Emergent Technologies: Friend or Foe?

Emergent Technologies: Friend or Foe?

Show Video

Good morning West Coast and good afternoon East Coast. I'm Daniel Sargent, Associate Professor at the Goldman School of Public Policy at UC Berkeley. I'm excited to be here today in partnership with the center for security and politics and the center for long-term cybersecurity to host and moderate a discussion about the risks and opportunities of emergent technologies with two leading national security experts and lifelong public servants, former Secretary of Homeland Security, Janet Napolitano, and Senator Mark Warner from Virginia.

Please allow me to introduce our distinguished guests to put today's discussion. Senator Mark Warner was elected as the US Senator from Virginia in November 2008 and reelected to a third term in November 2020. He serves on the senate finance, banking, budget, and rules committees, as well as the Select Committee on Intelligence where he is the chairman. From 2002-2006, he served as Governor of Virginia. The first in his family to graduate from college, Mark Warner spent 20 years as a successful technology and business leader in Virginia before entering public office. An early investor in the cellular telephone business, he cofounded the company that became Nextel and invested in a hundreds of start-up companies that created tens of thousands of jobs.

Janet Napolitano is a professor of Public Policy at the Goldman School and directs the setup of security in politics at UC Berkeley. Previously, Janet served as the 20th president of the University of California and formally as Secretary of Homeland Security under President Obama. She is a former two-term Governor of Arizona, a former Attorney General of Arizona, and a former US Attorney for the District of Arizona. Professor Napolitano, I am pleased to turn this conversation over to you.

Well, thanks, Professor Sargent. Good afternoon, Senator Warner. It's great to see you. Nice to see you, madam secretary.

I hope you had a good Thanksgiving and I know that you all have a very busy schedule over the next few weeks, so we really appreciate you taking this time to be with us and share with us some of your thoughts about new technologies, emerging technology, cybersecurity, but other issues as well. Let's just dive in if we can. What do you view as the key emergent technologies that the United States needs to be preparing for? Is it AI, quantum computing, anything else? What are your thoughts there? Well, Janet, one, it's great to see you and it's a great opportunity.

I appreciate this opportunity. For Professor Sargent, thank you for the introduction. As you know, I didn't say I was fortunate enough to get involved in an emerging technology back in the early '80s. The beginning is a wireless industry.

I just recently, literally just an hour ago, finished a Zoom with all of my interns and to see the looks on their faces when cellphone technology, I said, was a cutting edge emerging technology in the early '80s and they were like baffled by that because it seems so old school at this point. I'm not sure I can fully answer that question of what are the emerging technologies I know from the intelligence standpoint. We look at artificial intelligence. We're looking at quantum computing.

We're looking at hypersonics. We're looking at a supply chain issues around things like semiconductor chips, which are both legacy chips and cutting edge next generation. Last week or two weeks ago, our committee had a fascinating brief on what has taken place in both biotech and bioengineering, and there was a great analogy made actually by a professor from Stanford and a startup, not so much startup company in life sciences that's I'm quite well, and making the analogy that biotech and bioengineering where we are now is equivalent in 1975. We think back to ones and zeros in 1975, there was a lot of on the horizon that sure we could have fully predicted.

I would put that list up there, but I think we have to include life sciences. I think we have to include the challenges and opportunities professionally from the national security standpoint of overhead satellites and what all is happening there, and I think we just also have to be on the recognized that no matter what definition we put around emerging today, we have to be nimble enough to amend that so that the next thing that comes out of Berkeley or Stanford or MIT or Virginia Tech, that what we're flexible on us and our thinking to move as technology moves. Yeah. You still have

a pretty funny line you used in speeches about Nextel. Do I share that with you? I would always point out the fact that having involved in a Nextel and cofounded it that I was the only politician that said, even when I was speaking, leave your cellphone on because if it goes off, people are hearing annoying sound I here cha-ching cha-ching. I was about that joke last night and I realized even making that joke today, Janet probably would not go over well admit most Generation Z kids. Because they're saying, well, who uses a cellphone to call somebody on a voice call anyway anymore? Yeah.

Time goes very quickly both in tech and biotech, which I'm very interested that you raised. Mark, what do you think are some of the risks that we already know of associated with new technologies? I'm reading the new Henry Kissinger, and not just Henry Kissinger, it's also Eric Schmidt and another on AI at this point. Obviously, I think AI poses some of the most dramatic issues as we think about the ability for machines to outperform humans, the ability for machines to not only outperform but take on a decision-making process that maybe independent of human's skill, the idea that we would turn whether it's defense systems or other choices over to machines where there maybe algorithmic bias too often built in by people that look like you and me as opposed to people who looked like the rest of the world. How we sort through all that, I'm not sure that we are sure up at this point.

I do think with like let's just start with AI for a moment. I would argue, if we look back now to the late '90s,1996 Telecom Act, which put in place things like Section 230 and gave the total freedom around social media platforms to break things with no impunity and with total impunity. I think I would have a more fervor debate. I would come as a policymaker saying, maybe we should have put at least some initial guard rails in place or have a trigger that says 10 years from now we're going to put some guardrails in place because we weren't sure as we're thinking about social media platforms what both good, but also I would argue just the dark underbelly of social media now putting in place some of those rules and regulations.

It's Ben Franklin, one of Congress' greatest failings that we've done nothing, Zippo. California obviously has moved further ahead on privacy, but on other guardrails, we've not been able to put any in place to what has become a fairly mature technology. When we think about AI, which makes the issues around social media platforms look puny and comparison, the idea that we're going to allow this set of technologies, so a data machine learning AI. In fact, we don't even have full definitions around each of these to evolve on its own in the wild without any guardrails, and then think that 10 years from now we're going to come back and put guardrails.

That works. Now if you were to say, what garden was to be put in place? I don't feel I'm competent enough to answer that by any means. When I was in London recently at Google DeepMind, we have some fascinating things going on there. They're moving ahead on a series of areas that at least on first blush could benefit humanity. I wish there was an equal set of academics, ethical theorists, law maker, policymakers, thinking about how we ought to also put some guard rails around what's clearly going to be something that's going to fundamentally change all of our lives.

I just worry at times in some of these technologies, when we wake up to the reality of it, it may be too late. The life sciences issue where the CEO was talking about the idea around nitrogen. I believe it was nitrogen, around the need for that in terms of fertilizer and how much money we spend trying to create these fertilizers whereby soybeans have the ability to do this process literally through the root structure. The idea that I thought what Biotech over promise and under deliver it for years. But literally I'm talking about now grafting that biological engineering onto corn or wheat or other products.

Obviously environmentally, it's the right thing. It sounds all these wonderful upsides. But for every upside you think in bioengineering, there's also the chances of a dramatic downsides. How we sort this out in a world where frankly, our policy-making in Washington has gotten extraordinarily slower as you well know.

What I almost feel keep going on in this question, but it's important, one of the things that has really concerned me over the last few years has been the fact that America, and I would almost argue the West written large, has retreated from a lot of the standards setting entities that are taking place as these new technologies on an international basis are developed. That I see this as a former telecom guy on 5G, where frankly we were asleep at the switch and China and when I say China, my beef is not with the Chinese People, it's with the PRC and Xi Jinping's policies, but they flooded the zone and took over and standard-setting entities around 5G. That has implications well beyond the technology standards. How do we think about each of these emerging technologies in a more holistic way and not simply be chasing the technology, or chasing the venture capitalists making money out of this technology. Yeah, I think that's right.

I take it you would recommend that the administration lead an effort to reengage internationally with standard setting with these new technology? When we think about standard-setting, the short answer is yes. When we think about standard-setting, it's more than just, again, as a telecom guy, which frequencies and the basic technology nuts and bolts. I think implicitly we build our biases.

When those biases are coming from democracies, we build in the notion of some level of transparency. We build in the notion that maybe there should not be total governmental control. We build in the notion that if a call was going from Berkeley to Buenos Aires, it maybe shouldn't be routed through Shanghai. The way Huawei equipment would route that.

I do think there is a chance here to re-engage an alliance of the willing amongst democracies around the world that does not and should not be American only driven. I think there was a real willingness in the inside the administration. I've had a long talk with Secretary Blinken about this recently, where they are putting in place both an emerging technologies division at State Department, but also around this standard setting body. Because in the past what would happen is we would send both governmental people but also private industry would come to these standards setting bodies. Generally speaking, the West drove the agenda more. I don't want to focus entirely on Huawei and 5G, but this was a case where government stepped back particularly under Trump and then on top of that, private sector was not sending as many experts of these associations and entities setting the standards and China flooded the zone.

I think this is one in a variety of areas where I think there could actually be a re-emergence of American leadership in combination with our allies. Yeah. To what extent does the ability of Congress to be agile and nimble and proactive as opposed to reactive, require the interaction of both parties.

What do you see as some of the issues with the partisanship we see out of DC now. If I might add. Are these issues some that members of both parties are willing to engage in? I will give you the good news and the bad news.

Obviously, you're trying both as common security Secretariat and as Governor, you lead always in a way that was also that effort of trying to be bi-partisan. One of the values of my bi-partisan does not mean that the objective solution set is necessarily better. Particularly in our country when we go from one team controlling power to the other team controlling power, we don't relitigate the issue again. Whereas if we do it with only one team, it's constantly being relitigated, which I don't think makes much sense. I do think on technology issues.

There are still more agreement. I do think one of the great frustrations I had coming from the Intelligence Committee and being the first to expose some of the manipulation done by the Russians in using Facebook and other platforms is that we still have not been able to move at all in that area. We've not been able to move in privacy.

Again, I point out California has moved in privacy. Obviously, the Europeans have as well, and we're seeding that traditional leadership. That would be the bad news. No Section 230 reform, even though Facebook on my daily political brief says they are in favor of Section 230 reform.

We've not been able to meet some consensus. I'll point out two areas where I think that we are finding agreement. One on the need for America to up its game in the global and domestic production of semiconductor chips.

Seeing the shortages there, the CHIPS Act, which is part of what's called USICA, a broader based research field got 68 votes in the Senate. It's crazy to me the house has not taken it up. But that is an area where bipartisan and some big money, $52 billion on semiconductors, and two billion on 5G and O-RAN, open radio access network, congress put its mark. I'd also say, we have right now the defense bill was up. One of the amendments we're trying to get on that I think will get 75 votes is around an area that you have a lot of expertise, cybersecurity.

As you know, we have no mandatory reporting requirement for cybersecurity incidents. Thank God, SolarWinds reported and Colonial Pipeline reported, but neither of those entities needed even tell CISA, which is the new entity that I know you supported that's ultimately been created to try to have that not regulatory but domestic cybersecurity expertise, there was no requirement to report. We've worked to put a reporting requirement with appropriate indemnification and privacy protections and others, because you've got to not only tell the government, the government's got them shared with other folks in the private sector. Those are two examples of where there is still bipartisan. I take more than a little bit of pride in this, I think the Intelligence Committee, which I'm proud to be chairman of, we view ourselves as the de facto technology committee because there is no committee of technology in the Senate, as you know, we're in the house for that matter. House has got a Science and Technology Committee, but it doesn't have as broad a scope as some of the things that I think we're looking at.

We stay bipartisan on these issues, some AI to quantum to concerns about hypersonics. There are some good and some bad coming out of this, but it is an area where in the past, America would have already exerted its leadership in each of these areas. Our failure to do so I think has cost us. Yeah, I think that's right. USICA is an interesting bill that was pushed by, as you said, bipartisan members of the Senate.

I think Senator Schumer played a leadership role there in getting it through the Senate. What all is in USICA besides the CHIPS Act? Because I want to return to the CHIPS Act in a moment, though. USICA has basically $52 billion for chips, two billion for 5G and next-generation beyond 5G called open radio access network. Then there is roughly 150-200 billion that's not appropriated, so it's simply authorized.

A lot of the dispute in that part it is a dramatic plus up of the National Science Foundation, there are other areas. But one of the battles became should we simply plus up National Science Foundation or should we plus up as well Department of Energy, our National Labs? Again, you guys in California blessed with great ones, we've got at least one in Virginia. We reached some accommodation between NSF and DOE.

Frankly, was not my debate, I'm not sure whether we got the right mark or hit the right split there. But I think that could be resolved. A lot of this USICA battle has been I think twofold. One, less a partisan battle and more a traditional Senate versus House.

The fact that the Senate did all this. Todd Young, Republican Senator Chuck Schumer took the lead on one piece of it, John Cornyn and I, I'm the Chief Speech took the lead, and the House felt they were left behind. Some of the traditional House versus Senate exchanges. I think there is also an understandably concern that they view it as a pro-America research and development bill and not be viewed as an anti-China bill.

Again, one of the things I'm very sensitive to or at least try to be is recognizing that when we talk about a rising China, we make clear who our beef is with and that some of this anti-China rhetoric does not become used as a tool for anti-Asian American or anti-Chinese American discrimination, which I know we have seen. There have been stories recently that sometimes I think the FBI and others have gone a bit too far. Yeah. I need to disclose for the audience, I'm on an advisory committee for Intel, which is a semiconductor manufacturer, the largest in the United States and indeed, the world. But I think one of the things we've noticed is that semiconductors are pinch points in the supply chain. If you don't have enough semiconductors, you can't produce enough cars.

Anything one uses has a semiconductor, a chip or chips in it. I think and this leads to the issues with China, but the bulk of semiconductors are actually manufactured at foundries in Taiwan. That raises the issue of China, Taiwan, United States are we prepared? Have we thought through this? What are your thoughts there? Well, you're right. This is an area you probably have more expertise than I, but I've really tried to go to school on the industry over the last couple of years. Interestingly enough, as you know, this has been an area that's been boom and bust for some time. We have two fabs in Virginia; one that's expanding, one that's actually been shut down for a number of years, because it went through this boom and bust period.

Matter of fact, pre-COVID, there was even some concerns about oversupply in the chip industry. What happened with COVID was we had this dramatic cutback in capacity, and part of the consumer move towards buying more electronic products they operate at home, that's where the chip manufacturers move their business and legacy industry. The autos got left behind, one of the reasons why we still have some of those auto plants sitting idle.

I think we do have to recognize there is boom and bust, number 1. Number 2, I think we need to have to recognize that, as you know from Intel, not all chips are made the same. There's cutting edge chips, there's legacy chips, there's memory chips.

There are places in this equation, such as packaging and the machining and other areas where we and our allies, I think some of our allies like in Netherlands and elsewhere, were still doing pretty well. You point out appropriately though, that Taiwan has some of those cutting edge manufacturing. American manufacturing of overall chips supply has gone from roughly 40 percent to about 12 percent. PRC itself has gone from 12 closer on the path of 25-30. You have the concerns around national security at Taiwan, you have the increase in China itself making huge investments, estimates of $150 billion worth of capital investment.

What you also have are countries from South Korea that's talking about somewhere between $65 and $130 billion of investment, you've got Japan talking about multi-billion dollar investments, Taiwan continues to do well. There is a question here on a national security basis that we need to have some level of this domestic manufacturing facility in our country. I think the consensus that I've come to is that unless we provide some additional subsidy dollars, there will not be new fabrication facilities built in America because these fab plants run between eight and $15 billion. If other nations are willing to subsidize 2-$3 billion per fab, and they take a lot of land and a lot of water, and industry experts say we're about 20 percent more to 25 percent more expensive to do it here in America.

I think we have to make these investments from a national security standpoint, from maintaining domestic supply chain, we keep some of the innovation. But I still think having these long-term facilities in America make a great deal of sense. This, 20 years ago would be called industrial policy. But when we see other nations, particularly China, make these huge multi-billion dollar investments, not just in chips but in a series of other areas, I think we in United States and by implication the West, needs to pony up as well. We need to put our money where our mouth is.

I found a lot of my Republican colleagues, and I'm thinking about John Cornyn and John Thune. A number of other senators on the Intelligence Committee who this is a pretty dramatic change for them to acknowledge that the market alone is not going to solve this. If we leave this to the market alone, there won't be additional fabrication facilities built in this country. We'll lose our domestic production capacity. We may still keep some of the innovation, but the innovation sometimes goes with the FABS too. I think this makes sense and I think getting it right, this is why USICA or the CHIPS bill is so important, getting it right and having a process that 52 billion dollars, about 12 billion dollars research 40 billion roughly on that could help subsidize 7-10 new FABS built over a number of years here in this country.

We got to make sure we put appropriate controls in place because I think what we're doing in CHIPS, we may have to do in artificial intelligence, we are already doing in quantum computing. There may be a series of other emerging technology areas that we have to make these large scale investments, and not just at the research basis, but actually at the development basis as well. Right. I think the thought needs to go into what are the critical elements of emerging technologies, where the United States, from a security standpoint, needs to put some of it's public dollars to remain competitive with the world, and that's a very different approach to government spending that we've seen before. Where we're actually putting in significant dollars to support one private industry. I get this and matter of fact was fortunately over the Thanksgiving break, had a fairly good critique on the CHIPS Bill saying, maybe we're putting it in the wrong place and stuff, and would push back against some of this.

But I think maintaining a domestic chip fabrication facilities and its basic research, investing in some of these other cutting edge technologies, I would argue is more important than adding an extra plane, or ship, or tank because I think the competition of the 21st century is going to be around who wins the technology evolutions, not who builds the most traditional military hardware. Although again, as you well know, even in the most sophisticated military hardware, you've got to have those bleeding edge chips as absolutely critical component piece. For sure. You can't easily segregate between hardware that you need on the military side versus what we need simply for supply, domestic production of consumer goods and other material.

Let me ask you. I have one thing here, Jennifer. One of the reasons why I think there's been part of this evolution is that, I think COVID showed that the just in time global supply chain model that we've all.

Gotten used to. In the long way and gotten used to, that at the end of the day, that it may be worth an extra few cents on a chip or an extra two cents on some PPE to make sure that there is a domestic, or if not entirely domestic, domestic plus allied supply chain because, again, my friend Debbie Stabenow, who argued very strongly with me in one part of the battle to make sure at least some of this went into legacy chips, which I was not a 100 percent sure of at the beginning. The number of auto plants that are sitting idle in Michigan right now because they don't have access to chips, all the tanks in the world aren't going to stop that.

That is for sure. Turning to USICA Mark, what do you hear on the House side? Is it going to be stuck there forever? Is it going to move? I think there is a recognition. There was an announcement right before Thanksgiving that we would have a conference on this bill. I'm not sure what's the House bill. There was some minor bills that were in this neighborhood, but nothing of this scale.

It is in my mind, one of the highest priorities that we need to make happen. Secretary Armando, Commerce Secretary has been great from the administration at pushing this and advocating for it. I wish the White House had been as active on pushing for this. Again, this was a case of a bill that was passed in July, and putting on my partisan hat for a moment, coming out of some of the challenges around Afghanistan.

If we send a strong signal, passing USICA, that we were stepping up our game, not only on chips but on investment in emerging technology, I think that would have been a great win for the president and a great signal to the rest of the world. It's been more than a bit frustrating that other than this inner seeing, who gets credit House versus Senate, there's not been a lot of clarity. I'm hoping to be on that conference committee and ready and anxious to meet to get to yes, the sooner the better. Agree. We've talked a little bit about cybersecurity, but let's turn to that directly if we could.

When I started as secretary in 2009, our chief threat streams still involved aviation security, and I spent maybe 10 percent of my time on cyber. By the time I left four and a half years later, I was spending a good 40-50 percent of my time on cyber, it was exploding. Given our hope that we don't end up in an actual kinetic war with China, but we can anticipate that we're in a battle a way in the so called gray zone, where cyber is concerned. How do you see our interactions with the Chinese in this area, but the Russians, the Iranians, others that are active in the cybersecurity realm? Secretary Mayorkas, now the current Homeland Security Secretary. If he's not spending 60 or 70 percent of his time with CISA, which again I know was the independent agency that's been set up afterwards, took us too long, not judging you by any means, but it took us too long to get it stood up to try to have that domestic, non-regulatory.

But the fire person you call when the fire gets lit, call CISA so they can help where you public/privately respond. Huge issue. During your tenure with President Obama, and if you did send a strong signal to China for a while and they cut back on some of the intellectual property there. But it is estimated that China steals 300-500 billion dollars a year of IP from us and around the world. That's a lot of dollars that if you don't have to invest as a nation state, then you can steal that through cyber, or it's not just entirely cyber, to joint ventures and other things. Let me also say, I think a lot of American and other business that turn the blind eye to Chinese government's bad behavior because of their fear that they would lose the Chinese market and consequently have made compromises they wouldn't make anywhere else.

But that intellectual property theft is where China has done most of its activity. Russia has done traditional information exploitation, the way solar winds, but also they are Russia and its quasi agents of people that may work for the GRU during the day and cyber criminals at night. They have been much more ransom ware based attacks. But I'd like to step back and say for a minute, solar winds or the bad guys, the Russians in this case attributed, got into 18 thousand companies.

Luckily, they just exfiltrated information, but if that had been a complete denial of service and shutdown of those 18,000 companies, our whole economy could have come crashing to a halt. This is an area where we are still vulnerable. There was a fascinating story, again, I saw in the public press over the last three or four days, that has shown where this kind of cyber activity still not kinetic, but conflict may be headed. That is the case and I've not gotten independent Intel on this, this is just from public domain stories that showed the recent back and forth between Iran and Israel, where the Iranians thought that the Israelis were shutting down their other gas stations and drove up the cost of their gasoline and complete disruption for a couple of weeks into people's availability to gasoline in Iran. The Israelis have then said the Iranians broke into the biggest gay website in Israel and disclosed a million and a half Israelis private information.

This is different than a ransomware attack, or this is different than stealing intellectual property, or this is different than traditional spying, but this may be where cyber conflict is headed, where you're not bombing someone, you may not be violating the rules of war, but you are definitely affecting a domestic population's lifestyle. Whether it's 1, 2, or 3, as chairman of the Intelligence Committee, what keeps me up as much at night, cyber is definitely one of the top three. I have to say I'm glad to hear that, but I'm not glad to hear that because we wish it weren't such a precedent risk, but it certainly is.

One of the areas of risk associated with new technologies is the risks to our democracy itself and the role of social media in being an accelerant on the flame of extremism on both sides, but primarily in the US recently we've seen it on the right-wing side. What do you think Congress can do or should think about that? We're going to do some version of what California and Europeans have done with GDPR and put in place some basic primacy rules number 1. Number 2, in these Dolby for the potential of breakup, which I have not moved to yet, but but I'm open to if we don't make some progress.

I was a Telecom guy, as I mentioned earlier, we had to import some of the ideas from Telecom. There used to be really hard to move from one telephone company to another in terms of long distance. We had number portability, I think we need data portability and interoperability. If you get tired of a certain platform, you can easily migrate with all your data to NewCo and still then talk or communicate with people that remain on the previous platform, so data portability. This is something again, I know California legislature looked at what enabled across the line, and this probably would be too much for the American Congress to grapple with. But I think the idea that Facebook and Google and Twitter and so forth are free, you and I both know they're not free.

Their model is simply based on, they suck information out from us and then monetize that. I think there's nothing implicitly wrong or morally wrong necessarily with that, but I think we ought to become informed consumers. I'm a big believer that there ought to be some requirement that these platform companies share with their consumers or their products as the case may be, how much that data that they're sucking out of us, it's actually worth. Some level of data visibility law.

Then I finally think we do need to take on what I referenced earlier section 230, which back in the late 90s when the legislation was passed, basically put in place a complete impunity and a complete legal liability shield against any of the content on these platforms. Maybe that was right in the late 90s. I'm not sure, 25 years later, it's still makes sense. Again, even the largest platform companies like Facebook say they're willing to do changes there. We have made certain changes, meaning childhood pornography, bomb making.

I've got legislation with Amy Klobuchar and Mazie Hirono called the Safe Tech Act, that would say let's at least make certain things that are already illegal if they are used by social media company. If you're doing civil rights violation in business and you're doing that over social media, there ought to be some liability. If you have the illegal tort, illegal alien actor, which is basically what Facebook allowed when the Myanmar government was using Facebook as a platform to encourage people to go out and murder the Rohingya, there should be some liability there.

If you're levels of cyberbullying that are illegal in certain other areas, maybe they should be illegal as well on social media, the ability to be able to enforce injunctive relief. I mean, there was this horrible case of somebody on the grinder site that got manipulated. Somebody said another person had his life basically ruined, and there was no ability even though there was to get injunctive relief to try to prevent what the platform didn't even deny was happening, but they said section 230 protection. I think their First Amendment obviously needs to be preserved, and you'd have a right to say stupid stuff where you have the right to have it amplified a billion times remains to be seen, but I don't think there should be that same First Amendment protection. If a platform is receiving benefits form paid advertising, I mean, there are prohibitions on television and radio and other mediums if you are selling a faulty product or a pyramid scam, that's you'd be gone after.

There is no such prohibition on social media. This our Safe Tech Act I think preserves the First Amendment, but it's not full content moderation, but it simply doesn't force some of the laws that are already out there now on social media. We're still looking, we've had some good conversations with republicans to get bipartisan support.

There's a companion bill already in the house. There's been another approach that has gotten some bipartisan support that I've been also looking at, that looks at the algorithmic biases that may be coming in place. If the algorithm is sending you disproportionally to someplace. I don't fully understand how they're shaving this because it's easy to say, but if it essentially to illegal site, my definition that is a little harder to sort through.

Even this week it's been suggested that the House Energy and Commerce Committee where probably this will more arise, is going to have a meeting on this whole universe around Section 230. That I hope would move. I've gone on way too long on this answer. I wish we could have moved some of these other areas like data portability, data valuations.

I've got bipartisan legislation of what's called dark patterns in which, again, I think, you know and this audience probably knows. But for those who don't, it's when something comes up on the site that you have no basic ability to opt out, you get the big flashing light here to sign up here, and you've got to go three different pages to find a place to say, no, I don't want this. That's technically called a dark pattern usage, that ought to be prohibited as well. What you're saying is that there ought to be some guardrails that would go into the business model that the platforms use as opposed to government, necessarily it's self-regulating the [inaudible] . I don't think we're going to get to because of the First Amendment and because I don't think I know any bipartisan where you can get to content regulation.

I also don't know if I want to. I disagree with some of my friends on the right who say these social media companies have an implicit anti conservative bias. I think its actually their bias is to make money. If you look at who the top 10 posters on Facebook on a daily basis most of the audience is not heard of probably seven of them because they are far right-wing bloggers and posters.

The thing is, I think we need to have to obviously respect our First Amendment. I think there are ways to respect that First Amendment, but still put some appropriate guardrails in place. One last comment because I'd been following this abroad as well. I know this is a hard subject to grapple with because even when you look at content moderation in countries that don't have a First Amendment. After the great tragedy in New Zealand at the mosque shooting and some of the activities in France, the burning about, shootings and the manipulation of social media in the UK and elsewhere, none of these other countries or the Western democracy countries at least have totally sorted this out. I actually think the British are going to come up with some legislation that they think will get fully vetted early next year.

They may be one of the first, but this is even with countries that don't have First Amendment productions, this is not an easy needle to thread. That's one of reasons why I think the idea that there's so much attraction and why I've not ruled this out if we can't let some guardrails in place so if we can't add some more pro-competition notions like data portability, and data valuation. Then some of my colleagues have said, we need to look at full breakup. I'm not taking that off the table.

Good to hear. So 230 is still an active topic of conversation with your colleagues? It is. It's how we can be in a situation after the whistleblower from Facebook testified, was it a month or two ago, absolutely damning comments about how that platform is.

In her case, I think the most powerful was manipulating young women around eating disorders and other issues and say that that status quo was acceptable is just beyond me. Totally agree. You've mentioned several times working with the other side of the aisle and you were one of the lead negotiators on those so-called Bipartisan Infrastructure Act that the President recently signed. How did you get that done? I'm going to ask about the reconciliation bill, but first of all, how did you get the Bipartisan done? Give me the nice one first. Here's the slow pitch over the center. The mainstream media attention span is pretty short, which is not exactly a news flash, but deep, I'm saying, how did this group come together? Well, most of this group, at least eight of the 10 of us, had worked very closely together with then Secretary Minutiae on the last COVID relief bill that took place in December of 2020, the so-called 908 bill that had $908 billion of relief.

My Republican friends like Susan Collins and Lisa Murkowski and Bill Cassidy and Mitt Romney and Rob Portman, we'd worked with in the past. This group came about with Rob Portman. Kristen sentiment would not in the night awake group working we had a prior working existing relationship and there was trust.

It was still hard sausage making. It took a long time to get there. The fact that we got 19 republicans and kudos to my republican colleagues who took enormous amounts of grief for working with us.

But it's hard to deny that a nation like ours that hadn't made a meaningful investment in infrastructure in 50 years. But that wasn't good policy and not just roads and bridges, but things like resiliency, broadband deployment, $65 billion, things like even the energy component, a lot of transition to smarter grid, to investment electric vehicle infrastructure, electric buses. It was, I think a good piece of work and not putting on again my partisan hat for a second. I thought it was completely stupid that Democrats in the House would not go ahead and once we pass that in middle of July, but go ahead and pass it, then to give the president a big win and the country a big win.

There was certain irony that it literally got signed about a week after our election in Virginia and wearing my partisan hat again where we lost the governorship. You lost the whole ballot. Yeah, we lost the whole ballot and personally due to the fact that we had this big win that we could have gone and talked to people in Virginia about and the gubernatorial candidate could have said, "Listen, I know what we can do here about this route, I know what we can do here about this broadband." That would've been a tangible item. I'm proud of the work. I think it will be significant for our country in a host of areas.

I do think one of the things you as a governor will get this more than most of my colleagues. We all know that was governor's that passing the bill is just the first step, how it gets implemented, and this level of new spending in areas where you either creating new programs entirely or pumping up at a historic numbers, things like roads and bridges. We need the best oversight team possible.

I think the President's taken a good step with our mutual friend Michelle Andrew. But I think there ought to be a whole team of people on implementation. Yeah. Turning to the next bill, the reconciliation bill, which is now in your chamber in the Senate.

Again, give me how you see the lay of the land there. Well, I'm actually pretty optimistic. I'm not going to put a date certain on it.

But at roughly 1.75 billion. I was prepared to do more than that. Because I think again, over a 10-year frame, the inflationary pressures were feeling right now partially due to supply chain. But if they're doing the government spending, if because of the five billion dollar, five trillion dollars that we spent under both Trump and Biden on COVID relief.

But I think if we talk about what's in it, that our pro growth, we know we need more people back in the workforce disproportionally women. Well, childcare and guaranteed preschool are two pretty good places. Making sure that piece this morning that I saw on the news that one out of every five Americans is a caregiver. In one level or another, whether it's for kids or for aging parents, providing some support particularly for aging parents and disabled. I think that makes a lot of sense. Taking on climate change in a meaningful way, I would do a carbon tax, but even without that, hundreds of millions of dollars of incentives.

I say this as a father, one of my daughter is a Type 1 diabetic, at least trying to bring down and put a cap on a drug like insulin costs makes some sense to me. I guess I've been partially guilty of this as well, we've spent the last four months talking about topline numbers, and mostly Americans don't have the foggiest idea what's in this legislation. Component parts are popular.

I'm trying to talk about what's in it. If I could have incorrectly a little bit more, I would have probably tried to do less for a longer period of time than the whole wish-list. Because again, you and I both know from our time as governor and you being more in the belly of the beast of the federal government, they even delay, having the record of the federal government under any President to implement a whole lot of new programs simultaneously had been mixed to say the best. Yeah. One always worries

when you see a program that is only funded for a year or two years. Given how the political lay of the land could be changing, etc. regardless of the merits of the program. Neither party has clean hands on creating fiscal cliffs, whether it's on tax cuts that expired too early or expire at some point or starting new programs and I think again, if I could wave partial magic wand, I would have made it less programs.

The only other thing I'd just say on this second half and I'm not sure we can shift the battleship again. A lot of the initiatives, well, extraordinarily important for climate change. A lot of the other social initiatives in this plan feel like the list was put together pre-COVID, and if there was a major change I would make in this legislation would be thinking through the ideas of the fact that I think many Americans are post COVID rethinking what their work life balance ought to be, and how we invest in human capital and treat that investment from a tax accounting and reporting system at least as well as we treat things like research and development and intangible goods would have been the area that I wish we would have spent some more time on. Let me just ask one concluding question. If you could step back a moment, the United States has been the world's leading economy because we have lead in technology and innovation for years and our universities have been talent magnets from around the world actually. What do you think the United States needs to do, and what Congress needs to do to sustain that position as the number 1 innovation center for the globe? Let me start back and say, I think we've seen that without American leadership can delay the rest of the world founders little bit.

We saw that when President Trump so dramatically tried to basically take America out of that leadership role. I think waiting for the Europeans or the Japanese or any other countries to take on these macro risks on their own without American leadership, I think the world suffered, I think democracy suffered over the last four years. But to get this right, we need to make these investments like you see but we also need to make sure that we continue to be a attraction of top talent from around the world. Immigration reform. Making sure that our challenges, for example with China are focused on the challenge of the communist party and not turn this into an anti- Asian or anti-Chinese political propaganda. I think there are countries like Australia that seemed to manage particularly that component of how you deal with the Chinese diaspora better than America.

I think that there's things we can learn there. Let's keep investing in our universities. Let's go ahead and realize we are going to have to get into at least the area of quasi industrial policy to stay competitive with China in many of these areas and must make sure we continue to be the place where the best talent from around the world want to live and then study and live their lives. There you go. Thank you so much, senator. It's been a wonderful conversation.

I know we've some questions coming in from the audience. I'm going to turn over to Professor Sargent. Terrific, I'm going to seize the opportunity to lead off with a question of my own. If you don't mind. Earlier in your conversation, I think Senator Warner, you reflected that you wish that wasn't set of academic theorists, policymakers, thinking about how to manage the consequences of artificial intelligence for security, for society. In response to that observation, I would like to ask you both to reflect upon the role that universities can play in managing threats emanating from emerging technologies. Now I'm mindful as a historian of the role that the service academies historically have played, anticipating and responding to disruptive technological change from the advent of battleships to the rise of airpower.

Do you think that civilian universities, which are after all key generated as a technological disruption, could be more proactive and anticipating security risks, social risks that emanate from disruptive technologies and in participating in the development of solutions? Yeah, you want to go first or you want me to go first. You go first. Well, it's a great question and obviously to a world of mostly academic audience, I'm going to say yes. But with a couple of caveats. One, there's an interesting idea that Senator person General Brand, and Senator Ben-Sass that I'm very intrigued with. I'm going to see it fleshed out a little more about creating, in a sense, the equivalent of a cybersecurity academy that would be maybe not guaranteeing military service, but recognizing again that this is going to be an ongoing threat, but how you train people, how we move people in and out of government around cybersecurity.

This also begs the question of a nerdy issue, but security clearance reform so you can get people in and out of government and form on easy basis. On the issue of AI and the others, I absolutely think the academic community is critically important. I do think though, it needs to be married in some format so that it gets the recognition that it deserves. I am sure that I'm at Berkeley and there's probably 50 different around the country, not just Berkeley, 50 different academics or even working groups that are looking on AI and its implications for ethics and policy but how that information filters to policymakers and how it's done in a collaboration with the very investors who are making these decisions, the private capital and some folks in the government that's where I think we could still make some improvement, but the basic premises absolutely dead on. Well, I've to agree with my friend, the distinguished senator from Virginia, and I think what we need to develop, is a better bridge between the academy and the policy making world and the political world, quite frankly.

I think if we can do that, one of the advantages to the political and policy-making world is to have access to those in the academy who are thinking not just of today's technology, but who can see around the corner and see what's in development, what's the next thing so that we can become as a country more nimble agile and proactive. I think perhaps, for example, on AI where we know there are technical, legal, ethical, moral questions around AI. The idea of forming an independent commission with one goal in four months, give us your best recommendations on how we handle AI. I think academicians would leap at that opportunity. In action, simply add to the secretaries' comments that we did have enrichment and Bob will get a pretty good paper or on AI that was some of those multi-disciplinary. But it needs to be ongoing and I'll just to be so it doesn't sound like I'm playing to the audience here.

I would challenge the academic world. I feel when I see this in my own state with our institutions that because the sausage making has gotten so ugly and because in certain areas we've looked in that. Then you've had the antithesis of the epitome of anti-science accurate academic leadership under the former president.

I think there's been a lot of the academia, that's basically said, we're not going to mess with policymakers and politics. I think that is a horribly wrong decision. Both the intellectual rigor that we need in debates and the ideas in this, Janet said the ability of people who can see around the corner a little bit, we need you more than ever and it's going to be messy. Yeah, and one of the things that Eric Schmidt and all I've spoken about is the need to have technology sophisticated workforce in the government. That's one of the things we've been trying to think through at Berkeley. Like how do we support that development of that pipeline? I think in the government, I mean both in the inner agency, in the federal executive of agencies, but also in the staff for the Senate and the House because as you know, so much of the prep work is done by staff.

Yeah, and that's again why I think the nitty gritty issue and I've been working on this a lot and we even made some progress under their Trump, getting security clearance reform done so that people can move in and out from academia into the government and for that matter, back into the private sector. I see on a regular basis, good staff, I'm studying with one right here who got stolen away by industry or it gets stolen away by academia because the sausage making process at least recently, it's been pretty damn messy. Yeah. You got to despair to come back and forth.

For sure. Let me pick up in response, I solve this metaphor of bridge-building, which you both deployed I think to describe the relationship between academia and governments. I would be really interested to hear, reflect upon whether academics might be more proactive in getting engaged with Congress as distinct from the executive agencies. Often officials in the executive agencies are really fixated with operational tactical, day-to-day problems.

Is Congress a venue in which academics might be more constructively engaged with longer-term, more strategic level challenges? I'll start again. The short answer again is yes and I think about the Intelligence Committee. We have a technical advisory groups where we put academics. I've been no chair for only about 10 or 11 months. But we need to use those more often.

There's a little bit of when the Congress or goes from radically one into the other back and forth having that continuity. Because building these relationships, particularly with academics, into the congressional sausage making process, I don't think any academic is going to come and feel fully utilized on a short-term basis. It has to be some level of trusted relationship.

It goes again to what Secretary Napolitano said is, we've got to do this just not at the member level, but you've got to have that trusted staff, people that can, again continue to build those relationships. But I think that's an area that we have underutilized. For that matter, we've even under us outside of academia. As Janet notes, there's a whole total other industry in Washington, a very smart people that are in the think tanks and at least on the Democratic side, our ability to use even the brains that are 15 minutes down the road has been pretty poor over the last few years. I think Daniel, one of the challenges is because Congress has evolved in sausage making.

It's a big place. You've got 535 members, you've got all the staff, etc. It's knowing how to plug in and where to plug in. Working out some process by which Congress knows where to go and in the reverse, those in academia, nowhere to go. An audience member asks, artificial intelligence is a fast emerging technology that straddles industries from national defense to help care. When will Congress proposed definitive federal legislation to create a regulatory body or even a new department to manage this new technology? It's a big question, but the topic is so important that you'll see him seems worth pondering.

My read is that there will need to be some crisis regarding AI before Congress actually acts. But I do think there will be a need for some regulatory approach to AI and I think we actually would be benefitted if we had it and if it was beginning now. But I think Congress moves when there's a crisis.

I would agree, and I think that's where we started the conversation. I am rather than trying to fix, they should put guardrails after the "AI industry is fully stood up" would be a mistake. How can we get ahead of it or at least how can we enhance some regulatory entity that will at least looking at it. But we don't even have a good definition at this point. Somebody who spent a bunch of time I can sort through a little bit of the differences between big data, machine learning, AI, where one begins and the other ends.

[inaudible] book about the China-US competition around AI fascinating. But I'm not sure most policymakers, we need, academia combined with the emerging AI industry to help us at least get the definitions right, so we can figure out where that regulatory or at least advisory group want to be. Although I recommend that we grab a hold of AI now, there are risks involved. Incorrect technological assumptions, too much technological specificity, omissions, unintended consequences.

All of those go into developing a regulatory approach to a new technology, but we have to recognize those risks and I think the benefit, and nonetheless of grabbing hold now outweighs those those risks. Let me improvise a follow-up question to that target audience question, which is, ask you to ponder whether you think some degree of international cooperation is going to be necessary to deal with threats emanating from this really broad as the world of artificial intelligence. If we think specifically about, say, the battlefield use of artificial intelligence, there might be precedence in the Geneva Conventions score as of international treaty rules to limit the deployment of AI on the battlefield. But achieving such progress really requires international cooperation.

Do you think there's a realistic prospect of governments coming together to tame the disruptive effects of AI as a military technology? Yeah, I think that's a big question. I think if you look historically as somebody who's advocated that we ought to have some international norms around cybersecurity. You might have a lower attribution requirement if you're raiding a hospital system or if you breaking in or creating a ransomware to strike back.

But my understanding is it was actually in the United States when the rest of the world, including the Soviets and the Chinese and others in the late '90s or Russians, I guess in the late '90s, were talking at the UN about an international standards around cybersecurity. It was actually American that was reluctant because we felt we were so far ahead that we didn't want to enter into any kind of international normative entity. Number 1. Number 2,

I do think we needed this internationally because I can keep coming back to the CCP. I think we've seen if China, which the Chinese Communist Party has access not only all the government data, but to the access to the body data that we chatted, the metadata, the Alibaba data. They have already created an Orwellian surveillance state with social credit scores. If that model of AI becomes the dominant one, I think we should all be extraordinarily concerned.

Again, this goes back to the point I've tried to make it. I think the administration gets it, but they need to put some real emphasis behind this. That there needs to be this coalition of the willing. We're ought not just being five eyes or NATO, but it ought to include South Korea and Japan and Australia in Taiwan, Singapore, India, Israel. There ought to be this coalition of the willing around particularly interested in AI.

I totally agree and I think that we should work to establish that kind of coalition of the willing, regardless of the CCP. If we wait to see it, the CCP will come to the table it won't happen. But there are many nations around the world I think who would be in a coalition of the willing and amassing that and creating that critical mass would be a good thing. Let me just add one thing on there. I think that may mean this will be part of my responsibility and others and government.

Just because we may have a short-term advantage in some subsets of AI, it doesn't mean that we should walk away from that international order. Because the value of us doing this collaboration in the long run, it whenever short-term leads we have now I mean, I argue the same thing. The race on quantum. Who gets to quantum first can break through all defenses in the cyber world. I think we need to think about this in conscious with others.

In the history of the nuclear weapons, technologies provides so powerful corroboration for that point. But let me segue to an audience question about China's authoritarian abuse of artificial intelligence technologies. An audience member is curious to know what lessons can we learn from China's domestic application of AI and machine learning to abuse human rights in Xinjiang, where extensive and highly advanced surveillance systems monitor and automate aspects of forced labor camps.

As we think about the problem of how to regulate AI and machine learning as we've domestically in the United States. I think what the question is asking us to contemplate is, should we see China as a cautionary example and if so, what do we do about it? Well, using China as a cautionary example, I think that establishes the need for the United States and our coalition of the willing to take on these AI related issues. Because I think the countries that would be in such a coalition don't want to see that Chinese model or CCP model as the template. I would simply add on that.

In Xinjiang, think about what's happened to the people of Hong Kong. Where you had one estimate, 80 percent of the people taking place them some level of protest over. You've had barely a peak and from businesses that I know that are in Hong Kong, they feel like they now have to play by exactly the same rules if they were in Shanghai and Beijing because this massive ability to not only use governmental data, but the access. I mean, the fact that when the CCP change the laws in China and explicitly said in 2016 that every company first obligation is not to their shareholders but to the CCP. That means all that data goes in.

Out of your forewarning to again, this coalition of the willing that we need to interact together. Agreed. We focus mostly in this conversation on the security threats resulting from emerging technologies.

But I wonder if we might take the last five minutes to reflect upon the disruptive effects for a society at large as several audience questions are inviting us to do. One question that an audience member has posed is, how do we begin to grapple with the consequences of artificial intelligence for distributive equity in society? AI technologies are poised to create new forms of inequality, perhaps greater than any that we've had to grapple with in the past. Is dealing with the social consequences, the economic consequences of technological innovation within the purview of lawmakers, or is this a problem that is simply too big to address? I'll start on that one. Let me analogize. Secretary Napolitano tried to do this as well when she was governor, and she may have been luckier than me, but I always thought that we look back 25 years to promises of an interconnected world. The promise of an interconnected world was going to be, you could build it anywhere. That should have been potentially hugely empowering to rural America.

We end up showing rurals we can build it in Beijing, and Shanghai, and Mumbai, but we didn't do a very good job of making sure Virginia or grown-up Virginia, or smaller communities and probably Janet has got some in Arizona. I think we do have to be aware of the social-economic impact. I think we have to get educated. I think one of the things we've both talked about at the beginning of our conversation was, even algorithmic biases that we may not even be conscious of. I do think that Congress has to sort that through.

How do we get ahead of that? I don't know. One of the things I've been working on a lot in the last post-COVID era is racial wealth gap issues and access to capital. Which again doesn't solve every systemic racism problem, but if we can have fair access to capital, one of the things that's come out of COVID is we're seeing entrepreneurship activities in people of color in this country at an unprecedented level. A progressive government that's actually, I would argue, that wants the best American capitalism possible ought to be encouraging that, and again access to capital. But that's slightly not answering your question Daniel, because I think it's a smart one, but I don't know the answer.

Luckily, secretary Napolitano has got a great, concise answer to that one that's going to lay out how we figure that all out and on social equity basis. Well, it is a big question. The subtext of the question is, is this a concern that Congress ought to take into account

2021-12-09 09:07

Show Video

Other news