TecHype Special: Robert Reich Interview/DeepDive (full episode)
[BGM plays] (Brandie Nonnecke) Welcome to TecHype! I'm your host, Brandie Nonnecke. Today we're digging deep into an issue that affects us all. The role of emerging technologies, especially artificial intelligence, in reshaping the very foundation of our economy, our society, and even our democracy.
Whether you're feeling excitement or a twinge of anxiety, one thing is true. AI and emerging technologies are increasingly going to influence the way you live, work, and connect with each other. From the way you receive information online to the number of jobs that will be created and lost.
AI is going to be integrated directly into your life. but how do we make sure these advancements actually benefit everyone? How do we ensure they do not destabilize, but reinforce our democratic values? These are questions not only for the elite within tech circles, but these are questions that affect you. We're honored to feature two guests today who have been on the frontlines of these issues. Robert Reich, former US Secretary of Labor, and Janet Napolitano, the former Secretary of Homeland Security. As the nation prepares for a pivotal election year, America finds itself at a crossroads that is at once familiar and newly daunting. For decades, we've grappled with the convergence of economic power and democratic ideals, a central theme in the work of Robert Reich, former U.S.
Secretary of Labor and a prominent advocate for economic equity. Reich warns that the 2024 election is not merely about party preferences or ideological divides. Rather, it's about the very future of American democracy. He argues that with income inequality reaching unprecedented heights and corporations wielding immense influence, the institutions meant to represent the will of the people are being co-opted to serve the wealthy elite instead.
Reich sees this as a moment of reckoning. Can democracy survive in a society where political and economic power rests increasingly in the hands of a select few to grasp the gravity of this situation? Reich encourages a deep dive into the root causes of economic inequality Let's start with the big picture. Right now, we're in an era of exponential technological advancement. Tools like artificial intelligence, deep learning, and the algorithms running on social media platforms are reshaping our information landscape. But this powerful technology has a dark side.
It can be harnessed to manipulate, distort and control the flow of information in ways we've never seen before. And that manipulation isn't just an inconvenience. It's a direct threat to democracy. Why?
Because for democracy to work, we need an informed public, citizens who have access to reliable information and can make educated decisions based on what's true. When people are bombarded with disinformation, content that's not just misleading, but intentionally deceptive this erodes our ability to separate fact from fiction. Consider, for example, how AI-generated content can create hyper-realistic deepfakes, videos or audio clips where someone appears to say or do something they never did.
These deepfakes make it harder to trust even video evidence, traditionally a cornerstone of reliable news. When trust in these foundational sources breaks down, so does trust in democratic processes and public institutions. That's why understanding these threats isn't optional, it's essential if we want to safeguard democracy. With rapid advances in AI, an emerging technology is an social media platforms. There is a significant concern that these technologies could destabilize our democracy. What do you think are the most pressing challenges that emerging technologies pose to the integrity of our democratic institutions? Well, the biggest problem right now is the weaponization of disinformation, and that would be a huge problem anyway.
But if you get social media and technology, providing all sorts of ways of fooling people into thinking that lies or the truth and weaponizing those lies, then, it's very, very difficult for the public to do anything about it. I mean, a democracy depends on an informed public. Our only two responses are, number one, making sure that every social media outlet and every technology that's being used has in it some corrective feature, some moderation feature that make sure that the lies are just are not there or somehow screened out.
And secondly, that people are trained in critical thinking so that they don't subject themselves to they're not vulnerable to these sorts of lies. The Liar's Dividend and the challenge of truth. Here's another challenge we're up against.
The concept of the liar's dividend. Now, this isn't just a catchy phrase. It's a powerful force in today's information landscape.
The Liar's Dividend is what happens when people use AI-driven technology to blur the lines between reality and fiction so effectively that eventually nothing feels certain. We reach a point where truth and lies feel equally plausible. And suddenly people start to doubt everything. This is where phrases like fake news come into play, making it easy for people in power to dismiss uncomfortable truths as simply fabricated.
If we think about Orwell's 1984, this isn't too far off from the idea of doublethink, where citizens are expected to accept contradictory realities, bending to whatever the prevailing authority tells them. Now imagine that kind of influence in today's world. Leaders in both politics and business can muddy the waters so much that people begin to doubt even legitimate news sources.
This constant questioning and undermining of truth leaves the public disillusioned and cynical, ultimately weakening our collective understanding and ability to make informed decisions. It's a tactic that authoritarian regimes have long used to control their citizens. Only now, with advanced technology, it's happening on a much larger and more subtle scale.
Exactly. But it's not only, presenting a lie is truth, but it can also be presenting truth as a lie. Exactly right. And then we get to this problem of the liar's dividend, where there's no discernable truth. Everything is a lie.
And that's exactly what Orwell was warning us about in his 1984 novel, which, really was about the end of democracy, was about authoritarianism. How does how do authoritarians persuade the public? Well, they took they take lies and they turn them into truth. They take truth and turn them into lies. And here we are 40 years later.
We're seeing it happen right before us. Well, it's fairly, frightening because I've been involved in education and public education for, the last 45 years, and I've never seen anything like this in terms of a threat to democracy and a threat to public education and broadly concede. So what's being done to counter this trend? Fortunately, some policymakers are starting to take action here in California, Governor Gavin Newsom recently signed a transparency act that addresses one of the most immediate concerns distinguishing AI generated content from authentic content.
According to the law, developers of AI systems now have to include clear watermarks on any content that's generated or altered by AI. It's a small but crucial step toward letting people know when they're looking at something that may have been manipulated by technology rather than by a human hand. The watermark policy might sound simple, but it's a major step forward in digital transparency. If people know when they're viewing AI altered content, they're more likely to approach it critically.
They can start asking questions. Who created this? Why was it altered? Is the intention here to inform or to mislead? But California's law is just one piece of the puzzle. Ultimately, we need a coordinated national and even international response to tackle the manipulation of information. Other states and countries will have to follow California's lead to ensure the public can navigate this new digital landscape with some degree of confidence So what do you think we can do now in the state of California? Governor Newsom recently signed a bill into law, the Transparency Act, where it will require these developers of AI systems that can either generate or modify content to include, essentially a type of watermark that cannot be removed.
In doing so, the hope is that this would raise awareness to people that that content has been manipulated by AI. Do you think interventions like that will be effective at, you know, helping the public better understand truth from fiction? I think it could be helpful. Anything we do that signals to the public what the truth is, and that the truth might be manipulated is an advance. It helps critical thinking. I mean, if people even want to be critically, thoughtful, they do need markers. the role of public spaces and third spaces.
One of the deeper issues here is how isolated we've become in the digital age. In the past, communities had what sociologists call third spaces, places outside of home and work where people from all walks of life could meet, talk and exchange ideas. Think of libraries, cafes, public parks, places that encourage a kind of social mixing and mutual understanding.
But with social media, we're increasingly isolated in echo chambers or online spaces that show us only what we want to see. Usually, opinions that match our own algorithms fueled this isolation by feeding us more content that we already agree with, making it harder to encounter differing viewpoints. This online isolation has real consequences for democracy. It means we're less likely to hear opposing perspectives or engage in constructive debate.
Imagine if, instead of being segmented by algorithms, we had online third spaces where people could engage in balanced discussions across a wide range of views. It could be transformative. Physical third spaces are still important, too. They provide environments where diverse groups can gather and discuss issues face to face. Helping to rebuild that social fabric that's crucial for a healthy democracy.
One thing that you talk a lot about in the work that you do is around economic stability and social justice. I've been thinking a lot about the importance of third spaces. Right. One of the reasons why platforms are able to influence people is that Americans live very different lives, and we're no longer intermingling with each other. Our children go to vastly different schools.
They participate in different recreational activities. Is there any hope in the United States that we'll get back to a point where our democracy can flourish, where we no longer have these factions? Well, I'm very hopeful. And my hope rests upon the notion that we as a society fundamentally depend on, people understanding what they owe each other as members of the same society. We talk a lot about rights, but we don't talk enough about duties. And the only way we know and understand duties and our responsibilities to each other is if we intermingle in these public spaces. As you're talking about technology could be helpful.
I mean, Wikipedia, for example, is a public space. It's a public technology. There's no reason that every technology, every piece of software and every platform has to be privately owned. Now, we can't ignore the influence of big tech platforms here. Platforms like X, formerly Twitter and Facebook, are major players in shaping public opinion. Some experts argue that these platforms are so central to our democratic discourse that they should be regulated like public utilities.
Think about it. Just as water and electricity are essential for daily life, so is the flow of information. But right now, this flow is controlled by private corporations who answer to shareholders rather than the public good. When billionaires like Elon Musk and Jeff Bezos hold so much influence over these platforms, they wield enormous power over our collective understanding of the world. For a healthy democracy, this level of concentrated control is unsustainable.
That's why we're seeing calls for antitrust laws to break up these tech giants to ensure that no single entity has a monopoly over our information channels. It's not about restricting technology, but about ensuring that it serves society rather than a small elite. right? And I think that there are some ideas around these decentralized types of platforms where they are not controlled solely by one company, but it's really different.
Groups can control it. Now, you talked about our duties, and one of our duties in the United States is to vote. And we're on the eve of a very, very important presidential election. Yet we see the role of these emerging technologies in influencing voters and mobilizing them.
How do we best ensure that the tech behemoths don't have an undue influence on our voters, and in part, on our democracy? They are already having an undue influence? I think one of the people who, are distorting our democracy more than any other person is Elon Musk. And he's going to be called out. And I don't care if I'm the only person calling him up. But he is doing things that are directly undermining our, our democracy because of his ownership of X and his willingness and actually his encouragement of dissent, information on that platform. Right.
I mean, there's definitely, I think, an economic incentive we have seen in a lot of research that shows these hyperbolic disinformation stories. They tend to go viral. Get people clicking and looking. I want to also point out that when Elon Musk took over X, he he bought up all the shares. So sole owner, how do we balance, though, that he is the owner of a private company with the influence on our election, right.
He does have First Amendment rights Well, he may have First Amendment rights, but X or any platform that would that has that much influence should be treated as a public utility. Okay. Well, even beyond that, you know, I find it anathema to democracy that somebody like Jeff Bezos owns the Washington Post and prevents The Washington Post from endorsing a candidate. I mean, that is a an abuse of the wealth and power in this country, and we cannot allow that.
So do you think that antitrust is the main way forward when we break down some of these big companies or the ownership, like, lateral ownership across different industries where we're seeing this influence of these sort of tech giants taking over large media platforms. Yes. I think antitrust is a major and very important initiative with regard to dealing with these giant tech behemoths. I would you call them, but also, we must prevent any particular single wealthy person from controlling a very important vehicle through which the public understands, what's happening? We can't have a democracy, with with that kind of, centralization as the great jurist Louis Brandeis said in the 1920s, we can either have great wealth in the hands of a few people, or we can have a democracy, but we can't have both. It's very true, and we are not currently in that situation.
Are we? We have, these, you know, tech giants owning essentially the public sphere and shaping it. And, and I think that a very aggressive response to that to protect our democracy, not just our economy, antitrust is not just about economics, it's also about protecting democracy. And it's, that aggressive antitrust approach, is to be applauded. I think what the Biden administration has been doing is good. But we need more of it. Rightly, an icon at the Federal Trade Commission has been leading a lot of this antitrust work, which has been quite impactful.
She's very good. Jonathan Kanter at the Antitrust Division of the Justice Department, also extremely good. They need more resources, and they also need an administration that does not call into question whether they're going to be reappointed. I mean, they need more staff.
Antitrust is going to be a central feature of our, response to a structural, kind of grotesque, this this, let me start again. Antitrust has got to be a response to the structural problem we're having right now in terms of an economy that is not supporting democracy. And we are on the eve of an election.
And so I will pose the question if we have a, Harris Walt's administration, do you think that these issues will be taken up and those resources will be allocated appropriately so that we can deal with this issue? I think it's much more likely under a Harris Walz administration than under a Trump administration. But even a Harris Walls administration is going to have to be pushed because, never underestimate the power politically of big aggregations of wealth. We have gone come to the point, partly because of the Supreme Court and the Citizens United decision, in which big money is polluting our democracy.
Right. Companies have free speech, First Amendment rights, according to Citizens United. Yes. And then according to the Supreme Court, money is speech. Well, I'm sorry, money is not speech. Corporations are not people. And we need a Supreme Court that recognizes that.
policies to support fair access and education. Let's talk about equity, because even the best policies and the most innovative technologies are useless if they're not accessible to everyone. A big part of this is digital literacy, making sure people have the skills to navigate the digital landscape and identify misinformation. Education, therefore, has to be part of the solution.
Imagine if every student in America had access to high quality, free education and that included digital literacy. They'd be far better equipped to critically analyze the content they encounter online. Then there's the question of access. Not everyone has the same access to the internet or digital tools.
And this digital divide often falls along economic and racial lines. Universal internet access in universal basic income could help ensure that no one is left behind. When people have the resources they need, they're in a better position to engage in democratic processes, making these policies not just economic, but democratic imperatives. I want to talk about what do we do in our society to make sure that emerging technologies can lift everyone? What do you think are the key policy strategies that need to be implemented to ensure everyone from all ranks in our country are able to harness emerging technologies in a way that benefit them? Three things.
Number one, you need a universal basic income. So everybody, knows that they're not going to be stranded and their, their family is not going to be, basically pushed off a cliff in this economy. Secondly, you need good education available to everyone, all the way through college. So that and public higher education, that is free.
So everybody has an opportunity and all kids have an opportunity to make the most of their God given talents. And thirdly, we need internet access for everyone that is, if not free, certainly practically free, because it is a vitally important piece of, being a citizen and also a social learning. Yeah. I'm glad that you brought up, digital inclusion and closing that digital divide, my PhD in telecommunications. And I worked a lot on universal access and service policies.
And it's been a very significant challenge in the United States to close that gap. Do you think that the United States will be able to actually close the gap? And I will also say it's not just in rural areas. It's also in cities where we see people who can't afford to have access.
We are able. The question is, are we willing? We are the richest country in the history of the world. We have the capacity to give the best education, including internet access to every child, to every family. If we don't do that, it's because we don't have the political will to do that. If we don't have the political will to do it, it's because there's too much big money that stops us from doing it. Involved in our politics.
Yeah, And I will say, if we don't do it, I think that we will get knocked down from being in that top position to being lower on the global stage. We need to invest in education in I think so too, because I think that many of our young people have extraordinary capacities and abilities if they're given the chance. Yeah, I agree with you. Thank you so much for joining me today to talk about emerging technologies and AI and economic and social stability.
Thank you so much for having me. My pleasure. So where do we go from here? We have the resources to build a society where everyone has access to reliable information, digital skills and a voice in the democratic process. What we lack right now is the political will to make that happen. The 2024 election could be pivotal in shaping the future of tech regulation, equity and democracy. As citizens.
We have a responsibility to make our voices heard on these issues. Ultimately, democracy is about ensuring that everyone has a fair shot at making informed choices. If we're serious about protecting democracy, we need to make sure that technology uplifts society rather than dividing it. We need to create policies that protect public trust and encourage inclusive participation, rather than allowing unchecked tech giants to monopolize our information. The future of democracy depends on the actions we take today to hold technology to a higher standard. We're at a critical moment, one that could define how technology shapes our society for generations to come.
It's not just about AI, social media, or any one tool. It's about the kind of world we want to live in and the democracy we want to uphold by taking informed action now. We can shape a future where technology serves all of us fairly. Thank you for joining us on this episode of Tech Hype, where we got to sit down with Robert Risch and Janet Napolitano. We uncovered ways emerging technologies are influencing our society, our economy, and even our democracy.
Want to learn more about other emerging technologies and the laws and policies that shape them? Check out our other episodes at TecHype.org.
2024-11-19 04:15