Are We All “Zucked”? Technology and Power in the Internet Age

Are We All “Zucked”? Technology and Power in the Internet Age

Show Video

Thank. You for joining us my name is Scott Lowenstein, I'm a second year MBA and, a, student leader for, the corporations, and society, initiative, or Kassie, which. Is one of the co-sponsors of, the event today along, with the Center, for Internet and Society and. Cs+. Social, good. Cassie, is about, bringing together experts, with business, students, and policymakers, to. Ask. Questions. And being, a broader perspective, of the role of corporations, and capitalism, in society. Today, through. Research, education advocacy. Cassie. Is trying. To redefine or, question. Some of the assumptions, that form the core of the role of private, and public institutions in, our lives and today. We're talking about tech. In. Many ways the current crop of business. School students, kind of grew up with platform, media I remember, my. Sister showing. Me the Facebook when it was a dot e-d-u kind, of place and. I also remember in 2011 when, Facebook, was, saving. Democracy. And and civil, society, around the world, and. Now we're an event called sucked so the, times have changed but. It's. This kind of evolving, facebook, conversation that's, a perfect fit for what Cassie is trying to do, asking. Questions like you. Know what does Facebook, Oh users. Share, hold there's society. At large, is. Internet, addiction and, fake, news the inevitable, output, of, sort. Of shareholder. Value maximization. Model. In the internet age, and. How do we kind of tame or tamp down some, of this. What Tristan Harris calls runaway, attention, maximizing. Technology. So. It's these questions, and many more that Roger McNamee addresses in his book sucked waking, up to the Facebook, catastrophe, Roger, has 35, years of investing, experience in, Silicon Valley he. Was an early advisor to Mark Zuckerberg. His, most recent fund elevation. Was co-founded. By YouTube's bond and when, he's not on book tour he's rocking out with his band moon. Alice I can't. Wait to hear about Roger sort, of personal transformation from. Face book champion. To a whistleblower of sorts about the dangers of the platform I mean. In moderating that conversation, is Anne Grimes and is, an award-winning journalist, two held, senior, editorial positions, at The Washington Post and, The Wall Street Journal. She. Ran Stanford's. Gradualism. Program, she, now is the co-director of the Brown Institute. For Media innovation, in the engineering department and she's. Taught all over the school in CS, and engineering, at the d.school and, here at the Business School in a class on media entrepreneurship. Which is where I first heard Roger talk about some of these ideas last, year. Following. Their conversation, we'll have some questions from students, in the audience but. Now please join me in thanking and welcoming, Ann and Roger. Hi. Welcome. It's. Great to see such a big. Crowd here at, your lunch hour so. Thanks. Everybody for showing. Up and thank you Roger no my pleasure. Let. Me start off by saying that Roger, and I have known each other for a while now since. My days at the Wall Street Journal, when I was covering venture capital, and I, thought of him as just another venture, capitalist. But. You. Know look, at you now, you're, a you've. Written a. Really. Thoughtful readable. Well scripted book. It's. A best-seller. It's. A great read. You're. Killing it congratulations. And you're you're, scaring, me not just because you're sort of moving in on my writing, turf here, because.

But Because, what. You're you're. You're, telling us all is that. It. Looks like we are all. Really. Sucked. So. Before. Me before, we start how. Many folks have read Rogers, book or have, bought. Roger's book thank you thank. You all right there. Are, there. Were books will be on. Sale out in the vestibule so, don't. Be don't be shy with your wallets. So. Let me let me say a little bit about the book before we start. Our conversation. The. Tone of the book is really interesting it's a, manifesto. Of sorts. It's, a call of a call, to action and, it. The tone is very thoughtful. In. The. Book, Roger, starts, by saying that, he is an investor, he was an early investor in Facebook and he still has his. Investment, in Facebook but. That he learned that his father he felt his trust in Facebook was misplaced. And this, is largely a story about trust. You. Say, that. Even though Facebook provides, a compelling, experience. For. Most users, it is terrible, for America, it. Needs to be changed. You. Described Facebook, as a threat. Technology. Like, Facebook's. You, say allows. It to manipulate. Our, attention. The. Design of Facebook, allows bad actors, to exploit, it and harm, innocent, people. Democracy. Has been undermined, because of design choices and business decisions, by Internet platforms, not. Just Facebook but, many, others that, deny responsibility for, the consequences of their actions. The. Culture, of these. Companies causes, employees, to be indifferent, to, the negative side effects of their. Success. You. Say these, platforms. Have taken advantage of, our trust and, you, say you have, had a, conversion. Experience. This. Sounds like tough love to, me. And there's a lot to unpack there so. But. Before we do that let's. Start a little bit with the, back story here tell, us a bit more about, your background and, how you got involved with, Facebook in the first place how many of you were at GSB, just, for. Last to see okay, so here's the thing they. Don't tell you this in business school I mean, he certainly didn't tell me if. He can literally, what. You really need to have a successful career he's just incredible. Good luck I, started.

My Career in the first day of the bull market of 1982. They. Assigned me to technology, I mean I wanted, to do tech but they let me do tech and this is in the days when the space shuttle was the biggest thing going on in Silicon Valley and the tech industry, so. Men I had for the entirety, of my career a gale-force tailwind. I literally, couldn't. Miss and. You. Know they tell you in Modern Portfolio theory, that on. Average, active. Managers, can't beat the market what they don't tell you is they're 200, million or 300 million, participants in the market, so. Statistically. They're gonna be tens of thousands of people who outperform, in the market by two standard, deviations, for their entire career on the basis of pure dumb luck and, I want you understand I'm one of them okay. So, when. I met Mark Zuckerberg in. 2006. He. Had just turned 22 and I had just turned 50, I've. Been investing, half of my life, and. I. Got asked by one of marks colleagues, would I take a meeting with his boss because there was a crisis, and he need to talk to somebody on the outside no. Other information provided. The. Thing to understand, in 2006. There were a few things going on in Silicon Valley that were really worth paying attention to. The. PayPal mafia Peter. Thiel and, Elon. Musk and and. Reid, Hoffman in that gang it, had two unbelievably. Powerful insights, the first was that the internet was shifting from a web of pages, to a web of people, and. That, the. Limitations. On technology. That had constrained. Everybody. Around here for the prior 50 years that it made us focus, on the, narrowest, most important. Functionality. That people need at any point in time those, were going away and you're for the first time gonna be able to create global products, you, were gonna be able to crepe, services. Like Amazon, Web Services. That. Would allow entrepreneurs to, buy the, hundred million or two hundred million dollars worth of infrastructure, that previously, you had to build just to launch your product which, meant the cost of a startup was going to go from a hundred million to 10 million it, meant instead of needing 45, to 50 year old entrepreneurs, you could do it with 20 year olds so.

When Mark started Facebook, he literally, started, it at the perfect, moment, in time and I. Was absolutely, convinced, even, in, 2006. When it was just available to high school and college students with their school address when. It was just a facebook picture address, and relationship, status that he was going to be bigger than Google, and I. Was convinced, to that because I had watched, myspace. Friendster. And and. America. Online all failed, in their social activity. Because. In my opinion. Anonymity. Allowed, trolls, and bad. Actors, to overwhelm the platforms so, I was really excited to meet mark and the, first day we met he comes into my office and. You know it's six months after the end of the timeline of the social network movie and. He looks just like Mark Zuckerberg, he's. Got the, sandals. He's got to tight the skinny jeans got, the t-shirt the hoodie courier. Bag he, sits down I sit down I'm closer to him than I am to Ann right now we're, in these comfy, chairs in this conference room at elevation, and. I said mark before you say a word I got to give you some context, for why I took this meeting. And. I said look if it hasn't already happened either Google, I saw either Microsoft. Or Yahoo, is gonna offer 1, billion, dollars, for your company and everybody. You know your parents your board of directors your, management team employees. Friends everybody, can take mark, you. Got no revenue take the money it's. A billion dollars, you'll. Have six hundred million six hundred fifty million bucks and we'll be rich. Take. The money your venture capitalist Jim Breyer he's going to he'll back your next company I said mark you, need to know there's, two things I know for, sure after 24, years is investing one, is no, entrepreneur, in the history of Silicon Valley ever had the perfect idea, at the perfect time twice, nobody. Not. Steve Jobs not anybody and I'm convinced, you've got the perfect idea at the perfect time you're, gonna be bigger than Google. Secondly. I don't care who buys us company they're gonna kill it if, you believe this idea the only way to make it happen is for you to take it out. What. Followed, that was, the most painful five, minutes of my entire life mark. Didn't see it worried he. Was thinking really hard did he trust me did he not trust me, in. The end he relaxes, and he goes you won't believe this. But. That's the reason I'm here that story you just told that just happened exactly. As you said how did you know. And. I go I didn't know I just know all the people around here have been around here a long time and this is how things work around here, what. Do you want to do and. He said well what am I gonna do everybody, who wants to sell the business but me I can't, run it by myself and, I, said hang on dude you got plenty of money you get a great idea it's really working they. Don't get to tell you the music. What's the song it's over, let. Me explain to you how to walk. Them off the cliff and, so. He, went back and the whole meeting took less than half an hour he. Went back and he. Told. Everybody and I think in retrospect they're all pretty happy that he talked. Them out of selling a company. So. He had this problem right that everybody'd wanted to sell the company so it was helpful to have a new advisor who. Believed in his vision so for a period of three years I got to be one of many, advisors. That he had and the, principal thing I do was help to bring cherylin and, I'd. Known Cheryl because she introduced me to bono and I'd helped, to introduce her to Google and, so. You. Can imagine I was just. So proud of Facebook. I thought mark was different, I thought, the culture of the valley at the time which. Was this you, know it was replacing the hippie libertarianism. Of Steve Jobs that notion of empowering, people with, a different kind of libertarianism. That was really more by building monopoly. You know disrupting. And dominating, you, know that that you didn't really worry about what the rules were he just kind of went and did things he. Didn't ask permission you beg forgiveness that. Culture, was very uncomfortable, for me and in, the, early days I think Mark had. A different view or at. Least that's what I thought he had and so, I was blissfully, a. Fan. I mean, just, a huge, fan and I, got. To the end of my career and I'd retired before, I realized, there was something wrong not to say that there weren't signals, there were signals and it's an analyst I should have put them together better than I did but. I didn't and I feel bad about that so how. Did you move from an advisor, to, skeptic. Were there key points to be clear there was a middle point in the way where I was just a fan right where you know I was not regularly, interacting, with either mark or show and you.

Know I stopped being an insider in 2009. Because, you. Know the things I was really good at he didn't need anymore and. So. I'm just in a cheerleader, and then. I retire, in December, of 2015. And, in, January, I start. To see things that don't fit my preconceived, notion of this company I loved and, these people I loved. First. It was in the Democratic. Primary in New Hampshire I start, to see memes. Coming, out of Facebook, groups that are associated not, notionally, with the Bernie Sanders campaign, but. They were spreading, virally. Among my friend group in a way that suggested, somebody was spending, money to, get people into a group whose only purpose was to spread misogynistic. Memes that struck, me is really, weird, then, two, months later Facebook. Expelled, a company that was using the ad tools to, gather data on people who expressed, an interest in black lives matter and they, were selling, that data to police departments now Facebook did the right thing right they expelled them the, problem was the damage had been done police. Departments, were harassing these people, then. In June of 2016, the United Kingdom votes on the. Brexit referendum, to, leave the European Union and. You may know the outcome was a huge shock eight-point. Swing during. The day of the referendum. And it occurred to me that Wow. What if the Facebook tools that allow. You. Know ideas, to spread so rapidly what, if if. There's, something, about them that gives an unfair advantage to. Really. Intense. Nasty. Messages, over at neutral ones because the, the leave campaign had, this intense. Xenophobic. Message. And the, remain campaign, was kind of hey stay the course and I'm, sitting there going wow if. That's true, that's like really. Bad for democracy, but, I don't know I have no data, so. I start going to look for friends to say is anybody, seeing this and he actually reached out to Facebook too but I didn't, know what I was seeing so it was really hard to connect with anyone I mean connect with them in the sense of convey. What my concerns, were. We. Then find out about manna fort in August and. Around. Labor Day Walt Mossberg asked me you know from recode what, Mossberg asked me are writing up in which. I start drafting but I'm not in any huge hurry because it. Doesn't even occur to me that the US election, could be in danger. Again. Real, failure finale analysis, I guess I was retired and. Anyway. Long story short two. Things happen first, Housing, and Urban Development Department. Of the US government sites Facebook, for advertising, tools that, enable discrimination. Housing the very thing they were sued over a few weeks ago but. That was the first salvo, of it and then, the, intelligence, agencies say the Russians are actively, interfering, in the u.s. election, at that point I freaked out I realize, I'm not going to do an op-ed I got to reach out to market share they're my friends I'm a huge believer in this company and so. On the 30th of October nine days before the election I send, them the draft op-ed, and say guys I think there is something wrong with, the business model and algorithms, that's, allowing, bad actors, to harm innocent people and we got to get on top of it now. In retrospect. I wish I hadn't sent them the op-ed I wish I'd rewritten, it because the op-ed is you. Know it's designed to be provocative and it. Would have been better if I'd been a little bit more level. But. They got right back to me and there, couldn't have been more friendly, but they didn't embrace it like was a real business issue they treated like a PR problem and, they handed me off to Dan Rose who was a real good friend of mine so I was actually totally psyched to talk to him but. Dan's job was to essentially, contain, the, PR, problem and he goes Roger we, do not have a problem here. Section. 230, of the Communications Decency. Act says we're, a platform, not a media company so we're not responsible, for what third party still, and. I go din you, got civil rights problems, and you get some weird stuff in democracy, you, think people are gonna let you off the hook for that he, goes yeah we don't have a problem anyway. The election, happens at that point I go completely, ballistic I go Dan you. Have got to get honest you have got to do what Johnson & Johnson did when some asshole put, poison, and bottles of Tylenol in Chicago, in 1982. The CEO of Johnson, Johnson the very, day the story happened, pulled. Every bottle of Tylenol off every retail shelf in the, United States and Canada no. Questions asked and didn't, put them back till they invented, tamper proof packaging, I said. Dan, you have to leave to the defense the people who use your product you cannot, have, the uncertainty, you are in a trust business, if you.

Let The trust get broken, you will never earn it back I spent. Three, months begging, him to do, this and. He's. Just hiding, behind section, 230, and I go eventually, I give up and it's, at that point, that I faced a moment of truth that maybe one or more of you will face at some point in your life I. Saw. Something really wrong something. With issues that I could see already for democracy, and issues, for civil rights I. Was. Retired I could have sat back and just said this is somebody else's problem, but. The other side of the coin was I'd, been a small part of this and I'd, profited, from it and, I felt a moral, and emotional, need. To. Take what I knew to take my biography, and, try. To start, a conversation about, what had happened I. Was. Really. Our decision. So. As part, of your journey from a from, an investor, to a skeptic, to a to, an activist, it, it. Struck me in reading that your story that you think Facebook. Is essentially, a. Closed. System and that the leaders are not willing, to engage in, sort of this big picture. Thinking. We've. Seen we've seen them go on the sort, of apology, tour over. And over and over again and I wonder when it's worked really well right, move. Fast break things apologize, repeat. Okay. All right, some. Might say that but. But. When. Did you realize that that, they, were. This. Isn't, fair - am, i describing this, correctly do you think they are a close it's a closed the closed culture, there - me either had Stata - your doubts to me that's not the, answer. Is yes but I that's not the operative, thing I mean, Facebook. Exists. In a culture, here in the valley which in turn exists, in the culture in the United States of America well, we have deregulated. For 44. Years we. Have essentially, gone from a version of capitalism of, a standard, version where the government, sets the rules and enforces, them across. The entire economy so, that's fair for everyone to, a world with essentially, very. Few rules and almost no enforcement, where, businesses, are encouraged, to grab whatever they can grab while they can grab it and we've, gone from this world that Peter Drucker who was a great management guru, of the Industrial. Age back when I was young he. Would say there are five stakeholders. Right. Shareholders. Employees. The. Communities, where employees, live. Customers. And suppliers and, that it was the duty of management, to balance the interests of all five because, that's how you maintain it was a zen-like harmony, that you would maintain over time but. Over, the last 40, years we've abandoned. Four of the five and. Essentially. All companies, in the economy abused their suppliers they. Many, of them abuse their customers, they couldn't, care less about the communities, where people live and many, of them have treated their employees badly. And so, I look at this and by the way if that has been standard operating procedure right now if Rex Tillerson is, allowed to conduct a, separate, foreign policy, at Exxon in contravention. Of our sanctions against Russia if Wells. Fargo, Bank can. Get away with. Essentially. Defrauding, millions of account holders if the entire banking industry, can do 2008.

Not Be punished. It's. Really hard to expect Silicon, Valley to have a higher set of standards, than that and so. To me that was a big part of the problem was that you, know it's these, are not bad people right. That's. The key thing I don't think the managers, are bad I don't think the people are bad who work there I think the issue is the culture is just way off track we, kept doing the same thing over and over again and, we, forget that there's natural, pendulum, in society, to keep things in balance and we're just out of balance and. So. When, I look at Facebook let's. Face it that company, was so successful, for so long you, can't blame them for thinking if critic, then wrong, right. I mean I'm sure they looked at me and said Roger if you're so smart where you're two and a half billion users. Right. I get. That that's just how the world was okay. The incentives, are misaligned, and it's, way bigger than Facebook way, bigger than Silicon, Valley and. Demonstrable. You, know, you. Can see that there's a beginning of a change, right. We've, had at least five teacher actions labor, actions, in the last year all, successful. First successful, teacher labor actions I think in a generation. The. Air traffic controllers, with a partial, sick-out ended a government shutdown in two hours and McDonald's. Has abandoned, the fight against, 15 those. Are all incredibly. Important, indicators, that we just, may, have reached. The. Furthest limit, of this. Penciler swinging that we're gonna begin to swing back so, I'm extremely optimistic about, that and again, I don't think I'd, look at Facebook and I think their behavior, even now look mark I give mark a lot of credit cuz he's coming out and talking in public now and I think what he's saying is nonsense but. His PR department, creates right and they do what their job is right which is to create nonsense, to distract, I get, that, right. I mean I have been around a long time it's, like this is how it works I'm not gonna hold them to a higher standard and everybody else but, I want to hold everybody, to a higher standard than we've held them to for, the last few years okay. Well let me just drill down a little bit on this because in in the in the book on, the character issue and culture ashley issue you, say, mark, zuckerberg created, a culture in which criticism, and disagreement, apparently, had, no place yeah, you say you say you didn't realize that Mark Zuckerberg ambition. Had no limit you. Say that Facebook, secret, sauce is its ability, to imitate, and improve on ideas of others and scale. Them what, I'm saying is I was a terrible. Analyst, for a number of years okay that's, really a conclusion all of that was obvious, to anybody covering, the company so. Can I take a minute because I want to talk about what I think the real problem is here. Because. I think Facebook is by no me Facebook, is the biggest, problem we have for democracy, but if you sit there and you think about therefore, clouds as a problem their public health issues. There's. Democracy, issues there's privacy, issues and then there's the whole structure of the economy and I, want to dial it back to 2002. And I want to look at would be incredible. Genius of Google. And what. They did in, 2002. Google behaves just like a classic, marker they have one product it's a search engine they, gathered data from the people who use it to make that product better, completely. Normal natural thing to do the. Business, model, is based on IDI targeting, related. To, purchase. Intent, right you, want to go on vacation you look up places to go you look a bear lines you look up hotels the whole nine yards right. They. Do the analysis, they discover they only need a few percent, of the data in order to make the engine better so they look at and go well is there any signal, and the rest in the other 97, 98 percent and they not only discover, a signal but it's for something much bigger than purchase, intent, its, behavioral, prediction. The. Big enchilada. But. They look at me go holy, we got this problem we don't even know who these people are. So. They create Gmail. Now. Gmail, gives my identity to attach to purchase intent that makes the ads wildly.

More Valuable. Right. Makes the targeting, better so the consumer, has a better experience. So. Far everybody's fine but, then. This. Is where the culture of the valley the culture, of America, takes over they. Realize well they can do way better if you're gonna be in the behavioral prediction market, what is the single best source of intelligence for behavioral prediction. It's. People's emails. You. Tell people what you're gonna do. So. They got to find an excuse for reading the emails, see. Business, model will. Give it away for free well, put ads in it well tell me gotta scan the emails or to target the end, to. Hate the ads but will at least get the skinny emails so. They gave us duck food in the form of ads we, got pissed we forced them to get rid of the ads but, they continue to scan the, emails, you. See yourself well is that okay, remember. They've. Told us they're not a media company they're. A platform, which means they're a common carrier. Postal. Service read, your mail they, go to jail. Federal. Express reach your docks they go to jail common. Carriers are not allowed to read the content that stuff going through and. Remember. What consent is consent, in common law requires. Both sides to have the same understanding. Demonstrable. Not true here, so. That. However is a genius, way if you want to get into behavioral, prediction. And there are new rules right. Nobody's, got a problem so they move on they go to maps find, out where everybody is so. It Maps they, gather all this data create this incredibly. Convenient set of services the next thing you know they, can predict, what you're gonna do right. But. Their problem, here is they need to understand, the routes and everything that's going on so they've run some behavioral, manipulation experiments. Right. This route over here but not enough people take it we don't know how long it takes so we're gonna give, it to you and tell you that's the suggested, route this morning we're, gonna see if we can get you to take it and. Of course we did right. We've all had that experience you get in ways now all the time right. Think idiots think oh it's only gonna take 23, minutes 37. Minutes later you're pulling up where you're going. Right. So. Now they're in the behavioral manipulation business, and that's not what we signed up for, that's. When they have the genius insight there's, all this unclaimed, data in the economy, and the economy is incredibly, inefficient, because. There's. All this uncertainty, but if we get all this data we can take the uncertainty, out of it so.

They Start driving cars up down the street they call Street View. The. Germans go wait, a minute that's my house that's my kid that's my dog you can't do that that's. What the Stasi, is to do everybody, else goes and no, problem. Then. They do satellite, view right. So if you're in the breaking-and-entering business, you've now got street view and satellite, view and. Then. They tried Google glass. Remember. That now. They're going around and they're getting real-time the other two things are static but this is real-time and lots, of facial recognition lots. Of individual. Walking routes driving routes all kinds of cool stuff but we, hate it right we called them glass holes. Right. Remember that so. They go back into the laps they repackage, it as a video game. They. Spin it out as a separate company called Niantic, they. Called it pokemons go, they. Got 1 billion, people wandering, around with their smartphones, taking. Pictures of everything, now, the behavioral, manipulation gets, really interesting if. We put a Pokemon, in, private. Property well people climb over a fence yeah. It turns out they would, if. We put a pokey, bond in a Starbucks can we get you to go in to buy some coffee you. Bet, how. About if we do it in the third Starbucks, and give you 10 cents off and see whether you'll go a little further yeah, you'll do that too. Really. Largest, behavioral, manipulation experiment. Ever run outside of China. Really. Really, successful and. We're, in a different world down right our whole thing is we love these apps but we think what we're getting is better targeting, but what's really going on is large-scale, behavioral, manipulation where, it's not our data causing, that it's everybody else's data, then. What happens then they go wait a minute but there's all this data that other people own let's, go and buy it so. They go to your bank they, go to data, logics, they go to. Experian. And Equifax they, buy all your financial data they. Go to the cellular carriers, get, all your location, data they've already got the data from uber. And lyft. They. Go to health and wellness apps and they get all that stuff. They. Do all this tracking. They. Scan your documents. They've. Built a data avatar, of each. And every one was whether you're on the platform, or not and, the. Problem is only 1% of the value is in the stuff you put in postive, it's the metadata. That, tracking. You know the buy, of the browsing, history and the stuff they acquire. And. Then we're in the world of certainty right because, what they're selling to the advertiser. Is that so, think about this, anybody. Here been seen Google CAPTCHA right they say are you a robot tell, me look at these pictures of cars or street signs or whatever, right. You know what's going on right you're training the self-driving cars. Right. Google's AI for self-driving cars right. That's why they never show you things of livestock I, want. The ones that tell, you the difference between a pig and an elephant right that's.

What I'm waiting for but. No you're training the AI for self doubt they know you're a human because the way your mouth moves and. They. Save everything so I assume for me they have this long profile, of my mouse movement, here's. The thing. Some. Day I'm likely to develop Parkinson's disease. Now. What app is likely to see the very, first symptom. Of Parkinson's disease, an, app, that's measuring the movement of my mouse and, here's. What's wrong with the model. I'm. Not their customer. I'm. Not even their product, I'm the fuel for this data avatar I'm a source, of data. They're. Under no obligation to, tell me that I have a potential, medical diagnosis. They're, under no obligation to. Protect, my privacy and, all. The financial centers just sell that certainty. To the highest bidder and who will value at the most my insurance, company, who. Will either raise my rates or cut off my insurance and I won't even know I'm sick because if they tell me then it's a pre-existing condition. Those. Are terrible, incentives. So. My basic point you is what's happened, here and Google perfected, this but. Now Facebook. Amazon and, Microsoft are, playing this game also. There's. All this data in the economy, all the state in society, and. With. It. You. Can take uncertainty. Out of life, but. Here's the thing to think about. When. You wake up in the morning right you. Have free will you have the ability to choose are you gonna go to work today not go to work today eat. Breakfast don't eat breakfast exercise don't exercise, but. If everything. In life is certain what, happens to free will. In. An economic sense what happens to choice. That's. The question that we have to ask today. And. What I've discovered is that when I go around and. I do this thing it doesn't matter what the politics, are other people I talk to they. Go huh. That, doesn't sound right, that sounds like China. And. I go that's cuz it is China. Right. When Google says you can't regulate us cuz we have to compete with China ai I, go. Oh oh I didn't want to do that why. Do we want to be competing, with China and, behavioral. Modification, that'd, be like saying we want to compete with them and. Clothing. Babies, or time-release. Anthrax. Right. We. Got better things to do ai is the penicillin, of the 21st, century let's, apply it to things that make our lives better so. Let's. See here. Behavioral. Predictions. That's. The root of all of, it you also talked about the, addictions. That, this technology. The. Addictions, think of this and and. They and the, anonymity. And. Yeah. Creating, tribes, and, polarization. So you, said what you say in your book and the two biggest things you outline there are the threats to. In. Addition to what you've just said, public. Health with our addiction, to our phones and technology. As, well as the threats to our democracy, system, Roger how do I so my question my question is is at, the root of this what's. The the key design, problem. Here yeah that you that you would like to see solved. To fix all this so here's what the problem is if you're in the behavioral, prediction business you, do not want. To. Be stuck, with the data related. To our public. Selves right because we're, all on our best behavior here, right. If you want to find out what people are really about you've got to get the act of pierce. The. Shell, of civility, around, each one of us and, the. Best way to do that right is to. Provoke. Us. Right. So if you think about what engagement. Is right if your problem, here is that you. Want to build a habit and then, with the habit you want to then feed the habit with things, and engage people right, and it turns out that people, are most engaged by stuff the other triggers flight-or-fight, which. Would be outrage. Or fear, or. Really, conspiracy, theories, or. Disinformation. And nobody really knows why this information, is so successful but it is and so. Essentially. What this is about, is you want to get people into an unstable, state, to, find out what they're really like I mean. What happens, when you expose them to anti-semitism, how do they react, what.

Happens, When you expose them to hate speech what. Happens, when you expose them to the. Threat of some you let's say you know me, without break right. The. Problem, with the business model is that you. Know it, depends. On hate, speech fear, outrage. Disinformation. And conspiracy, theories those. Are the things that bring, out what they need for behavioral, prediction those, are not the things they need for great ad targeting, right. This is what you need if you want to eventually go to, behavioral. Manipulation. And. So my point here is that when, I look at GD P R when I look at the California private's law I see things that are fail in two ways they, fail because first. Of all they only address, 1%, of the problem they don't address metadata, they don't address browser. History, and they don't address the data acquire from third parties, which. Is where most of the value is but. Secondly, they happen, after, the fact if. You. Want to solve the problem you have to take, away the incentive to. Have inappropriate. Stuff there in the first place I don't want to be in the censoring, business I want, to be in the business of eliminating. The amplification, of, the, worst. The content, that essentially, creates political, polarization and, creates all these unhealthy out outcomes, and so, I think the way to do that is to ask the, first step is to ask political, questions why, is it legal for somebody to scan your email or your documents, why, is it legal for there to be a third, party commerce. In your, most intimate. Personal. Data. Why. Is it even legitimate, to capture. Data on children at, all. That's. The debate we need to have in 2020, and that's, the debate that interestingly enough, brings. People together because. It doesn't matter where there are left or right this. Is an issue of right, or wrong not, right or left and. What. I've found in, the ten weeks so far of my book tour is. It doesn't matter whether on fox or MSNBC Fox. Business CNBC. Conservative. Talk radio NPR. Everybody. Goes wait. A minute you're right why, is any of that stuff legal, because. If you could take that away. Then. We would go back to, a model, that wasn't. On. The. Bad stuff bad stuff would still happen but, it wouldn't be part of the design and. Again. This is really important I'm not suggesting that any of these people are bad people or they did this on purpose these, are the unintended consequences. Of a well intended strategy, where the well intended strategy, was to connect the whole world in Facebook's case or to expose all the world's information in Google's case the, problem was they failed they were in such a hurry they failed to put in circuit breakers they. Failed to think about containment, strategies. Because. That was the culture at the time I get. It this. Is just like the, chemical, industry thirty years ago you. Know chemical companies used to be able to pour mercury, into, fresh water, right. They left mind tailings on the side of the hill. Right. If you're a gas station you poured you used oil into the sewer and, for. Like 50 years nobody said anything but. Then the externality, started to pile up and people went wait a minute this is no good and. They went back to the guys who created the problem said hey toxic, oil spill toxic, chemical spill your problem you clean it up I think. We're looking at toxic digital spills that's all no. Seriously this. Isn't about people, this is about hey, look we had a business, kick we had a business culture, for a long time I, just. Think it's time to change the culture and, the. Reason I want to use antitrust, laws because I want to have I believe, all problems, create a much, bigger business, opportunity, in the form of when, you make the solution that, the really simple thing here is you. Know these guys aren't going to go away you're just gonna create this huge new industry in the history of tech is that starting. In 56 with AT&T with the consent decree first. We created the computer industry, we. Got the transistor. At ATT, and created all Silicon Valley. Right. Then IBM you created the software business and PCs. Then. The second 18t you accelerate, cellular, phones and you create broadband, data which creates the commercial, Internet and, then Microsoft you create the second wave Internet I mean.

And Every. Time the target company goes to all-time highs the, big complaint about those things was she didn't actually get rid of the monopolies you just created a whole new industry I'm going what hang on I'm, ok with that. But. You got to stop them from blocking, everything, so. We use the antitrust, thing that we used on ATT long-distance. Right. Which was we created MCI, and sprint by giving them access and really extraordinarily low, rates to the long distance line well, in this case what you're gonna do is very simple, any new, startup, that has a business, model that fits, the pattern we're looking for which is se not exploitable, people gets. Free access, to advertising. For customer. Acquisition on. The. Big platforms, and let's just say it's free up to a hundred million people, we. Could pick the number but just think about that for a minute what, would that do for innovation. How, many startups, you think we'd have if, it was free for, the first hundred million people how, would that change the cost of doing a start-up, right. Cuz, the problem you got right now is that Facebook, Google, Amazon, Microsoft they've, got like 80% of all the AI people, in North America and they're, all working on behavioral modification. So. If you want to do something else you got to compete with that how's. That work. You. Know the control they provide they, have these marketplaces where they provide perfect information, on one side right, Faustian. Bargain because, there's. No brand value right everybody gets the same perfect information but, the guys on the other side the consumers, they only get the data that Google and Facebook give them right. So once. They figure out your economics, the. Prices you see the, options, you see are tailored, to what they know about you because. They're removing, uncertainty. Again. I'm a big believer in choice I'm a believer in capitalism, I'd.

Like To get back to it okay. We're, gonna take a couple questions let me ask one. More and the. The. Microphone, is. Let. Me let me just, ask one more and then we'll turn to you and then if we could I'd love at least one or two women to come over here and ask okay so. We've talked about now, it, sounds like you feel like there's some momentum, you and tristin Harris, your, your colleague, have gained. Some momentum in capturing, the attention of the public of. Legislators. In, California. As well as in Washington DC, and this, debate, and this conversation, is becoming, national, so I, don't. Give us credit I actually give journalists. Much more credit than I give ourselves because, today, hardly a day goes by without another. Story, about something really important, going we like to hear that we've. Talked and, then in terms of solutions we've talked a little bit about technical, fixes we've talked I. Think, we'd like to hear what you think we personally. Yeah, what can do and. Next-generation, companies. Go, ahead right I want, somebody to make a mouse product, that tells me if I've got the, first symptom, of Parkinson's disease, right. It's, an insurance product. Just. Think about every, single thing that happens on the Internet today reverse. The polarity, of the business model instead, of treating the consumers, the fuel treat, them as the customer, what, does that business, model look like it's not gonna create things of the scale of Google on Facebook. But. It is gonna create a thousand, companies with a billion dollar market cap okay and. That should be enough right. I, mean. I think that's that's, a cool thing if you get to save democracy. And, you get to save public health and you get to save privacy, and you get to save the, economy. Right. And I may be wrong but let's have that conversation let's, have it right here, right. Because. I mean, this is where so. Many good ideas originate. There's, a question of how and when and if the regulator's can jump in and will they be effective. You've you've you've suggested, banning. BOTS stopping. Acquisitions, there will transparent. About political, ads. And. Algorithms. None of that none, of the stuff that has happened so, far. Addresses. The underlying problems. What. We're really trying to do is to restart. A regulatory, infrastructure, that's been more been for 25 years and that's. Really hard right, I mean the FTC, and the Antitrust, Division of Justice have done nothing. That, was related to consumer, for, since, the Microsoft case. And. I'm. Working with both of them and it's, it's a little bit slow going, but I have the economics, and law. School economics, department law school of another university, that have, found a way to use Chicago. School antitrust. Against these platforms, and that the model, is very straightforward, we've. Always said the services, are free but that's actually not correct it's, a it's a barter, of, services. For personal data the. Data is the currency and, what, happens it's really obvious. When you look at either Facebook or Google that, the.

Price In data has been rising geometrically. I, mean. Well let's just say it's been rising in a very steep slope okay. And. You. Get the marker for that is simply average revenue, per user though the suite of services really, hasn't been changing, much and the individual, value of an action on any service doesn't change much and yet the value of the data being given up is going up, very significantly and, so, that, model, we have the, a. Nobel. Prize was awarded last, year, for. A similar, analysis, relative, to carbon credits, and so. They're literally, they're, probably people in the econ Department here, working on things like this and I. Would invite those people to reach out to me because I think it's really straightforward and. So. Working with justice, and and and FTC, on that working with some state AG's, the. Europeans. Got a huge running start but they got a running start before we understood the full scope of the problem so the things they've done you. Know are too narrow and the thing in the UK and the, thing and the thing in Australia is nuts, okay the thing in the UK is, maybe. Nuts maybe not here it wouldn't work there it may. But. What's going on as government's are fed up and, they're. Fed up because. Facebook. Says the same thing over and over again expecting a different response, and Google says we're not involved and they're. Furious and, so. They're doing dumb stuff they're. Getting ever bigger mallets, right. And we've, got to get serious about this right, otherwise real, harm is gonna get done that you can't fix and I. Think. New Zealand changes a lot of things politically, because, New Zealand is really a theater. Piece right that. Terrorists. Took. The architecture. Of the internet so, the ability, of. Malcontented. People to find each other and connect, to act. Anonymously. In, their own interest, and then, they exploited. The architecture, of every social media platform. Right. Both in advance and after, the fact to. Amass. A group. Spread. The. Toxic, stuff and then, amplify. It and the. Problem, here isn't just that it took seven to ten minutes for Facebook to even notice there was a problem and like an hour before, they did anything about it, the. Problem was if. It had taken one second, it would have already been too late and there. Is no circuit breaker I suspect. At least one country, in this calendar, year will. Mandate, that, they have a button, that can shut down any, internet. Service. During. A problem that occurred. That. Demonstrably. Shutting. Down YouTube, would. Have been extremely, helpful, at. Limiting, the spread of this and potentially, changing, the incentives. Right. It may be enough for a terrorist just to shut down YouTube right. So. We have to think all that stuff through. But. Really. Harsh. Stuff, is coming and. You. Know because. The. Thing with New, Zealand worked, so well and, there's, nothing, out there to prevent it from happening again. One. Last question how, about the role of investors, in. All of this well you still own stock, in Facebook, I do if. You think it is such, a catastrophe is, the word in the title of your book why, are you still allowing your money to remain in Facebook so, instead, of another the money in Facebook is very simple, I didn't want anyone to accuse, me of. Speaking. Ill of a company whose stock I had just. My. Theory was I've made a lot of money in this stock and if my actions were gonna cause it to go down I was. Gonna share that with everyone else that's a personal value judgment you can agree or disagree. I. Think. Investors have a huge role to play it is time to get off the nonsense, of just focusing, on shareholder, value it's.

Time To start looking at society it's starting to start looking at employees. Communities, where they live. Customers. And suppliers again. And, you. Know I'm not gonna win that argument by myself. But. We have to stop looking at big data the way we're looking at it we've. Got to start thinking about inverting. The polarity, and. Exposing, that data to make consumers. Lives more. Successful. We've, got to stop viewing, people as a source of fuel. It's. Really bad and. Nothing. Is worth it. Right. If we can't see that then we need to read more books we need to meet, more people we. Need to actually visit, with the people who've been harmed because. Remember, you. Didn't need to be on Facebook. To. Be dead in Myanmar, we just needed to be a row hinga, you. Didn't need to be on any of these platforms to, be dead in New Zealand you just need to be a Muslim, and one of those mosques. You. Didn't need to be on a platform to have your business destroyed, me United Kingdom you just need to be EU centric. Right. This. Is not small stuff these. Are people's lives, now. We've lost sight of that and. Again. I may, be the wrong messenger for you but think, about the message. Find. Someone who is. Because. We can do better we really can't. Okay. Okay. We're gonna take some questions and our first question, is going to come. From Carlos, Gomez. Uribe. Who has, worked, on algorithms, at. Netflix. Google and most recently, Facebook. He has been a, more. Recent insider, in Facebook than, you. Have Roger so. Thank. You, more. Than a question I mostly, have a comment, it, is quite refreshing to hear your. Perspective I think it's important. And I'd love to see. More discussions, of this kind. And. I also want to state with the obvious which is the thing where you ended. Which. Is that many of these algorithms can. Actually. Be quite, helpful at improving how we make decisions exact, I worked, on search at Google first for two years I was eating ball over seven years at Netflix so, I know how both. All the recommendations, there and the search words inside and out I invented, a bunch of those that are running in production and because. Like you I grew increasingly concerned, with a role of Facebook in. Helping. Us understand. Their collective, reality and then making decisions for example in the ballads I reached. Out and I offered to join them doing, what I know how to do which is to build, algorithms. And. I joined them for a year and a half as. A director, supposedly, in charge of what they call integrity for news feed doctor. Terminology, that includes. Hate speech includes. Misinformation and, so on, and. Before. Going there before, going to Facebook he. Was super. Interested, in being, able to make a positive difference in you, know diminishing, these problems. I. Expected. Them to be the hardest technical, problems I had seen, just. Because you know they reflect essentially, all of human, complexity, and. I. Assume that Facebook. Having. All the smart people that I knew they have, must. Have tried all the obvious, things. One should try to make algorithms, that are high quality meaning, that, create.

Incentives, For higher quality content and for higher quality interactions. And. I, left her after a year and a half because. Like you I completely. Lost trust not. Only in mark and Sheryl but in at least three three, layers of leadership at Facebook, one. Of the things that's interesting about this problem is remember they've eliminated pornography. Right. I mean, they're. Plenty smart enough, if they want to get rid of a lot of the stuff right. Yeah they could make it a lot less the problem is. They. Need something, that activates, the, lizard brain to find out what we're really like right well. Jason. No I think in. The case of Facebook yes did it and maybe, more generally when you're thinking about the business model that's that. Relies on advertising, where. Again the. Users, of their products are the, product themselves well, so. One point is that I don't think that is true all, Internet, companies. Okay. That means Netflix, exactly. Can. I say sir because I want to point to Netflix, it's a really interesting thing because Netflix obviously. Has the ability, to do harm if it want to do but Damali is not doing so right at, least that's the way it looks to me and, I look at Apple they could do a lot of harm and they're clearly trying not, to right, I mean there are plenty of existence, proofs of this being done. But. Also you know, having. People with, with, morals with ethics people who care about the world and everything dividuals, at. The top and who, aren't. Who. Are willing to spend the time that's necessary to understand, how their products, their platform, actually works and who, are very thoughtful about it that, is not the, way things are on Facebook unfortunately. But I think you also pointed, with the business model is the problem, it's peer to the problem yes it's. A big part I mean it helps a lot to be Apple to be Netflix, and have built a model where. The consumer, is your customer, so, the last point I want to make and I'm happy to help. Take any questions or thoughts and specifically if I'm gonna sign books afterwards outside, so anybody else who has questions I'll be outside for a while yeah sounds good we have time for just a few more, okay, so last comment quick comment you. Know the people in these rooms whether it's Google Facebook. Netflix. Making. Decisions, about these products that end up affecting pretty. Much the entire world not just Americans it's bad for the world not just America. Came. From Stanford, and their, MBAs, they. Came from MIT where I studied from, you know supposedly these amazing. Places where people are trying to be thoughtful but. Once you get to these companies not. All of them but particularly Facebook, you're. Too focused on how, do you get promoted have. You meet all the goals and expectations that, are complete nonsense that, they set for you but, you know that your. Income depends, on that and. Ultimately. I think if we are to change the way these companies, run in, addition, to thinking about regulation, and business models we. Also have to think about education. And then, creating. You. Know generations. Of people who care not. Only about becoming. Rich and famous, but, who actually deeply care about building, things are actually positive for the world and the technology, is there you know AI you can do many things with a label. Hi. Thank you so much for coming I really enjoyed the talk I'm. A student and I'm in a club called CS Press social good here at Stanford, and I think we're also carrying on some of the discussions, and conversations that, you wish to have yeah. Um so my question is more focused on a subtopic in your book so you argued about how. Polarization. Is intensified. On social media because personal. Views have previously, been kept intact by social pressures are not so captain checked on those digital platforms, do, you think Facebook's, content moderation, efforts is enough to solve that problem and if. You think, they should have more content, moderate moderation. How should we weigh that against, the right to free speech it's. A great question so I'm. Going to answer the second part first so the question on moderation, is really, simple in my opinion, which is remember.

That I'm not I'm, a free speech person, but, you have to distinguish, as my partner, Renee de Resta says between freedom, of speech and freedom of reach the issue that we're dealing with here is I, don't want to restrict what people put on but I don't want, to have a business, model that, amplifies. Asymmetrically. The. Most negative, voices in society, and, so, I think content moderation. Is now, say. A probability, 0.8, or higher. Can't. Work at this scale okay, no, matter how many people you put on it the basic, notion is the reason they want content, moderation, is they want to gather all that data they want to get that first pass. Reaction. From people so, they can pierce the veneer, of civility, right. So moderation. Fits, the current business, so, I would argue that you have to change, the business model if you're really serious about these problems okay. I just. Don't I think moderation, is the wrong approach and, I. Just, don't think he'll ever get there and. Again I'm, not a scientist, I can't prove that but. I thought. The motherboard. Massive. Essay on that topic and all the cite the journal. Stuff I've seen since then makes me think moderation is not the way okay. If you don't have to - the class will take just, one. More there. Can. You give him the mic you're gonna have to speak much louder. So, on the one hand you say that the core problem here and one of the core problems is culture. As opposed to like specifically, bad people on. The other hand you've kind. Of admitted that suck created a culture at Facebook where criticism, and disagreement, pretty, much had no place in which if they had been able to be there probably we wouldn't be facing these problems so how. Do you change culture, without, actually. Being able to call out bad behavior from the people in charge if you don't call that out doesn't that actually create the culture that enables this that's a great question these are all amazing questions so the question of culture, specific, to company versus the. Country as a whole I believe. That the culture. Of business in the country, is the drive that's the driving thing Facebook, has a specific. Set of cultural issues but. The culture, the, thing that caused, Google, to go the way it went the thing that caused mark, to believe it was okay, to build a global, footprint. To mass this massive, network without any circuit, breakers and without a containment. Strategy, those, were the result of an external, culture that, they just grew up in at this point in time isn't, it isn't it time now to actually say. Like hey whatever. The reasons were like this isn't okay so I'm with you completely question, is is it time to ask for these leaders like so I can Sandburg to step down oh okay.

So Here's. What here's what my problem is changing. Culture and a company in short. Of an of, a. Massive, crisis, is, almost impossible, to do and only. Zuck has, the ability to change the culture at Facebook what I'm praying for is, that he gets a really good night's sleep and wakes up and has an epiphany, and realizes. That he. Can do a more. Good by fixing, the culture and business model of Facebook that he can do with a thousand, Chan Zuckerberg initiative. Now. I think. Counting, on that is. Probably. Not super, realistic but, I don't exclude it okay I mean, the mark I knew years and years ago that. Wasn't a zero percent are we kind of past that now we've kind of been - we may be we, may be but keep in mind. My. Job, there's limits, to what I can say in that area okay I'm not super, credible, when it comes to you. Know the, specifics, of Mark I haven't talked to him right. You know I mean I'm out, of touch there my gut instinct, though is that if, you don't change the business model it doesn't matter who runs the company, kind. Of change. The best of all it also doesn't matter you've been a friend, it's. Like you're one of the few people who if you were friends with Zuck you would have the power to actually be able to tell him this in a way that he could hear it but if you can't even tell them this that's, a problem not a problem I agree with you sorry. I'm, letting you draw that conclusion right, don't make me keep saying it out loud okay. I can't, be personal, between me and them right, that. Completely, destroys, my my, platform okay, I need, to focus on the ideas, right. They, are my friends I just. I it's not necessary, for me to go there to make the argument I need to make here and. So I totally, respect, what you've just said but. I'm not gonna engage with it beyond pointing, out that. If. You don't fix the business model it literally, doesn't matter who runs it. Anybody. You put in there with this business model you're gonna get the same outcome. Okay. And with that unfortunate, gonna sign some stuff out here if you want to ask fall off the heater alright thank you for a great conversation. You.

2019-04-25 19:52

Show Video

Other news