NBC's Jacob Ward: How Technology Shapes Our Thinking and Decisions

NBC's Jacob Ward: How Technology Shapes Our Thinking and Decisions

Show Video

Join us this new year for new conversations at the Commonwealth Club. Hello and good evening. My name is DJ Patil, and I'm pleased to be today's moderator for a two day session at the Commonwealth Club program, focus on Jake Ward's new book The Loop How Technology is Creating a World Without Choices and How to Fight Back. This program, somewhat ironically, unfortunately, due to COVID and Omicron, is virtual, but at least we're able to do it through all this technology, and that's going to be one of the things we're going to get into. But here at the Commonwealth Club, where I serve on the board of governors, they're actually we're starting luckily to be able to get some in-person programing and it is fantastic. Our in-person events are picking up in the months to come, and I encourage you to learn more by going to the club's website at WW W Dot Commonwealth Club dot org and following them on Twitter at CW Club.

I'm pleased to welcome Jake back to the Commonwealth Club of California to discuss his new book, The Focus of the Book on how technology artificial intelligence are working to limit the choices we make as humans is a subject that's really critically important to me personally. But to begin with, let's just have some quick housekeeping items. If you have a question to ask Jake or me, but really should just ask Jake. Put it in the Zoom chat.

Of course, you can always follow along with Jake on on social media on his Twitter handle at by Jacob Ward. Did I get that right, Jake? You got it. Yep, awesome. You can follow me at DY Patil, but today we're going to be here on the Zoom, on the Zoom channel. So maybe to start Jake, because there is so much to get into. And first, I just want to say congratulations on getting this book, and most of you, it's available on all the usual places.

I strongly encourage you to get it, especially given that the conversations that are timely at hand. But maybe to start, Jake, I remember how you and I first met, which was actually at a session that I had been trying to wrangle people to talk about ethics. And I remember the insightful, just great questions that you were asking. And now I read the book, and there are so many dimensions that I hadn't even thought about that you've been able to bring together. And so congratulations on doing so much.

And I'd maybe just a start as. You're a technology correspondent, you cover everything. Why this subject of all the things that are, you know, of the of the matter of today? Well, I really appreciate, first of all, you thank you so much for doing this. When they told me that you and I were going to be in conversation together, I was like, Oh, no, he's so smart about this stuff. And I remember being such a piker when I first talked to him about it. Oh God, he's going to blow me out of the water.

So I'm really flattered that you are here and taking such a personal interest in this. And I remember very well you and I first meeting and talking about this stuff and and that was at a time when I was having a couple of of parallel experiences and there really set off the book. So one was I had just done a documentary series for PBS called Hacking Your Mind, and through that we went around the world and met all of these scientists who were studying human behavior, studying the patterns in the unconscious decisions we make. And one of the big findings of the last 50 years. And this is best popularized through the work of Daniel Kahneman, who wrote Thinking fast and slow, which I know you know about, you know, is this idea that we have these two brains, these two cognitive categories, essentially one is a a fast thinking brain, an instinctive snake stranger and fire detection system. It grabs calories off trees without us thinking about it and gets us out of a burning building, you know, without having to coordinate our movements, right? It is an incredibly useful thing that's kept us alive for millions of years.

I think it's about 30 million years old. Then there's your slow thinking brain. Your slow thinking brain is a very new development in the history of our species.

It's probably only something like 70,000, maybe 100,000 years old. And it is what got us thinking, what else is there beyond calories and snakes and strangers in this fire and got us on our feet and exploring the world? And out of that has come all kinds of stuff art and law and politics, right? All of these invented human systems that are part of our higher society. So all of that is in this PBS series, and I got to then go meet all of these people talking about this.

And one of the big findings is that the vast majority of our decisions are being made by our instinctive, fast thinking brain. Even the decisions that we think we are using our higher cognitive functions to to accomplish. At the same time, I was meeting folks like you who were turning me on to this world of data science and pattern recognition algorithms. And while I wish to God and this will be one of our themes tonight that there were lots and lots and lots of people like you who had been, you know, trained in this stuff and then deployed it for good, right? As the chief US data scientist, unfortunately, I was meeting lots and lots and lots of people who did not have that goal. They were instead about making money using pattern recognition systems, and I had a very transformative evening in which I sat in a basically I went to a dinner party of a bunch of entrepreneurs.

They were young app makers and it was pizza. You know, I think it was Indian food and beer and a very chilled kind of atmosphere. And most of these folks were trying to deploy this stuff in the service of some really nice things.

There was some money saving apps, there were some exercise apps, you know, stuff that sort of before that. But one of the things that they had come together over was a lot of them used to be behavioral scientists in some form or another. And they were really interested in learning the latest behavioral science and trying to bring it into their work. So we were sitting there and we had this presentation that night, and I, you know, I went through all the process of saying, Listen, I'm a journalist, and anything you say to I was going to maybe end up in a book, you know, careful. Nonetheless, I got to witness this incredible thing. So these two addiction experts, these PhDs come up in front of us and describe their findings around addiction and the habituation of compulsive action.

And they, for instance, described a study that they were really interested by, in which if you took people who had been addicted to cocaine and used to do it in nightclubs, and you then and they had since gotten sober and you bring them back to a nightclub, you thump the music in their ears and you flash the lights out of the nightclub and then you show them a mirror with baking soda on it. And you say this is baking soda and you make sure they understand that. And then you ask them on a scale of one to ten.

How much would you like to do this baking soda? I don't remember exactly the methodology, but something like that. People absolutely want to do a line of baking soda anyway. And the reason they were telling us that story is because they said once the habit is built into the brain, it's just there.

And the human brain informs these habits so thoroughly, and it's so you know and can be so compulsive. And the moral of the story was it's fantastic news for you and your business, and that's why we are here tonight offering you and and anyone, you know, our services as addiction experts. And at one point. They were asked, was there any kind of company you wouldn't work for? And they said, and I quote, I'm quoting here, we don't want to be the salt police of the internet. They were absolutely agnostic about what they were going to do with this.

And they said that a lot of their colleagues from the program that they had studied and were now working for the big casino companies. So I'm having this experience of meeting, you know, of learning about the patterns of human behavior, learning about the raw systems that we have developed these, you know, very powerful pattern recognition systems. And then I was finding this weird in-between world in which very qualified behavioral scientists were looking at the patterns of human behavior and the patterns in our circuitry and trying to make businesses out of them. And I realized, Oh geez, I think I got to write a book or do something with this. And so that's why I got into it.

I mean, it's amazing that the book, I mean, it starts with this, you know, really, it's just a mind trip because you start by talking about timescales and what time is really like. And if we think about it, how short of an existence we actually have, as you pointed out with our, you know, our fight or flight versus our higher order functions and and how much technology is changing. And this this first loop, I think, is really, as you're describing it, there's this the systems. I'm wondering if you could go into a little bit of that because you go, this is almost like, I want to describe it almost as like Oliver Sacks meets Wired magazine. 00, do I like that, right? It it's like, if you could write that and get that published in someplace where I can then put it on the cover of the book, that would be really excellent. Thank you. So it's it's like it is.

This is like unvarnished view of like, where am I in control? Hmm. And I maybe could you talk a little bit about like what your favorite sort of unnerving finding about who we are as humans to? Could you go into that? Yeah, sure. So. So the the concept of the loop for me is, like you say, actually three loops in the book and you know, everyone wants to beat up their own work after they've read it. And so I'm not sure if I do it again. I'm not sure I would. I think I'll do a better job articulating these three loops, but here's how.

Here's how I came down to it was there is basically a loop in the middle, which is the loop of our unconscious decision making. Again, this system one fast thinking brain versus system to slow thinking, brain kind of cognition, right? That is the central loop. Then there are there's a there's a modern construction and this is the second loop, which is this sort of manipulative set of business models and mechanisms that we have deployed to take advantage of that first loop . And these are things like cigarets, you know, it is stuff that plays on our, you know, gambling is another one that plays on our on that circuitry to make money. Now the third loop for me is what's going to be made possible by artificial intelligence. And when I'm talking about artificial intelligence here, I'm not, you know, I'm talking just about pattern recognition systems and the ways in which we're going to start essentially behaving in response to manipulation and analysis by businesses.

Already, we can feel it right. I can't find my way from place to place anymore without Google Maps. I'm totally beholden to Google Maps, and that's starting to change the way I drive, the way I navigate, the way I make plans with people. And as that movement pattern is, is analyzed by businesses, this third loop is taking form in which pretty soon I'm not going to know how to do anything except follow the Google Maps around.

You know, I was talking to a friend of mine the other day who's trying to become a pilot, and he was saying that they still insist at the FAA that you learn the manual system on the flight computer of how to calculate where you're going and all the young pilots like, why would I do that? I just want to follow the blue line. I don't need to know how to do that right. We are losing a set of choices and and abilities that I think we used to have now. I think one of the examples is if you Google, you know, GPS and cliff, it's shocking how many people ignore their there higher level thinking functions because the car said, Turn right and you drove off a cliff. That's right. So here's here's an example of one that I think is the bigger picture that we're looking at here, which is so you'll remember there was an incident in 2017 in which a flight out of Chicago was overbooked as so many are.

And a doctor on board refused to give up his seat and was beaten up by a Chicago aviation police. And it made the news, and the United CEO had to apologize, and these days they are paying a lot more money to people when their flight is overbooked. I'll tell you that right now at the time, somebody said to me, they asked this question that I then went on to investigate, which is why? Why did they choose the people they did? And how did that decision get made in the process? And it turned out to be this parable for what I was talking about, which is it turned out that the computer basically they tried to get everybody to volunteer to get off the flight and people would not because the last flight out of Chicago, nobody wanted to give up their seats. That was that. So they often talked about money.

No, he said yes. second round of bidding, Adobe said yes. Finally, they said, OK, then we're going to choose names at random. And those people will have to get off, they choose these four names, and three of those people dutifully get off the flight. This guy, Dr. David Dao, he's a pulmonologist and he says, I can't get off the flight.

I've got rounds in the morning about patients to see. He was right. In the end, you can't. You're actually not supposed to take off a doctor who's on call the next day. But. Something about and there's a whole parallel I talked to oh, talked about all of this in the book, I talk to all these different experts who could, if they were sitting on the plane, could have said careful everybody.

This is the moment where you will all abandon your critical faculties because they will have told you a system has chosen people at random and right there. Anthropomorphism kicks in the technical term for attributing more sophistication to a system that actually possesses simply because you don't understand how it works. And and everybody in the chain of command down to the aviation police are just told this guy's name was chosen by the computer. Get him off. Right?

And I, when looked at like, how do they choose the people? Oh, it's because of do they have status on the flight and how reason to they buy the ticket? And there's also but nobody was talking about that. All they said was computer says, Get him off, get him out of here. And everybody abandon their critical faculties until the aviation police, you know, they beat this guy up.

He'd be hospitalized, you know, and for me, I'm just seeing over and over again the ways in which our brains are uniquely vulnerable to being given a verdict by a system we don't understand and how desperate we are as the gate agent that night was not to have to make the hard decision. Does not want to be the person to stand there and say, I'm sorry, sir, you are getting off or I'm sorry you are getting off instead, says computers going to choose for us and we're going to see that in hiring. We're going to see that in loan making. We're going to see that and who gets bail. And I don't think in the same way that you and I can agree, we don't know how to find our way around anymore. Right.

And people are driving right off a cliff using their GPS. I think when the computer says, hire these people, you know, these people are hiding, you know, are committing fraud, whatever verdicts we ask you to render. For us, it doesn't matter if it's right or not, we're going to believe it because that's what our brains do . And we, I think nobody is has has taken that adequately into account. And regulators certainly are not thinking about it yet. And I think we have to rethink how we.

Consider our vulnerabilities to these systems before we start deploying them on generations of people as we're about to. Mm hmm. It's interesting because example in the book, you also come to the defense of the of the aviation flight police. I think that's what they're called. So that's right. They in fact, they so one of the officers sued the the Chicago Airport Authority. I remember what it's called, but they've issued the Civil Aviation Authority, whatever it is in Chicago.

And he at the time was made fun of by like local newscasts. You know, another spurious lawsuit, you know, of course, he's trying to, you know, hand off responsibility for this. But in his lawsuit, when you read it, it's so interesting.

What he says is we were not adequately trained to know what to do when we are told to get on a non-compliant passenger off a flight who's been chosen at random. And so, you know, I don't I don't think I don't think he was reading, you know, the cognitive people that I, you know, who's really I don't think we came out it for the same reason. But his instinct was the same as mine, which is wait a minute. I didn't have the faculties or the leverage or the anything that you just understood that like the system had everybody in its grip and and terrible decisions were made as a result, you know, and for me, I do. I have sympathy for him, not least because I think we're going to see over and over again in all these agencies, right? Let's think about, you know, I was just talking to somebody the other day at NBC News who was talking about the difficulty of hiring.

This was somebody who worked in an industry in which, you know, they would love to be able to hire people that the A.I. doesn't think is qualified and they can't even they don't even have enough people on staff in the department to go in. Nobody's got time to go in and say, you know, this person actually is really cool. And all these other ways, if if the if the check boxes are not checked, then you know, by the pattern recognition system, you know, then that person's out of the running. And so I am I'm I'm deeply sympathetic to anybody these days who says, Wait a minute, why are we doing it this way and shouldn't we think about a different way to do it? Because I think the profit motive and the incentive structure is going to make it harder and harder for us to do it if we don't stop, you know, if we don't slow this process down now and we see I think it in policing too is is if an officer questioned decision, they're fired also because of some form of insubordination. And so it's almost like you're damned if you do, damned if you don't and you can.

You can get the human gets fired, but there's no accountability is one of the things I've taken away from the loop structure, the way you've articulated it. Well, I appreciate that and I think I think you're absolutely right. I just think, you know, there's very little incentive to push back against this stuff. Now I want to say here, you know, I mean, I want to be I am trying to push back against this whole thing because I think it's really scary.

I also think that there are, in fact, some really extraordinarily positive uses of some of this technology. And and and some of it can, I think, be great. So there's a very fascinating guy named Michael Knapp, who runs a place called Green River II. He's way out in the woods in Vermont, working by himself and is a team there. But you know, he's just off the beaten path and he he only works with nonprofits and some he'll do some hospital and health work, but he has these very high standards for himself about who he will deploy a AI on and with . And I was running my thesis by him.

You know that we're going to lose human agency every time if we rely on these systems, he said. Man, I wish I had that problem. You know, it's like my people aren't moving fast enough on this stuff. If only they would grab onto to the way that for profit companies, they said. For instance, if you gave me every birth certificate in the country and I could just feed it through a machine learning system, you know, a gang would kick back at me. Every apartment that needs to be repainted to avoid lead poisoning in this company and in this country, I could save millions of years of life if you if you gave me that opportunity, but I'm not allowed to do that.

Or, he said in social services agencies, they a person comes in and applies for help. And then we find the services for that person, he said. It should be the opposite. I should look at the available resources and feed them to individual cases, go out and find individual cases, but we're not allowed to do it that way . You're not allowed to approach somebody that way.

So the problem, I think, is that nobody's making money, taking lead paint out of apartments. Nobody's making money, matching social services, people with the unhoused. We are making money on gambling, right? We're making money on addiction.

And I think it has. I credit. More capacity to in theory. Amplify our slow thinking brain, our sophisticated moral brain, right, this is the the thesis of a new book by Daniel Kahneman. I was so disappointed by that book because I wanted to say to him, Yeah, but no, he's making money doing that.

They don't want to sell to System two to the first thing it does slow thinking brain. They want to sell to the fast thinking brain. They want to sell to the more instinctive systems that can't help but, you know, get angry and make a rash decision and the rest of it. So to me, it's capitalism that I worry about here. Well, it's a great one because I think you know, what we've seen in the early days of data science is the potential power of of of using data for for frankly good.

You know, the one that I think about is, you know, given this recent fire in New York City, you know, one of the early cases in the Bloomberg administration was how they were cleverly using data to identify which buildings were likely to have lots of problems because of complaints and calls to the city. And they just it wasn't like they did machine learning there. They just took a list and they said, Who's got the most complaints? And let's look at the places nearby. And so they could prioritize the the police inspector or the fire inspectors time to be most efficient and effective. It almost seems like government. It is not.

And I think also think about like medical errors in hospitals, like is it purely economics? Is it because I think about all these do gooders who want to go into these fields? What's holding them back from from taking giving agency to two forces of good for lack of a better term? Right? Right. And I do think I think it's money. I mean, I just think it's money. It is. It is much, much harder and less profitable to do what you are describing. And I think that is not to say that people aren't going to be able to do that kind of stuff, you know, make those fantastic.

You know, I I meet people every so often who against all capitalist instincts have decided, you know, I'm going to make a business that does X, you know, I'm going to work my brains out to do, you know, I met a kid the other day who wanted to subvert the cash bail system by creating a whole alternative. Sort of, you know, I mean, and and it was a it's a, you know, it could be a huge moneymaker. But his whole purpose was to try and do away with cash bail because it is a deeply unethical and unequal system and falls most heavily on the poor and all of this stuff, right? But those people are very few and far between. And I think that that, you know, I also, you know, I. one of the difficulties and I know that you have encountered, I assume you have encountered this too, and I and I want I'd love to hear your thoughts about as well as you know, there is a tremendous culture.

In Silicon Valley and in technology in general of convincing the people that work in that industry, that the work they are doing is for. The better is for. You know, a greater good.

It doesn't really matter what they're doing, making chairs, making vape pens. The line from the the, you know, H.R. people who tend to be referred to now as people and culture kind of people, right, is is you're doing good works in this world. And so I think a lot of people actually spend a huge amount of their time being convinced about also, you know, some of the biggest companies they refer to, you know, the people in that line of work as as scientists and as you know, the the the you know them call to headquarters in what they call the campus right. There's a there's an academic veneer to how some of this is is done that I think draws people in. It makes it very difficult for me as a journalist to have a frank conversation with them.

And that's putting aside the fact that this is the most secretive companies in the world that you know, when I talk to my colleagues at NBC who cover the military or cover the Pentagon, right, they they people will talk to them endlessly, not necessarily on the record, but they will talk to them endlessly. It is incredibly difficult for someone as a journalist to to speak to people inside one of these companies. And by the way, anyone listening who would like to speak with me, I'm very good at keeping a secret and love to speak with you. But you know what I'm saying? Like, like, so I think there are many, many incentive structures built here that are going to make it difficult for people to make good choices.

And I would also like to point out smart very, you know, much smarter people than me, people like you. You know, there's a guy like a brilliant woman named Meredith Whittaker, who was inside Google. She's now a consultant to the FTC. She's working with Lina Khan, and she made, you know, she was one of the very first people to say it is not OK.

I mean, thinking here about what you're saying about, like trying to push back, trying to raise your hand. So this is this is a, you know, not OK, that it is not OK in inside certain companies for someone in the position of being a data scientist to even ask what their work is going to be used for . You know, I was talking to some of the other day at one of the big five tech companies for six, whatever there are now who said that if you asked too many times, you immediately get fired that it's a it's a if you ask, you know, what is this going to be used for too many times they let you go. You know, so I think it's a really complicated landscape in which to create any kind of groundswell of resistance to this kind of stuff.

And I, I worry about that, right? Say. So for those that are just kind of joining us, we're in conversation with Jacob Ward and his book new book The Loop, and we're going to start taking questions in a bit. So I encourage you in the Zoom chat to ask them here.

And you should definitely be following Jake on Twitter and other social media at by Jacob Ward by Jacob Ward. You know, the one that you're talking about in these corporations and transparency, I think is a great jumping off point because jumping into point around national security and you know, some of these companies are talking about, you know, the need and the desire to work on a national security. Given the complex landscapes, questions about Western values and what's being put into these systems. China is aggressively going after surveillance technologies.

We see AI being used to break into systems from Israeli companies. It's a very complex landscape, and if we are a classic argument as if we don't do it, somebody else will. That's right. How, how, how do you square all these things as you look across across the complexity of the world and honestly preserving our lifestyles? I know, I know it is such a vast and important subject.

And maybe that's the next book, right? But but I think that the so first of all, that thing of if I don't do it, someone else will is one of the great traps of this world. Whether you are someone who you know, I've had people say that to me, who work in national security related products, making surveillance systems. You know, I've and I've heard that from people literally who make, you know, highly addictive casino simulators designed to ensnare old ladies.

You know, that way of thinking that someone's got to do it and might as well be me, I can do it a little more ethically than somebody else will is a real, complicated thing. And that's common thing now. National security. So you're it's amazing that we have to start with the sigh, Oh my God. I mean, it's so complicated, right? Like right before the pandemic, I was in this. I was just about to go to China to try to do a whole series for NBC about this because it's in one of the conversations that we were having about it is the language that you have to use in order to speak to Chinese officials about this stuff.

And what I wanted to do essentially was go there and say, OK, you have the exact same technological capabilities, arguably even more sophisticated technological capabilities and an entirely opposite worldview, political worldview . And and what does what would our lives look like if we were living in an authoritarian environment in which stability and control was the priority as it is in China? And it's so fascinating to to to talk about that then with people here in the libertarian west, you know, West Coast. I was at a dinner where someone was giving a presentation on the Chinese social credit system where you have to register your, you know, look at my Jacob board, you've got to turn that those credentials into the government so they can monitor what you do online. And if you don't behave properly, if you post something about Tiananmen Square or worse, if one of your friends posts something about chambers where your credit score goes down and you can't get a loan and eventually you can't ride trains and eventually you're basically a prisoner, a prisoner in your house. You know, they were describing that system and the effect it's having on social cohesion in this that on the other end, after the presentation, the room was divided. Half of people said, Wow, that's a nightmare, and the other half people said, we should totally have that here in the United States.

You know, there is an there is a way in which the most textbook form of communism, you know, and where and authoritarianism and where it goes and the most wild form of of libertarian tech fuzed capitalism meet in the distance in this weird way that I haven't really figured out for myself yet. But I do know that over and over again, I I bump into people you know who who essentially say. We have always assumed in the United States that our model is the model. And and you know, there's that phrase Theodore Parker originated it, but Martin Luther King said it. Read more Like the universe is long, but it bends toward justice, Obama said a lot.

You spent some time with behavioral scientists, political scientists, people actually studied this stuff and they'll say, No, we don't know that to be true. The moral arc of of the Universe is so long, and we've been on it only very briefly. We don't know which way it bends.

We don't even know if it's an arc. You know, they are not optimistic about the stuff. It is all an experiment. And so I do not think it is in any way a foregone conclusion that our way of life and how we deploy technology and the sorts of conversations that you and I are having tonight are a natural thing.

You know, that's going to take place all around the world, you know, it could very easily go the other way. So, you know, I don't know. All I know is that is such a minefield. It's so complicated to think about, and I wish I was asking you the questions about it because I don't know the answer to that one. Well, it's why I get to ask the questions because they're all trying to figure it out, right? But at the same time.

You know, some of the stuff that I struggle with, honestly here is, you know, I think like, let's just take a look at Google MAVEN project when you referred to before us, which is also this one of surveillance. And on one side, in these surveillance systems, you have a human who's sitting in a screen, who's trying to track a person. And if they get distracted because somebody asked somebody and they know they're following the wrong person, that may result in somebody deploying, you know, I don't know, a drone or something that that takes out somebody who's innocent and so could the computer is a computer there also to fire us out of jobs that we find so boring that we are bad at them and result in an error? I'm thinking about the nurse who missed types the diabetes, the insulin number or fat fingers that just because she's so tired because she's working, taking care of 50 COVID patients, and now this person is getting the wrong thing. Yeah, yeah.

You know, for me, I think all things being equal, you know, I was talking to an organizational psychologist who was basically saying, Yeah, if you, you know, I was asking you, are you in favor of using automated systems to find candidates for jobs and screen them and the rest of it? And she said, you know, if it replaces the racist instincts of some, you know, longstanding hiring manager who's been bringing his racism to it for years? Yes, absolutely. Unfortunately, she was saying, I don't think that, you know, the it is it is as simple as swapping the judgment of one out for the judgment of the other, because along with this is this blindness to all of these different things and this is me talking about, I mean, you know, the unique human vulnerability to believing systems we do not understand. And there was a recent study of all of these high level CTOs at these, some of the biggest companies in the world. And they were asked, You know, how often are you use relying on A.I.

to make important decisions? And they're like all the tools, like 75% of something crazy. And then they said, you know, how many of you can actually explain how the systems work? Almost nobody has any idea how these systems are arriving at the decisions they are. And to what extent are you worried about that? Not even slightly. You know, like less than a quarter of thes you know, care. So it's the problem, I think, is that in the case of the exhausted nurse who's going to be replaced by the system, excellent, that would be great if the pattern of history were that. We then gave more time to that nurse to do her job better, to do his job better.

That's not how we tend to use this stuff. We cut that nursing staff down to two, right? And then we use those automated systems. So we I think we have to build some values and emotions into how we make these decisions as opposed to just doing them for efficiency and better accuracy. Because, you know, it is, it is going to be necessary.

And I will say sort of take up more time here. But just, you know, I will say there have been some instances in which we have done that as a country. So for instance, you know, backup cameras, there's a physician who accidentally backed over his child's most terrible story, and he gave all of this congressional testimony about it.

And eventually, after a long battle, it is now the case that if you buy a new car in the United States, you have to the car has to have a backup camera and you're paying about $100 more per vehicle for this. And that is because about 60 kids a year were being run over, typically by their parents. The efficiency model and the, you know, the math would suggest, Well, that's not very many kids, but we as a country can agree that that is totally unacceptable and that we can agree that we can.

We should solve that problem. And Senate, the senators could, for whatever reason they got together, was a bipartisan effort and they made it happen. And today that doesn't, you know, those numbers have dropped to almost nothing. So we can make decisions on the basis of things other than efficiency. And we're really going to have to because the temptation to just do it because it saves you a few nurses is going to be the thing that drives the value proposition for so long and we need to stop thinking about it that way.

I think it's it's so good. It brings up to sort of memories of mine. one is the President Obama's Precision Medicine Initiative and the argument that I think really got the president, President Obama, over the line on it and realizing why he wanted a data scientist to really run.

This was the only way we're going to go after long. These long term genetic. Issues that sort of cropped up in the in the that we call end of one, these rare diseases is if we make this a data problem and we're able to kind of instead of just having hypotheses out there and going and building these very complex, that's kind of going, Wait, isn't that interesting? We're seeing this correlation here of cancers.

A population just having cancer. We should go study that. You know, there's other one that comes to mind as early in my career when national security was, I was talking to this general. We are building these detectors after nine to to to, you know, basically tell us somebody was bringing in something that had a like a dirty bomb. And you know, we had all this pretty sophisticated stuff and graphs and charts and everything.

And the general pulled me aside, said, Son, you ever work with the Marine? I was like, No, sir, said the light is either red or green. Yeah, he just turned and walked away. Now my apologies to my brother, my marine friends.

But but it raises this other point that you're talking about. And so I want to use this as a jumping off point because we're getting a number of really great questions here and taking those two examples back to the issue of racism and the racial reckoning we have because we got a couple of dimensions there that I'd love your take on one side. We have some communities that are saying, yeah, put license plate readers all over my over my neighborhood because we want to. I have to hide what you have to hide.

But there's there's clear bias and bending the moral arc there. We have people using facial recognition and fair protests. We also have the A.I.

systems that are and who designed these, but also has commented in one of these come somebody pointed out here is thank you so much for speaking on this issue. As a lawyer, I notice that most judges are irritated when the defense counsel even suggests that our systematic way of doing these things should not be challenged and that the judiciary is starting to use bail calculators, as you talked about, that have been proven to be racist. So I I bring up all of these things because it's such a wide area. How do you how do you get your head around this? Yeah, yeah.

So there are so many dimensions to it, as you, as you say. And you know, to me and thank you so much for that question, whoever brought that in. So the law is such an interesting one because that is I mean, there is there is no better example of that slow or sad, slow thinking brain, right? The slow thinking brain in which we are policed by people we've never met and we try to create these laws we all agree on. And, you know, I mean, our instinctive snake detection systems were not capable of that. And so it is a credit to our species that we're able to do that.

My dad always makes the point. You know that when we when we drive on the highway that we manage to keep our lane and not kill each other is just a miracle, and it's absolutely correct. So. The fact that in the law, we're thinking here about this astonishing so there's a Superior Court judge named Tino KUAR, who is now running an institute and is a very smart guy who's at Stanford.

When I was doing a fellowship there and he said to me, You know, you could make the law so much more efficient , it would be possible to do it. You know, he's said, for instance, entering a guilty plea is such a pain. You've got to fill out all these forms and you've got to do this and you can't do that. You got to think it through. And we could make it a swipe left thing. You could make it, you know, such an easy thing he's like.

But we have a principle that we call weak perfection and is the idea that you build a system intentionally awkward so that people have to think about it because with a guilty plea or not guilty plea, that's going to change your life. You don't get to take it back. You know, it's one of the biggest decisions you'll ever make. And so we cannot make it as simple as ordering from Grubhub.

We need to keep it hard. We need to do that so that people bring them their best selves to it. So in this particular question of, you know, you know, judges don't like having something like systemic inequality brought up.

I know I mean, I have this whole section of the book that for me was the really mind-blowing one for me, which was getting into the world of online casinos. Similarly, as I mentioned a few times here, because they're the big bugaboo for me, these absolutely cynical, predatory companies that that use the circuitry of addiction and and geofencing and targeting and, you know, distressed lines of credit and all the other stuff you can find through data to pinpoint people they think are going to get addicted. And what's so interesting is that for years, I was talking to this one lawyer who for years has been trying to sue on behalf of these people who lose their life savings to these companies, like four 99 at a time. And he was laughed out of court for years because the judges would say, I'm not calculating this is, you know, what are you talking about? Loss, you know, losses. These people are suckers, you know, and I've had many people in my reporting on this run, b c also say, you know, these people are suckers and they get whatever they deserve this, they're playing a fool's game. But recently, they've started winning these.

This law firm has begun winning. They just did $150 million settlement with one of these companies because they were able to show in the data that these companies know exactly how addictive these people are and are finding them on that basis. And so I think if you can get better, if we can as a society, get better at deploying the same pattern recognition systems that we're using to ensnare people to instead prove the existence of systemic racism, the way that redlining and systemic discrimination has made its way into all these other things, right? I think we could get to a place where you could actually make a case in court. You know, I know people are very cynical about the role of of trial attorneys and litigation in this stuff, but that's why we don't smoke cigarets anymore, you know? And I think that that suing the bejesus out of people is part of how we're going to get through this. And and that is that's for me.

I hold I'm clinging to that one because I think that's going to be a big part of this. So please make sure to get your questions in here in the chat. This raises another good one that was on my list of things to talk about, and I'm glad a number of people have brought this up also, which is misinformation now where we are as a country and people going down into conspiracies and getting sucked into things, whether it's vaccinations or January six, then the insurrection and people's heads getting turned around. What's the role of technology in this? And you know, one side of it is people are like, You know what? We have this culture in America of, you know, if you're not tough enough, addiction is your fault. And then we have this other side of, you know, Oh, it's not.

It's not them, right? Yeah. Yeah, yeah, that's right. You know, so so I've been so lucky to be in touch with people who you know, to work alongside and have for this book interviewed people who really specialize in this in in in thinking not just about misinformation, which is itself such a problem, but also the grift, which is how I've thought to be taught. You know, I've taught to be thought to think about it involved in misinformation, the small time grifters who make money off of misinformation.

So around January sixth, for instance, I spent that day monitoring all of the online streaming. Folks, typically on YouTube who were streaming from the capital and they're pulling in the feeds, the live feeds of on people's phones, which of course have gotten so many of those people arrested. And at the top of the YouTube screen, if you have a certain number of subscribers, you're allowed to institute what's called a super chat, which allows you to charge money for pinning somebody's comment to the top of the window for a few minutes and you can set whatever price you want. YouTube, of course, takes a percentage of that, and it's 20 bucks, 50 bucks, whatever you know, 100 bucks.

And I'm just watching people. Bing, Bing, Bing, you know, they're making, you know, a few thousand bucks a minute now. Is that a few thousand bucks a minute live streaming live stream of the pseudo democracy via an insurrection at the can't make this happen. You cannot make it up because you would not believe it if you saw it. So it is. It is.

It is. Yes, Idiocracy, but not funny. Right? It is. It is. And and and so that profit incentive is a huge driver of this. When I go cover, you know, when I was covering a lot of the Stop the Steal protest rallies at NBC, I would go see these people there who are streaming live. And it's so insane because they look sort of to a space alien. They might look like their job is the same as mine.

They've got lights, their hair is done, they're doing their makeup right and then they go live. But these people are leading the chant. And when you look at their their Instagram feed or you look at their super chat on YouTube, you can see they're making money in that moment now.

I am also being paid to cover this, but I don't get paid more per comment. You know what I'm saying? Like, you know, you have a you have an industry behind you ideally called journalistic ethics. Yeah, that's right. That's right.

And I get fired if I make it up, if I lie, I get fired. So. So there's some grudges around it anyway, that that for me, learning the grift was really powerful. And then there's a very brilliant woman named Nandi Germany who who runs something called Check My Ads. And she was the co-founder of Sleeping Giants. You're both my boss, and she has been doing all of this research about the ways in which online advertising funds all of these very scary publishers of all kinds of scary stuff.

And she turned me on to the research that that really sort of set her on her path, which showed that there are all kinds of black list services that will spike certain published articles against basically make it such that advertisers won't be, you know, who don't want to be publishing or advertising next to sensitive topics won't be published next to certain news articles. And she discovered that in fact, with the researchers, they discovered that it is people covering really important stuff that are being blacklisted off of these lists, such that some Pulitzer Prize winning brilliant people that the New York Times, for instance, were not. No. Online advertising was appearing next to their work. So it's actually costing the New York Times money to run that kind of really important journalism. So for me, it when I when I think about misinformation again, I'm thinking, OK, there is a system here both of patterned dumb pattern recognition that nobody is equipped to to question and incentive structures that is fueling this stuff. You know, I also blame as much as the next person.

Our tendency to just try and be tribal and crass and get attention. The attention economy is a really important part of it. All that stuff.

But there's some specific machinery in there that I think we should be starting to think about how we're going to take a hammer to it. I mean, one of the ones I will highlight because this is how much I enjoyed the book. Here it is. I think it's in Chapter two. You talk about the experiments of what happened when kids are just basically told they're on the green team versus the orange team and what affiliation does as a powerful psychological motivator, you know, and it gets me to one of these big questions that's in here.

Is it almost like the way I almost want to describe it is a once you could you talk about the addiction of people have too much time on their hands because technology is freeing them up, they can take drugs, get into the stupor and detach from the world. Same thing happens with gambling. You see a version of that. You see a version of this with. As with people who don't have alternatives to spend their time on work or other things, getting into these forums where they get they get radicalized, not just here in the United States.

We see it around the world now, and it's almost as version of like we're using technology to free ourselves up. From time, but then, you know, when time comes together, you know, your free time plus despair. The note I wrote, is three times plus despair equals opportunity to take advantage of people and technology accelerates it. Yeah, it is.

I'd love for your reaction to that. Yeah. So I yeah, I I absolutely I'm various about what you say. I'm not sure that I blame free time as much as I blame despair in the equation that you have there. And I also think that social isolation is a huge part of that as well.

So one of the common threads, you know, in the book, there was a 14 year old kid who lost his mom and was deeply isolated in Florida who wound up going down this rabbit hole of of race, quote unquote race realism and all of this stuff, and wound up adhering to all kinds of white supremacists. He's a Muslim, right? And he turned out to be Muslim. That's right. He was.

His parents were Bosnian Muslims who escaped genocide. And he nonetheless wound up down this rabbit hole and became, you know, somebody who believes in white supremacy. So that kid couldn't have been sadder or lonelier than than he was. He was deeply looking for connection and was not able to find it. Another character who you know, fell prey to online casino simulator is also a deeply lonely and sad person. And here's the thing is, I you know what I'm starting to understand is that there are marketing mechanisms out there that find people who exhibit those conditions.

You know, I don't know about you, but a lot of my pandemic, as soon as I turned in the book anyway, I went hard at Ticktock for a while and with doom. Scroll my way through hours of it until. And here's what happens when you get to a certain point in tick tock until a video comes up that says you've been scrolling really fast, you should slow down. There's like a little little warning that says you've been going too fast, slow down.

And then eventually it'll say you've been looking at videos for quite a while. You want to take a break. Meanwhile, every ad I get is for ADHD medication.

Right? And I'm sure anyone out there listening to this who's been on TikTok recently has gotten these ads as well. Huge amounts of ADHD medication now. Maybe everybody's getting that. Maybe that's just a blanket kind of advertising campaign. I don't think so. I think that inside that company there are there is a pattern recognition system that says this guy is exhibiting the classic signs of X, Y and Z serve him an ADHD ad.

You know, it is not just the affinities that we have in the hobbies we exhibit, and what we post about it is the way we behave that is showing our inner state. And I think that we are being analyzed in that way. That is the loop, right? That's what's certain to grab us. And as they get better at noticing that I have ADHD, I'm not actually a diagnosed ADHD people.

And there's there's a whole problem with with advertising ADHD to people who have not been clinically diagnosed. But putting all that aside, the qualities they have spotted in me and are feeding me information as a result, I have to basically the way Tic TAC is for me, it's like doing drugs. I do it for a couple of months and then I have to erase it off my phone because I cannot control myself with that app. And so, yeah, there's an inner state being analyzed here that I think is a huge part of this. And, you know, maybe it is extra time on our hands, but I don't know about you, like half of Americans can't put together an extra 400,000 emergency right now.

I don't think time is our problem. Know well. I've tried to layer in a number of the questions. There's so many more that have come in that they're amazing. I hope some people will keep putting them in here that. Maybe to capture a couple more of these is it's almost like I might describe these as are you optimistic or a pessimist? No, I know I'm weird.

I mean, it's almost like, What do we do about this? What do to do, Jake? I know. Here's what we're going to do. We're going to I think we're going to, first of all, need to look deep inside these companies and make them civilly and maybe even criminally liable for the ways in which they've tried to manipulate our behavior. I think that it's going to start costing these companies money right now. Human attention is treated as this kind of ephemeral thing. It's endless. But, you know, people smarter than me have been, you know, saying, No, no, no, it is like mining and we need to regulate it.

Not that we do a great job of regulating mining, but we need to get into to these companies, I think, probably through lawsuits and begin showing what they are knowing and doing now. That's that that's for me, that the first step. But I also think there needs to be a recognition that all of that like. Star Wars, we're watching, you know, when I watch Star Wars these days and Han Solo was being told by C-3PO.

Never tell me the odds, you know, me a little nerd, right, he's always saying, you know, don't tell me, Oh, you know, for Captain Solo, the chances of survival are 10,566 to one, right? He says, never tell me the odds. Listen to C-3PO. C-3PO should be the hero of that movie because he's right, should not do this, you know, and our whole culture is geared to has been since the 19th century on this idea of rugged individualism. Growth at all costs is good.

We're going out to the West and and, you know, pioneering our way out to a better life, you know, as opposed to thinking as a community about how are we going to support one another and what if it all goes wrong? And and and for me, a big part of that is going to have to be making it socially acceptable to say, here are my mental predilections. So for me, I try, I tell anybody who who, you know, wants talking about it like I am. I no longer drink. I'm a I don't.

I think it's unfair to people who suffer from alcoholism to refer to myself as an alcoholic. I'm not sure I fall fully into that category, but I absolutely cannot drink. I've learned that about myself and I have also learned as a result that when people say to me, Hey, let's meet up and go to a bar.

I say to them, No, I would love to take a walk with you. I would love to do something, but I cannot go to a bar with you. I used to drink and I don't anymore. And that's going to mess me up, right? Being able to say Tic TAC is gummy. Right?

Being able to say, I'm having trouble with this thing, you know, making it socially acceptable to look at the odds right to listen to C-3PO, I think is going to be a really important thing. And then the last thing is, I think we need to stop letting. Culture, the modern culture, as it's being dictated by some of the biggest companies. Tell us, are norms. So for me right now, I'm in the process at the school that I'm that my children are out of creating a pact with all the the parents in the grades that we are in to not give our children personal smartphones until they enter high school at the very earliest. And I can't tell you how complicated that conversation is.

It's a very hard thing to have that conversation because it involves admitting to your own difficult relationship with smartphones. You got to sort of admit, as a parent, you don't have any idea what your kid is really doing with them and what that might be, and that you may not even know your kid fundamentally at all. You know, it's a really hard conversation. But we have managed to get through it. And in fact, I'm on the hook right now for being the guy who's supposed to write up the new revised pledge after a huge amount of really smart input.

It makes my palm sweat to realize that I'm on the hook for that right now. But, you know. It's going to require communities coming together and saying, nope, nope, I'm not going to do that because, you know, the statistics show that the vast majority of parents get their cues about what's an appropriate use of technology from the ads for technology from a cutesy Alexa ads in which the kid in the dog, you know, trigger Alexa by accent is not adorable. You know, they're normalizing behavior that we have not actually signed off on.

And I think that we should we need to start coming up with some civic structures for saying, no, it's too. And we're too quick to say, Oh, don't be a Luddite, right? As if that's some sort of terrible thing. You know, if you read up on the Luddites, you're pretty interesting group.

That's pretty interesting. You know, and I'm not saying we need to kick it all out of our lives. I love being here with you tonight, GE in this way, this is an incredible empowerment of our slow thinking brain. You and I are doing right now fantastic, but we have to recognize the profit motive, the power, the, you know, the the way it's going to feel inexorable as pattern recognition systems make their way into our liv

2022-02-01 19:15

Show Video

Other news