What can Mary Shelley's classic, Frankenstein, teach us today?
Welcome. Everybody, I'm, Jennifer. Widom I'm, the Dean of the School of Engineering. We. Have in the audience today graduate. And undergraduate alumni. And we have members of the community I also wanted to mention that we are live-streaming, to. Thousands. Of alumni and Friends all over the world, the. School of Engineering has, been co-hosting. A series that we call intersections. It brings, together faculty from, the School of Engineering and, from humanities, and Sciences, to share insights, on a common, theme or idea and. This, tonight, is the third in that series, but, we have the series because, we recognize, that world's. Most challenging, and, complex problems. Need, to be addressed not by individuals, working by themselves, but by people working together and talking together across disciplines. We. Also value, an understanding. Of the relationship. Very important, relationship, between humanities. And technology. And I would say especially today. And urgently. Into the future and the, themes in the book being, discussed, this evening are an incredibly. Dramatic example. Of why having that understanding is so critical, I'm really. Pleased, that purses. Drell will be the guest tonight. Persis, is a colleague. She's, a friend, she's the Provost at Stanford, and the former dean of Stanford, engineering she's. A scientist, and she's someone who's thought very deeply about. What we like to call the, humanist, engineer. Don't you see the danger. Jona, inherent. In. What you're doing here genetic power is the most awesome force the planet's ever seen but you wield it like a kid, that's found his dad's gun. Dr.. Belman i learned. A great deal from you at the university. I'm. Sorry Dave I'm. Afraid I can't do that I'll, tell you the problem with scientific, power that Europe that you're using you, know you read what others had done and you took, the next step you, didn't earn the knowledge for yourselves so you don't take. I said. Just some dumb things which nobody ever done with her, but. You're scientists for Sofia. Please. Welcome, your philosophy, talk host to the stage. Professors, can Taylor. And Joshua, Landy. Holding. Hands at midnight. Me. The. Starry. Sky. Nice. Work, if, you can get. It and. You can, get. You try. Should. I after. Nice. Work. And. You can edit, if, you, try. When. Your technologies, like artificial, intelligence, and bioengineering, be, the salvation of humankind or will, they destroy. Our bodies, our. Democratic. Institutions and. Even, our planet. And who is going to control the technologies, of the future this, is philosophy Tom the program that. Questions. Every. Except, your intelligence. I'm Josh Landy and I'm Ken Taylor we're coming to you from simek's auditorium, on the Stanford, campus continuing. Conversations, that begin at philosophers, corner where it can teaches philosophy and I direct the philosophy, and literature initiative.
Welcome. Everyone to philosophy, Todd. And let's. Hear it for our musical guests a Tiffany Austin trio. Today. We're, thinking about monstrous. Technologies. As part, of Stanford, University's, Frankenstein. At 200, project, monstrous. Technologies, that that's a strong word can oh gosh. Come on look I love. My iPhone, but. Got. To admit smartphones. Are causing, an epidemic. Of, distraction. And Tom Nia and, depression. That, seems pretty monstrous. To me and, that's just techno, panic can know, who look. Look people are always freaking, out about the latest technological, invention. You know like the printing, press the mechanized, loom. Newspapers. Electricity. Always. Turns out there, was never a thing to worry about shush. Come, on tell, that to, the inhabitants of Chernobyl, or Fukushima or. The victims of asbestos poisoning, or almost, elyda my babies. You're. A big literary guy it's just like Mary Shelley. Says in her novel Frankenstein. That we're thinking about. Technology. Can, be deadly, very. Silly look. You've watched too many Frankenstein, movies, Catholic with the novel, they're novels, a lot more subtle and sophisticated than, you or Hollywood are making it out to be that. Novel it's just some Luddite, screed, against, us you're the horrors of Technology, it's. A philosophical, investigation. Into personal identity, it's a it's a brilliant experiment, with little reform, it's, an exploration, of, deeply, varied antisocial, impulses. I mean that no you're. Forgetting the main thing John it's. About, also about a technological. Marvel that. Runs around killing people. Right. All right fair point Touche. But. But remember. That great scene in the novel where the creature learns about language, he calls language a God like science. And and, writing writing he says opens up a field for a wonder and delight just. Your waxing, so poetic, Oh big deal well. No that's a big deal writing is a technology and, it's among the greatest technologies, ever invented, the, novel, celebrates. That kind of technology and, we should too oh look Josh look look look I I love, writing too I mean. That's why it's hours and hours riding myself it's not just because I have writer's block because, I love writing, I even, love other people's writing I love reading your writing ja, but. You know even, a technology. As glorious. And as powerful, was writing has its downside, look, no. Writing no, mind. Kampf no mine comp no world war two QED, technology. Rebus. Godwin's. Law can, look. Look look look, it's, not the technology, of writing that was responsible, for my confident, if the guy who wrote it and it's the people who read and and, believed it I mean you you can't blame technology for. What people do with the technology, you have to blame the people but, you're missing my point Josh. Technology. Is often, designed explicitly, designed, just. To, exploit, human, weaknesses, what. Do you use your iPhone so much why. Because, it's a drug Josh and, so is social media and social media is this, addictive. Drug is driving, people to suicide and it's, ruining, our democracy. Sounds. Like you don't trust, people very, much to handle, their own technology. Okay. What do you want you want the government to intervene I mean that's paternalism, look look if I want to waste my time watching. Cat video it's on Facebook, that's my business, no no, Josh it's not just it's not just your, business my. Life isn't is impacted. By your choices our lives are impacted by your choices Facebook. And it's addicted. Adult users they're destroying, our, democracy. My democracy. Okay I'm, not gonna spend Facebook, that's good but, but, but the question still remains how, can, we prevent, all the negative, outcomes the monstrous, outcomes, without, losing, all the benefits and how, can we have the good outcomes without resorting.
To Paternalism. And installing, individual, freedom that's a good question good. Question but I think the answer is kind of obvious. Technology. Producers, and designers, have to take on some other responsibilities. They, have to do a better job of, predicting, compensating. For the effects, of their inventions, and. And, and they have to care we. Have to care about more than just creating cool fun gadgets. You know making, a bunch of money Josh yeah, and pigs have to fly, well. What, if. Producers, refuse to regulate themselves. Well. Then the rest of us will just have to make them an offer they can't refuse. What. Don't. Take me well I'm talking, about changing. The incentives, drugs okay so taxes, regulation, or social shaming, maybe well. Look I agree we've got to do something if we don't do anything, nightmare. Scenarios, may well be around the corner I agree with you that we're gonna start seeing things like on black America right that's a great TV show but but you know how much of that show is pure. Fiction and how much of it is future, reality well as it happens can, we, asked our roving philosophical. Reporter Liza, veal to, find out she. Files this, report, so. If you've seen the show black mirror you've, wondered how. Crazy is this stuff, total. Fantasy, 15. Years away ten. Years away. The. Interpersonal reading, system in this episode seems, familiar. You, want a cookie with that, on. The house, it sounds awesome. Clac Oh. Saw. Your boy in the fire had just now so. He's. Really something. Let, me introduce someone, who takes the technology, in black mirror very, seriously. Dylan Hendricks he works at the Institute for, the future it's, the oldest foresight, institution. In the country they. Don't predict, the future but they, anticipate, it I didn't, know this kind of thing existed, but I'm glad it does. We. Can only achieve the future outcomes that we can envision and so if we can imagine more possibilities, than we can we have more choice in where we go Hendricks. Says black mirror is embarrassed. Tongue-twister, the, first futurist. Fiction, for, futurists. It. Is something that internally, within our culture like everybody watches black mirror and has opinions about it and things we like and dislike and you know we'll pick it apart but we like, it's worth picking apart he says the show has gotten people thinking, about the implications. Of the technology, were beginning, to use though. It's set in the future it's, grounded, in very real existing. Possibilities. And liabilities. So. How far are we from the futures to picked it in black mirror that's, what Dylan Hendricks is going to talk about in, some cases not far at all this. Episode, is pretty straightforward it's basically 40, minutes of this robot chasing, this woman Hendricks. Says this, robot is based on an existing one developed, by a company called Boston, Dynamics their, quadrupeds. With machine learning capabilities. Similar, to artificial, intelligence. They. Actually learn by being in the world by a sort of actual. Feedback, and experience, of the world so they're learning how to open doors how, to recognize, objects, how to navigate terrain, Hendricks. Says it won't be long before this technology is used by the military but. It also won't be expensive, to manufacture for. Personal, private, consumers. That. Is something that we are gonna have to deal with in, our lifetimes very likely, of. This idea of sort of guard, robots, that are just that are so capable that they're they're terrifying, so, that's an example of something that's not at all far-fetched. Then. There's, our, Angell in this episode, a brain, implant allows, a mother to monitor, her children, her child her, location her, vital signs the. Mother can see the world through her eyes she, can even censor, what she sees Hendrick, says this, ship has a lot of realistic, elements and some fantastical, ones. Obviously. Already today as. Soon as a child has a phone, now, the charter, parent has a choice of like do I just track my kid all the time because you've given them a sort. Of a GPS. Tracker basically, and so, that is something where the temptation. Of parents to be able to know, everything a child is going through at every moment is more. Or less kind of a realistic you. Know even sort of present day kind of choice for parents as, a society. We have the technology, to surveil and control, our children, more than we ever did so. We face a question, our. Children allowed to make mistakes isn't that a part of growing up and if you stopped them from making mistakes will they make bigger mistakes later the. Way the mother sees through her daughter's eyes Hendricks, says that's the most far-fetched.
Aspect, But, it's. Not crazy I mean. It's not impossible actually I will say that there have been studies done of, reconstituting. Sort. Of memory, images, from people's brains under. CT. Scans so the. Idea that we could eventually sort of capture images directly off of the optical nerve and translate. Them and broadcast, them that that's not insane, to think that we could do that at, some point in. The episode be right back, a woman, uses, this service, that recreates, the consciousness. Of her deceased partner based on the digital communications. He, made while living then. That consciousness, is imbued onto a very lifelike physical, form. In, some way is brought back from the dead. Coulda. Left me some clothes I. Mean. Talk. About an undignified entrance. That's, the creepy, what. You're doing. It's, release a towel I'm dripping everywhere. Hello. So. When it comes to the idea of creating, and imbuing, machines, with consciousness. For. Me this is always kind of a non-starter because, there, is actually no nothing. Close to an operational, theory, of consciousness, that. Is, possessed by anybody in the scientific, community right, everything with, around, consciousness, is philosophical. Hendricks. Says without a theoretical. Understanding, of what consciousness is, there's. No way to theorize, a path, for creating, it we're stuck if we get any closer it will be the kind of discovery, on the level of discovering fire something, that changes everything, but. What we do have is, machine, learning, computers. Can call enormous. Quantities, of data and learn from them so they can perform in ways that we don't program them to but that they've taught themselves, to the. Effect can appear to us as consciousness. But, there's kind of a big difference there, it's. A recurring paranoia and black mirror that machines, are entities will be able to hold consciousness. Ours or their own it. Comes up in the episode USS, Callister, it's, about a virtual reality world that players can control and, design but. And this is a minor spoiler, the. People in the simulations, are actually, sentient, and because. This guy's a jerk and he's abusing them the abuse is real. Well. All. Right. Let's. Go check security protocols, they've, expired you're in trouble, yes captain, do. Donny recheck. Those probe results. No. Room for errors, of course captain, hums, wind packer captain. Vanilla. Latte. Skim. Milk at, once. Walton. Exit, game. But. Like I said the question of creating consciousness. In virtual, spaces or for machines is not so important for Dylan Hendricks, the. Viable, part is this idea that as as a virtual, reality and mixed reality technologies. Become more mainstream and accessible, which is sort. Of inevitable because, they're they're already we've already reached a turning, point where they're very compelling.
That. People will want to spend more time in, simulated, environments, Hendrix, is an optimist, the. Main bone that he has to pick with black mirror is how, terrifyingly. Dystopian. It is as, a futurist, there's. A strong desire for us to have sort of more identification. Of what, are the positive. Futures right what are the things where we use technology to actually solve problems in a real way so when it comes to virtual reality Hendricks. Doesn't just see doomsday, scenarios, he, sees opportunities, here's. One he. Says if we can't, stop humans, from, doing bad things to each other in real life then, maybe, we can channel that antisocial, behavior. Into. Virtual reality. Here's. An example that might be a little hard to swallow, that. That sexual, assault in the real world went down because. Of these kind of simulations. Existing, is that. Simulation, not then kind of a public good right. Like if we if it turns out we can't fully deter behavior, but, we could channel it into something where. It's less destructive, to real people's lives Hendricks. Says this question, will only become, more pressing. And. It, doesn't just apply to VR, does. This neck does this technology have more potential to encourage, impulses, towards antisocial behavior, or to. Mitigate the consequences of, it Hendricks. Is asking these questions so. Our parents, so are some technologists. But who, has the final say. For. Philosophy, talk I'm Liza Beall. Thanks. Liza for that tour, of the dystopian, possibilities for. The future I'm Ken Taylor along. With my Stanford colleague, Josh Landy, we're, coming to you from CMX Auditorium, on the Stanford campus as, part of the university's, Frankenstein. At 200, project, our guest today is a form is a physicist, and former dean of the School of Engineering here, at Stanford, who recently became the thirteenth, University, Provost please. Welcome to the philosophy, talk stage persist. Rau. So. Versus, Josh and I we're, talking, earlier about potentially. Dangerous even. Monstrous, technologies, I know, that's been a topic that you've been interested in for a while well when did you first get interested in these kind of questions well, I think the right answer that as I grew, up with it my, father. Was, a theoretical, physicist, who. Spent. Part of his life pursuing. The dream of understanding, the natural world and then spent, a lot of his life pursuing. The. Attempting. To preserve the world from the horrors, of nuclear war. As. An arms controller and so it was in the house from, when. I grew up. So. Where did that leave you McKenna and I were arguing earlier about whether we should be optimists, or pessimists I. Mean, you. Know the, threat of nuclear war something. Is on the horizon do you think that there is now or there is you know coming up some some, Victor, Frankenstein, type who's about to unleash something really deadly on the world or do you think basically we're gonna be okay oh I think there are all these engineering, students out there who are working on things that couldn't, unleash, something terrible, on the world but it could also be something that's wonderful, for the world that's, what the wave technology, and, discovery, works. Go. Ahead well I'm just, optimistic that. The good will win. I. Mean. I want to take the world at large mmm, because we're we're, focused we can be focused on the American context. You know if. Hitler, had won that war if, Stalin, had not yeah you. Know if Stalin, had prevail, he did prevail for a long time yeah visited. Horror upon horror. Technology. In the wrong hands, some, some, terrorist gets a dirty bomb I mean in total, looking at the world in total, what, controls technology in the world taken in total.
People. In. The end it. Has to be people. Taking, responsibility, for the, technology that they create. The. Technology, is going to be invented, no matter what you can't stop it can, we keep. More fair jerm. Jerm weapons. Nuclear arms, a. Chemical. And weak chemical, weapons are all over the world there's a cheap there that there's the there's the cheap dictators weapons of mass destruction, can we keep, problematic. Technology, out of the hands. Of all problematic, people we can never do it perfectly, but we've had nuclear weapons since 1945. And, they, haven't been used since 1945. It's a great example we, have biological, weapons that. We so. Far have, controlled. Does that mean we can stop working at it absolutely, no but. If you can beat you have reason to be optimistic. On. The global, warning all this study stuff you look out at the world on you like our. And. You don't look at it like those folks on black meter well, so I've never watched black mirrors so I want to speak about black mirrors your world is more a Star Trek. Problem, but. If I wasn't optimistic. Where does that leave, me well. Maybe vigilant, I mean right soon even. Being vigilant, so. Okay so so, what what about you know things that are on the rise and things like for example you know video fabrication, technology, I mean it seems like if. I understand correctly we may be on the verge of people, being able to make, you be saying on video and I think they want you to I mean, so who's. Gonna control. That kind of thing. Well. It's a really good question. Really. Good question and more really good questions from our audience after. This short break this is philosophy, talk coming. To you from simek's auditorium on the Stanford campus our, guest is the Provost of Stanford University. Persis, drill in our next segment we're going to talk about how, we can balance exciting. Innovation, against social, responsibility. How, do we get the upside without, the downside, invention. Attention. And Prevention along, with questions, from our technologically. Savvy audience, when. Philosophy, talking continues. Yes. Close to midnight, and. Something. Evils, lurking, in. The. Time.she. Side that almost, stops your heart you, try. To scream. I, never. Taste, the sound before you making. Your. Stops. You. Right between the, eyes. Right. Inside. Kayla Ella. Creatures. Growling, the, deadheads. Ought to walk in, the man's grave. No. Escape. In the jaws of the alien. There's. No, second, chance against. The baby. Kaylah. Thanks. Again to our musical gases the Tiffany Austin trio this is philosophy talk, I'm Josh LAN and I'm Ken Taylor I guess this Stanford, physicists, and Provost purses drell and we're thinking about, monstrous. Technology. So, we're sitting at I'm taking questions from you folks so if you have a question please take. A spot in front of one of these microphones at the front of the stage so. Purses. Okay I mean. We. Live in a capitalist, society I think. Capitalism. Makes, it really hard, to. Balance innovation. Against social, responsibility. Can we do it well. I would, say we have existence, proofs where we've done it in the past. I would, say we've done it with nuclear. Technologies, and we've done it with some of the biological. Technologies. Like recombinant, DNA, where. The, community, has really. Stepped up and taken some, responsibility. And not. Just pushed technology. As far as it could into commercial applications, because, they recognized there were potential dangers, and threats out yeah yeah, I mean you really are an optimistic. Down. River. Right. I just wouldn't but I got, if you say I'm much more of a downer I just, I'm not sure I believe in, the capacity of capitalist. Production of technology, to, always regulate, technology, for the human, good and here's why because the decisions, are made kind of locally. That, is I'm, going to automate and make my company more efficient, I'm gonna lay off workers. I don't, think about the aggregate effect of that I think about my competitive, advantage this person competing, with me and it's not just within this economy, it's around the world if I don't do this so the pressures, are all generated, bottom, up locally, but then they aggregate into something, a mess, right, we saw this in space with with with Facebook, and. Twitter I mean the way in which these social, media you. Know their their, profit. Is. Driven by clicks. And, like chairs, and those are driven mostly by controversial. News stories and so so conspiracy, theories and fake news well.
They're Good for the bottom line so. I think what you're bringing up is a really good point which is that with the two examples I gave the. Threat was evident, early right you knew there was a really, serious threat out there I think, the threats, and they are very real and we've seen them acting, out it from from social media. Machine, learning, those. We didn't realize, they were dangerous and so, now we're in a somewhat, different situation I would also say another stark, difference in in my view is that in, the, cases again let me use nuclear, weapons or. Recombinant. DNA though. There were leaders. Of the field who stepped, up and led, where. Are, the, leaders well, right but that's that's quick I mean, I think the threat from nuclear weapons is you're. Right it's dark, because. You know people. Used to think about, limited. Nuclear war and all that sort of stuff and then all these studies came out about nuclear winter and it's like oh my god we, can just trust this to the Soviet, you're gonna the Americans, fighting it out it's like the, whole world is like hasn't taken it right and I think global, warming's the same way but there's a like who goes first problem, kind of thing and you know the Paris Accords were cool, but you look what our current, president did right, I mean but still there's a who goes first problem, right I get I could freeride off the world doing this so. I just think we don't really have mechanisms, that kind, of force, the, balancing. Of innovation. And responsibility. I just think we have precious, few mechanisms well in in the case of global. Warming I think capitalism will come to our aid. When. The, economy starts to tank because of the effects. But. I'd still go back to the internet and say the challenge, there is I think the there, is the recognition, of the problem, is now. Real. But, I'm not yet quite seeing where the leadership, is going, to come, from. If. We if, the field itself does. Not take some leadership I think that's when regulation, comes in in a very, I. Could. Agree with you now can on this one yeah, when you see what's actually coming out of Facebook now it seems as though either they're genuinely. In denial or they're in some kind of you know well wait a minute wait a minute I see again it's the but there is a logic of capitalism and, we are capitalist, production and, markets are cool things I don't want to deny that markets, are cool things and I, think Facebook's. Business model, depends, on something important. That, they're a platform, not a publisher, right and I think they're they're being a platform allows, them and say let, all comers come and we like that we like oh is there is freedom there accessibility.
If We force them to be a publisher, well I don't know and then, take on all the liabilities, and obligation, that a publisher has that, I actually don't know if they survive economically right, right so, I think, this, is non simple, how. Was that decision made. How. Was that decision made that they, were platforming, out of publisher had it because, they're good capitalists, right. What's. Happening to the publishers, they're getting hammered by, this new technology. Right and if they take on this, certifying, and verifying, and distributing, and being liable and all that stuff I know, I don't think they get all those those. Those investors, who invest in them I don't know there might be some of them in the audience with would would they have gotten all those investors, who invest in it if they said our business models we're gonna be a publisher, maybe, but meanwhile if we're relying on people within those industries to, do self regulation it seems to me we're just putting the Foxes in charge of the henhouse and we're really gonna get very you're giving them competing interest, right exactly, so the incentive structure seems to be wrong I mean and I don't know where they're posis I mean you know I love it you're an optimist even, about global warming I mean I think you're a little more of an optimist than I you know maybe capitalism, will kick in after, everything. It's too late but anyway. But. So what you know what do you think I mean is there a way that we could maybe. Without, regulation. Maybe we could so change incentive, structure at least at the level of the social and make it uncool, to destroy the planet for cash you know stuff like that so are we talking about global warming over time at the Internet and so I know which problem, I'm talking about the. Way you want to talk about yeah pick the one you can be optimistic about. Well they're really very very, different, yes uh-huh I. Think. On the only on the issue of the internet I think. That the threat has become visible, in recent years and I actually see evidence that the that. The thought leaders are starting, to think about how to address this and I'm, seeing a lot for example more discussion. Not my field of expertise, but let me just put this out there around say, the subject of machine. Learning, and the dangers, of machine learning then, we heard, say. Even. Five years ago or ten years ago around new social media platforms, now that just might be that the threats of machine learning are more. Obvious, but. I actually think it might be that the. Thought leaders are starting to have much more of a sense of responsibility. The. Culture, of, Silicon. Valley has been as I think was articulated, in a New York Times article quite recently. Build. It and ask for forgiveness later I think we're starting to move away from that and so I see the again, optimist, that I am the, beginnings, of that development. Of social responsibility we. See that among our students, here as well, right so I want to ask you about the students in the next segment but I want, to back off to the I'm gonna ask. About for thought leaders and the. Innovators I mean, I. Grew. Up to believe that. Science. And innovation goal. Right. The, you said innovative. Build it then then worry about the consequences because if you start out I mean do you believe we, should ever, restrain. Technology. And science, in advancing. Don't go there don't go there don't go there I I, don't think that works, I think certainly at the basic discovery, scientific, discovery, stage you, don't know enough and then, even when you're going into the technology stage, I mean you, may discover, a biological. Weapon you may discover limited, nuclear weapons. And decide we're, not going to build limited, nuclear weapons do you know I have to know what could be done there because somebody. Else that bad guy on the other side of the field might. Not have had the same moral sense of I, have to be able to defend myself since science, is about the disseminate. Is if, we're talking science, yeah right and universities, for example and we're not talking private, industry, we, don't keep our Discovery, secret, we, disseminate them we publish them right, the world I mean I think every III, suspect.
Tell, Me if I'm right about this probably every college physics student knows. How to build a nuclear bomb uh I, know, no no it's it's actually not that easy, but. But. They know what has to be no way they could figure out what they know they all know the basic physics. Right. The. Technology. To. Make it to. Really have ready to make a miniature. And. And and to assure criticality, kalki. And so forth is is. Not it's hard but. I am, told, if you search hard enough on the internet you can you. Can find it to make, smaller. One so you control the fissile material the, Iranian, tube trumpets, were worried about and the world is worried about I mean they're smart enough oh sure no logically advanced enough that if they set their mind to it now the question is would we put a stop to it but they're technologically. Advanced enough to do this right but you can't take this knowledge and put it in some bottle so. Right. So that was why so many people work for so hard on the non-proliferation, treaty, and we didn't get it and and. That's where the focus now is is in proliferation, it's not so much in mutual. Assured destruction and. We're worried, about the Soviet Union which doesn't exist anymore, so so sounds, like they're actually in a, way it's restage, is here there's the stage of invention and I think you're making very I agree with you I think we can't stop. That right but then there's the stage of. You. Know recognizing. This potentially, dangerous and, maybe we should be encouraging people do that and then potentially, down the line their interventions. That we can use its regulations. Or social, models right and and I think it is I think that's a wonderful, separation. Josh and and it's. At that middle stage where. It's absolutely critical, for the. Technologists. That themselves, to. Have. A sense of moral, and social responsibility, towards, what they see them, developing. That is the critical moment because, if it's later and it's regulation, it's, not so good you're. Listening to philosophy talk we're talking about monstrous.
Technologies, In front of a live audience at, the Stanford, campus with our guest Bruce's Braille and we've got questions from that live audience I'll go from, one side to the other room I'll start with you welcome, step. Forward tell us your name where you're from don't tell us last name just first name because you're crazy people out there. Good. Evening my name is Josh I did my masters here at Stanford. Thank. You for having me here, very, quickly my question is you, made a very good point about not being able to stifle the stop or slow, down the invention, phase of, technology but, what about the aspect. Of influencing. It because there's, a lot of discussion around how for example all these things like Twitter Facebook Google. Were built by a select few of the, world and how, that's affected, those the way those technologies, have shaped our world for example Twitter, is a great place where people are Hara stand bully the olive oil all the time and, people. Have discussed about how, they. Have it has implications. On people's lives both, positive, and negative. What, are your thoughts on how we can sort of change. That how can we add more diversity, in, both, in terms of the people who are inventing those technologies, as well as the ideas, that we use when we do that good, answer. So. Twitter. Is again, it's a tool that has marvelous. Benefits and, it is used in really negative ways. I. Ponder. A lot is there a way of ensuring at least some accountability, in, Twitter and. My. Understanding is that many companies, in. These they, actually have rules. That they don't even force themselves. About. Fake accounts, and so forth so, that, is a place where the. I believe, Twitter. Let's pick on them should. Actually. Force. A little more accountability, in the use, of its platform and along. With that it would be great for society. To realize, that. Just. Because. Speech. Is protected, doesn't mean it's actually appropriate, all the time. So. What do you think about the following thought I think in. America, we, have, and. I think this is really connected, to technology too although it may not sound like an inference we have the wrong model of a corporation, we, have a shareholder. Model of a corporation, the corporation's, supposed to serve, its shareholders, I think. We need to advance. To a stakeholder, model of the, corporation. Should serve, its stakeholders, and who is the stakeholders, are the lots, and lots of people so, that the interest of lots and lots of people are somehow brought to bear I know there's, a debate about this an economic, theory and all this sort of stuff but it seems to me until, we.
Make The corporation's. Accountable. In. A broader way when. Either we're gonna have the heavy hand of government or something I mean what do you think of that I'm. A physicist. You're, a philosopher. Redesigning. The US economy, as we speak I don't. Know. I'm. Carl for a long time as MIT but, now I'm here at Stanford, and. It. Has its advantages here you know but, anyway we're. Building the Internet of traitoress. Things or, devices traitoress. If it works against, the interests, of its users, the, biggest. Companies, in the valley think, that within ten years the majority of the people in this room will, be whirring hollow glasses. What. Are we going to do where everything, that you do, see. And say, can. Be recorded, and owned. By, somebody, else, stay, home and never go out you don't have to wear the darn glasses. You, don't have to give all your information away, Luddites. Aren't. Gonna make it do you have a cell phone yes. Oh you're, in. I went, you said you responded, to the last set. Of questions you said we're, philosophers. You're. A physicist, here. We are redesigning the economy, but this is this, is part of my point and I wonder what you think about this we all have to be into this making, and remaking of, the world together and and, you can't we I don't want to train our students to say hey look I'm an engineer I'm, a physicist, I totally, agree with you okay so how do we how do we do that how do we get this, conversation, to. Be a broad, conversation, involving. All. These people because. It, seems to me that's what we need we, need the physicist sitting with the philosopher, sitting with the politician, sitting with the journalist. Right and. The economists let's not forget the economists go they actually know what they're doing in this case um. But. There are some models I mean you know hospitals, of ethics boards and. And and a responsibility. Are. Certain kinds of well, are we responsible, for, their being. Review. Boards, right, humans. Up because of the some, experiments, that took poet's, death the Prison Experiment yes yes, the IRB yes right so so, it's novice I mean so everything's, the Wild West right, and, so we just have to throw up our hands it seems to me the inner thing we you know we could shift we, could decide as a society we want to shift some of these areas of technological innovation, more, in the direction of things that have a little bit of inbuilt. Policing and responsibility, yes. We could and, and. It wouldn't be that hard to do I the first thing though is to acknowledge the threat and I think that's really what's just happening, so I, agree. With where you're going but I would point that it probably is a little unrealistic to think it, would all be in place now, because, I just don't think until until, a few years ago we recognized the magnitude of the threat welcome. To philosophy doc man that's your comment a question who are you and what you pomander hi my, name is Victoria, and I'm currently an undergraduate student on campus so. I guess my question is philosophical. Perhaps in nature, since. Struck, structurally. The rise of artificial, intelligence or by extension the aspiring. To develop, it and research, it implies. The displacement, of conventional, work forces right, so the abilities of computers. To start taking on, complicated. Tasks that is beyond, even monotony, so, we've seen you know in cases where they're, starting to take on diagnosing.
Diseases Or produce mego documents, or you know even creative creating a poem etc, so, in that case of, course these abilities got magnified, and you know those workers, that are most vulnerable in societies, are harmed most so in that case do, you think there's a moral responsibility of. Companies. Or people who participate, in a AI development. To compensate, that in forms of Taxation or others and how. Do we go about kind of thinking, about that conceptually. Do, you have a viewer you want to punt this one well, I do but I actually would love to hear from the philosopher first. Well. I think this is a really hard question and I think it's a huge. Question because. There are people speaking, of the economist I mean there's a disagreement about this people, used to believe that in the old days technology destroyed. Jobs but it produced, compensating. Jobs that's actually not as true as you might think that's a complicated, thing but there's a some, people believe that day is coming when technology is just a net destroyer, destroyer. Of the demand for human labor and that and that we, could see that in the next 10 or 15 years we can see the demand for human labor decrease, v 40%. In the, next by the next century we can see the demand for human labor almost go away right. How, do how do we live in such a world of course, there's a moral responsibility but, who does it fall on it's, a really hard I mean that is among. The hardest questions we face I'm not saying it's an easy question but I think we can apply similar, principles, here to the ones we apply elsewhere and just say, look, if, there, is something that a reasonable. Personal, set of people could predict, then. You should be setting about trying to predict it and if you're not even trying then I think you can be held liable for that so, we go there's, a lot, of questions we gotta take a break we'll, start the next segment with a bunch of questions after some more music but I remind, you we're listening to you're listening to philosophy, talk we're, coming to you from semak's auditorium, as part of on the, Stanford campus as part of this University's, Frankenstein. At 200, project we're thinking about monstrous. Technologies, with persist rel Stanford's, new Provost in our final, segment, versus, how she thinks, we, should train, the engineers, of the future, educating.
For Responsibility. Plus more questions, from our audience, when, philosophy, talk continues. Monster. Began. To. Rise. It. Was, a graveyard. Smash. To. Get a job. Thanks. Once again to our live musical, guests the Tiffany, Austin, trio, I'm Josh Landy and this is philosophy talk the program that questions everything, except. Your. Intelligence. I'm, Ken Taylor we're thinking about monstrous. Technologies. With, purses drove from Stanford. University so, we've got a whole bunch of questions to learn we start with some oh I don't, know I think I was on this side of them welcome to philosophy talk back hi, I'm, Sarah I'm undergraduate. Here at Stanford, um and. We. Live in a world in which there are huge socio-economic. Disparities, and as newer technologies, become available they're, often only available to those of higher socioeconomic, status. So, I was wondering what y'all's opinions, were on how, inventors, and companies can ensure that those disparities, don't, become so big that they're unable. To be, overcome. Presupposition. A fair question is that it is the responsibility, of the, technologists, themselves, do you think do you share that presupposition. Or is that a broader society. Responsibility. Uh-huh. I would, wall I would like to say I think broader society, should take responsibility I, like it when technology does take responsibility. Too but, I would. Also like to point out in some ways certain. Technology has been incredibly. Democratizing. So. It has, cut both, ways, but. Ultimately, for me and, this, probably reveals, a certain amount about my political, persuasions I do believe society, should be taking a responsibility. To. Ensure that it is available broadly. So how do we do that well. You're not a PO you're not a political. We. Make you philosopher-king. Yeah, what. About, something you said though I. Mean. You. Said technologies, that are democratizing. Sometimes. Technologies. That look democratizing. So. The internet is supposed to be a great democratizing. Technology. Right, sometimes technology that, are democratizing. Just, break, down. The. Public square because, they they, substitute, noise, for. Knowledge right instead. Of these authorized, these tough one of the things that these authorities. Did they, certified, stuff as legitimate, as knowable is worth paying attention to when everybody. Has access, but. Let me give a very specific example theoretical. Physics, used, to be if you want to do theoretical, physics you had to be in one of those pillars, like Princeton or Stanford or, Harvard or, Oxford. And. And. If you wanted to learn about the hottest latest thing in theoretical physics, you had to write away with a little postcard for a free press. And. Now theoretical. Physics innovation, comes from all, across, the world it's, been phenomenal that's. True that's the upside right, I'm the optimist yeah. What's. Your comment or question I'm Wade from Portland Oregon and as an aspiring engineer, in a very large organization, how. Do you how, do you feel one should navigate these, considerations when, you're just a tiny cog and maybe a much greater machine, there you go for this Wow. That's. Real. Is your moral compass, no. Matter what level you are in the organization and. Hold. Yourself accountable to, it and it will guide you well okay so that those, are inspiring, words so if this brings me to asking you what. Do we do with is this, what you know is is this these words enough, to inspire the next generation of, budding. Engineers, we've got, a room including. Some current, students here so what what do we what do we do to try and make sure that the next generation are, are going to be helping, the world rather than sort of new Victor Frankenstein's, creating.
Without The proper vigilant so, I am a huge believer that. Engineers. Need, to be educated, not just in engineering, but they need to be educated, broadly, because, they. Need to, care about the impacts of the technologies, that they're going to be involved in inventing they. Don't, get, that by taking more physics, classes, or more math classes or, more engineering, classes they, get that by taking a philosophy, course or they get that by taking a literature course, and being, forced, to think, through the impacts. Of what, they're doing or if they're really more directly interested take a social, science course but, I think that. Educating. Engineers to be engineer, only, is criminal. I totally. Agree with you I say to students, I put. It starkly, and, they. Sometimes gasp, I say. You. Know Hitler, had his, technologist. Stalin. Had his technologist, it's not enough, to be a technologist. If you're just a technologist. You're fit to be a tool, of some, broader it's a social thing but, is that what you want you want to just be a tool, right. And, students. Sometimes they're taking it back when I, I. Say. This but, that, brings - how do we make them not just be, tools how do we educate them, to be technology leaders and thinkers I mean what you you, say they should you. It's, a I know I think you think you would like them to take out say a philosophy, course but you're not gonna force them to take a philosophy, course you're not gonna require, them to do that well I do believe, that if. You require, things, people, do them because. It's required but they, have if they don't come to subjects, willingly they're not going to absorb and learn them and internalize. Them and then it's just a waste of time so they. Have to come willingly at, Stanford, and most other institutions. We, have gentle, ways of encouraging people, to get, breath. They. Could be a little less gentle, in some ways they could be a little more prescriptive. Here. I think, the university, education is is, and I think it is, in a great crisis, I believe we're in a crisis State and what I think there are two sources of the crisis but we're focused, on one source I think, we, have become too focused on imparting.
Into Our students are narrow technocratic, education. Right, and partly. Our students, are demand, want that of us because of their parents, or their the you. Know we have this silly representation. Which reputation. As undeserved, as get rich you and all that sorta stuff and they think chase the brass ring and get the hot, job and, all that stuff and and, I think we need to address, this and I don't think this is a small thing I think it's a huge thing okay so but there's another piece of it which, is that so I do think we have students majoring in technical, subjects. Computer science whatever, for. The wrong reason and all and, helping them choose the subject, they want to major in for the right reasons is is obviously part of our responsibility. And we could be doing it better but I also think, that and. Here I'm going to speak as somebody who was a I. Was only in the School of Engineering for two and a half years I'm not an engineer I've never taken an engineering course but, what engineering, did, which was really very am and, is doing which is very impressive is they actually think. A lot about not, just what do they need to impart to the students but what did the students want to learn and how do they, want to learn it right and that focus. Has, helped some of the majors and see us as one of them be, incredibly. Attractive. With, really, good on-ramps. Understand. I think other subjects, certainly my own subject. Could learn a few lessons for, you welcome. To philosophy talk sir what's your comment okay I want, to defend some economists, now so. I. Teach. Economics my name is Mark and I teach economics business. And computer, information systems and. One, of the things I want to presuppose, is that economic. Growth is good first call can we agree upon that okay I think, is, good one of the leading sources this is just principles, level economics. I'm, quoting here Robert. Hall and John Taylor, because, that's the textbook. That we use for principles of economics and most, of the growth that's taken place in the last 50 years in the modern world in the first world has been due to technological. It. Increases, so increases. In productivity not. Increases, due, to more people working or working harder and that also that economic, growth also brings about good things we have life for longer, life expectancy, and what, you can't just say people live longer but what happens when people live longer the useful, life is a lot longer you have people I think probably. 50 is the new 30 now I like. Things like that so. Because. As an economist, what, side are you on on it so you could the productivity. Yeah one person doing producing. More you know more production. Per capita. Or something like that right but what, what do you think about this. Debate of, whether. Technology. Is, going to demand, it didn't diminish the netdom, decrease. The demand for you labor oh is that gonna happen exactly, what my next point what. Am I gonna tell my macroeconomics. Students is when we're talking about economic, growth because that's a large focus, of a macro economics classes where. Are people migrating, to in the in the world are they're migrating to the robots or away from the robots were, they migrating, to where their factories, are where, the computers, are or away, from them so we see that you know we have in at, this point in human history we. Probably have more people working as a percentage, of the population and, and, especially adult, age people with, opportunities and we also have you know and and you have to say that there are more people being employed one thing is you have to take a look at is is. People. Will be displaced so to a certain degree technology. Displaces. People but. Those are the people who don't a lot, of times have the education, to adjust so, you have to be malleable so, and to a certain degree machines. Or robots largest are substitutes. For humans but to a greater degree they're complements, so machines. And people work together and, that's why you see increasing. Growth yeah, okay so okay. That's. That's my that's my question there is why is that bad then why is it bad if we're having technology, and it's, increasing growth and good things are happening why is that bad, well.
I'm. No economist, I'm no futurist, the. The fear is. Take. Driverless. Driverless. Technology there. Were three million people I think, in this country who, make their living off driving. Things right. And. They're, gonna be displaced pretty quickly and that's. A good stable middle-class job and those people are not malleable I mean, a I mean, you could say well be malleable but people aren't as malleable I mean a 50 year old truck driver in Pennsylvania, who, gets displaced. He's just displace he's. Not gonna do anything else and, how, do we deal with that and that's, not to mention climate. Change right I mean we're making all of these incremental. Advances. In in life expectancy and things like that but this is coming at the cost of future generations and we're not thinking about that then ultimately all, of these all, these advanced is just going to be completely dwarfed, by. By the challenge, I'm gonna face in the future, so you, want to respond to that no I just want to leave an end and an optimistic. It's. All gonna be okay, no it's. Not gonna be okay if we don't work at it right but we have to work at it right and we, cannot give up and cede that responsibility, to, anyone else okay so this is what we're gonna do. This. Is basically the end of the show except there are people standing line questions, and if, you're standing in line questions we're gonna take your questions I'm gonna make a clean break you probably won't get on the air but. You'll get to talk to us anyway so we'll take these people who are standing in line come up to the mics and. Then I'm gonna stop and I'm, you're gonna say something wise and then the will will say goodbye to you okay. Right. Now we're gonna take these three even, wiser than back. Okay. He, needs a clean break for editing. Welcome. To philosophy talk-show what's your comment requested that, one area I know about is self-driving, cars and so there's a big push for that and. The technologies, that are required, is so complex, so a lot of people scientists. And engineers are working on this but. All the scientists and engineers who are working on this they don't really ask you know future. Implications of, self-driving cars but. A lot of burden is not on them it's on the capitalistic, system is the heads, of Google and Ford, who actually want. To make a lot of money heart of the self-driving, car business so. So. Who, should like.
How. Do you think we should push back on the self-driving, cars for example as, consumers. Well. If. We don't like, self-direct. Driving cars no one's gonna force us to get them the fact is that if we have self-driving, cars it will be extremely. Attractive it, will make commutes more attractive, it will make probably. Make highways, safer. Notice, so. I think you could argue that the, technology. Is good, you, worry about the loss of jobs. And, that, is then a societal. Responsibility, for. Either. Retraining. Or. Oh. Evolution. I think actually. One. Thing that we can do I don't know that we will do it is that I. Think the transition between, every. Vehicle. Having a person, in it driving. Actively, driving, and a. Fleet. Of vehicles with, no one in it is actually a slow, transition and, so, you could do it in an evolutionary way, if you really thought about it and and planned, it that's definitely, right uh I, mean I think the deeper point is that, we've, got to get past thinking, of a technology. And I think this in. Educating, our students we have to do this too we, think we have to thinking, of the technology, and technological, innovation, as just a thing unto itself not, just well, we produce it it changes the world we you know oh my god we, didn't have any agency in changing the world that way how does technology change, the world by being deployed, by human, beings in a context. All right spelling much your sister right okay it's part of a much bigger system yes, and so we have to think about the whole system-wide, thing and even, a young designer, can. Do that and can be, alive, to the fact that I'm entering a large complicated, system, and I'm and I'm and I'm a thought leader and I went to a place like Stanford and, I should be reflective, and I should be a citizen and. All that I mean gets back to it versus earlier point right to you know don't just study your. To your particular field but. Learn, about human psychology you learn about macroeconomics. Learn about. Dance. First. First, thing. What's. Your comment request so I'll just go back and then we'll have. A clean break okay good evening my. Name's Shelby and I have engineering. And business degrees, from here at Stanford, and. There's. Two points. I I. Want. To be an optimist, I am. An optimist, somehow or other but. Logic, gets in the way.
And. We talk about people. Using. Technology, people being, responsible. The. Problem, that, I see there, is it. Has to be every. Person. Responsible. Forever. I mean you know, we've. Got things going on in North Korea at this that you, know they're doing CRISPR, all over the world and creating, stuff I mean yeah it's, just out there and, you know that's. The one thing and then the other thing is that, we have AI now. Ai is a different. Monster. From. All the others, because it can have, a will, of its own. Philosophically. Potentially. And. We, don't know what kind of will it might develop. So. Those, are now, so, what's the question, how. Can we not be pessimistic even, though yeah. Ai. Ai. Is coming, like gangbusters yeah. Ai is coming. Like gangbusters and, the, promise, you know John. McCarthy our former colleague our late colleague they, had this meeting back in and I was in 1950, or something like that they got together they denied, in 70 they yeah, okay, that was a little premature right. But, AI is coming like gangbusters and, the. Day is coming when anything, a human can, do some. AI software. Will be able to do better that's, just coming I I'll, take that bet yeah, I will too it's, coming I I want to see anything like you I want to see someone design, the software detector, honey look it's already the case and this isn't even a super-intelligent area it's just just think machine, learning, technique they can already out diagnose, your average doctor right. Diagnosis. Fine anything, that involves pattern recognition. A bit, involves we do a lot of things there beyond, pattern, right I know. About. Today, and what's coming next, it's it's, coming, okay. We'll take if you wanna be. Disabil. Translate, and you'll. Sleep better. But. You know, you know you know get you know. Welcome. To philosophy dogs, yeah. Hi I'd. Like to just complement, the last speaker, because he. Usurped, most of what I wanted to say, you. Talk about technology, but. There isn't the technology and, I'd like to at least this have you distinguished, as he was trying to say like. The world, of Medicine, has done magnificent. Things I mean you know you cannot, deny that yeah, on the other hand in, areas, where they have begun to encroach, on truly, dangerous things, the. World of medicine seems to have taken it seriously and, they are doing. Something about it and I'm presuming it's because the government has, been bothering it's really, a serious, issue, somehow. The world, of technology. Is focused. On you. Know kitty things and so forth that are fun and so forth and you know there. Is this issue of the. Fact that I think, somebody. Like forward writes. A book and he's a little exists obviously extremists. And I haven't finished the book to be truthful but the reality, is that you. Know he's saying good things about, the fact that the owners, of these magnificent. Six companies or whatever they are you, know really. Think. They own you. Know the very godlike, you know and they are determined. To the guide ours well I mean they are the gatekeepers, you know they we learn, what we want to learn from them they, can model it they can modify. It it's, a dangerous, issue and if you think that they're going to ultimately. Come forth and say well we're gonna drop all this and you know forget, it I mean, to, be pessimistic is, one thing but, to do, something about it it's going to require I think, ultimately that, there's.
A Stand-up, sort of thing sort of like Florida, youngsters, getting up and saying enough already there's, gonna have to be some group stand up and get, some action out of a, Congress or someplace that says look, you can't let this go on you, created, the division. Of labor I mean a division of money that's unacceptable, you've, created the ability to take, everybody's, privacy, and destroy it I mean you have that's what they were saying you have no privacy on the phone you, have no privacy I mean it's just good those, this. Goes back to your where the growing yeah, it goes back to where the grown-ups and. What. What. I think is, is. It it will be fascinating, and when the history of this is written because I'm not sure even the owners of those companies had a clue what, was going to be coming they. Have a choice, to start taking some responsibility or. Government. Regulation, will come in and break them up the way AT&T. Was broken up I mean there is a there. Is an ultimate authority there, we think and. I but I'm not sure that's the right answer so I I actually do hope the grown-ups stand, up and start working at it okay the last question Thanks, and I, wanted to ask about the. Discussion, you had about encouraging, humanities, education. I. My. Husband and I are both alumni here he was in the philosophy. Major and I was an engineering major and we discussed this a lot about, her own children now where he hopes they get a strong. Humanities, education, and, you and I say absolutely it, like it's good to get some. Back. - I find, it's, employable. To get the. Engineering, degree and I do believe you're a better person, for having a humanities, education but then what happens after they get out of college they need to pay for their rent they need to you. Know do all these things so I I mean how do we close the gap there so, I. Take. It the purses wasn't necessarily, recommending, that people, major. In a humanities. Subject. Could. Just be look if you're gonna be an engineer, I. Think. There's a good compromise, well, but I think if I were to lay my bet on the table of the degree, of the future, it is not the pure engineering degree, it's gonna be a social, science degree with, computational.
Literature Literacy, because you, need to know the questions, to, ask and, there, are these huge societal challenges, and I do believe the social sciences helps you understand, what questions, to ask but, you can't understand, them without computational. Literacy, so I really think the magic, combination that, I would would put my my. Money on so, I mainly want your kids to get a job. On. The radio so I can say this. The. Degree of the future is the symbolic systems, degree in. Future. I. Want. To address this I want to address this briefly and then we'll give, you a clean break I. Think. Look, we are educating. The makers of the world the remake errs of the world humans human. Society, is constantly making, and remaking itself, and and, and and these Stanford students that we educate they're going to play a role, in making, and remaking in the world and we need to mitigate that. In multiple, things, we, need to turn out excellent, students, it all got we need to turn out them great, engineers great scientists, great, great, artists, great literary of, thinkers, great philosophers. Great social scientists, but. We. Also need them each to, understand. That. Making. And remaking the world is a deeply, collaborative. Thing that, requires multiple, disciplinary. Talent, there is no, competition. We have this stupid, thing that's never the students say well you were techie or a fuzzy you better be both right, you better be a too fuzzy or today or thank. You or whatever that is because. If we don't produce people who, eat whom all these things live simultaneously. The making and remaking of the world will be a disaster. Of. Course it does. Yes. I will oh. Yeah. A deeply. Committed to that I. Am. Deeply deeply committed to that yeah, okay so now we're gonna pretend none of that happened ever so. I'm gonna give what. I'm gonna say is purses you got one last bit of wisdom for us right, so okay. So. He needs a queen pressures on so. First you got wooden last bit of wisdom for us I, think. We have to hold on to our optimism, despite, the challenges, ahead, well. In that optimistic note I'm gonna thank you for joining us it's been a it's, been great conversation thank you. Our. Guest has been persist rel former dean of the Stanford School of Engineering and, recently appointed as our, University's, thirteenth Provost, now, this conversation continues. At philosophers, corner, and our online community, of thinkers where our motto is code Ito e
2018-04-11 14:43