Jamie Susskind: "Future Politics: Living Together in a World Transformed by Tech" | Talks at Google

Jamie Susskind:

Show Video

Please. Join me in what coming Jamie, Susskind. Thank. You all very much there's, a story, that's told of, an encounter that took place in the 19th century between. The, great prime minister William Gladstone and the. Scientists. Michael Faraday and, Faraday. Was showing, Gladstone, the, electric the invention of electricity which. Gladden hadn't seen before and. The Prime Minister was looking at it curiously, and he said to Faraday, well. What does it do. What. Uses, it and Faraday. Gave, an explanation as, to what he thought the scientific, implications of it were and how it marked a great advance in that, respect, but Gladstone wasn't convinced, then he kept asking more. And more rudely well, what uses it what, uses it and. Eventually Faraday, turned round to the Prime Minister and he said well sir in due. Course I'm sure you'll find a way of taxing, it. What. That story shows. To. My mind, is. A phenomenon, that's the same today as it was in the nineteenth century. Which. Is that there are lots of. Gladstone's. In the world who. Know a lot about politics, but, not much about technology, and equally. There are a lot of Faraday's, in the world who. Know a lot about science, and technology but. Don't immediately. See the social implications of their work and to, my mind the Gladstone's and the Faraday's, are remaking. The world that we live in they're the most important, people on the, planet just now when it comes to politics and once, a start if I may with just four, examples of, simple technologies, that, are emerging, that. We'll all have heard of the. First is a a self-driving, car and I want you to imagine you're taking a journey in a self-driving car and you, ask that vehicle, to speed up to go over the speed limit the, vehicle refuses. You. Ask it to park, illegally, on a double yellow line just for a moment so you can nip into the shops the. Vehicle refuses, in. Due course a police car comes along its sirens blaring and it's asking you to pull over for, whatever reason you don't want the car to over at least not yet but. It does against. Your will. Now. I want to imagine you to imagine that you're using a virtual reality system. One. Which enables, you to experience, things which otherwise would be inaccessible to you and you. Ask that system, to, let you for whatever reason, experience. What it was like to. Be a Nazi executioner. At Auschwitz, or to perform a particularly, depraved. Sexual act which, society, would, condemn by enlargers, immoral, the. System refuses. Now. Let's. Think about an invention or a development which took place just a couple of months ago in relation, to chatbots, where a babylon system, was, said to be able to pass the Royal Society of, General. Practitioners general. Exam better. Than the average score. Of its human. Practitioners. Imagine. Living in a world where. Chatbots. Are not just better at talking about medicine, and diagnosing, conditions, but a better are talking about politics than, the rest of us as well. And. Finally think of the stories that we've all heard of the, soap dispensers, that won't dispense soap to people of color because they've only been trained on white hands, the. Voice recognition systems. That won't hear women because they've only been trained on men's voices. The. Passport, recognition, system in New, Zealand that declined, to issue a passport to, a man of Asian extraction, there because. It said that his eyes were closed in his photograph. These. Were previously, and too often still are seen as technical, problems the ones that I've just described. But. To my mind that political, the. Self-driving car example. Is an, example of power plain and simple a technology. Getting us to do something we wouldn't otherwise do, or. Not to do something we would otherwise have done the. Virtual reality example. Goes right to the heart of the, question of Liberty what. Is permitted, in society, what, should, be permitted in society, and what, should be forbidden. The. Chapel example, goes to the heart of democracy. In. A world where deliberation.

Takes Place increasingly, by machines, or could, do what. Places there for us. In the systems they govern, our lives and, finally. The. Examples, of the soap dispenser and, of the voice recognition system, and of the passport system go. To the heart of social justice because. They deal with how we. Recognize each other in society, how we rank and sort each other and how, we place each other in the great chain of status, and esteem. Power. Freedom. Democracy. Justice. These. Concepts. Are the currency. Of politics, and increasingly, I argue in future politics, that, the currency of technology and. What I say is that like it or not social. Engine is forgive. Me software, engineers, such as yourselves are. Increasingly. Becoming social. Engineers you see the two are, just the same in my mind now I struggle. Even to distinguish, them. Technology. I say is transforming, the way we live together and, what I hope to do in my talk today is briefly sketch out how I think it might be doing that, but. The overarching thesis, is clear the, digital is political, we, can no longer be blind, to the social and political implications of. Stuff which previously in the past were just seen as consumer. Products, or, as commercial, or technical matters only I thought. I'd begin by outlining the. Three main trends, in technology, which. Lead me to the conclusions, that I reach in respect of politics, and you don't need me to spend much time on these, but. I'll just rattle, them off anyway the, first is, increasingly. Capable systems in short, we, are developing, systems. Called them artificial intelligence cool than what you will they're, increasingly, able to do things which we previously thought only human beings could do and they can do them as well as us and they can do them better in some, cases whether. It's lip-reading, whether, it's transcription. Mimicking. Human speech. Detecting. Lung cancers, and diagnosing. And predicting, survival periods. Almost. Every game that we've invented computers. Now do them better or equal to human beings and the. Simple thesis is that progress isn't going to slow down anytime soon some, people say it's increasing, at an exponential, rate, so. Increasingly, capable systems. But, the second point of importance is not just that our systems, are increasingly capable, but. They're increasingly, they're everywhere, we, live in what's being called the era of the glass slab we're, principally, our interaction, with technology takes, place on computer, screens or iPads or phones through. The through. The medium of a glass slab but, what's said is that in the future. Technology. Will be dispersed, around us in our architecture in. Our. Utilities. At home in our appliances, in our public spaces even on our clothes and inside our bodies the, so-called Internet, of Things or, ubiquitous, computing which. Means that increasingly, sensors. Processing. Power and connection, to the Internet will be distributed, all around us, in items, and artifacts that we prove that we previously wouldn't have seen as technology, so, the idea of the glass slab will gradually.

Fade Away the, distinction, between online and offline real, and virtual meet. Space and cyberspace. Will. Lose, some of its meaning and certainly lose a lot of its importance, to put. Increasingly capable systems and what I call increasingly, integrated, technology. And. Finally. We have an increasingly, quantified, society, now. What's, said is that every two days we generate, more data than, we did from, the dawn of civilization. Until, 2003. And it's, predicted, that by 2020. There'll. Be about three million books worth of data for every human being on the planet this. Is obviously unprecedented. And what. It means is that increasingly. What. We say, what. We think how. We feel where. We go who, we associate, with what. We like and dislike, almost. Every aspect of our lives in some sense will. Be captured, recorded. As data stored. In permanent, or semi-permanent form. And made available for processing. Looking. At the crowd in this room a lot of this may. Seem, natural. And normal to us because it's what we've grown up with and all of these trends have been increasing through our lifetime but to my mind earmarks, pretty a pretty substantial. Shift. In the state, of humanity it, could be as profound for, us as the Scientific, Revolution the, Agricultural, Revolution because. It's only just started, we're only five or ten seconds, into this stuff in a historical, perspective and. If you think about what might, be around the corner 10 or 20 years down the line then. It would be mad to assume that, the consequences, for politics, for how we live together wouldn't. Be profound because. We've never had to live alongside. Non-human. Systems of extraordinary, capability, before we've, never known, what it's like for. Technology, digital, technology, to be integrated. Seamlessly, into, the world around us. There's. Never been a human civilization. Where every facet, of its social, and private, life has, in some way being recorded and. Stored as data and our, duty whether, we're glad students or faraday's or just citizens is to try and understand what the implications of that are for the future of politics and so. I thought, what I do today is. Just go through four of the most basic concepts and politics power, democracy. Freedom and justice and say how I think that the digital is political, and how your work as software engineers will increasingly make. You social engineers -. People. Often say that big tech companies have a great deal of power and it's true they do and. That's. Only likely to increase in the future but, I think there's often a conceptual, confusion that people come across which is that they mistake purely, economic, power for political power and I don't think the two are the same thing in, politics. And political science, has said that a very, basic definition of power is the ability to get people to do things they wouldn't otherwise do, or. Not to do things they would otherwise have done let's. Adopt that as our working definition for a moment now. I suggest that technology, digital technology. Is capable. Of exerting power in. One of three ways the first. Is in the way that we saw. With a self-driving car in example. At the beginning which is basically that whenever, we use a technology, whenever. We interact with it we. Are subjects, to the dictates, of the code of that technology so. When you use an online platform or, a piece of software you. Can't, ask it to do something that it's not programmed, to do it. Can only do what it's programmed to do and it get to take another prime ministerial example, that often springs to mind when. Gordon Brown was prime minister he went to the US and Prez Obama gave him 25, DVDs of classical, of classic, American films this was for some reason seen as a great insult to the British people in and of itself but, if that was insulting, will, then happen when the, Prime Minister went.

And Sat down at home popcorn, in hand was that the DVDs wouldn't play because. They, were coded, for us DVD, players and the, digital rights management system. On those DVDs, simply, forbade it now, we know about that technology and we understand, why it's happened but to the untrained eye it looks like a glitch but it's not a glitch technologies. We can only do with them what people. That, were the coders what the programmers, say we can do with them it's a very simple fact about technology, and. This was acknowledged very early on when we started using computers in internet and people started saying well. That means code is law or at least code is like law, but. Things have developed since then quite, recently the. First is that whereas. We used to think of the code that was inside, our, technology, as. A kind of architecture. People used to talk about software, architecture. And the language we use reflect, that so platforms. And portals, and gateways as. If it was a metaphor. For physical, architecture that's. No longer going to be the case in the future increasingly. Capable systems, means that the, code that animates, our technology, is likely to be dynamic it might be capable of learning and changing over time it, might be remotely changeable. By its, creators. Or, it might do so on its own basis, so, the code that used to control us in the early days of the internet on cyberspace, with. More of like dumb architecture. But in the future it's more likely to be more dynamic, the. Second big change is that code, is no longer just power or law in cyberspace, it's, in real space too and that's, because of increasingly, integrated, technology, when, we go around our daily lives and interact with technologies. We. Can't shut down and log off like we might have been able to in the past if that distinction between real. And virtual or, online and offline or cyberspace and meatspace if it does dissolve and if people are right about that then, code is going to be able to exert power on us technology's. Gonna be able to exert power on us all, the time and. There's no way getting away from it so that's the first way that I would, say simply, technology. Can be used to exert power. The. Second of third ways are more subtle. The. First is through scrutiny. The. More you know about someone, what. They like what. They fear, what. They hate, the. More easy it is to influence them it's, the basic premise behind all online advertising, and all political, advertising. As well. If. It's the case that, society, has becoming increasingly, quantified, that all of our thoughts and feelings and inner life is becoming better known to those who make. And govern technologies, then. It'll be easier to influence us because, they'll have more information about us it's, a simple point there's. A deeper and more subtle way though that people gathering, information about us allows them to exert power and it's, the Disciplinary, effect. When. We know we're being watched. We. Change our behavior we. Police ourselves were. Less likely to do things that society would think are sinful, or shameful, or wrong or, that might land us in hot water, Google's. Not a bad example because one, of the, the. Things that Google apparently, does is it if, people search for things. Related to child pornography they're reported to authorities that, in, itself the dissemination of that fact is likely to and does change, the, way the people behave, so. The second way that technology exerts, power is by gathering information about us which, can be used to influence us or. By causing us to discipline and police ourself because we know that informations, being gathered about us, and. The third is the most subtle of, all and possibly. The most powerful of all I call it perception, control. We. All of us rely. On other, people or other things to. Gather information about, the world. Distill. It into something sensible.

And Comprehensible, and present. It to us in a digestible form, it's. A work of filtering, otherwise. Well we'd know about the world is our is. What we immediately, perceive, now. Increasingly. We. Rely on technologies, to do the work of filtering for us whether. It's when we go out and look for information such, as in a search function, when. Information is gathered and brought to us in a news function. Increasingly. We're subject, our immediate. Sensory. Perception. To, technologies, as well with augmented reality over. Our eyes over, our ears over. Our bodies and haptic form or. In virtual reality -, and those. Who control the flow of information in, society, exert a great deal of power because, you know that the best way to stop people from being upset about something, as to stop them knowing about it at all or, the best way to get people angry about them something is to tell them over and over that it's disgusting and that it's wrong and then it has to be punished and. The. Work of filtering. Presenting. To each of us the world beyond. Our immediate gaze, is increasingly. Done by technologies, and so, when I say that technology is powerful, I'm usually referring to one of those three things the ability to force us through codes to do something the ability to gather information about us the, ability to control the way we perceive the world and. There's nothing necessarily nefarious, or wrong with any of these it's. Just a helpful I think way of thinking, about how technology, can. Change the way people behave how, it can exert power. The. Other important, implication, however, of, technology. Flowing. On from how it. Exerts. Power on us is how. It affects our freedom now. The. Great debates that we've all had for twenty years is how increasing. Technologies, of surveillance, will. Potentially. Lead to States, and maybe tech firms having, too much power over people because they watch us the whole time and are capable of regulating us that's. An important, debate it's not the one I want to necessarily talk about today because I think that the effects of technology. And our freedom are actually a little bit more subtle, so. I would ask the, people in the room to ask that for tasks themselves if you've ever streamed, an episode of Game of Thrones. Illegally. Or you've. Gone to take a second. Helping from, the Coke. Machine even, though you've only paid for one or. If you've dodged a bus fare here, or abroad by jumping on a bus and not paying for a ticket and jumping off again. Seventy, four percent of British people admit to having done these is not, because they're all scoundrels, it's. Because, there, is this hinterland. In the law where. People. Are allowed to get away with things from time to time without being punished as long as it's not, constant, as long as it's not egregious. That's. Why so many people do it I suggest. In a world of increasingly, capable. Systems, increasingly, integrated technology, those, little bits of naughtiness will become much. More difficult whether. It's because you're, smart. Wallet automatically. Deducts the bus fare when you jump on the bus or the. Game, of Thrones episode it. Just becomes impossible to stream because the digital rights management technology. Becomes so good or, because you need face recognition, software, to get that second, helping. Of coke and if you think that's petty you, should know that in the in Beijing's temple. Of heaven park facial. Recognition software, is already used, to make sure that people don't use more than their fair share of toilet paper and if, that's the world that we're moving into then. That hinterland, of naughtiness. The ability just to do the little make. Little mistakes around the edges like getting. A self-driving car to go over the speed limit of park illegally becomes a lot more difficult I think.

That Has implications for our freedom the. More profound implication, for our freedom though is what I call the privatization. Of it increasingly. We, use technologies, to. Do. The things that we would traditionally be, considered, freedom. Making whether, it's freedom, of speech. An important. An. Increasing, amount of poor important, political speech takes place online on online platforms whether, it's freedom of movement in, the form of self-driving cars or whatever it is that comes next, whether it's freedom of thoughts the ability to think clearly and rationally, which, is obviously affected by the systems that filter information for, us the. Good news about technology obviously, is that our freedom can be enhanced, by these technologies, the. Interesting. Point though is that whereas, in the past for most of human history questions. Of freedom were, left to the state and were, considered political, questions, to be decided, on by, the whole of society nowadays. They're increasingly privatized, what, you can do on a free speech plat honor on a political speech platform, what you can do with us of driving, car, how, Facebook, or Twitter. Filters. The news that you see these aren't decisions, that you and I maybe you these, are decisions that most of us take. They're. Done privately and they're, done, by. Tech firms often. Acting, in what they perceived to be the best interests of their consumers, but they're ultimately just, now a matter. Of private, decisions. Taken, by tech firms and their lawyers and, I. Think. We, need to think through quite carefully, what the implications, of this are just in political, terms looking at the long run. Of human history because. What. It first means is that tech firms take on quite a significant. Moral, burden, when. They decide what we can and can't do with their technologies, that, was previously, a matter of societal. Debate so the VR system I think is a good example. When. You get a virtual reality system that is supposed to be customizable. In some way or give you lots of different experiences, should, it be up to you the, individual, user to decide. Which. Experiences, you want depraved, or otherwise should, it be up to the tech firm should. It be up to society, as a whole the, traditional answer given by human beings is that society as a whole sets the limits of what is right and what is moral and what, is forbidden right. Now we don't live in that world. The, second thing is that obviously through. No. Wrongdoing. Tech. Firms are not answerable, to the people in the same way that the government's, the set laws are, the. Third difference between a tech firm in a state is that in the, state the law develops over time in a, public and consistent, way that applies to everyone, whereas. Tech firms do things differently. Google. Might have a different policy towards. Hate speech than Twitter does a different, policy than, Facebook, does and. Some people would say that's a good thing for reasons I'll come onto in a second and others would say it's, a challenge to, the developments, the overall, moral. Development of society, of shared values, between us all just. To take two examples of troubled political philosophers since time immemorial one. Is the question of harm to self. Should. We as as, grown-up. Adults be able to harm us selves so if I ask myself driving, car to reverse over my head should, it do that because it's my autonomous. Decision, that I'd like it to do that or my automated cooking system in my kitchen if I wanted to make a curry for me that's so spicy that's likely to send me to hospital but. As my choice should. It do it or, should systems be designed to protect us, the. UM idea that systems, beyond, our control should be designed to protect us might seem anodyne, in this room but to John Stuart Mill and Jeremy, Bentham, and other philosophers like that on whom our legal, system and its principles, are often, based that that, idea would have been anathema to them, for. The same reason that suicide, stopped. Being illegal. Not. So long ago because. People are generally thought to be able to do things which harm themselves should and should be free to do that, even. Even more so the question of immoral, acts there, are very few laws left on our statute, books which stop us from doing things which are considered, immoral.

Or Disgusting, and the privacy of your own home you can do almost any sex act apart, from. Something. Which causes very. Serious, harm to yourself or to others as a. Free speech you can anticipate in the future free speech and free action campaigners are going to say if I want to simulate. Sex, with a child on my virtual reality system in circumstances. Where it causes no harm to anyone else I should, be allowed to do that and actually. A governing, principle, of English law 8 centuries has been if something, doesn't harm other people you. Should be free to do it now. There might be disagreements, in the room about whether that's right or wrong the, interesting point for me is that right. Now that decision is not going to be taken by the state it's, going to be taken by companies. And. That. Marks quite a profound shift I think in the way that politics. Is arranged. And. The, way that political, theory needs to proceed now, in the book I won't bore you with this too much I try to outline, a series. Of doctrines. Of ways of thinking that, can that can help us to think clearly. And crisply, about, what's at stake when we limit and don't limit people's freedom so, I've got this idea of digital libertarianism. Which some people are going to adopt, which is the idea that basically, freedom, is freedom from any form of Technology if I don't want to have technology in my house I should be free not to have it there should be no requirement of smart devices smart, you too and any. Piece, of code that restricts my freedom, is unwanted. More. Likely is that people will adopt a position of what I call digital liberalism. Which is that the rules that are coded into technology, should try to maximize, the overall freedom. Of the community, even, if it means minimizing. There's the freedom for some a, particular. Doctrine, which I think will appeal to free marketers, I call digital confederalism. Which, basically means that any company should be able to set its own rules so long as there's always a sufficient, number of different, companies so you can switch between them according, to your choice people. Will say that's the way to maintain freedom lots, of different little. Subsets. Digital. Moralism the idea that technology should, encourage us to be better people, Digital, paternalism, the idea that technologies, should protect, us from ourselves and, our own worst instincts, or, digital republicanism, for, centuries humans, have demanded, that when power is exerted, over them that, power should not be unaccountable. That, power should be answerable. In some way even if that power is exerted, benevolently. It's. Why the American, and. British at. The American in English revolutions, to a certain extent both happened, it wasn't just people's frustration, that the monarch was behaving, badly is that they could behave badly at any point so. A freedom which relies on the, benevolence, of someone else is no kind of freedom at all and digital. Republicanism, therefore, means that in any technology, whenever power is exerted over you you should be able to have a say in it you should be able to customize, it to edit it according. To your principle, of the good life to your vision of what's right for you these, are all ideas that are new. And strange but. I think we're gonna have to grapple with them whether were Gladstone's or whether were Faraday's if it's right that so many of our freedoms are now going to be in the hands of technologies, and people who make them. Democracy. We. All know the ways in which technology has, affected democracy.

As We currently. Experience. It, it's. Changed the relationship between citizens, and other citizens allowing them to organize look. At the move. On or the Occupy or the Arab Spring movements, in some, places it's changed the relationship between the citizen and the state enabling, a more collaborative form, of government. Petitions. Online. Consultations. It's. Definitely transformed, the nature of campaigning, between. Party and candidate, wean Party and activist. And between party and voter. Activism. Is obviously almost, entirely done online now the, organisation of it the central organization, and, Cambridge. Analytical, and the. Brexit and. 2016. American referendum, show the increasingly, big data and the technologies surrounding it are used, to pinpoint each of us based on psychological, profiles, or profiles of what we like in. Order to influence us in a particular way now, everyone gets very upset about this stuff all very excited. About it and I think it's right to but it's, ultimately an example of what I call faster, horses thinking, the. Reason I call it that it was because when Henry Ford the inventor of the automobile, was asked. What. Did people tell you they wanted he replied faster, horses, it's. Sometimes difficult for us to conceive in politics, of systems that are so radically, different from our own and instead, we just think of technologies, as augmenting. Or supercharging. What we already have and so, the changes that I've just described to democracy, are all, profound but they don't change the nature of democracy itself they, work within the system to which we are presently. Accustomed. And. I, wonder, if that's going to be sustainable or true within our lifetime. I suggest. There'll be four challenges, to the way that we currently think about democracy the. First is the one that I described in the introduction, if. BOTS, get to this stage where, they are good enough to debate in a way that is more rational and more, persuasive. Than us or, even, if they don't how. And and a lot of political speech takes place and online platforms, how on earth are we supposed to sustain, a system, of deliberation. In which you and I have a meaningful, say when, every time we speak were, shot down or or presented. With 50 facts to the contrary and, remember that in the future bots are going to be disembodied, lines of code they'll have human faces they'll be able, if. The sensors are there to detect human emotion, they'll be persuasive and real seeming, so. Do deliberation, which has been part of our concept of democracy since Greece could. Be completely, disrupted, by a technology, that's already underway.

No. One really talks about that that much I think that's something that could be a problem within 10 or 15 years and. That's pretty profound second. Big challenges, we're now entering a time where easily. It's. Foreseeable that we could have full direct, democracy we're basically using a smartphone, or whatever a place is it we were we vote on the issues of the day directly, with no need for politicians or wiki, democracy, where we added the laws. Ourselves. Some model of it it's. Absolutely not technically, infeasible in. In the course of our lifetimes, we need to rehab. The debate about whether that's desirable how, much democracy, is too much democracy why. Is democracy, valuable, in the first place I don't think we're ready for that debate I don't think it's one we started happening it wouldn't surprise me at all if the natural offshoots, of the populist, movements that we see just now is a demand, for. More. Direct. Accountability. For political decisions for people to vote using stuff. In their pockets. Data. Democracy. It's. Going to become increasingly weird. That, we consider, a system legitimate. On the, basis that we put a tick in a box once. Every, five years an, almost, inconceivably. Small amount. Of data is used. To constitute, the government of the day I think. There's, a theoretical and philosophical challenge. To be made about, a system which uses the abundance, of data which really reflects, the lives that we actually lead and the, role that that should play in legitimizing. Governments that is to say if a, government doesn't pay attention to the data that actually exists, about its people how. Can it be really said to represent them it's. Interesting question is one that we haven't currently got to yet I suspect it will rise, in salience and the, final question is going to be about AI democracy. Is it's. Not at all nuts, to consider as we, entrust artificial. Intelligence systems with more and more. Valuable. Things trading, on the stock market. Robots. Conducting, operations. One. Was appointed to the board, of a company in, Singapore, that we. Might ask what role should a eyes play in the decision of public policy making. In the decisions made by policy public policymakers which areas, of politics. Would. Be better. We. Be better served with, systems taking the decision on our behalf perhaps according to principles, that our Creed agreed democratically. Or. Should we each have a nai system, in our pocket which votes on our behalf 10,000, times a day, on. The issues of the day based. On the data that it has about us and what it knows about our preferences, and our, lived experience, with. Just at the cusp of these questions, but, the system of democracy that we have is a very old one and it would very much surprise me if, faster. Horses was all we got if the destruction we've already seen to democracy, was the last we saw of Democratic. Disruption, that would seem to me to be against, the grain of how the digital really, is becoming, political. Final. Concept, social, justice. When. Political theorists, talk about social justice they tend to mean one of two things, first. Is distribution. How, should benefits, and burdens be, distributed, in society. Equally. According. To some principle of Merit to, the best, disproportionally, to the most needy, these. Are all arguments, that philosophers, have had and, politicians, have had for, generations.

And. In. The past they were settled by the markup which distributed, goods. Among. Us and by, the state which, kind of intervened and regulated, the distribution of those Goods increasingly. As algorithms, that are being used to distribute goods, in society. 72%. Of CVS, or resumes. For an American audience are never. Seen by human eyes. These. Systems that make decisions about who gets jobs. Have. Profound, distributive. Consequences. For. Who does well and who doesn't in society. Mortgages. Insurance. A. Whole. Host of other. Distributive. Distributive. Lis important, things are affected by algorithms for example the fact that algorithms, now trade on the stock market, has caused a ballooning, in the, wealth that flows to people who use those automated, systems, mostly, banks. That. Has distributive, consequences. So. What. Political philosophers. Typically, thought of as a question, of political, economy the. Market in the state. That, question of social justice is increasingly, entrusted, to the people who write those algorithms, that's. The first way. That technology. Is going to affect social justice. But. There's more to justice than just a distribution, of stuff. When. We see the slave. Kneeling. At the feet of the master, or. The. Woman cowering, before her husband, or the, the. Person, from a black, or minority ethnic, community, having, insults, hurled at them the, injustice there is nothing to do with the distribution of stuff it's, what's called an injustice of recognition, where. We fail to. Accord, each human being the dignity, and respect that they deserve, now. In the past it was really only other people who could disrespect, us in this way in the, future as we've seen it can, be systems, as well I mean, if, you think of the frustration, you feel when your computer doesn't work today. Imagine. What it's going to be like when one doesn't even recognize your face because, it's the wrong color or because, it doesn't hear your voice because you're the wrong gender or. Because. It doesn't let you into the nightclub because your face doesn't meet the right. Specifications. That the club owner has set, technology. Is increasingly, used in. Questions. Of recognition and, I. Think. That's a profound importance for distributive, justice, for. For for social justice the other way that technology affects. Justice, is that, it. Ranks us I mean. Today we all know what the currency of social status is increasingly. It's, likes it's retweets its followers. People. Who are half a century ago would not have held high status, in society now. Hold high status in society and the, reason they do is because a particular set of algorithms designed by people, like you have been. Set which decide, what the key, factors. Are who's, in and who's out who's up and who's down who's, seen and who is unseen, who's. Great and who's nothing. There's. Nothing inherently nefarious, about this nothing.

Inherently Wrong with it. But. It used to be that only people and our social norms and occasionally, laws like the Nuremberg, laws or the Jim Crow laws which. Specifically. Discriminated. Against people were, the things that decided, the politics of recognition. Now. That's done by technology, and it's, increasingly in the hands of people, who aren't politicians and who aren't necessarily, philosophers. Either. So. Just stepping back power. Democracy. Freedom, justice, these. Used to be words the just politicians, and political philosophers. Used. In their day-to-day discourse. I say. That they have to be words the, software engineers use in their day-to-day of this course and the tech firms know and are familiar with and understand, I like. To close with with, two quotes that have always stuck. Out to me, the. First is this and you might have heard it the. Philosophers, have only interpreted the, world in various ways the, point is to change it. The. Second is this we're. Not analyzing. The world were, building, it now. Essentially. They mean the same thing what. They say is you can talk and, you can think and you can debate but the real people who, create change are those who go out and do it the. First quote is from Karl Marx is from his theses on Feuerbach in. 1845. It. Was a rallying cry for revolutionaries. For more than a century, after. It was published the. Second quote is from Tim berners-lee it. Couldn't be more different from Karl Marx in his politics his, temperament, or indeed his choice of facial hair. But. The points the same the. Digital is political, software. Engineers, are increasingly. Social. Engineers. And. That's the message of my talk thank. You very much. Thank. You very much indeed we, do have time for questions so. What, do you think of the increasing, tendency of, governments. To abdicate, responsibility to. Tech, firms to make decisions, you know the classic example in the last week. I think the. EU has said we want tech firms to make the decision and take things down within an hour. Do. You think that's a good trend or. The. I'm, not sure what you mean by the abdication, of responsibility or. The delegation, if you want while they stay where. The government could choose to regulate but instead. Choose to say you, must decide the. Message I have is this. If. It's the case that there's going to be. The. Tech firms are going to be taking decisions, that are of political, significance in, due course people are going to expect to know what those are to demand transparency to demand accountability to, demand regulation. Tech. Firms essentially have two choices. Not. Mutually inconsistent, they. Can try. To. Get it right themselves. And, articulate. Why they think they're trying to get it right to. Set out clearly the way their algorithms. Work insofar as it's possible in the market system to. Justify, them by reference to principles, of justice or principles of democracy or freedom, the. More of that that is done privately. And willingly by tech firms the less likely, it is that the state is going to come barging in and start regulating we've, actually seen that I think tech firms are. Increasingly. Becoming answerable, to the unhappiness or the perceived unhappiness, of their consumers, about the way that, things are working, but. I think if the state just came trundling. In and started regulating than tech firms would say the same as any private corporation, has said since the invention, of the state which is I can't believe these fools. At the center of government are trampling all over matters, that they don't understand, these Gladstone's, but. We, have to find a compromise between the Gladstone's and the faraday's the people who know a lot about tech and the people who know a lot about politics, and I think if tech firms assume. Responsibility. They're. Less likely to, face.

Regulation. Which they consider to be dar moral informed. So. When. You said and I think it was in the first quarter or half of the talk, regarding. The privatization. Of. Policy. Through, the through, the use of tech in these firms where. Does open-source fit into this and free software in that whole movement because one, would argue and I think a lot of people would probably agree with me that. Open-source. Is a political. Movement of tech. So. Before, it, was really known in the private, world that would, become political. So where where, does that fit into this whole picture and how does it change the equation it's. A great question and the answer is it does it obviously doesn't fit into the very, simple dichotomy, that I gave but I think it's also fair to say that although, the open source movement has become, incredibly. Important, in in many respects, most. People don't know what it is most, people when they use technologies, don't have, the opportunity, to customize or edit those technologies, or to understand, the rules that govern them. If. More. Tech was open-source, that. Would definitely resolve, some of the tension between what, appears, to be private, entities exercising. A kind of public power if. They're using code that can be at least seen and understood by. Its, consumers, I. Just. Don't think it yet characterizes. A, lot. Of the technologies, of power that I describes. Okay. Thank you. Hey. Thank you I'm. Also, going back to the issue of privatization. And I, think in some ways we. Could argue that there's a benefit, here that, that with, an. Increased. Number of actors, making, decisions, we get we get pluralism, and. That's not a terrible, thing, but. I think that that, maybe I wonder, if you could reflect on on whether this, claim of of. Privatization. Is is as solid, as as you suggest, a lot, of these technologies, were funded, by public bodies. By the state and and. I wonder if if we need to sort, of revisit that, that, the. Genesis, of a lot of these technologies, because we often forget that these were funded, by two taxpayers and, that they're not strictly. Private architectures. Or private. Systems I think, it's a really valuable and important, point that there are two reflections. I would make the. First is that the fact that say a technology. Derives. From. Public. Investment, doesn't, necessarily, mean that the public retains a degree of control or transparency, or accountability it's. The use of the application of the technology that matters for political purposes for the ones that I'm describing rather. Than the, the. Genesis, of them the, second point I make that maybe, I didn't make strongly. Enough in my speech is that. You, also a lot, of the time you'll turn ative to technologies, being, controlled. Privately. As technologies, being controlled, by the state and. There. Are huge enormous. Risks with that the modern state is already the most powerful, and. Remarkable. System of control that humans have ever, invented. The, idea of endowing the state through, regulation, or nationalization. Or whatever it is that some people suggest, with, further power in the form of awesome technologies, of surveillance of force and, of perception, control is not something that I would. Welcome. Inherently. So, actually the big political, tension, I say for the next half century or so is going to be how, much of this stuff is best left, to the custodians, in the private sector acting, responsibly and, how much should be brought under the aegis of the state but. It's certainly not a kind of state good regulation. Good, privatization. Bad dichotomy. And I'm not just saying that because I'm at Google I think the argument is often forgotten by those who, criticize, tech firms that. The. State, can act in a pretty, heavy-handed. Way, when it comes to technology as well there's. A balance to be struck I, guess. You probably part. On that my question just now but. Like. My question is or in. A similar sense about the like, what, option does the, regulator, even have like, and I'm thinking now of of.

A Global scale so. The. Status quo as I, see it and tell. Me if if, you disagree like, regulation. Is always playing catch up, with. Technology and, the question is if the. Regulator, wants to turn this round what option would they even have because, if if one, country would would. Start. Trying to invert, this and basically. Try. To have, regulation. Be the default and. As. A technologist. You would basically have to seek, an exception, for every single thing he, wanted, to do rather than what is now like. Technology. Companies. Invent. New, paradigms. That affect society and then regulation, catches up so. Obviously if if one state started, trying, to invert this. Tech. Firms, would probably. Move. Away from that, country and like do their innovation, elsewhere and there would always be sort, of islands, of. Deregulation. As there are islands of like, like. Tax, harbors and and that that, kind of thing so if. You think. It from, that point of view do you but. What's kind of your view, on on that, well. Leaves you from you've identified to, two problems, that the regulator, faces, one, is you're always behind the technologies, are invented first then you're kind of playing catch-up to try and understand, their implications, and if necessary regulate. Them the, second is the problem of multinational. Corporations, if you're just one country it's very hard to set a rule, that. Others. Don't follow which might place you at some kind of disadvantage. Economically. Or commercially, and incentivize. That firm to leave there are other problems too like the problem that regulators, sometimes, don't have the best people the, best people all in the private sector I know there were the regulation, of Finance for instance that's a consistent, problem so, there's no doubt that the task for regulators is formidable. What. Are their options well they got to do their best tech. Firms I think, shouldn't. Just see it as a matter of we'll do whatever we like until, were regulated, I think the whole system would function better if, purely. Commercial, considerations. Didn't just. Motivate. The policy, set by tech firms and increasingly they, two-nil may wouldn't button for a second suggest that they always do. The. Problem of international. Movement. Of capital you, know or of. Competitive. Advantage, is a tough one the, EU is actually not a bad. Counterpoint. To that you, know the the GDP are say what you like about it it's a kind of regulation and it applies to every country in Europe and then makes it easy for them to act in concerted, fashion I would welcome I. Mean. I would see technology like climate change is one of those issues that benefits, from international, collaboration. And cooperation. Partly. The way we think about it though is if, we think about it as an economic problem like. The. The power that tech companies or that, the the problems that can be caused by technology, are just matters of economics, and this is actually part of the mindset that I want to try and change which is we. Have to start see them seeing them as political problems, and I would hope and encourage. For. The part of states that they don't. Deregulate. Or. Create. Wild West's, out of a desire to attain an economic, advantage, I mean. Countries. Do do that though there's just no doubt about it so I I, hold my hands I can I say the task of the regulator, is formidable I think, there's. So little regulation. Just now though and. Technologies. Are becoming so much more persuasive, and so much more powerful something, will be done as I, said earlier the. More that tech firms are involved in that proactively, and, sensibly, the, better it will be for them for the States and for people who use the systems. Okay. I have, a question more about the concentration of power and. The accountability, which. People demand after that. Increasingly. Companies, like Google or Facebook they've, become public. Utilities, where we use search or, a social, network on our debt on a daily basis, and. That is the concentration, of power do, you think in 20 years from now we, will see a Google. Or a Facebook, that's held. Accountable, maybe inside, this, state and we actually vote on how that's regulated.

Well. III certainly. Don't think. Nationalization. Public, ownership of things like Google or Facebook would, be a good thing. I also. Not sure if public utility, is I think it's the best word we've probably, got just now to describe, the kinds of state status, that these companies have within our modern economy and within our modern society I don't think it accurately, describes it most, public utilities don't exert power over us we. Rely on them we rely on the water company, the electricity, company but they don't get us to do things we wouldn't otherwise do, they don't influence, elections, they. Don't determine matters, of social justice or what, is and isn't permitted. So. I think, the public utility. Analogy, is helpful. Only up into a point, do. I think that in the future it's possible they would be nationalized, or part, of the state I, guess. So I think it will be not, sensible, but. Again. I think, the regulatory environment. The. Regulatory future, is up for grabs. So, you talked briefly about how. People. Still have this mindset of faster horses when it comes to technology a lot, what. What are the hallmarks, are what timescales do you expect this kind of public. Mindset, to shift from. Just thinking of this you know technology, as like. A step change rather than like a big step. Forward and a revolutionary. Aspect of. Technology, it's, a really interesting question and I'm not gonna give you a defined, time scale because I think again. It's. Up for grabs I think. What. I try to do in my book is to sound the, foghorn, and say we need to think about this stuff not just as consumers, but it's conservative citizens we. Need to not think about it like faster horses but to see the fundamental revolutionary, change hey some people are going to disagree with that thesis be. A lot of people are going to be interested in it and they're, just going to be interested, in, interacting. With technology as consumers, which is what most of us do most of the time that looks cool this is a cool new function, without, necessarily, seeing the huge broader. Picture. So. I don't have an answer to the question as to as to when I expect, if at all public, perception of this stuff to change I do think the market forces are likely to result.

In The transformations, that I described, so. Insofar. As the kind of political classes paying, attention. I think. Easily within our lifetimes, we're going to see the, big question. Of politics, change, from, what it was in the last century, which, was to what extent should the. States be involved in the functioning, of the economy and to what extent should things be left to the free market, that was like the big ideological. Debate of the last century I think the big ideological, debate of our lifetime, is to what, extent should we be subject, to digital, systems and on what terms and, I, I say over the course of our lifetime the debate shifting that way because I see. Is almost inevitable, if the technologies develop in the way that people predict they will. Yeah. If you say hey you've said a few times you'd. Like to see technology. Companies. But technologists, get more involved, in politics, and in, a lot of people's heads that's. Equated. With lobbying, which tends to be seen as a bad thing can you talk about maybe some of the positive, ways you could see technologists. Thought algae companies get involved in politics in fact I think that analogy perfectly. Demonstrates. The, change in mindset I think we need. Powerful. Companies in the past say like the great monopolies, of the early 19th century had power and the political process but they exerted it indirectly. Through, lobbying and through, campaign finance, what's. Different about technology, is that affects us directly if. You're, a if you're a tech firm you don't need to go through the government, in order to exert, power over people, or to affect democracy or affect freedom or justice. That's. What's so profoundly. Different, about technology. And so, I say that people, who work in tech firms do work in politics, because their inventions. Their algorithms. Their systems, are the ones that are actually, changing. The way that people live and changing, the way that we live together so, it's. Not Mark Zuckerberg should run for president it's Mark, Zuckerberg who is already in a sense some, kind of president, he because, he affects all of us in ways that he should know about more and so he should take that power as. He as I'm sure he does responsibly. And seriously, so, what. I don't want people to go away thinking as I'm saying that we need to technologists. To step into the political process more, although there should definitely be, constructive. Engagement the. Point is that if you, work in technology in a sense you already work in politics. Positive. The, positive improvement, I'd like to see is the Tim berners-lee idea, of philosophical. Engineers you, know he's the one who said we're not we're not analyzing a world we're creating, it and with philosophical, engineers well, sometimes. You. Know the arc of a computer, science degree is long but it doesn't necessarily bend, towards justice just. Like people, who. Know a lot about politics, shouldn't be assumed to know a lot about technology I, think the people who work, in technology should, have a good, grounding, in the values and principles that.

They. Are whether, they know it or not embedding. In their work, and. That's that's that's why I wrote the book in many ways it's a book about tech, for people who know a lot about politics, there's a book about politics for people you know a lot about tech. Hi so. My question is considering. That in private, companies the, end goal or the incentive, is usually, to, make their, users happy, which, is starkly, different from, what the state cares about which is to promote, the general well-being of, their citizens. It's. Hard for me to think of things, like filtering, content as an exertion of power rather. Than an enabler, of kind. Of their users, to exercise, their freedom as as, they would like and so, I guess. My question is, when, you're saying that technologists. Should be kind, of these social, engineers. Do you think that requires a fundamental shift in, what we're prioritizing, and kind, of adopting, this more paternalistic. Approach towards, oh we think this would be good for our users rather than this, is what the evidence shows, our users like. Again. A great question I said FMA alum, pick it, do. I think that needs to be changing priorities my first answer would be to dodge and say I don't know because, most of the algorithms, that you describe are not made, public and if you look at what Jack Dorsey said to Congress the other day and you. Know one can applaud him for, bringing. It to the public's attention he, basically said we. Got it wrong, six hundred thousand accounts including the accounts of some members of Congress were wrongly deprioritized. From political discourse, at quite a sensitive, time. The. Answer to that to my mind would be a Twitter, algorithm. The, people are capable of understanding and, the people are capable of critiquing, rather, than a one paragraph explanation. From Twitter which, says well the policies are and says and many other factors at the end we. The users are not in a position either, to know whether. The algorithm actually embodies. The. Values. That are stated and to, a certain extent what the values are so the, first thing, one of the things I talk about in my book is the more transparent, companies, are the. More people will be comfortable and justifiably, comfortable, just as they are with governments, that become more transparent, that. The people who exercise power over them are doing, it in a, responsible. Way even. If it's just a small amount of power. The, second thing I would say is that, you, you correctly, identify, that the. Intentions. Of the state are different from the intentions of a private company operating, within a market system the. Difficulty, with the market system approach, to tech to, just. Letting the market do its job is first of all you get monopolies, and so, even if I don't like Facebook. If, I want to be on a social networking, system, it's no point moving to one which is just me and my mum even if it's superior. And loads, of respects because there's a network effect there and Facebook has has. Dominated, it, the. Second is you also it, relates back to the first we don't always know the principles, on, which companies. Are competing, the. The difference, between the way that, news is ranked in one system, and news is ranked in another system is apparent only from what we see but, we don't always know what we don't see and so. I think. It's hard to say that people are empowered, fully, to. Make. Decisions, like that if a, they, don't have a choice because there's a monopoly, and B they actually aren't showing the full basis, of their choice and that they, have to that.

They Have to make I mean. You. Are right though that assists. A pluralist, system, where people have a choice of moving between systems. Where of competing. Values according. To their values would. Definitely. Be one solution, to, the problem of what. Might be perceived to be too much power concentration. Or too much on accountability as, one answer. We. Do have more questions but we unfortunately out of time so, thank you again very much, Jamieson. You.

2018-10-21 06:18

Show Video

Comments:

He is soo hot, Jamie

I we don't switch from fossil fuels to sustainable energy sources in a major way, in about 50 years the human race will be finished anyway.

Susskind is an absolute twat

You mean living together under Agenda 21 in a neo-humanist world. Not for me.

really mind opening talk.....crazy to think about the future we are creating

There is some really interesting and insightful material in this talk! Great work - will look out for your book to explore these subjects deeper.

future mp/pm?

I bet you have a high intellect...

Other news