All, right well thank. You everyone for coming. Today. We're delighted, to welcome Ramesh. Srinivasan. Ramesh. Studies. The, intersection. Of Technology politics. And society. Since. 2005, he's, been a member of the faculty, and. Professor, at. UCLA, in the Information, Studies department. Prior, to that he, took, his PhD from, Harvard, his. Master's, from MIT and his, bachelors, from Stanford, and. Then went on to become a fellow at, the MIT Media Lab, and. A teaching fellow at Harvard's, Graduate. School of Design. He's. Here today to discuss, his first book. Whose. Global village, rethinking. How technology. Shapes our world. The. Book is a sort, of call to action. To. Include. Marginalized. Non-western. Countries in, the digital revolution, and so. Without, any further ado, please. Join me in welcoming to Google Ramesh. Srinivasa. Alright. It's absolutely. A pleasure to be here I, have, a bunch, of friends at Google two of whom are here and my new friend David I wanted. To thank you all for making the time to, listen. To some of these thoughts what, I'm gonna do is set a timer and I'm gonna go about 45, minutes maybe, even less and then we'll have a conversation about, this material so. What, you can do also is if, there are thoughts or reactions or comments that you have to any of this material by, all means you can tweet about it and link it to me and we can have a kind, of a synchronous, conversation, afterward, and. You can also get in touch with me through my email address, and my website, which are listed, on, this on this link right here okay. So. Many. Of us know what this diagram represents, but, let me kind of unpack it a little bit this, represents. The fiber-optic cable. Infrastructure. That provides sort, of the back hall if you will the the very basis. Basis. Of what the internet is and. You. Know many of us think about you. Know how this sort of environment, and this landscape, is shifting but. It's very important, to note when you see diagrams. Like this that the Internet is actually not in material but it's actually material, it's rooted, in the, things we build and the, places we connect but. What's also striking when you see a diagram like this is how. The. Networks, that are actually drawn by, fiber-optic. Cables that represents, the Internet and there's also of course satellites, that are part of the Internet as well, represent. The, places, that are already in, a sense very well connected, in our world today right so if you look at a map for example of plane flights you'll, see that there'll be some pretty high correlation. Between plane, flights and other forms of traffic in exchange and these, mappings, as well so, that explains why we see you know the west coast of the United States for example well. Connected, with, you. Know Shanghai, and Beijing in this map it, also explains why we see New York City well connected with London, and and Western, Europe but what's particularly, notable, when you look at this diagram is what.
Is Not connected, and what. Is not connected, to what, particularly. Speaking that we're talking about just a couple fiber-optic. Cables that, connect, the, two major, continents, of the, global south right. That of Africa, and South America, so, what that means is this. Sort of global village, so, to speak which is you know partly the title of my book the term in my book is. Not actually, come to be that, if anything this, vision that Marshall McLuhan and many others had of sort of an internet that brings everybody together and equitable, and flattened, and democratic, fashions has, not really been the experience on a very, very material. And specific, level itself, so. To kind of continue, let's. Think about one of the major metaphors. By, which we think about the Internet today and that we think about data today more broadly which is the cloud right so, as we, all know is we think about the role of metaphor, in our lives metaphors, both open up our understandings, but in some cases they also close our understand. Right so, when you think about the cloud you say hey who. Has a problem with the cloud a cloud is immaterial, a cloud is made of water our bodies, are made of water clouds are everywhere right who, has a problem with the cloud itself, but. Of course as we all know here at Google but we also know in other companies the. Cloud is very much transacted. And determined, by, a few, companies and their terms of service in relation, to personal, data right, and so, the reason I have these three, logos here, on this, map including, your own is because. In the country of Mexico and I'll speak about this very briefly in this talk today where I'm currently doing my field work. It's. Really fascinating there, been some informal, studies done but. Emails and various forms of specific, communication, that are sent between one, neighbor and their, neighbor just next to them are often. Forms. Of data that are that are transacted. On cloud-based, servers, through, these three companies so we, all know this but if you send an email to someone who's right next to you that email might actually go through a server that's outside, your country and that, is part of a private corporate, kind of proprietary Terms of Service so.
I Think it's important, to read our experiences. Of the internet more largely and data, as well through, the specific, mechanisms. By which it's, transacted. And arranged. The. Reason why I say that is we're at an incredibly, staggering, moment I would argue an inflection, point and how, we think about everything, from how, money is made to, how labor is accumulated, to, how services, are drawn right we're, at a point right now where platforms. And specifically, the so called looseness, or, flexibility. Of platform. Infrastructures. Mean. Everything, right so. You think about these particular examples. And you can substitute very, easily, Alibaba right the retailer, one that in yellow, there with, Amazon, as well and, you can actually realize that ownership, and labor are actually, in a large sense something of the past right incredible amounts, of money are, accumulated, simply, by being some sort of transactional, agent online and I, think that that is really really important, because that. Of course is brilliant, it's disruptive, it's an example of an incredible, incredibly. Creative and, efficient. Technological. Advancement, but it also has massive. Massive massive social. And economic, effects and, we. Need to be responsible, and aware of those effects, so I'm. Very happy to know that there's increasingly. A movement toward ethics in AI and ethics, and algorithms, and ethics, in thinking about the effects, of technology, not just when you click on something but, more rooted, in our societies, and we can see that from everything from our rent prices in San Francisco, to. These. Sorts of questions about labor and the independent, gig economy right so I want you to just kind of keep this thought in mind as we, think again about how not only the cloud has very specific, transactional. Arrangements, that are associated with companies, but. Also our experiences. Of taking, taxis, staying. In houses, sharing. Media content, via Facebook, for example those, are also all. Transacted. By platforms. That are accruing, a great amount of money and power. So. The reason why I'm getting into all this is because we're building systems, right now here, at Google and of course in many other capacities, many other companies that are so-called learning, from the world right and as, my colleague Cathy, O'Neil says in her recent book called weapons of mass destruction, generally. Speaking and you all know this better than I do when you create an algorithm you're basically thinking about what is the recipe and what is it what is the success right, you're thinking about the set of instructions by which some sort of deterministic. Or successful, outcome, can be programmed, or created. Right, and so, the reason that that is problematic. And this is sort of a sad, but kind of humorous example, around this is if. We, are building systems, that people generally, speaking treat, as neutral, and natural because, none of us have any time anymore right we're all completely. Multitasking. And the technologies, allow us to do that but they also put. Us in a position where our minds, we, sort of blindly trust that which we see right. But, if those systems, are learning, from a world that already features, various forms of bias and, there's, quite a bit of science that shows that at the minimum racial. Implicit, bias is the Nohr rather, than the exception, then. Those, systems are going to create outputs, that we might treat as neutral, that, actually reflect, those biases, and normalize.
Those Biases, right, so, generally, speaking you all know this again what. We end up creating algorithmically. Is based, on who we are and what we design and what software we build right, the kind of learning model, that we apply and kind, of our ability to design that learning model and the data sets of course by, which that learning model is learning and the, outcomes, for which we are optimizing, those algorithmic. Systems so, this is sort of a again. Kind of a weird and humorous, example, of a company out of Russia called. Face app I don't know how many of you heard of this out. Of Saint Petersburg in Russia that, was attempting, to go through the internet and take, people's faces and make them more attractive so. I think most people think. That President, Obama is pretty attractive right at. Least in 2008, right until, you had to age by being president of this country, and so, President Obama's, face right we know he's mixed-race but President Obama's face was turned white by this algorithmic. System I'm. Not really sure he looks that much better thanks. To the system than not, ok. So now I want to kind of touch on other issues just, to kind of push, these ideas, and kind of engage with you in a conversation, around these issues I. Don't. Know how many of you read this article but it's quite it's quite persuasive. For me, published. By the the. Outlet Pro Publica and this. Is a really, really interesting issue. Where. A private, company, was. Charged, with, building a system that. Could predict, in, theory. The rate of recidivism so, what does Roo citizen recidivism, means you committed a crime what are the chances that you're gonna commit a crime again right, and what. It turned out in this particular, case and this company is called North Point in this particular, case is, the. Caucasian, man on the right who, actually had. A felony, conviction already. On his record. Was. At a, 70%. Rate. Less. Predicted, so the black man was basically. Predicted. With a 70%, higher rate to commit a crime again. Than. The white man and the black men did not have, a felony conviction on his record now. The, the, systems themselves are not directly, trained, on racial issues so it's important to point that out this is far more implicit and pervasive as a form of learning phenomenon. Then, actually, saying oh you're, black therefore, you're gonna commit a crime again, they were asking, people other questions, like was, one of your parents ever sent to jail or to prison how, many of your friends and acquaintances, are taking, drugs illegally, not sure why they answer that question correctly or, how often did you get in fights, while at school right, and we know. Environmental. Factors are correlated, with racial factors, that, might allow. This gentleman, on the left to answer yes to some of those questions and this gentleman, on the right to answer no to these questions so, this is a huge issue because, increasingly, more and more states.
And Cities, themselves I'm going to show in just a moment are actually. Applying, various, sorts of technological, systems in theory, to overcome, human bias right. And in, many cases there it is true humans are biased right but, the problem is is that systems, that we're treating as neutral, and natural are actually. Complex. Reflections. Of those forms of bias and highly, inductive in environmental, manners, so, this is something that you know you all at Google, a most. Powerful technology, company, in the world in my mind guys. Got to do something about this help us with these issues, because the impacts, of what you do are profound. Are profound, and I'm going to talk a little bit about some of Google's projects, in relation, to this space as well, okay. This. Is another example. That's, made the rounds this. Is actually research, done by a colleague of mine at the UCLA, anthropology. Department, who's, built neural, network, machine, learning models, for. The, Iraq battlefield, he had Pentagon, funded, research. He. Also was building learning. Models based on archaeological data. And now, there's, an attempt to apply some, of these kind of if you will learning. Ontological models. To. The, theme of predictive. Policing and. I think all of us, Minority Report so we know this idea of. Predicting. Whether a given, crime, in this particular, case is gang-related. Or, not right, we, all know that gang-related itself, is quite murky definition, right like what, is and is not considered, a gang is it simply a yes or no are there gangs that might be construed more as communities, or their gangs that are actually violent. And and a threat to society, so. This is actually being. Potentially. Implemented, by the Los Angeles Police Department right, where I live in, Los Angeles a predictive. Policing system, again, that, will what what this will do is it will be fed crime reports, it, will actually develop it's called partially, generative, research, partly. Generative, partially, generative algorithms and in, the absence, of a written description the, neural network will generate new text and algorithmically. Written crime report that isn't actually read by anyone but it's supposed to provide context, to a police report and then, is turned into a mathematical vector. That, will relate. To the prediction, of whether a crime is gang-related or not and. So. Of course as you can imagine and, I'm kind of implying here very I don't even know if I'm imply I'd, be pretty clear this, to, me is problematic. Because it doesn't really consider, all these sort of qualitative, social, issues in the construction, of an algorithm let, alone considers, how, such.
An Algorithm might be regulated, how there might be oversight. How it might be checked the checks and balances and so on so, often I get the response to this well people, are biased judges, are biased the police department, is bust no, question, about it at all but, for me setting. This up as kind of algorithmic. Bias or human, bias is a bit of a false positive right, it's a false comparison the question is is what kinds of technologies, should we aspire, to word, what. Kinds of relationships, between humans, and machines are, ethical. And healthy for us in our society, everything, from labor to questions, of criminality to questions of racial bias which I know none, of us really want to perpetuate right, so. So. The. One of the one of the junior authors, to this to. This research with Jeff, granting him my colleague. Presented. This this. Research. Before, this kind of made the rounds in the media and he. Was asked. Quite a few critical. Questions, if you will and he. Left the room saying. Here at kind of ran out of the room saying I'm just an engineer right, but, as we all know and and, you know this is part of our engineering education we're, seeing a movement toward ethics courses in our AI sit, in our AI classes, as well to. Integrate the two and other. Colleagues, are saying we should integrate moral, philosophy, classes with computational. Sciences classes or anthropological classes. With computer science classes I would love to co-teach classes, like this at UCLA, and we struggle to do this frankly this is a pervasive issue but. We know that we're not simply engineers, that what we're building has massive, impacts on society and what, we built has a lot to do with the cultures, not, only of ourselves but, the but the organization's of which were part you, know we all know that right so. I guess, I just want to kind of use these examples to talk about the world that we're constructing, right now and what. Happens when we give private. Organizations. That. Are quite, secretive, because they have intellectual. Property and copyright issues, all the, power in public. Life right. There's that blurring, and I think to, my next point this is why so many people are upset at Facebook, today okay. So this is research that actually, informed, some of the algorithms that were used by Cambridge analytics, I don't think I need to introduce Cambridge, analytic, to this crowd I've, been able to do some interviews with them for my new book talk, about that in a moment but. This relates to research done by a scholar, at the Stanford Business School right up the street highly recommend you having, him here he's very nice guy named Mahalo cause in ski an me, how has, made the argument and, it was a real barnburner when, I brought him to my University, for people to, ask him some questions that with ten likes this is on Facebook a computer, knows you better than a colleague with, seventy, likes it knows you better than a friend or a roommate a hundred, fifty better, than a family member and three, hundred likes it knows, you better than a spouse but it's not simply, about likes, on Facebook, because you know to be I don't think my, my, sort of lends to my personality, is that well, revealed, through Facebook there's a lot of things I believe that, I don't share on Facebook and may not even be that implicit but, it's the aggregation. Of data so. What we think of as disaggregated. Activities, in our lives are being, aggregated, they're being brought together right I never thought that what I might buy at Rite Aid would somehow have any relationship, to what I post or like, or comment when I connect with Alex on Facebook right or with Matt on Facebook, both my Facebook friends right so, I, think.
That Those parts of my life are, independent. Of one another as, a human, being I think, that I want the right to be Who I am where. I am when, I go home I'd like to be who I am in. A different way in a different fashion than I am when I'm here right now Who. I am in front of my mother is different, than Who I am in front of my friends right, I'm kind of the same person but still my. Mom's right here so so. So, these are these are I think these are things that are really really important, to kind of consider right like the power over, our own agency. As human beings and more generally, the idea, of knowing, someone knowing, someone true, that's, true that on some sort of snapshot in a specific, set. Of activities I might, be able to predict one's behavior but, should and I have the power to change, should. I have systems, that reinforce, certain types of behavior feeding. Me information that. Then I have no power to necessarily, overcome. Because we all are conditioned. By systems, of all forms not just technological ones, school, systems, etc policing. Systems, governmental. Systems that influence, us in our lives we are socialized. By the systems, of which were part to think that people somehow can overcome, those forms of conditioning, is naive. Right so, the question for me is that is our multiple. Questions, about, moral morality, about ethics and about the agency, of us as human. Beings. So. What. I'm getting at is sort, of a set of these issues that we're starting to normalize, in our lives and I'm gonna give you one. Well, kind of two very quick final examples, of this this. Is a story. That's made the rounds based on original, research that was published in, science you know which is like the dream for all of us to publish in the journal science, this. Is related, to the, word embedding, algorithm, I don't know how many of you know about this research but it's very much worth knowing so, this research is has. Been built into automated. Systems, that are being piloted by various companies for human resources labor as we all know automated.
Labor Or at least semi automated, labor through. Computational systems. Are going, to replace if not supplant, or, perhaps create, new opportunities I. Hope for, people's, jobs at the end of the day we need to be building technologies, that serve all of us as human beings even on a kind of company based level right we shouldn't be losing people, in our jobs unless, the goal is to lose people and cut, costs right so what these systems were doing word embedding in particular. Has. Had a 50%, bias. On its CV scanning, between, african-americans. Names and. And. Caucasian, names even, when the Seabees are normalized. Around other variables. For, being more or less equivalent. In terms of their level of quality I mean how that's even done is a very interesting question. But what is essentially, happening is, again not, a problem necessarily. With the system, itself but what it's learning from and the overall environment, around, which it's kind of computing, right so, words so, for, example words. In this system and in, generally in corpus, of English language like. Female, and woman or more closely associated, with arts and humanities, surprise. Surprise, white. While, male and men were closer to math and engineering professions. European. American names in terms of the outputs. Of these systems, were more, closely associated, with words like gift and happy, an. African-american names, were komak more commonly associated, with unpleasant. Words. So. The issue is that unless. Algorithms. Are explicitly. Programmed, to address some of these issues they're. Going to be riddled with the same social prejudices. Right, and so the question really is how do we do better eight, hundred, and forty, billion, words, were. Used to train this system, and it was providing, these types, of outputs. So. You. Know what's going on here well four. Or five factors, that I want, to identify, first. Questions. Of diversity, and inclusion which. Is really central to this book that you now have in your hands around, the design and engineering of technologies, second. Our data sets as I mentioned, already what kinds of data sets are these systems learning from third, and this is really central to my work the ontological, learning. Models that's a fancy term what do I mean by that when I say I know something, how, do i articulate, that which I know when I say when I say it believe something that's epistemological, when, I want to articulate, that knowledge that's ontological how, I express, so, similarly, the systems we build can, be built with particularly. Models, that might be more inclusive where, people, who, might, be targeted. Or unfairly, excluded, by such systems, can be, part of the process of, designing, and developing, such systems, etc and then, I also really, believe that we need independent oversight, particularly. When these are privatized, services, being used in public contexts, right and the best example of all is obviously, our election, right and elections. Across the world so I writing, them in my new book about examples, like this from Myanmar, Sri Lanka. India. The, Philippines with Rodrigo Duterte in, the cyber troops issue all of these issues right and a lot of this implicates, Facebook, but, of course since you're building a technology many technologies, that are accessed, by two, plus billion people, you guys should tell me the actual numbers I think it's two plus billion people all I do is scrape journalism. Right. Pretty. Impactful, stuff so we're gonna have to think about. What. Are the values. Right. That influence, how we build systems and how. Do we think about the other especially. Because the places where Google and Facebook and other big companies are going to expand, in their reach are guess, what of course the global South right places, where there are more people higher. Density, of population less. Connected right, but who are those people how, do we understand, them at not just as users. But, as people who have voices and values and certain histories, and. And things to say to us you know like that. They're living they can express and communicate with, us as we build and design such systems and as we think about the effects that those systems pose. So. The reason. I show this one is I think this one made the rounds but of course there's also a little bit of a movement here in Silicon, Valley and I've interviewed Sam, Altman from, Y Combinator about.
This For my new book and a couple other folks concerned. About this kind of super intelligence, that might be emerging, on a recursive, level as various, sorts of algorithm, algorithmic, systems start to learn not, only internally, and develop their own sorts of forms of complex adaptive behavior. But they learn from one another I think this was totally overhyped it was basically an interface, language that was developed between two, AI. BOTS systems, I don't know if you know this story but I still. Think that it's it speaks to all the different ways in which were talking, about and thinking, about AI and it means a lot of different things right automation, is not the same as AI specialized. Is different than generalized, ai ai, and biases, like the examples, I gave earlier is different, than this kind of AI so, these are all things that are worth thinking through in my mind. So. Of all, people, my, favorite Tucker Carlson, that's a joke is, is, his. I'm. Showing my biases that's why we all should just show our biases, right has. Actually. Been speaking, to cognitive. Scientists, he is concerned, about the effects, of the centralization, of power around, technology on our political system, because he's such a great Patriot, right. So. He did he interviewed a colleague of mine who has been quite critical of Google full-disclosure named. Robert Epstein and, Epstein's. Research has been looking at search. Results, in large, scale controlled, studies, in different parts of the world Australia India. And the United States and he's, been doing really really, interesting, work and he published, one piece in the Proceedings, of the natural Academy of National, Academy of Sciences, and he was the head of psychology, today and what, this research showed and this might be very interesting, to you that if he searched for political, issues on with Google search results, simple. Masking, of the ordering, of those results could flip undecided. Voters from 50/50, to 90 d10 let me explain what that means so Hillary Clinton Donald Trump let's say they were both you, know I'm totally undecided, between the two of them and I. Have you know Trump at one and three and Hillary, at two and four it could flip it from. That to Hillary one and three Trump two and four fifty, fifty to ninety ten, so. That's really really interesting did that make sense basically, how, you order, even, if it seems sort of not necessarily that. Significant. Could actually largely, affect people's, voting. They're, kind of in their their, biases, toward voting, so I think that's a significant, issue because. Again. That's the sort of transgression. Into the public space and one of the most famous examples of this from a few years ago you all may remember this was, Facebook's, a be. Testing. On the, get out the vote button do you all remember this so. They kind, of said I voted. Like click on this right David, it's like you, know click, on this to say like I voted and share this with your friends so it turned out that that influenced. Facebook users who were seeing that right the a group, to, vote at a 0.5, percent, higher, rate which. Doesn't seem like a lot but we all know when we're talking about millions of people that's, very, very significant. So, simply. Facebook, saying, you. Know I voted will, push voting, to that significant. And extent and we, know that those numbers are way, way higher than the amounts by which Trump won in Ohio. Michigan and Pennsylvania in fact put together I simply, I just recently looked at those numbers so.
We. Know I mean to me that's not necessarily, a critique of Facebook, but it's meant to understand, that these platforms, have profound, impact, in on our, behavior so. This. Is all to, kind of get at some of the work that I've been doing in this space so I said, when this book came out that you have in your hands I started to make the rounds I think it's because I was concerned about some of these global and cultural and, even these blurring, into like politically economic, questions that I've already brought up in this introduction, so. Of all people morning joe MSNBC's. Morning Joe, had me on like, two or three weeks right, after the election and Mika. Brzezinski I literally, saw her jaw drop in front of me, when I spoke about concerns. With groups like Cambridge analytical, right how when you create sort of so-called, open ecosystems. For advertisers, but, closed off for users you can have these pernicious, effects, on our. Democracy, or what. We fight for and aspire to in a democracy and of course this is not simply an issue with Cambridge analytic, it's, far more pervasive with. Russia and other examples, like this right and really it's not really even about the effects, that these folks, have on our elections, we're not really sure with, Cambridge analytic, I have yet to see a solid piece of research to, show that, Cambridge and Luca, actually. Affected. The election, and Cambridge, analytic --is content, is not the same as fake news it's more like framed news based, on psycho metrics, right you, know like me Hall's work him see influential, there he's not happy about it but still so. This, is also you, know meant to make the point that we, also need to understand, as we, design and build systems. What. We value in, terms. Of, advertisers. Versus. Users right and how do we balance those how, do we think about not. Just short-term effects, of the technologies, that we build but also these kind of systemic effects, on on our, larger, political systems, and. I think that that's really really important, and I'm appreciative, that a number of folks from Google were. Working, with the Obama administration. Afterward and helping advise them around this and. I, hope that, you, know that there's, a devotion, to public, and civic life and democratic, ideals. From. Our tech companies, because. You. All are really powerful you know so. Alright, so my colleagues an app to effect she has been making the rounds some of you might know her work she's a quite, public, critic, of what's, going on these days she in, particular, has been concerned, about some, algorithms. That, have been populating, YouTube in particular. Especially. Like the recommendation. Systems, I've yet to see very strong empirical. Work showing, that, the auto suggestion, is it called Auto suggestion the auto suggestion, feature of YouTube. Actually. Impacts, one's. Perceptions. But it's hard to believe that that is the case because I'll speak, for myself and, speak for some cognitive science studies showing that what, we see impacts what we, believe but to, factories critiques. Which I'm sure have been heard here are, that. Maybe, not here but at YouTube have, been our, of. Her own experiences. And it is anecdotal rather, than large-scale, quantitative, evidence and I think that that's important, to note she's. You know was like let, me watch them make America, great again rallies, on YouTube and then, she's she's, makes the point that she started to see more and more radicalized, content, right so like what was, what, was suggested. After seeing. A make America, great again Donald, Trump rally might. Have been content that was a bit more alt right or white. Nationalist, or heaven forbid neo-nazi. Right, and of course it has nothing, to do with Facebook. Google any other company, being, Pro all trade of course not right if anything. As we saw with, some of the interrogations. Of Mark Zuckerberg by, our Congress who don't, seem to understand technology, very well. At. Least the ones who at, least in the Senate generally. I. Don't know why their staffers, weren't. Schooling. Them on on some basic concepts, of computing, but anyway as we, saw the Republicans. Really, really, did. Not like. Mark, Zuckerberg I mean they admired, I think his wealth but, I don't it didn't look like they really liked him very much and that's because generally, speaking we are seeing the, Silicon Valley tends to be more Democrat, you, know and of these two parties that's just sort of been the case historically so. So. I guess the question is is how. Do we sort of optimize. Again.
Recommendation. Systems, in this case with YouTube to. To. Balance, again the goal of maintaining attention. As my colleague Tim Wu points, out in his excellent book the attention, merchants. Which. It describes Google and within this as well and therefore. The gathering of data and click throughs and so on with, what. Might be if you will a vegetable. You know I want it I like I like my information, junk food i we. All do right the guy like my french fries but i also want that salad i or maybe I don't want that side I want that salad after I eat it and I need that salad right so, this is kind of a point that's been really. Built into a lot of the conversations. That we've had about technology, for quite a bit of time including work, by my colleague Eli Peyser and has worked the filter bubble several years ago that even Obama referred, to in his interview, with David Letterman that, was up on Netflix. So. These are examples, of these. Effects, in different parts, of the world I won't go too much into this in Srilanka Facebook. Was using similar techniques, I suppose to privileged more hysterical, content, there's not a strong independent, journalistic. Media there are not very strong governmental, institutions. In Sri Lanka that can regulate and push, this back and that actually created real-world violence this is a New York Times article that came out fairly recently so these battles are very significant. Not just about data and attention, but also about the Internet itself this is me on joy read a few weeks ago actually a couple months ago talking, about net neutrality itself. Right and so there's some conversations. Which I'm very happy that Google supports net neutrality, but, their conversations, even about that because at this point many, many different forms of our democratic. Activities, especially including. Online are under, attack, so. I wrote, a piece when my book came out. Concerned. About some of these issues and I made the very simple, point that, how. We learn, about others culturally. Is very. Determined, by the, instruments, of search and what we see in terms of the ordering and filtering of search results.
That. Then influence, how we know and understand, one another and we even know with Wikipedia, I don't, know if you know this research that, it tends to still be very a symmetrically. Authored, by men and specifically, men from, Europe, and North America and that's generally a story, of technology, at this point right but it doesn't have to be that way so I give the very simple example, in, this in this article I wrote for courts and also in my book itself, that, we. Can do something better about this one. Way too and. And the example I give us of the country just kind of random of the country Cameroon, in West Africa. I was invited, by, UNESCO, to visit that country and search for Cameroon, of course I use Google like everybody, and the. First couple results, I get on Google in fact in my first page, of search results I didn't, see a single webpage from, Cameron which is actually, pretty high in Internet penetration rate, quite, educated, and Anglophone. And francophone so. It's not just a francophone, country, so, why was it that I was seeing that I thought Google might know me better but, in, this particular case I was seeing content, that likely was correlated to that which was more popular and validated. Perhaps by PageRank, and various. Forms of backlinking. You. Know massive Al Edition which is not a problematic. Concept. In my mind but mass validation, is not always what. Should count as knowledge and we know what comes up in our search results essentially. Are treated as truth and knowledge by a large percentage, of users and in many cases by myself, so. I kind, of am concerned, with these issues I write about these issues and my in this book and in, my second book which came out with my colleague Adam fish where we write about kind of hacker, examples. And the Silk Road and the pirate, party and Iceland's, you know these are all things you can ask me about later, and. Even the examples, of powerful, uses of technology in the context of the Arab Spring I did my fieldwork this is kind of after I wrote this first book in Egypt. In the middle of the earth spring looking at what people were doing with technology old, and new not just new technology, also old technologies, and using, projectors, using bed sheets taking, stuff that they saw on YouTube and sand, projecting. It into public, spaces and, all of this is really, really powerful and has great promise one. Of the major arguments I make in the book that I that I spoke about earlier I referred to earlier is this, concept, of ontology I described, it in chapter. Basically, in Chapter one but I really elaborated, in chapter three in chapter four based, on field work that I did collaboratively. From. About. 2002. To. About 2014. With various Native American, populations, where I was thinking about how do I build systems, from languages, to, databases, to algorithms, to interfaces, that, those communities, themselves who are my partners, could help design with me and I'm not nearly the engineer, that most of you are and I'm not nearly the engineer that Google is as a whole but, I was attempting, to deal, with some of these issues and think about some of these issues in the context of trying, to support communities, who are on the other side of the digital divide and certainly. Simply. Being connected to the, Internet is not good enough to actually support their voices and their agendas, especially, people who are you, know in many cases have. Had a great, deal of cultural, and political trauma. So. One example I give in chapter. 3 of the book is an, ontology, that I designed, with. These communities, something like what we call an Information studies and information architecture, so. Folks, in these communities, were building, this was a project I did with 19 Native American reservations, between, 2003. And 2005. It's described, in chapter, 3 of the book and. In. That in that part. Of the book I described how these communities, are attempting, to take advantage, of this internet infrastructure, that, they have built and owned with the help of Hewlett Packard and actually. Build, a system for them to communicate with one another and to build their own local economies, and preserve, culture, etc, and so they were building and designing this system according, to this architecture, again pretty hierarchical. But they were naming, categories. Back when we used to think about tagging, and web, 2.0, and all this kind of like what Tim O'Reilly, wrote about back in the day and they, were basically putting content, and sharing it with one another and deciding, what categories, were relevant and how those categories, would be related to one another so, what this is is the power of naming and giving, people the opportunity. To classify, their, own, corpuses. If you will within their communication, systems, to, support hopefully. One another, another. Example, I given the book in a large amount of detail in chapter 4 is of work I did with the Zuni this has all been funded, by the National Science Foundation and, in, this work I describe, what.
Happens When a group of people in a Native American community, in New Mexico, are. Able to build and design with my help, a digital, museum system where. They can actually get, access to images, of objects. That were taken from their communities, that, are sitting in museums all over the world and this is another huge part and promise, of the Internet the recovery, and hopefully promotion, of cultural heritage projects. Which I'm also very interested, in and one, of the coolest parts of the book is the story I tell to. Where the end of chapter 4 of the. Zuni coming together to, look at this system that we've built where 1500. Approximately, objects, from, different museums are now being made available to, them as images, and look. What's happening here it's not one person per computer, and it's. Not like it's, not this kind of individualized. Experience, to. Look at an object the, whole community, and different folks, within the community of different age groups and different what we call Kiva, or kinship, groups come. Together and, as. They're looking at an object in this particular, case I'm talking about a project called the ANA ho-ho and an, image of an ANA ho-ho which is like a Kachina, people. Are putting their fingers, in their ears and other, people are leaving the room other people, from other parts of the community are coming in the rim and that's, because knowledge at Zuni, is really. Based on who you are in the community and I, want to make that point to make the point that as we think about Google, in relation, to diverse cultures, in different parts of the world we have to understand, those cultural, norms and values that, are part of how people what, people know and how people share information and, that's a really important, ethical question. As well so. The reason people were putting their fingers in their ears is they. Were not yet at the point in the community, in terms of like kind, of a what do you call that like a. Ceremonial. Sort of you. Know kind of becoming initiation, right to, actually know that knowledge right, and at, different points, so this so looking at one object and getting them to share information for, themselves and with, the museum's took like one and a half hours, and they, would change the languages, with which they would speak about the object, so. It's really really powerful to me this is an example of technology as a catalyst. For a culture, as it is right. And so. My, book is not just critical, but it's really concerned, with these questions of how, can we develop and build technologies, that serve people that, serve, the economic, and political and cultural interests, of those communities, so just, to kind of like quickly wrap up this. Is perhaps like the coolest project in the world that I'm working on, I'm like really excited about this this, is a word this is work I'm doing with the with. The group rizo maanteeeca who by the way recently, won. A Google Prize so, thank, you for supporting them, they. Are a, the. Largest, community, owned cell, phone network in the world in southern, Mexico they're, in the mountains, all around Oaxaca. Which is like a magical, magical place one, of the most biodiverse and, culturally, diverse parts, of the world dozens. Of Zapotec mix tech and me hey languages, and these, communities were not served, by big, telecom specifically. Carlos, Slim and Telcel, who's one of the richest guys in the world right, so they said hey, we, want these communication. Rights our constitution. Legitimates, the communication, rights for indigenous, peoples in Mexico. We're, gonna build our own networks so. They are building, their own collectively. Own networks, and we see examples of that in the context, also of net neutrality here, in the United States and places like Detroit and Red, Hook and Brooklyn in other, parts of the world as well the largest kind of collectively, ownself.
Sorry, Internet Network mesh, network in the, world right now it's a place called wifey net in Catalunya. And goofy. Is how they say Wi-Fi, so, I think that's really funny, so. This, is really really an amazing project I've been doing ethnographic work, out in these mountain, regions looking, at how these networks are being built what's, produced, out of collective, ownership does, it support people's, economies, how does this support people to speak, languages, that have never been ridden because of colonial, histories, all of this right what, can emerge, out of the rhizome, the reason matica the rhizome, that is this project right so the project's called rhizome Attica, in Spanish. They also called telefono, in Vienna comunitaria. Indigenous. Community, telephones, so, it's a really really amazing example. Of how, not just we as human beings but, our communities, themselves can, take power over technology and, as I said it's really cool that Google has been supporting, this project, without. Trying to own it or own, its data so, I think that that's to your credit. So. The, last thing I'll just mention really, briefly is I'm starting to write this is for my new book my third I'm working on right now which will be a trade book with MIT press called, I think I'm gonna call it we the users and I'm, writing about examples, of how we can head into a technology feature, that is human, and collective, that, makes sure people have enough money make sure people have portable, health benefits, so in this context I've been talking to people also like Sam Altman from. Y Combinator about. The universal, basic income movement, I, did a really cool so, I've looked at this example from Sweden I did a really cool interview with this guy is the Stanford Graduate one, of the youngest major city mayor's in the country I interviewed, him yesterday, Michael Tubbs he, was on the Bill Maher show just, like two weeks ago and Michael. Is only 27, years old he's a Stanford Rhodes Scholar, comes from a single family home his mother was. Near the poverty line and he's, implementing, the universal, basic income project, with the help of some, folks connected, to the, Obama administration, actually Google's, been kind, of on the side involved, with this to. Try to think about what happens when you give people five, poor, people especially in Stockton five hundred bucks a month what do they do with the money right is and. As an experiment, not necessarily. As some sort of legitimate path, forward, but as an experiment, itself so. These are a couple of things I'm writing about in my new book I'm writing about where we can go to, kind of balance. Flexibility. Creativity innovation, as, it's defined here in Silicon Valley with. Innovation. In relation, to people and their lives and really, more than more than anything the, implications. Of all this on our world in terms of political equality, and. Democracy economic. Equality, and really. Allowing. Our diverse, world to maintain its diversity, through the technologies, we build right rather than flatten, or reduce that diversity, I think all three of those so I'm kind, of overwhelmed because I'm trying to write about all three at the same time and I've, had an opportunity to talk to some really important, and major figures, Vicente. Fox the former president of Mexico I was. But I'm talking to David Axelrod, from Obama, 2008. And 2012, and on CNN next week Van, Jones other folks Elizabeth, Warren I've had, a chance to get these peoples voices very, briefly in the book itself and. Also a number of folks here in Silicon Valley I spoke to the head of diversity. And here at Google you're. The only major, tech company that's talked to me so far so thank you thank, you for that and, I'm, also writing about examples not just like these of Michael, and the Swedish example, but also of the AI lab in, Makkah, at a university. In Uganda, which. Is so interesting it's. An example of a company of anova, sorry an AI laboratory, that's attempting, to build artificial. Intelligence, models, that, are somehow supportive. Of Ugandan, interests, but also are seated, with Ugandan. Data as well, as built around learning models, that are hopefully expressive. Of the, cultures, and communities, of Uganda. Itself so. This is a whole body of work I would, love to come back and share, more from, this new book when it comes out and.
I'm Really excited to get some of your feedback and thoughts on all of this I tried to throw, a lot at you but, that's because I am really excited to hear your thoughts on this thank, you for having me. Okay, I asked you a question so. You. Mentioned the the power, of these. Algorithms. You, know because of the scale that they have and, so, when you take that into account and then also take into account the the inevitable. Fallibilities. Of humans. Who. Are who are creating, them. What. Can companies do, to hold themselves accountable. Accountable. Ethically. Like. You. Know maybe by training its employees a certain way or, whether. There needs to be some sort of auditing, process, have. You thought about you, know what should. That be. Should. The onus for that be on the companies themselves or, should it be government, yeah. Maybe you just talked a little bit about that No thank you for asking that I mean I appreciate, the sympathy for what I'm trying to say in that question. So. Absolutely. I think that there not only should be kind of internal, auditing processes, I think. That I think. You, know one of the more brilliant design, design, companies, here in Silicon Valley is it was I do and they were really smart for, having. Anthropologists. And sociologists. In the, room with their engineers, and their designers, and I think that having, teams that are more inclusive and, multidisciplinary in, the design and kind of engineering. And even evaluation. Process is really important, I think that um, one. One proposal, I've had that I've been talking about for the last year or two is, even. Giving folks an opportunity. On a heuristic level. Right like a descriptive, level of helping. Them understand, why they see what, they see right, I understand. You can't give up private. Software code right and I don't think people are interested in that nor would they even understand, it right nah, I don't, think almost any of us would right but, at the same time you can explain to people this is optimized, for this output and here. Are some other options in. Terms of what you, could see based on other sorts, of if you will language. Or values, by which something is optimized, for but, I also believe, when. We talk about technologies. That are kind of blurred into other realms right like. Into our political lives. To our educational, systems, into, our criminal policing. And justice, systems. Economic. Systems there, has to be some sort of third party. Not. Necessarily. Regulating. As much as coordinating. To ensure that there are checks throughout this process to ensure that we do not legitimate, or naturalize, the biases, that we have especially. Not, even on an individual, level but on a level that is. Collective, and institutional. And that's why I try to talk about these more collective, examples, first like the policing, example, like. The pro-public article, before I talked about what Xena was saying about kind of the YouTube algorithm, because, I think it's far more pernicious these, collective, implementations.
Than, Just the kind of individual, radicalized. Content cuz I've seen radicalized content online I don't mind looking at that I don't think that transforms, me necessarily. But I think on a larger, scale level it's an issue. So. I guess I actually just wanted to build off of what's already been asked. And what's been answered because, I think. When. Mark Zuckerberg first, gave his interview. After, the whole Facebook. Russia, Cambridge, analytical, scandal the, summary was, who. Knew I had no idea Facebook, would ever do this and, that. Seems, to be a pretty common critique of technocrats. Of CEOs, of tech companies that they. Break first and then think about the consequences. Later. And. I think that cycle, is further. Exacerbated. By the fact that a lot of these companies are going public. Really quickly and are. Essentially. Pushed to, grow and grow and grow and so you'll see companies. Like Facebook and Apple, and Google try, to break into China for. The sake of growth for the sake of revenue not necessary, for the sake of inclusion, I think though, for the sake of money. And so. How. Do we how. Do we impose some, kind of, not. Necessary regulation, like you said but like some kind of like, just. Balance, some check, to. Essentially. Guard. For, these biases, for this desire for growth as opposed to like inclusion. And diversity and. You. Know. Corrections. For algorithmic. Biases, and mo models, and. Things. Similar yeah what a great question and, also it allows me to mention one other idea I had which is giving. People the opportunity, to visualize, why, they see what they see and choose alternatives, right like a lot of us as designers used. To build in design systems, that were visually, based that were kind, of multivariate, and they're kind of scoped and we could think about that so let me answer your question. It's. An incredibly, difficult issue, right because, you. Know Tim, O'Reilly makes the point that the master, algorithm of all is market, cap valuation. Right it's. An interesting point for a father figure in computer, science to say that right he was made a lot of money off of his. Own publishing industry, I. Think. That, we. Have to figure out ways to experiment, with. Other. Models. To see what and and we can do this in a kind of small-scale lightweight, fashion we being folks, in these in companies. Like yourselves to see. If they are actually returned. Creating, returns that. Are similar, or perhaps, even close to. The current model right so I've been looking at some research that are that's showing, that, you can build very persuasive. Very. Strong. Engagement products. With. More diverse design, teams. That. Even, having. Diversity. In in, in terms of VC investment, could, actually incubate, highly lucrative and, profitable, industries. As well of. Course it's difficult, to, to. Want. To break something or even. Extinct. Or with something that is so wildly, successful and, and I would say Google is nothing if not successful right. In terms of that level however. Move fast and break things right, which is kind of a casual motto right that Facebook embraced. And. You. Know and and we get it like we're engine I'm a former engineer right like I I that's. Just men in that playful, way that a lot of us talk just like at MIT we use the word hack right, like in a very loose sense. But. I think we're what we're realizing is, we. Can't, break other aspects. Of our lives and, and. And kind of overweight, simply. Our economic, bottom lines otherwise. The blowback, in terms, of public. PR. But, also maybe, our own internal notions. Of ethics and what we're standing for could, be compromised, right so you. Know it so so I guess my question it, might might I would. Just encourage, folks. To think about whether there are some different, kinds, of lightweight, models, it's not a very radical proposal. In my mind at least that by which we could experiment, with, other kinds, of models of being inclusive other. Kinds, of models of being transparent, other, kinds, of models are being transparent, and, I'm. Sorry you know and and and kind of accountable, right. And. I and and, I guess not the last thing I'll say is the. Network, effects, I didn't mention this the network effects, of having mass amounts, of users without necessarily, having to be as. Accountable. In terms of our governance, of those systems is. Making. A lot of money for Facebook, right and probably makes a lot of money at Google you have a much larger global, governance, team than Facebook does which is about two dozen people two. Dozen people for, I don't know how many countries Facebook, is dozens, of countries right, two plus billion users if, you only invest that, amount of money for two dozen people to, deal with all the global effects of your technologies, you're, saying something about what you value there it means you're a massive, massively, over.
Privileged In the economic, network effects, of your technology. Without. Necessarily investing, in trying. To curb some of its potentially. Pernicious, effects, and so, I guess all of this we should put it all on the table we, can make very low scale investments, Google's, an experimental, space experimental. Company why, not try it out. Let, me help you. I'm. Going. On from the this idea of governance. I'm interested. From, you from your travels, either, your travels or like stories from colleagues. And stuff like this yeah I'm, interested in the perception of governments. Elected. Governments, whether they be at a city, level like Stockton or country, level or even an EU level what. Their thoughts, are what you've experienced around their thoughts around these issues that you've talked about about it. Specifically like the sort of ontology is and learning, and like the biases, that might come from these huge private, government private. Systems. That, then are either. Incorporated. Directly or their, ideas are put, into play in these systems, and I'm interested in governments because governments, are ultimately, hopefully. Theoretically. Accountable. To their, citizens, yeah, what. What's. The intersection looking like there and what are the opportunities like, what are some opportunities to do good that you see, I. Think. That it's. Not that great right now when, I when I kind of go to different parts of the world that kind of the, digital divide was it really, shouldn't have been ever been framed about access, to technology, was more about the kind of literacy and the opportunity, to produce and create technologies. In one's, own ecosystem, that's what the ultimate divide, was and we, shouldn't also presume, that the mere ability, to create technology in a place that somehow beneficial, to a place economically or, politically but. I I think, that there are two major strategies, that are that are underway amongst, the more kind of if you will clued, in or or ahead. Of the curve kind of folks that I've seen in different parts of the world first is to think about how to piggyback, off of, these, large scale you, know private, technological. Systems and infrastructures. And build, local economic, and political. Technological. Ecosystems. On top of it right so like how, do so that's been pretty successful in places like India right you can kind of say. Hey you know these Google services are there but we're going to innovate on top of that and try to create. Systems. And technologies, and firms that are beneficial, to our, constituencies. I mean, to be honest. Manila. To. Me that's not like the full way forward, just to be honest the. Other idea. Is to try to build. And this, is really interesting I'll be spending some time in Nairobi, this summer right where OSHA Haiti was born I think some of us probably knows Shahidi and other, companies it turns out in many parts of the world I think you all know this. That. There are thriving tech. Incubating. Communities, and Nairobi is one of the largest right. And so, the question is this how do we start so I'm seeing that happening in governments less, of the Kenyan government buts the local kind of music municipal. Authorities, are supporting, that form of growth but the issue is the access to capital and VC funding that's, been a big issue so I, think, what we need to do is pay attention to, local innovations. And make them make the idea and think, about the idea that innovation is not something that simply happens, when, we have you know seemingly infinite resources. But, innovation also happens when we have very little and we got a hustle right we just gotta like adjust, within constraints, right and, you see that in all sorts of parts of the world with, what people do like on the street, like honey of you've been in various parts of the global South New, Delhi you'll walk through the middle of old old parts, of New Delhi people will be resaw, during and rejigging phones in front of you and building, informal, economies, off of these systems, so I guess my point is as much, as possible I'm not seeing a lot of that's that promising, but as much as possible it would be great if local.
Institutions. And government institutions could, support kind, of other kinds, of tech, incubations. And economies, in those parts of the world I'm. Hopeful, that that happens I don't see a lot of it though right, now but. I think in general just really quick point I don't in general I don't see governance, officials, really understanding, technology very, significantly. Thanks. Last weekend I will, announce duplex. Where we're gonna have Google assistant through algorithms actually. Communicating. Directly with humans, and one interesting aspect of this was they, trained the algorithms, it seems, to. Actually say um mm-hmm, SPECT, of our culture here in the US and we talk to people casually built-in, so, I'm curious what your thoughts are on the specific, technology that we're developing how. It'll impact worldwide. Culturally. And what Google could be doing from, your perspective, differently. To. Make sure that our impacts are minimized I think in general it would be it would be great if Google could be pretty transparent, allows, me to say what's something I want to say about kind, of what data it's collecting, and how its using that data I mean it doesn't have to be too specific but I think that, that could be in general, as these, sorts of like useful, technologies, start to spread and they're in their reach now, to directly, answer your question, I mean. I unless. These technologies, are I don't know if it's so much this specifically, but I I see, this as layered on top of a larger, potential, problem, which, call centers are highly. Embedded. Within which. Is the. Idea that based, on where the money is and where the customers, are we're, gonna build protocols, around technology. That. Are forced into that logic, right so, what I'm getting at is I actually just assigned a paper in my graduate, course this last week about. The. Americanization. Of, call, center workers right, so, we're in in call centers I mean, you all probably heard of this like the, in call and call centers folks, are taught. To take on American. Or western identities. To speak American English and to even learn those forms, of speaking. That are familiar to all of us so, I think, it's really important, obviously that. We don't do the same thing with our with, the technologies, that are gonna be automated, that we're gonna spread to other parts of the world but, I think more generally my, larger point is really about. A. Kind, of a sense of social responsibility. Recognizes. That not simply, in the gathering. And acquisition, of data and attention, is that's, not the only way to produce value, that. There are other ways of producing value and we can have a more balanced approach toward, that but, to be honest I'm not an expert on this particular, technology I. Did see the video and I like everyone else was like wow would I buy it it, went super viral, David and I were just talking about it. But. I think the questions again of design and what it means are really, central here so. Feel. Free to ask me more about it once I learn more about it as well. You.
2018-07-21