Hi. There I'm elastic, were carry from Seoul item I'm, standing here with our keynote speaker, Alex rubes on from the University of Amsterdam, she, was headlining the frontline summit 2018, where, we started a discussion around how do we use data. But, also how. Do we design behavior, for, better lives and you're, a philosopher of Technology, so I'm pretty sure you have a lot of interesting views around that and your, keynote speech I think left a lot of people wondering I want, to understand a bit more about what. It means to be human in the digital age yeah. I. So. In your talk what, struck me the most was that we. We, tend to think of humanity, through, the metaphors of the technology, of that, time could you like even. One minute rundown, of how that happens what, what's. The kind of mechanism yeah. So, in my talk. I actually used the work of a Greek computer. Scientists, who sketches, this sort. Of pattern that he says that we always mirror, what we think of ourselves by. Technologies. That are present, and important in a certain time he, uses examples. Of. Jewish. Folklore, thinking. Of golems. Coming from the earth Greek. Doctors. Thinking, of humans, as hydraulic, machinery with four. Fluids, running through them he talks about Descartes, speaking, in a time where the. Intricate. Mechanics like clockwork, were. Very important, and of the time where Mary Wollstonecraft Shelley wrote, dr., Frankenstein, where electricity drove. A lot of the technologies, of the time and he sees in all of those four times a, pattern. That human, beings. Tend to think of what they are, through. The lens of something, that's familiar. Specifically. Their technology, I will. Say that I buy. A part of what he says, and I use. It in my. Presentation because I think it's quite. Tangible. And very visual. I'll. Add that I think this human, tendency, to think about. Even what it means to be human is something, that we see very specifically. In the last say 400, years but, not as much before that there is especially. Around the time of day Kurt and the years. After that a tendency, to ask this question what does it mean to be human and. Equally. A tendency, to answer this question by looking at something familiar. Or to use a metaphor that is known to explain that and so, we see that currently, I think. In how we, have a tendency to think of what, it means to be human as to be a computational, entity of sorts an information. Processor, a data, processor, where we have our environment which, is made up essentially, of information, that, we process and then that's how we move through the, world I mean. A, computer used to be a job title right you had people who were computing. And, then, computers. Well like mechanical. Computers, or or. Let's say, silicon. Based computers, took over that world and now I think it's, kind of set up this crisis. For us as well which you refer, to as as, the, the AI as the end of humanity yeah, well, I think what's important, about what you point out about computers. Being humans. Data, used to also be something quite smaller. Than we consider it now because, at. The time data was just some, numerical. Representations. Of of some, parts of the world right, now we have a tendency to think of the whole world as a potential, source of data, or even as, data altogether, so, the, way that we interact with people like you and I are exchanging, data right now or. The way that I interact with the world with my mouth my health for instance my diet my exercise etc all becomes data that, I process, and the outcome can be like a healthier, life etc so. We have kind of gone. From humans, being computers. Actually. Computing. Data that was very specific, to all human, beings being. Computers. That. Process. Everything, around them as if it were data so that shift has, equally, opened. Up the possibility, for us to think of another, computer. Say an artificially, intelligent computer to. Become, or. It, has become possible for us to think of such. A computer, as a threat, because, we, have reduced all. Of what we do to information-processing then, on comes another information.
Processor, That says it doesn't say that but we know it is going to be a lot better at processing, information than we are and because, we've sort of not. Looked at anything else that humans do, beyond, the information processing. We're starting to freak out because it's going to be better than us then we, must be in like, dire trouble. And. I. See I see that. There's a lot of talk there's a lot of people talking about the threat of artificial, intelligence on, humanity. On humankind. As a whole on. Specific. Human, life, and I. Think a lot of that, discourse is, driven. By a notion, that at least for. The important, part human beings are information. Processors. And, a lot of our work now is centered. Around processing, information right so, then and and, because, in the last 400 years we've become more more. More defined by the by. The kind of work that we do I mean be industrial, work or service. Industry, or well. I think. Information. Now and I think that there's also threatens to sort of core of identity, for people but, but, at the same time we say ok ok let's have the machines do do. What machines to do better and then stick to what as humans do better but as you pointed out computers. Seem to be better at reading scientific papers, or designing, art, art. Per se as well so, what. Room does that leave for our, definition, of humanity, yeah well I think I'll, say that I don't think computers necessarily, right now are better at designing art than human beings but it is interesting, that we like to hone in on these things like but we have art that we're better at and we like to think like but we're better at emotions, and then to sort of look for this thing our our niche that, will make us still be able to differentiate from computers, and I think more, than, searching. For what. Are we. It's. Interesting to see how, this. Threat, is making, us so stressed out that it seems we've unearthed this this this notion that we really really want to know what we are and and to hold on to that and we have a tendency to think about that. Through. You. Know. Things that we know things that we see and and. Then to to but, to mainly think, of what we are as what. We do you, know like our jobs and there's a sense of identity that comes with that and there's a sense on an individual, level and there's a sense of species. Identity, that's also associated with that that we like to think about what, I would like what I would think is very interesting is that if we have. This. Discourse, we have this discourse, on the threat of artificial, intelligence. The, threat, of our technology, overtaking. Everything that we are maybe, this. Discussion. Can be so disruptive that, we finally take a step back and say you know what perhaps, there isn't a singular answer, to, this question, maybe, we can move beyond, this stressful search, for a for, a singular identity, which is very much sort of the way that we tend, to approach, this question is this, year we have sort of a I, don't know a Venn diagram almost, of this is what is human, and we'd, like to have a single word like, it's either you. Know in the time of Descartes. It is reason. And cognition, etc, and so that's sort of this. Is what it is if you belong to this category you're, one of us if you don't, you're. Not you're not one of us and. Historically. Obviously, this has posed. Quite a few. Problems, in that we have we. Can show that even. This sort of like this group definition, of like this is what it means to be human, and if you belong then you get all the rights associated. With being human and if you don't have the quality that makes you that that qualifies you for the category, then we can really easily justify, for ourselves that. We treat you in a way that is you. Know doesn't. Have yeah, inhuman and. To dehumanize, our. Enemy, or our other of course has, a history, of very, very problematic practices. And this this I mean there's many, things that we now consider to be at least belonging, to the category of human that we haven't in the past let's. Say I mean women or. Anyone. Not from Europe in a sense. So. We can see how the practice, of clinging. To a, single. Definition. Can. Cause quite a bit of heart and so. A, fear. I have is that we we are using. This. Computational, definition, of humanity. Hand-in-hand. With our. Increasing. Technological. Presence. As another. Instrument, of differentiating. Between so those that compute, well with computers, they, will be, whatever. We consider to be human than those that don't will will just, be, dehumanized, in a sense that, is a quite. A serious threat that I see. The. Other part, of that if I can go on a tangent. Is. That. Equally. Some. Of the things that are going on in how. Nervous. We're becoming around, these technologies, can empower us to, change these, things to say we're not operating on this. On. This, notion of humanity anymore so to let it go and this.
This. I think goes into for instance if you look at algorithmic. Bias for instance which everyone, is now is like oh algorithms. Are bad because they're racist, and they're sexist, etc, and they are they are super races, and they're super sexist, even, those that that. Don't, take a race. Or sex as a marker, turn, out to be super racist because if it's then divided by neighborhood for instance it turns out you know people, of different ethnicities in the, New York housing example, where they're like dividing, houses on the basis of who goes where and it turns out the algorithm is super racist even, though they never marked. For for, race or ethnicity, and. It turns out the the algorithm. Is still, very racist what's, interesting about that it does is that it empowers. A discourse, of you. Know we have for. A long time had a tendency to to look at questions of ethnicity. And sexism. As if, gender. Or ethnicity. Exists. In some sort, of political, vacuum, right. Like it doesn't it's just just, the category and nothing about your life beyond, that has anything to do with that and now that we're looking at algorithm, algorithmic, bias we're seeing well it actually doesn't. Exist we have proof, that these, things don't exist in. A political, vacuum and so you could use that. Shortcoming. In tech ecology to argue for I don't know intersectional. Feminism because. You have proof that it doesn't exist otherwise. That's. A tangent but it's a fun one. So. Basically what you're seeing is the technology. Even. Though or, actually, just because technology now has has, shown that we we, exhibit interesting, biases, when I use word interesting loosely even. Disgusting, biases in our society, which we tend to ignore somehow. As humans in our through, our upbringing, or through our sort. Of own biases, but, actually technology. These. Algorithms algorithms. Could, make it transparent and then and, then would actually force us to start asking what kind of society we want to build yes.
Yeah. I think I think that's exactly the case I think some, of our technological, shortcomings. That as we, become more and more reliant, on data-driven. Systems, are more. And more exposed so. We see our own biases, we see how we what, we think, is objective. Programming, isn't anywhere. Close to objective, programming. It. Is exposing, some of the biases that we have that have been, very hard to make explicit before, and so in that way that can be quite empowering this is all to do of course with how. You choose, to use the. Information that you get like oh it turns out my algorithm, is biased yeah. There's gonna be people that say like it doesn't matter it does it does what it needs to do and then there's other people that will say well that's actually kind of a problem maybe and this is exactly coming, back to sort of the importance, of multidisciplinary. Approaches. Of interdisciplinary, approaches, if you don't put people, that, can build an algorithm and people that say hey have you seen that is actually a super, racist. Algorithm. In the same room together then, we're just gonna go with like rampant, like, algorithmic. Data-driven. Society. That that, further augment, any form, of inequality, that we currently have that, is a real risk but there's also a real opportunity for, those, kinds of people to work together yeah, so. Either you could take the route of China and create a social, credit score and actually, actually. Use the biases of algorithms, to enforce certain behaviors, or you could start asking hey if we, expose certain. Biases, can, we actually reduce. Their impact or or even try as a, it remove those biases, not just from algorithms, but from society, yeah, yeah. I mean I don't like pointing at other places I like pointing at ourselves I think, we have enough. Algorithms. That are staring us right. Now already, that we might not even be super aware of.
In. Our society, itself, I mean and and Facebook. Is an obvious example of this in that we are being staid and we're looking at politics right now that is we, can't we, can we. Cannot deny that. Algorithms. Are currently. Playing a role in our politics, so we can point at China say boo boo boo but we do the same thing we have the same thing there's maybe, less of a single human being behind the steering wheel but that doesn't make it less bad and so. That's. A problem that is a direction that we're going in right now and and I, would. Be the very last person to say oh but it's all fun and we can like lift bias from the world I'm just seeing an opportunity that. We can expose, and make, explicit, some of the things that have been very implicit for a long time so, when we discuss. Humanity. And, computers. Or or AI or algorithms. We tend to sort of position, ourselves and, and. Then the computers as the other and. But. I find it fascinating that for example if you look at the work work, of God, America's part of the former chess master, who was. My childhood hero who, then was to deep blue and, didn't. Get disillusioned, but, he actually started, playing chess together, with computers, against. AIS and it's. And it still turns out that humans and machines working together beat. Any machine or or. Any human of course I think, I find that fascinating so, could, we also change our relationship. With machines in some way and to, actually try and augment our humanity, instead of battle. For the definition yeah, I think I think we will have to in the sense that I, very. Much already believe that we well let me pause. I. Think. We have a tendency to. Demarcate. Ourselves. From the world in a way that we may have to come to terms with isn't, very viable like we are at, a point in time where I we must be more aware of our. Influence. In our entanglement. With our environment, that, we for a long time be able to say I am my own Island I just live here I'm a rational human being, I don't, influence, the. World around me because and. There's a certain privilege associated, with that with.
That Perspective where, you can just say oh I can just live my life and that's fine and. We are, not, just within. A. Technological. Perspective I, think coming, to a time and age where we becoming increasingly. Aware. That, we aren't Islands, and we are very much existing, in a network, of relations. I'd hate, to have used the word Network right now because it sounds so computational, but. We. Exist in, entanglement, with our environment, and this, goes to. I, mean in. The most sort of basic. Example, as a. Society. Of people we, decide collectively what, are our collective, norms and values we. Formalize, those in the law and then, the law starts executing, the. Sort. Of how, we cope, with those that don't. Fit. Into what we've decided, is our if we, distance. Ourselves from that if we, say well that's just the law and I have nothing to do with that then yeah, then that's I would say not very realistic similarly. If. We, continue. To identify. Artificial. Intelligence or autonomous. Robots, as some, other some, outside, force that, is just acting, on its own then. It. Will be, you know at some point will no. Longer have anyone taking responsibility. And I think it. Is very necessary that we start recognizing that one we created those system they ours they, are an extension of what we thought at least at the time was needed and was necessary, and three, in that they. Are doing what we told them to do at some point and if we don't like it and we are the only ones that can, tell them you, need to know no longer do this that sounds like you're having an actual conversation with, one and. So, we need to recognize, our technology. As a part of ourself and this has. Weirdly. Been super, hard for us we've we've struggled to say yes, this is us in a way that I think is quite, similar. To the way that we sort of look at our society and, say, I'm. Not sexist. Why is the world sexist, well that now it must have nothing to do with me it's like well you know we've all sort of like tolerated, that for a very long time and so, if we don't start becoming, responsible for our behavior then, we, won't start. Becoming responsible for our technology and both I think need to happen to. At. Frontlines, how many honest spoke about how we move from design thinking, to actually, like behavior, design so. Not. Just habit habit, design which is kind of trying, to get people people, more addictive to click more and more on on candy crush or open, Facebook more that, actually we should start creating these systems where, we when. We help, people behave. In a different way that fits their goals and then you. Spoke about goals, and, that we're not very good at setting goals so. Do. You see hope that we could use algorithms to help first, understand, ourselves and, then set. Goals and actually achieve those goals I mean, again. I kind of look at AI as a crutch for humanity, yeah. Yeah well, what I really liked about your Hanna's story is, that he's looking at that he talked, about this fictional, not, fictional this imagined. Company. That, is a health, tracker. Of so yeah, and. His. Approach was to say well you can look at just the health part and that's like one and I think what I really appreciated, about his approach is that he's putting, it in a broader perspective and, so he is being aware of effects. That whatever, you're trying to achieve that, it doesn't exist exactly, in such a political, vacuum or in a personal vacuum, it's like I as. A human being have, like my health app and I have my I don't know. Calendar. And like to pretend that those two never intertwine, or our never sort, of interacting. With each other would, be very. Narrow would be a very narrow way of approaching and so to say well actually if you have a health app you, have to look at the larger picture of. What. Kind of role that plays in someone's entire life not just the small life I think that is an approach I mean, I'm not in the health our business I don't know how, building. A health app works. But. The approach is I. Think a smart one because we, are no longer looking at things, in sort of like an isolated. Way and whatever. We do technologically. May be like oh I want to get I don't know your heart rate down but, that will have. Repercussions. And, will have an effect on a larger, in a, larger system that isn't just your heart rate going down and I think so that approach is is a, fruitful one because it forces. Us to be aware of by. Effects, that aren't necessarily the Intendant one and so to look at that in that perspective I think is a very very. Smart way to approach it so. For business people when when we talk about let's. Say KPIs, key performance indicators. That we try to set for services we, usually look at sort of very very narrow sets of things again, sort of very, narrow behavior, and I, think when, you're under spoke about yeah if you quit if. You run a health club you should start thinking about how, you change the behavior of a retired.
Office Worker to actually convince them to skip. Going to the pub but instead actually go to the gym and to create, sort. Of smart, reward. Mechanics, but, not just make them addicted to. Go into the gym but sort of have your eye on the far-off goal of actually well helping them lean better yeah I think it's it's. Always a tough challenge when you, when. I start preaching at a client like hey do you really think about what, the end goal. What kind of impact do you want to have in the world. Well. Is but. Then. Then. Again because all these systems are becoming more intertwined and sort. Of every algorithm that you let loose in the world seems. To become much, more powerful than its creator intended, it it's kind of the story you, spoke of the Gollum where, where. The story of the cni this. Desert. Came along where weather with rage created, the Golem and go stepped on and then they call them just skipped yeah, give. It a bringing sound so so. Do we really. How. How, can we become better at understanding the, ramifications of, what we do with, technology I, mean. This is a story as old. As time really. I mean and it goes back, to that story it goes back to sort of the classic, Frankenstein's stories like Victor, Frankenstein, thottie which is like testing his machine and canon create life and oops I've actually created life and now it's I don't. I'm you, know because he sort, of rejects. It and then doesn't want to take responsibility for, a it goes into the world it does damage it hurts people and then he's like oh I should be a responsible, and she's like we, know this story and it's a it's a scary story and, it's one that seems. To be almost. Timelessly, relevant, in that we, have a real. Struggle. To to. Accept, responsibility for. Our own creations. And I. I wish. Man I wish I had like the answer like oh if you just do this then everyone is just like happily, responsible. And I think there is so many things to point out in, that's. I mean that would be the, most important, perspective for me is people tend to look, at AI, or. Machine, learning and say this is the first time that we are creating. Something that. We. Have no control over and like we just don't we may like cause. Harm, in the world that, we never, intended and, that is, now. You. Know running rampant and we have no idea how to take responsibility over. It and today take, control over it and I, think that perspective. To say this, is the first time is quite, naive that doesn't change anything about the fact that we you know are. Still doing that but if you think of, capitalism. Or the war machine or anything those, are all systems, that are in place that are causing quite real harm in the world that, we as, individuals. Don't. Particularly, have control, over no can we point to a single individual, and say it was their fault and they should have taken responsibility and, I think something similar is happening with. We're happening with our artificial. Intelligence systems and our machine learning system, is that, it is possible and it might cause, a significant. Amount of harm in the world and it. Seems, to be very hard, to point at someone and said you should have done better if. I had a, method. Of convincing, people it's gonna be you you're gonna have to do better I would. Love to. It's. A it's. Interesting you can even point you to, for. Instance the me2 movement you have a system, of. Sexism. That, is just perpetuated, by anyone that participates, in that anyone that keeps silent, anyone, that in some way crosses a boundary is that, is being kept safe by a system, it, takes them very brave, individuals. To say this is no longer. Acceptable. And then, it takes a ton, of people, to say I will no longer participate. In this it isn't a single human it is the responsibility, of all of us to. Make. Sure that whatever we create with our technologies, without artificially, intelligent systems. Doesn't. Become something. That causes harm that we have no way of pointing, out responsibility. And you're saying well I can't help this. That's. Pointless. No. Me it's just dangerous so. If I hear you correctly you're suggesting that they we. Will not be able to say definitely that hey this technology, will be harmful, or harmless I mean if we look at Facebook it's probably, creative social connections, around the world done some really good stuff and then at the same time it gave us Donald, Trump and whatever or, if, you look at air B&B it's, changed, the way people travel probably. In a lot of good ways but it's also course, massive damage in places like well your home time about of. Amsterdam, I mean it's. Yeah. And. It's it's, fascinating because I don't think anybody, so, the first understands, that they say Toulouse these systems. Or or machines and. So. But. Then. There.
Is No cure-all for it but, but. Maybe one thing that helps is to actually become conscious of it and start to have this discourse, I think. Yeah because I think it's it's. May. Be fun, but definitely, lazy to a point and Mark Zuckerberg and blame him to say like well it was you who put Trump there like that's not I think, what's the. Only thing that we can do is to claim our own responsibility. And to say I will no longer accept. Parts. Of this how that what that looks like can be different for any other person, and maybe there's not much and there might not be much that we can do and that is a real possibility that, on, an individual, level there isn't much we can do but if we can find, you. Know and there's and and we mustn't underestimate, the powers that be in the the social impact, that it may have to say I refuse, to be on social media is real and you. Don't have to we. Shouldn't judge anyone for saying I choose to participate in this because it provides me with a social. Platform. Or whatever yes. But. I think it's it's very easy and this is something that the only thing the only concrete, thing I have maybe to offer is to, say to continue. To point at other people, and say Mark. Zuckerberg gave us Trump is I. Think, that, would be the opposite of helpful yeah yeah so. So. You caution, against oversimplifying. As, about to say hey like yeah. By, by being open to understanding that it's, very. Different mechanics, at work yeah, and by and, by trying to get people engaged looking. At our own behavior, yeah. Yeah, yeah. Hey. Thank. You very much no thank you Alex.
2018-06-06