Future Ethics

Future Ethics

Show Video

Hi. Everyone welcome thanks for coming today and for dialing in I'm, really excited I'm Josh Lovejoy I'm a designer on the ethics and society team and cognition. Super, excited to welcome our speaker today Kenneth, he's. A designer. And. Ethicist, and what. I love so, much about the the stories, that Kenneth, is gonna walk us through today is how practical, they, are because. Often we think about AI, and, machine learning. And design, and product development there's, this big gap between the. The. Theory or the science and the application, and. So, what's, crucial is how we can all learn to, recognize. That. We all do have a role to play in this process it doesn't matter how technical, or your perception, of your expertise, around ethics or around AI, everyone. Has a role to play and. Everyone has the kind of responsibility, to ask questions so. Anyway, super, excited to have Kenneth here and, great. Take it away Thanks. Hi. Folks and before I before, I start, these. Are kind of jet lag slides, you, know we know the sort I've been tweaking them so they're not super. Polished and they took maybe a little too rough around the edges but I think we're gonna have. Time. For Q&A afterwards, as well just have a chat about anything that's not. Clear anything you'd like to go through in a bit more a bit more detail. So. We've, all realized that contrition. Is in the. Last 12 18. 24 months. The, industry, of ours that's been sort of so renowned for, optimism. For seeing, technology as an unalloyed, good, has. Started instead to have this creeping recognition, that we've taken some wrong turns the. Founders, and the influencers, who, previously, exalted. Tech as something that will enrich. Humanity. While they're now writing they're remorseful, medium posts, then a forming. Nonprofits, to come to express all their regret, for the decisions that the industry has made and they're, begging for, a second chance, and. There's now an unprecedented. And overdue focus, on the, harms that technology. Can bring. The press is as likely to call tech. A danger, as much as it is a savior and. That's understandable, because we keep serving up all these very easy examples, you are all perfectly familiar with the the one at the top there but, Facebook their emotional, contagion study, which. Manipulated. The emotions of six hundred eighty nine thousand, individuals, without consent, Twitter, which I have first-hand experience of, still. Grappling, with and frankly failing, to, get, to grips with, long-standing. Problems. Of abuse on its platform and Buber of course been a whole soap opera, and. Almost. A poster child at times for unethical behavior. And. There is now some evidence that this is having an effect on public, attitudes, toward, technology. Not. Everyone there a think tank based, in London and, they did, a survey that found that 50% of people believe that technology has made their own lives better but. Only 12 percent say. The tech has had a very positive effect, on, society itself and. That to me is a worrying trend because. Over the next 10 15 20, years the. Tech industry, at large is going to demand an enormous amount of trust from, its users because, we're going to ask people to connect more and more of their lives to. The products and the services that we build, they. Don't need to trust us with the. Homes their vehicles, even the safety of their families the. Potential, harms of emerging, technology, are greater than even those present. In technology. Today the. Stakes will only get higher. So. This has been my focus for the last two or three years focusing, on. The. Ethics particularly of emerging technology, and so I'd like to talk about today some, of the ethical challenges, we have right now as an, industry but, also the ones that we're on. There on the brink of encountering, or that are really growing before, us but. Then crucially I'd like to spend pretty most of the time discussing how we can react to that how we as practitioners and, as, a industry. Can, help rise to those challenges. I'll. Start with data because it's one of the more familiar. Aspects. I suppose ethical, issues. In data are probably better. Known than they are in other fields. Something. Very interesting as how and with. The way we talk about data in the, last few, years, once, we used the language of the library, data. Was catalogued. Referenced. Queried. It. Was neatly bundled, and stored, but, now we we've, had. A subtle metaphorical. Shift we, now of course say that data is the new oil this, thing. That's become a cliche. That's. Actually quite a revealing metaphor. It. Suggests that data has become something that we burn for profit and that data. Is an uncontrollable. Force, of nature it's a liquid it's a river, it's a deluge a, torrent, it's.

No Accident of course that liquids. Have a habit of leaking. Just. Like any valuable, commodity. Everyone. They wants data, in some form or another I think, one of the mistakes sometimes we make when, we discuss, the ethics of data as we limit it to the, ethics of advertising, but, of course we know in this room that. Data. Has value far, beyond targeting. And and tracking and things like that we know the value of data for, artificial, intelligence without. Data there is no AI and. With sensors, in our devices and. In our cities and our, online behavior. Monitored. And tracked companies. And governments alike now have vast reserves of information, on, individuals. And their aggregated, behaviors, of. Course from an ethical point of view this composed challenges, to privacy, and justice any imbalance. Of data is also an imbalance, of power. Companies. Can use customer data to improve service to innovate, but. Also to discriminate. And profiteer. Governments. Can use citizen, data to. Plan resources. To anticipate, social, change but. Also to track to. Persecute, and asylums, and we. Could argue that that distinction between customer. And sisters in here is, essentially. Meaningless these, concepts. Are rather interchangeable. Under, the right legal, and technical conditions, data. That's available to companies is also available to, governments. Autonomous. Vehicles for example. Some. Estimates say they're going to be gathering around 4 terabytes, of data a day from their clusters, of sensors, lidar. And cameras. Of, course that information will be terrific ly useful to the, companies that operate. The vehicles, or that programmer software but. The police will clearly want access to that kind of data for looking at road traffic accidents, but it's also highly likely that that data will also be sought to, try and solve crimes of different. Purposes not just car. Crashes. As. This prospect, that led the economists, who are hardly as. You're aware hardly, opponents, of. Technology. And of corporate power to. Label the autonomous vehicle as a panopticon on, wheels, I. Think. There are two common mistakes when we. Think. About data ethics. The. First is really. How we've approached, the. Very topic, within our approaches. Within our design. Processes, we've chosen to hide data flows data. Has been something that's out of sight out of mind and I. Think that's mostly a mistake. Designers, have made I think I'm hopefully allowed to criticize as owners because I am one, under. The mantras, of user. Centered simplicity. Don't, make me think we've. Chosen to regard data as complexity. Something. That's best hidden under the hood something, that the user really has no business poking. Around with and looking. At. The. Second problem of course is around. Security in the race forever cheaper, connected devices. Manufacturers. Are increasingly. Omitting, fundamental. Security, practices. As, one wag put at the S in IOT stands, for security, the. Mirai botnets, the. Mirai botnets, which you probably, all familiar with we, now know that was caused by three. Kids trying to hamper their rivals minecraft servers in an, October 2016, it zombified. 900,000. IOT devices and taking advantage of the lack security, protocols. Have been put in place so. We have this perfect storm where. We have a slew of new sensors, and tracking technologies, and, we have data pouring, invisibly, by intent, or otherwise between, a host of devices, and companies. And authorities are, incentivized. To collect what they can of course that makes a breeding ground for, potentially. Unethical, behavior. Another. Now increasingly, familiar ethical. Challenge is the idea of bias in algorithms. Almost. All of us I'm sure have read the Pro Publica, expose. On the compass, crime. Prediction, system which found that black, people were tagged as future attendants, upon. Me offenders more, often than their white counterparts and. Bias is a particularly, hard, thing to tackle because, too many people. Data. And algorithms, are seen as clean, objective. Neutral, things. But. Of course that's not so, professor.

Jeffrey Boca coined, an, i memorable, phrase raw. Data is an oxymoron. All. Data and all algorithms carry, the implicit bias of the people and cultures, that make them. When. You choose how to collect data and how, to display it bias, leaks, through that system data's. Never as pure as we think because we can't separate, it from its, means of collection, and analysis, and, presentation. Here. We see a fantastic, infographic, from Simon Skaar for the South China Morning Post I think. It's a great, piece of work it's excellent, it's powerful, and it's clearly laden, with meaning, I mean it needed metaphorically, drips with with. Meaning. On. The. Right here we see anti-cop grief in our works at tableau, who, completely, transforms. The message of this same data set with, just three changes a new headline. A flip. On the vertical. Axis and a huge change the, same data with, an entirely different connotation. Meanwhile. The algorithmic grip, tightens, all around us the, smart city is predicated upon the idea of sensing. What's happening in our environment and, then responding, to it already. We have systems listening for say gunshots, in, our urban environments, we have facial recognition technology, that's, increasing, in power that, can detect ages. Ethnicities emotions. And it's, fairly clear. That soon, we'll be able to build. Systems, that can track people's movements, with. Facial recognition technology, potentially. Alone or gait recognition. Trap. Those movements through an entire city. By. Association, we can then examine. Who, their friends who, are who they hanging. Out with or their friendship groups who are they associating. With with. Very clear implications. Combining. This, this deep data harvesting, with opaque. And unquestionable. Algorithmic. Decisions, than you have in sign up to fetch keys words the, building blocks of authoritarianism. Technologies. Are also becoming powerful manipulators. Of behavior, now, all technology, has the power to, persuade but, recently that's become an explicit, focus of technologists. A lot, of it is centered, around the idea of nudge Theory particularly in the realm of health fitbit's. Things like that to help you live, a more active life diet apps to help you choose, healthier. Eating options. But. Of course you can persuade people to do a number, of other things these techniques, can corrupt, humanity, just as much as they can ennoble it we. Can create products, that persuade. Or train users to keep coming back and. So. We have of course this ongoing, it's, almost a moral panic I suppose we could say around, tech, addiction. Now. That's not, entirely. Well evidenced, at the moment as to whether it's true addiction, but it is clear that chasing, engagement, in a lot of these software, systems has led to some fairly. Trashy and not. Highly valuable uses of tech again. I'd question here the role of designers, in this system I think we've perhaps, failed to interrogate, the line between designing. For delight and designing. For addiction, the techniques that happen to be particularly good for one also. Work quite well for the other. For. Perhaps a greater concern is where persuasive, technology, goes next, we. Have a system that's networks, that can learn from the successes and failures of, a million. Or a hundred million individuals. If. It's dynamic, so it can change its messaging, and its approach on-the-fly if.

It's Personalized. Based on what it, knows about you, and which, buttons it can push within, you that. Could be an irresistible. Force of, manipulation, it might, be a system, that's so different that it deserves its own just. Name this, is a system that Karen. Young who's a legal scholar in London she calls hyper nudge, and. They were already crude, examples, of these systems. Becoming, remarkably. Powerful as a study. That found a, 50%. Increase in facebook, ad click-through, simply, by tailoring. Messages. To an interpolated. Personality, traits very simple looking, at people's likes and, then running them through essentially, the Big Five personality, test, and then just tailoring, image, and copy, to, what. People believed about. That individual, 50 percent uplift Amazon. Already sent billions of automated, nudges through, their online seller program a. Thickly. Of course the big question is when does a nudge become, a shove, when does this actually start to threaten our. Autonomy, in our free will as individuals. And of, course we know this stuff isn't just limited to commerce you. Know it is a political persuasion, and maybe. Even an information cold war that could result from these kinds of things again. The ethical concern is that these hyper, nudges are invisible and therefore. You get power imbalances. People. Can't organize against, them they can't collective eyes they, don't know who's doing it and they often, won't even recognize that these techniques are being employed. We. Also have significant. Questions I think around the sustainability, of our, industry, Victor, papinek, an. Industrial designer rotor. And a famous book designed. For the real world and it, starts with a hell of an opening line he says, there. Are professions. More harmful than Industrial, Design but. Only a very few of them I. Tell. This to people who work in software, design and they say well ok yeah it's a cute line but you know we're off the hook surely we're not the ones wrapping. Things in plastic, piling, him high and selling him cheap. But. Of course if you look at so the Volkswagen, emissions. Scandal, that was a pure software cheat this. This. Is. Essentially, this hack to try to cheat, these emission standards which, of course now have created extra. Missions that gonna kill by estimates and I think the Guardian between. 20, and a couple of hundred extra. People now as a result of these. That. Was pure software it was all within the existing capabilities, of, the hardware, that was already there and of, course we are now entering, or stepping. Even further into a world of physico. Digital, hybridity, if you like where. Physical, products are run by software, I don't think it's a stretch to say that software is heating, the world. Crypto. Mining is, a. Particular, threat, I've been tracking this, for a little while now as part of my research and every time I look there's. A new country that's being listed, as an equivalent, energy. Use. It started off as being. The same amount of energy as Paraguay. Within. The Bitcoin network then it was Iceland then it was Republic of Ireland I think it's out Austria I believe. We're now at 0.5% of the world's entire. Electricity. Demand is. Being spent on Bitcoin mining alone let alone the the other cryptocurrencies. Then. Maybe up site there may be positive, things we can do the technology can bring to disrupt. The supply chain, to move, people toward a right to repair or repair at home mentality. To. Reduce, the risk of things having to be shipped and created, centralized, and then distributed. Across the globe in this way so it's not all negative, there may be good, steps we can take but. One thing is clear which. Is that the, technology industry itself. Promotes some, of the fastest, and shortest, upgrade cycles, in, consumer, goods the, pursuit of the new has consequences, think. Of all the handsets, and tablets that, we've had between, us in the last say decade, where are they what's happened to them the, rare earth minerals. Harvested. In mind to. Create the screens and their gyroscopes, and so on within those, Helen. Walters who's a writer, and design critic. Reviewed. The computer, entertainment show consumer. Entertainment CES, of. 2011. No, 20,000, new products unveiled, at that event, including.

Some 80 tablets, I think it was and, Walters. Was scathing she said that's, not innovation, that's vandalism. And, when it comes to ideas like machine autonomy, I think, there, need to be some, profound shifts in the way that we work and also. The. Way that we respond to the ethical challenges that are that are posed I think, the idea of user centered design for, instance starts. To flounder a little bit when we're in this sphere, because we're. No longer talking about use of these systems we're talking about coexistence. With, digital, systems how do we live alongside them. The. Course elaborate, I suppose of tech ethics is for. A quite a while now has been the trolley problem you know the drill you know should we kill two people to, save three would, you push a fat man over the bridge to stop the trolley all that sort of stuff MIT, ran. A study called moral machine, where. They essentially crowd-sourced, this and they found well. I'll discuss. The results, a little bit later but. For me it's it's. Been a useful entry, point I think to. Tech ethics I you know a lot. Of laypeople sort of oh yes I know all about this but. In the words of Andrew Chatham, who's one of the lead engineers, on the. Way my project. Really. The intellectual, intrigue goes out of the problem but you just slam on the brakes that's the answer in almost all of these situations. So for me it masks, more interesting, moral, challenges, around machine, autonomy. Technologies. Have a habit of distributing. And diluting, responsibility, if my. Autonomous, vehicle hits a cyclist, then who's. Liable is it me, is that the cyclist, is that the software engineer is that the regulator, is it the car itself, potentially. Some kind of mix some, dilution, and distribution, of responsibility. Is inherent, with these type of technologies. Do. We have to take, decisions say to disable. The vehicle if the user refuses, to apply a, software. Firm or a firmware upgrade, can. Police, for example override. Ie pack these, machines, now. The moral. Machine. Test. That I mentioned before the MIT study they, found essentially, a utilitarian. Approach. Or, come on to some of ethical, Theory shortly. But. Then that was made. A bit more complex, there was a study in science found. That autonomous. Vehicles if they were programmed to follow that logic, essentially. Harm as few people as you can. People don't want to buy them they, don't want to buy. Vehicles. That would choose not, to put their own safety first, that would choose to risk their, own safety above, those, of said pedestrian, so. There's a question that will people actually want more and machinery if it acts somehow. Against their interests. What. Should we not automate. In. 2016, the Dallas PD bombed, a suspect. Ironically. Using a bomb defusal. Robot, now, that wasn't autonomous, they piloted it so it's essentially, like a drone but, of course these technologies, will become more sophisticated autonomy. Will creep into these, domains.

Many, Militaries, are of, course interested in lethal autonomous weapon, systems. And. There are a whole host of, other. Slightly. More sci-fi slightly more sort of. Stretching. And occasionally, sexy, things so we probably won't bother getting into here things, like robot. Behavior robot, rights legal, and moral personhood, you know if you own, a robot, that's so sophisticated that, it's deemed a person, while that's slavery, right we have rules against that kind of thing or, the, future of work what's the impact going to be on. The. World of work how do we find meanings, as individuals, when. Essentially. A machine can do pretty much anything we can but better I'm. Not going to spend time talking about those, because I mean that's a whole other other. Section but I think it's very easy for our field to focus on these far future topics. These. Distant, sci-fi threats at. The expense of more pressing, issues, I think. What powerful, and super intelligent, people do with, technology, is more. Of a threat than powerful, super intelligent technology. So. I think as clear Ithaca l-- quran', is cliff fertile ground for us to take ethics seriously, within our field to, let new ideas take root. Sometimes. There's a resistance. Of this that ethics can be seen, as a drag it's going to slow down innovation. It's going to hamper, everything, that the industry, holds, dear but. I like the phrasing from Peter Bava Bakke who's a an. Ethicist, of technology in the Netherlands he says ethics. Should accompany. Technological. Progress it doesn't necessarily, need to oppose it should just go alongside, it so. I like the idea of trying to take ethics and infuse, it into, our work to see it as a as a constraint. Designers. Know the power of constraints, that they don't always, inhibit, they often generate. New ideas, we, know that constraints, are seeds as well as shears. There's. A lot of talk these days about the. Idea of neutrality of Technology and so, it requires that we accept, that technology, isn't neutral and frankly it never was, this. Idea that. Technologies. Are just mere instruments, tools built. To fulfill, tasks, well, that doesn't really hold anymore, it. Was a common, idea, but it's also kind of a dangerous one in. Ethics you sometimes hear that, an is does not make an AWS just because something, is common it doesn't mean that it's correct. In. 1980, Langdon, winner philosopher. I, think. Around these parts somewhere. Wrote. Her now famous article do artifacts, have politics, his, conclusion, was well damn right they do he, gave the example of Robert Moses it's, a city planner of New York in their 20s. And 30s a belief I mean, the account is slightly disputed, but it is clear that Moses was a racist. And. The, allegation, is that he built bridges over. Freeways, or the park ways to to Long Island beaches, intentionally. Low to, prevent minorities. Who mostly travel by bus from being able to get to them because a boss is simply couldn't pass, under the bridges. So even these bridges, inert. Hulking, great big hundreds, of tons structures, had. Moral, and social and political impact, so. We can't separate tech, and human capabilities, anymore. These things acts together they're into woven, things. Fundamentally change, what people can do and how, they do it so. We have to overcome this belief in neutrality. This idea that we can wash our hands of the social and political and. Ethical, responsibilities. Of our work.

Might. Go further I'd say that design is applied, ethics a lunch. Option off. Professor, at SVA. I think he. Says that design is doing philosophy with the hands I really, like them now, sometimes, that relationship. Is explicit, if, you're a designer of razor wire then, you're making a clear statement that someone's right to personal property is, so important, that we should enjoy someone, for, choosing to contravene that right. But. Even then if we're not making weaponry. Or anything as enormous. Lee ethically and politically Laden as razor wire, we're. Still making a statement about the future every, act of design makes a claim about what the future should be like we. Choose one, preferred future we discard, tens, of thousands of alternative, futures so of course there's a clear ethical components, within that we're, making a case for how we should all live I. Think. We have to be skeptical, however of people who claim to have easy answers to, these problems, I think. Our field tends, to like fairly simple medication, oh just, don't work for a particular. Company or just ban add from, the tech or just, sign, up to a new Hippocratic, oath for design technology. And design now, particularly critical of all, of those ideas but actually particularly, codes of ethics are, a bit of a bugbear of mine I think, if another one was the answer why didn't the previous 20 work. There's. A risk that they become unenforceable. And static, and lead to this kind of checklist approach that I think sometimes we see with say, accessibility, I, also. Think. That. New. People may be needed to address these problems there's, a a. Fallacy. Or a temptation. As opposed to believe that the people who led us into this mess are also the people to lead us out of that mess and I'm not sure that's actually true, we, won't redeem ourselves if, we end up reimposing, the same old power, structures, it's. Likely instead that ethical and answers are going to come from diverse and underhood perspectives. The. Great news is of course that we aren't the first people on, these shores as this, sometimes. Arrogant assumption in tech that you, know with a first, people pioneering, pioneering, this new future but, of course there are millennia of ethical. Theory and thought behind us and it's not just dusty, Greeks. It's contemporary, academia. It's critical, designers philosophy of technology it's science and technology studies this. Is fascinating deep, work that's happening all across the world but a lot of these academics, sadly, aren't taken seriously, I think, our field is is very intelligent but it's quite anti intellectual, I think, that's that really needs to change because. There's so much we can learn from these academics, and artists. And thinkers to. Help put our work into context. So. I'd like to just draw on a couple. Of key. Ethical theories, and translate. Them into five ethical, tests, that, I've tried, to instill. In people. And mentees, and so on that, I've worked with, based. In essentially. Modern ethical theory, one. Very simple test is what if everyone did what I'm about to do what. I what would a world in which my, action was commonplace, if. It was a universal, law of behavior, would that be a better place or a worse place this. Comes originally from Immanuel Kant it's what's. Known as a down't illogical approach to ethics and this is one that's grounded, in duty and in a belief in following certain, rules and obligation, if you like on behavior. An. Example for this might be dark patterns are all familiar I'm sure with dark patterns deliberately, misleading. Interfaces. Well. If we all shipped. Dark patterns and of course the sphere of Technology is diminished, and probably our, technological. Lives, are a bit worse so, that suggests if we apply this test, we, probably shouldn't ship that dart pattern. Another. One that also comes from Kant am i treating people as ends or means I'd particularly, like this one it. Takes a bit a little bit of unpacking, the. Question really becomes am i treating users. As means, for me to achieve my own goals or, am, i treating them as free individuals with, their own goals that are probably superior, to mine now. I don't think this is a question designers, necessarily, struggle with but. I do think data driven companies, tend, to struggle with this in aggregate, and I've seen from. The inside how some companies, essentially. They're framing of users starts to shift with time they, become not the reason, for existing. But. Experimental. Subjects, means for us to hit our okay ours essentially. People become masses, and when that happens I think ethical design is the natural result. Another. View this is essentially the utilitarian, question.

Am. I maximizing. Happiness for, the greatest number of people and by extension am i minimizing, suffering. As. I say this is the utilitarian, approach to ethics is sometimes known as a consequentialist. Approach as well because it's all about the consequence, is the impact the outcome, so we're not focusing here on rules and duty but. Something a bit more tangible I think technologists. Generally look at this. Theory, and say there's something in this right because it feels a bit measurable. It feels like something, that we can calculate, and. Come. To a fairly definitive opinion on, of. Course the downside is do you really have to do this for every decision that you take in the world because that sounds like you've essentially, become a number cruncher rather. Than an ethicist. The. Fourth question that I tend. To pose is based in me essentially. The third main, theory after de ontology and. Utilitarianism. In virtue ethics, as it's called would, I be happy for what I'm about to do to be a front page story front, page of the paper or the websites or whatever it is and virtue. Ethicists. Aren't, really interested in the rules of behavior, or all their outcomes. The consequences, they care about moral character, instead what does it say about me as an, individual. Would. I choose, to be accountable, for the decisions I'm taking what I sign my name to them. And. I think that's important, because it speaks to our identity it speaks to who we want to be as individuals. The. Final test. I'll I'll offer is, it. Comes from John Rawls who wrote a theory. Of justice this. Is the idea, of the veil of ignorance this is. Essentially. Rules roses conjecture, that we should design as if we don't know our place in the system that a fair world or a fair system is one, designed, behind. A veil of ignorance before. We've even dealt the cards of who ends up where so. If, for. Instance I'm designing a system, of public welfare I should, design it so that it seems fair whether I end up as a taxpayer, as, a welfare recipient, as an, administrator, of the scheme and so on. So. Frameworks, are useful. And I think I'll refer to those. Tests, a little. Bit later on but. As I say ethics is for. Me about something, that's it, should be applied it needs to be translated into action so, I've been thinking about how do we look.

At The way we develop, products, design, and build products what, are the points of intervention, that we can take. Advantage of within that process I'm a, little bit wary of this framing because it seems like ethics is something that sits outside right, and we have to sort of inject ethics, into the process it shouldn't be that way of. Course but I'm trying to just look at it. From. That point of view just. To see where. Where we can tease out those moments. The. First one is right at the start of a project I think we have to start defining our projects properly. One. Of the most important, ways to do that is to think deeply about stakeholders. There. Are of course two categories. Of stakeholder, the. Most obvious one is people who can affect the project, but. The second category is people whom the project can affect and we often overlook those people and so, this creates a number. Of externalities. Essentially. The Economist's, way of saying someone else's problem right. Effects, that fall on people who haven't necessarily signed. Up to be part of that system passive smoking is the one that's usually given, as an externality, you. Know people next, to you that no, one was designing for that problem they, weren't considered as part of. You. Know the cigarette manufacture and licensing process and may, yet still, have to pay the price. Airbnb. I think is a classic example of an externality, company. Ridden ridden. By externalities. A. Being, B is fantastic, if you have spare property you want to let, out or if you're traveling to a city and you want to hire a room. Brilliant. They've designed it absolutely, for those two large. User bases. The. Externalities, the costs will fall. On the community though the neighbors the local shopkeepers and so on suddenly, there is no community anymore you have a different neighbor every day, tax. Is often not paid on those, rentals. And so on. Thomas. Went who's a designer in New York. Talks about user centered design being quite individualistic. Here to its detriment, that actually makes us too narrow we, focus, we, focus too much on just trying to make things efficient, and effective for this specific. User in mind like the the smiling persona or the, user however we defined them and we've overlooked, broader. Societies. And ecologies, and environments. So. In widening the net. Ways, we can do that is you know we can have a prompt list you know have we checked oh yes. It might have an impact on and their, children, or unions, or whatever it might be but. I think we can even add abstract, concepts, to, our stakeholder. Lists Facebook. Are well aware for instance that some of their tools have not necessarily, threatened. But studies away at the fabric of democracy or. Free. Press things like that. But. So does non grata as a fairly straightforward idea, it's just my name for essentially an anti persona it's a persona. That's a bad, actor within your system so, you. Know a troll or a hacker or a terrorist or something like that and actually giving them the space in the conversation, creating. A persona putting it up and, of course your job then is to hamper, that individual is to stifle, their intended goals within the system. Another. Option here is biased bracketing, this is something. That's actually. Fairly new to me but is apparently, quite common in academia particularly. In social sciences. Essentially. Academics. Before, they conduct a large piece of research will. Anticipate up front all of the ways that bias might creep in to their observations, and their conclusions, listing.

Them Out and then noticing, or writing, down every, time during that it. Conducting, the research and evaluating, it noting, every time that they think bias, may, be. Showing, its hand and then, of course you end up using those reference, points when. You're evaluating you. Know checking. Has, bias or could bias be screwing, things up as we go. Moving. On a little bit in the process a really. Important, idea for me in ethics, is the idea of moral imagination is. Using. Our. Moral. Intellects, I suppose to think about different but for futures and assessing well what could the impact of those, be but. That's kind of an abstract idea it's easier. If you have a tangible. Artifact. You have something to look at as a. Spark. If you like for creative discussion. And here. We can borrow from speculative, and critical, design and. Create what I call a provocative, some people call it a prototype, some. People call it a design fiction, but it's best if I just give, an example this is by. Marcel showing our own home Van Beek and it's. Created. For two Dutch energy clients of theirs and it's a prototype, or a provocative, of a. Public. Electric. Vehicle, charging point. In. A scarce energy future so essentially I have to model how, do you handle charging, infrastructure, when. Demand, exceeds supply, how. Do you allocate those resources so the idea is you know this, is you know large. Physical, thing you come along and you plug in your socket I mean actually give the cables right you plug them in into. The sockets there above, that we have RFID, readers. So you tap an RFID, card to authenticate, and then. Dials above to request. The. Energy you want so you can say well I want more but for a shorter time off so there are trade-offs and the system does the rest the algorithm prioritizes. And so, above that's essentially, the energy flow. Starting. At the, present, time at the bottom and then further away at the top and you can see who's. Turning is next how much energy they're getting you can see at certain times the energy availability, is compressed, and, at certain times it's wider. So. For me this exposes, what an algorithmically, driven, world would call it maybe an algo cracy might, look like in a, decade, but. The really interesting bit is they also prototype, to the RFID cards, and, this, is part as I say of the provocative so people are issued with these cars that allocate different, social, status and therefore different energy, status a doctor.

You'll See the one in the middle top, priority, but. It comes with a caveat unauthorized, use is punishable, but by you know by law, underneath. That the probation ID, so. This is energy that's capped your lowest priority it's. Fairly obvious how, social, status, and potential inequality, are encoded. And. Enforced, through. These technologies. In future now, the designers, aren't proposing, this as the best way to design they're not saying this is the solution that, the city should, implement. But. It's the design that gets the right conversations, happening that sparks, that moral imagination, and helps, us understand, well what does this future look like so. It's this kind of weird artifact, that creates a wormhole, between design, and research you, sort of pass between the two. What's. Really important for me is you have to allow the time and space for therefore that divergence, to happen for those conversations to happen for. Prototypes. That aren't necessarily good product, per se but. Get us closer to good product by asking. The right questions, at the right time because the debate really is the thing. And. The. Design phase itself, I'm particularly interested in how we can alleviate. Some, of the dangers of invisibility, I think a lot of text ethical dangers have come from hiding, things that actually, shouldn't have been hidden and, so. By shifting things like data flows and persuasion, into. The visible spectrum I think people can start to make more informed choices. And. This doesn't necessarily mean. A whole heap of extra complexity, we're. Talking here about making, things available rather, than mandatory. An. Example I would give here probably, my favorite. Example. Of materializing. A fairly abstract idea, comes. From I mean this is that prius from 2003, i believe it's quite an old display the. Idea of energy flows within cars if. You own this vehicle and see this energy monitor display then, it gives you insight into a pretty complex, flow, of an. Invisible, entity energy. Within the vehicle and, of course then you can learn how, to adapt your driving style to get the best results you know that if you push you. Know a particular pedal. Too hard then the energy starts to flow out of the engine. Rather than of the electric. Motor and so on so. You can adjust your actions accordingly, to. Improve. The economy of, the vehicle, so, by materializing. The invisible, we, can ensure that people act more sustainably. I took a stab at how this might look for say this. Is a hypothetical, home hub I think, sort of like a nest or an echo show something something, along those lines. By. Materializing. The data flows within the system, at any one moment an, individual. Can see what data is being collected about them the current status of that where. It's being, transmitted. And then intervene in that system as well so my idea is that these could all be tapped, upon or withdrawn, or something like that corrected.

Overridden, And so on I. Also. Think referring back to this idea of trying to. Act. Against some of the tendencies. Of user. Centered design that have been taught to us we, should actually increase, interface. Friction where it's relevant to do so so, my attempt to create. A highly frictional, interface. Where. There's highly sensitive, data available. So we've got theoretically. De-identified. Information here. But. Of course as we know any data that's de-identified, bears the risk of re-identification of. Future I've made a typo. I think in that should be Rio Dental patient, and. So. We have a duty I think to alert users, when that's the case and some really pause, and say rather, and don't make me think make me think engage. People in that conversation, make them take that decision, with. A gravity that it deserves so I'm suggesting here we actually have to get, them to write it in with a stylus or a finger or something like that so they really know what they're, signing away. Critique. Is also a particularly important, moment. I think for us to take. Ethical steps the. Idea of a designated de Center this crops up in Eric Meyer and sarawak de Becher's book designed. For real life this. Is essentially, a role of antagonism, sort of constructive. Opposition. This. Person is there to lob in a grenade of defiance every now and then in mostly, in critique and say well. Maybe I don't want to do that or maybe my, what. If my name only has two characters, in this fields has three or more seriously, what if I'm going through a divorce or if I'm being harassed on this platform how, does this system. Affect. My experience the platform how does it affect my vulnerable emotional, state the results. Critique. Is also I think a particularly important, point to apply those ethical tests, they talked about previously, the utilitarian, test, that, means. Not ends all that sort of stuff I thinks it's particularly well within this. Of. Course testing, has to come into this as, well. Kate, Crawford who I'm sure, many of you know very well, talks, about fairness forensics, a way to test, algorithms. For their. Suitability. For, the real world if you like whether they're ready to go whether they've been D biased as much as possible, but. I'm also a big fan of getting. The qualitative, experience. Looked, at here as well field, research to take in as diverse range of participants, as it can to try and minimize the risks because, who better to act as a dissenter, and someone who has me a genuine cause to dissent get. People in to test the baraka types that you've made, and. Then also this idea of stress testing, this is something that's maybe a little bit high-risk again, this is something that is. In that book I I mentioned, bio at Mario and Sarah what about you but, testing people or, inviting. People in to look at interfaces, when they've actually experienced. The, particular harm, that you're trying to reduce or to mitigate or even. Are currently, undergoing it this. Is something else you need to do very carefully you need to properly, trained researchers, you might need some counselors, maybe on staff. With that but. If you really want to get insight into the, effects. On the individuals, involved. That's a clearly, probably the most effective way. And. Then zooming out a bit I think there are things that we can do you. Know collectively, as teams. And as managers and leaders and so on to, create what I would call ethical infrastructure. Diverse. Teams of course have. Ethical. Power they. Act essentially, as an early warning system, as. An alert that if you might be heading down a road that actually you don't want to pursue and, I've sat in meetings and crit and so on where, I've seen people. From different backgrounds, say what we do realize that this thing we're considering, doing would.

Actually Harm us or would lead us down, a path that I we, don't want to work that way it's not right for this community or this you. Know this this group, that I know well. Career. Ladders and. Behaviors, should include, some kind of ethical language to make it clear that ethics, is the responsibility, of not just a particular, anointed, group but everyone, we. Should know for instance what, a senior how senior, product manager should behave compared, to a PM, two or something like that my. Old boss at Twitter Mike Davidson, talks, about the idea of how. Important, it is to reward the right behaviors, not just the right results, rewards. Just the results you might get all sorts of behaviors that happened, to have been successful you, may be setting yourself up for future ethical hum. Core. Values I think are particularly important, any large firm of course has these and they, vary from the Trites. To, the actually quite meaningful. My. Main bugbear is when I see core values that are too short I see, a lot of companies having a single word innovative, as, a core value which. Is useless because, it. Can be twisted by anyone to serve their own purposes, how can you object to this tracking software that I built its innovative you know that kind of thing. Even. If the core print or the core values are not ideal, oh you can create more localized variants, you can create design principles, that work for individual, teams or for projects, and. The great thing about those is they act almost is sort, of ethics in Deep Freeze they act as a guiding, star where. You're not sure of the direction to take, then, consulting. The design principles, can be a really useful step forward. Of. Course morality is a muscle, that needs exercise, it's not something. That's you know it's not granted, to someone the ability to do the right thing it's something we have to work at a good. Life is one that is actively chosen so, we have to ask ourselves some quite tough questions as individuals. And as practitioners. After. Ask ourselves well how might I be screwing, up right now in my work and I don't know maybe. If we're using those ethical lenses which one of those lenses applies. The most to me or appeals the most to me what.

Would My ethical limit be where, would I draw a line and say you know what I'm not willing to do this and how would I respond, to that and. Once you've asked yourself questions, I think it's important, to ask them of others and to involve other, people in those discussions your line manager particularly. Just, to say hey well you know I'm starting. To think about these questions because. Of course once you're primed for that conversation it makes it so much easier to have that tough conversation, with others a little bit later on if required I. Think. It's important, if. Possible. If it's safe to do so, to stand up for what's right to, occasionally. Increase the personal risks that you take cass, sunstein who's, now better. Known as one of the architects. Of nudge theory he. Proposed, what he called a norm entrepreneur, it's kind of an ugly phrase, but, essentially, this is someone who's. Fairly. Visible that has some power in an organization, or in a society that takes, a stand and says you know what actually I don't think we should be doing this anymore the way we've chosen to do this is wrong we need to try and change, the norms of our. Community. Now. Recognize that's kind. Of a privileged, position to be in not everyone has, that. Safety. If you like to put themselves in that vulnerable position, people have different needs and circumstances, but if you feel safe and comfortable, and respected, in your team in, your industry then. You're in a perfect position to use up a little bit of that goodwill when necessary to. Try and push for ethical change and. The nature of that disobedience. Can. Be because. It can be hard and fast I refuse, to do this it, can be gentle or however it can be, hmm. I'm not comfortable about the direction we're taking this are there alternative, perspectives, are there different ways that we can look at this problem proposing. Different solutions, to achieve the same goals. Of. Course finally there's there's, a need for us to. Collectivise, our work or our efforts, as well it's, very easy I think to. See. An. Ethical campaign is something that needs someone strong to stand up you know a normal trip Rijn or that i talked about, but. It can't be sustained, that way it can a normal - Peter can start these things off but you need allies you need wider, consultation. If ethical. Change is ready to happen I'm, particularly, keen that we engage the public. That there's. There's, something I call a technocracy, trap, here it's very easy for us to say, well only we speak, the language of Technology only we know what's happening inside these devices. Therefore. It's probably down to us to decide you, know there's the social impact of these things, and. I, think that's a big mistake I think that'll be a mistake for our field if we keep thinking. That way I also. Think we need to find moral allies as I say internally. And. Also. Not forgetting that we are, consumers. And voters, and, we have power through. Those as well we have powers, to engage. The public to educate, them about what's happening inside their technologies, we. Have powers, to lobby for change, at a political level even. Through the ballot box. So. I think, there. Are a host, of opportunities for, technologists. To try to push for ethical change. It's. Not an easy job. For sure I mean I can. Tell you from my own personal experience, it doesn't necessarily, make, your life smoother. But. I do think there's something powerful in it that it helps you also, step. A little bit closer the kind of person you want to be in the. World and I, think we need now to start taking these steps to push for a more. Fair. Inclusive. And thoughtful industry, and. We're going to need all the people we can get so, I'm delighted. To be with, you today and I, look forward to see what you all come up with in the coming years. I'm. Not really here to plug the book but I have written a book about. This should, it be of interest future, to us at - ethics calm and thank. You for your time. Okay. So I think we have some time for questions if there are any I'm also happy just to chat one-on-one, if people would rather do that at the end yeah. All. Around. Also. I heard that the stream. Ironically. There's. Like some major issues so the stream is delayed a little bit there's, folks that are behind okay there may be more, questions. You get that little thing yeah yeah, any thanks for the room. As. You've gone through kind. Of talking about this have you seen good examples, of where new people are taking some these principles, and applying them. Um. Essentially. No I. Was. Talking to Josh about this you know just before before.

We Kicked off that. Hasn't really been a focus, for me I've. Wanted to tackle it from from, first principles, probably. The company, that seems to have its act. Most, together at least from the outside is Salesforce. Which. Surprised. Me and didn't really expect that they will be on the forefront of it but they've they've. Gone down whole sort of ethics committee route which I'm skeptical. Of to be honest but. They've got the right people in they've. Made. It cross-functional. Cross-disciplinary. They've. Engaged with academics, I mean a real proper, ethicists, things. Like that and they. Seem to be putting together something. Quite interesting they're investing, behind it properly they're getting senior, support. So. I'm watching their efforts. Google. Has a, whole. Bunch of efforts. Some. Quite. Mature a lot, not from what I hear. They've. Also reached out I mean they've reached. Out to academia Shannon, valo who's a virtue ethicist, an extremely, good one was. Just trying to Google clouds AI, team. On. Like a one day a week sort, of consultancy, thing. Which i think is you know excellent. Deep. Mind oh yeah, deep mind definitely. You. Know a long way ahead of. Pretty. Much anyone because. The. Harms are fairly high with some of their work and they've been, in trouble already they, misused, NHS, data in the UK they've got a whole heap of trouble for that so. They're quite keen to be squeaky. Clean. But. Again they've they've appointed. Good. Risk, you know respected. Individuals, and academics, and practitioners two. Committees there but I haven't seen that many teams, sort of internalized, and operationalize, it that, well yet I'm not privy to the, side of a lot of these companies yet hopefully that will change, so. Why I'm actually excited about the model you have here which seems to you know be bringing that stuff close to product and design because. That's the apex of where these decisions made I. Think. Here Tim I try to add a higher level we're talking about product, and design talk, through a lot of companies, but there.

Does Seem to be these two, polarities when. It comes to data, in. Terms of services that are given away for free because, you are the product or not the customer right their customers are people we're targeting your eyeballs and they, could care less about ethics, right in it or that's. A broad statement right versus, companies, like Salesforce, companies like Microsoft to our products. Who sell products and with that comes a certain substance of. Privacy, because you're buying a product versus, making the product and taking advantage of a free service and paid for it with your data privacy, so, how do you fit first does, the book go deeper into that because if, you're talking about ethics and AI you. Have to acknowledge that, motions. Around one business model, are. Different promotions, across. A different business model on the other end of the spectrum I then putting all of these on the same playing field they're playing two totally, different games in terms of following the money and following, the profit the. Book just go into more detail in chapter on it, I'm. Going to disagree I think, they are fundamentally. The same I reject. This idea that if you help paint I've got it you're the product being sold, and. Not an I reject, that idea but not in the way you might expect the. Way I reject it is even if you're paying for the product you're still a product being sold I. Go. To a cinema or a nice I paid 13, quid for the ticket I still have to sit three twenty five minutes of ads I. Sign. Up to. Netflix. And they still you know advertise and preview. Their own things and they use it, tweet. Their own algorithms to recommend me things that maybe, come back to the product more I'm still being sold in exactly the same way so I actually reject the separation. Done. On business models I think there's. There's. A potentially, quite strong ethical harm that results. From some of that discussion, I hear. People. Who like to use words like surveillance, capitalism. They. Often, end up. Going. To a place that I actually don't think is very helpful which is to say to, include essentially the only ethical, business. Model is one that's funded by the consumer, and I. Disagree, because that itself is discriminatory, and. Inaccessible. Decision. That's taking, the power of technology away from billions you're then saying the technology, essentially is should. Be only the domain of the rich. So. I I actually don't, see that fundamental. Distinction every company is incentivized, to get as much data as they can on individuals, there. Is however. A. There. Is a bifurcation in. Terms of, privacy. Being used essentially as a luxury good and there's. Quality no sort of theory about this that privacy is now being sold essentially, as an upmarket thing. You. Can to be very crude about it you can say Apple strategy and Google strategy when it comes to Android in iOS are. Bifurcates, along those lines you have iOS, is being positioned, more as a privacy first but very much a luxury product and the selling price keeps going up whereas, Android is. Far. Stronger on privacy than a lot of people give it credit for but it's certainly aiming for reach and of, course they recognize that there's an advertising model that needs to be funded by it but.

Again This is I think a potential harm of that separation. As you start thinking the. Privacy, is really only something that should be funded. By. The. Consumer, and as. Such it becomes kind of optional, if you let that split happen so that's why I reject, that premise I don't mean to be sort of hostile to your, question, but I I. Don't. Think, it's an accurate reflection of, what's actually happening inside industry if you see said I mean. Difference. Here because you, and you're using Twitter as your example which is a free service that sells eyeballs I know that's where you come from but if that, is no, hey. I don't pay for Twitter and its business model is selling my apples and selling its data to everyone so that's, fundamentally, not true at, Twitter, Facebook, etc. Categorically. Do not sell data if they did that go bankrupt. The data is the competitive analysis is there a competitive advantage what, they sell is access, the user on the platform and that's, a very significant, difference because they're then incentivized, to treat that data with the respect that it should have. Product. And I'm selling the product to somebody. The. Ability to capture data is an additional value for me and that needs to be treated very ethically right and that's where it's complex if my business model is marketing, and, our kind of this weaponized marketing it is a different it. Is a different conversation around ethics it would appear to be honest I think I think this is saying, that they're that they're the same thing and that we're just turning it into privacy, or the business model I there are ethical, conversations. Across the spectrum but, these two polarities are different sensibilities. Around. Ethics especially, because one of them deals with the true nature of marketing, which has been around for decades before, the internet but. Once you get to this weaponized, marketing and think of marketing versus nothing that says we're selling products that generate data but, how do you democratize, this data by solving the products, that they can have access to that data itself, I recognize. The stereotype, I don't, view it as true I haven't worked I haven't seen that planet as a as a difference, within. Industry, to be honest. It's. Interesting that it speaks a little bit so the feedback would be described of like user as, the ends or the means hmm. In the distinction and I think to your point Rick. About you, know selling. A piece of, you. Know hardware, or software such. As Salesforce that has a specific, use. And you, measure, it. In, a certain way. Versus. One that is essentially designed to, keep you using, it. Justifies. A lot of its use by. The very data that you have like. There. May be something that sort of bridges that sort of gap in our opinions, which is I do think, there are differences. In. How. Users are treated, and the ethical, responsibilities. That come from that where engagement, is prioritized, and. A. Lot. Of Twitter and Facebook sand, instagrams, increasing, failings have come from chasing engagement, metrics.

So. From. That point of view that's not a that's not an intentional decision no that's not that one company cares. About, privacy. And I think someone doesn't because the business model, is the business model predicates. Certain, default, decisions, I suppose which. Make it harder to challenge so. The. Reason I kind of shake. My head a bit when people say they're selling eyeballs and so on there. Is sometimes I'm not saying this is necessarily, the case it but they're sometimes a disingenuous argument to, say that well Google Google is just an advertising company it's not Google, is a product company doing a whole bunch of very different difficult, technical stuff it's funded, by advertising but. You can bet your ass that most of the people of that company don't, care about the advertising that they're grateful it's there to pay their salaries but they're there to make great, technology, it's, like saying that ITV was a commercial network, in the UK there's. Not an advertiser network they put on programs they tried to put on the best entertainment they possibly can I'd like to assume positive intent of course, it has to be funded and that funding. May have some implications but, for. Me there's quite a an. Important, distinction to be made, about the intention, of that, company anyway. Sorry - sorry. I wasn't attributing, this relationship, constant. So. Kind, of back to the question about like who's doing well it's like how do you measure your like, ethical. Robustness, in your organization, right because it's like aside. From in a cot, it's. Hard to tell whether or not you're your. Organisation is making ethical choices. Yeah. I'm. Always a little bit wary of that particular line, because, you start sometimes. To make business cases for ethics.

You. Start to say well if we do this we will have the following results. Which will affect our okay ours or our KPIs in a particular way when. You do that you're making ethics, subservient, to the profit imperative and. You. Lose like, because, a profit imperative is what's, causing a lot of them pretty. Much all of the, ethical, problems the tech industry has faced. That. Said there are things that you can look at you can obviously look at things like user retention you can look at customer, satisfaction, and MPs you can look at these kind of things, but. I'm always nervous. About. Treading, into that territory okay, customer, ad staff, retention as well I think it's quite good. Lagging. Indicator, I suppose oh I. Guess, I mean part of my interest is more from life so. If you've. Prompted, us to build organizations. We have ethical, choices that, are making it yeah, right it seems like how, would I measure, the success of, mine did my, initiative. To build an organization, yeah. I. Think. It has to be experiential, I think it has to be what. You observe within that company I I'm. I'm. Skeptical, about sort of abstracting, it into something that's easily you, know put. On a on a slide, see that's my that's my, difficulty. With it has to be what's the quality of the conversation, that are happening around me in the, design studio or, within critique sessions or demo days or you know internal. Messaging systems or whatever whatever it is as. To how you turn that into something more, tangible I don't know and I kind of don't want to know at this stage episode, I mean again sorry to be evasive. Oh. Yeah you needed to be embedded in the culture and and to, say like. Certain. Ways it's, a risk that you can't, mitigate. In. A lot of ways then it's like because, the, risk is that in the future I make a bad decision it's, like well I guess that's always possible yeah yeah, I mean. There's certainly certainly, places I've been you know the employee satisfaction so, there's one of the questions they ask is you know do you trust leadership to do the right thing and so you know they're up there are other. Vehicles, like that HR, hrbp, just love to talk about that kind of stuff right but um yeah. I don't have a more sophisticated answer, for you I'm afraid sorry. I, think. That's where we probably face a lot of stuff on the product side is, we kind of look at ourselves we kind of joke for like the applied team right, we take a lot of the theory and try to grant to fire teams I think the, challenge, is is product. Teams talk in kpi's. Which. Is good other, ways, and. Sort of meet them where they're at stones you have to put something, like that in front of them and make this business case and why this that was important, I think that's where we struggle is. How. Do you translate something that's a gray area and you, don't want to just put it on a slide, but the same time you have to convince someone that it, is important it aligns to them it's, not fluffy, it's not this like nice to have feel good thing and, and. I think that's where it's, totally, experimental, to your point we are in this experiment, but, I think that, as an industry is something we have to get better at it is actually how we make this thing real, for, folks who care. Inside. The, answer we're trying to figure that out now as the team credits. It'd. Be interesting as you work through companies, and especially. Companies like like a sales perso

2018-10-29 02:03

Show Video

Comments:

Thank you for uploading these to the public

Other news