Trust and digital rights in learned systems (PAIR UX Symposium 2018)

Trust and digital rights in learned systems (PAIR UX Symposium 2018)

Show Video

Well. I'm gonna welcome Sara Sara Sara Gould is an expert in data. Management and privacy, she, sits on the board of tech, for good and she's. Going to talk to us about trusts, and digital rights in learning systems. If. We, believe that services, should respect, our rights so we. Work with the, organization's. Shaping. Our digital, world to, show how they can be trusted with data. Over. The last couple. Of years we've worked with a whole range of different organizations. And, our. Work is increasingly, taking, us to quite a specific part, of machine. Intelligence, looking. Specifically, at how people, can trust, services. That are powered by machine, intelligence technologies. But. Also how, machine, intelligence technologies. Can. Be trustworthy, in the first place. This. Is a really important. Area of work because, services. Are increasingly being. Powered, by machine. Intelligence technologies. And those, services, are beginning to creep into really, important, areas of our life, whether, that be helping. You to find a job or to, manage your finances or as. We found out last week deciding. On who to vote for. At. This, point I want to. Reframe. What I mean by machine, intelligence. Technologies. Matt. Jones uses. The phrase learned, systems. And I, think it's really effective. Because I think it helps us to refocus. The issue on people. And, society. Which. Are the most important, things really and, it. Also reminds, us that we. Actively, teach these, systems, and they. Operate, in a real-world, context. That we must never forget. So. I'm going to talk to you about two, things trust. And rights. In learned, systems. To. Start I'm going to talk about trust, and I'm. Going to give, you an overview of six. Insights, that. We have found, from, our work at if working. Directly, on these, issues and, whilst. I can't show you exact. Examples. Because. I'd be in breach of confidentiality. Agreements. I have. Found examples to, illustrate these, points from, the public domain or. Illustrations. From, ifs blog, to show you. So. To. Start off, design. For non determinism, as. Designers. We are used to designing, for fixed. Circumstances. It's kind of like if this then, that with. Learned systems, that, completely, changes, as designers. We don't necessarily know. What, a user will see once, the system is, showing, them a service because. There's an infinite number of. Possible. Interfaces. That, a user might, be shown. As. A user, that. Creates, a real challenge, for trust, because. How can you start to have conversations. With one another to, check is, my thing working right does, this look good to you this, what it should be showing me if every, one service, is showing them something different that's. A real challenge. And. If we think that designers. Should, start, thinking about designing, for frameworks. For, learned systems, to, express, a range of, opportunities, within. That's. Why as of today we've launched, a project, with Allyson, Powell from, the London School of Economics, looking. At just that what. Do these frameworks look, like and how do you design, transparency. Into. The frameworks, so. That users are able to understand. And, challenge. What a learned system. Presents, him with if you'd. Like to follow that project please do take a photograph of, that blog, post link there will. Linking. But, working on it over the next few months. Design. For legibility, and understanding. Is, another, principle, and. Just. Like handwriting, if someone's handwriting is really messy it's really difficult for you to read what, they've written in the. Same way I think that services, are really hard for users. To read and not, only that they're super difficult to understand, even if you can read them, I'd. Like to ask the audience how. Many of you use Spotify. Or iTunes, or, Google Play, hands, up and. Keep. Your hands up if you've, read the, Terms of Service update, recently. No. One kept. Their hand. Up. That's. Right we know that Terms of Service or, terms and conditions are not, good enough for the digital services. That we use today let, alone learned, systems. And. If we think it's possible to begin designing terms. Of service out by. Putting. The. Information about when services, are learning, what. Data is being used and, what, powers, a user has to change that all, at the point of use. You. Can see on the, example, to the left this, is an example of a fictional, benefits, service, that. We prototyped at if and you, can see that this individual, has, received, a sanction, on their benefit, what.

We Were interested in doing here is showing why. The individual. Received, that sanction, to, give them that information at. The, point of use and, give, them an option to appeal that decision if, they think that something's, not quite right this. Is all about the service explaining. Itself as it's being used and we think this is something that, will. Be really important, as we, see more and more services, are powered by learn systems, in. The. Meantime before. We ditched the Terms of Service we. Think it's possible to make Terms. Of Service more, legible. And easier, to understand. We've. Blogged, about some of our thinking on that understanding. Terms, and conditions and I'd encourage you to look there at some of the techniques that we think are important, to make Terms of Service more, machine readable and more understandable, to you. And I. Design. For override, is another interesting. Area this is because Trust is a two-way relationship. So. It really wouldn't be any good if a learned system, simply. Presented. A user with. Decisions. That it had made without, the user being able to give feedback or. To, take over I think. It's really important, that users are given the ability to take, over critical. Automated. Processes. And that, means that the user has to know they can take over and understand. How. This. Is important, for a host of reasons one. Of them is skills fade. There's. A really nice parallel, and in, the. Transport, for London every. Sunday, the. London Underground, turned, off automatic. Braking, to. Make sure that drivers of, tubes, don't lose their skills from the automated, safety, system. I think. That's a really nice parallel, and I think there are things we should draw from that in our, work on services. And. Design. For recovery is another. So um. Systems. Are going to go wrong that. Will happen and it's, really important, for the user to be able to UM remove. Any autonomy, that the system has or to. Revert back to, a previous, system. That, didn't fail and that. Should be possible without having, systemic. Failure, and. This. Is a bit like cruise control in a car you can have cruise control on for, a long time and enjoy you, know the car kind of driving you somewhere but, as soon as you see something quite unusual up, ahead or a difficult, Junction, you should be able to take over that's. Really important, because there are some actions that you'll be able to navigate better. Than any system. Design. For respectability, this, is all about making sure that, a user, can, see. What, a learn system, is doing and why. And, it's. A bit like view source on the web that we have now or, apologies. For another car analogy but, if you've ever checked the oil. In, your engine it's like that you should have the ability to check, what's. Happening. And. Then. There's designed for collective action so, whilst, more. Technology, and more design is needed in this area they're not the only answers, and in. Fact, the. Answer isn't just about more, design more technology, because we have to think about the wider system and it's. Really important, that we think about the. Needs of different levels, of society. We. Really can't fall into, the trap of this Californian. Ideology. That, you understand, a learned system, or to trust it will, be purely an individualistic, action. I think. It's really important, to think about collective, action by. Which I mean things, we can only do together or with, the help of organizations, that can represent us. At. If we've been looking at how consumer. Groups unions, or, medical, charities, can. Help people, they represent. Understand. If a learned system, can. Be trusted. But. Also those organizations. Can help society.

As A whole course, correct. You. Can see the. Example, on the left going back to that fictional. Benefits, service, I showed earlier that. The individual, that received that sanction, was quite concerned that that sanction, was wrong so, they went to an organization, called Citizens. Advice, Citizens. Advice is an organization. In the UK that helps citizens navigate. Complexity. And hard situations, in government. Services, so. This individual who's received a sanction, goes to Citizens, Advice and the. Advisor. At Citizens, Advice is able to see why. That sanction, was given and, if anyone, else in that area, was given a similar sanction, for the same reason, what's. Important, here as you can see the kind of highlighted. Decision. At the top this question, which, is a hundred and fifty three other people in the soffit area have, missed appointments, in April, would you like to include Mary's, case because. Maybe it's that the bus timetable. Has changed, and suffered, so no one in that area can make the right appointment, time. So. Letting. Organizations. Help us means we can also have collective, action so, we can course correct so, we can see as a society, where, these systems might go wrong and hopefully. Early enough before, they cause huge, problems. Of. Course that means that if you're designing a, learned system, you. Really need to think about where, your learn system, fits in society. And with, civic institutions. If. You'd, like to look at more about, what, we've been thinking about it if on that we. Wrote a blog, post last year on making, it clear how machines make decisions. This. Leads on to the next part of my talk. Which. Is all about rights. So. I think that these. Are three areas that I would argue urgently. Need, addressing by, us. The. First is. GDP. Our four, learned systems. GDP. Our is the general data protection, regulation. It comes into force in May this, year, it, affects. Every, company, that holds any information. About any, EU citizen. Essentially. For, us as individuals it. Gives us a whole suite, of digital, rights, some. Of them are really interesting. And if you'd like to see what. Those rights could mean for services, that you might use every day and we, did some sketching, a couple of years ago and you can find those under, new digital, rights, projects. By if comm. GDP. Ours rights. Often, directly. Well they directly challenge some of the ways that learn systems, are built, and that's, where these challenges, and the, questions I have for you come in. So. What, is an individual's right to deletion. In a federated, learning. Model. Because. People are complex data. That they give a service, could, be wrong or, the, data they give a service could be right at one moment in time and wrong. Now. For. Instance if someone has, a divorce, that, will have an impact on maybe how they use their bank account, perhaps. The kind of invitations. They make in their calendar or simply. What images, it would not be okay to show them in their social media feed. These. Are the kind of everyday problems, that, everyone, has and these systems need to be able to respond, to, so. It. Needs to be possible, for this kind of ephemeral design, to work where. A system can explain, the, learning, it's got, to a user and the, user can choose to keep it or throw it away so my, question, is how can we make ephemeral. Learning, kossal possible, in a, privacy-preserving. Way, and how. Can we make ephemeral, design work. Another. One is, how. Can you give a user the right to understand, in this world that's, non-deterministic. That, I explained. Earlier I. Think. This is a really tricky, question and, I think we have to think about the way we deal with data, permissions, and collective. Action to answer. Some of this. Because. I think data privacy, is just far too complex, a thing to, be regulated.

By Individuals. Who are really, busy doing important, things like. Making. Their money stretch or finding. Food at a food bank. They. Don't have time to tweak data permissions, or. Let, alone the skills, to, check that their choices are in fact being, respected. And this, where I think we have to think about groups. That can represent many, of us to, help us understand. Whether, certain, data should. Be given to a service, and if. That service, is respecting. The. Choices, that we made. Testing. In learned systems, is the second area so any system. Of any complexity, in society, has required, testing, if, we take an example of like a fridge, a fridge. In your home will have been tested, by consumer, rights organizations. To make sure that that fridge doesn't set a light in your home the. Manufacturer. Will have also tested, that fridge to make sure that, it keeps your food at a temperature that safe to eat at the. Manufacturer, also will have designed it to standards, that have been iterated, and decided, that this is how we build a fridge, now. If you take that testing, ecosystem. Question, that will be necessary in, learn, systems, but. How can q you create a testing, ecosystem. That, doesn't just test a kind of inanimate object, like a fridge that's performance, only changes, with age but, a learned system, that can change in minutes so, how can we create a testing, ecosystem. That keeps pace, with the, changes, of software. Within. This as well we'll, have different people running different models, on, devices. So, how do we verify, the authenticity of. Different, versions of models, so. That people can make sure that, the model they're using, is trusted. Deepmind. Are developing. Verifiable. Data audit, which is a technology that, essentially, logs all the different kinds of health information, that, deepmind. Use to provide their, services. That. Are used by clinicians to. Make better decisions about people's, care, or. Med or treatment. Verifiable. Data order is essentially, a log of lots of information, you'd need to look back at to understand, how, did the clinician, make the decision, at the time what. Information, was available to, them to make that decision. Verifiable. Data order is just one kind, of technology, that could help us verify complex. The systems, but. I think there will be other things we need to like. For instance open, registries, perhaps, of verified. Models, or. Maybe we need to start verifying, the training, data that we use in the first place. So. How do we publish, that other data how. Do we verify the authenticity of, different training models and then. What training is needed for the individuals. Who will be running those testing, institutions. Or organizations in. The first place and something. I'm really interested in, is how do we give testing, Suites to individuals, so. They don't just have to rely on organizations. But, they can self test things too. The. Last area, I want to talk about is something that's really close to my heart and this is ownership. And control, in learned systems, because. These systems don't, operate. In isolation, they. Exist as part of a wider system and we've. Seen with Facebook, and Cambridge analytic, er what happens, if the wrong parts, of a learned system, and the platform, it operates, on is closed. Obfuscated. Illegible. Or has, perverse, incentives. Baked into, it. So. When you use a service, that learns from your behavior and makes decisions on your behalf whether, that be about deciding, what brand of loo paper to buy or maybe which school to send your child to how. Can you know that that, system is working in your best interests. Then. As those decisions, are made not just about. You but maybe about your community, how. Can you know that that system, understands. Right from, wrong, what. Is right anyway, as. Humans. We have morality. But. When we're talking about essentially. What is a bunch of numbers, how. Do we make sure that, that. Can understand, or know what's, best. When. You use, a doctor or an engineer or a lawyer, they have a professional, code of ethics that they must. Work to. So. How can we make the intent of a learned system, clear, to a user and. Why. Do you think a systems ethical, promise should, be locked. The. Other question I have is which parts, of a learned system, and the platform, it operates, on can. Be of the market, which, parts, of the learn system, and the platform, it operates, on has, to be open. Cooperative. Commonly. Owned and not, part, of the market, that's. Really important, because if there's not enough transparency, baked. Into the way we design, and build these systems, how. Can society ever. Course-correct. You'll. Remember the financial, crisis, that was, a systems, failure of huge proportion. That affected, everyone, around the world but. It most affected, those, individuals. Who have least most, severely.

I. Think. These areas are the most pertinent, issues, to be investigating. In design because. What, you design affects, the understanding. And access. That an individual, has to their rights and. Their ability, to have, action, on them. It. Is. Our responsibility, to, make sure that we design learn systems, so, that they can be trusted and that they give people, digital. Rights. Thank. You. Questions. I. Share. Probably most people's you, know intimidation, of legal. Terminology. And, contracts and stuff and and. You. Know I it. Strikes me that perhaps, they are as. You say sort of some. Something, that can be, you. Know sort of parsed or made simpler to understand but at the same time you. Know a big part of it of what I think makes them complex, is that that, all of that language is there for specificity, it's it's details, basically, right and and, so, in a certain way it's it strikes me that it's kind of like saying like why do we need all this code in this computer program like, can it just do the thing that it's supposed to do. If. That's I don't know if that's a fair analogy but okay so my question, is basically kind, of how, do how. Do you can you elaborate a little bit on sort of how to kind, of bubble that up and. I, guess, sort of the final point on this is is that you. Know it seems like often the thing that sort of screws a person, with with with legal language is not, kind of the top-level point but the thing that's kind of buried deep, in there. So. I think, was about the. Relevance, of terms of service to, a designer. And then the second part was, what. Are the ways to design, out terms, of service. That. Kind of ripe. Sorry. I'm, not I'm. Sorry, my question, was basically sort of how you envision. Clarifying. This. Kind of language or this, this kind of system to. Lay people without. Sort of like losing the details. In. Those documents and, that, are you, know that are sort of crucial to, me. So. I think that, having. Like. Terms of Service that. Sometimes. You can't even find very easily on a service, is I mean it just doesn't work at all because even if you wanted to try and go back to understand something you can't or, you're not even sure which version it, was that you saw when you signed up to the service so. I think that there are it's. Really important, that an individual. Understands, what, information, is being collected when, a service is learning and what. That data. Is used to do I mean, all the details of it actually when it's relevant to them, so. I think as you go through a service I think there are points of asking, for consent. As a service. Is running, that is, a. Good idea as opposed to asking for all the consent up front which is what currently happens so. I think getting smarter, about how we can bring in consent, throughout, a service, but, that we have to do that with a lot of research, because, I think. Speaker. Earlier mentioned about getting in the way and that's often part of the tension here is you, need to add friction, points to help someone understand, but at the same time someone. May just, fundamentally, not understand, why, that could be relevant to them and and data is and I think that's part of the problem actually is that with, learn systems, the quantity, of data that, is being extracted. Is so huge I think it's going to be almost impossible for, the majority of people to understand, that so, I think that. Looking. At how the. Terms, of the issues of Terms of Service gets designed, across the service is one way to look at it but, I think ultimately what.

Should Be looking at and Tom Steinberg, blogged about this recently is what. Are the kind of collection action, points to understand, this stuff you, just want to be able to use a service, and to. Receive. The utility, it provides you and not have to worry and the. Majority, of people just don't have time to check so, I think we have to look to that but, also I think there's a question about how machines can also help us check because. Of the quantities, of information we're looking at. So. To extend, this notion. Of legal. Systems as code or, code as legal systems as you well know, regulatory. Environments, and laws are often in conflict within. A nation and so, for example a program would crash at that point through a memory error right. But legal systems keep churning on and of course a lot of lawyer activity, is applying. The correct regulation. For the correct circumstance, and ignoring, that one that's in conflict now. Of course we. Are developing technology, that has to live in an international, environment, where. It's not enough that just have consistent, laws within a nation but, we the technology, providers, builders have to build things that are consistent across international. Boundaries. Copyright. For example varies tremendously from, country to country Australia. Regulations. Are very different than us and so on. Tell. Us what we should be thinking about when. One country demands, that you delete this data and another country demands you maintain that data. Suggestions. Well. So. This is an area that I'm I'm really, really interested, in and I think that technology companies, I mean there's a question about. What. Your experience is as you go from country to country or as I've come here, the fact there was at that point in time where there was a lot of concern that at. An airport you could be asked to give over your passwords, like then there was no way of me going dark on my phone so to speak right, maybe. That should have been an affordance, that, an organization. Should have. Given me and. So. This is a question that I'm not gonna say I have an answer to but. I think it's, something where we need to be designing, with the. Legal, teams I think, that when you see like design and development teams they're often, in completely different buildings, to where, legally, and so I think far more collaboration. Between, those, groups to understand, these issues and. Although. There's a bug, with right, I mean the, legal, universe, that I deal with is way behind the technology that we're developing I don't, know how many times have had to explain how HTML works, and, how, web browser works it actually makes a copy on your local device, that's. A crazy hard notion, for lawyers to understand, but I think that's where we have to look at this as being a cultural, question so. When you look at programs. Like Oxfam's, data, responsibility. Programs, that they brought in with the engine-room that, was something that wasn't about. Like. More like more more, design it was actually about, workshop. Facilitating. Workshops to, bring different parts of the organization together, to, discuss really naughty issues, of things, like consent, in. Somewhere like Yemen so, I think it has to come from a place of bringing, people together that have not been together before and as designers remember. We're really good at doing that like riccati you know we are facilitators. So I think there's I think, that has to come first but I'd love to do, more work on that and I'd love to talk to your officers about that I think it's not a question I can answer super, quick. Yes. And from deepmind ethics and society, so I was, very interested in your idea that certain parts of these systems should operate outside of the market, and. I was wondering if you could flesh that out for us a little bit more. Yeah. I mean I think this is one of the big research questions that actually. We, need to explore and needs certainly. Academic, research as well so that this can be in the open because, I think it's a really important, one I. Think, that there. Are it's.

All About incentives, I think so if you have all. Parts, of learn systems, and the platforms that they operate on built, to a model. Of capitalism, we. Know what, happens, with, that model and we. Also know, what happens with parts. Of the infrastructure, that are critical, to running that the open source infrastructure, that, may not, receive. Any funding, or has. Security, issues in it because it's not well resourced like, we're gonna see those issues again but at a much, much bigger scale with learn systems, I think, we have to learn from what we know and think about which. Parts, of these systems, have to be what, I would call of the Commons and I hope that Dan Hill later might speak a little bit more to some of these issues of ownership, and. That's. Really important, because when, something is of the Commons or cooperatively, owned there. Are rules, that you set as a group, of individuals, that you. All get to decide on so it's really going down to the fundamentals, of power in these systems and by, power I mean what kind of rights and capabilities. Amid these systems, give the majority of us and. I think that's incredibly, important, to look at it's something we have to see particularly. At the speed of you, know software change and we just won't be able to keep up we, really won't and I think there's a real risk that, we will see huge, systems, failure, again and again and again we, don't give society, the ability, to see ahead and course-correct, that's only going to be possible if we make the, important, parts of these platforms. And systems, open. Yeah. That's, what I mean. -. More of a sorry, a comment to it - a conversation, I'd like to provoke I think annex I, really. Connective thread through, the three talks that we've do we've heard today and I would, characterize, it as the. The, re-emergence of, ego. Or self perception in UX, and UI so, a lot of the threads especially, with, work especially about what, it means to be user friendly especially, what it means to have a. Consent. Were for. A long time there was a lot of incentive, alignment between the technology the UX and the business model which is essentially to obfuscate. All of this right like we all get the same UI no matter who we are we all click on the things we all get, in for that we get things that are really fast that get cheaper over time that, I don't actually have to give as much information to. To. To, use and what we're seeing now is it seems like the role of identity, so if you go to the doctor's is a lot of times like the reason the system is getting rejected is because you're trying to replace part of what they consider into their core, identity for. Me to have representation, in the systems I have to give some kind of feedback. I liked your comments about like you. Know one of the things is when you, have a certain result and I have a certain result as a UX designer we, need to design for us to be able to compare them and like what does that experience like but it's also moving. A lot of work that, was obfuscated. And was taken care of by the technology to people right. And I and I think of Cliff's comment. Or thinking, on unused friendly, and user friendly used to mean like get as much of it out of the way as possible and I think there's like this shift going on right where people are raising their hand perhaps. To say like I want to participate more and I actually want more friction because, of what, I'll get annyeong and, I just I sorry, I just wanted to kind of connect. Those and if anyone wants to talk about that topic I'd love to talk more about it I think it's I think it's one of the parts where that user interface design has, kind. Of stayed the same whilst, the technologies, underneath, these services have radically, changed and that's what we're facing that your big questions, are kind of centered around that tension point really that we've not changed, the way that we design. Mission. Of don't make me think entitled. Make me think a lot make me think all the time. Yeah. I think that's where there's a real tension bit which is that I guarantee. 99%. Of people just won't have the time and. So then you get a new design challenge which is okay how do we start to look at crawfish, muddles which historically. Are places that are really successful helping, us do stuff together, yeah. So, I think looking looking to that as well which is not.

I Don't think has been part of you the UX language, but should be. Thanks. A lot good. Time, for a break. Thanks. All our speakers and we'll start at the 11:45. Sharp thank. You snacks, are on. The sides. They're. Planning, afterwards, yes.

2018-07-14 09:30

Show Video

Comments:

How to learn machine learning

Other news