DTA Chats With Professor Genevieve Bell
Like, to start this, afternoon session by acknowledging the traditional custodians, of, this land the. Ngunnawal people and, also extend my respects, to. All Aboriginal and Torres Strait Islander people here, today so. Welcome everyone and welcome to those online that, are joining, us via webcast and. The, first of what we hope to be many DTA chat type events. That we're going to be holding over the course of the year for. Those of you that I haven't met Tom Gavin Slater and I'm the chief executive of the digital transformation, agency. Well. We have a wonderful guest chair this morning professor. Genevieve bell you. Know i think she's well known to i guess most people i think everyone in the room here today would have at least checked. Out her profile, followed, her her boy electro series checked, out her. On social media but really. These are my words but i reckon the rock star of, i. Think, thought leader in thinking, about sort of innovation. Technology. Disruption, and, and. Everything else that goes with it so personally, i'm really, wrapped to to have genevieve here i saw. Her speak early on in my tenure in the IPS, probably. About six months ago and. Really. Felt, totally, switched on by by what genevieve had to say and to share so it's great that we all have an opportunity to learn and and. Broaden, our horizons. Are now perspectives. As. I said she's a renowned, anthropologist, with. Many interesting, experiences, and insights and, I pulled a couple out I mean she's worked in the Silicon Valley worked. At Intel for you know a couple of decades but. A couple of interesting things that I thought you might find interesting. As well and and one. Is you acquired you, were brought in to bring the stories of everyone outside the building inside, the building and make them count and I think that sort of really resonates, and you. Have to understand people to build the general, next generation, of technology. And I know you means chatting, about human centered design as opposed to just user, centered design, they. Say you hold 13, patents I checked you out on LinkedIn again this morning I could only count four so I don't know if there's a disconnect, between the four and the 13 but perhaps for another day. But. But. You've, been inducted into the woman. In technology international, home of Hall of 5. You were named one of the 25 women, in technology to watch and what was, on the hundred, most creative people in business, list 2013. Genevieve. Was named as the winner of the Anita Borg Institute woman, of vision in leadership award. 2014. Included, in Elle magazine's, first list of influential, women in technology, so. As I said an amazing, individual. One. Of the other things that I thought was really interesting and I think it was in the New York Times and you just mentioned that net to to, Jeanine you were quoted in one interview saying it's, a one-way ticket I you. Learned every, journal survival skills such as how this squeeze I drink out of an Australian water holding frog and, you were quoted in one interview saying it's a one-way ticket for. The Frog to non frog nests so, I said, another way I think once you squeeze water out of a frog the frogs not going to hop around anymore, it's probably the. Way I would interpret, that, anyway. Genevieve. It's great having you here today I'm, going to hand over to you and I'm really looking forward to hearing what you got to say and then we'll have an opportunity for about 20 minutes of questions, both from the audience and those that, are streaming in live, as well so let's give Jenny we've a great warm, welcome, thank you. All. Right that's a daunting, and intimidating introduction.
Um I have. To give the usual set of caveats in advance I'm jet-lagged, I have a cold I've just taken sudafed, and strepsils. So. If you thought I was speedy usually, I'm now on all of my drugs of choice and preference or plus three coffees because I'm back in Australia, which. Means there is no slower speed than this thing it's only gonna get worse, but. Listen I'm incredibly excited to be here I'm always very, happy, to be back in Australia and to be welcomed back onto country I want to acknowledge that I am back on country, and to, remind all of you in the room that sometimes that feels like a ritual but there is nothing like that anywhere else in the rest of the world and it is remarkable, that it gets to be part of our heritage and who we are so. I want to say I'm happy to be home on a normal country and I'm happy to be here to get to talk to all of you the, digital transformation agency. Has been on my radar for a long time back when you had a different acronym so. I'm excited to get to see you again I'm happy to be part of this whole conversation I'm. A little frightened by that introduction Gavin, I didn't, realize a long time ago that if I told Americans I knew how to get water out of frogs nothing good would come of it and. That continues to be true I also realized that you might have walked off with the clicker to move the slides forward. Yeah. We go there we go see that. Looks like one excellent, look. At that digital transformation. Right. Here in front of you so. I, don't. Know what to do with that introduction I do know how to get water out of frogs it's only one particular kind of frog and it will kill them that, is true and I'm sure I said that in America at some point to point out to them that I wasn't like everyone else in the room I realize. Now there's plenty of other ways I could introduce myself and. I, might, do that now I thought, what we do for the next thirty minutes however was talk a little bit about where I think the future is going and I want to locate this as a couple of things right this is not.
A Set of predictions this is a conversation, these are not the only answers, these are some of the things that a top of my mind and I really want to be clear this is coming at it from an intersection, of someone who spent time both in the valley thinking, about technology but as an anthropologist who spends all of her time thinking about human beings, and. So. There are lots of ways to introduce myself and indeed it's been 25, years in Silicon Valley but long before that there's a couple of other things that I always need to say about myself I'm. The child of an anthropologist, that means I grew up in my mother's field sites in central and northern Australia and briefly in Indonesia in the 1970s. And 1980s I, spent, my time with, Aboriginal people at, a time when those settlements were still new some of them were less than 20 years old when, people still remember their first sight of cattle and Europeans, and fences, and when it. Was a very complicated time, to be in the Northern Territory so. My childhood is the land Rights Act and that's. A different moment than the one we find ourselves in now it. Also means that I spent my childhood in. Aboriginal. Communities where. I went to bilingual schools, I spoke. Wall pre I ditched. Class a lot to go hunting and gathering with people I got to kill anything around me that moved and then eat it I, didn't, have to wear shoes and it was arguably the best childhood you can possibly imagine it. Was also not one without its complications when. We came back from Central Australia the first time I remember. Fighting. With my teachers in class Here I am a cam girl after all turn a primary, Lynam highschool Dixon College and wait, for it wait for it the Department of local government administrative, services, but. In those early days coming, back I remember saying to my mother that I couldn't work out how. To square, the conversations, that were happening around me in camera about Aboriginal people with the Aboriginal people I'd spent my childhood with and, one, other things she impressed upon both my brother and I was that the, world as it was wasn't the world that, we had to have and, that as a result you had a moral obligation, to make things different that. If you had any. Commitment. In your world it ought to be to making things better and making, things different and that mean you should put your time your energy your intellectual, efforts. Everything. On the line and she lived her life that way and she made it quite clear to my brother and I that we would have to do that too and, so. I've pursued an odd career path as a result of that I left Australia in my early 20s to go to university, in the u.s. I found. Myself after, one thing led to another at Stanford, my. PhD, is in Native American Studies feminist. And queer theory you. Can see how I would end up at Intel almost, immediately with that kind of background. Because. Obviously that's, the kind of people they were hiring in the late 1990s. That is of course not true uh um. I was. On the faculty at Stanford after, I finished my PhD I was a professor of anthropology I was perfectly happy doing that I was in a tenure-track job frankly. I was in one of the better places you could be in your life at that moment in time and, yet, I met a man in a bar in Palo Alto about. This time 20 years ago actually and he, changed my life in. America that always means I need to add I didn't marry him because. They convinced that could be the only way he would change my life but. The reality is he asked me one really, simple question he said what do you do and I. Told him I was an anthropologist, and he, said what's that I, said. I studied people for a living he, said why, this. Point I probably should have worked out he was an engineer. I said. Because they seemed interesting, and he said but what do you do with that and I, said, I'm sure not without, you.
Know The, right degree of hubris I'm. A professor, and. He looked at me and said couldn't you do more and, I thought yes I could stop talking to you because you're kind of irritating, and so. I wandered off and stopped speaking to him so it was a surprise when he called me the next day at my house because, we're talking twenty years ago and I, hadn't given him my phone number because my mother was also very clear about the other piece of advice which is that you don't give your number to strange men in bars and he. Counted and I haven't given him my number and we're, talking before Facebook. LinkedIn. And there are 13 patents there's just four of them on LinkedIn before. LinkedIn, before Facebook, before Twitter, before tinder, before, any way you could have found me you, couldn't type redheaded anthropologist. Into the internet and get my name though if you do type redheaded anthropologist, into the internet now I am the first search term that turns up I. Have. Done something with my life that is it but, at the time he did it the old-fashioned way he called every anthropology, department in the bay area looking, for a redheaded Australian, and, the, anthropology, department at Stanford said, do you mean Genevieve and would you like her home phone number. Yeah. These. In the days before data, privacy. And, he called me he offered me a job I turned, him down he offered me lunch I accepted, that, turns. Out being a graduate student you never get over the prospect of free food and. I, was willing to do almost anything for lunch I met. Him altum Utley I met the people at Intel, through, a very complicated dance I ended up at Intel on my. Second, day of my new job my boss sat me down and said we're very excited that you've come to Intel at this point I was their first anthropologist. And. My, new boss said and here's your job description, should, there are two really important things I'm like excellent, I like the clarity of this so I opened, my notebook and I wrote down to number one and two I said, okay what what am i doing Chris and she said well the first thing is women.
Like. Um which. Women Chris she's. Know all women, like. What do want me to do with all women she's, like oh if you could tell us what they want that'd, be great. So. Write down women, all an. Underline, that's a few times and try, to work out what is the research project, you will do that will have some necessary characteristics one, of explaining that women all isn't actually a meaningful category, and the, other one actually telling a semiconductor. Manufacturer, what women want because that seemed like possibly, a bad place to go but. I realize you could spend the rest of your life doing that so it was troubling to me to imagine that this boss thought I should do a second, thing as well, because. The first thing is explained women. It's. A little frightening to imagine what job number two might be and, so. I said oh god what's job too and she's at oh that's really easy here, at Intel we have an arrow w problem. I'm. Thinking yeah I have one too I, don't. Know what ro W stands for and. My new boss says ah that's rest of world. Like. Okay um so, for the sake of clarity where is world such that you have something you describe as rest of world I my, newest adult that's really simple we have America and the. Rest of the world and. We're. So excited that you're here because you come from there. I'm. Like okay, good thing so. I looked, at my job description and I went women and the rest of the world I think. I'll be busy as. I went back to my desk and I had an 18 and a half year career where my job was really to make. Sense of women and the rest of the world and over time I added American, men to that because that seemed like an act of kindness and. Kind of rounded out the set and. My job was really to put people back into the conversations, about making new technology, maybe not even back maybe just put them into the conversations, about making technology how. Do we think about innovation where, the innovation isn't just about what's technically possible but about what people want about. How do you work out what are people's pain points, and their aspirations, and how do you build things that map to those rather than just solving technical problems how do you solve the right technical, problems, my. Job and the job of all the teams I built at Intel was to stubbornly, and persistently. And doggedly. Insist. That the things we did needed, to touch human beings that. The measure of success wasn't, just the unit's that we sold but the things that we made possible and that, the conversation, should be as much about what do human beings want to get done and how. Do we make that possible as it was about megahertz. And micro. Processors, and nanometers, and all of those things and you know I'd give myself a B sometimes, a B+ on that effort it's fairly hard to change a large company like that but. In doing so I learn a tremendous number of things about how you drive the right conversations. And perhaps, more importantly, I think more relevantly to the DTA how. You ask the right questions, because, it turns out that's what it really comes down to it's not about the answers it's about can you ask the right questions, that unpack what the genuine challenges, are and they'll, open up that space and so, in the spirit of that kind of notion of not.
The Answers but the questions I want to kind of walk you through what I think the big questions are that we're facing as we move forward, into the world today. And. I want to start with this quote by William Gibson it gets out a lot and it is an excellent quote William Gibson is a science fiction writer and he was interviewed in the Economist, back in 2003. And he was asked about the future what will happen in the future mr. Gibson he was asked and he's in his answer was pretty straightforward the, future is not somewhere, else like it's not a thing over the horizon waiting, for us to arrive in it what he said was the, futures already here it's just not evenly distributed, by. Which he meant the. Futures already around us it's just not always where you were standing at that moment in time I think, I always think that's an interesting provocation. And as. I've thought about this quote over the years I started to think about well, back. In 2003, what would we have been looking at and. This. Photo is from 2003. This. Photo was taken by a remarkable colleague of mine at RMIT a woman named Larissa hora and this is a photo she took on a train platform in Tokyo now the, thing about that photo is you don't have to look at it too hard to think that could be right now that. Doesn't look like 2003. That looks like 2018, closer. A little DAG year but we still recognize it right into phones the, body language pretty much everything, about that looks like now it doesn't look like, 15. Years ago. It. Looks like the present, but I can tell you when Larissa took that photo and we took photos like it and we brought them back to big American, tech companies, what they said is that's just Tokyo. That's. Just DoCoMo, the. Japanese telco company that's, just the, local stuff that's. Only happening in Tokyo because of the telco and what, they said by that what, they meant here is that if you were actually to ask people what they were doing here on this platform in 2003. Location-based. Services, they. Were chatting, with people near them they were swapping, photos they were swapping early, basically. Emoji, they. Were dating, using, location-based, services, they were doing a whole lot of things that we would recognize today but in 2003, it was dismissed out of hand as simply being a Japanese. Phenomena, made possible by a local telco, and by, basically what was seen as idiosyncrasies. Now. Of course the reality is we look at this now, and it doesn't look that strange at all in fact it looks other. Than the fact that it's more public transportation and, less cars it, looks remarkably familiar. So. The question then becomes what is around us right now that. We are writing off that. 10-15, years from now we're going to look at it and go ah the future was right there we didn't bother looking at it properly and yes, I did just swear I'm sorry about that I'm. Blaming the sudafed for that one so. Seven. Things I think that you could look at right now that. Would in fact let. Us know what the future was going to look like if we were willing to ask some hard questions and none of these will come as a surprise to you frankly I think they're part of our conversations, but we're not very good at interrogating, them as critically. As perhaps we should and the. First one is to talk about data and all. Of its you know well discontents. All of its contents, right there's been an enormous amount of time over the last five years talking about big data about, a data-driven economy, about waves of data of. Course what we haven't done in those conversations, is parsed what's new and what isn't I mean frankly anyone who's been in government for more than 10, minutes knows that data is what government is built on and governments have been built on data for a very long time the.
Doomsday Book forms, the basis of the British government's, theories about regulating, the environment, and regulating, taxation, bases and that was built, in you know 1084. Ad so. Date has been around for a really long time, the, current pieces of data I think we should be paying attention to actually, come more from the commercial, world than the private world and there, are two instances I'm thinking of here in particular that are relevant for thinking about notions, of data its ownership and its consequences. So. At the moment one of the ways as a human being we might encounter our own data is data, that's generated by our bodies so. Those. Of you who have a fit bear and apple watch you've had some kind of tracker on you to track your activities know. That that's not an uncomplicated space. About. Two years ago in the United States a court case commenced, where. Fitbit. So a fitness tracking device, was. Being used to. Effectively. Assert, that the person who wore it was committing perjury on the stand, the. Person on the stand says they were at home had better sleep not committing the crime at hand the Fitbit says they were somewhere else engaged in vigorous activity, co-present, with where the crime was committed. It. Turns out in the United States at law the, person who was wearing that Fitbit has no rights in the date of their body generated, and that Fitbit data is owned by the company and so. When the company was p-nut they turned over the data to the court the data was then used against the individual. That's. An interesting notion about what our bodies generate, and where that data might go and who begins to own it it's. Not the first one I'm sure most of you saw the news over the last two weeks about Strava so. A fitness tracking object, and an enterprising. Young researcher, here at the Australian National University who. Managed to use the fitness tracking data, and people's leaderboards to determine, where secret American bases were based on where people were running in places where no running should be happening. Now. Of course what that starts, to mean is thinking about data, our bodies, how. That data is used and, by whom is a complicated. Morass, and, starting. To think about where the challenges, come with that is complicated, - there's. Another instance, again in the United States of a managed. Healthcare service providing, company who, have about 20 million subscribers. If you want to think about them that way so. They have all of their healthcare data they purchased, all the credit card data, related. To those same 20 million people and merge the datasets, again.
Not Illegal and not at this point technically, impossible so. Now your credit card data is co-present. With your healthcare data so. Now your healthcare company knows how many times you go to markers or whether. You purchase furniture from Ikea because it turns out furniture purchasing, furniture from Ikea is a sure sign of emergency, room admissions. Flat-pack. Furniture is, in fact every bit as dangerous as we always imagined. But. It also means that there. Is no, way of thinking through the social compact about data collected under one set of circumstances, being married with data collected under another now. It is perfectly possible from a technical sense to stop that happening you. Can watermark data you, can meta tag data, gavin's. Healthcare data and gavin's credit card data could. Be held separately another point that his healthcare data finds itself near his credit card data notification. Could happen hey gaff I'm. Hanging out with your medical data how do you feel about that you know we could have an opt-in opt-out mechanism, there are plenty of ways of doing that technically, but no one's been asked to make it happen yet now. Imagine about how, long does data live for we. Have rules. At law about certain kinds of data how long does taxation, data survive, how long does medical data survived, digital. Data is in some ways mostly. Survived, by the transparency. Of the platform, and how long you can access it but those aren't good rules to move forward, on so. How do we think about what data exists, what. Data it's co-present, with what. Perceptions, get built on the basis of that and, what the consequences, of that are and, not just issues that are going to get weighed out in the commercial, realm that have enormous implications for, government. How. Do we think about certain, bodies of data sitting with other bodies of data and the interpretations. That get based on top of that and the judgments that are rendered Varian on the. One hand it's easy to argue that efficiency, should win there it will be efficient if that could happen the, reality, is there are certain kinds of judgments that will get made that cannot be undone, that may not be about efficiencies, they may be about notions of fairness about judgments, about equity that all get pretty complicated. Second. Thing to think about there is how is the data collected under, what circumstances, and under what conditions so. It was remarkable, last year to listen to a conversation about what had happened to data that Norman Tyndale collected, American. Anthropologist, who collected, a genealogical. And genetic data from Australian Aboriginal people back in the 1930s and 1940s was. Done under what you wouldn't want to call an informed consent protocol, and before. People, here were willing to analyze that data and make determinations on it they actually went back and got consent, from all the families and descendants. And living people from whom that data was collected it took a really long time it. Was an incredible, investment in doing that but, it also said, that just because the data has been collected doesn't, mean it should be used, that's. A complicated thing now. Imagine, that we, are going to have a world where. Algorithms. Get built on top of that data so, if algorithms, are simply automating. A task using a data set and. We. Want to buy those algorithms, do we want to know what data they were trained on do. We want to have some visibility into whose data was used there do we want to think about whether that data was collected by standards, we would approve of here in Australia or any other anywhere else do. We want to think about what those data sets bring with them that's looking inside them do.
We Want to think about how all of those things function, and then, last but by no means least in the world of data one of the challenges, here is of course that data. Only represents, the world as it has been not the world as it will be the thing about data is that it is always retrospect. It's always the past and. The. Thing about the world most of us are committed to building as it often doesn't look like the past if. You were going to build a Pay Equity tool, for instance inside. The APS you probably wouldn't use existing APs salary data to do it because. If you were to build a Pay Equity tool based on a salary data that exists, you would build into its salary inequities because that's the one the past data looks like so. How then do you think about where an intervention, comes how, do you get to agreement, about that what, do those things look like I. Have, a colleague of mine who says that more data equals more truth and I always think that more data just equals more data and being, careful about how you want to use it requires asking a whole series of questions about it but those questions are already around us these, are not ones we need to wait five years we could just mine the world around us and go there's already a whole lot of questions we might want to ask, which. Leads neatly into the next point about whether AI, or, algorithmic living has already arrived. We. Spend a lot of time talking about AI and algorithms. They're, not exactly co-present. Terms but they have a relationship, between one another as I, said an algorithm, is merely an automated, set of tasks, usually back-ended, by data so. If, this happens then you do this if this then that they're, relatively straightforward. Things, that way and we've been in. A world of algorithms, for a very long time certainly, ones that operated in a computational. World, recommendation. Engines, everything, from Google to Amazon, to Netflix, to tinder are all versions of this they. Don't do a lot they merely say if these things are happening these things are likely now. Of course what's built into all of those is judgments, and notions about how things, connect, to other things and people's. Determinations, about what sets of data connect, to other sets of data we. Have a great deal of evidence coming, out of both the United States and a few other places at the moment about the challenges, with algorithms and about how much human. Activity. Is built into them and how much you need to think about data sets when you think about these things, Google, has been wonderful, about making clear their. Experiments, with machine learning and. Some of the challenges they're in those. Are instructive, for what they start to say so, last, year and the year before Google, put, together a series of algorithms to do shortcut activity one, of around labeling photos I'm sure most of you know this experiment, go, create an algorithm the algorithm was to label photos it. Turns out in order to make algorithms, one of the ways you do this is you hive off a certain percentage of a data set about 10%, you build the algorithm, based on the activities inside that data set you then test it against the rest. Test. And control good mechanism so. They tested it on 10% put. It out to the hundred percent it produced the same thing they released it to all of us and it turned out this particular algorithm, labeled, black faces as gorillas. It's, not good, Google. Immediately went not. Good pulled. The thing off fixed it put it back out had, to do an analysis of why they had gotten to that place and that was a really interesting moment because it turned out the data set they were using the ten percent of the hundred percent was. A fair and reasonable sample, the hundred percent was not a reasonable sampling, of the world. It. Was a sampling of the images they had within their. Organization, it. Was also the case that when they had run an eyeball test on it no, one caught that because. The design team wasn't, sensitive to that set of issues so. Now you have two challenges right how, do you think about your data what. Is your data representative, of. If. It's not representative of the world how do you get it there and if. It isn't representative, the world and you can't get it there how do you determine that the faces that are the, people that are looking at it have, enough capacity. To, go oh oh this is probably not representative, of the world. That's. A whole set of steps that, aren't usually inside, the way we talk about machine learning or deep learning or frankly, what it means to think about a set of algorithms but. As soon as we have these entrenched. And making decisions, those. Decisions, embody both human biases, data, biases, and then they compound, because, then they scale out over time we.
Have Plenty of examples of this not working, we. Have examples of it working too then we have some that sit somewhere in between how. We think about reviewing, these objects how we think about unboxing. Them and unpacking, them or really complicated, pieces so. The court case I would point to in the United States again here that I think is instructive has to do with a sentencing, guideline, tool that's been developed in the United States by a commercial company that, is used to facilitate judges. Rendering, this is more effectively, so it's a sentencing guideline tool it takes the guidelines you, plug in a series of attributes, about the defendant, it gives you the parameters by which you should sentence them. It. Turned out this particular tool was more likely to sentence african-american. Defendants. To harsher sentences, than white defendants. Now. You might argue as that because, african-americans. Reoffending. The data actually doesn't bear that out it turns out this tool has. Looking inside of it somewhere determination. The, challenge, there it turns out however as that determination, is commercial. In confidence, because. It was produced by a company who are not willing to black unblock, unbox, the. Determining, waiting features inside of it now. Imagine every time as a government, we purchase a piece of software we. Purchase a service we. Purchase a good how, is it that we know what's inside of it. We. Have a capacity, to already do that with other things think about Australia's rules about biosecurity, in particular or those of you who are old enough think about Australia's rules about literature. And censorship we. Have had very clear notions about what was going to be appropriate inside this country and what wasn't this is the next place we need to think about how you scrutinize, these objects but, asking these objects to unbox themselves, is very tricky, asking. The people who build them to unbox them is tricky as still and working out what your parameters are for how you feel about that is actually. Really interesting, but. We already see the cases unfolding. Here right we've, already seen the court cases about, bias. We've. Seen them about, unintentional. And intentional harm we've. Seen ones that were simply just quirky and odd so one of the very first website dating, companies in the United States had. Sitting inside of its matching algorithms, that would only match men with women who were 3 to 5 inches shorter than them. Apparently. Blokes only like short girls that's really what you should take from that and they built it into the algorithm and no one noticed for a while until someone went why, wouldn't, you keep it this. Way not this way and. I took awhile to sort out now it turned out the person who built it just figure that's how desire went. Now. Imagine all the places that is true right imagine all the places that we will normalize, things, that. Are someone one, person's, idea whether. It is about notions about relationships. Whether it's notions, about. Appropriate, behavior, about parenting. About savings, rates think. About all the stuff that gets built into that and all, the ways in which that is intensely, complicated. And imagine. Those complications, are already all around us but we're moving into a world wall there will be more of them. One. Of the other things we hear a lot in the field work we were doing at Intel and I've continued to hear now is this tension between the notion of technology, that's always on and always sensing, and the notion.
That People want time without technology. So. Think about I'm willing to bet in many of your lives now you have a moment every weekend where you think I, just. Don't want to answer any of those things anymore I don't, want to look at my Inbox I don't want anyone to call me on the telephone I don't want to be texts I don't want to look at Facebook you, have a moment of going I've had quite enough of all of that I'm willing to bet in some of your households, there are complicated, conversations, that still go on about whether. Laptops, come on vacation, about whether you can look at your phone for work about whether the phone is allowed in the bedroom, those. Are not uncommon conversations. They. Get more complicated when we start to think about all the other objects, that come into your world that you don't get to switch on and off that are always on and always sensing. So. Whether that is things like Amazon. Echo, Google's. Hello there are a whole series of products, that are always honored always sensing because, they're always listening that. Creates some really interesting challenges, in people's homes and really interesting challenges, again about what it looks like to have a world where you cannot get away from technological. Activity. We. Know that there are certainly, cultural. Moments where people want to create spaces where this isn't happening it's hard to think about how you do that when you're not always the one who can opt in so. What does it mean when you come to my house where there's no Lexus sitting on my kitchen countertop, and she's listening to you to not just me what. Does it mean when you're in any number of world cities where you are being surveilled, and where. Your opt-in this is simply that you've turned up in that city so. How we think about some of our notions, here about presence, about, permissions, about. Opting in and opting out are really tricky and we.
Were Used to it when it was our activity, you pick something up you logged into something now, it's simply your appearance, your body your, face your. Voice, your. Movement, all become. Part. And parcel. Of a world of things that are being collected, and. How. You think about that is actually, both culturally, complicated, personally. Complicated, has, to do with all, sorts, of features but how you unpack, that turns out to be really tricky and we already started to see people moving, to different kind of mechanisms, about this either turning. Everything off I, can, tell you some of it is certainly gendered, about your Friedman at the University of Washington's done a lot of work on looking at how women perceive surveillance, technologies, and pseudo surveillance, technologies, and the, gendering, of putting. Anything. Over the camera. In your laptop shell and anything. Over other pieces of technology, in your life those things turn out to be quite complicated because the consequences, of being surveilled turned out also to be quite complicated and then. Even if you don't think of it as surveillance how do you feel about having something always on and always listening in your home, because. It turns out there are objects that are now doing that and then think about when those objects are not even visible on your countertop, but are in fact your electrical, meter, which. Is now engaging in constant censoring and showing that information, and. Where, we have enough technology already. To feel, down the wire if you have a smart grid. And a, smart electrical, box to actually shut down individual. Pieces of technology, in your home, so. Now rather than loadshedding your, suburb will just load shed all of your air conditioners, or all, of your televisions. That. Gets complicated in this world right but is increasingly possible I. Was. Telling Gavin right before I got here that this sign here turns, up on most. Trash cans outside of American, chemists. So. On the garbage bin as a sign that tells you not to discard your personal information there, that's. An, interesting warning, sign on a garbage can at. Multiple, levels we, also know that we're coming out of a period where there is an enormous amount of anxiety about, notions of surveillance about. How we think about who is listening and under what circumstances I've. Just said there were a bunch of commercial, products that live in your home that are listening to you people don't yet think about that as surveillance but they do think about it as listening, how. Then we think about notions, of privacy of. Security, of trust and of risk are. All intensely, complicated. We. Were tracking, information, about people's, perceptions, of privacy, there's. A watershed event about three years ago which is of course Snowden. And interestingly. That it changes. Individual. Consumer, and household, activities about what information is being shared it took a little while for it to happen but. We actually saw an uptake, in all kinds, of technologies, everything from tall which you have to be kind of committed to to using two. People being much more attentive to the security, protocols, of their chat, tools. So people, trading signal for what's happened vice versa and people. Starting to think about what were the other ways of thinking these things through it's. Also the case that what this started to do was think about who did you trust and who didn't.
You Trust and what, did you trust them with. Unsurprisingly. Trust in government has gone down and his period as has trust in most other social institutions, this, is not just about notions of who's. Listening but why are people listening and to what end and frankly. When we've talked to human beings about this so neither our citizens nor as consumers, the, thing that is always deeply, concerning is what is the evaluation, that's being placed on top of this information it's not that I don't expect you're going to know things about me I expect, you will what. I'm more concerned about is what do you think of me on the basis of those things and are you changing how you treat me and are. You basically. Judging, me through these things so. You know the first time we explain to people that they're smart electrical, meters were making assessments and sharing it back with the grid add, multiple, household say to be wait so like my, electrical devices like gossiping, with a utility about me and gossips. An interesting word that way right it has a moral judgment attendant, to it and a notion of assessment. So. How we think about these things on the one hand we see them as efficiencies, on the, other hand they are perceived as being something altogether different. Which. Raises of course the issue of trust who, do you trust under, what circumstances, what does trust look like is, it selective, is that brand driven, is it, about what information you know about me and how you've chosen to use it so far if we. Look at what brands, have, moved on the trust spectrum what hasn't it's been interesting to look at this so. You know have you been willing to defend your customers, against what particular set of players, how. Have people thought about co-locating. Brand and Trust so there was a lot of debate to you ago in the valley about why it was that Apple, was willing to stand up against the FBI to be asked you. Know to basically go to court about cracking, their security, software. And their security systems because, their argument, was if we do this then it's open slather and we're not willing to have that happen that. Actually means different technical solutions have been implemented, data restored locally rather than in the cloud how.
You Think about all those things turns out to be critical how. We think about signaling. That to, human beings turns out to be really quite hard, how. Do you say this is trusted versus this not being trusted and you. Know I don't think anyone's gotten this one particularly, right but. I know it continues to be an interesting question there, is also an argument here that says people are willing to trade off certain amounts of privacy for certain amounts of efficiency. And gains it, turns, out that's not always true and it's not um you know linear. So. Just because you've done it once doesn't mean you'll do it again just because you trusted that group of people in this moment doesn't mean you'll trust them later and the. One thing we do know about this is it's really hard to get it back and, part of the reason it's really hard to get it back has to do with this which. Is people's. Fear about where all this technology is going is all about the robots and all about artificial intelligence and, all about what the consequences of that may be I'm. Willing to bet most of you who work anywhere in the tech field get asked a great deal about when the robot apocalypse is happening. It. Never happens soon enough for me because I always have to go give another talk and, the reality is that you know if we look at the robot apocalypse and the notion, of the robots taking over. It's. A complicated, set of fears now the reality is the most. Readily. Deployed, robotic, objects in the world there's 10 million of them oh rhombic. So. Robotic. Vacuum cleaner so, if you can climb stairs you're probably safe at least in the short term from the robot apocalypse of course the challenge about the Roomba is while there are 10 million of them what we didn't know about them was that in addition to sucking up our dirt they were sucking up the footprints of our home and now they are willing to sell that data so. Suddenly the, robotic objects it's less about is your life in peril it's more about what do they know about you and who are they going to tell. So. We've already seen is an enormous amount of activity around robotic. Objects, around, notions. Of job replacement, certainly, the oxford report that came out now two years ago that has the much-touted figure of 40 percent job loss if, you drill down too that's not actually what it says what it says is that there will be 40% task replacement. And that some, of those tasks, ladder up to hold jobs the. Reality, is looking at those tasks, and looking at those jobs is a very interesting exercise in, terms of what sorts of activities, can be automated and why and what sort of tasks, can't be we've. Been tracking this kind of stuff for a long time and one of the things that's really interesting to me is what, categories, of work feel more replaceable, this time around than last time so. One of the interesting things if you spend any time looking at these kind of objects, is that, not. Physical, robots but certainly AI like. Objects, the easiest tasks, to automate in that sense of stationary. Rule-based. And data. Heavy. That. Turns out to surround remarkably, like many of our jobs there's. A white-collar tasks. Those turn out to be, paralegals. Certain, kinds of diagnostics, in in medicine, certain, kinds of other tasks. Where the rules are clear things. That involve more ambiguity are much harder and things. That involve great, deal of physical dexterity are really quite difficult indeed.
Carnegie, Mellon University has. A wonderful, robot that folds laundry I, know. Good, right fold. Laundry takes. About 13, hours to fold a basket of laundry and. It. Can't match socks and it. Costs about a million dollars this. Point it's starting to sound like me and I've dated and. It's and. It's mostly useless now, the thing about that is over time it the price will come down and the efficacy will go up but not inside an envelope where you could imagine deploying, that in any standard kind of way because, it turns out they're a certain kind of tasks that humans are still better suited to always better suited to over a surprising, period of time so. How we think through and do a better job of having this conversation is interesting, I think this is a place where knowing our history is good too why, is it that stories about robots fear so feel so fearful, is because, they're tied up with our history and that's a literary and cultural history, not a technical, history but. Being clear about what's going on here is also really useful and thinking about where. Efficiencies. Can be gained and where they can't is also really helpful and how we think about that discourse, of what is it that we are automating, and why turns. Out to be useful, which. Just leaves me on this last note here what is that one of the things I'm always really struck by in conversations, about the future is how the soul metrics we have for thinking about the future about efficiencies, and productivity's and. We talk about how things will make us more efficient and more productive and the, reality, is most really great technologies, at the last thousand, years also had another set, of consequences, they made us better storytellers, they. Let us create, magic, and Wonder. And things. Remarkable, this. Photo comes from an experiment, Intel did nearly, two years ago now with the Royal Shakespeare Company, where we instrumented. Aerial inside the tempest and did, real-time, augmented. Reality there. Was enough, computation. In this room to keep the Space Shuttle in space, it, was really quite something but, what it did was make possible, an experience. That was utterly transformative, for everyone was there didn't make anyone more efficient no high. Productivity absolutely. Not, did, it give people a moment of splendor. And Wonder and magic yeah, and I, sometimes think there's an interesting conversation, to have about, as we. Think about the future of Technology, what, are the other things we might be looking at that aren't about productivity's, and efficiencies, for, government maybe that's about well being maybe that's about engagement maybe, that's about people feeling better connected, as citizens, maybe.
It's About feeling more empowered as citizens maybe it's about feeling more celebrated, as citizens I can think of lots of language there that isn't about more, efficient, engagement, with government. So. That might be nice to UM, but there's something here about how do we think about, magic. That. Isn't, as scary. As that might sound but. Thinking, about the future of technology where we weren't just thinking about efficiencies, is really important, and so. That leaves Gavin and I to have a bit of talk about how you put all back together again. I'm. Gonna stop. Got. To stressful. Wellthanks, Genevieve if there was great, and really do. You I love the storytelling, so, thank you thank you for not bombarded us with the host of. Facts. And figures and everything else so that sort of really captured the imagination, about robots. Is a fact that is a fact it is a fact that is in fact true so, we, have an opportunity for questions both, here in the room and on, the webcast but just to get people thinking I. Thought, I might start, with one and then we will will, hand over. Chuck, government has this thing around that wants to digitally, transform, government, for. The betterment, of individuals. And businesses a question. For you and perhaps, reflecting. On why you joined, Intel. And what you experienced, is. Its. Question around culture. How. Much of this leading. The witness is an issue around culture, rather, than call, it ability. Your technology. Itself I think. In order to have organizations. That are, able to reimagine themselves, there's. A couple of features I think you need to have so. I think it's about how do you have trust. In your leadership how. Do you have a culture that, celebrates. Risk so, that celebrates, people actually trying, things and seeing what happens I think, it's about how do you tell stories about. Moments. Of transformation, so how do you celebrate your cultural heroes not being the people that had the traditional careers but did interesting things and then. I think it's about how do you create processes, that make it easier, to do things differently, so one of the first things we did at Intel when I moved. Back into the R&D labs and was working for a new boss and one of the things he was concerned about was that the labs had started to feel a bit stale and. He was really concerned that people didn't want to take risks anymore cuz they were really concerned about shepherding. Their careers effectively, they didn't want to try new things he, implemented this prize called the first penguin prize, those. Of you don't know penguins the thing about penguins is they all sit on the ice floe for a really long time and then one penguin jumps off and then the rest of all now, the thing about that first penguin is sometimes they're first to food and sometimes they're first into the whale. So. It's a high-risk proposition right being the first penguin you, willingness, to get off the ice flow uncertain. Future and so. My boss at the time said we should have it we should celebrate. First, off the ice blow, he, said to him do you care if they succeed or fail he said that's not the point the point is jumping off the ice flow and so, for the first two. Quarters we had that no one wanted to get this award it. Was like a bad award cuz this look like you're an idiot you were reckless you have jumped off the Iceland all of us sustaining keep going, and. Then after a while it became because. Part of what we made him do wasn't just give people the awards but tell the story of why, and. So part of was how do you create alternate, myths how. Do you create alternate, stories about what a career looks like about what success looks like how do you take some of the sting out of risk by. Knowing that even if you don't succeed you're going to be lionized, for trying. So. For me as a cultural. Anthropologist one, of the things I know is when you are making. Culture, it's. Not just about the processes, and the rules it's also about what are the stories you tell and what are the the, symbols, that you make meaningful, I. Couldn't. Help but think that it could require a different. Approach to Senate estimates.
They. Might all be first penguins yep. Okay. Christians are Christians, in the room anyone got a Christian I'll. Go to, anything. Coming through webcast. Right. And you all into silence with my sudafed strepsils, rant. Okay. All right we got one over there in the front great, be. Great if you just mentioned who you are and, yes. You first penguin yeah well. I'm jumping, David, Pitts knowledge from the Department of jobs in small business hi. Fan. Buck. We all are I'm cool. I. Just. Wanted to talk, a bit about. What. Your thoughts were about the current trend around design. Thinking and, how, from. Your vantage point over the last couple, of decades what. Does that mean for. Organizations, now, is. It a good thing is a bad thing is it diluting, something, is it augmenting, something, so. Sort of design thinking discussed, mmm. Does. Everyone in the room know what he means by design thinking. Okay. Good yours, gonna stamp to that okay good listen, I think design thinking is part of a long and honorable tradition I would think back to participatory, design as in some ways it's ancestor, so. European. Notion of how do you bring all the stakeholders together, and imagine, a future collaboratively. I think, in that sense design thinking has been an incredibly useful exercise, of retooling, people, again back, to saying how. Do we think collectively and collaboratively about. An outcome how, do we get better at imagining, that outcome is going to have multiple, points of view on it I, think. In so far is that it has frequently, created, spaces for people to be a little less, rigid, about. The ways in which they make. Knowledge and engage with each other I think that's always a good thing I. Think it is a tool, no. More no less I know. Some people think it is a, sort. Of an epistemology, I don't I think you know design thinking is part of a toolkit, of a healthy organization, I don't, think it works for everything any, more than participatory, design worked for everything I. Do think it's something that the notion of how. Do you, create the spaces for that to happen is harder, than just having a nice workshop, I think. You know how it is that you, make. Room for people. Who don't imagine they're your stakeholders, and people, who don't feel empowered to be in the conversation even, then to be there is actually really hard work and I, think those are incredibly. Useful exercises. For getting. All of us more sensitive, about what the broader context, is in which we work I think. One of the other challenges sometimes to design thinking is that. It. Becomes a checkbox exercise, so. Did, you have a design thinking workshop good yay go you and now we will move on as opposed, to saying what did that surface, and how do we think of that as being part of an iterative process because, I think for me the most useful thing about design thinking is it's iterative in nature and. The notion that an answer. Isn't fixed, it needs to be constantly, kind of renegotiated. And updated, so I think. It's been a useful thing I've watched it sort of evolve in the valley I mean there were certainly other things we did before it came along that looked like it anybody's definitely having a kind of a moment but. There are pieces of it that I think ought to be in any healthy organization. And, require. In. Some ways the most interesting thing about it a decentering of power and authority which, is often really true so, you know one, of the challenges frequently, as you have a design thinking workshop and then you just go back to business as usual in the old chart and that's. A, harder. Thing to manage I. Don't. Think if you do design thinking for the Senate estimates though it would be interesting would be very interesting. All. Right we got a web question is that right okay. So I'm read it out what. Ways might we protect, our work on digital services, so, that we are not building in poor, outcomes, for whole comunicate, communities. Gerardi. From pilla yeah. Pila, that's always a really good question right I think one of the answers there is how. You think about balancing. Hoots, so in the kind of notion of whole communities not a majority, is actually a really hard thing one. Of the challenges, there is depending, on what tools you were using the, majority, is frequently, what ends up for dominating, so if you were using classic. Machine learning or deep learning and you're using the data to drive the. Outcome, the bulk of the data will get you to the outcome so one of the challenges becomes what happens to the long tail there right what happens to the things that don't look normal. Or normative, they. Either get removed out of the data set oh, that's. Just an outlier and you don't deal with it and, I think one of the challenges there is how do we think about both what is our sampling methodology how do we think about how.
You Talk about the whole community how. Do we think about how do you measure good outcomes, all of it becomes the kind of in. Some ways I think the hygiene you would have used for building government services historically, though frankly I think we could have asked those questions critically of those too so I. Don't think you know the digital piece for me makes. It both easier and harder it, is easier to find the outliers, in some ways because you have a clearer sense of what the datasets look like I think it is harder because sometimes it causes us to, want. To move more quickly and the, reality here is good, government, actually takes, time and engagement, it's not just can you build the thing and ship it tomorrow, and finally, I mean for me one of the things I think is complicated, here particular, as we move into a world, of more sort, of algorithms, and artificial, intelligence things, is that. The the. Way of doing innovation in the valley really for the last well. I want to say 25, years but really in the route in reality it's 50 years is you, build it in beta so. You build a first pass of it you. Throw, it out people. Iterate on it and they, do a lot of your testing work for you ie all of us who, hopefully you know tested, you. Know Google search engine and built Amazon's recommendation, engine through our actions right every time we added a search we built their database every time we you know we didn't, quite work for us we were helping them do their testing that. Notion of build it in beta and iterate, works. Really well if, it's a recommendation engines, for movies I'd. Be much more concerned, if it was something that was determining, the level of your, Medicare, repayment system or the service delivery to your community, so. There's something there about how, we, and. I think that's true for algorithmic stuff in general how, do we rethink the notions of algorithms, that have particular consequences. For humans that are material consequences.
How. Do we think about different more different models and methodologies, for innovation, that don't involve letting, human beings wear the consequences, of that when those consequences, can be really, quite, devastating. Yeah there's, an interesting think about how do you test and iterate. Without. Letting it into the wild and I think there for me I'm thinking, very much about Australia's, history with things being feral, yeah, well, it doesn't go well we know a few domestic you know you. Send them out there for a bit and see what happens think bunnies can you know bunnies, camels frogs. Good. Olive trees. It's. Interesting just that one, around, the. Potential impact on people I think in the political context, that's probably, even. Harder in a sense because one disenfranchised. Tax. Payer or community, member just. Dollars up the political rhetoric around. So. Yes but. I think. We. Have been. Guilty, of being, lazy about certain kinds of technology development right where we took shortcuts we did the easy thing we. Bolted some more code onto the product rather than going back to Ground Zero we weren't willing to have the hard conversations, about why this piece of code was being privileged over that piece of code and the, reality, is perhaps the move from talking to consumers to talking to citizens, as a mover we actually have to get our stuff together where you actually have to say no, in fact it requires a higher bar we, ought to be more discipline, we ought to be asking the hard questions about, what are we building and why and what does, something. Being good look like we're good enough isn't actually appropriate I think that's I don't think that's unreasonable. Great. Question, from the audience again over, there on the Left gentlemen, with your hand up, hey. Nicholas, from Department, of Industry I'm just, asking. Sort. Of what your thoughts are on how we get to a bit, more of a democratic. And human. Friendly Internet, especially. With the Mueller stuff, just. Recently and and, astroturf. Thing and. Being. Able to spoof, a. Grassroots. Organisation and also. Sort. Of how, do you get around the, fact that there's, culture, inbuilt into some of the platforms, so, there's a fairly, libertarian. Culture. Built into a lot of the big tech companies and, therefore. People, get to say whatever they want without there. Being yeah. That's. A good question and. It was really striking I was. In the u.s. until last Friday I was, there over that last fortnight and it was an interesting time to be there for all kinds of reasons so listen I think those are all hard. Questions. To which I don't think there are easy answers, I do need to parenthetically, make an advertisement here for my current employer which would be the Australian National University and, the, Crawford Forum is coming up in June and Vint Cerf is here and he and I are doing a conversation. On exactly that topic so please, come I know he'll be appropriately. Dogmatic, about this in ways that will be useful and instructive, and I can't, kind of do him justice listen. I there's a couple of pieces there that are really important, in the last I would say sort of two to three years a number of companies have scaled, to a point that I don't think even they imagined they would and in, that scaling have had to come to terms with the fact that certain of the things they thought were technologies, are actually ideologies, and in, the process of doing that I think it's put an enormous amount of tension on a couple of systems one. Of them is state-based, regulation, so. I think we are in an interesting moment intellectually. Where there. Was for the last at least 80 years of kind of very intertwined. Relationship, between capitalism. And said, it's capitalism and democracy they, kind of went hand-in-hand I think in some ways those are fragmenting, now and it's creating, interesting, moments to think about who, is going to regulate some of those transnational. Platforms. How. Do we think about, what. The regulations, might look like are. There bright spots that you look to I mean I certainly think where the EU is going in terms of talking about notions of. Individual. Ownership of data. Regulations. Of the platform speech acts on those platforms suggests that it is possible to regulate, them what Twitter. Looks like in Germany is very different than what it looks like in the US so. It is possible to imagine that just because these are platforms, of scale doesn't. Mean that the scale means they are universally, the same everywhere, I think. For a number of the people running those companies they are certainly, at a moment, of critical. Reflection, slash.
Existential, Crisis um. About, what it means to be at that scale and, I think the conversations, inside, those places at least initial. That it's comforting, exactly but they're at least having those conversations now which was certainly not there two years ago. How. You unpack, all that stuff listen I think it's I think. One of the hard things here, is that human, societies, take a really long time to adapt to new technologies, and most. Of us in the room were early adopters I'm willing to bet and we forget that most of the rest of the world wasn't an early adopter, I, sometimes, use the example of television, as a parallel, I mean you know when I was a child television, was already 20 years old in Australia and my family was still arguing about how far away you sat from it how much you watched whether. You turn it of