Keynote: Intentional Approaches to Human-Computer Collaboration
Good. Morning. So. I'll. Have to say I'm incredibly honored to be able to speak to you today I always, view MSR. As the, place that disrupts. And inspires Microsoft. And while. I'm not a researcher my, team works incredibly, closely with MSR, to funnel. As many good ideas as we can into the, depths, of the product teams and so, today I'll walk you through some of the things that we're doing, and how we're thinking about approaching. Responsible. Innovation around these emerging, technologies so. First. I thought it might be good to just go back in time and turn. Back the clock a little bit what. Did the world look like ten years ago do, you remember do, you remember what it looked like in 2009. Well. In. 2009, the. Financial, crisis was at its tail end right. In March of o.9 the, Dow Jones Industrial, Average had, lost over 50% of, its, value since, October. Of 2007. This was the economic context. Back, then. The. IPhone was, just a year old do. You remember that I mean those wild right and, every. Other phone was. Pretty much outdated, by the end of 2009. So, these phones just weren't good enough by, the end of that year. The. App Store was, five months old, remember, that slogan there's an app for that it. Was home to about 25,000. Apps the. App Store now serves as one, of the most important, software stores on the planet, there. Are over 20 million registered. Developers two. Million apps and in. That decade, the, App Store has generated over a hundred, billion dollars, in revenue, which, is astounding. The. Apps were weird right. They're kind of goofy, developers. And users we're, enjoying the power of, iPhones multi-touch, display, and motion. Sensors, this, was the app that everyone had to download it was the hook to get you into the App Store. Really, sweet. Amazon. Had, just announced. Its Kindle, 2. Which looks dramatically, different now. Google. Had introduced an app that, was meant to transform. The communication. Space it, was magical, and new and it was called Google Wave and we didn't quite understand, it at the time but.
It Combined email and messaging, and social media and event planning, into. Something, that kind of looks like slack, now and so maybe Google was way ahead of its time. Early. In January of 2009. Facebook. Surpassed, MySpace. At. The time Facebook, was around 68. Million, monthly active users in the US and this, trend has continued, Facebook, now has over 2 billion monthly. Active users. Just. Incredible, and. Then, there was this video that shocked all of us write, it rallied, the world around, a movement Neda. Agha-soltan. She was 26, years old and she. Was shot dead it was captured, by. A cell phone and uploaded. To Facebook, and YouTube and it quickly went viral on. Twitter the. Media outlets like CNN grabbed. It and ran with it and. Netta's. Death set the stage for social. Media activism, and. Never. Before had citizens, in crisis, been able to disseminate. Their. Stories. So easily to. Millions of strangers around, the world and this really set the stage for, the Arab Spring that happened two, years later. And. Then. Kinect was first announced and this. Was now under the codename of project natal and it. Brought a, multi-person. Skeletal, tracking, face. Recognition voice, recognition. To, the mainstream making. You the controller, and. This. Left a lasting impression, in the technology, world these features soon. Began to percolate all, throughout, the tech industry. This. Is kind of sweet minecraft. Was first publicly available it. Led to a major cultural. Transformation. It's changed, the way that video games are perceived in the classroom. Teachers began using minecraft, to, teach various lessons from circuitry, to math to history, to language, and. And. This has been, really. Interesting for teachers because having a customizable. 3d environment, gave, them the opportunity, to teach lessons, in a novel way in. An interactive way. And. Minecraft still kind of looks like this which is interesting. And. So. I would say that at that time we. Were really early on we'd probably a little blind, to the impact of technology, may, be a little naive and. Really. Optimistic as well, and. Since then. Some. Would say that we're living in an age of magic right. The AI revolution, has occurred we. Have you, know seeing, AI these assistive apps that help people with low vision navigate. The world we're able to use vision technologies, and drones. To, go and check on power lines we, can track the, workforce and how they're operating in their workflows, in factory, settings we. Can use AI to track. What's happening and, weather patterns in the environments. Project. Emma in the top, right over here is helping. Individuals, with Parkinson's, disease and, in. Reducing the tremors and so, this. Era is a, really, interesting one and it's. Happened so quickly and if. You haven't been watching closely or, you've, been heads down doing, your work. You. Might be feeling like you're, looking up and finding a world that is, quickly, shape and changing, its shape around you and in forming a new type of world around you. Even. For artists, like myself like this is my artwork I can, take original. Artwork like this and. Transform. It and create novel new artwork, using, machines. And this. Is an experiment where we took, 75,000. Images of my art I create video, art we, sampled, it we trained a neural net that would mimic mirror and my. Style of artwork and I thought how interesting I can collaborate, with a machine. We. Can take words like. A bird with wings that are blue and. Belly and generate. Completely, novel synthetic. Images. This bird doesn't exist in real not real life this is a synthetic image, and it was generated, based on this text and a neural network that was trained on images of birds a. Bird. With wings that are black and a white belly this bird also doesn't exist in real life, and. A bird with, wings that are red and a yellow but yeah and that yellow belly again. Another synthetic image and this is really quite amazing when you think about the power of using language and converting. It into imagery.
And. So. We've seen massive shifts, along the, perception, and cognition, services. The. Ability to reach, human parody in in. Certain categories, and this, has really been quite incredible, and if, you think about it we've, spent the last thirty years teaching. People how to use computers, and we're. Now teaching computers how to understand, the world and. And. This is one that's an amazing, field is transforming. And changing lives, is transforming, how we think about technology and, Microsoft. Has really been leading, the industry Microsoft, research has been leading the industry in creating some of these amazing breakthroughs, as. You. Look around along mixed reality this. Is an innovation that's transforming, how people collaborate and work it's. Allowing, the merging of the physical, and digital worlds to overlay virtual objects, in physical, spaces lets. You collaborate over, time, over spaces, and. Here is an example of some field technicians. Working with engineers and, using, the hololens to collaborate, with one another. There. Are assistive apps like seeing AI that are built out of the, suppose out of my group and this. Is helping individuals, with low vision narrate. The world around them here's an example of the currency, reading experience, in seeing, AI. And. So as we think about technology and how it's transforming, the world we're, starting to see it transform. Huge. Industries, from medical, around, skin cancer diagnosis. To, a eyes that, are helping to detect, when a heart attack might occur to, collaborating. With radiologists. In the agriculture, space you, know creating more efficiencies. Real-time crop monitoring predictive, analytics, to Big. Pharma design, drugs being, able to predict how chemicals, and compounds might, interact and how. They work to target different molecules. Autonomous. Vehicles the idea of taking raw, pixels, and translating, that to steering. Commands, really, kind of profound right and data. Centers thinking, about server workload, and optimization, how. Do you reduce the downtime. How do you allocate staff, and these are just some categories, right there's a huge number of industries that are being transformed. So, this is a massively, transformative. Set of technologies, that we're dealing with now very, different from 10 years ago and. The, debate. The flip side of this technology is equally. Transformative, we're. No longer looking at technology, in that blind simplistic. Utopian. View the. Debate, on the dystopian, side is equally. Real there's. Discussions, about addiction maximizers, we, see individuals, who are hooked on to technology, we're, worried about that we have questions are in automation and how, do individuals work. In a space where technology, is quickly automating, jobs, how. Do we think about autonomous, weapons we, don't even have a definition of autonomous weapons and, we think about bias, blackbox. Algorithms, deep fakes, fairness. Ethics, corporate responsibility. The environmental, impact human rights privacy. This. Notion of meaningful human control right. How, do we think about sustainability, transparency. Surveillance. And then. Importantly the unintended consequences, and this is just a small list of all of the topics it's, obviously a much larger list behind here but these are some of the big topics that are coming up in this, category of work and, we. Can't really look away right now right and. So. We've come to recognize the, incredible power of technology. Even. The pervasive, of social media. We're. Starting to look at the power of face recognition technologies. And. In. A political, climate where both sides can't, seem to agree on anything they. Seem to be able to agree that with face recognition this. Is a technology that we need to regulate San. Francisco going even so far as to ban this technology. Synthetic. Media this year was huge, this is a huge leap in synthetic media techniques being, able to create volumetric, versions. Of people just using images, being, able to clone voices, using small samples, of speech and recorded, audio these, technologies, are massively. Transformative, I'm pretty sure Bill Gates I'm going to go out on the limb right now I'm pretty sure he didn't consent to his voice being cloned. And. These technologies can be used in deeply harmful, ways such, as revenge, porn and, so, we're seeing a massive shift, in the capabilities, of tech and also. The general societal. Reaction, and perception, into how technology, fits into our lives. Okay. And principles, everyone, has them right every corporation. Every, industry group every. Association, everyone, has principles, around how we should think about emerging, tech and it's, important, it's important that this conversation is happening this wasn't happening 10 years ago I don't think anyone had principles, people weren't talking about this leaders.
Weren't Talking about it that's important and, and. So there's this recognition, that across, the board we're creating, massively, complex, systems, these, are impacting, cultures, they're transforming, the way we interact they're transforming the landscapes, and their interactions, between all of us and that acknowledgement, is an important one in the industry and I, think our language is poorly suited to. How we talk, about the complexity, of technological. Interactions. These. Normal, cause-and-effect metaphors. They, don't quite work here, and what we're understanding, is that we have to look at technology in context, because. Most, technology, is often dual use and it's. Difficult to separate, the good and the bad we need to understand how it's being used by whom and what context. Where and what, the effects are and. So this. Is I think a really pivotal moment, for us in technology, you're hearing our leaders talking about this quite a bit Satya, has been incredibly, bold in shaping. How Microsoft approaches. Artificial, intelligence, and emerging technologies. Some. Of you might recall at Build a couple years ago he referenced, 1984. Had that up on the big screen and, talked. About his concerns, about dystopia, and dystopian, futures. Masaccio, talks a lot about building trust you, know building trust in technology is crucial it, starts with us taking, accountability. For the algorithms, we make the experiences, we create and ensuring. That there is more trust in technology each day this. Was two years ago every, time, he speaks he talks about ethics, and responsible, innovation, and so, this at Microsoft, is an important, concept. And at. The corporate level we have something called ether. It's our corporate, ethics board it's, an advisory group I'm a member of it we look at big. Topics, that the company ought to consider, and come, up with recommendations for the for the company, but, we've also recognized, the need to embed, people who think about ethics and responsible, innovation, deep. In the engineering stack and that's where I said I sit in cloud and artificial, intelligence and we, work alongside with researchers, engineers, designers. Developers. And we, look at how we can augment, those, folks with. Responsible. Thinking with techniques, methodologies. In order for us to build this from the bottom up and so, I, think, you heard this in the intro but my team is a guide and we. Lead. We. Help guide the the technical experience innovation. Within cloud and AI, and. And. We, have a mindset that, is really around. Stewardship. And a conservers, society, and the, idea that we have real. Deep concerns, for. The future knowing. That the decisions were making right now have, potential. Disruptive, destructive, destructive. Impact. In, the medium to long term and, so this is the, mindset that we bring in to this group, and what, I'd like to do is share, with you a little bit more about how we think about responsible, innovation, deep in the tech stack. The. Idea, here, is that. This is about stewardship, and, it's about taking care of the future. Through. The decisions we make around, research innovation. Development. In the present and there's.
A Number of facets, that we think about when we say responsible, innovation, it, means reflexivity. Inclusivity. Anticipation. Responsiveness. And, intentionality. Now. I'm going to walk through each one of these and share, with you how we think about it and, kind, of break it down a little bit. So. A big caveat first, I think it's important, to say this we, don't have all the answers, we. Can't stop that from us, having starting. And. And. We are going to iterate quickly and, so we think about starting. Somewhere learning. As quickly as possible and iterating, iterating methodologies. Iterating, on mindset, innovating. On tools and. And. We optimize for creating, a system where we can learn as quickly as possible in the. Machine, learning space you often hear of developers, talking, about their inner loop and what, they mean by that is. The. Inner loop between developing. Experimenting. Testing, and creating a really tight group between. Iterating. On their models because the more you can iterate and turn the crank on AI, models, the better your models are and so, we think about this inner loop as the ethics inner loop and how do we create together. With engineers designers. PMS. And how do we create an inner loop that helps us learn as quickly as possible around, these methodologies. And. I think it's also important, to recognize that with, any new tech there are huge domains of ignorance that come up there, are things that we cannot possibly understand, or know or anticipate, ahead of time all, of the new ecosystems. And unique circumstances. Surrounding. Technology. Developments, and so we know that with every new technology development, these domains. Of ignorance pop up and we need to create this space for us to understand. And excavate, and learn more and also, create mechanisms for. Listening. To signal and. And. Finding you know Canaries, in the coal mine so to speak and so, I want to kind, of caveat all of this with with these and and. Lastly the fact that we're building on amazing, work from, other people and other disciplines. In. Other, industries. I take, a lot of inspiration. From Ursula. Franklin, who was a Canadian physicist. And her thoughts around technology, in the way intersects, with society, we, look at Wendell, Berry whose, work around sustainable agriculture. Was, really you know he was a thought leader in his time and and. We can apply, a lot of those techniques and ideas in, the. Technology space and, then lastly in the Science and Technology studies there are people and disciplines, whose work it is to understand the impact of tech and society.
As We want to leverage that thinking and bring that to the forefront so. On. The first category of reflexivity. The. Idea here, is really. About examining our moral character, and holding a mirror up to what we're doing and why we're doing it what, are our motivations, what, are we doing what. Our commitments. What assumptions, are we making, where, are some of our biases, where do we think our blind spots are what, we're trying to do here is, have an honest conversation about, what we're doing and why and, and. Understand. That we are trying to avoid any sort of moral distancing. That, might occur or, the desire to look away from a certain piece of technology, and say you know what I'm, uncomfortable with it but I'm just going to put my head over here and pretend it doesn't exist we don't want that to happen and so, this is about honesty, and this is about openness, it's, also about acknowledging, that there's, very, different views and value systems around the world we're not a homogeneous, world. We have diversity, we, have very valid viewpoints. That are not based, in Redmond, Washington in. The United States and we have to acknowledge that we have to bring that thinking, into the, way we think. We. Spend a lot of time looking at organizational, culture, because. We recognize, that organizational. Culture is all, about the values, and behaviors, that happen and contribute, to the environment in which you build software and we. Want to influence that we want to influence how people interact, the. Way they create technology, the way they share information and, and. Ultimately, the. Most some of our most important work is about shaping the mindset of our workforce because. It's people that are building technology, and so I'll share with you a few things that we're doing here and the. First one is an experiment, that we're running right now this. Is an, experiment, where we're incorporating. Our principles. The things that Sasha is talking, about the things that Brad Smith is talking about at the, highest levels, and we're. Looking at incorp that into, our personal commitments, and reward. System and so. What does it look like to tie principles. Of fairness into, your connect into. Your rewards, what, does it look like to train the workforce to understand, and internalize and, translate, that into how. They change and do their day-to-day work. And. So we've been embarking. On a number of workshops where. We actually work with groups, to, help them write out their actual accountabilities, here's how they translate, to accountabilities, we've. Worked with training, managers to. Recognize. The work that people are doing and translate. That into how, they reward individuals and, so, this, this, exercise, of translating, principles. Into, your day to day actions is, an, important, one because sometimes they feel quite abstract, and sometimes. They're hard to you know it's hard to look at them say well what does this mean for me what, does it mean for someone who's a build engineer, working. On the system to, have a system, that is fair what. Does it mean for someone who's working and the design discipline, and maybe, for those who are closest, to the tip of the spear, it's, actually easier because they're thinking about stakeholders.
And They're thinking about who might be impacted, but, as you go down the stack it, becomes increasingly difficult to, take those abstract, ideas and then apply them and so we work on breaking that down and helping people think through those types of things and. Then look at how we can change the whole system so. This is an experiment, where running is a few thousand people going through it right now we're. Learning a lot it's, been a tremendous, growth and learning experience for all of us and. As we start scaling this type of practice, out again this is the way we we innovate. We do, small experiments. We iterate really quickly, and we learn and then we start scaling it out based on what we learn. The. Second piece is around, inclusivity, and the idea here is that all. The projects, that we're working on we want to vet them against competent, and diverse individuals, we want to vet them against people who bring perspective and, who, might even object, to the work that we're doing because. We. Deeply, believe in the value of diversity and inclusion we, want to hear from people who. Don't, believe in the work that we're doing because we want to understand, what's happening there we want to create the space for, introspection we, want to create the space for challenging, dominant, views, we. Want to bring in experts, and leverage, those capabilities, and we, want to understand, who's impacted and start talking to those impacted groups and, and. I'll say this you know dominance, systems they. Maintain themselves and, without, diversity, in the room without people challenging, that system they, don't change they reproduce, they maintain themselves and. Dominant. Groups are rarely challenged. To go and even, think about their dominance, and so, this idea of listening to others of, incorporating. Feedback even, when it's uncomfortable that's. A key element for how we think about responsible. Innovation, and it's, important that we create the space for this in our organizations, things, like open town halls or we can have honest discussions with, leaders and ask pointed questions, things. Like having forums, where you can anonymously, give feedback, and. Have people listen to it and take action on it are. Important, forums and we've created those types of systems. Another. Important, one that we've created is, really around how we do research, this. Is around inquiry research validation. And. Almost. All of the research that my team does is on. Stakeholders. Who are typically forgotten, and these are individuals who are often excluded from, mainstream research, they're, members of social groups like the LGBTQ. Community or. Minority. Groups. Women. Introverts. And. These are individuals who are. At risk of harm in social situations, and what we want to do is we want to listen to these groups that are typically excluded, children. And the elderly are, often excluded right. And. So these types of groups are important, because we, understand, that if we listen to these marginalized. Groups we're, better able to address the, needs of the broader range of people and so, we intentionally, go and recruit individuals, to, listen to their feedback we, want to hear how they think about privacy I guarantee. You that women think about privacy different, than men do and, if you design for, women in certain scenarios you actually address the privacy needs for everyone and so we take this mind, we start applying it to how we do product development and this. Sort of intentionality around. Just research, and who cuts gets to come into the room give, feedback to you on what you're building and helps, to shape it as an important, attribute for, how you design responsible, technology. So. Psychological, safety is something you hear everyone talking about and. This. Is I think one of the critical pieces to creating, a workforce in an environment, where, people can speak up and this, is the you know the shared belief that the team is safe for, interpersonal risk-taking, you're, able to bring your real authentic, self into work without any sort of fear of negative, consequences, that. Team members feel accepted, they're respected. And in. Creating, this type of environment. Is an ongoing task, and it's easy to destroy, it and we're very thoughtful, about how we how. Do we create that and, an example of what we've done here is and. This might seem a little silly but it's actually, not it's we've. Created a card game that we call judgment, call and. And. This card game has been interesting, because what. It lets you do is it lets you exercise, your moral imagination. And test. Your analytical, thinking in a really safe and inclusive environment. If. Anyone's interested I've got a few of these and you can come by and pick one up at the end but but. This card deck has been unique because it really alters. The.
Relationship, And the environment, for you because, when you enter into the context of game play you suddenly open yourself up to have conversations. And dialogues that you might not normally, have, you, give yourself the space to, open up issues, and concerns and talk about it because it's under the guise of gameplay as part of the game right, and so by, creating these types of tools and mechanisms what. You start to do is alter the relationship, between individuals. And give, you the space and the freedom to have the conversation, so. The game works something. Like this there are three different types of cards there's, a rating card one. Saying. That the product is extremely poor to five fantastic, exceptional. There's. A stakeholder, card where, you have to enumerate all of the different stakeholders that might be using your product the direct ones the end one's and. I'll tell you it's really hard for people to think about stakeholders. And the you know vast spectrum, of people that might be using it just that exercise, alone of enumerated, all, the different individuals, that might interact with your system not just the ones paying for it but, the ones who are bystanders different. Advocacy, groups different, types of individuals, and so they articulate, different stakeholders, there's. A set of cards that represent the ethical principles that Microsoft has publicly talked about now, what you start to do is you pair these cards up and you get delta hand that says hey, you have a 1-star your, stakeholder is an elderly, individual, and your ethical principle, is all about fairness now. You write a review from, the mindset of that individual, talking. About the system or product. This. Is a replace you doing the real research behind it to understand how these people actually. Feel and and. How the product, works for them but what it does is it puts you in the mindset and so, this sort of activity, pushes. You into that mindset where you're forced to think about well, what would I give a one star like what would they say if it was a one star review let's. Say it was a face recognition system. In an airport what, would you say you. Might say well I keep getting pulled out of the lineup it never really works for me I find, I feel targeted it feels really uncomfortable and I really don't like the system it seems to take longer to. Go through this system than, into the old one well. That's an interesting thing, to capture right. And by, writing down the review it gives people the space to, articulate, some of the concerns they might have about what they're building because, it is hard to bring up negative things everyone, is interested in shipping everyone is interested in moving the ball forward stopping. And pausing and having a dialogue around what, might be problematic. And what you're building we, want to create the space for that so we put all of our new employees, through this boot camp we.
Have Them go through a set of exercises, we. Find lots of different forums for us to share this type of activity, and this is really one about building. That muscle around thinking about stakeholders. Creating. That safe environment to talk about things that might not be ideal and perfect and, to be able to internalize what do those ethical principles really mean what does it mean to be fair what does it mean to be transparent, what does it mean to be accountable here that, sort of thing. Okay. So then the third category is, about anticipate. And, looking around the corner thinking ahead and what, are some of the likely, outcomes not just what are all the outcomes, but what are the likely outcomes, what. Are the likely futures, what are some of the unintended consequences. And I. Think. It's important to recognize that there's no such thing as just introducing, technology everything. Changes, when you introduce something you even, the introduction, of a dishwasher, changes. The routines the, patterns, and the relationships, within a family and so we have to think about how we introduce, technology in, a really thoughtful way, intentional. Way and, it's. Important for us to be able to talk. About harm and break. Down this notion of harm because we often say oh we don't want to create technology that causes harm nobody does alright, nobody goes out there and says hey I want to create tech that's going to hurt harm someone but, we need to be able to break down what we say when we say harm what does that mean, and. For. Us what, we did was we created a set of operating foundations, and obviously. There's a lot more than this but three, important, ones that we wanted to articulate, as a company, was, first upholding, the Universal, Declaration of Human Rights, second. Upholding the principles, the underpinnings. Of democracy. And third. Upholding, the mechanisms, of an informed, citizenry, and the, third one's actually a component of the second one but, given some of the technologies, that we've seen enter the space we actually wanted to pull this one out and elevate it to a high level and I'll show you in a second what I mean by that.
So. We broke down harm, into four major categories risk, of injury denial. Of consequential, services, and those are things like education, access to public services the, erosion, of democratic, and societal structures and that's why I've shown you that third foundational. Pillar because, we, are seeing a lot of technologies, that are having an impact on societal, structures and we wanted to call that category, out as a top level because that's something that we're seeing and then, the infringement on human rights, and. We broke this down into. Categories. And it's been broken down into sub further. Sub categories, with questions, underneath, each one but, the idea here is we want to create a framework for how we think about harm, because. Harm is a complicated. Topic and it. Is difficult, for people to say what, do we mean and we wanted to break it down and this is probably going to get tweaked if, you come back to me in three months this will look different because we're going to learn and we're gonna find things we've missed we're, going to find nuances, that we wanted to capture that we haven't captured here but, the idea is here like in risk of injury we're. Thinking about the emotional, and psychological distress. That technology, can cause people think about the effect of deep fakes on people, right. The, physical infrastructure, damage that think that technology could affect. Opportunity. Loss economic. Loss the, loss of agency, manipulation. Social. Detriment, loss, of Liberty loss of privacy loss of dignity and. Like I said underneath this as an entire set of categories that breaks down what, we mean by each. Of these larger, categories, but, then what we've done is we've taken some, of our tech and we've started applying a, lens, on top of this and saying hey. What does this tech actually do or does it think if you were to put your tinfoil, hat on and think about the worst things and the worst likely, things that could happen in the worst hands, and the, different ways you could exploit or abuse and, manipulate, this tech what could happen and, here's an example of one, project where we, said look this has potential for psychological, distress, or right about that manipulation, it. Can affect how people make money we. Think this might affect the. Freedoms that people might have like this is an example rate of how you might now layer on a lens, and some level, of analytical, thinking and by no means is this perfect there. Is no like rubric in the back that gives you like a score, at the end of this this is a judgment call and it all depends on who's in the room and making sure the right people are in the room having the dialogue around this because. Some of this is a judgment, and leveraging. Research and the latest knowledge, in developing, this kind of framework. There's, multiple characteristics right. How severe is the impact what's the scale is there disproportionate, impact with the likelihood, of exposure, if. You've done any sort of threat modeling this will seem very familiar, right.
Ease, Of updating the system this is an interesting one some systems are much harder to update when you've. Discovered that they cause some low blow heart so. Hardware. For example if you need to make an update that has a longer life cycle and lead time to making updates we need to be extra careful around hardware. And so we start thinking about these various facets and characteristics of, systems, and then, we can start applying this, lens and so there's some rigor that we want to apply here, and, then, there's also the understanding around. Infrastructure, because. Infrastructure. Emerges all around us and that, infrastructure, lends. To certain types of technologies, growing. Perpetuating. And once. Technology, is widely accepted, and standardized, there's, a relationship, that changes, between people who use the system you, suddenly have less and less power as a user once, a system has been broadly standardized. And. Widely accepted and that's, not the type of relationship, that we want to design and so we want to be really intentional to understand here what kind of infrastructure. Is starting. To pop up around some, of these technologies because. That will give us a little bit of a clue into the direction that some of this technology is heading. Okay. So the fourth one is around. Responsiveness. And, the. Idea here is new. Knowledge is emerging, all the time norms. Are changing, technology. Is changing, people's. Perspectives. And their views on technology, are changing, and we have to respond to it we can't carve things in stone and say this, is how we think about harm that will not change what we need to change as, things change right and this changes. Disproportionately. Around the world and we have to be aware of the worldwide impact. And view and so, the landscape is shifting dramatically, and we're actively adjusting. And responding, likewise. And. So a few years ago we did some research this is now old but I thought I would share this here because. We. Started looking at the, socio-political, environment. And how does that affect. The. Belief systems, and how people feel about, technology. Where they have fear, where they have security, we. Started looking at the intersection of, how. People think, about the kinds. Of relationships they want to have with technology, the. Kinds of experience they have had with technology, and then, we look at, context. And so how do some, of these technologies. Pervade. Throughout, people's lives were the situations, that people find themselves in day to day and, we know that norms are built, in. You know more than that just these three circles but these are some important, ones for us to consider and we. Did some research a few years ago we looked at different. Countries and, different variations on trust because. We wanted to understand, well what was the difference and how people looked at technology how, they thought about technology. In the government, and, in the u.s. we found that a lot, of people. Had. A lot of skepticism, both around. Corporations. And, the, government people. Felt like no one was looking out for them that, the government was. There. Too there, was a lot of suspicion around the government supporting, businesses, over, citizens, interests, I bet some of this is still accurate. I'd. Be more so in. China and for the people and I'll caveat, this you know for the people that chose to speak with us, there, was higher, trust in corporation. Than government because they tended to work a little bit more in tandem and people. Were starting to see the benefits of government. And corporations working, together for. The collective citizens. Germany. Is quite different right, there's, some tougher, regulation, in the in, Europe and, so there was more, trust in the US in, the government's ability to regulate regulate, corporations, I think this has changed a little bit too since we've done this. Ethnography, but, the idea here is that around, the world there are different relationships. That people have with technology with. Government, with corporations, and we need to factor that into how we think about rolling out new technologies, because it is not just the US. And. So. Lastly. I want to talk a little bit about intentionality. And how, do we design. With, principles, how do we design with intention, with appropriate, technologies, and so, this is where, Sachi. I think in 2016. Share. These principles, around ethical. AI development. The. Principle, of fairness that all stakeholders. Should. Be treated equitably and, that we want to prevent. Undesirable. Stereotypes, and biases. This. Notion of systems being robust and reliable that, they perform, well in the worst case scenarios, across different environmental, factors. Privacy. And security is a big underpinning, of at Microsoft, that all of our data we, want to prevent it from misuse, and unintended access, we.
Think About privacy in. A pretty robust way at the company, inclusion. Is really about empowering. Everyone regardless. Of their abilities, making. Sure that people have ways to give feedback and. Transparency. Is all around these, black box algorithms, and how do we ensure that whatever. These algorithms are doing they, they're understood, by the stakeholders, and people that are interrogating, the system and then, lastly principles. Around accountability and, ensuring. That we as a company as. Individuals, take responsibility. For what we build and what. The impact is and that's why we're being so thoughtful and intentional especially. At the engineering level around how we develop, tech and. One. Of our researchers. Salima Amir, she has done an incredible amount of work where, she's gone and scoured, 150. Plus AI recommendations. Conducted, multiple. Rounds of iteration, and validation. And looking at what, are some of the guidelines we should have around human AI collaboration. And, and. Each of these has, examples, and proof, points, that we can point people to and say hey if you want to do X here's, some examples, here's some good examples of what you can do this is publicly, available her. Research papers out there and, and. Things, that we do here are you know how do we be super transparent about technology, so. We recently published, something, called a transparency, note for face recognition we. Recognize, that there's a lot of complexity, and face recognition and. We want to make sure that people understand, how the technology, works what the capabilities, are what the limitations are how, to achieve the best results, as a, company, as technology. Companies they rarely open, the lid and say here's where it just doesn't work you guys don't do it like this don't use it here what, we wanted to do is create a, mechanism for us to be able to share what. Is the most appropriate use, of technology and, we want you to understand the complexity, here I want to understand what can cause false, positives, false negatives what, can you do as a customer, to create. The most appropriate use, and deployment, of a piece of technology and so, this sits somewhere between. Marketing. Material which is often meant to make products look amazing, and develop. Your API is, which is really functional, it's somewhere in between and, it's about capabilities, how, something works and how to help guide the deployment, of this for responsible, use and this is a pattern that we're starting to use across. Other technologies. And. Then there's this concept of being seen full right, in technology. And in software, we're often. Striving. To create. A seamless as an, experience as possible and there's many benefits to this sort of seamless, philosophy. It's, easier, right, if it looks beautiful it's frictionless, but. Sometimes we want to insert, speed bumps they, want to insert friction, because. We want to encourage conscious, decision, making and so this. Is where we put controls, in place around privacy, identity.
Exchanges. Of information and we intentionally want to show you where the seams are because we want you to stop and pause as, an individual, as a customer, and make a conscious decision. And. I think AJ. Talked about this in the lightning, round this morning we've, been doing a lot of work to put ethics in the code path and really. What we want to do is we want to translate what someone says at a policy, level and, translate. That into lines of code what does it mean and. This is an example of, an automated tool that we're building that, extracts. Out conditions, where errors are happening, in a model as part, of a developer's, workflow, and, this. Is a prototype, it's something that we're actively developing, but it's designed to sit within the developers, code flow and what, it does is as you look through the, chart here, it breaks down a models, behavior, and here's an example if someone's going through it saying okay four females, who are not wearing eye makeup who. Have short hair and are not smiling, we have a whole bunch of errors so we not we don't look at just females, and there's a big bucket of errors around female let's go recruit more females what we can start doing is breaking, down the model performance, and saying for, this category, of people we. Have problems, we have gaps this, model. Is not performing, well well, let's go understand, that let's go look at whether we have enough data whether we need to acquire more how's, that model performing, this sort of tooling, and inquiry is what we mean by explainable, AI we want to be able to understand how a model is performing, and and. So there's a lot of things like this that we're introducing. Right into the infrastructure. And the engine of, how we're developing code, and it's, meant to just it's part of your process it's just how you do work in the new world. And. So as we examine, our ecosystem, as we examine the technologies, our building and we say okay what are the attributes of. Good. Systems, what do they look like and here. I, think this is more an inspiration. From Wendell Berry and sustainable, agriculture because, a, lot, of the patterns that he was describing years, ago are very similar to good patterns in technology, good ecosystems, function. In harmony with the larger patterns right they're pragmatic, there's no science, fiction, and crazy stuff going on it's nothing far-fetched it. Solves real problems, and there are reasonable limits to it it's, something more than one problem it's not doing. Things in piecemeal fashion it's solving, complex problems in a holistic way well, what do bad solutions, look like these. Seem. Very, similar to the way a disease might, behave or an addiction might behave within, a body we see addictive, behaviors, right now we see we're seeing it right now with social media right.
These. Types of bad solutions, cause new sets of problems that are equally as bad as the one that's trying to solve it might worsen a problem. I might, feel heavy-handed. It, fails it's really brittle and so, as we examine good, and bad systems, we want to be able to identify what, do they look like what, are some of the attributes around, them so. Let me step. Back a little bit and just talk a little bit again about the big picture because. Technology. Need not be used. The way we use them today and, it's not a question of no, technology, or, the you know putting up with the current ones we. Should actively be pushing, up against, technological. Determinism, and we should be asking questions about that looking, at the big picture at the same time as we're looking at some of the details and so. Fundamentally. What I think we need to do and what we're doing is, we, should be approaching this, type of thinking as innovation, material, it's. Not a tax it's not a compliance, it's not an after-the-fact it's, not something you tack on and say okay now let's make this responsible, that's, put some band-aids on it it's, starting to think about it right from the beginning the way we think about privacy and security right, from the set of development, and innovation, material. It's an opportunity, right it's a business opportunity and. And. So this is a mindset that my team brings into this world, like, I said we're embedded at the deepest part, of the. Infrastructure, we're, embedded in the place we look at all of the face recognition text, speech, people. Technologies, ambient, technologies, mixed, realities, all, of these types of synthetic environments. As well this is where we see, some of these technologies start to incubate. And we want to do those in the most responsible. Way possible, so, I think I have 10, or 15 minutes left and I'll leave it at that and see if anyone has questions, thank. You. Okay. So we have some micro neurs and I. See. A number two here yeah. Hi. There so. Thanks, for your talk I think it's it's wonderful to see a company embedding ethics deeply in its process, and I'm enthusiastic, about your efforts to try and do good, I'm. More, pessimistic about your efforts, to not be evil, because. You're, only one company and if, you're not evil than some other startup, company will be evil instead and we'll get the same technology. And I think the. Only way to prevent those sorts of of, outcomes, is to, legislate, and. Actually. Have rules that apply to all companies and so I'm curious whether your, ethical, team at. Microsoft is working with your lobbying, team to. Try and influence legislation. Going, forward yeah absolutely and I agree. With your sentiment. That the thing that applies to everyone our laws. Well. There's and there's two pieces here one is what. Can we as a company do as a trillion, dollar company, in the world what, can we do to raise the floor and the water level for the industry, and and. I think we have the benefit. Of being. Around the block a few times right, we went through our own era of growing, up and our, own adolescent, years and so, we. Are a much more mature, company, than some of the newer ones that are going through very publicly, some. Learning and, so, we can set a lot of examples, by us talking about this publicly we're setting examples by. Us talking to the industry, by participating, in things like the partnership on AI we, talk about things and techniques publicly, so that we can give people ideas and give, them tools so, that they may go and implement them as well because a lot of companies do, want to do good and and, what they're looking for our examples, they're not quite sure how to approach it they might be a little new they're not sure how to organizationally. Situate themselves and so we want to give them examples on the, regulation, side you've, probably seen this Brad Smith has publicly called for regulation, around face recognition my. Group works very closely with our legal team as well with. Our lobbyists. As well and what we and what, we think our role is we. Want to give proof points we want to show people what it means when, we say here's a policy, well it's kind of abstract sometimes, and the, best policy, is one where you can point to a proof point and say here's how its articulated, so, we do work closely with them our focus is not around.
Lobbying. And regulation, but, there are parts of the company that are focused on that we'd collaborate, with them. So. An old cliche, says talk, is cheap. Action. Count. Apple. Has done publicly, and says what happens on your iPhone stays, on your iPhone and this tech the position with respect to privacy. Well. What's my Microsoft. Position on and we heard yesterday I talked a behavioral, data in the wealth of behavioral data the. Question of course who gets to see this behavioral data so, what's Microsoft, position on behavioral data and. Would. Microsoft, step up to the same place what stays on your laptop, or you, know vice there's a new device that's. Interesting so I don't know the talk that, you're. Referring, to around behavioral data that happened yesterday but, I'll. Say that both, Apple and Microsoft are. Are, some, of the conscious leaders in the industry and so when you look at the cases Microsoft, has brought to court, the, where they have challenged, the government the, way that they choose where, data centers, exist and the evaluations, that are done around countries, and human rights I. Would, say that there's a lot of very, public. Discussion. That does translate, into actions, that, are meaningful, multi-million, billion-dollar decisions. That are being made Brad, has talked about. Decisions. That we've made as a company where we chose not to sells technology, to certain regimes or took it where we've chosen not to sell technology to, for. Example a California. Police. Department. That wanted to go and deploy face recognition, and, you know body cams dash cams and we've said you know what no we don't think that this technology is fit for purpose we, don't we have concerns about how you would use this we've. Said no to countries, that have wanted to deploy technology. Across. Their, capital, cities and. And. We've looked at their human rights records. And potential, violations, that might occur and we said no we don't want to put technology there and so, we let's, go back to behavioral data yeah I am electing. I'm, not an expert on the behavioral data side so you, know we can talk about that afterwards and I could find you a contact, because, I. Think, the behavioral data there's lots of places where would appear in office potentially, LinkedIn potentially, but, those are just not categories, that I spend, a lot of time in but. I can find. You a contact or someone to talk to around, behavioral data. Hi. So. At. A moment we see sort of globalization. And open source two of those things that. Limit the impact so, if you decide face. Recognition on body cams is not something, Microsoft would want to sell but, at the same time Microsoft would put, out things as open-source or, libraries. At open source. The. Level of knowledge required. To put it into action for somebody, else goes down and, I think so for me this question I coming. From Germany I think regulation. As one of those things we have seen and it's really. Hard, if you want to maintain an open market and they see if you want to have your citizen, access, everything, around the world so how, do you see service in. One way it's very clear we can do it from always. Sort of a. Our own perspective. Our own country's perspective, but we have the global view. And then sort, of probably. What we agree globally, becomes very very small like the human rights even, there we. See sort of looking around the world so how do you see the sort of the open. Sourcing, which basically lowers, the hurdle for artists to change things. As, well as the globalization. In order to have success, with, sort of the ethical principles we put out as sort of some, like. Companies some states, but if the others don't join in is it going to work yeah. I know that's a tough question and, and. We talked about this quite a bit which, is if, we choose to do something but others if. We choose to, not go into a certain business and others do what, does that do and. This is where again the question about regulation, comes in and. And. At the end of the day there is a there's. A distinction, here because, we're, talking about how we govern our, behavior we're, not talking about us governing, the world this is about us governing, our behavior, as a company, where, we choose to put our time and energy and investment and so, we have to distinguish. That because it's easy to say well how. Does Microsoft, go and shape the world this, is primarily, about how do we shape ourselves first and at the same time can we shape the world in a positive way but but.
It's Not our job to go and police the world and so, we have to think about that as, first. Just a way a mindset. Around this is about us managing, ourselves the. The open source question is a really interesting one we've had a lot of debate around this, discussion. I would say and I. Think you recently saw a. Dataset that was, pulled off and. Microsoft pulled it offline we said look we just don't feel good about this data set right now the researcher that was working on this no longer at the company let's, pull this one off line because, we. Have concerns about what might be in it and we, want to be really thoughtful about it and so we're, actively going, in and examining. Some of these and some and that's an example of that. One. In addition so I think it goes all the way to the tools because tools make it so much easy to do the things and I think you're. Conscious, about sort of that. The product to sell but I think, would. It really sort of make it. Yeah. What, is required that we don't sell tools to people yeah. And and, so the. The angle. That we are actually coming at it is from a platform so I sit in the platform, at, Microsoft, I don't, sit in the here's the actual piece of software, or a product that's, shipped we sit in the platform, we power Cortana. We power all of the speech language vision, services around the company and then, we provide that platform to developers, and so the discussion is around what. Do we provide some of these platforms and, and. What technologies. Are actually gated, which, ones are gated which ones need extra scrutiny and face, recognition is, one of those technologies, and and. So, we think about this at the platform, level as well I think that's the deepest level we need to think about it and this. Is where it gets super messy because you, don't always know, how your platform is being used as soon as you open it up and, it enters the market you lose all control over it and so the, dialogue around which. Ones are the most are problematic, and and. Can we articulate, then identify and then do we have systems in place to manage who. Uses it and how it's used. That's, a needle, we're trying to thread right now. Yeah. Hi I'm. Margaret Burnett from Oregon State I really, enjoyed your talk and my. Question is sort of about a trajectory. As. You see it now on on. Microsoft's, organizational. Culture so. It. Seems like this, this trend that you're describing now, actually, started several years ago and so you, know over time it it, seems like it's grown. You're, reflexivity, and you're thinking about your, products, in this way and my, question is what kinds of changes have you seen. That. You can actually measure in, Microsoft's. Organizational. Culture. Well. Organizational. Culture is hard, to measure it I mean. It's hard to measure but well. Let, me I'll, share some anecdotes, how about that because, well. One thing we do is we measure literacy. Around, principles. And values just, to understand whether people understand, what they need and whether, they can apply so that we, measured through surveys. And things like that but, but. I want to talk a little bit about behaviors, and. And. What we're seeing are, people starting, to ask a lot more questions and starting to say hey you know what I actually, want to change how we do budgeting around, data acquisition. Because, what, we want to do this, year for example is. We want to go create, a fair benchmark, data, set and creating. A fair benchmark, data set costs, a lot more money than just creating any old data set where you just go and gather as much data as you can and so, we're starting to see changes in behavior. That affect, the, way you budget, the way you plan for, technologies. We, have people that come to us and say I'm. Starting to articulate, things that I've never felt comfortable talking, about we, have people that have come to us and say I need to rethink in my career what I'm doing Mira I need to go quit my job and I say don't quit your job change how you're doing your job and so. There are these cultural. Transformations, that are really. Behavioral. That you start to see we. As an ethics group I mean nobody, wants to ethics group at the beginning right you've got an engineer that you're gonna slow me down and, that's, the initial reaction, we. Get, pulled into a lot of conversations, where they say hey we want to think about this right at the beginning or another.
Group Will say I heard about what you guys did with that group can you come and help us and so, the types of asks. We get and the questions, and the way we get pulled in we've, seen even in the last year in the last two years a huge shift just in the way people approach us because. Initially, there's a fear of oh you guys are the ones waving a stick you're gonna tell me I can't do what I would do and then, we hear wait you're actually my partner you made me do something better and faster and I didn't have time for that thank you so much and so it's those types of things that we listen to and that's how we understand, whether this is actually having the kind of cultural, shift we want this. Year what we did was, at. Least in this organization, we change how we do budgeting we. Force. Groups to factor in their ethical. Needs. I guess I don't know if there's a better way of putting this but we. Forced them to think through all of the things that come with the responsible, innovation, all the new data acquisition, that they have to do any. New benchmarks. Any new tooling, that they have to build and they factor that into their budgeting process, and I found that really remarkable because, I. Haven't, seen that and heard about that elsewhere, and so that to me is another signal that things, are shifting and changing, thank. You. Hi. There um there. Was one up there I think I saw you waving, number. Three oh hi thank. You for the talk this is really, interesting, I. Particularly. Like this idea of. Ethicists. Embedded, and product, groups some, science and. One. Thing that I'm just wondering about it seems like a, lot of Industry anybody, can be an ethicist right. In the sense that you have, to have. A computer science degree and. Go, through a loop in our interview. And become a software developer, but, you. Don't really need any qualifications, to. Be. An ethicist. And I wonder if, Microsoft. Is, thinking about this somewhat, differently. Do. You have to have a degree, in philosophy or, degree in ethics or do you have to have. Passed, some bar before you, are, leading, the ethics decisions, of a team how does it work yeah, so the. Group is actually, multidisciplinary. And so there are only, a handful, of people that I would say are like the true ethicists, on the team the, the. Organization. That I run has data. Scientists. Creative coders designers. PMS, and you. Know one or two people that I would say are more ethics oriented. Because. This isn't really just about ethics. This is about how you develop and. So we create proof points we create through, code and design and show you here's another way to do it and so the ethics, you know the the thinking around.
The. On the ethics side and which type of ethical, philosophy you, align to isn't, really the high order bit the, word ethics is used often and you would think that there might be it's all Ephesus it's, not and so I. Think the trick here is is to not 1, don't hire people who are just armchair ethicists, and who have some level of experience in. Background and so we, make sure that they've got some degree in philosophy and, they've worked in the industry for a while. But, but these types of groups are not effective, if they're only talk they have to do and they have to show and so, we staffs the group in such a way that data. Scientists can go in and examine, the, data in the composition, a coder could go in and that, example of the. Error analysis, tool. That I showed that, was created, I was coded by someone, on our team that did you, know the front end of it so we could understand, what how it will manifest and that wasn't an emphasis it was someone coding it in a responsible way so, I think the shift has to really happen away from ethics, as the talking point to responsible. Innovation, which change it's an engineering function as well thank. You. There's. Two in the back thank. You I'm so, I was really interested that I could. Remember. UC, Davis I was, really interested in your, training. Program and. I was wondering whether you had any curricular, materials that, you could share and more. Broadly and, what. Advice, do you have for developing, curriculum, materials. In undergraduate social engineering programs. So. No. No. Curriculum, that we're ready to share yet but we hope to make things public, but. I'll say that curriculum, needs to be tailored to your cultural, environment, in the context, in which you're developing if, it's too abstract. It's just theory at that point and theory doesn't really land well with engineers, you want to you, want to show real practical, here's what you should do and here's how you should think about it and so, we've tailored our stuff to our, environment. The dialogue we have is around the technologies, that we're building and so it's not too, far removed because then you have people saying I just don't get what you're trying to say and so I think it has to be really tailored and. So when we share it it'll look tailored, to what, we're doing in our group, I'm. Happy to talk about just education, curriculum, afterwards. If you want because I think that that's a longer conversation and. It. Depends. On what the outcomes are that you're trying to achieve and, and things, like that so I think that that's a longer deeper, conversation, I'm happy to chat after, this if. You want to come up. Okay. Any more questions. One. More up here a. Lot. Of people seem to be talking about intentionality. These, days but. There's. Not that much agreement on definitions, and I was wondering what.
Youth How, you see, intention, and what. You think it's sort. Of important. Disease, yeah. It's a it's a good question I haven't heard a lot of people talking about intentionality, I just hear a lot of people talking about principles right now and. But. The way that we talk about intentionality, is that every decision you're making there's, a reason behind it you've thought about it you've been able to think about who's. Impacted how, and, what and like. Have you asked why five times and gotten, to the essence of what you're doing. Where. You're you know we're doing some work around just. User flows right now around some of our devices and. Being. Really thoughtful about how. Are people going through that flow, how are they consenting, and giving information and, so, that instead of like it's so quickly to go and draw Golden, Path scenarios, and
2019-08-22 19:46