Defining and Deploying Responsible Technology (Cloud Next '19)

Defining and Deploying Responsible Technology (Cloud Next '19)

Show Video

Hello. And, good morning thank, you for joining us today, I'm nisha sharma I'm a managing director in, Accenture. I've, been in an Accenture, for just over 20 years and, worked on all sorts of different technology, implementation, projects, for clients, across several. Different industries, I am, currently part of our Accenture, Google cloud business group which, we formally, launched last year at next and I. Oversee, all of our offerings, and offering development, for the partnership and I based out of Miami Florida, hi. So I'm Deb Santiago. And I'm, based in Chicago, I am, the cool lead of our responsible, a I practice, at Accenture, as well as a managing, director in the legal department and. So, the way that it's gonna work today is we're gonna be. She's gonna go and walk through our tech vision document and I'm going to respond, provide. Some I, guess some. Thoughts with. Respect to what, we're anticipating for, the next five years and what, can we learn from, the, past five years that, can help us navigate and, manage our entry into these new worlds. All. Right so every year Accenture, publishes, a technology, vision this, is our view of the top trends, that we are seeing impacting. Our businesses today we, talk about technology trends. As well as business, trends, and how the use. Of technology, is. Impacting. What power clients, and what businesses, are doing we've. Been publishing the, tech, trend the tech vision every, year for the past 19, years and, we just recently published, this, year's 2019. Tech. Vision, in February, so you, know we thought we'd anchor, this conversation. Around the responsible, use of technology. And the considerations. Of ethics, around the. Future trends and what we see and where we're going because this gives us an opportunity to consider. These as we, deploy our, our technology, solutions, so. I'd like to start by asking you two, questions just. Think about these questions first. Question is is technology. Good, or bad I think. About all the different technology, that you've come across all the technology that you hear about is, it good or is it bad and the, second question I would ask you is what. Is your role what, is our role what is our company's, roles in influencing. And determining, whether, technology. Is good or bad. So. Let's take a look at some examples to, kind of set the context, for our discussion. Here today technology. Can be very scary, to people there's a lot of negative, news around. Technology. And advancements, in technology. Right, who should be trusted, with this technology who shouldn't be trusted with this technology is it, ethical is it not ethical, right it can be quite controversial, even though we think there's some really advanced. Advanced. Implementations. And advancements, that we're seeing in technology, we're, still questioning. The, ethics, and the and the responsible, use of that technology so there's some examples here, just to kind of you know. Show. What. We're actually seeing in the news right so over, three billion identities. Were stolen, online, last year, right. We're questioning the, the, geopolitical, forces. And the influences. That we're, seeing in today's world, right, who which countries, do, we trust with our technologies, which, countries, are we willing to share our, technologies. With right, lots of questions around who, we trust again and who we don't fake. News is in, the news every single day right there's, a majority, there's a survey that says that the majority of people around the world don't, think that they have a single source of truth for news.

These, Days right. I just came back from a vacation in India and they're getting ready for their elections, and there's a lot of concern around, fake, news being spread and really influence in and affecting, the elections, right even after what we saw in our in our own elections here in the US. We're. Seeing questions. Around the use of technologies. Like facial recognition technologies. Again. You know we're constantly scanning, everybody. All the time and we're, identifying who these people are and who they are who, do we trust with this who do we trust, you to, see, and. From this information. The, bottom left. There is about. Researcher. In China that. Was using gene editing, tools. Like CRISPR to, modify, human embryos, right. Sounds, really advanced but it's that ethical, right there's a lot of questions around this we. Hear stories all the time about. Bugs. Or or you know software gone wrong or something, you know that we've discovered. In our applications, like like the the recent FaceTime. Issue right where you know you'd make a call through FaceTime but the camera turned, on before there the, other the receiver actually had an opportunity to turn, it on right, we're. Question you know we've, got apps that are constantly, requesting, access to, our photos, to our contacts, to our you, know to our call logs, and text messages, right so we're just constantly, asking. The questions of what's, happening with my privacy and, my data we're. Talking about inequality and, inclusion, many. Many questions and then we're, seeing this tech clash right against all, the technology, companies, whether, they're in the Silicon, Valley or whether they're in China but, you know they're, constantly, being asked, what are you doing with my information what, are you doing with this data how, are you using it what are you you know what are we what are we really doing. But. There are also some really amazing, things that are happening a lot of positive, news around, technology, right, so there's an example here, of work we're doing on, a program called ID 2020, which, is about us helping refugees, all. Around the world establish their digital identities, so. That they can get access to services that they might not normally have had access to because, they can't demonstrate, or prove who they are. We're. Seeing advances, in the agriculture, industry where. We're using, technologies. Like drones or you, know IOT, sensors, and such to improve. The quality of the food that we're producing and to help reduce food, waste and, and so on advances. In healthcare and medical services, this. Is an example you, know one of many, many examples but, using. IOT, devices and, technologies. To provide. Seniors, and. Disabled, people with. New ways of getting exercise. Some. Fascinating, advances. In robotics right, which we're, teaching robots, how, to do backflips and to dance better right, so we're, not gonna be doing the robot anymore they're dancing like us right, pretty, cool stuff and then and then the other example there with, with, Rockets so, traditionally, we we've launched, Rockets up into the air but they're kind of like these one-time uses, of these rocket technologies, but companies like SpaceX for, example are figuring, out how to launch, those rockets safely, back down to earth so we can reuse them it's, so some, really fascinating and really positive, uses, of technology out, there as well. How, many of you had heard about the the, 10 year challenge, that was was, you know recently all around social, media was where we were basically.

Taking. A picture of us 10 years ago and comparing. It to a picture of us now right to, show how much we've changed well. Bill Gates posted, the, world's, 10 year challenge, and this, was showing how, you. Know how social. Factors. And social things have, changed, over the past 10 years life. Expectancy extreme. Poverty, child mortality youth. Illiteracy, we've, seen really. Really, positive improvements. Over the past 10 years and. Technology. Has definitely had a role in all that. So. Technology. Can be good or bad and that's, based on how it's used and it's really as 2 up to. Us to determine whether. It's good or bad and, the thing, is is change is constant this isn't as though we, didn't, have technology, changes. In the past but. There's something very different, right. Now about the pace the. Scale and the velocity, of change that we're experiencing. Right now many. Of you have seen that, s-curve. Slide, around, the technology, adoption of the U in the US over the last 100, years one. Of the often, cited examples. In that slide is that it took about 45. Years, or so for the telephone, to achieve, mainstream, adoption where. As smartphones only took about ten years and. Before. Societies. Had. Time to kind of observe, the impact, of technological. Advances. But. We don't have that luxury right now. Because. We are living in this time of rapid. Technology adoption. Regulators. Are playing catch-up. But. They've also indicated. That, in the next five years that, they will be, catching. Up with us all and. I. Think, that what, we're seeing is in this gap, that. The, society. At large is responding, responding. We've got enormous, public scrutiny and real-time, feedback. And, in some respects forced, transparency. Of the activities, that companies, are doing today, and as. New tech like AI becomes. Mainstream, it's. Important, to consider that the, ethical, implications right. At the beginning is a. Central, aspect of how we develop and deploy new, technologies as. Recent. Events have shown it's. Really. Hard to get right and the. Thing is technology. Alone, cannot. Deliver on the full promise of new technologies, like AI, an. Important, conversation is, going on right now, with. Respect to external. Advisory, councils, etc. And it's facilitated by. Community. The community, at large and. I, would say that it's helping, the tech world understand. How. The public, views the way that companies should be interacting. With us engaging. With. Us, and. What. They expect companies. To to, represent what communities, are expected, to represent but. In the end. Everything. Should not hang on one. Company one, person, one, team one. Regulation. One, government, rather. Companies, should be creating. Resilient, and sustainable governance. Strategies, that. Can help them, act in an agile way to deploy, technology. Responsibly. So. Let's take a look at the technology, vision this year's technology vision. What. It's all about is, about this, post digital, era and we are now entering, this post digital, era in. 2018. Company, spent one, point one trillion, dollars, on their digital transformation. Projects. 94%. Of organizations, say they're doing digital transformation. Work today I don't. Know if I believe that but that's, what they said that they're doing, 58%. Of those organizations. Say that they're comfortable with where, their tech their digital transformation, programs are going, so. Everyone. Says they're doing digital right, and we need to figure out how, to differentiate, we. Need to figure out what's next and in, this post digital world we're not saying that digital is done not by any means in, fact it's just table. Stakes now it's required, to be there it's just the cost of doing business and, you.

Know I like, that our and. At some point we're going to say that we're gonna stop using the word digital, even right I like. How our CTO has, said that you know we don't say we're in a post, electricity. World anymore or post Internet and world anymore they're, just there right and so very soon we're going to we're gonna see that digital, it's just like that as well so. Now we've got a new set of characteristics. That are defining what it means to be successful in. This. This. Post digital era. So. We're gonna just touch briefly on, what some of these characteristics. Are and the first one is individualization. Now. Individualization, is, not personalization. We're, talking about hyper, personalization. Right it's not just about knowing, what your preferences, are and what you like and those kinds of things but, it's about really, understanding what. You want and what you need. Instant. On-demand is, about being able to respond, and deliver services. To a customer exactly. When they want it right they want it now and we have to be able to deliver that and then. Momentary, markets, are these, pop up like services, where, we are able to respond, to a customer's, needs at, exactly. That particular, moment in time right. And these are again these these. Markets. That form very quickly they. Provide, an opportunity right then and there but then they go, away very quickly as well and. If you can't serve a customer's, needs exactly. When they want them then, you've lost that opportunity to engage with them. So. Now you can imagine all, the information, that's being captured, in order to provide these types of services right. And and, customers. Expect, trust. And responsibility, right. They expect that all the data that they're putting out there you're, using it responsibly, and that they're, able to trust that you know that you're doing that so.

I'd Like to do a little. Experiment, with all of you so, take, out your phone you. Know you all have phones just take out your phone right. Okay. Got your phones got your phone now. Unlock your phone, okay. It's. Unlocked. Right. Everyone's phones unlocked now. Hand it over to the person next to you. Go, ahead hand it over and do something on the phone right, where's. Your camera here I don't know where your camera is ghostly I'm gonna send you email let's take a picture here. I'm. Gonna take a picture. Where's. Your. Selfie. Mode. All. Right they give the phone back. Now. All of you laughs all of you pause to think about oh my god I'm. Giving over this phone to the person who may I may or may not know think about all the things that are and we're friends yeah I know. That but but, who I really trust her, or do I really want her to see, all the messages that I'm exchanging with my sisters on whatsapp or all the photos that I've taken in her stored on my phone right, there's, access, to my banking, applications, there's access to my emails, there's, just so much information that I know is on this phone and you know we're all pausing, and hesitating, to exchange those phones but, we do this every day with. Data, that is online and services, that we use right, we just publish. Things we have companies, that have access to these things and we don't even think about it, right we just trust, that, they are using, this information, responsibly. Right. It's quite fascinating yeah so I am and. Part, of my responsibility, I work I had a client reach out to you and say have. You ever done anything with, like robots. As an employee like. Where is this question coming from and, they explained that they, had taken, a robot, and introduced. Introduced. The robot to the workplace, and so the robot had had eyes and was, roaming. The. Work floor and they. Thought it was going to be enjoyable, but actually, it completely, backfired. People, got very angry and they said one of the things that they said was this. Thing is recording me and it's, invading my privacy, and as, we were kind of brainstorming. About this we were, just pointing, out I'm. Sure you guys have CCTVs. All over, the place recording, your. Employees activities, all the time and it just was that. Reducing. It to physical, form. That makes it come to life where people really understand, what is actually happening. So. Over. Here you'll see these. Are the principles that we use internally, in terms of our use of artificial. Intelligence, at Accenture, whether, it's from. An HR perspective a. CIO perspective. These. Are these are the principles that we've been using, they. Are based, on the common principles, of fate. So it's fairness accountability, transparency and. Explained, ability, so, I'm not going to go into it you could, see here it stands for trustworthy. Reliable understandable. Secure and teachable the. One thing that I wanted to highlight is the item around understandable. And we, were very intentional, about, taking, it a step further from. Transparency. Why. Is that well, privacy. Policies, are very transparent. Pages. And pages of transparency. Nobody. Ever reads, them and people don't really understand. You don't say eff you, know that these privacy, policies, can establish trust with. The end-user so. We. Really wanted to think about how do we stop putting the burden on the user to decipher, what's going on how, do we take responsibility, for. Our actions and, make the things that we're creating. Understandable. To the user because when we when we prioritize. And understand ability, we're. Able to allow the the, user to understand the import, of the actions, and to, enable trust quickly and that this will be a very important, point as we, talk about momentary, markets, later on in. This, discussion. Okay. So we talked about these characteristics. Of the post digital business now, let's take a look at the five trends that we've established as. Part of our technology vision, for, the post digital era and then we're going to discuss how, we can apply those, principles of. Responsible, use and ethics that Deb just talked about. So. The first trend is what we call the dark power and this, is about the technologies, that we, are now seeing companies.

Use In this post digital era the. D stands, for distributed, Ledger's, right and, this is technologies. Like blockchain, and cryptocurrency that. We're using to. You, know to have these more secure and protected, and trusted transactions. So, we're seeing examples, of companies using technologies. Like blockchain, in their supply chains to, make sure that they're establishing that trusted, you. Know set of supply, chain activities we're. Seeing car makers use, these technologies as a way to protect cars from, getting hacked we're. Seeing the, delivery, service companies like DHL, for example who are looking at blockchain, technologies, to, to. Make. Sure that you. Are receiving for example. Medication. Or drugs that you had ordered and, they're not counterfeit, and that they're really the ones that you had asked for. The. A is, for artificial, intelligence right. 67%. Of businesses, say that, artificial, intelligence is being piloted, or, adopted, at their, organizations, today and 41%. Of executives, ranked, artificial, intelligence, as the one technology that, they expected, to have the most impact in, their, organizations. Over the next three years, so. And you'll, see all sorts of great examples of artificial, intelligence, hopefully, here at the conference as well as you walk around and and see what some what, other companies are doing. The. R stands, for extended. Reality, this, is augmented, reality virtual. Reality, mixed, realities, assisted. Reality so on and so on these are all about new experiences. Right. We're looking at new experiences. In terms of new, ways of training right we can use a virtual reality to. Simulate. Training. Environments, or simulate. Actual environments, so that workers, can, have a safe place to to. Try and to learn and to and to test out you. Know new capabilities. New. Ways of shopping right. New. Ways of interacting, with products, or gaining. Information, in. Our you, know in a store environment perhaps or, things like that and new. Ways of just exploring, places right, we can virtually, visit. A national, park or a museum, or any. Other such facility, like that. The. Queue is for quantum computing right. Quantum, computing, has brought, about so. Many new advances, and and they're really allowing businesses, to explore. New. Ways of, solving very, difficult, problems, or perhaps problems, that they weren't able to solve before right. We were working with a company called one qubit and we had. Collaborated, to, identify, over 150. Different use cases for. Quantum computing to help us solve, for. Example drug, discovery, we, can use quantum, to to, quickly discover. New types of drugs. Or. Fraud detection or, route optimisation, or so on there's many many examples of, how, quantum can help now. Individually each, of these technologies. Provides. An opportunity to. Differentiate, right. And we've, already seen that 89%, of businesses say, that they're already, experimenting. With one or more of these dark technologies, but. Then imagine that together right. These, are going to open some, really new pathways, and some unimaginative. To. Dark technologies, is built, on the digital foundation, that companies have invested in, and been investing, in over the past several years, and. We. Have. You heard of the term smack before, right. Since Mac is what is traditionally, you know called the digital. Technologies, that's social, it's mobile, that's, analytics, and cloud, right. And these are considered, to be the foundation of our, digital solutions, and we're, building upon those now right as we move to the dark technology, so, as an example augmented. Reality and, virtual reality solutions. Run, on mobile devices right. That, say the big data and the analytics capabilities. That we have been setting up over, the past several years, well, now we're extending. Those to artificial, intelligence and, machine learning those. Quantum, computing, services that are now being available they're, being made available through, the cloud right, so we're building on top, of what we had, had already, been establishing, and this, now provides a foundation for. Companies to start to explore, these, dark technologies, which are really still the early stages. So. Not only does it give us a head start but, it allows us to. Create new value. From these previous investments, and to extend our digital, business into, the future.

So. We see enormous promise that. Extended, reality can have so for example training as you mentioned, can. Really increase, the ability for people to empathize, with the plight of others, a. Really, great example I encourage you to go look for this on youtube is the NGO People. For the Ethical Treatment of Animals decided. To change their communication. Campaign, actually. Accenture, helped them with. This but. They. Use. The, extended. Realities. Technologies, to, help people to engage with a rabid, aitai it's called the eye to eye experiment, and, it really had a profound impact, on the public in terms of understanding, and helping. Animal. Rights come to life. But. When when I think about you know also the promise but, what are the things that we're thinking about whenever there's a huge institutional. Shift going from smack to. Dark what. Are the things that we need to protect against. And what, are the things that we need to maintain, so. What, shifts are we seeing we're, seeing a shift that's going from watching, a video on your phone to. Having these very intense, immersive. Experiences. And, when. You have these intense immersive, experiences, it becomes. Very. Easy to come to a truth, truth. From me conclusion. That can seem at times, unshakable. And permanent. And. What. We're what, we're shifting, from is not just data being collected, but. Also data that can be possibly. Used for. Manipulation, there's, there's a senator and. Virginia. Who just, today introduced, a. Bill. That. Ban. Would, like to propose banning, the use of dark. Patterns. For. Online platforms. To prevent kind, of the use of. Manipulated. Activities, where people volunteer. Information about themselves without realizing, how, that information, is going to be used so, we're we're shifting, from. Data collection, to possible. Data collection, for manipulated. Experiences. And we, think about the regulatory scrutiny, that is that, is coming, the, combination, of extended, realities, and for, example deep fake technologies. And. Will, the public start, asking, for guardrails, not just on fake news this morning I counted, there are about 13, countries right now that are either proposing. Or or. Have on their books legislation. Regarding, fake, news but, at some point will will, the public also, start. Asking for guard barrels around not only just online safety. For children but. Around, the creation, of fake memories. Okay. Our second, trend is called, get to know me and this is all about the consumer. How, do you reach the individual, consumer how, do you provide the right services, to those consumers right. It's all about how we engage, in interact and you. Can think about all the smack technologies. That we said we've been deploying right social, the mobile and so on and, we've collected, so much information, about each. Of us and all the users who are using these these. Technologies, so, you may have seen some, of those those charts that talk about what happens, on the Internet in a minute in 60 seconds, well. 3.7. Million google searches are done every, minute right. 4.3. Million YouTube videos, are viewed, every single, minute. 2.4. Million snaps, right. 38. Million, whatsapp. Messages, exchanged. Every, minute, and. 187. Million, emails, are sent every, minute. Right, so you can imagine now all, of these interactions all of these activities they. All say, something about us they all provide information, about. Us and companies. Are able to use. That type of data to provide new. Types of services to individuals, right, so as an example share. There's, a company called slice, pay they're, a financial. Services company in India and they, serve unbanked. Customers, right, so these are customers who don't traditionally use, banking, services, or anything like that but.

What They mean, what they did was that they, were able to use the pictures, that people were posting online or the, messages, that they were sending or their other social interactions, and create, a financial profile. Of these. Individuals, and, now they can provide services, or offer services, to these individuals, using. This profile, that they were able to build based on that information so. You know keep in mind that customers, are making. This information available, to. Us to our businesses, and they, are relying on us and, the businesses, to, use, that information responsibly. So. Again when I think about the promise of individualization. It. Is the promise of instilling. A sense that, you, are not only known, and, recognized, but, also understood. And so. We delight whenever a recommendation. Comes and, it's correct, we. Delight, whenever it's thoughtful or insightful. But. When data is being scraped, or being used in ways that, the, user didn't, originally, intend, as a. Society, we should pause, are. We creating, systems that are learning to penalize, people based. On activities. And data that they're providing, that they never intended, it to be used in this way are, we, making, inferences, on, users. That are just not reasonable, again. We're seeing increasing, scrutiny in this paper in this space so, the state of New York. The Department, of Financial Services in. January, just introduced, a set of guidelines to, insurance. Companies based, in New York saying. That if you're going to use non traditional data, sources like social, media posts. Here. Are the the gut here are the guiding principles that should apply but, importantly, you, need to be able to show the burden is on insurance, companies to. Be able to show that you are not using this data in a discriminatory, way so, against, protected, classes, of individuals. Race gender etc, and just by. The way just removing race, and gender doesn't. Eliminate that problem there, are in indirect. Ways of discrimination. Using proxy, variables, like zip codes that are going to get caught up into that so how, how, companies should anticipate and think through using, those. Non-traditional. Sources, of of data, the. City of Los Angeles recently. Filed. A lawsuit against, the Weather Channel. Alleging. That, you. Know claiming, that The, Weather Channel had collected user, location, data. On users. Who did not know that it was going to be sold to advertisers in, the. State of Utah recently in enacted, a law that. Requires police, to, have search, warrants, before, they can use and collect any kind, of electronic, electronic, personal, data like social. Media posts, people. Are worried, about, everyday. Behaviors, being, penalized and, criminalized. And. They. Are especially concerned, when. The, data sets by, which these inferences, or conclusions or. Recommendations are. Being made are either, false, they're, faulty, or simply, incorrect and I'm not even going to go into the whole bias discussion, we could spend like a whole hour, just talking. About how, some of these systems, are built with, bias already. Embedded, in them. Our. Third trend is, called, human, plus worker this. Is about the workforce our. Research, shows that more than 90 percent of, jobs that we have today will change as, a result of artificial, intelligence robotics. And other, technologies. Jobs, are changing fast each. Individual, worker is empowered, by not only their skills and their knowledge but, also by the new capabilities that technologies.

Are Providing. As. Some examples here, though in the oil and gas we. Have an oil and gas company whose. Workers, are now able to troubleshoot an, issue a mile underground using. Game-like. Visualization. Tools we. Have workers that are being trained as drone, delivery pilots, these are jobs that didn't, exist before and we, have factories, where humans, are working side by side with robots, and we're using artificial, intelligence tools, to. Help determine, which jobs the humans should be doing and which which, jobs the robots should be doing right. So the workforce is evolving and companies do need to keep up with, ticket to support these human, plus workers, and. There are three areas of focus that we have been exploring. And working, on, on. Improving, and one, of those is around hiring right. So the speed and the constantly, changing nature. Of these, human, plus. Careers. Are making it harder, for businesses to. Acquire, this. Talent, within their workforce, so, they're moving away companies, are being forced to move away from the more traditional. Reactive. Skills, based hiring you, don't just put out a, you. Know an, ad for you know I need someone who has six. Plus years and you know accounting, skills, or something like that right so you, know we've got companies, for Unilever as an example who, are now using games to. To, screen candidates, right and they're they're, using these games to, to. Assess your memory they're, using these games to. Understand. Your acceptance, of risk they're, their understanding. Whether. You take more contextual. Cues or emotional. Cues right. And so it's a very different way of screening, candidates, and then they're using again artificial, intelligence, tools to, match these. Candidates, with open roles it's, are just new ways of hiring. Is. Also training, right. So you, know we can't hire, people. Necessarily. To do the work of that. We require them to do now they're people just don't have those skills and you. Know what we've seen is that, 43. Sorry, but 43, percent of businesses and IT executives, say that more than 60 percent of, their. Workforce will move into new roles over, the next three, years and that's gonna require substantial. Rescaling, right, we need to invest, in rescaling. And offering, on-demand. Training, opportunities, for these employees, and, the. Third areas around knowledge management right. We have more information available up to us than ever before, but, it's also harder, for us to access and, to to find this information, and we're. You know exploring. New ways to to. Make that all right we're using technologies. Like natural. Language processing to be able to capture.

And, Assess you. Know information, and insights we're, using we're, looking at indexing, of unstructured. Documents. Like incident, reports, for, example as a way to collect, knowledge and insights we're, incorporating, knowledge graphs, to. Be able to find information across, a wide variety of different types of data sources right. So just new ways again on hiring, on training, and knowledge, management so. As. New story was late rather, recently, about a certain Seattle, company, that had. AI. Tool, that was screening Seavey's and if, you were a woman you were immediately downgraded. And there. Was enormous. Backlash, about what, happened, etc. But. When we looked at it we. Actually, said hey actually they, had. A really strong governance, structure, they, looked. At this for two years they. Recognized. That there was going to be bias in the system and, the bias and their existing, hiring practices, today they, tried and mitigate against it and tried to figure out ways in order to to. Fix and, adjust, for that bias and at, the end of the two years they couldn't fix it and they. Decided to dismantle that program and that story, to me just talk told, me about how, important, it is to really establish those governance, strategies, at the beginning, as you start, experimenting with, artificial, intelligence at. Accenture, we we spent a lot of time looking at some of the some, of the tools that we use internally, but, we also put. A high, level of importance, on retraining, and rescaling our workforce, last, year we spent 900 million dollars, on retraining. Our. People. And we use. The, money that we save from automation, and artificial, intelligence to. Upskill, people, and for. Us that really is important, because we think it's important, to democratize. AI learning, and from, an ethical point of view making it accessible, to those, who may have a hot there might be a high bear to entry for some of these skills. Our. Trend number four is called secure us to secure me right. Security is no longer about just protecting. Us as individuals. It's not about one person it's about protecting, all of us and, what. We need to you. Know what we've seen is that a lot of companies still think that security. Is, an, individual, effort and you. Know if they can just secure their own information, and, their own data that they should be safe but that's not the case at all because what we're seeing is that businesses, are rapidly, entering these, ecosystems. Right, they're working with technology, partners and industry, partners to create new, services and new products, and new experiences.

For For. Their customers, and, attackers. Are really seeing these ecosystems. As this ever-widening, attack, surface, from, which they can you, know try to do, bad. Things only. 29% of executives, in our survey, actually, know that their technology partners, are taking. Security and being just as diligent, as they, are in in you. Know implementing. Security. Processes. And solutions, so you, know what we continue, to have the traditional risk that we always have well you know we one of them has always been around the misuse of data and we've. Always thought of it as a misuse of our own data and how you know that would how. Could. Provide access, to our systems. But, really what we're seeing is that you know we can take we can see misuse, of other data to impact us as well right. So for example data from business wires has, been stolen for. Illegal. Stock trading purposes, right. There's. A risk around aggregated. Data as well we think of aggregated, data as being kind of somewhat anonymized, or, that, you. Know it's not able to identify us individually. But, you know what we found is that agra, agra gated data was found on Strava and used, out of context, to identify, secret. US military sites. Right and so you, know we can't just assume that we, can't make sense of some, of this other data, and. In today's connected an ecosystem dependent. World the, impact, of cyberattacks is, exponentially. Amplified, there. Was the, wanna cry crypto, worm he might have heard of that one that, exposed. An operating. System vulnerability. And infected. Over 300,000. Computers across. 150. Countries. In just a matter of men of days right. And this brought down businesses, and in fact it impacted, you, know work that was being done and, so on there, was a marine malware, that was used to hijack over a hundred thousand. I owe t devices, and then, use those devices to launch an attack on a. Domain. Registration, services provider right. So as, hackers can also we've, seen right I mean they can spread fat fake, news much. Faster than how good, news and real news traditionally. Gets, gets spread, so. Leading businesses are recognizing. That just as they work. And collaborate, with their partners in the ecosystem, around these. New products, and new services and. New experiences, they, also need, to collaborate, on security. Yes. So I think for, us it's really critical, that we're building, resilient. And sustainable innovation. And this includes making. Sure that we're building systems, that are secured by design I think I really, love this point around making sure that the whole ecosystem and, the supply chain is actually secure, given. How increasingly, networked, and, connected, all our world. Is and, I think, it was the, designer. Bruce Mouw who, said something like everything. Is connected. So. For better or worse everything. Matters and, so to. Me that means it, you, are only as strong as, essentially. And as transparent. And as secure. And. As and. You, are vulnerable as your weakest link and in the chain and we've seen regulators. In the past kind of put. Liability, not just on you, as an individual in terms of yours cyber security or your secure environment, but also in your your, supply chain and we're anticipating that, trend as well. Okay. Our last trend. Or fifth trend is called my markets and this is all about those momentary, markets, 85%. Of executives, agree that the integration, of customization. And real, or new near-real-time delivery. Is going, to create the next big wave of competitive, advantage, right, so again customization. And, real. Or near real-time delivery, and, that's. All about capturing moments, right, so whether it's real time views of operations, whether, it's instant, price quotes based on inventory. Or, scheduling. Or pricing. Data whether. It's the ability to immediately, adjust. And respond based on customer, feedback that, you're getting right, or these pop-up services that we talked about it's, all about capturing. That moment and as we get better at capturing those moments people. And businesses are going to expect, more, convenience, and immediacy from. Our technologies. And. Companies, can capitalize, on these, moments by. Providing, personally. You know tailored, products, and services that, go far beyond just customization.

So, The example I'll share with you is what. Carnival. Cruise Lines is doing right, they're transforming. The entire ship experience, right the cruise experience and. What. They do now is everyone gets everyone. Will be getting a what. They call a medallion, it's a wearable device and it knows your preferences, it knows you know it's able to know where you are on the ship it, allows you to make purchases, right there's all these things and what. And and they're also adding. IOT. Sensors, and, cameras, and analytics, and just all this information, all around the ship as well so. That for example they. Might identify, that there's. Availability at, an upcoming attraction. That, maybe your child might like and that they would be really interested, in and then it sends, you a message and offers you the, opportunity, to to. Be part of that attraction, right, it's a very momentary. Opportunity. It comes, you, know is, made available to you and then it just goes away, and. So. Given. The criticality of these momentary. Markets, what, I would say is that companies. Need to really think about how do you establish, just-in-time. And how, do you establish that you have the, responsibility, so. That trust, can be established and, exchanged. Seamlessly. And quickly, this. Necessarily. Means that trust, needs to be in corporate, incorporated. In the very early stages of designs, the. These, systems should incorporate for example for us at Accenture we, want to incorporate the the, principles of trust and deploy responds, the technology. Responds, responsibly. So I talked about trustworthy. Reliable, understandable. Secure, and teachable. Working. Through this beforehand, really matters and in our global, responsible, a survey, that we conducted last, fall twenty. Four percent of the respondents indicated, that. They had to go undergo, a complete. Overhaul, of an, AI system due to either inconsistent. Results, a lack, of transparency, and, or, biased results and, so. What. So what what do we do and what what does deploying responsible, technologies, look like and I just put together some. Of these points. And. I think it's really critical that we're building, agile. Multidisciplinary. Ecosystems, if you're if you've got it a group. Of individuals, that are that look like you act, like you do, the same things that you're doing it's, very likely you're gonna have some blind spots we.

Think That companies, ought to be looking across, the board you may need to be engaging with human rights organizations, for the first time academia, for the first time, use. Good data hygiene, I can't. Tell you how many times it it's, really important. For people to really understand. The. Way that the. Bias can creep into into, systems. How. Do you build a governance strategy that anticipates, downstream, impacts, how, do you create systems. Of constructive, dissent and, incorporate. Diverse. Perspectives. And, how. Do you use this whole system to enable informed, decision making there's, no one tool I wish I could say I've got a checklist we've. Got this great thing, and it's going to stamp you like organics, and you're, gonna be ethic certified. There. Is no one magic solution that will solve everything instead, companies, really need to invest, these governance, strategies. When. We started this presentation we. We, said, that we wanted to apply what we learned in the last five years because. In the last five years we've had this immense push forward, the, old models, of disrupt, or be disruptive, or move, fast and break things were, were commonplace at. Times this mindset, just served, as a justification to, innovate. Without, implications. Without, a, desire, to, to. Take into prospective. Compliance, trusts, or ethics and, if I may, for a moment just use the e use general, data protection, regulation. As a watershed moment because. Things, are being mirrored in California, for example. We. Are at a meaningful, transition, right now I've talked, a little bit already about some of the increasing, level of activities that we're just seeing in the United States in Europe and. In other countries there, is even more activity, and some. Some. Some, are also being demanded, by the public, the. Next five years we think is about trust and responsibility, and, how do we create sustainable. Innovation, but. In the end we cannot as we said at the beginning go at the set alone instead, of separate. Businesses, trying to go, their own way I think we've really got to work together and collectively. To. Try to get this right and pull, in the different community interests, the different perspectives, to. Make, them at the heart genuinely, at the heart of the, discussions that are happening today. Okay. So we've, given you a glimpse, into the future and. We hope that this presentation has. Provided you, with some things to think about when, it comes to defining, and deploying, responsible. Technology, if, you'd like more information on, our technology, vision you can visit our website Accenture, comm such technology, vision and you can also connect with both Deb and myself on Twitter, and LinkedIn so. Feel free to reach out to us with any questions, or. You. Know conversations. You'd like to have and. You. Know what I'd like to to, end with is the same question that I started with at the beginning right, we've talked about how, technology. Can be good or bad based, on how its deployed and how it's implemented and. My question will come back to you what is your role in influencing, whether, technology, is good or bad so. Thank you for your time I hope you found this useful and, Deb. And I will be here in case you have any questions. But. Thank, you again and. Enjoy the conference.

2019-04-11 18:14

Show Video

Other news