An Overview of Cloud AI (Cloud Next ‘19 UK)

An Overview of Cloud AI (Cloud Next ‘19 UK)

Show Video

Good. Morning folks. Levent. Basic director, product for cloud, III business. And developer solutions. Super. Excited to be here today thanks for coming and joining us here at excel. Today. I'd like to give you a quick, overview of, our, Claudia product portfolio and. More. Importantly, actually we're going to have three of our customers, come join me on the stage and do. A deeper dive into their use cases and talk, about how they were using our products, to add real value into, their businesses, so. Let's get going. So. I should. Probably say that your feedback is greatly appreciated so please. Make, sure to form your fill, up your surveys after. This a. Quick. Recap. Of our mission our mission as cloud III is to. Empower every. Enterprise to. Transform, their business with. Artificial intelligence. We. Believe that machine. Learning is a fundamental. New technology, that's going to revolutionize, everything. In the enterprise and we. Want to make sure it's. Accessible to every business on the world and. To. That end we've been very busy we work on our products, to make sure that it's a compressive, suite that. Meets all the needs of our customers and just. This year we, launched, 60 plus products and features on our portfolio. Serving. 26, verticals, because. We believe that those vertical, context-aware. Use cases and those. Products are really important to really add business value and. On. Top of that of course you want to make sure that we have a strong, ecosystem, of partners, so. You never feel alone in your AI, ml, journey. Last. But not least we have a, still, a deep. And continuous. Connection. To our data science community within and outside of Google through. Kaggle which, today has more, than 3.5, million members. So really excited and proud of that. All right so. As. I. Mentioned. Customers. Come to us, to. Use ml in, varying. Levels, of needs and requirements and what.

We Find out is that in, order to be a powerful, platform we need to serve each of these needs and how, we do that is that we provide, different. Levels of abstraction layers. Of abstraction of. Our, products, for. Anyone that might need a specific, use, case so. On the very top here we, have our, solutions, and these. Ideas solutions are basically plug and play ready to go with, our partners so. That you can have immediate, business value, for. Your specific use case and for your specific workflow without. A deep investment of any, kind into machine, learning. One. Level below that is our. Api's. And our ml which is primarily, geared towards, our developer, friends, and. Again here, you don't have to be a machine learning expert, but. You can still have access to some of the best-in-class tools, and api for either. With pre-trained api's or with customizable, tools, like our ml to. Create a specific model for your needs and. Now. Underneath that is of course our powerful AI platform. It's a robust and reliable, platform, for, data scientists, primarily, who. Wants to create their own models, iterate. On them collaborate. With their colleagues, and of course deploy them successfully. And all, of this of course is built on top of our purpose build infrastructure, with GPS, and TP use to, make sure you get the best efficient, performance out of it so, I'm gonna go through these one by one. Starting. With our solutions. And I. Won't, have to specifically, touch on contact, center AI so. One, thing that we see quite often especially, from. Our b2c. Customers, is that. They. Often have this, really growing. Pain in their call centers where. They face a trade-off between. Operational. Efficiency, and efficacy and, great. Excellent. Customer service. Well. With the advances, in language. And conversation, AI that doesn't have to be the case anymore and with. Contacts and AI we. Basically, want to solve problem primarily. In three levels first. With our virtual agent. Product. We, want to contain your incoming calls and chat conversations. As efficiently. And as reliably as possible, getting, to happier, and quicker customer, outcomes and. If. For some reason, we cannot contain those calls we, want to make sure that we empower your agents, in real, time with turn-by-turn directions, with, suggestions, and also mapping your entire coops of knowledge your, enterprise, to their fingertips so. That I can they can quickly, resolve the customer, issue and. You. Know leading to better agent outcomes, a better customer happiness. And, finally. We, want to give you tools so. That you can see what's happening in your call center with.

Our Call center analytic tools such as topic modeler so you can see what's happening you can optimize for it and it can feed that back into, your. Call center. Now. I'm going to quickly invite. Akash, Palmer to stage who's going to talk more about how, they were able to use technologies. In Marks & Spencer for better outcomes Akash. Thank. You. Morning. Everyone I'm, Akash, Palmer I'm an Enterprise Architect at Marks & Spencer. Established. In 1884. Marks. & Spencer is one. Of British, leading, retailer. Selling, home, clothing. And food items in. Over 1400, stores, across. 57. Countries and through our 50. International websites. Marks. & Spencer is undergoing. A five-year, transformation, plan everyone. In the company is, working, towards restoring, the, basics, shaping. The future and. Making. Marks, & Spencer, special. Again so. What does that mean to. The customer care team I work with, in Marks & Spencer this. Means we have to provide the best customer, experience to. Every, collar beat our colleague or our. Customer, and and, make sure that experience, is memorable, so. One. Of the big challenge we had was how do we effectively manage. 14 million calls which, come, into our, contact, center stores. And an head office in the, past these, calls were managed by, completely. Different platforms. With, their own IVRS, with their own routing and reporting solution, and it. Was expensive, plus, the, experience. Across them was, very very inconsistent, so if you as. A customer you think it's Marks & Spencer but if we call a contact center you would have a company different experience, compared. To how what. You'll get in the store so. A very, typical. Store. Experience. If you had called the store maybe, 18 months to two years ago now, you'll go to the website, you'll. Pick up the store number of your of your local store you, give, a call you'll, get an IVR and say, press one for opening, hours Dalian, extension, press 2 and you, know after going through this IVR, you probably press an. Option to speak to someone first. Person you'll get is somebody on the switchboard they'll try and send. You to an. Extension in store most. Of the time it will just ring ring ring because. The. Colleague. In store is busy dealing with the customer who's actually walked, into the store the. Call will bounce back to the switchboard, switchboard. Will try and that extension, this will go in loop till, either the customer gets frustrated, and hangs up or if, you were lucky enough to get the call answered. You. Might and have a query like you know buying something the. Store colleague will have to then transfer the call to the. Contact, center so a very very inconsistent. Experience, the. Problem was very clear an IVR. Which you, have to press 1 or 2 cannot. Really capture, hundreds, of reasons, why, a customer, is calling us and and. Then the fact that we were sending a lot of calls to stores which, they couldn't even service, so. We had to come, up with a with, a solution, when. We thought about this a very, obvious, and a very clear solution was why. Don't we just let the customer say in their own words why, they are calling we. Then understand what they are saying if we find. That intent, we try and sell serve using. Automation if we can't we, send them to the right place, first. Time so. Here's, what, we came, up with. We call this platform, Ava. Automated. Voice assistant, the. Fact that ava is the name of our managers, granddaughter, had got nothing to do with this so just just to make sure and, so. What what what how, does this work so we moved the entry point of all our stores and. Contact. Center calls on to the Twilio platform, it's, a sea pass platform, using, which you can pretty much talk to any api's you, like to this.

Gave Us complete, control of our calls so, now what happens is a call comes in. We created an orchestration, layer which, was within our secure M&S network so, call comes in if. We take over the control of the call we, ask the customer please can you tell us why you're calling they, leave in a trance and then we. Google, speech API transcribe. Data transfer us we, take the text back pass, it to dialogue flow data, flow will then return an intent at, this point we'll try and either do. Some automation and if, we can't we'll have a destination, where we can do the call to either the. Contact, center or to, the, stores. So. So you know we, use the text-to-speech API, which. Was very very useful, to make it sound very very, human-like. And and not too robotic and some. You. Know slight integration, with our existing Genesis, and Mytyl solutions. In stores and contact center. Overall. You. Know it proved to be a big success, and. And one of the things which, we learned. It. As, you can see the the journey was very very simplified going, forward what, what happened, here was we, reduced calls to store by 50% because, they. Would either sell serve they will go out or they. Would go, to the contact center. We. Also replace. The did EMF IVR with natural language and this, allowed over hundred colleagues who, were busy taking the calls in the in the background, to, be on the shop floor dealing. With the customers this. Was a really, great experience to, go through and one of the things which we learned was. Not to, get bogged down into, making. Everything hundred was inaccurate, you need to let different. AI complement, each other and work with each other. In our early days we were getting some, transcriptions. From the speech API which. Were not what we expected, it, was not word by what what we expected and we had to kind of train the model to, make sense of it so I've got a few examples of these. Transcriptions. Which were not, word by word which. Makes any sense within M&S have, a guess and see whether you know you, can you can guess what, it means within the M&S world. So. Jamaican order we, were like do we sell Jamaican food what, kind of Jamaican food we sell you're getting thousands, of calls about it it, was actually customers calling to make an order and then, we just place that into the.

Place Order intent it goes to the contact center next, time any American order comes it, goes to the contact center we. Started to get calls about Brad Pitt or like we've really, won the Oscar here Brad Pitt is calling us suit, on actually Brad Pitt this was called about profit, which. The. The int no it's a big business for M&S and we, had to send these calls to the profit, section in store so next time Brad Pitt calls up you know where he is going to go. This. One here I did. Say we want to provide the best experience but, we have to draw a line somewhere, but, this was a colleague calling. To find out about their ours and they had to speak to the ops manager to find out when they are working so just. To see line you have to play with these and. It's, not the old world where you have to be accurate. About everything, just let different, AI work, together and as, long as you get what you want out of it it is, perfect so so experimentation. Rapid. Experimentation is, is the best advice I can give. Some. Quick stats. Around it so this proved to be very. Very successful, we. Have, now so initially we saw customers calling in and there lots of calls were they were just hanging up the call and we're like why, are we getting this abandoned, calls and then we saw those customer calling back again and we've. Worked out the customers were not used to kind of saying something as soon as they call they, were waiting for this press, 1 press 2, and then press something and suddenly, they have to speak and then they like okay I don't know what to do hang up the call and then they call up again and like oh yeah, I will I want this I want that so, over time now 92, percent of the calls. When they come in the customer leave a good uh trans for us the, transcription rate using the speech API so Google speech API is really really good now. We no more bother about getting it 100% correct as well and we just trained the model and then just take it from there. The. Intent model has matured over time so, now almost, 92 percent of the calls are assigned, to the right intent in, the contact centers we got rid of reason, for contact, because, we just put the intent, as the reason for contact because that, is exactly what the, customer, called about and we know that in their own words overall.

A Very positive, experience from the customers, oops. Its, future use cases so we have you know as you can see we created a platform which. Not only solved. A problem but. It also opened, up opportunities, for the future so. Now we, know what, the customer is calling, us about we understand, it what, we have to do now is to, fix. The problem solve the problem for the customers so we want to do that through automation personalized. Self-service, we. Want to also, create. Channels. Where. The customer only channel where the customer can just call, in or get in touch with us using, any channel, they like because. Now we have the, brain pretty much what I call it you, can not just push voice through it you can push SMS, chat whatsapp. Any channel, can go in and come, out with an actionable intent, and, and. Overall. I mean I wish, Google, had come up with the the, combination, to AI 18 months ago because that would have made our life a, lot more simpler, we. Are looking forward to experimenting with the new cloud AI features, to. Make every contact our customer has with M&S truly special thank, you very much. Thank. You thank. You all. Right, that was great to hear about their success story so, next up I want to talk a little bit about our developer, building blocks and here I'm referring to of course our free, train api's, as well as our ml. So. Our goal here is very, simple we want to provide, a comprehend, of set, of capabilities across. Four. Major domains site, language, conversation, structured, data to. Developers, so, that they can easily use, the, latest and greatest an ml, development, and technology, in their applications, and services, and add that layer of intelligence. For. Their purposes and we. Do that primarily through, either our pre-trained api's which, are simple, API is that you can just call from your applications, or services, no. Ml knowledge needed using, Google's data of course to train those 80s for common use cases or. If. You have, a custom, specific. Need. That needs a custom, machine model we, of course have our other mouth suite of products that allow you to customize as api's with, your own data and. The. Great thing about these capabilities. Is that as Google once. We release that we don't forget about them we continuously, drive new improvements, into, those api's, so. Forest as a speech API convinced, it gets better and better, you don't have to do anything we just retrain, our models and get those accuracies, up over time and we, of course continue to release new products and as. Part, of that I'm actually very thrilled. To announce here today that, as of today we. Have three new products, available for general availability for, your use our, ml vision as well as all ml translation, and other a cloud, translation, API so, I can't wait to see what you guys will do with those api's. And RMR capabilities. Now. One. Thing I want to double-click a little bit deeper on is our, ml, natural, language since we see a lot of excitement around this feature. Basically. Our ml natural language allows, you to create your own custom machine. Learning model for, classification, for, senton analysis, and for entity extraction and, the experience. Is pretty smooth it's a primarily. A UI driven, experience although. We do have api's if you prefer api's we don't discriminate but. In our UI you only have to do it just upload, your documents, your text. Snippets. Our. Ml, on the backend trains. Multiple, models before. In some of the latest technologies, that's available to us multiple. Architectural, types you, know feature engineering whatnots, picks, the best one you, get the evaluate, which one you like the most or you can just deploy the the best one that our mouth picked and you. Can actually just deployed immediately to, arrest API that's relative scale, period. So. All of this is done through a you are driven experience with. No deep, knowledge of ml needed. So. We're of course really excited about this but, I actually want to now invite Dan Gilbert up on stage who's going to talk a little bit more about how they use these capabilities and news of your case use. Cases then. Thank. You LaVon. So. Yeah thank you my, name is Dan I head up the team. Of machine learning engineers, and data scientists that news UK. We. Are a media, publisher, most of you have probably heard of us so we published titles, like The, Times the, Sun Sunday, Times TLS. We. Also include the wireless radio.

Group So includes. Stations. Such as virgin. And, much. Like our, cash in Marks & Spencer, and that. The times is like an old publication. It's been around for more than 200, years and. One of our roles is to kind of guide guide. The world and. The UK through kind of an increasingly, confusing. Time. In terms of politics and news and, what's happening, and. Increasingly, a lot of the. Content that we produce and we produce a lot of content so, dependent it's always tricky to put your finger on a number but somewhere. Between quarter, of a million and half million pieces, of content per year we produce across our titles and more. And more of that is produced. And served, and reached his audience online rather, than through print and, as. Such we've gone through a digital transformation, and much. Of that digital transformation, has been accompanied, with, machine. Learning and machine. Learning and the rise since kind of kind. Of 10 or so years ago has, helped us build better knowledge about our customers, how they behave what, features of the product they enjoy how. To improve, the customer experience, but. Despite that a lot of the benefits we've gotten has been through kind of big big data analytics on, things like bigquery, a lot, of the advances, from deep learning. In terms of things like image recognition have, not really helped, kind of transformed, our business I think, the real exception, is natural language processing and. Much, like deep learning and image classification. Kind, of seven, or eight years ago was going through kind of a moment, of massive change we're. Really in a kind of golden age of natural language processing in, terms of the ability to use machines, and data, to understand, the semantic meaning of text and as, a news publisher we produce a lot of text so this is incredibly, interesting to us and, we use it for a range of applications. From. Metadata. Which I'll talk about today and through, to things like automating, fact-checking, within the newsroom to kind of assist journalists, and editors, and. So. Metadata, is. Kind, of almost like the baseline, core, piece. Of NLP that you tend to want to apply when you're in the in a in a content rich business, and so we have this article here, it's, about economics, is about UK, politics it's about brexit, and, that. Metadata, is absolutely. Critical to the success of us particularly, in a digital age. It. Helps us build better and, new user experiences. So, when you have a game between kind of quarter, and half a million pieces of content over the course of the year that adds up to millions of pieces of content within your archive the, ability for it for a, user who, is kind of short, on time to navigate, to the content that is meaningful, to them is really aided, by our ability, to put it into collections of, topics, based, on the metadata that sits behind the scenes.

Perhaps. Most importantly, it helps us understand the impact of our publishing decisions, so, in an age of free news where you can read your news anywhere, for free using. Search ends engines, and other other means our. Ability, to understand, what it what is a unique to us that our customers, value what drives engagement is. Is ever more important. Through. Doing so we can reduce churn by producing, content that our readers engage, in value more and we can create more relevant advertising, so. Metadata is not new we've had metadata, tagging, along on content, for years I guess. What's really really changed, is the ability to put that metadata within the same infrastructure, as the rest of our data lives to, drive better decisions, and bed user experiences. We've. Been using Google bigquery for a approximately. Six years now and, Google. Bigquery is the home of our customer data so kind of people who pay and subscribe, to the times. The. Click stream so the behavioral, analytics of what, happens on our websites and our apps but. Then most recently the actual content, itself so. Not. Just the list of content we produce but the actual kind of full body of the text the links or all of the other information but. When. You're trying to deal with kind, of, millions. Of pieces of content and understanding, what is working what is not working the metadata is absolutely, vital you you can't just rely on the headlines and, things like that and so, kind of a really transformative moment. Was bringing, in the metadata, within to within the same data architecture. That this other date and other data sits in traditionally. Keeping. Your content, data in the same database or, data warehouses, where your customer, data lives would just have been impossible but. Bigquery, is an incredibly, flexible tool, that allows you to do that and we. Use google's. AI. Api. Is to, derive, a lot of this metadata and, i'll talk about a couple of them today so we'll send the content, that lives at rest with in bigquery off to. The, google ml api's but. Then send that back into bigquery itself, so we have this kind of rounded view of the customer the content, and the behaviour with our digital products, so. Going back to that. This article again so we know it's about kind of. Bricks. It's of bricks it is an entity. It's. Kind of merged, into it kind of morphed over time into a topic but it starts out as an entity and things. Like UK politics, economics. That is a topic so, we're using Google's NLP API is to kind of systematically, when we produce new content send, it off to the api's take, it back we add a bit of kind of business logic and processing, on top before surfacing, it on the front end of the website but, within bigquery, we retain all of the kind of the raw data from the API for.

Analytics, And data scientists, to use. But. You can only get so far with, these kind of I guess out-of-the-box. Entities. And topics types descriptions. Of 10 as, you can probably imagine we've written, hundreds. Of thousands, of articles about UK politics over the years thousands. Of articles about brexit, so how do you kind of distinguish. Between one article about that same topic and another article over here and, which types of content really drive user engagement and, users. To understand the value of our product and, so. That's where kind of custom. Kind. Of metadata comes in and this is a very recent kind, of addition. To this to, our our data architecture. So. If you take her another, article like this. Earlier. In the year we. Employed. Kind of a team of very skilled interns. To look at our kind of arrange a sample of our content from, the previous year and, not tag it based on topics. And entities but. Using a kind of a much richer classification. System which, was developed by a company called KITT in Sweden, and, what that led to is for a given article like this, the. Ability to kind of get a much deeper understanding of, the nature the tonality of the content, and. Then when you run analytics on top of this it gives you a much more kind. Of nuanced, view of the, type of content, that works well and doesn't work well with our audiences, so, rather than being restricted, just to just to a kind of a broad topic of think. In this case kind of music, and books you. Can get a sense of whether kind of the field reporting, style of interviewing, it kind of performs, better with, certain audiences, than other. Types. But. But obviously doing that process once kind of using. Again some very skilled interns, it is, not a sustainable ongoing. Process, and. We're. Kind of scratching our heads for a while on how do we kind of like scale this out, so. That's where kind of Auto ml entered, the picture so. Also ml is again it's part of kind of Google suite of ML. API products, sitting. Alongside kind of the natural language API, and. What. It enabled us to do is, so. Whilst we have some very talented kind. Of data scientists and the team who have kind of deep expertise, in natural language processing. What. This allowed us to do is someone, who didn't necessarily have that background to kind. Of upload this data on the content that had been manually labeled, by people. Within the newsroom and then, train models to try and understand whether we can systematically, predict. Which label new. Content would have and so. Here you see for. One of our so I think overall there were kind, of 10 to 20 new kind of custom, metadata types, within. Which you'd have kind of ten or so individual, values so you'd have many, dozens, of potential metadata, tags you'd want to assign so. In this case. We. Uploaded, the content with. The tags and then, without, a game without having a deep understanding of, how NLP, approaches work you can have a look at the performance of the models and so, in a case like this. Things. Like whether the article, was deemed authoritarian. Or objective. The model performed, really, well what. This then means is that we can kind of systematically, tag new content, as it comes through and is produced without having to, have someone manually label that as. With. All these things there are nuances so. If. You're. Expecting the machine to be able to tell, you as something is funny or not it is pretty bad it has like a four percent accuracy, rate in terms of determining whether a piece of content is humoristic, or not so, it's not a panacea for everything, but it massively, advances, our ability to apply this much more nuanced metadata, to our chronic content on a systematic, basis.

So. We used auto ml text classification. Which. Really put the hands in the hands of our arc of, analysts, some, of the latest capabilities, in natural language processing. The. Commercial benefits really being asked. To being able to identify exactly. The. Type of content, and, parts of our journalism, that drive the most engagement. Supporting. Acquisition, and retention and, so, kind of making learnings, around kind of what type of content works well in home news within, business. Understanding. Kind of the breadth of content that would interest people things. Like world news understanding. That it's actually local more nuance to reporting, rather than informative, pieces, that work better trying. To describe, metadata. That with traditional approaches, would have been very difficult with this new approach that that starts to become possible. Technical. Benefits are applying. This custom metadata and it's kind of like a kind, of a layered cake so the. Auto ml on top of the the more. Fundamental. Entities, and topics using the NLP API. And. Putting these NLP capabilities, in into. The hands of an analyst. And. The, real benefit, is and. Often to kind of I guess the skepticism both myself and, also people in the team is the, ability to label, with a relatively, small number of input. Data so. Traditionally. When you're trying to classify using, NLP techniques, you may need thousands, of examples of a particular label, to assign to a piece of content and in, this case we've had success where you've had as as few as a hundred, or so positive. Labels, for a particular type of metadata. When. You're in that world it reduces, the the kind of the time, to do this by hundreds. And hundreds of man-hours and what, that allows is kind, of the the people in particularly with NLP. Expertise. To focus their problems on other new, more, difficult, challenging, problems such as fact checking with the newsroom.

Thank. You so I'll hand back to lavond. Thank. You I think that's fascinating. You. Gotta watch out for this little interspace, here it's, easy to fall back. Great. So. Moving. On last, but not least I'm going, to spend a few minutes on our AI platform. With. Our AR platform, our goal is simple we want to provide a robust, reliable, powerful. Platform, for data scientists, to create, their own models, from scratch to iterate on them on those models to, do your data science workflows effectively, and efficiently and, as. Part, of that of course we offer our this. End-to-end code, base development, environment. Specifically. For AI inside, of GCP, it's. An integrated tool chain from. Data. Labeling, -. Of course built-in algorithms to Train and prediction, and, not, only it's integrated, within themselves, but of course it's also seamlessly, integrated, with the, rest of the GCP, products, such as data. Pub/sub data flow and of, course bigquery, so, again. Our goal is to make it as simple as possible for data scientists, to create the great, models out there after. Now. One. Thing I wanna briefly. Touch on is, the. Need for, explainable. AI, this. Is a been, a rising, need especially. For our customers, in in healthcare. In financial, services for. Many reasons, first. Of course with. The rise of deep learning you. Know it is harder, to understand, why a model is doing what it's doing and why, a prediction, is is happening, with. Multiple layers of neural. Networks and the. Best of course benefit is for the data scientists, themselves right to understand, that you know or. Develop a deeper understanding of, their models to, better debug it to, better understand why the mouse doing what it's doing but it doesn't stop there natural also helps you communicate the. Value of top model to, other parties within your company and outside of your company but. Most importantly, it helps you build user trust especially. If you're predicting, things like credit scores or a cancer diagnostics, you. Want to know how. The model works and of course your customers, and users want to be able to trust that model so to that end as Google.

We're Trying to build, multiple tools here in the space and have, already a few things in place but, there's more exciting news to come on this so stay tuned for that. All, right now in my Jacob Eggers to stage who's gonna talk a little bit more about how they'd be using our, cloud platform with, embarr to. Help specific. Use cases and Diagnostics pair Jacob. Okay. Thank you very much. Wow. A lot of people welcome. I'm. Gonna take you through our journey in the, use of data in, healthcare. And I was asked to speak about AI in healthcare and I thought that's huge, AI in healthcare so, I sit in Bear which is a pharma company and you. Probably know what is the aspirin company and, I. Want. To take you through our journey through data. And AI. So. First, one. Of the things bear does is, radiology. And that sort of got us into this whole arena and field because, it's really, big data a lot of them are 3d images, which I'll show you coming. Up as well as when we talk about EMR, data which, is that metadata so, as we move forwards here. We. Have a system called router metrics which is our platform that. Allows us to connect to many different types of data in hospitals, and it's. Very disparate data because some of it is metadata, like your EMR electronic, medical record and other. Data are. Images, and often, like if you've ever had a CT scan on how many people have had a CT, or an MRI a. Few. Hands showing up. Those. Datasets often, are 512, by 512 by, 512 their 3d data sets it's a lot of data and through, our platform we've currently have about I don't, know give or take a few million 26, million patients. Come through it so. You're talking about petabytes, of data. So. As we move forwards on this and I've already mentioned this to you one, there's on-prem data that we have to digest and understand. In order, to apply our models there's. The, integrity, of the data and workflow because different hospitals, have different systems we have to integrate with a lot of different. And, non-standardized. Systems, data, access can be an issue because of these non standardized systems and obviously. Data security is a big piece of what. We do because in healthcare. Data everyone's, worried about where their data and that. They want to make sure it's. Secured. Let. Me take you through our journey here, so. When we wanted to develop. An intelligent, data Lake we. Used a combination of some work. On Google cloud with quantify, and this is sort of how we bring our data sources in that's, our ingestion layer then. We have a D identification, layer which, you can use google, has out an API healthcare, API that allows you to not, only ingest, different data types but, work on D, identifying, that data some, of the data we do D identify, the imaging, data through, our own proprietary methods. Some of the other data that can be very difficult to de-identify like, doctors notes and, they can write in their notes mr., and mrs. so-and-so's, husband, or whatever in the estart their personal information you.

Need To remove from your data itself, which can be difficult and then, from there there's our data mark that we then can, use. To, build. Inference, engines and make. Diagnosis. So. As we move forwards here one of the big pieces of work in. Any. Sort of machine learning unless it's unsupervised, we're not talking about unsupervised, learning here is the, annotation, or that tagging, methods as you all know so, the tag data in healthcare it's a little different if you have a. Self-driving. Auto driving car that you're working on an algorithm for almost. Any of us can find the stop signs, even. If they're, on the right or left side of the road depending on your US or England, and. At the same time you, can find the people in the cars in healthcare it's a little more difficult you, need experts, to annotate these things we have a whole platform to. Allow us to annotate in 3d, otherwise, we take forever I mean. If you had to do 512. Images, for, just one patient one CT, scan you'd be there forever, okay. When, our data comes in if we look at let's say 50,000, patients that, may turn out to be 2. 3 4 million images you really do need to annotate in 3d we, have technologists, annotate first for Anatomy then, we have radiologists, and at 8 for the abnormalities, and then we overeat, quality, is a huge issue and it's not so much quality meaning people annotated, it wrong it's not like a stop sign where I think all of us would agree it is a stop sign or it isn't in healthcare. There's. About a 20% discrepancy. Between one train what, one doctor says and another or one radiologist, says in another so, there's, discrepancies. In data because there's different opinions so all, of these are challenges, it's, just a quick insight, into our annotation, platform they're semi-automatic, tools that, allow you and I'm showing picture here the chest and the heart allow, you to segment the ventricles, and then you can by hand correct, any of that with a 3d painting tools and then you can use clipping tools you're not using scissors along a line here you're using planes to cut things so, these are how we actually, annotate the data so. There's some challenges though and some of them I've already mentioned and want to take a long time here we're not trying to turn everyone here into a data scientist in the next half hour take, maybe 45, minutes. But. In healthcare, the, data we're looking at from imaging, is all 3d and we're also looking at the EMR data which, is obviously not imaging, data but, some of the problems there is people. Come in all different sizes you. Have pediatric, patients, and tall. Adults short adults you have many different. Size, datasets it's not standardized, and on top of that they. Can come with all different resolutions, each Hospital does their own thing each, doctor, may do their own thing so you have to deal with all these different challenges. Between. Disparate. Data sets. So. What do you do about that well some, of these data sets are so large they can't be analyzed, in one. Inference, so, you have to divide them up like with cube flow or something else they'll show you examples, but you also have to make and, this is where data science comes in and where physicians come in some, common-sense decisions so. Some things you want to do you need full resolution meaning.

You Need that original, 512, by 512 by, 512 data set and there are other things where. You can condense that data set down resample. It to like 128. Cubed and you have to know whether your problem allows you to do that so let me show you an example so, if, you look at this example here and this is where we're using a 3d. U. Net as you can see on the left, side on the right we're looking at a stent and you know like a cardiac stent you can see all the little dots of the wires in the stent at, one resolution you can easily make out that it's got wires in the stent as the, resolution, decreases, it certainly turns, into like a doughnut you may ahead this morning and your. Problem may be good enough to just know it was a doughnut and other. Times we need to know actually where the wires are in the stem if, you just wanted to measure how big the. Diameter, is of the vessel the dark part in the middle of the doughnut, the, the Dunkin Donuts donut holes if that's what you wanted to figure out the. Low-resolution maybe in office you have to understand, your problem before you can decide what, tools to use that's my point here so if, you need whoops, I think we have two slides ahead there yeah if you, need the full resolution we do use cube flow and that allows you to do your inference. On each of the cubes at full resolution rather. Than resampling, it to lower resolution. And. On the side, here I'm showing you sometimes we use squeeze net on the upper, image. And on the lower image that's dense net and we've used both to make inferences. Using. The cube flow engine. The. Real challenge is once you've implemented. This for each of the cubes how do you put all the cubes together it's, like a flock of birds do you need more than two cubes to be positive, or three cubes in order, to say that. Your ml, engine should give you an inference or a classifier, as positive. And these are some of the questions that come up the tools will. And need to be built in the future and some of the things that we work on so. Let's, just take an example here for I, think again I scooped yeah so a case here lung disease classification, when.

Working On classifying. Diseases, in the lung again, we. Need to leverage the AI workspace, we. Develop radiology, solutions, to this that use both the EMR data as well as the, imaging data and then we try to classify the, pulmonary abnormalities. So. As you step forwards and you have to have your image ingestion as well as your other data ingestion, there's, pre-processing. There's training and. Then there's patient, level aggregation. That's where you have to put the cubes together and finally, a classification, result, let me show you a little bit better diagram of this this, is one of the key slides and how we approach things that, you bring in disparate data we, have a supervisory, a I bought or like a, inference. Manager, that then manages, multiple deep learning algorithms, in each of those looks at different features and. Then. The inference. Engine, can decide what, AI to run which, of the different modules and then make a differential diagnosis, at the end and, you can have multiple more, modules I'm showing a cardiac a pulmonary, vascular, you can on you, could add on other modules for cancer. For. Breast. Analysis, and so forth as the system grows. So. It's very scalable I just wanted to show you a few results again after you've annotated, or tagged your data you, may want to do like a segmentation, a little different than what you've heard about earlier a lot of that's classification. Or NLP. Segmentation. Here allows you and this is an example of the heart to. Look at the ventricles, individually, the atria the, aorta the pulmonary arteries in order to be able to make diagnoses, of like left, heart failure right heart failure and so, forth this is one of those engines. Here built, using 3d, unit. And. Then. The, other point I want to bring out is and we're, here about AI and we're here about deep learning and we do a lot with it but, at least some percentage still, requires other algorithms, and we still do a little bit of image processing on the back, end in order, to then reanalyze these inference, output so you saw the segmentation, masks, we then can fit these spheres, or in this case ellipse I to, the ventricles. Or to the septum, to tell you more about how your heart's functioning. So. What I want to end with is what. Is the future, imaging. In the future of AI in healthcare is very big it's very broad there's a lot of people working in it so imagine a deep learning system where you, did have enough data to build kind of whatever you felt like now, what if we could make a diagnosis, as simple as like a Google search or we. Could eliminate the diagnostic, errors. From. Us data we know we spent over three, trillion on health care but, almost 1/3 of it we believe is wasted, so, these are big things that we can bring to, improve. Healthcare for every individual, not only in this room but throughout the world and, that's really where we dream about using the different tools we talked about today. So. Thank you very much. Thank, you so much thank. You. All. Right so, today. You heard from our you know three customers, about their use cases you heard a quick overview of our clutter our product portfolio we, have many other customers, in Amyas specifically, using cloud III some, of them are which actually are here today doing.

Sessions So make sure to check them out I want. Us to do a quick shout out to our growing. Ecosystem. Of partners AI, journey, is it, can be simple can be difficult, we. Want to make sure that there's a this growing ecosystem. Of our partners, so. That you never feel alone in this journey and you get the help that you need but made the integration. Maybe other, purpose of AI into your workflows and with. That I want to thank you hope you enjoyed our quick overview, and our customer stories I can't, wait to see you guys use our tools and hopefully transfer your business with AI thank, you. You.

2019-12-14 18:08

Show Video



Other news