Good. Afternoon ladies and gentlemen, my, name is a Mogan barker I have here with me also balaji krishna my. Colleague and as. We are set in a movie theater we will walk you through a movie of SA, peas data management journey, also. How we are trying to bring, best of both worlds, from ACP and Google and in. An offering to to. You. So. Brief agenda how we are gonna walk you through in terms of what, do we see from the enterprise, data landscapes. What. Is ACP. Data hub the product that. We are going to walk you through how we. Are working and collaborating with, Google cloud and a, few customer examples, and of course we want to keep it interactive or, also I have questions, from you guys so. Let's. Go through. As. You. Can see here. Is the kind of evolution, in terms of the enterprise. Intelligence. On one. Axis as towards, the data diversity, on the other one you. All know that we, have seen more OLTP. As a start, where the transaction. Processing. Was the. First step into the enterprise world via. This ap, started. Our journey also, from there onwards then. We saw a push and move towards, enterprise. Bi or reporting. And that's. When the. OLAP, world, came into being and as, we are moving more and more towards, heterogeneous, landscapes. And, more. Diverse, data we. Are seeing a multi faceted data, management, requirement. We are seeing also technologies. Like machine. Learning ai. Ai, ot. Being. Used as well and that. Is where it, becomes now more and more interesting, how we can address all these kind of data management, challenges. Also, in the future. Just. To bring, a little context, as to why, and. How data is changing, the. World here is an example where, last year, judge, gave a ruling in Ohio that, the, data from a pacemaker. Could, be used against, an investigation. Four-inch. Fire insurance. Investigation. You. All might have seen that how. One. Father had reached out to the marketing dream from Target when. They, were sending some commercials, related, to pregnancy. And later. He found out that they were actually not wrong their predictions, were true. And they knew more or had more data points, on his, household. In. Terms of movies we have seen that, the. Data science teams have been collecting, more and more data just. Like in this kind of a setup when it gets exciting, or when, it is scary and, try. To see that which scenes are creating, what effect and bring. That into action for their next productions. Also. From Spotify, we have seen that they, are using it to, bring, out more ad business, we. Have seen also that. After. Breaks it they sent out these. Kind of ads and you know it's end. Of the world kind of a thing or what you see on screen about, Valentine's. Day and. Of course in Japan the. Land. Where there, are a lot of earthquakes, we have seen as well that even. Apps, on, the iPhone are, being, used to, to, gather data and. Then. Try, to predict, as to what can be done or what is the kind of intensity, that. Is happening, but. Now coming towards a little bit in terms of the corporate world as well what we are seeing is, a. Picture like this there are a lot of silos, data is separated. And. What. We see also is with. Mergers, and acquisition, scenarios, coming, in it, adds to, to another challenge, and further. It up now with the. Kind of open source world or big, data and hadoop. Scenarios. It, adds a further complexity. To this and it, becomes really. A big challenge for the teams to see. That how. They can avoid duplication. Of data how, they can avoid. Unnecessary. Movement, of data and. Go. Through bringing. All these silos together, for, a single version of truth. What. We are seeing is also an explosion, just to give you a context, in the last 5,000, years the. Amount of data that was created, which. We already exceeded, in the. Last you. Know 2016-2017. Years, and it's. Predicted, by 2025. It will become humongous, it is expected, to be 163. Zettabytes, of data so. Manually, actually. Analyzing, this data is, simply, out of the question, that's, where we, want to automate as much as possible and, bring value, from the data to make sense and have intelligent, insights. So. When this volume, of data comes in, understanding. That data or finding, patterns and, so on is, also. Equally important, what, do you see on screen here, is. You. Know some. Picture where you can add context, you can add some water and, you can see that maybe it's a it's, a seagull when you build.
Some Context, around that data and add some meaning to that but. At the same time if you, add a different context, then you. Are able to see a different picture and bring out a different meaning from that of course a skilled, veterinarian. Can differentiate. Between this and that, is the kind of a, job the data management data governance data. Science, kind. Of roles. Are today facing. And. With. This what we try to do from a, say P side is we, got, feedback. From some, of the customers, as well as from some. Analysts, and with. The study also from, KPMG we, realized that there. Are at least of, five principles, if you think from data. Side there, is data visibility. Almost. 3/4. Of the. Customers said yes this is definitely, complex, and that limits their agility, data. Quality is a big concern, most of the CEOs are. Saying. That okay this is where we are basing our decisions, on and. For. Innovation. They. Are saying that they're not getting most out of the data also. In terms of if you ask them that, are. You know, are, you knowing how much data is. Relevant. For your, decisions, they, are not even able to give rough. Percentages, this is a big big problem. That we sense. And where we could bring. Out some offerings, TCO. Is a factor, compliance. We had GDP. Are coming in Europe we have, californication. So all these things will will definitely. Shape up in terms of the challenges, for our customers. That's. Where ASAP, came out with a different kind of a thought process and, we. Wanted to now see how we can address. All these challenges, together how. Can we build a, kind of. 360. View to to give them a view across their. Enterprise, data which means that, it can be enterprise also the open source world how, can we seamlessly. Integrate, data quality. Capabilities. In there how. Can we give them also. All the tools that are available today, to. Drive their, innovation. Initiatives. In a very agile manner at the, same time we, wanted to avoid. The data redundancy we, wanted to avoid, the high cost of data so. We wanted to to, optimize, that by. Decentralized. Data processing, as well and, of. Course, in the end too to give them the confidence and, get them to be compliant, with their, data sets. So. What ACP has, come out with its. New offering, we launched. Late, last year ACP, data hub. We. Are trying, to help our customers become. More and more data-driven, what. You see here on on. The left hand side is the SI p world with ACP HANA. As the core of, the. Digital platform, and. We. Have on the other side the open source world we. Are trying to help now. Everyone. Right, from data discovery, in, terms of metadata. Governance, with, the orchestration, with. Very, kind of agile, data. Pipelining, capabilities. You, will see it in also and demo later and at. The same time we are trying to expand. The reach into. The, connectivity, and. Also. Get. Out the reach into, the, cloud infrastructure, providers, Google is a very close partner, so we are working, with.
Google Cloud team as well. In. Terms of a architecture. View here is what you see we are on. The, layer below, addressing. All the data storages, it can be on-premise, it can be in a Hadoop store it can be in the Rost or it can be in a cloud store, as well and, on. Top of that you will have the runtime, from. A CP data that. Is where we are also, using. Kubernetes. You heard a lot from Diane. Green today about. Kubernetes. And soon there as well to, see how we are then. Seamlessly integrating. That to give you the. Means to, to have that elasticity, as well and. Then. We are trying, to connect it into also, all the ASAP applications. As as. We go on as well, as we are giving you rest. Api is to connect in if you want to connect to any other application, as well on top then you are seeing how we are trying to bring in capabilities. Towards, metadata. Cataloging, towards. Data discovery, how. We can have a one-stop, shop to also, more monitor. Schedule, and get all the landscape, administration, and you. Will see also in the demo how easily you, can bring. And build data, pipelines, and. Define. Wow and where the data should flow. In. Terms of for example. With the flow based big, data management here, is how you can see there can be multiple, sources when. Comes in the heterogeneity, you can have. Data which, is raw. Data but it is now increasingly, becoming important, for your enterprise. Decisions. You, can bring in that, stream. Data in you can combine it with your master, data you can have. Your. Data provisioning, done in a distributed environment, and, to. Do all this you are starting from ingestion. Or collection, of the data then you want to refine the data bring. In the structure especially for, the raw data make a model, and then, bring it out for for consumption and then when you deploy that you can automate. That you can reuse that it's, very easy to. To. Get in there and. Without. Further ado I would like to then bring in Balaji. Who can talk a little bit in terms of also what we, are doing in the hybrid, cloud and walk, it through a demo thank you I. Just. Want to make sure okay perfect, though I, just want to make sure the mic is on so continuing. From where a, Moog left off we're. Seeing this rise, of hybrid cloud, obviously. When you heard the keynote this morning like. One of the speaker's was mentioning, 8 out of 10 customers still look. At a multi cloud and hybrid, environment. Where a lot of assets are on Prem, but they're gradually, migrating, over to the cloud and we, see this happening across.
Different. Cloud, environments, primarily, GCP. And Google, being one of the main partners for SA P in this journey and as, part of this journey, what we've realized as, the. Main. Advantage. For a lot of customers, when, migrating to the cloud is, how, easily they can migrate the workloads, without having to worry about the. Whole DevOps, aspect of it so it's moving from a DevOps, to no operational, kind of experience. And that's, what is enabling. A lot of our customers, and sa P is a company. Which has been in this service. For around 46. Years so we have a lot of customers, who, still have on Prem but we still see, that there is a gradual. Migration, from. Our, existing, customer base to. A cloud, kind of environment, obviously a lot of new customers that we are grading are primarily, cloud only but, there are a lot of on-prem customers, and for them we provide this choice especially. When trying to manage. Their data more efficiently, when, the data is spread across cloud. On-prem. And multi cloud environments, and. What. We try to provide is this capability, of best of both worlds because again, putting. The emphasis on here I still have my ERP, system which I'm, in the process of upgrading. Migrating. From ECC. System is AP ERP system to, s4 cloud. Which is our digital code but. In the interim I want, to be able to bring in the data that is on Prem and also, combine with other applications which are running on the cloud whether it is s AP, conquer or simply successfactors. Which is cloud, native applications, that sa P has so, how do I bring in the data which is sitting in all these different layers and make. Some sense out of it make insights, and that's where data hub provides, this capability on, how we can leverage. Data, hub to build these pipelines, and beyond, that also. Talk to big data systems, or data lakes where when, you're doing something like a data lineage or impact analysis it's. Not enough if you're just looking at your traditional, relational. Data which is primarily, data at rest but you also want to look at the, data in motion which could be data coming from IOT. Stream it, could be coming from web blogs it could be social media data so, how do I bring that lineage, across. My different artifacts starting from a bi dashboard. Or analytics. Report. That I'm looking at and tried to point, that to a source. Of origin which could be hey this data came. From s AP ERP system but now it is residing. In HDFS, file as a park' format. So, it's very important, for us to combine, these two, worlds, which is the on-prem and the cloud and especially when we talk about these multi. Cloud environments, and support. Different kinds of workloads, like. I said it could be the IOT streaming, workloads or it could be advanced analytics, which, is now, migrate. To the whole machine, learning a I kind of workloads.
And Finally. Our basic. Cloud storage, where I'm looking at a data aging data tearing strategy where I want to push my. Cold. Data into, a cloud, or a, frozen, state so how do I ensure that I build, a pipeline which you can also bring, in the data from a frozen state combine. That with hot, data in memory data that, we have in SA P Hana which is our, central. Core of the digital platform. So. We have collaborated, with Google. On multiple. Aspects, but one of the primary things here is how, do we leverage. Gke. Primarily, the kubernetes, engine. But, beyond that gke and that's where we, have a lot of customers, who are running kubernetes, on Prem whether, it's through support from Red Hat or so. Say but it was interesting, to hear this, morning at the keynote that now, we'll have gke, on Prem as well so a lot of our customers are going to be excited, because they're. Already on the journey to using gke and for us we've, been working. With kubernetes. For almost, two. Years now, and even. Before the, 0.1. Release was, made, available so, we've, gone, through the ups and downs and it's interesting to see the adoption in terms, of how kubernetes, is being adopted, not. Just across the customers, but also across, other partner. Landscapes, so. For, us it's important, that we ensure that this, kubernetes, allows us to build, more and more as we deliver data hub as a service, in the future where today, it's available as infrastructure. To the service but we are looking at a managed service where, you can come in and you can buy the compute, hours and you, just need this compute, to build a pipeline, which, does, the cleansing and refinery, tasks for. A data scientist, for them to run, their data scientist workloads, and as, you've heard in several. Talks especially, when we talk about ml, and AI and data science almost, 70% of those tasks are related, to I want, to ensure that I've cleaned the data I've cleansed the data I have, the right kind of data before, I can actually hand. It over to my data scientist so data scientists. Are also working as data engineers, and with a tool. Like data hub we, simplified, that capability, for a data scientist for him or her to go in and build, those, specific, data scientist workflows, which is one of the use cases a MOOC is going to be talking about in the future and. One, of the next slides I. Also. Want to bring out this partnership. That we're doing with Google Google. And Cisco. And I think you heard some of these again, at the keynote sa, P has been working, as one, of these triangular. Partners in this initiative for almost a year now and it's. Great. To see that we already have our first version, of the customers, who are looking, at this and they're, already gaining. A lot of benefits, in terms of, enabling. Gke. Along, with Google Cloud as a hybrid cloud running. On Cisco. Platform. So this is really. Great, for us to share. This information with our customers, in terms, of data hub being one of the first, applications from. Sa P which is able, to take advantage of this partnership between Cisco, and Google. And. Finally. Before I go. To the demo I just want to kind of set, the stage for the, demo that we're going to be talking, about today primarily. This is again a collaboration, for one, of our customers that we did between Google. And sa P where, we are taking in the data, from point of sales information, so we have point, of sales. Data. That is being streamed, in through Google, cloud, platform primary. Specifically. A pops up the. Output of that Google, pops, up is being, returned, into, Google, bigquery and again this pipeline, is being managed. Using data hub and the, Google bigquery data, is being consumed, in SA P Hana as a, data, virtualization layer this, is where we talk about Hana. As being that broader. Information. Fabric, or in-memory fabric, or the logical data warehouse where, what, we are enabling from data hub is leave, your data wherever it is whether it's in your Google cluster or your on-prem, we, allow you to do.
The Push downs to wherever the data is so that you, do the heavy lifting where, the and, then we bring it do the joints that are required before we, can make that available for, visualization. Through, your favorite bi tool or - boring tool. So. Let me switch to, the. Demo. Before. I can hand it over to a, moke, for the use cases so, what you're seeing here. Before. That actually, I wanted to show the bigquery demo, first before we go into. So. What we are seeing here is a big data analytics demo. Partnership. Between data hub and bigquery. So. As I mentioned earlier we are streaming data from point-of-sales across, the different retail, stores in US and this, data, is being streamed, through, a pubsub and data. Hub is used as a, configuration. Management for, connecting. This pub/sub and for streaming the data oh. You're. Not able to see. Sorry. About that. Thank. You MOOC. Okay. So, this. Is the architecture, diagram that I showed earlier and, primarily. What we're doing here is we're leveraging data, hub and this is the data hub pipeline modeler. So. Here what you see is I am reading. The data from. Google. Storage. And I'm, streaming. The data into, a Google pub/sub and the, output of that pops up is being consumed. Into, a big Google bigquery which. In term is being. Accessed through a Hana calculation. View so for folks who familiar, with sa P Hana and the, Hana data modeling you're, familiar with calculation, views so we consume this data as part of a calc you and then, expose, that, through. A visualization. Using a simple analytics cloud. So. Here I'm looking at the configuration for, my. File. So here I'm using, a CSV. File and the. CSV, file is in route path SAPT, ADA hub now, I switch to my Google. GCSE. And I'm, just pointing to the location where I have my source, file that I'll be using for this demo. Exercise. And. Of, the next step I'm going, to point to the. Google, pub/sub and the, sink. Where I have mentioned. The information, so the topic where I'm, capturing. This information in the Google pub/sub. So. As data is being streamed in through this. Now. I can go. In and capture the data so I also have this, terminal. So what you see here is a terminal where I can capture the data that's coming in so, the data that being streamed through the point of sale devices, is being streamed here and within data, hub I can add a terminal. Window to, actually see what data is being streamed, so the, idea here is not to really, read through that stream but we're, just showing you how you can also look at the actual data being streamed in.
And We also have an integration with track, driver from, GCP. To analyze, the, monitoring, of this information so how much data is being streamed, in at specific. Intervals of time. So. If I switch to the pipeline. Primarily, here I'm looking at what. Data. Is being streamed, in and then, where. I'm being able to capture that into, Google. Bigquery. So, now that I have the data being streamed in I'm looking at Google bigquery and. Looking. At the data that's being streamed, in real, time here so, I'm just running a query to show. That the number of rows. Are increasing, so it was around, 8,000 earlier and now it's at 16,000. 840. So this, is primarily showing that I am streaming, this data from. Google. Pub/sub built. This pipeline using. Data. Hub and then, in the next step I'm going to be consuming, this as part. Of a Hana. Calculation. View where, I consume, this. Google. Bigquery as, a virtual, table and then, make that available through, ASAP. Hana smarted, access. So. This is where I'm using, the Google. Bigquery table, I'm consuming that as a virtual. Table using, that logical, data warehousing concept, and then, make that available for my. Bi analysis. Through, Hana calculation views. In. The next step I'm doing some visualization I'm not going to talk, too much about the vasila visualization. Because once. You've done the heavy lifting in, terms of building that pipeline which, is where the meat of the work happens, which is where a data engineer comes, in and has. To do the different part in terms of configuring, my different sources in this case being. Google, pub/sub and then, making that data available through, Google bigquery, and finally, expose. That into, a Hana. Table. If. You have time I can do the other demo. We. Have 24 minutes right. So, let me switch. To the other demo. Where. I can show a few more. Screens. Of SAPD. The hub and its, capabilities. All, right so what you see here is the data hub cockpit, this, is kind of the UI. For a. Administrator. Where. He, or she comes in and sets. Up the landscape, so in terms of the landscape, I want to create. Different systems I want to set up these zones where I can categorize, the different systems so. I'm providing, a connection here to my Hana system or it could be to my ERP, system or a hadoop system I have the options of setting. Up the systems depending, on the, source systems that I have to connect to so here you see a list of the, different systems I've connected to. And. Then. As. Next step I can go in there and I can browse. Through the, different. Data. Points that we have so in in my case I'm going to go through data, discovery exercise, so, here I have the option of browsing. Through my, hadoop. Connections so I can browse through HDFS. Folder, and I can look at the different, files. In their pathway files over C file CSV, files or if I want to connect through a GCP or any cloud storage I have the same option in this, exercise I'm just going to go, through a Hana system, and I'm going to browse. Through a Hana view that. I'm going to then. Run through a data profiling, task. So I have, information, here for the sales order so, now I'm, in the data discovery or data profiling, space. Where I'm, looking at the data that I have so this is just a sample data and this, is where the data wrangling, part.
Kind Of kicks in so you want, to get introduced, to hate now I want, to do a data wrangling kind of exercise before, that I want to understand the spread of the data so I want to look at the. Profiling. Information and the fact sheet where, I can see what, are the different values. Of the data what does the spread of data say in terms of what are the distinct values so. Here I go to the fact sheet and I can look, at the different data points or have information, about this. Has around 1.9. Million unique, sales. Information around, 20,000. Customers 5,000. Products terms, of the master data for my customer and products who purchase this specific. Product and it. I have the data for three years depending. On the. Data. Points that you have are dependent, on the source for, example if it is a park a file obviously you'll have much. Larger data set and in a lot of our customers they. Want to store, the cold, data in Hadoop system, or a data Lake system so, the profiling, tasks for a Hadoop. System or even for a cloud system can be enabled similar, to what you're seeing here from. Hana. Table. Now. I enter into the pipeline modeler, so the idea here is as a. Data engineer or, a, data modeler, or even a data scientist, you want to understand the data a little bit before, you can really play around with the data so, if you are a data scientist, you're using the Jupiter notebooks, or. Ipython. Notebooks, in, our case we provide this data. Hub tool where you, have, UI. Enabled. Way of dragging. And dropping information. And trying. To build your pipelines, so here. You're looking at a pipeline where we've. Used the, data that we got from the, sales order, table and we. Are running, some. Predictive. Algorithms, in this case we're, going to be using Python, initially, and in. The next phase we are going to be using R so, we'll show you how you, can, very. Easily modify. Your predictive. Algorithms, to, suit whatever you're, comfortable with so if you are tensorflow. Customer, you can just rip and replace a. Python, algorithm, with it and the flow algorithm and you can enable the prediction so we, have two tasks, here one is training, the algorithm, and the other one is the, serving part and now. I'm showing you how we build. The training. Algorithm. Using. Python, so. I'm opening, up my sequel, generator which, is where I'm getting the data from my Hana. System, so the sales order table that you just saw I'm going to bring in that data and. I'm going to convert that to a CSV, format, so that I can feed that to my Python, model, so, that it can take the data and then do. Analysis, on that. Python. Model so we, leverage. Some of the open so. Libraries.
Like Pandas, and lifetime. So. Again this is I can bring in any external, algorithm. I can make that available as. Part. Of my pipeline and leverage. The, insights, that I get out of that so. What you're seeing here is this is called a graph or a pipeline within, data, hub and each of those blocks. Are the. Operators. So we deliver around. 200. Plus operators, to do different tasks, whether that you want to anonymize, the data in, slight you want to do data, cleansing you want to do geocoding. Reverse geocoding, you, want to enable location, intelligence on top of that audio. As a partner, want to build your own operator we provide all those capabilities as, part of data. Hub and, extend. That even to machine learning and AI as well so. I'm just showing some, open. Source. Libraries. That I'm importing, from pandas, this. Is the Python. Library. That we returned, for doing the predictive, analysis, based, on the sales. Sales. Orders data. So. Just switching through. The, demo. So what I have here is when, I execute this I am training, this model. So there is first, the training phase which I'm doing and then once, I've trained it I'm going to serve the. Model, so that I can apply this model on new. Data that's coming in or real-time data that's coming in to be able to get a prediction out of that real-time data. So. Let me quickly. Move. Through this so basically what you're seeing here is right now I have the Train. Model or the, serving model so I've used, the, Python serve. Model. Here and I'm going to be loading. The data that I got, from my trained. Ml model I'm going, to feed that through this prediction Python. And then. Get. A prediction so based on the, inputs that you provide. For this prediction you'll be able to get. A prediction from that model saying depending. On the customer. And they're, buying pattern. If. I provide. The number of days there, is a high, probability that the customer is going to buy another. Similar, product. So. Here I'm. Picking. Up the customer ID customer information, and the product, and then I specify, the day and, based. On the number of days it's going to tell me what's the probability that the. Customer, might buy product. So this is again based on the. Historic, information about, what this customers, bought this. Certain. Level of recommendation, capabilities, as well so when, I punch in fifty days there is one point five one times. Probability, when I punch in 500 days there's a 14 times. Probability, that the customer. Might. Buy this product. So. Now I switch, from, using. Python. Algorithm. To our. Algorithm. So we'll. Show you how easy. It is so, we've, already created, a graph so what we do, is first we copy this pipeline, so I already have a pipeline so this is the, whole reusability, part, of how. Developer. Can reuse. Pipelines. That are already existing in your repository. So. I'm just going to call this with, a different name I'm going to call it, train. For R and then, I go in I remove. This so I'm just literally, going to delete that. Operator. And I'm going to drag, and drop our. Operator, so, obviously depending, on and like. You see, on the left side it could be a tensor flow of, operator. As well so. Here I'm looking at the code from, our, standpoint, so this is a custom code that have returned for R and then, I just have to connect my. Operator. To the input, data feed that I'm getting and, then I'll, be ready to train this. Graph using. The new R, operator. Our. Code. That I just inserted. Into this pipeline. So. I just have to make some changes in terms of what the model name is, I'll. Go through that. It's. A similar exercise as, we did in the previous time. Now. I'm running the train model once. I complete this in the next step I'm going to bring in the sir model so. I'm going to do the same thing so here I have the python-based. Sir model I'm just going to save it as a different name and, then. Literally drop. That operator add R. Operator and then, use, that for. Serving. The. Serving. This graph so, that you can see the prediction difference, and this is something that data, scientists would always want to do because they there's, always something. New coming in from the different libraries so. You want to ensure that you can get as close to the prediction, to, real-time. Right. So I'm just going to drop. That. I'm. Bringing in the our. Serve model. And. Then. I'm going to connect to the model. That I created using my, R. Train. And. The same. Prediction. Now so I, switched, from. 50. Days to find the days and you see in the previous. One it was 14 times.
Probability, Now it's 17 times probability so you as a data scientist you have some. Tools that you can play around with in terms of this is your favorite algorithm so you can do, some kind of analysis. Based on that. So, I just wanted to show life system, so that you guys don't think like, we're. Just showing you some. Recorded. Demo so again. What you're seeing here is the data, hub homepage, so this is the latest, version of the homepage which will. Be available for customers end, of this week I think. So. This is the new version so we have multiple, applications so, what you are seeing right, now is the more, a modeler which is what, I have here or the data hub modeler which is the pipeline modeler, and we, also introduced, other things like metadata explorers so one, of the big things that we are seeing from a lot of customers, and again there are several. Startups, that are trying to solve. This problem the, capability. To provide metadata, management, data. Cataloging. Lineage. Impact analysis, across, the different, disparate, data sources and, again it's not just limited to how, my data in my scipy systems I can do lineage no, talking. About lineage all the way from your bi report. Whether it is a tableau. Dashboard, or ASAP, analytics cloud dashboard, all the way to the. S AP ERP system so, those, are things that we can do from. A data hub standpoint, so we are bringing, in multiple capabilities, into. A single tool so, that you don't have to go look. At different, tools to do different things and all these are available, as part of Google, cloud platform and, we have customers, that Amogh, is going to be talking about in, terms of the use cases so, I'm going to pause there and I'm going to pass it back to I'm ok. Yeah. Thank you Balaji I think one, more thing also, our board member went like Curt and, Diane. Have also, announced that we are able to sprout in 30 minutes, if you want to try out ACP data hub on Google. Cloud platform using. Using. Kubernetes and. Super. I'll just walk you briefly through couple, use cases. Just. Starting with one of the customers who has a parallel, session right now a tippy financial. Here. Is how they are also bringing out the, kind of benefits, they see their goal was to have, zero dissatisfied.
Customers, They. Have seen a. Huge, increase in efficiency and. For. Their operations, and delivery by. 30, to 40 percent this is where they. Were able to bring. Out their data-driven approach, we. Have been discussing couple, use cases with them as well also. They are working with the. Google cloud team, they are trying to set up a, data, science, lab. In there they're, having. Also on Prem HANA and Hadoop, kind of a, setup, and, multiple. Use cases including, customer, satisfaction. How can they enrich the the customer experience, or also, try and, bring. Out more. And more customer, churn scenarios, so, that way they are able to predict what product. From them it's bringing. Out what impact. Another. Customer, of ours is also, able to use, now. Machine, learning, as well as predictive, analysis, we give a, lot, of operators as you saw from the demos, Balaji gave you I think, they're over. 220. Plus as we go on we are embracing. Everything. That's available and. You. Can include. Then a seamless, integration, towards. Sa, PBW for, Hana as well you, can have different, zones set up one for data science, one towards data management or data engineers, and have. That seamless flow, and orchestration. Of data as, you will of course you. Want to see Balaji already demonstrated, how once the model is proven you can easily bring. That into, production or industrialized. That model, and. Then. Another. Nice story of a customer, we, have seen all those smart appliances. And connected, world and so on here, we saw a customer, giving us a challenge there, were six million, devices generating. 16, terabytes of data per day they. Had a, first, challenge their data science team they were paying top dollar for, that kind. Of data science, team. And they. Ended, up using four, or five different tools, now we shall, out thousands, of dollars for these and, their. Idea was to see that whatever, features, are coming out from that they wanted to get a good production, efficiency. They, saw. The usage, of that they. Were able to bring all that data in but there, were a lot of issues you know there were five. Different tools, there was a lot of manual coding there was no automation, possible.
And There, was no end-to-end visibility they, did not know where the data was, stuck or whether. It. Was not running onto a good model with ACP data of what we changed, we. Refined. Complete. End-to-end process. Using, one single, tool we. Were able to give them a nice visual. Modeling, environment. We, also brought in governance, we understood, their. Process. And then. Help them automate, that from. One region they were able to scale out into multiple regions as well and we. Were completely, helping. Them in the end-to-end, orchestration. And refining their data so, they were not only able to bring down that from the ten features that were given. Out in this new appliance, whether, only. Three of them were used they pushed a campaign, and out. Of that came in use of four more but, then persistently. They saw that in all, different regions, three were not being used and they stopped the production of those because it was not being used at the same time they, also were, matching. Or matching, that data with the enterprise, data in ACP all these kind of raw streaming, data came in together if, your mere pink some dollars for this we want to see that it is not lying around for, three, to four months when the service vendor comes and gave us service right so we were. Able to bring that data mash. It up with the, customer information with the vendor information and, based on the error code they. Were able to proactively. Reach out to the service vendor in that area and said. That please carry part ABC, and more. Than 85%, of the times in one single visit, they, were able to resolve. The issues and had. A very loyal customer. Another. Quick example is, about, how. The fitness, tracker data is there most of us are using wearable devices and. We. Were able to now stream, data, from. There across the globe we, were able to bring. In the sales figures we were able to bring, in what type, of for. Example a, jogging, shoe or a running shoe was used by that consumer. And not only. That but when, we refined, all these we, were able to help them deliver insights, we were able to then bring this data towards, an. Analytics. Dashboard, and bring, out with with. A model that based. On the pattern or the jogging pattern, of of. The consumer, and the. Type of shoe they have they. Were able to bring out you. Know much, more upsell opportunities. More. Than 20%, in some regions so that way they, were able to increase their sales as well in. The end very briefly. You. Heard also from, our board member burn. Troy cut in the morning, here, is how Aesop is trying, to, bring, out more. How. Simplified. Data management can be in the future together with a CP HANA that is what is our ACP data management suite data. Hub is one of the, significant. Components, in that and as, you have seen already we are able to scale out into, our partners, like Google. Cloud and and also others. What. We are also trying to deliver now in terms of our strategy is, gear. Up all of you towards, an intelligent, enterprise in. Their in. The digital play form of course, is driven through ACP, Hana and ACP, data hub is then the core of the. Data management where we are wanting. To help. You in all the multi cloud environments. And have that seamless flow of data, manage. And give you a wonderful customer experience. As well and. Just. A. Brief. Take. Away if, you will how, ACP, data hub is then gonna help you bring. In a, kind, of consolidated. View across, all the sources of, data, how it is going to help. Bring. In intelligent. Discovery, as we, move ahead we will be able to help you also more and more with. Improvised. Data quality, and cleansing, we. Will be able to help you with, kind of enrichment, of the data how we can, have. Distributed. Processing. Across. This kind of hyper scale architectures. How. We can help you with kind of scalable, data ops management, as. You move into the cloud you hold as well in the keynote in the morning that is where they want to automate more and more and last. But not the least also, get. Optimal. And compliant. Data governance across. The enterprise even. For the data that is now, becoming more. And more relevant the unstructured, one how you can help. Get more. Grip. Over that, here. Are some of the resources. Which are available for you. Please. Note the links and of. Course you. Can reach. Out to ACP, as well through here and we'll be happy to help, you in that.
Thank. You very much please do not forget to to. Also give feedback on, the app and we'll. Open up the round for any questions, now. Thank. You.
2018-08-17