Good. Afternoon everyone. Thank. You all for coming for this session. So. In this session my name is Vinay balasubrahmanyam I'm a product manager in, the. Bigquery team I'm. Joined by Ian, and Catalina from King we're. Gonna share their experience, of, using lucre and bigquery the. King, has. Transformed. Their business, using. Looker as their, Enterprise bi solution and, they're here to share the story soon. Now. I work in the data space and your. Feedback is super, important for us so your feedback is our data so do share your feedback, about the session after the session is done. So. Let's get started if. You look at organizations. Who are going through digital transformation. They're, collecting lot of data then, they used to do before now, the eyes being generated from clickstream analytics, mobile. Data social. Media post or a sensor, data this. Particular. Statistics. From IDC shows, that by in. The next four to five years the. Amount of data which is being generated more than quarter of that data is going, to be analyzed, in a more real-time fashion, if. You think about it in the, next four to five years they're. Going to be about 175. Zettabytes, of data a quarter. Of that is going, to be analyzed, in a more continuous manner, so. As organizations. You. Need to invest in a Big Data Platform which not only can analyze, the variety, and volume, of data but. Can do that in a more continuous, manner. Google. Cloud platform data. Analytics solutions, provide. Real-time. Analytics, as part of the capability, which is already ready for you. But. What is driving the. Growth of real-time data if, you, look at our personal lives we, leave digital, traces everywhere, on a daily basis, or. It could be a business who's, looking at all the sensor data and acting, on that in, more real-time or, it, could be a marketing, executive who's, looking at clickstream. Data to understand, what sort of ads they need to serve or, it, could be a game, developer who's, at all the, different gameplays, to figure out what sort of levels they need to customize for every use case. And. If you look at this, particular study, from. Forrester, which was published in sites, which are derived from data have. A shelf life which, means you need to analyze the data in a more real-time fashion, so, that before they perish so. On the x-axis is the perishability, of data as it ages and on, the y-axis is, the time to insights not, all use cases require, real-time analytics, but, some of those use cases such, as the. One which is where, you take automated, decisions, for. Example you need to detect if there is a fraud happening, at the point-of-sale application, or, it could be a.
Consumer. Who's applying for the loan and based on the, credit. Score you need to understand what sort of rate, you need to give right as a financial institution those, things are more automated, fashion which you need to act on it much more sooner the. Second on the operational, insights these, are more human, oriented, where humans are looking at dashboards, for, example a consumer walks into a retail store and as, a merchant I need to look at their past purchases to. Understand, what, sort of discount or customized, discounts, I need to offer right. There is a little bit more time to act on that on the. Performance, insight site you are looking at an, ad. A old data to understand what sort of changes, you need to do in your sales pipeline to act on that and on, the strategic insights these are more longer-term where. You are looking at quarter or quarter data to understand what sort of changes you need to do on on the business side right. So there's, a spectrum of use cases our. Vision, for real-time, analytics is to, be the platform, which allows you to d'lai derive, always. Fresh and always fast, data with, no limitations, as the. Data volume grows if. You look at BI solutions out, there they, make you make trade-offs between freshness, and speed if, you want the most fresh data it's not fast if you want the data to be really fast it's not fresh, right so, we provide we, want to provide a platform which allows you to analyze both real-time, and historical data, so. If you map the same thing with what bigquery, is today bigquery. Is widely used for strategic insight, performance, insight and, operational. Insights but, where we are investing, in the. Neck this year and the next year is to push bigquery. To, also be in the real-time space so you have one single platform where. You can look at real-time insights, and historical, insights and often, times we have seen customers using two different systems for this right, that's where a bigquery. Is today, and. If, we compared bigquery with what traditional. Warehouses. Are you. Need to manage servers, you need to scale them and you patch them you, need to manage complex ETL, pipelines, to load data in right, with bigquery we, make the overall, infrastructure. Management invisible. So, you have lot more time to, look at insights, derive, ask the questions or even ask newer question, which, was not possible, but. The key differentiator for bigquery which is unique. To that is support. For real-time insights, so, there are two use, cases where bigquery, already, does. That today so bigquery on the injure side natively. Supports, streaming, ingest whereas, a lot of other data warehouses, require, you to do batch ingest, which delays your freshness the, second area where we announced, earlier this year is an, in-memory analysis. Engine built, into bigquery called, bi engine, which, allows you to get subsequent, query response, time for, some of these dashboard style use cases and those if you combine those, two capabilities. Of bigquery you are really getting to a point where you can derive continuous, intelligence, but, as the data comes in versus waiting for data to be loaded over time. In. Terms of streaming bigquery. When it was launched few years ago V we supported, both batch and ingest, streaming. Mode ingestion, and but. We had certain limitations, in terms of how much we can scale right because, it was built for, what. It what real-time, insights was a few years ago but, since then we you, have seen lot of customers, moving, to from batch to streaming more ingestion so, we took time last year to, create a completely new back-end for bigquery streaming, and this, particular back in right now is in beta and that, can really scale right if we remove some of those limitations which, we had with current streaming the, default, ingestion. Quota, we give for all your swimming with the new back-end is 10x more than what it is to be so it's about 1 million rows per in a second ingestion and we, have tested it up to 20 million rows per second. Ingestion that that's pretty massive in terms of how much you can ingest. Also. Not only on the ingest side but also on the read side the performance, the read latencies, have significantly, improved as you, start streaming data in you can read them at a much lower, latency, pace and that's improvement. On both sides. Sometime. Next year we're gonna have a new API for streaming, which is binary which, has native Avro support and that makes the overall platform, for streaming much more richer for you, to build real-time insights, on top of bigquery. The. Second thing which. We announced earlier is bigquery bi engine, a big, query bi engine for those of you who have not, heard of it it's. An in-memory OLAP, engine which is built into bigquery right. It's a column-oriented, database on top of bigquery storage and, it. Can horizontally scale, so especially, built for death dashboard, and reporting style use cases where, we have lot of concurrent, queries and all, of these analysis, happens in the memory layer right.
So It can scale horizontally so, as you have lot more queries. Coming in from your dashboard bigquery. Bi engine can automatically scale to support concurrent, queries and bigquery. Bi engine natively integrates, with bigquery streaming, so as data changes, we can incrementally, update that data in, memory and to. Make your freshness, of the data much more quicker. If. You look at, traditional. BI tools, they. Require you to take, the data from your data warehouse into the in-memory layer, for. It to be fast but. That comes. With lot of architectural. Complex engine in terms of managing, an ETL priceline with. Bigquery bi engine we have taken the same OLAP, idea, and baked, it into bigquery itself, so, it sits on top of bigquery storage and, we can move the, data from the bigquery storage layer, into the memory layer much more seamlessly, and that, simplifies your overall ETL and there's no need to move data outside of bigquery, when. We launched it we announced, big PerR a bi engine only with data studio but. We are committed to delivering and sequel. API. To make it available outside, of data studio so tools, like looker, tableau, any. Of the existing tools which can talk to bigquery through. JDBC, ODBC, can still make use of this particular, engine and that is sometime next, year. My, query be engine also has a notion, of smart tuning so, it really doesn't cache all the tables, it, only caches, the columns, which are being used in your dashboard and it can automatically, tune the column based on different encoding types and it keeps the reader compressed, in memory so that way you're small if, your table can be bigger it can still handle lot of data. And. In terms of reservations, it starts as small as one gig you can create one gig increments, of reservation, and you, can go up to 50 gigs but as we make the sequel API available we're. Going to allow you to add, lot more memory layer so you can really if, you want to have a bigger table you can pretty much keep everything in memory and. A lot of data about the bigquery BI engine how it's being used, what, sort of caches, how much time we cache, the data what kind of data is being cached it's published, and stacked driver so you could get, visibility, into, understanding. How much more, memory, layer you need to do. So. Here is a great example of a customer, who, has taken this journey with us Zhu Li Li for, those of you who are not aware of Zhu Li Li Xue Lilly is the e-commerce. Company. Based in Seattle they, use AI and, machine learning to, offer a personalized, shopping experience, for users when. They started off they, were on Prem in a collect in a Colo using, traditional data warehouse technologies. And Hadoop so. They were collecting, data on a regular basis but their whole pipeline was slow and the, data used to be analyzed every 24 hours and that was not good enough for them to understand, what sort of daily DS they need to offer to the customer since. They moved to bigquery they, started streaming, data into bigquery and, they, collect lot more data. Points for example today, they collect up to 50 billion data points about, different parts of shopping. Experience, of users and they were able to create.
Nuke New shopping experience new, products, dynamically. Based on the data they've collected and, since, then they have really added, lot more users within the organization. A lot more customers are to their site right so this is a great example of a customer, who has. Transformed. Their business and still, staying competitive with, Amazon right so they're able to do that because, they have they've invested, their time on. Bigquery, as their data platform, of choice. So. Last year. Earlier. This year we announced acquisition, of, looker and that's. A great complement. To what we are doing already with bigquery but, a question. Is still pending so I cannot share a lot of details of what. We want to do jointly, but, essentially the key point is with, lucre we, have lot of data about joint, customers, who, used looker and bigquery and these. Customers, who have taken this journey of using looker as their BI platform love. It right, and that's one of the reasons we, felt there was a really good synergy between what looker, was offering, and what, bigquery, is providing. And. One of the if you look at the overall portfolio of, our stack, we. Have services, on the injured side we, have services on analysis, side but, what we were missing is the last mile of delivery we. Are relying on our great partners, to, offer the BI solutions right. And Laura our customers, wanted, an order Bach experience, from Google, and looker. Really. Fit in that category, right we will still continue working with all the great partners, we have in the, BI space but, we will we are also going to offer a first-party experience, with looker now. What makes looker. Highly differentiated and complements, to what Google, already provides. So. With go with lucre we provide an end-to-end data, analytics platform, from ingestion to BI. And. Looker. Fits. Very well with Google clouds, Multi cloud strategy, Google. Runs a local. Runs on GCP. AWS. And Azure and that will continue to support that and looker, connects with all different data warehouses, including, on-prem data base basis we'll continue to do that so it fits well with our open cloud strategy, the. Second thing is the. Common. Data model over your data, there. Are lot of BI tools we provide self-service. Bi which, allows you to connect your data source. And then, get and. Create, these dashboards but. The challenge with that model is that every. Department gets their own view of the state a myopic, view of the data and there, is no, single unified source of truth of what's going on right with. Looker what they have done is they, have invested, in a modeling. Platform so, that you can express your business logic you can express your KPIs in a more centralized, manner and then. You can provide that access to your dashboards, so that different teams when, they look at churn data or revenue data or net revenue data it's, all consistent, right so there is a little bit of upfront, work and creating, those data models but, once it's created it's widely, used by the in the organization, and King, is going to talk about how. They have transformed, their business using, look ml, third. Thing is where Google compliments. Looker. Is around, Augmented, analytics, this, is where we can use Google, brings the machine learning and AI technology, right, into your bi tool right. This could be where you can use machine learning models, to, understand, or predict. How, your churn is going to happen in the next few months based on past data or it could be where you can predict, whether, the customer, buy. This product, right this is where you blend AI. Seamlessly. Into your bi solution and that's where Google cloud. Platforms, AI, capability. Can come in and lastly. We want to help our is squeeze and a size to. Build data rich applications. With, lucre as the plot data platform, and bigquery as a data warehouse, so. With that let me invite.
Karolina. And she's. Going to share how, King, has. Started using looker as their Enterprise bi solution. Thank. You vena. Good. Afternoon everyone I'm Carolina. Martinez, and I have been working at King for four years now, correct, currently, I'm working for. The incident management team. Today. With my colleague Ian we want to talk to you about how, we use to look at King, ian. Will explain, how. We. Did it why we migrated to looker and whenif, it's it has provided. Afterwards. I will show concrete, examples, on how we, use looker in the incidence management team. But. First for, those, who don't know who King is we. Are better known as the, makers of candy, crush. We. Are a game developer, we developed. More than 200. Titles. Of which 18, are currently, live our. Games, are played all over the world by. 247. Million people each month and. We. Are currently. Located. In, studios, and offices all, over the world and, we are more than 2,000, players. Currently, and, since February, 2016. We are part of Activision Blizzard. But. How does, King. Look like in terms of data. Sorry. First you can see here a selection, of our most popular games you, might have played, them, at, some point or heard of someone just playing, them. Behind. These in terms of data we. Have, 7. Billion events. Generated. Daily. From, our games and consumed. Into, our, data, warehouse. We. Count at the moment 500. Terabytes, in fact, tables. And 60, terabytes in dimensions, which, equates to 3 trillion rows, and. 500. Billion rows respectively. On a, daily basis. This, data is systematically. Processed, in bigquery. I. Can. Show you briefly, how this, happens. Behind. Scenes so. We have a combined, bats, and, streaming, infrastructure. That. Allows us real-time capabilities. As well as robust. And complete analytics. Regardless. Of the how. We ingest, this data we. Can have real-time anomaly, detection we, can explore, this data using. The typicality data. Scientists. To kid with, our. Or, pison. Developers. And we, can also explore, this data with lucre. Also. We ingest. This data and. Interested. In two other analytical, apps, in. This environment Luger. Has played a very interesting role and has, expanded our capabilities. Of analysis, so, next, Ian, will talk you through, how and why we. Moved to looker. My. Name's Ian Thompson I've been working, be ayat King for over, five years now and I'm currently looking after the BI, platform. Google. Asked me to talk today about our journey with Luca a little. Bit of information about how and how Luca works and a use case we've done with Luca, and bigquery, we. Started, with Luca about three years ago when. It won an internal, BI, evaluation. For a complete bi solution and and. We it became our main bi tool after just one year after two, years it was our sole bi tool. King. Also completed, its migration to. Bigquery. From, an on-prem cluster. And MPP. Database, it. Was about 18 months ago. But. Let's take a little quick look back in time to, see. What, our problems were before. We start over looking at and what we needed. So. Our biggest problem was that we had one, team. Bottleneck, so we had one team in charge of reporting, with, a huge backlog which, meant that we had minor, changes, that had to wait, we, have many reworked copies, of all sorts of reports. And data, models and that. Led to lots of duplicated, code and lots. Of wasted effort we. Had no single, source of truth really, and people, were defining what should have been standardized, dimensions, and module measures. In, varying. Different ways which results in different KPIs and that. Led to a lack of trust of the data and the, tools that led, to wrote back rogue behavior and, people were spinning up all sorts of different solutions and software. Users. Found it quite hard to adhere to a. Single. Source of truth model, because the processes, were quite complicated and the workload was quite heavy and. We had no traceability or responsibility, so who owned these products, and, where. Were they running was it in someone's personal schema, and who was going to action these bugs. If they came up or any edits and on, solution. So. What. Do we want to achieve by, integrating. Looker so, we wanted to disperse our engineers. We have wanted, to embed our bi engineers, into the business to be a dedicated resource. And. We want to empower and devolve, building. Those analysis, to those who fully understood the subject matter the BI, team understood, best how. KPIs, worked, but not the intricacies, of, how. The different games and. Campaigns, worked for example. We. Even wanted to empower and devolve people. Understood the data best so the bi team they, understood the core data model really well but. They, might struggle to understand the intricacies of data, from, all sorts of platforms.
Across The business. We're. Better assisting, rather than bouncing, a solution back and forth until it's perfect, and we wanted to get the decision-makers. Closer. To the right analysis, and. They needed to get closer to the process, so. We need the decision-makers. Sculpting. The landscape, of what actually could be analyzed. So, this is a useful time to take a quick break from how we rolled out looker and see. How the car works and how simple it's equal abstraction. Language. Called looker male actually is. So. At the core of the look of data platform and not, seen by the majority, of its users is a, query, builder so. You describe the attributes of your database, data model and any, business logic to. Look up just once and the. Person capable of building and running tailored, sequel queries based. On what the user wishes to know and this, is known as exploring, in looker, the. Abstraction, from sequel has many positives, and is coded, in. A language called look amount, look. Amount is a, language for describing dimensions. Aggregates. Calculations. And data relationships in. A sequel database and look. At users a model written, in look ml to construct sequel queries against, a particular, database. So. For some context, I wanted to show you what an explorer actually, looks like and so. This is your explore page that you're faced with in looker and the user is able to select from the left-hand panel there some, fields, to create their own alysus, and visualize, that result set so, looker wall creates unnecessary sequel. Query based on the looker mail to, fetch these results from the database. So. What are the benefits of look amount looker. Say that look ml isn't a replacement for sequel it's a better way to write sequel, and I, agree and, some. Of the major points of the benefits would be reusability. So, majority. Of data analysis and. Isn't. Likely to be used again yet a lot of the same steps would be so. With look allow you define a table, a dimension, or a measure, or relationship, just once and you build upon it rather. Than rewriting a sequel every time version. Control, so ad-hoc, queries and multi-step, analysis, is very difficult to manage and version control with. Look ml version. Controls built right into the tool and you can integrate that and to get. Look. Around is also architected. To make collaboration natural. And easy, and. We also love, that it can empower others so no longer their secrets, of how. And, how, and why things are designed and defined everyone is able now to chip in. So. This is a very very quick introduction. An example, of look ml and, we, can see here we. Are defining the orders datasets, in look amount so.
The Majority of look ml is auto-generated, and. It's, very quick and easy to enrich. In. This orders dataset known as a view in look, amount there. Is a dimension, which. Is based on the ID field from the database and that also. Has a name a data, type and is marked as a primary key. Here. We can see a few more dimensions. And. For, example we have a formatting. Change. There and we've. Also got the created, time dimension. In. The kemal someone has decided there to expose the, time, the dates the week and the month and. That will expose itself in the front end look about and explore. Nicely. And easily to the user without you having to specifically. Code week, date, month. At. The bottom we have a measure, which, is a sum, of the amount field. In the underlying database table. So. This concludes our orders, view, we. Can now see how this is actually utilized. So, this is the. Explorer. Definition. And, an explorer is. How. You offer your, users to create an analysis it's, the playing field you offer to them so. In this, case we can see that we have the orders data set and any. Customers, that may have made those orders. Depending. On what dimensions, and measures the user actually selects, in the front end looker. Will may or may not include the customers, data set into your underlying query, look. I'll always create the most performant, query to execute, against your database, in order to get the results the user is actually requesting. The. Collection of all of this look them out is known, as a model. So, that's a very simple example of looker mail there now and it needn't be any more complicated than that to. Add some serious value I personally. Love the Kamel because of its ability to deal. With the complex issues while still remaining clean, and simple and, along. With other some of the other benefits I mentioned earlier such, as the reusability, it's a big one and how. Easy it is to collaborate when working with look them out but. Let's get back to how we rolled out look here inside King. So. How do we integrate lookup we're. An pocs with teams, disillusioned, with the BI offering, at the time. It took off as shown by a rapid decommission, of our other platforms, and we, did this by running hands-on workshops, with people where we'd start off with, an, introduction, to the tool and we'd, actually end up building a product with them so the team left with a working, solution. We. Would always make sure that we were on hand and would. Proactively, searching, our issues in order to make, the adoption as simple, and satisfying for our users as possible. Our. Power users were also at the heart of our success, they, would do a lot of our work for us from inside their own teams and. We also introduced, a concept of what we call core models. So. What. Is a core model we. Think a core model for us and is owned by a specialized, bi team who best understand, KPIs, inside, Qing teams. Are then seeded with this core data model, which. Is a starting, foundation, point for them to build on so. For free they get all the KPIs, and the ability to add edit and remove bits for themselves and because. Of this there was a lot of bind from them as well they owned something for themselves. They. Were free to express themselves and could truly tailor, the experience for their own business areas, the. Solution, of this is I feel very robust centralized. And controlled yet. Adaptable, but. This is only possible through. Extend extension. In, come out. So. What. Is extension. In. Short you inherit everything, from the parent object as you can alter, and include new attributes, and when. I was putting this slide together I lazily googled, the word extension. Extending. And I was faced with loads of extending, tables and I, feel this is almost a perfect metaphor so long as you are a carpenter. For example and, you'd. Be crazy to discard a six-seater table, to, build a separate 8 seater table, you'd, be much better off. Adapting. Your six seater table, to, make, it an 8 seater table for the few times that you actually need it and you could reuse your perfectly, suitable table, legs and perfectly, suitable tabletop. Regarding. Kings specific, news cases here we, couldn't possibly cater, for all of the every, single game nuance, to build different metrics, we, can cover the standardized, ones and allow, teams to extend our model to include their own. So. When. We merge. The concept, of our, core, models with looker Mel extension, it's incredibly, powerful we. See teams with this model and. They. Can add value by, tailoring, it to their precise needs. This. Results, in a single source of truth which is based on a secure code base and saved. And get. We. Now have traceability, we. Can track all models and analysis, back to the data warehouse and the.
ETL. We. Can distinguish owners, of, parts, of the model and assign, responsibilities, and teams, can really focus on the differentiation. So, a few years ago King started working, in partnership, with some small, independent game studios and. King. Support these studios in many ways one, of which is, offering. Analytics, our. Partners, could leverage, analytics. Offering through looker to, improve their marketing campaigns it's, improved gameplay and engagement, and. Obviously. Increased, sales. We. Are currently refactoring. The data and gesturing process, for our publishing partners but this is a use case based, on a proof-of-concept with, a very light touch, ETL so. Let's take a look at how we created, this multi tenant analytic. Service for our partners. So. We ingest all of our publishing partners data combined, into one DCP, project, and. We. Create a useful data model from these events our. Internal. Publishing, business. Performance, unit otherwise, known as a BP you may. Also wish to create, some tables, specific. To publishing. We. Expose, only. Individual, partners data, to that partner, in order, to do this we. Create a DCP project, for that partner. And. Every partner and the. Data sets of what we want to expose are created one-to-one, mapping with our source data. And. Each object there is exposed. Via, an authorized, view from, the source data this. Allows us to do column and row level, access and offer. Our partners, something, different for each partner. These. Views are built programmatically, in the. Partner specific, projects. In. An authorized view. So. Inside. Looker it starts with our core model that we looked at earlier and, we. Allow our internal, publishing, bpu to. Extend this model to apply, standardized. Publishing, data additions whatever. That might be. This. Is then extended a further time for, each partner. And. That's. Where partner. Specific, additions are made. These. Models are configured, to read the partner specific, GCP project we saw before using. A partner specific, service account. Employees. From these partners, access. Looker and due to permissions, they, are only able to access their, own models. There. Are a couple of other safety nets put in place here as well not shown that would stop accidental. Access to other partners data. So. I'm gonna hand back over to Karolina and she can talk to you about the incident management process. Thank. You Ian. In. The incident management team, we, have benefited, a lot on this transition to looper let. Me introduce you briefly how we work in the incident management team, and what we are responsible, for in. The incident management team, the process has four steps its. Detection. Investigation. Communication. And post-mortem. Detection. When. An incident occurs the first tip step is to detect and determine, if what is going, on characterizes. As an incident, this, needs to happen as fast as possible. Once. Detected. Investigation. Comes in our, analysts. In the team jump, right in to analyzing. The impact, and the root cause, of the incident. Communication. At the same time the, incident managers. Start, and keep conversations. With all involved parties, teams, and potentially. Impacts. Stakeholders. When, the root. Cause is found and the fix. Is volt in place a post, mortem is run, and this. Is all about respecting. Defining. And following, up on actions so that that type of incidents, don't do not happen again. Swift. And fast detection. And investigation. Is crucial, for incident. Resolution and, they, depend heavily on reporting. For, this Luger, has become a must. Before. Having looker the date on the model belong exclusively to. The data engineers, and bi developers. All. Investigations. Were limited, by what, was, in the backlog of the data engineering, and bi development, team there. Was also, really strong dependency. On how, well requirements. Were transmitted. And understood, by all involved parties. The. Fact that we had a central, team and that. Engineer's AI and bi developers, were. Only. Part of that team was. Making. It really difficult to, have a dedicated, person. When an incident was happening, and it depended, a lot on work on workload. At. The same time for, exploration. There. Was a need to delegate, deeper. Investigation, to data scientist, since it. Was required to have specific. Skills, for, our development, and and, Python. As. This capability. To dig deeper. Was, limited. By the tech the, incident monitors were forced, to go back and forth to developers, and engineers for. Constant redesign. This. Was having. A very, important impact on, resolution. And was limiting, the pasady we had to be. Fast. And quick and resolution. With. Luca, Luca came in with fresh air. It. Allowed us to band inside, teams, that engineers, and bi developers. And the, line between, stereotypical. Walls got blurred as. Iterating. In Luca is very. Quick through, the flow you can see on this slide this.
Empowered. All users, to own their data independently. If, they were developers. Or not, for. The incident management team, disempowered. Everyone, to, tune, dashboards. Looks. Explorers. And add. Extra, tweaks, and extra logic are, two existing ones even. The more bold just turned out to look into, how, the datum. Works all. Parties, involved, feel. Now included, in the analytics, process and, rather, than being just an end customer, they can also become a creator. In. The words of one of my colleagues, instance. Manager. He. Said with minimal knowledge I can just trade dashboard, and create my personalized. Own workspace, for an investigation. This. Gives them freedom to investigate. And to load data as they need with. Less dependency. On the availability of engineers. And developers, they. Can also easily interact, and collaborate with. Other, data users, and share, with data scientists. And engineers. Dashboards. Collaboratively. So everyone, can just add they're at their best to this the, current investigation. So. More flexibility. Gives, us a better investigation. Capacity. The. Next time I'm just gonna go through some, use cases. We. Use lucre, for traditional. Dashboarding, to. Load events, to explore, those. Events and for. Interactive, dashboard, a. Traditional. Touch boarding is very. Simple, in lucre it, allows, us to. Just integrate, in, one single. View different data sources, with. Different representations. In, a single single, unique dashboards. This helps us tell a story. More. Importantly, when. Investigating. And another analyzing. It allows to correlate, the different, data sources with. Only a glance and dig. Deeper, when an animal is spotted. How. Dashboards. Are built they allow us to, just click. On the on, each, tile and then explore, from there and create a new Explorer, based, on the same data and, start a whole new investigation. Based, on. What. We saw. Previously in the dashboard, so. Next I'm going to just show how simple it, is to add, new data into. One that. Thanks. To how Locker is a structure it is very easy for any user to add new data they. Can go, to the, development. Modes. They. Just, head. Directly, to the model of interest, and. Click. Add data when. They create. Views from tables. They. Can just link into, views that exist directly. In bigquery and, with, one click, voila. They. Have a look ml with, only information ready, to explore. Next. Step we. Add the new Explorer, to, the model, it. Can be joined with other tables, it can just be, an. Explorer, dependent, on the data, we just linked in and. We. Can visualize. Once. This is done that. Is of interest this. Can be saved into the initial, dashboard, we were looking and. One. Non. Deep, data, development. Person, just you know can't just continue, adding. Information, and going, through this loop, and investigation. Another. Very, powerful, tool we have a king is a slack. Bot that, integrates. Allows. Us to integrate, data. And looker dashboards, into conversations. So. We have a boat. Which. We can ask questions. And then this, boat launches per. Race against looker and Luger comes back with an answer being. It in a form of a number or a graph this. Is very useful when. Communications. Going on because, it allows, us quickly, check, without. Having to change context. Without having to change the tool we are at, the moment working, with, this. One allows us to. Fuse together communication. And exploration. Last. But, not least, we, are making use now, of interactive. Dashboards. We. Lucre as a front-end, thanks, to actions where, we can ask the users parameters. For. An investigation, these. Parameters are collected, this, is sent to cloud functions. And in cloud functions, we can execute, Python. Packages. Developed, by our data scientists.
They. Can just work their magic, dumped. Results, in bigquery and, from, there. Results. Can be explored. Analyzed, and of. Course visualized, back, in Luca. This. Is a. Simple. Example of an interactive dashboard where. Instance. Managers, can select game they, can introduce. Training. Dates from other watch states they want to run the analysis, on and all, these data is collected. And. Thanks. To the Luca Mel that's behind scenes with, actions, and calling. To cloud functions. Data. Is processed. Dumped. Back to bigquery and. The user can just see. And completely, new analysis. That they run by themselves. Next. I'm going to just head back to Vinay, thank you very much. Again. Thanks thanks Ian thanks Karina. So. The incident management is a is a good example of a data application, right where you you could use looker as a platform to build data apps powered. By bigquery, and you can share, that as a local, block with, the new organization, so. To summarize, a few, call-to-actions. Bigquery. Streaming, we. Have a new version there there's, a recent, blog we have written on what the streaming offers, tomorrow. There is a session which, covers all the new enhancements which is happening in the data platform, including bigquery, so. If. You can attend that session that would be great it talks about the roadmap and. Lastly, King also has a very technical blog, which describes. The solution, what's. They built with looker and in general what they're doing with bigquery so. With that again, thanks for coming, again, if your feedback, is very important, for us so take some time to give us feedback and have. Rest of the good conference thank, you.
2019-12-13