DevFest 2020 Full Keynote

DevFest 2020 Full Keynote

Show Video

Hello, and welcome to devfest, 2020. I'm jason titus, vp of engineering, at google, joining you from my home in california. Each year, we at google truly, feel the anticipation. For dead fest. It's a one-of-a-kind. Series of developer, conferences. And we're glad you can join us and the local developers, within your community. Your local devfest, event this weekend is one of hundreds, taking place all around the world, right at this moment. And what makes devfest, truly unique is that it's run by volunteer, community, organizers. On a mission to help other local developers, grow, and share a passion for google technologies. Now. 2020, has been a year of radical, change for people across the globe. But when difficult, times came about. Developers, within this community, came together. Online. To plan a virtual dev fest, where each local chapter, can participate. Helping each other teach and learn, when we need to connect the most. And these connections, can be so important. I can remember the first developer conferences, i went to, where i met people who were already doing the kinds of things i dreamed, of doing. They had started, companies. Built products. And i got to hear the stories of what had worked, and what had gone wrong. And we hope that all of you will get to have similar, experiences. At devfest, this year. At google, we've always, believed, that we are only successful. When all of our users are successful. While google and its products may plan to see, it's developers, like yourself. Who make the seats grow, and thrive. This community of developers, has shown us firsthand. How are you using technology, to help each other during times of need. Let me introduce you to my teammate david, to share with you some great examples. Welcome, david. Hi jason, thank you. And welcome everybody, to this year's dev fest season. I'm david mclaughlin. Director of developer ecosystems. At google. So i've been with devfest, since the very first year in 2010. When i was able to join a large percentage, of them, back then they were a series of targeted events, in just a handful, of countries. In the years since then we've had massive growth, to countries, all over the world.

While I really do miss being able to join many of you in person, this year, i'm happy to be able to continue, the celebration. Of developers. And tech, in a new virtual, format. And bring people together, to learn and to share what we're all building. Over the years, it's been great to see the apps and the solutions. The broader, community, has built with the support, of their local gdg, chapters. We now have thousands, and thousands, of developers. Who are working to address, real-world. Problems, in their local communities. Particularly. During the challenging, times of covid. I'm impressed, by how much your work, has made people's, lives, better. There's a few recent examples, that truly stand out to me. Community, developers, from gdg, west sweden. Recently, worked with the swedish, government, to create, hack the crisis. An event focused on designing, testing, and executing. Ideas in response to recent challenges. One of the finalists. Named remote, giggs, helped modernize, the swedish, government's employment, website. To match job seekers, with remote work. Nearby, in romania. Gdg, groups organized, a hackathon. Gathering, 270. Different mentors. And students from across eight different countries. The hackathon, brought together developers. To save lives, to save communities, and to save businesses. Some of the winners, included, an online, platform, that connected, volunteers. With non-profits. Across the region. A system to help doctors. Work with each other into ease online, consultations. And a personal, assistant, for health and nutrition, tracking. Another cool app that they developed was risk alert, an app that alerts emergency, rooms, that a patient with a very rare disease. Is about to arrive. So that the hospital can prepare, for any special needs. In asia, gdg, tainan, initiated, the mask, inventory, map at the start of covet 19. Which crowdsources. Mask, inventory. From local store data, combining, it with the google maps api. The project was so popular, that it caught the attention of taiwan's, government. Inspiring, them to publish, public data apis. On mask, inventory.

Managed By the government. Taiwan's, digital minister. Has since encouraged, the gdg, tainan, chapter, to create more than a hundred, different apps, to help the local community. In india. Gdg, cloud pune. Used machine learning with tensorflow, and gcp. To help their community with an app that remotely, analyzes. Dental health, and helps patients, book at home dental services. They've had over 300. Trained, images so far, and the developer community is working with local universities. To gather more than 50, 000 dental images, to improve, the application. In the middle east, the region's, largest, international, women's day event, was held in april, in turkey, by multiple, gdg, groups working together. They had over, 2500. Attendees, to the event, which included a raffle for online courses. And talks in artificial, intelligence, and data mining. The event helped women, across, the community, and across the region, learn, and apply their skills. And of course, here in the united states, the gdg, memphis, chapter, has joined give camp where software developers, in the area, donate their skills, for non-profits. As well as helping kids learn to code on the weekends. These are just a small, sample. Of the ways that google developer, group chapters and communities. Like yours. All over the world, are learning, together. And giving, back what they've learned, to really help people's lives. Everywhere. Be better. I encourage, all of you to continue. Be in community, with your fellow developers. Think about, how can you use the skills. And the tech, that you're learning, this weekend, at dev fests. As well as the networks, and the communities, that you form, the new friends that you make, how do you combine, all of these, to make a difference, in your own area. Now let's take a look at a really inspiring, story from uganda. It shows how a community, member. Started learning about machine learning, and ended up building a really nifty app, using tensorflow. That detects, diseases, in plants, and helps farmers, reduce crop devastation. Take a look. Growing up in the city, i never expected, to work in agriculture. But when fall amiwam. Attacked. It affected, us all. Since its arrival in 2016. This crop best has caused massive devastation. I've met farmers, who have lost. Everything. Solid. I can't work, the kinds of mala, or whatever was stable. So, i wanted to use my skills, as a software, developer, to help. At a google study gym, we taught ourselves, tensorflow. We started by building, an android, app on top of an open source api. The app allows farmers, to sport infestations. Early. Far beyond the capability. Of human, eyes. And suggest, an effective, treatment. When i was younger, i didn't know what i wanted to do. But then i discovered, app development. And i was excited, to show people something they never thought possible. Machine learning gives us the advantage. Against, fall amir. Ultimately, saving harvest, and reducing. Pesticides. We are just getting started, and there are so many other sectors. Like health, and education. That machine learning could really improve. Farming is a crucial aspect of life in uganda. And i feel proud. I'm part of a team, driving to ensure our culture can continue. Earlier, you saw a beautiful, inspirational, video, about how machine learning and android. Were used to create an app to detect crop diseases.

So For devfest. I wanted to get together a few of my friends from google, and beyond. To show you how you could get started in building something just like that, from scratch, in a few minutes. We'll build an android app and a web app, so to get started, with android, let's see what chat can teach us. I want to create an app that's able to recognize, information, about plants. It's going to need camera functionality, as well as machine learning inference. Let's see what that looks like in code. The app is written in kotlin and uses camera x to take the pictures, and ml kit for on-device, machine learning analysis. The core functionalities, in take photo, where we take a picture, analyze, it and display the results. First we call take picture on a camera x image capture, object that was created earlier. One of the parameters, is a callback, object which has this on capture, success, function. We get the received, image into the format we need for ml kit, then we create an image labeler, object, and process the image. When this succeeds, we receive a collection, of image labels which we turn into text strings. And display a toast with the results. Let's see what the demo looks like so we'll take a picture. And it says, i see an insect, and a plant. So that was pretty easy rigging up camry x and ml kit to detect, arbitrary, objects, in the camera view. But the results were, pretty generic, because the dataset, didn't have enough information, about our domain. So let's, dig a little deeper. Okay, let's go deeper. Now we need a model for something, very specific. Detecting, diseases. In beam plants, instead of cassava. Let's explore how to build it. On this guide we'll use some of the great tensorflow, tooling available. Let's start, with colab. You can understand, collab, as a cloud-hosted. Development, tool, we will do all our coding on it and you will not need to install, anything, on your machine. Let's start with a new notebook. Let's just send the title to beans. We will need to install some packages. That we are going to use later. This package are not installed on your machine, they are on, a cloud machine. That was, created, for your collab. Nice it finishes, out the packages. Let's download, the data, and, do some visualization. To understand, how our data. Is, separated. Perfect. We download the data. Let's take a look on some of the images. So we. Can have a better understanding, of what we are doing here here they are these are some of the images that will be used for training our module later. Now we have the data. We need to create a model, we are not going to create one from scratch. We are going to use a technique, called transfer learning. Tester flow hub is a repository, for, tensorflow, models, you can find all kinds of models here let's start with this one. Let's go back to our collab. Let's. Define, a model, handle. Nice, now we have the data. And the base model. How can we do, transfer learning. To do that we are going to use one of the tooling, that i mentioned before called model maker. Model maker make your life way easier, when you need to do transfer learning. Let's create, the. Spec for our base module. Let's. Create our, train. Variables, here. Using the data set beings that we've just seen. And now we're going to put everything, together. With model maker. By defining. A model with the training data, and the spec that we, got from tensorflow, hub. This will take a couple of minutes. It finished training. And as you can see here our accuracy, is at 87, percent. Of course, let's evaluate, the model. With some data didn't see yet and see how good, it is. Nice. 95. The tensorflow. Lite model gus just created. Contains, all the metadata. Android, studio, needs. To recognize. It and automatically. Build classes, for it, to get started. You can update your bell.gradle. File to include the following, tensorflow, lite dependencies. Then, you'll want to import your generated, tf light file into the ml, folder of your project. Let's check out the details of our imported model. From here, we can see an example, of how to use the model, in our app. Let's move over to the main activity, class to take advantage, of it. Inside, of our image capture callback, here on line number 78. We create an instance of our model. Next. We use it to process, the captured, image, here on line number 84.. And finally. Here on lines 92, through 98. We display, the results, of consuming, the output. Inside of a toast. Message. Let's run our app. Now. Instead of telling us it's looking, at a leaf or a plant. It can actually, tell us if it's looking at a bean leaf. And give a diagnosis. Sweet. So this concept, works but it's very much a raw demo, what if we want to make this a more successful, app. Well we'd probably need to add services, like authentication. So our users can sign in. Analytics, and a b testing so we can find out how our users are really interacting with our app, some crash reporting or performance, monitoring.

And An easy way to save our users data to the cloud. Luckily, that's where firebase, comes in. Now the new and improved firebase, plugin in android studio makes this simple. I'll start by adding some analytics, so i can find out exactly, how our users are interacting with our app. And the plugin does most of the work to get the library, integrated, into my project. Now that i've done that well we can, get an instance of the library up here. And then we can log what kind of results we're getting from ml kit. And then once we've done that there's a lot of ways to get at this data, it'll start showing up here in the firebase, dashboard. But i find one really fun way of viewing this data is to use stream view which kind of gives you a real timey sample of what kinds of analytics, results we're seeing. Looks like i've already recorded, several of these select, content events. And i can dig into these event properties, and see what kinds of results our users are getting. And i could start using that information, to maybe refine my ml kit model, or a b test different alternatives. Firebase, helps you build better apps and analytics, is just the tip of the proverbial, iceberg. Maybe we could let our users upload their own pictures, and store them in the cloud using cloud storage for firebase. There's so many possibilities. This is a sample, app, but if we were to productize. This it's important to keep in mind how our ai design decisions. Impact our users. For instance, we need to consider. If fandor, how, it makes sense to display, confidence, intervals. To help your users interpret the ml model, output. Or say, how you design the onboarding, experience. Sets user expectations. For the capabilities. And limitations. Of your ml-based, app, which is vital, to app adoption. And engagement. For more guidance, on ai design decisions. Check out the people plus ai guidebook. At pair.with. Guidebook. This use case focuses on plant diseases. But for other use cases, where our ml based predictions, intersect, with people, or communities.

We Absolutely, need to think about responsible, ai themes like privacy and fairness. Which you can learn more about, at. Resources. Slash responsible. Dash ai. And don't forget about the web, i've built a pwa. That can be installed across, all your users platforms. It combines, the web camera, with tensorflow.js. And by integrating, machine learning, we can make an amazing experience that runs across, all browsers. Now let's take a look, we have our standard project layout, with a html, file, a manifest. And a service worker to make it a pwa. We have some styles to make it look good. And our data folder that contains, our tensorflow, configuration. And trained model that we're going to use in the app. Now to the heart of the project, let's go back to the html, file and see what's happening, we're also loading the webcam object. This is just a class that wraps some boilerplate, logic to make it easier to pass camera data from getusermedia. To tensorflow. And now let's dive into our app logic. In index.js. So i'm just going to use chrome and the debugger here and this is johnny so you can kind of see how easy it is to integrate machine learning into your application. So let's get started by clicking the classify button. And get the machine learning gears into action. And immediately, we break into the tensorflow, tardy function, this is just there to help you kind of clean up any of the memory that tensorflow, uses whilst it makes a prediction. We get our image from the web camera. And then we pass our image back into the uh into the machine learning algorithm, to make it a prediction. Then once we've gotten prediction, we access the data, and then we can use that data. To update the user interface, kind of based on any application logic that we want. And that's pretty much it. Great, so now you have the platform for building a real app, with the tooling from android, studio, the apis, from camerax, jetpack, ml kit collab, tensorflow. Firebase, chrome, and google cloud. You have a lot of things that just work better together. This isn't a finished project by any means. Just a proof of concept, for how a minimum viable, product, with a roadmap to completion, can be put together using google's developer, tools and apis. You might also want to open source this project too, so developers, can suggest, improvements. Optimizations. Or and even additional, features, by filing an issue or sending a pull request. It's a great way to get your hard work in front of even more people. We'd love to help you with this and you can learn more about the process, at open source dot guide. Starting a project. Indeed we've already open sourced the bean disease sample we discussed in this video, so you can have a great place to start. Thanks puja, and as you mentioned, open sourcing a project is a great way to make it grow, and inspire, people to adopt and extend, it. If you want to learn more about what you've seen in this video, please visit us at, Hi developers. My name is annie jean-baptiste. And i'm the head of product inclusion at google. The demo you just saw shows how google's, products can come together to create an amazing, app, but what about product inclusion.

You May be wondering, well what is product inclusion, and why is it important. At google, we believe that giving power to new voices, is the core of innovation. When we bring an inclusive, lens to the product design process, we amplify. Underrepresented. Voices. And allow all users to feel seen and validated, in the moments that matter for them. We look beyond ourselves, and seek out diverse voices to shape the products that we build. We also believe that we have a responsibility. Not to disappoint, our users, no matter who they are, what they look like how much money they make, who they love, how old they are or anything, that makes them them. And when we make difference the new normal. We will usher in new opportunities, to grow our business, by earning the love of all of our users. So whether you're 1 or 105. Years old, live in a city or remote village, on wi-fi, or cellular, service, google is there for you to make sure you have the answers you need, when you need them, product inclusion, is exactly what it sounds like, bringing an inclusive lens throughout the entire, product design process, to create more inclusive, products for our users. And so when you're developing, your own apps i challenge you to incorporate, the principles, of product inclusion. Into the design, process. Because we believe that you can do well, and do good by being intentional, about including, underrepresented. Voices, at key points in the product design process. Remember. Those closest to the problem, are also, closest to the solution. You can learn more about product inclusion. At google, by visiting. Thanks. Hi everyone. My name is jen cole and i'm the global developer, communities, program manager at google. Now it's time to meet and hear from your local developers. So, what can you expect next from your local devfest. We'll have technical, talks, breakout, sessions, networking, opportunities. And more. These sessions will cover a variety, of technologies. Such as android. Google cloud platform, machine learning with tensorflow. Web, firebase. Google assistant, and flutter. With speakers, from google. Women tech makers, google developer, experts, and your local community. Be sure to claim your badge for participating, in devfest, 2020. By going to, Devfest, 2020. You can earn even more badges by mastering different google technologies. Available. On, Learn. As a reminder.

One Of the most important, parts of devfest, is providing, an inclusive. And harassment-free. Experience. For everyone. As an active participant, of devfest, today, we can all agree, to treat everyone with respect. And to speak up if we see or experience harassment, of any kind. Together we can create an environment, that is welcoming, and inclusive, to everyone with us here today. Follow at gdg, on twitter for highlights from deaf fest around the world, and try out the devfest, ar filter in avatar. Share what you're learning, or your favorite part about the event on social media, with hashtag, devfest. And lastly. Don't forget community, is all about getting to know one another. Use the virtual breakout rooms, and chat feature to connect with other developers, near you with shared interests. We hope you enjoyed deaf best. 2020. You.

2020-10-24 08:42

Show Video

Other news