ML/AI with Zack Akil: GCPPodcast 206

ML/AI with Zack Akil: GCPPodcast 206

Show Video

Hi, and welcome to episode number 206. Of the weekly Google cloud platform podcast. I'm john and i'm here with my colleague Gabby, hey Gabby how's it going doing, John how are you pretty. Good just getting, over the Thanksgiving. Dinners. And, regretting, how much food I ate. How. Was your Thanksgiving how was your holidays. You. Know it was nice because I got to rest, I got blessed week off and like I just got all the days together but like I'm tired of the weekend, I'm tired of resting which. Is weird but it's good it's a good problem to have right. Well. It's not a bad problem to have a lot, of time to rest I barely got the chance I was driving, everywhere for the holidays but uh glad, to hear that you had a great time resting. On. This episode we're, going to be talking to developer advocate machine learning engineer Zack akhil and learned for him what you can do with ml, and creativity. She's, very creative when. It comes to building, things from scratch by his hands and you'll, hear from him how he, takes a lot of inspiration from the things he sees around him to build really cool demos yeah, he's very passionate about that and, it. Shows you know it's, really nice and, we're. Also gonna get into our question of the week which is how to run cloud functions, in a local, environment. So. Yeah I'll reply, to that later but, learn how to use it without like. I don't know if your Internet's not working maybe. You need to do that so, but before that let's get into our cool things of the week. We, have now stocked driver on our Google, cloud code which. Is our extension, for Google or visual, code and it can also use on IntelliJ, and what. Happens now you can see the logs with our studio code without having to go, to the Google cloud dashboard, to see it and there is a link there to see how to enable, that on your favorite. IDE, pretty sure any type of extension that Visual Studio code is so lightweight I think a lot of developers tend to gravitate towards, that so it's pretty cool very good, performance usually so. My cool thing in a week comes from my. Recent, travels from Cube Khan right, before, we, did our workshop, we released, version, 0.8, of open match and. 0.8. Is really focused around the developer experience so, there's a lot of getting started guys a lot of tutorials and here, you'll see our documentation did they really break down the most common use cases and pretty much sample applications that you can create using open match so if you're interested in creating, a scalable, match maker take. A look the. Other one is if. You're familiar with. We, call common theme expressions, it's a way to disturb, queries and refer in sub queries, without having like queries, inside queries in Sai queries the, wif clause which you do that it's now available on Khan spanner so. You will, be able jawed to create, a a sub query and then reference, on the next query that you're creating without having to copy and paste code which is most cases not a good idea and be. Able to create a priority table and execution, time to, be able to process that information for, you so you, gain performance and more, visibility, on your code that's. Gonna do it for our cool things of the week let's. Go along and talk to Zack and see what he has to say about autumn. A vision autumn a video and the cool demos that he's been working on. And. On this episode we're going to be talking to Zack Takeo a very, very interesting, and very good, friend of ours that can you tell us about who you are and what you do yes. Thank you very much John I am a, machine learning engineer, and developer advocate. For the Google cloud platform and, what, specifically. Do you work on so, my role. And projects, are all focused, around trying to inspire developers. To implement. Machine, learning into their projects, what. Is a favored machine, learning, tool that you have a Google, cloud platform that. You recommend, that you like most the products. That I focus. Around are all the machine learning api's, and the, machine learning auto emelles which we like to call the sort of building blocks a machine learning that allows you to gain some power, and value from your data with like necessarily, needing.

To Know all the, in-depth knowledge on machine learning awesome. So we've, mentioned a, lot of the AI and, ML tools and Auto ml tools and our like cool things of the week as they do like product updates so can you give us your definition of, autumn, ml vision autumn of IDEO and best practices, and maybe what, they're best used for most, common use case that I always tell people about is if anyone's. Ever used google photos on their, phone or on their, pixel where they're able to search through their own images based off of tags, which they themselves didn't, put in so if you've taken a picture of the beach you. Can search for the word beach and it will find from your own photos pictures of beaches. And that's, actually a feature that's built into the vision, API so. You can upload any image and it will automatically, annotate, it for you with objects. That are inside the image and I can even do things like OCR. Or text, recognition within the image and we. Commonly, use that for if people just have a large storage of images or videos to you automatically annotate, them to make them easier to search 3 or to gain some insight about what. Videos, or pictures people are uploading into certain services. That. Explains, why, when we look at Google photos and if I search for winter, I can. Say snow or, hoodies. Me hoodies, or. With, a beanie is, that correct, yeah. That's the kind of tags which will appear when you search those sort of things awesome, and that also explains why. It's so easy for me to find a lot of my game images, then I take screenshots, of hopefully. And. Hopefully, try to to, recreate later on when I'm practicing my own. Exactly. Out with, Auto ml vision, what is the most common Center is it streaming, or other. Kind of annotation, like Auto ml image what are the main use cases were video is in video archive, and search there's, a lot of times companies, especially, companies that maybe have their own kind of storage clients, or are storing a lot of videos things, like media companies, they. Want to store the videos and search. Through them quickly without having to go through and annotate them common. Use, case that someone came to me with recently, was with regards to video editing, so, especially. If people are, recording, sort, of niche video. Scenarios. Where they're interviewing people and they maybe want to get all the videos where people, are laughing that's. Not a feature that is built into any API at the moment but it is a feature that can be trained. Using auto ml video. So, if you want a special, type. Of detection which detects when people are laughing, or maybe special. Camera movements you. Can train your own Auto ml to detect those so they will automatically, retrieve them when you want to awesome. So you really focus on autumn of vision and video, so, when I first met you Zack maybe earlier this year you had a demo running on your desk that if you place the item, in front of a camera to tell you whether or not the actual item was healthy or not and I think the sample did it you we're using fruits, and candy, so, it, seems like you have makers, approach for all of the work that you do and all of the demos that you do so if you tell us a little bit about how, you come up with those ideas absolutely. It, all goes back to when I first got into tech and it, was I got at Arduino, which, was a like, a small, prototyping. Electronics, board for like making, projects. To make LEDs and buttons light up and that, was my first intro, into programming and so when I started learning using, that I find it a lot, more rewarding seeing. Even just a single LED light up compared, to having some text on the screen move, in a different way so since then I fell in love with making physical things happen physical, hardware demos and then. After. University I looked, into machine, learning and it, was a lot of, really cool machine learning applications, that. Were running on UIs, and desktops, and exporting, CSV files and I. Really wanted to combine. When. You find interest of machine learning with my old passion. Of hardware. And electronics. So. Now when I sort. Of come up with ideas I was trying, to combine machine, learning and embedded, technologies, just because I think machine, learning is one of the most interesting new, technologies, that, people are getting, into and hardware. Electronics, is one of the most fun ways to interact, with any kind of technology so. I just always try to combine the two and they also created, a demo that, I think you won. A prize or something, about it with a bike and I'm, al can you tell us about more that absolutely. So I've, recently. One of my more recent projects, is a smart, AI powered. Bicycle which. Performs. Real-time. Image classification of, what's happening behind the cyclist, so as you're cycling down the road and say, a large truck is approaching, from behind your. Handlebars, will light up red, to indicate there's danger and it will light, up on maybe, the right-hand side of your handlebars if your truck is on the right-hand side and if the truck pulls, from the right to, the left you'll see a kind of blob of LEDs move across your handlebars to represent the, the truck, and, I.

Built This using, the. Coral h TPU, which is one of Google's, products for running machine learning fast on embedded, systems and, that plugs into a Raspberry Pi and I, built this and I built it on to a physical, bike and we, recently had. An internal sort, of hackathon and in, that hackathon it won first prize as, just. Like a hacky. Project, and quirky. Implementation. Of machine learning but, it's actually a project which I plan to continue working on until I can get kind of a more solid, working, concept, going and. Potentially. Working with local, bike companies, in London to, see if we can get it onto real bikes for people, that's. Impressive I would, assume that you you're a cyclist yourself, and there's probably some real-life, inspiration behind. This project oh yeah, absolutely there's, like some of the horror stories, around being. Heckled at. The road by, like accidentally, pulling, in front of people without seeing that they're there but, knowing that I couldn't really do anything because I because when you're cycling in London I'm sure it's similar for other busy cities like New York you. Really, can't turn round to check because you're constantly having to look forward and there's, just like it's, just like an extreme sport is the way I always classify, it and the. Current technology. To, help people be more aware on the road hasn't, really gotten past mirrors, they're, ugly they don't run machine learning so, that's why I came up with this that's. Awesome it's kind of funny because in, New York you get the opposite it's the cyclist who how could you when you're driving throughout, the city the funny. Thing is when I took this I built this and I built a travel, version of this bike so like a flat-pack IKEA, model. Of a bicycle, that I would take round and assemble, on stage to show people I, took, it down to Sydney and I showed, it to people they all loved it because it's Sydney's similar, problem but they. Also. Notified. Me of a specific. Like, regional, problem, to Australia, which is for, cyclists, the biggest danger is it necessarily, trucks. Or cars but. Actually, territorial. Birds that, will sweep time on cyclists. Yeah. And it's a very very common issue in Australia and, in fact it's, very common to see cyclists, cycle. Round with zip ties tied. To their helmets to create like spiky, hair, to, scare away birds from attacking them and yeah. It's it's a crazy, problem. And the. Initial, version, of the bike was, just using a off-the-shelf. Model that, had traffic, detection but. I was thinking of sort. Of making an Australian, version of the bike that has a model, trained, in Auto ml that has both traffic and territorial. Birds so. That it, can detect though, there's there's a car there's a car there's a truck and then there's a falcon coming back. What's. Cool is with with Auto ml it's actually quite straightforward because although amount no supports, exporting. Object, detection models to the coral HDPE, so. You have it on your hand, without, having to connect to the internet or rely, on 4G. Exactly. So yeah you can take a model that's trained in the cloud on auto ml and just download it to run locally on a July so you don't need a connection to the internet for it to work and.

About. That, you, also presented. That demo cloud, next London, yes yes it is how was the reception. I always start the talk by asking who who in the room is a cyclist. Fortunately. It seems a lot of people in tech like cycling, because it's always the entire room that raises, their hand so, they they seem to really like it I always also. Ask okay who cycles in London usually it's about the same number of people if it's a talk in London of course and. Then, I'll follow, that up with okay who would recommend, cycling, in London, to like a family member and then all the hands go down so. Everyone's. On the same page is like we all cycling on and we all think it's very very dangerous we wouldn't want anyone to. Do it I show. Them the bike they, seem to really like it and then people always come up afterwards, usually, actually every time I've given the talk there will be someone that comes up afterwards, who has an injury that, they got that week from cycling in London and they but I really could have used it this week. So. It really resonates, with a lot of people. And. Then I do remember, when you were in New York you were working on that image classification matrix. Okay. I was doing like a little side project yes. And it was actually my first time doing an image generation, where. You can get AI to generate, new images for you and I actually I, used tensorflow, j/s on my, personal. Website Zak. Akhil calm to. Generate. To. Generate, the actual title of the website in real time so as you're moving your mouse around it's generating, the title like it's completely useless doesn't. Solve any problems, it's just it was just a bit of fun that's how I have fun is experiment. With new AI beyond. The demos that you've built I think, you have a lot of interaction. With other developers, to being a developer advocate did. Someone else came, with, you like with a really, cool interesting. Implementation. From our ml, or the ml API is beyond, that one that you mentioned about the Laughing one beyond, that do you have any other, examples well. Next, san francisco's and it was a lighting talk about giving, a video and using the video intelligence, api and the vision API, automatically. Bleep out anytime, a person, says an acronym and immediately. Afterwards someone. Came up to me and said that they wanted to use this kind, of technology using. An auto ml video. Model, detect, when someone, is talking about a specific subject, like, at. The time I think it was Game, of Thrones was. Still pretty, hot, and. They, wanted to detect. When, say there was a video of someone being, interviewed and they gave away a spoiler, to Game of Thrones and, beep, out when, they say the spoiler hang on a minute I thought Ned Stark was the main character, and then they jumped off so. They were gonna use old oil to build a custom model to detect that so, that was pretty cool, they should did that for trailers, - I can't stand like oh yeah when you go somewhere I don't watch trailers because spoils, for me they reveal everything, and like I want, that everywhere. That would be perfect, I feel like it's gone crazy now with the amount of trailers I just give away the whole film for.

Free Use case for an Aldo Mel video detection, model so automatically, trimmed down the trailer to just like keep things interesting so. Zach I know that you're quite, the athlete, I know that you play, rugby. Yes. When, you came to visit New York you just randomly decided to go to pick up dodgeball I tried to find a referee but they didn't have it but I settled for dodgeball. So. Do, you see yourself using ml and like the sports and recreational, activities Rome or can you tell us about some demos that you've done in that space that's, like absolutely. That was my original. Inspiration. To, get into machine learning it was actually, to build automated, camera, equipment for filming, sort of grassroots rugby, matches because. At like school level I didn't play rugby initially, but I recorded, the matches and then when I started playing I really wanted people to record my matches but nobody wanted to do it because they didn't want to stand out there in like the Irish freezing, weather holding. The camera steady where well my first ever physical, projects was actually a camera. That would automatically, follow, rugby, matches and record it I think, I gave my first ever talk at a PI, data conference about that demo and I'm. Continuously, anytime although ml releases a new feature I'm always my first examples. Of demos. Are always to do a sport or sport tracking so, I'm actually already. Talking. With some. Of the lead coaches, of the, Great, Britain tag rugby, teams, about. Using, Auto ml video, tracking, and things like that in combination with drone footage that's already been taking of training, sessions to, try to say track, players and, track, their acceleration, and things like that and so. Yeah absolutely I expect, to see quite a few sport related demos especially, in summer time not, so much in the winter time sooo cold except. If it's curling, like you could do something for curling -. That. Could be nice like tracking, the curls and stuff it's. Not big. I. Just. Like I said no. One came to my right I'm just perfect so what, is the demo that you're working on now well, we've just finishing. Up all our work on all, most recent, demo which was in partnership, with the Football. Association. In here in the UK and it, was actually like the main demo of the, next, London conference which, involved. Three, camera. Phones or their actual pixel for phones, to, record a, penalty. Kick from, a full-size football, or soccer net and, it. Would upload all the video, footage to the, cloud using firestore, and analyze them with video intelligence, and all this stuff so that a person would go up take three kicks and then it would immediately see a dashboard with the speed, and accuracy along, with the style of their kick and get, like their own player profile, card printed, out for, their like sticker book oh that's awesome. How long ago was the delay from, the kick to having, the card from, kick to card the, actual printer itself took like a minute, getting. The kick with, both the accuracy power and the style calculations, took. About. 15. Seconds. Conservatively. Sometimes. A person would forget to head stop recording, and it would take like a half a minute but. Around 15 seconds to get all the data continuously, analyzed, and then into, that like image is about that 15 seconds, impressive. It's pretty cool some, kind of curious what MOS was are using when it comes to like. Following trajectories, because I would assume like if you're bending a ball under a shot then there, has to be some way of try the trajectory, and then hopefully like, coming. Up with a more precise, calculation. On I guess you can call it with the style so the. Way we calculated, the accuracy, this is probably the bit of the project that I was most proud of doing because what we did is we used auto, ml object. Detection and we. Just wanted initially to track a football moving across the. Screen and there's. Already lots of public, data sets that, have footballs. And other sports balls labeled and, we actually use the public. Cocoa dataset which you can download and I actually took that and dumped, it into auto ml object detection scraping. Out just the photos, of balls and then, trained a model that would track the soccer. Balls and that. Was the initial model it, performed, all right until. The, balls were kicked too, fast because, then they'd stop looking like sharp soccer, balls and we're just like very smudges across the screen and, that's when we actually just collected a bit more data and labelled it in the autumn L UI, which, is a good use case for slightly, more customized, models because the, the, public data set didn't have blurry soccer balls, after.

That It was performing like 99%, accuracy on tracking, these class Beauty objects and so then what we did to calculate, accuracy, is we, just recorded someone, taking a kick for. Each frame. Detect. Where the ball is and then, I have a bit of a data science background, so I played, around with visualizing. Maybe. Like okay, what's the speed of the ball moving in pixels and what's the trajectory, of the ball moving in pixels and eventually. I stumbled across a way of detecting, when, the ball actually made impact was the goal by continuously. Checking the, direction, of the ball that it's moving and then, as soon as the direction changed. Suddenly, that, was when it had an impact with something and I. Would just take that as the point, of impact and then we. Were also using auto ml to track the targets, that, we had hanging in the goal net and then. Once we know the impact of the ball and the position of the targets we just did like a little distance, metric and, to, calculate accuracy. That's. Really cool that you found a workaround because I know that when, it comes to sports, imaging they, typically use cameras that have like really high refresh rates that way you can, sure images, really quickly but, knowing, that ml can solve that issue with just your standard, I believe you said use pixel for phones right we use pixel for phones yeah cos we actually when we were building the initial prototype we were using my, own like pixel to phone and because, most, Android phones have a slow, motion capture, of like 120. Frames minimum, that, was fast enough to capture even people really, like hammering these, like kicks and. It was tracking them no problem it's pretty awesome that's using machine learning to do the heavy lifting rather, than getting, more high end technology I think the original spec, of the demo we were going to go for these like high-end, high speed cameras, but, my kind of hackery. Nature. Was. Like no no, I just want to use a mobile phone because it's simpler, and more people have access to mobile phones and rather than having a model that detects, a sharp soccer, ball will just train them all let's just exa blury wonder it's fun way. To think of a developer, man yeah. Did. You have to think that under pressure or like, you had time to do that I'm, a notorious.

Procrastinator. So to, answer both questions yes. Yes. I had time and yes it was also under pressure because, I didn't utilize my time as well as I could have, so. I'm pretty sure you've had to take a couple shot can you tell us a little bit about your card I did check my car this morning and, I remember exactly the scores I, got. 91 on power I got, 96. On style, and, 84. On accuracy, Wow which was. Average. Because. We, said a bare minimum it was impossible to get a score below 50 on any of the stats that's super impressive man I. Think. In like everything over 80, was good you know. So, can you tell us about anything. Interesting. That developers using auto mail for or maybe something. That you haven't your pipeline for demo is coming up absolutely. What I'm looking, to do is using auto melt video, tracking, to. Build. Sports. Recognition. Technology, so that we can track players on pitches and work. Out new tactics for players is, there anything that we missed that you like to mention before we wrap things up the. Only thing that's currently, on the horizon, for me is using. Detection, exporting. To HTP use for my version, 3 of my HTP, bike where, can people find you like are you going to be traveling anywhere can. We expect some cool demos that you're gonna like open-source, or maybe they can follow you on your social media absolutely. So i am, on twitter at. Zack. Akhil well. You can find me around in London I run a London. Meetup, group for, people, who are looking to get into any kind of data science or practical machine learning called, central, London data science and yeah. Twitter, and London, no, conference is coming I'm, gonna be a dev rock on I feel there yeah, yeah for sure, so. Zak, it's been an absolute, pleasure having you in this episode. Really. Looking forward to seeing a lot more of your cool demos and you know hopefully, we can hang out a little bit more when you come back to New York City absolutely. Thank You Zach thank, you so much it's been a blast Thank You Zach for talking to us and now. We're, gonna do our question of the week John. So. Our question of the week is how, to run clown function in a local environment so, Gabby the, cool, thing about, serverless. Architecture. Is that you don't need to worry about, infrastructure. But. Sometimes. You, want to test something without, the point first which is a good practice so. You can now use Google. Cloud functions, framework, which it is an extension, for nodejs where. You can create, a cloud function, locally, and, you. Serve your nodejs. Application. Using, Express and you. Can hit the URL, that you're just serving, and that's, your called function, locally, without having. To connect, necessarily. To the internet so it, makes your testing, easier, and you, don't have to wait to the poi like all the time that you do a different, thing on a function, so, I think that's very useful awesome. Service without the internet. Well. It's not the same thing we're like it kind of helps like with small case scenarios, you know right. So. Gabby where are you going any future, travel to wrap up the end of the year so no more work travel, because Conference seasons over finally I'm. Going to with my family to Brazil, so I'm gonna stand the, holidays, there so, it's gonna be a family meeting which is nice you I, just wrapped up the last of my conference travel, and any other travel that I'm having but. What I will be doing is going to a bunch of baby showers everybody, just decides that when I have kids now so I will be attending. A couple baby showers one, of them being my. Twin.

Brothers And at. The wedding that we just found, out that they are having a boy was he really excited about so yeah, you kept that secret for, me I definitely did, but. Very excited for them and very excited for all of my other friends who are you know expecting, kids that's, cute. Thank. You all for listening to this episode and we're, gonna be back next week I will see you later see ya.

2020-01-01 14:05

Show Video

Other news