Beyond the Hype: Real-World Edge AI Applications
(upbeat music) - Hello, and welcome to "insight.tech Talk," where we explore the latest IoT, edge, AI, and network technology trends and innovations. I'm your host, Christina Cardoza, Editorial Director of insight.tech. And today we're going to be exploring the embedded systems industry with two special guests. First, we have Brandon Lewis, longtime friend and contributor of insight.tech, who will be guest hosting the podcast today.
And joining him is Alex Wood, Global Marketing Director of Avnet. But as always, before we get started, let's get to know our guests. Brandon, I'll start with you. What can you tell us about what you've been up to these days? - Sure, so I've been doing a lot of coverage still of the embedded and IoT space.
I've been getting a lot closer with developers recently, seeing what they're working with, what they're working on, from a tools and chipsets perspective, which is always a lot of fun getting closer and closer to the action. - Yeah, absolutely. And looking forward to the conversation today with Alex. Alex, welcome to the podcast.
What can you tell us about what you do at Avnet and the company itself? - At Avnet, we've just launched our new compute brand, Tria, which is born out of the old Avnet Embedded business, which used to be Avnet Integrated, which used to be MSC Technologies. So it's a bit of an evolution over the last few years, but we've got a really good, strong brand now. So I've kind of been focusing entirely on launching that new brand for the last six months. It's nice to take a step back and have a cup of coffee with you and talk a little bit more about technology and less about branding guidelines. - Absolutely.
Before I throw it over to Brandon, Alex, I wanted to start the conversation, especially since you have this new product coming out. What are the technology trends? What's going on in this space that you guys launched a new line, or especially with edge and AI becoming more prevalent in the industry? What are some things that you see going on? - I think we're at a nexus point, really, in the industry. With AI, there's a lot of emphasis on putting things into the cloud, and there's a lot of pushback from people that want to put things on the edge as well. So you've got one half going to the cloud, the other half going to the edge, and both of them have their own challenges and potential setbacks. So that's really what we're seeing at the moment is customers are saying, "We want to leverage this, "but we're not entirely sure how we can leverage this."
And it really is a sort of, there's no perfect silver bullet. So we've got to find the right path for our customers. - It's interesting you bring that up because obviously a lot of people who are going to the cloud are really looking for things like more performance, working with bigger datasets usually. And on the edge, you tend to think that you have a need for lower power, quick inferencing. But then we see all these GPUs and stuff coming out, these huge, powerful GPUs. And I wonder, number one, what are some of the applications that are driving things at the edge, you see? And then do they really need, do they need the performance of a GPU? Can they get away with something else? Like, are you seeing more low power at the edge, more performance requirements? What is it? - Yeah, I think power is the key thing, right? That's going to be the make or break for AI.
At the moment, AI is super power hungry. It's consuming a vast amount of data. It's really putting Bitcoin, it's making Bitcoin look almost power efficient right now. With the amount of power it's consuming. And I think for a lot of businesses, people don't realize how much power AI applications consume because they don't see it.
They've sort of outsourced the demand. Like you run an AI application at the edge, it's hugely power hungry. And you have to deal with that problem, the power and the heat at the edge.
If you're sending it off to a data center, you don't see the challenges that it brings up. So it's easy for people to forget about that. So I think reducing the power requirements of performing these applications is going to be a key challenge.
And that's going to make or break whether or not AI sticks around in this hype cycle, depending on how you define AI and how it works. And accessing those large data models, being able to process things and also absorb data and processes in real time. The applications all require more efficient, more energy efficient, more heat efficient processing.
And I think that's going to be the challenge. - You're a marketing guy. By the way, thanks for the rebrand. - Sorry. (laughs) - No, no, no. I was going to say, I love Avnet.
It was sometimes confusing which Avnet, I was referring to, right? So I think the rebrand was great with Tria. So this push of a lot of these super, what we would consider embedded, like super big or super high performance processors, based on what you said, is this marketing? Are we just marketing to get, to sell more units? Do we really need that? And what are some use cases that you're seeing in real world use cases? We all hear about computer vision and stuff like that, but what's the reality like from your customers? - There is that speeds and feeds elements of marketing. So it can perform an extra amount of tops.
It can clock at this frequency. It's got even more RAM. And if I'm building my gaming PC, then that's a sort of like, "Oh, this is great.
I want to be able to get this extra amount of frame rates. I want to be able to render videos much faster." But at the same time, you then have to, like upgrading graphics card at my last upgrade, I had to get a PSU that was twice the size of the previous PSU.
And you're just like, "Wow, I'm pushing a thousand watts now to run a proper PC rig" when it used to be like 300 watts was a lot. That's triple the amount. For customers, that's the issue. We had the energy crisis recently.
That brought it to the top of the agenda. And now it's eased off a little bit for now, but it's not so long before I think it's going to come back up again. The energy consumption, the power is going to be critical.
So it's not so much about getting a more powerful processor, the most powerful processor. It's about balancing consumption, longevity, capability, specific to the application. For customers like that, okay, there is a marketing element.
You want to buy the absolute top of the range, the flagship processor, when actually you might not need that. But sometimes you do. And it depends on the application for what you're going to do. I'm the least marketing-y marketing-y guy in that respect. I'm kind of like, I'd rather sit down with the customer and say, "Okay, tell me what you're actually building." Rather than just say, "Yes, you need the top of the range.
"You need the i9 immediately." - What are they building? What have you seen? - There's loads of different things that we're working with customers on at the moment. And a lot of the applications, I mean, there's a crazy amount. Everything from new farming applications. I was reading about an opportunity mentioning no names. I mean, there's a lot of articles at the moment about more efficient farming and artificial intelligence being used as an alternative to things like dangerous forever chemicals that are being put into the soil.
So can you train an AI robot to move around fields and identify weeds, being able to tell weeds and pests apart from crops and non-harmful animals, and to be able to organize accordingly. One of my friends works in the farming industry here in the UK, and he works in a farming management industry. So crop checking. And he has to walk through fields, taking photos of the different plants and then educate people working in the fields to tell the difference between the different varieties of the plant and which one to select for breeding to build the best crop. And you can create an AI application in the field that does that for you. You don't necessarily want to put all of that content into a data center.
You want to be able to program the robot at the edge to be able to do that. So we're seeing applications like that in agriculture. And those are edge-based applications. You don't necessarily have a reliable cell data connection all of the time.
You want to be able to do that edge-based AI recognition. And then at the opposite end of that spectrum, so you've got the massive industrial agriculture use case, and then we've got automatic lawnmowers for people at home and being able to map the best path around the lawn, but then also being able to spot hazards and deal with hazards around the lawn as well. So one is a sort of great future-facing altruistic solution. The other one is a more practical real life solution, but it's those practical challenges in the real world that really put the technology to the test.
- Are both of those vision applications, I'm assuming? Like camera vision? - Yeah, yeah. So both of those, I mean, both customers, one is vision, one can be more radar sensor application, but vision is where the jump is in terms of the processing requirements. So that live vision AI, so being able to understand what it's looking at in as quickly as possible, identify it reliably and act on that identification instead of having to send signals back to a data center for crunching and then get it back again. It's being able to do that in a short amount of space and a short amount of time.
- So this is exactly where it's like, okay, well, you got your trade off time. It's like decision time, right? Beause now you're saying, all right, well, you've got vision out there and these are probably both mobile, I'm assuming, or semi-mobile, right? And you have to send, at least in the industrial ag use case, you're sending that back somewhere, right? So is this the place where you're like, how many GPU execution units can I fit into this or are you really, with Tria now, are you taking it case by case and saying, look, I mean, from a cost perspective, let's figure out form, fit and function here. And it's not top of the line. Is that the case? - A lot of customers will have several different tiers of the product that they're creating. So especially for different markets where there's a different appetite and also different sizes of the amount of things that they need to crunch.
So for agriculture, you'll see that there's the top of the range where they want to have mass scale farming, say in America's with the giant fields and they want to be able to do things at speed. They'll have the top of the range solution. You buy something really big, it will work in the field.
It's going to cover a huge amount of distance in a huge amount of time for a giant farm. So they have the money, they have the ability to invest in that. And then you'll want to have a slightly slower, slightly cheaper mid-range application as well. And then you want the lower end range as well for the market and then let the consumer decide.
Obviously you want to sell them the best solution, but sometimes it's not going to be an option. And it's balancing, most customers will have various different levels of capability and sell that to the end user based on their application. And for me, that's where the industry is driven forward by the actual application and whether or not the end user feels the need for that amount of use. I'm always reminded of the picture that does the rounds on the internet of the field and the path that leads around the corner, like an L-shaped corner. And then there's a trodden path across the field where people have just walked across diagonally and it's like design versus user experience.
And I think that like the last cycle of AI, there was all of this sort of exciting talk about what was possible, but at the end of the day, what was successful and wasn't successful was defined by people actually using it and finding it useful. So the applications that were created, some of them stuck around, some of them didn't. It was the same with blockchain when blockchain was skyrocketing in usefulness and the same with NFTs, Bitcoin, that kind of thing. People actually finding it useful as an application and being able to use it every day and decided what stuck around and what didn't.
- The same thing seems to have happened in the IoT space. There were a bunch of different use cases that were really pushed hard, like smart home stuff. And there's a point at which as a consumer, not just like a B2C consumer, but any kind of consumer where you just either don't need any more of that or it's just not really practical. It was a great proof of concept, but it's not useful at the scale that it's being promoted. And I think we run into that danger zone here with AI too, where it's like there's a lot of vision type stuff that's getting pushed and it's cool.
And I know that the margins are bigger there, but ultimately, a lot of the actual deployments aren't going to be exactly what you see out in the media. And it sounds like you're talking about with the trodden path across the fields, right? It's like the use cases, the demand in the market is going to start driving exactly where this technology goes and then how it evolves. - Yeah, you knew that the IoT concept had reached the top of its hype cycle when there was IoT toasters on the market. And okay, like we were saying before, there's different tiers of the products that's available.
Some people will go for that top tier and some people will just be like, I want my toast to be slightly more toasted. I just turn a knob on it, same as I did back in the 1950s. It doesn't need to be any more smart than that. I do like, I recently upgraded my aging fridge to a semi-IoT fridge that tells me if the door's open or if the temperature's too high or too low.
And for me, like I don't need a fridge with a screen on the front that gives me information about the weather because I've got a separate display in my kitchen. I don't need something where you knock on the door and it shows me the products behind it. I don't need a camera in there, but I do like it if it warns me if the door's been left open and it beeps on my phone.
And that's usually because my partner's been loading food into the fridge and forgot to close the door. And then I'm in here in my room and I'm just like, you left the fridge door open. Those real life applications are what sticks around. So IoT is now quite a mature market where the businesses that are investing in that level of technology, they put all of the technology into the device.
The consumer demand for that sort of technology cools off a little bit to a level where the consumers understand what's beneficial to them in their everyday life. We've got another customer that we're working with that makes industrial cookers. So for cooking consistently huge amounts of the same identical foodstuffs over and over and over again. There's an IT model, an IoT model there because you want to be able to control all of the different ovens and also manage a hundred different ovens at the same time and know if one of them is over temperature or under temperature, that kind of thing.
There's applications there that we're working with where that is a requirement where it might not have been 50 years ago when cookers were being used as an industrial scale like that. - So unfortunately what happens with this hype cycle like you mentioned is that everybody has these huge ideas, these grandiose visions of what the future is going to be like, where with IoT for example, it was everything is going to be connected and your toast is going to be ready in the morning and your car is going to be sitting there waiting to drive you off to work and it's going to be perfectly climate controlled and by the time the market starts to mature and people realize it's going to cost you a quarter of a million dollars per consumer to realize that vision and it's not going to happen, everyone experiences a sort of letdown, that's the trough of disillusionment, right? But that doesn't mean that the technology is actually dead or even unsuccessful, right? It just means that it's evolved in some different way and I think what you're describing with AI here and even that last industrial ovens sort of example is like, hey, there are a lot of use cases out there that aren't necessarily the biggest, baddest processor, RAM combination that you could potentially have but the volume's there and it exists. - That connects up with what we were talking about with power efficiency, so understanding you get all of the innovation, the excitement, all the things we could add and then you say, yeah, but I need realistically, practically to run it with this amount of power draw in order to get what I want. So you got to sacrifice something in order to get something else.
Sort of like with an electric car, you add loads of bells and whistles to it, it gets heavier and heavier to the point that the range drops and then you're, well, I want a long range model so I've got to increase the aerodynamics which means making it look a little bit less attractive and strip out things like power seats in order to reduce the weight as well. So you've got to find that middle space, that sweet spot in these sorts of applications. - How does the portfolio, Tria, like expand or develop in order to meet that range, that range of requirement? - I think we've got a pretty good range that goes from tiny little low power compute applications all the way up to the COM-HPCs with the server grade Intel processors in them. So, and like the COM-HPCs with the Intel processors are designed for edge-based image processing and AI applications, but they're larger as well. So you have to have a balance between size and power consumption and what they're capable of. So a lot of the larger, the COM-HPC modules are this sort of size, they're sort of motherboard sized which means that you've got to put them inside a dedicated case.
You couldn't just embed them directly into a product unless it was a really big product. So for things like edge security or public transportation, so AI applications and public transportation is another thing that we're working on at the moment. Being able to take data from a huge number of sensors from a train or other vehicle or train station, analyze them all, react to them in real time. That pretty much requires an on location server because sometimes you can't rely on the data network being reliable. And that means that we're using those for those sorts of applications in standalone servers.
But a lot of the requirements, we've got ones for industrial automation. So again, we're working with Intel on cobotics with one of our customers, building real time image sensors into a cobotics, a cooperative robotics environment. So a robot can operate in the same space as a human safely.
So if the human moves into that space, the robot arm stops moving, can move around. If the human picks something up, the robot knows where it is and can take it off them again. We were demonstrating an early example of that at Embedded World in Nuremberg this year. That was built around a combination of the Intel based ComExpress modules that we have and the Intel based, actually no, it's Intel based SMARC modules.
And then our Intel based COM-HPC modules for the image processing. And those two things communicating with each other. So getting the signals from the cameras analyzed and then communicating with the robot in real time as well.
So there is that sort of, how useful is the environment that you're creating, the application that you're creating there versus the amount of power, the amount of processing that you need, the amount of space you need in that environment as well. For some customers, that's a pinnacle. So it's giving them the option to say, okay, well, I need cobotics, I need to have a reliable environment and therefore I need that extra processing power and the associated costs that come with setting that up and developing and installing it. Whereas other manufacturers, they might want to just have an enclosed robotic space, no cobotics required.
You got to, like you were saying before, you have to create the potential for innovation. So you have to inspire the customers with that new technology, that new possibility, and then let the customer then build that application around it. And if it works for them, then that creates the foothold for that technology to develop further. And for sometimes you'll create that new technology like a lot of the AI applications that we're seeing at the moment where the customer, the user can't really find that sort of really, that killer app point that becomes a foothold for the technologies to develop. - Tria has almost used the bad A word now, the old A word.
(laughs) Within the Tria Portfolio, obviously it's pretty expansive, right? I mean, there's lots of options. What is the Intel portfolio look like? I mean, are you offering Atom, Core, Xeon, you know, the sort of the gamut or what does that look like in terms of scale? - Yeah, pretty much the full gamut. I think within the mobile processor space, like I said, up to the COM-HPC level, we can put server grade processors onto those. But at that point you may as well have an actual server. So it depends on the size, the shape that you need to put it into.
So yeah, we typically offer the Atom and the Core series and the Xeon series at the server end. We have those, it's really cool to see what the product team does, putting things into such a small space. I've been working with motherboards and processes for motherboards for years and years and years. So to see that sort of computing application in such a small package with heat management, thermal management is a fine art. And watching the team develop those sorts of applications in the environment that the product's going to be used in is a fascinating challenge. So being able to deploy like an Intel processor and its capabilities and the new AI based processes we're working on as well, to bake those into a small product to be able to use at the edge is pretty exciting.
- Well, cool. I mean, it's really exciting to see more of the AI in action than AI in advertisement, right? So really looking forward to seeing how this continues. - Brandon I actually wanted to ask you because you covered embedded world for us this year, which feels like it was last year at this point already, but there was a lot of next generation edge processors that came out that Intel launched Intel new core processors called Ultra Intel® Arc™ GPU.
So I'm curious, what have you been seeing around the industry, especially as we've talked about all these use cases and constraints of how the latest processors and technology advancements are helping some of the partners in this space? - There's obviously like the software side and the silicon side, on the software side, you've got DevCloud and OpenVINO™ has got a really good foothold, really helping streamline and accelerate the development of models. And there's even Intel® Geti™ which is even further back on the training side and just making it easier there. On the Silicon side, man, the core Ultras, like the AI PCs, I think that they're a really nice fit in this sort of spectrum that Alex is talking about because they enable you to scale up and scale down even with inside the same skew, right? Because you've got a lot of different compute that's available to you.
These heterogeneous processors where you can say, look, I want a performance core, I want an efficiency core from a CPU standpoint, but then also, they've got graphics execution units built, integrated GPUs where you can do acceleration there. And then you bring in neural accelerators. So, you can get this sort of ability to move your application in one direction or the other based on what is available on the SoC or chip set. And that just gives you so much flexibility because at that point to Alex's point about efficiency and power consumption, you're using the right core for the right workload, right? And that's really ultimately what it's all about because that allows something that would traditionally have been a smaller or less expensive processor to accomplish more. And really that's kind of the name of the game here. - That's a really good point, Brandon.
I was at Intel's AI event recently. They had that global event where they showcased all of their latest AI technologies. The applications there to look at some of the partners that we're showcasing were fascinating for how you can take AI to accelerate an application at the edge.
There were things like supermarket checkout applications which were automatic checkouts that recognize what it is you're holding and queue management, automating supermarket management as well. But it was really cool to see the applications that Intel was developing at the Olympics. So, the athlete applications that they developed there, that's a really great way of taking the technology and showing a real life use case to capture the imagination of potential developers of the technology. And the case study video that they showed of the technology being used in Africa to sort of scout a huge number of potential athletes and then find potential future Olympians based on image processing using that platform. That was a really cool, that really captured my imagination.
It really stuck with me. But being able to take that AI processing to the edge and in a laptop as well, it goes back to what we were saying at the beginning, taking that high power today, it's hugely, they're large units, they're very powerful, compressing it, making it smaller, making it more energy efficient, being able to put an AI application into a laptop, a laptop-sized device that can be used in the field is really exciting. I think it was Dell that was up on stage that was showing the laptops that they're going to be releasing with built-in AI applications. So it's an AI device instead of a computing device and really leaning into that collaborative AI application environment. You've got a great example from the Olympics that Intel's done, but it's a blank slate.
I'm really excited to see what developers do with that amount of AI processing technology at the edge, instead of having to depend on sending stuff back to a huge data center and back again. And I think that's going to be a turning point really for the future for AI at the edge. - Honestly, I think a lot's going to be about sustainability and something I forgot to bring up was, man, have you ever put a farmer's market piece of produce next to a supermarket piece of produce? It's weird. - I don't know where you're going with that, Brandon. - Well, you were talking about not using chemicals, right? Not having to use as many chemicals. And when you put the farmer's market versus the supermarket like something is not right here.
But sustainability in the future, I think is really important and I think all the things that you've been talking about and we've discussed today will help us on that path. - Yeah, for sure. - It's amazing, all the different use cases and everywhere you can go with these AI applications. I can't wait to see where else we go, especially with partners like Avnet. So it's been a great conversation, guys.
Thank you for joining. Before we go, Alex, I just want to throw it back to you one last time, if there's any final thoughts or key takeaways you want to leave with us today. - Like I said at the beginning and like we were kind of leading back into at the end there, I think that AI is at a nexus point at the moment and I think edge computing is a nexus point as well.
So that's advancement in edge-based AI applications. So being able to take it away from the cloud and onto the device, that's the nexus point. If you're watching this, find those applications and tell us about them if you've got them. I think it's a really exciting time to be working in computing on a small form factor with AI in this space.
- Yeah, and I invite all of our listeners to visit the Avnet website, check out their new product line, see how they can help you take some of your AI efforts and initiatives off the ground. So thank you both again, Brandon, it's always great connecting with you. You've always been our embedded systems expert and thanks to our listeners and thanks Alex from Avnet.
Until next time, this has been "insight.tech Talk." (upbeat music)
2024-08-29 13:33