Manufacturers Unlock AI at the Edge: With Lenovo and ZEDEDA

Manufacturers Unlock AI at the Edge: With Lenovo and ZEDEDA

Show Video

(upbeat music) - Hello and welcome to the IoT Chat, where we explore the latest developments in the Internet of Things. I'm your host, Christina Cardoza, associate editorial director of And today, we're talking about edge computing in industrial environment with Jason Shepherd from ZEDEDA and Blake Kerrigan from Lenovo.

But before we jump into our conversation, let's get to know our guests. Jason, I'll start with you. Welcome to the show. - Thanks for having me. - Yeah, thanks for being here. What can you tell us about ZEDEDA and your role there? - So ZEDEDA's, you know, we're all focused on orchestration of edge computing.

So management and security remotely of assets, you know, out in the field, deploying applications, understanding the state of the hardware, take the data center principles and extend them out as far as you can into the field to enable cloud native development while also supporting legacy assets. I lead our ecosystem, so I work a lot with strategic partners. I work a lot with industry consortium, you know, serve as our field CTO.

And one of my mottos is if it's fuzzy, I'm on it. You know, I'm always find myself on the front end of emerging technologies. So, you know, hence, edge right now. - Great, can't wait to dig a little bit deeper into that. And Blake, thanks for joining us today. - Yeah, thanks for having me.

- So what can you tell us about Lenovo and what you're doing there? - Well, look, I think most people know who Lenovo is as you know, one of the largest personal compute and mobile compute and data center compute hardware providers in the world. But my role essentially here at Lenovo is I manage our edge computing practice. So here at Lenovo, we're hyper focused on digital transformation as a whole for most enterprises. And we feel that edge computing is, you know, essentially core to our customer's journey. And so I've been here for about three years. I'm based in Raleigh, North Carolina and me and my team are uniquely focused, not just on edge computing, but also, you know, defining what is our strategy as a company, you know, how do we develop products differently for use cases outside of traditional data center or personal compute.

So mainly go-to market, product development, and product strategy. - Perfect, I love how you mentioned how edge computing is part of a manufacturer's digital transformation journey. I think that's the perfect place to kick off this conversation today.

No surprise to you two that the manufacturing space has been rapidly evolving over the last couple of years to keep up with the demands of the digital era. So Blake, I'm wondering if you can walk us through what some of those transformations in manufacturing have looked like recently. - Well, I think, you know, recently, they look a lot different just even in the last two years.

Things have had to change quite a bit, you know, Lenovo being a large manufacturer. This is a space that's very close to home for us, you know. Probably some of the largest trends that we see is around, you know, computer vision and AI use cases. So, you know, for the last, probably 15 to 20 years, I think most industrial customers have been uniquely focused around automation. You know, whether it's a simple process around manufacturing or some sort of a logistics auto optimization or automation process.

And today, what we're starting to see is the use of AI on a more binary state in terms of, you know, how do you create more efficiencies in some of those processes that already exist? But when you lump computer vision applications and solutions on top of that, we're starting to see unlocking all sorts of new insights that beforehand, you know, we didn't really have a way to capture some of these insights with some of the, you know, the sensor technology that existed in the world. So some of those trends that I see a lot, you know, in manufacturing and even in distribution is things like defect detection, there's all sorts of different safety applications. Usually, these were done, you know, as kind of point solutions in the past and with the, you know, adoption and transition from more purpose built compute to general built compute for AI and computer vision. We start to see, you know, a lot of unique types of solutions that we've, you know, never seen before and they're getting easier and easier to adopt for our customers. - That's great. Here at, we've definitely been seeing

all of those use cases and the opportunity with computer vision and AI just expanding those opportunities for manufacturers. Traditionally, they've been taking all that data and processing it in the cloud, but what we've been seeing is even that's not enough or not fast enough to get that real time insight and to make informed decisions. So, Jason, can you tell us more about why edge computing is playing a role in this now? - Well, I mean, the only people that think that sending raw video directly to the cloud are people that sell you internet connectivity. It's very expensive to stream, especially high res you know, video straight, you know, over a wide area connection.

So clearly with computer vision, the whole point at the edge is to look at live camera streams or video streams. It could be thermal imaging, it could be any number of things and look for an event or anomalies, you know, in the moment and only trigger those events over those more expensive connections. I mean, it goes from, you know, manufacturing through, of course, all different types of use cases, but, you know, it used to be where you got someone just sitting there looking at something and then call somebody if something happens and now you can have it being continuously monitored and have the intelligence built in to trigger it. So, you know, edge is key there.

I mean the same thing with like, you know, 5G is a big trend in manufacturing. I know we're talking about computer vision now, but you know, every new technology is like, oh, you know, this is short-lived. Well, 5G actually drives more edge computing too because you've got a super, super fast local connection, but the same pipe upstream.

And so we're gonna see more use cases too, where you mash up, you know, these kind of private 5G small cells in a factory with computer vision. And then of course, other sensing technologies. But yeah, we're just kind of at the beginning of it as it pertains to edge, but there's just so many possibilities with it. - It's funny that you mentioned the only people that talk about processing in the cloud are the people that it would benefit most, but I think that's also a real issue in the industry is that there's so many people telling you so many different things and it could be hard to cut through all of the noise. So Jason, can you walk through, you know, what are some of the challenges that manufacturers are facing when going on an edge computing journey and you know, how can they successfully move to the edge? - Yeah, I mean, there's kind of in general, like you said, it's the hammer nail syndrome.

Everyone, you know, tells you that they can do everything. I mean, edge is a continuum from some really constrained devices up through kind of on-prem or, you know, on the factory floor, say the shop floor up into sort of metro and regional data centers. Eventually, you get to the cloud and where you run workloads across that continuum is basically a balance of performance, costs, security, you know, et cetera, latency concerns.

And I think people are, first and foremost, are just confused about what is the edge. There's a lot of edge washing going on right now. And whoever the vendor is, what they sell, and where they sell it, that's the edge.

So I think for manufacturers, first and foremost, it's kind of understanding that it's a continuum. Understanding there's different, you know, trade-offs inherently. If you're in a secure data center, it's not the same as if you're on the shop floor, even though you wanna use the same principles in terms of containers and VMs and things like that. Security needs are different.

So it's concerns around getting locked in. You know, everybody loves the easy button until you get the bill. You know, so the whole thing with the clouds model is you know, make it really easy to get data in, but then very expensive to get data out or send it somewhere else. That's another reason why we're seeing this shift.

It's not just about bandwidth and latency and security and all the reasons you see edge. So long story short, just navigating the landscape is the first problem. Then when you get into actually deploying things, I mean, these always start a use case. Say, I'm trying to do quality control or improved worker safety or whatever, using computer vision.

It always starts with, you know, a POC. You know, I figure out a use case, then I'm doing a POC. At this stage, people aren't thinking about management and security and deploying in the real world.

They're thinking about an app. And we see a lot of experimentation with computer vision applications and there's really cool innovation happening, but to take the lab experiment into the real world, that's also really challenging. You know, camera angles change, lighting changes, context switch, just getting, you know, the infrastructure out there and the applications and continuously updating those models remotely. These are those infrastructure things that I think it's, you know, really important.

I think that the main thing is to break down the problem, separate out your investments in infrastructure from the application planes that you invest in, you know, consistent infrastructure like we're doing with Lenovo and Intel and ZEDEDA, we're obviously focused on infrastructure, and build it in a modular way, where as you kind of evolve, you build new applications, you can take in different types of domain expertise. Eventually, it's about domain expertise with consistent infrastructure. And so I think it's the key for manufacturers is to break down the problem, work with vendors that are architecting for flexibility, and then you can evolve from there 'cause no one knows all the answers right now. You just wanna build in that future-proofing.

- That's a great point that you make. Edge is a continuum. There's no one-size-fits-all approach. There's no one way of doing manufacturing. Everyone's building different things and, you know, applying technology in different ways. So on that note, Blake, can you talk about how manufacturers can approach this, what they need to be looking at, and how they decide what technologies or path is gonna be the best for them? - Yeah, you know, I mean, the first approach, I think, is, well, even before you get to the POC, I think the biggest challenge is just understanding what kind of business outcome you wanna drive with the particular POC, 'cause you also have to scale the business case.

And one of the challenges is, is you can build something in a lab and typically the last thing, you know, an engineer's gonna think about is cost when they go to, you know, develop or deploy the solution. You know, it's an exponential factor and it's, you know, in my opinion, and I'm sure Jason would agree with me that the biggest inhibitor to scale is deployment and management and life cycle and end of life and transitioning from one Silicon to another, you know, over time as products come in and out of their own life cycle. So I think the first, you know, the first step is making sure that you understand what kind of business outcome you wanna drive and then keeping a conscious understanding of what the costs are associated with that. And that's something that we at, you know, we at Lenovo, we work with people more on, you know, solution architecture and thinking about what type of resources do you need today. And then how does that scale tomorrow, next week, and next year and the next five years.

So that's, you know, critical. I also think it's, you know, it's important to understand that, at least in this edge computing spectrum today, there's a wide array of different types of resources or hardware platforms that you could choose from, some of which may have better performance. Others may have better longevity or reliability and in some terms, but you know, I think it's important for a customer to understand that in order to select the right hardware, you kind of have to understand what are the iterations of the program throughout the life cycle of whatever solution you're trying to implement. So those are the first things and what I would call more fundamentals when you approach some of these new solutions. You know, there's a lot of tools out there because, you know, if you think about it, if PCs today personal computers that maybe Lenovo sells in our core commercial business are based on user personas.

So, you know, if you're an engineering student or professional, you may use a workstation machine, you know, with great graphics, some good performance. If you're a mobile executive, you're probably using a ThinkPad and traveling around the world, you need that mobility. Or if you're a task-based worker, you might have, you know, a desktop computer. In edge computing, you know, there are no personas and the applications are endless.

And I would say, you know, I think ZEDEDA is proof that, you know, there is no standard ecosystems of applications. So you have to be able to build in that elasticity and you can do that with ZEDEDA and Lenovo, frankly. - Now I wanna expand a little bit on some of those, you know, the hardware and software platforms that you just mentioned.

Jason, can you talk a little bit more about how you deploy AI at the edge and how you approach edge computing? What tools and technologies are you seeing manufacturers using to approach this? - There's obviously a lot of, you know, kind of special built, purpose built solutions, you know, vertical solutions. Any new market, you know, I always say, it goes vertical before it goes horizontal. It is, you know, as I mentioned about domain knowledge, you know, it's attractive upfront to buy like a turnkey solution that has everything tied together and someone that knows everything about you need to know about, you know, quality control, you know, does everything for you. And there's been computer vision, you know, solutions for a long time that are more proprietary, kind of closed systems for things like, you know, quality control on the factory floor. That's not new, but what's new is everything becoming software-defined where you abstract the applications from that infrastructure. So if you look at, you know, in terms of tools, if you look at, historically, I mean, constrained devices really, really, you know, low end compute power sensors, you know, just kind of lightweight actuators, things like that.

Those are inherently so constrained that they're embedded software. In the manufacturing world, control systems have historically been, you know, very closed. And that's a play to create stickiness for that control supplier. And of course, there's implications if, you know, if it's not tightly controlled in terms of safety and processed uptime. Okay, so that's kind of like the world that it's been for a while. Meanwhile, in the IT space, we've been kind of, you know, shifting and the pendulum swings between centralized and decentralized.

Over the past, you know, 10 years, we've seen the public cloud grow. Why do people like public cloud? Because it basically abstracts all of the complexity and I can just sign up and just, I'm looking at resources, compute storage, networking, and just start deploying apps and go to town. What's happening with edge and the way we've evolved technologies and the compute power that's being enabled, of course, by Intel and all the portfolio like from the Lenovo, is we are able to take those public cloud elements, this platform independence, cloud native development, you know, continuous delivery of software always updating and innovating. And we're able to use those tools and shift them back to the edge.

And that there's a certain footprint that you can do this with. And it goes all the way to the point where basically we're at the point where we're taking the public cloud experience and extending it right to the process, right above the manufacturing process that's always been there to where now we can get that public cloud experience, but literally on a box on the shop floor. I don't need to, you know, one-size bootstrap, you know, everything I'm looking at, I wanna deploy an AI model. You know, I wanna sign it to this GPU. You know, I wanna add this IoT protocol normalization software. I wanna move my SCADA software and my historian onto the same box.

It's this notion of workload consolidation. It is using these tools that we've developed the principles from the public cloud, but coming down. Now, what we do at ZEDEDA, what's different is while we help expand those tools from a management standpoint, a security standpoint, we have to account for the fact that even though it's the same principles, it's not in a physically secure data center.

We have to assume that someone can walk up and start trying to hack on that box. When you're in a data center, you have a defined network perimeter. We have to assume that you're deployed on untrusted networks. So the way that our solution is architected, and you know, there's a bunch of different tool sets out there, is take the public cloud experience, extend it out as far as you can, to where basically it starts to converge with historical process, you know, stuff in the field, but you build a zero trust model around it to where you're assuming that you're not locked up in a data center.

When you're outside of the data center, you have to assume you're gonna lose connectivity to the cloud at times. So you gotta be able to withstand that. So this is where the one-size-fits-all thing doesn't, you know, doesn't come into play. There's great data center tools out there for scaling. They're evolving, you know, with Kubernetes coming, you know, out of the cloud and down, but they start to kind of fall apart a bit when you get out of a traditional data center, that's where, you know, solutions that we're working on and with the broader community pickup, then eventually, you get into constrained devices and it is inherently death by a thousand cuts. Everything's custom.

And so those tools there and then of course, there's, you know, we'll talk a little bit about some of the frameworks and, you know, kind of the AI tools, but as Blake mentioned, as you know, I'm very much stressing when you get to the real world, this foundational infrastructure, this notion of how do you manage the life cycle, how do you deploy it? Oh, and how do you do it without a bunch of IT skillsets running around everywhere, 'cause it's not, you know, you don't have those skills everywhere. It's gotta be, you know, usable and definitely secure. And security usability is another big one.

'Cause if you make it too locked down, no one wants to use it or they start to, you know, bypass things. And you know, a lot of stuff, but I think the key is the tools are there, you just need to invest in the right ones and realize it is that continuum that we're talking about. - Now I wanna touch on some points you made about the cloud. The cloud isn't going anywhere, right? And there may be some things that manufacturers want to do that may not make sense to do it at the edge. So Blake, can you talk a little bit about the relationship between cloud and edge computing, what the ongoing role of cloud is in edge computing and what sort of makes sense from a manufacturer's perspective to do in the cloud and to do at the edge? - Yeah, I mean, you know, in the line of what, you know, Jason was just talking about. I mean, we kind of see, you know, ultimately, edge will, essentially in a virtual world, will become an extension of the cloud.

You know, the cloud means a lot of different things to a lot of different people, but you know, if we're talking about, you know, major CSP or cloud service provider, I think the central role that they'll play in the future is probably more around, I mean, obviously with edge computing, it's all about getting meaningful, insightful data that you would want to either store or do, you know, more intensive AI which may happen, you know, in a hyperscale data center when the data gets so big and it can't be crunched. But essentially, what we are doing is trying to calm down the amount of uneventful or insightful data. But I do think once you get the, you know, the meaningful data in the cloud, you know, as an example, we were talking about defect detection. You know, once you have enough information from, let's say you have 50 different plants around the United States and every single one of them has a defect detection, computer vision application running on the factory floor, well, ultimately, you wanna share the training and knowledge that you have from one factory to another. And the only real practical way to do that is going to be in the cloud. So for me, there's really two main purposes.

The first one is really around orchestration. So how can I remotely orchestrate and create an environment where I can manage those applications outside of the onsite or out, you know, not at the edge or in the cloud. And then the other one is, you know, in order to make these model better over time, you know, you do have to train them initially.

That's a big part of AI and computer vision that's back to our earlier point, probably woefully underestimated in terms of the amount of resources and time that it takes to do that. One of the most effective ways to do that is in collaboration in the cloud. So I do think there's a place for the cloud when it comes to edge computing and more specifically AI at the edge in the form of crunching big data that's derived from edge computed or edge analyzed data. And then the other side of that is training of AI workloads to be then redistributed back to the edge to become, you know, more efficient and more impactful, more insightful to the users. - And definitely one way I would summarize it is, you know, there's kind of three buckets.

One is cloud-centric where maybe I'm doing light pre-processing at the edge, you know, normalizing IoT data and then I'm kind of, you know, so I'm doing a lightweight edge computing, so to speak. And then I'm doing a lot of the heavy crunching in the cloud, so that's one. Another one, you know, Blake mentioned, it's where I'm using the power of the cloud to train, you know, models. And then I'm deploying, say inferencing models to the edge for kind of local action. You know, that's kind of like cloud-supported or cloud-assisted model.

And then there's like an edge-centric model, where I'm doing all the heavy lifting on the data. Maybe I'm even just keeping my data on-prem, you know, I might still be kind of training in the cloud or whatnot, but maybe then I just do orchestration from the cloud because it's easier to do that over wide areas and remote areas, but the data still stays in location so that, you know, maybe I got data sovereignty issues or things like that. So it's, you know, exactly what Blake said.

It's you know, not kind of one-size-fits-all, but that's kind of one framework to kind of look at, you know, where is the centricity in terms of the processing and you know, of course, the cloud helps support it. I mean, we always say it's at either the edge is the last cloud to build. You know, it's basically just the fringes of what the cloud is. It's a little abstract or just becoming more gray. - Now going back to a point you made earlier, Jason. Manufacturers don't always have the IT staff on hand or the IT expertise to do all of this.

So I know there's no silver bullet tools out there, but are there any tools and technologies that you can mention that may help them on this journey, especially if they're lacking the dedicated IT staff that it takes to do all of this? - Is a fair answer, ZEDEDA? (Jason and Blake chuckling) - That's what I was gonna say. (Jason laughs) - I mean, you know, let's face it. So like, you know, again, there's a lot of people that have the domain knowledge.

You know, the experts are the folks on the floor and whatnot. It's not the folks that do the data center. I mean, everyone's experts in their own right. And that's why a lot of these different, you know, tool sets as they become more democratized, I mean, you look at public cloud, it's attractive because I can sign up and I might not know anything about IT but I can start playing with apps. And, you know, maybe I start getting into the OpenVINO community and you know, working with that community.

I mean, there's a lot of resources out there, you know, for just kind of those initial experimentations. But when you get into trying to deploy in the real world, you know, you don't have the staff out there that's used to scripting and, you know, doing data center stuff and all that, plus the scale factor is a lot bigger. That's where tools like, you know, why we exist is to just make that much easier. And again, give you the public cloud experience, but all the way down, you know, out into the field, delivering the right security models and all that. You know, there's a lot of other tools, you know, just in terms of, we'll talk more about OpenVINO but you know, there's the whole low-code, no-code platform, you know, solutions. It really is about finding the right tools and then applying domain knowledge on top.

A friend of mine used to work, you know, on factory floors and doing kind of from the IT space. And you bring all the data science people in and the AI frameworks and yada yada and then you've got like, you know, the person that's been on the factory floor for like 30 years that knows, okay, when this happens, yeah, that's cool, don't worry about it. Oh, that's bad. And so literally they brought these people together and the data scientists had to be told by the domain expert, well, you know, here's how you program it because they don't know about the domain stuff. And literally at the end, they called it Brad-alytics.

You know, the guy's name is Brad. And so we got Brad-alytics on the floor. It's important to bring those right tools that simplify things with the domain knowledge. - Now you mentioned OpenVINO, I should note that

and IoT Chat are Intel publications. So Blake, I wanna turn the conversation to you a little bit since Jason mentioned ZEDEDA, learn a little bit more about where Lenovo fits in this space, but also, you know, how you work with Intel and what the value of that partnership has been. - Yeah, look, the value of the relationship goes beyond just edge computing, obviously. I mean, the, you know, Intel is our biggest and strongest partner from a Silicon perspective when it comes to edge computing.

It's interesting because, you know, Intel holds a lot of legacy ground in the embedded space, the industrial PC space, which just is more or less just a derivative of and an evolution of. But you know, one of the things that been working with, you know, Intel, the couple things that come to mind. One of which is works with, right? So most applications, most ISVs, most integrators are familiar with x86 architecture and have worked with it for years. So that's one thing. The other side of it is, you know, Intel continues to be at the cutting edge of this.

They continue to make investments in feature functions that are important at the edge and not just in data center and not just in PC. Some of those are Silicon-based, you know, whether we're talking about large core, small core architectures or we're, you know, we're thinking about integrated GPUs, which are extremely interesting at the edge where you have constraints on cost, more specifically. Some of the other areas where, you know, I feel like our customers understand that better together story is really around number one, OpenVINO. So if you are trying to port maybe AI workloads that have, you know, been trained and developed and on, you know, some sort of a discrete GPU system, which isn't really optimized to run at the edge, you can port these AI applications over to, you know, and optimize them for maybe an integrated GPU option like you have with Intel.

So that's, you know, very important from a TCO and ROI perspective. I talked earlier about, you know, what kind of outcome you wanna derive. What's typically driven by cost or increase in revenue or increase in safety. And in order to do that, you have to be extremely conscious of what those costs are.

You know, not just with, you know, the deployment, but also in the hardware itself. And another, you know, part of, I guess, OpenVINO sits within this larger ecosystem of tools from Intel and one of the ones that I really like is it helps our customers get it get started quickly is Intel DevCloud. And what that essentially allows us to do is instead of, you know, sending four or five different machines to a customer, we let them get started in a development environment that is essentially cloud-based.

This could be cloud or it could be an on-prem depending on what type of sovereignty issues you might have or security requirements. But this allows a customer to basically emulate, if you will, and do almost, you know, almost real world scenarios. So they can control all sorts of different parameters and run their applications in their workloads in this environment.

So, you know, obviously that creates efficiencies in terms of time to market or time to deployment for our customers. You know, once our customer can use some of these tools to become, you know, ready for that first POC, they go through the POC and they realize those objectives. You know, the Lenovo value proposition is pretty straightforward. We provide very secure, highly reliable hardware in over 180 markets around the globe. There are very few companies in the world that can make that statement.

And that's what we're trying to bring, you know, to the edge computing market, because we understand our customers are gonna wanna deploy systems in unsecure or, you know, very remote places. And that's why, you know, edge computing, Lenovo's DNA lends itself to be a real player in this edge computing space. So when you think about, you know, when I get to scale and I wana deploy hundreds of factories, you know, thousands of nodes, hundreds of different AI models, you're gonna want partners that can, you know, provide things complete zero root of trust, provisioning all sorts of, you know, they're gonna wanna make sure they have a trusted supplier program or transparent supply chain, in other words.

And then you're also gonna want a partner that can help you with, you know, factory imaging, making sure that we can provide the right configuration out of the factory so you don't have to land products in a, you know, either in a a landing zone for the imaging of the product, either within your own company as a manufacturer or maybe some third party who is gonna, you know, for all as expected would want to create a business around just imaging your machine. So with Lenovo, we wanna be able to create the most frictionless experience for a customer who is trying to deploy infrastructure at the edge, which is why Lenovo and ZEDEDA really compliment each other in our alignment with Intel. - Yeah. I'll say that, you know, we're basically a SaaS company. You know, it's all software, but coming from the hardware space, I can be the first to say hardware is hard.

And so partnering with, you know, Lenovo, I mean, especially now with the supply chain things that we've been going through and all that, you gotta find a trusted supplier that can help simplify all that, you know, complexity. And of course, make things that are reliable. I mean, we see a lot of people throwing raspberry pies out in the field, but it's like, sure, it was 100 bucks, you know, but once you drive a truck out, you just spend 1,000. But yeah, I think it's important to work with people that are building reliable infrastructure. - Now, big point you made at the beginning, Jason, was customers are afraid of getting locked in to a particular technology or a vendor.

So when you're choosing tools and partners, how do you make sure it's gonna not only meet the needs you have today, but be able to scale and change as time goes on? - Yeah, I mean, I think that's kind of going back to some of the things we've touched on, you know, this shift from proprietary systems to more kind of open systems. We've seen this throughout technology. I mean, it used to be, you know, played on plain old telephone systems then all of a sudden we get VoIP. You know, that's that transition from sort of facilities to more IT-led, but working together.

CCTV to, you know, IP-based cameras. We're in that transition period now where we're taking, you know, kind of these proprietary purpose built technologies and we're democratizing them and you're kind of opening them up. And so one way to avoid getting locked in is, you know, as we've been saying is to separate the infrastructure plane from the application plane. You know, once you get tied into a full vertical stack, sounds great, you know, you're doing everything for me from analytics to management and security and whatever, whatever, whatever.

You just got locked in. But if you decouple yourself with edge infrastructure as close to the data as possible, this is why ZEDEDA uses an open foundation, of course, you know, the Lenovo portfolio is agnostic to data or to application stacks, you know, super flexible. If you decouple yourself from a cloud as close to the source of data, you're a free agent to send your data wherever. If you decide to go to one public cloud, great.

You know, how about it? But once you get the bill, you're probably gonna wanna figure out a multi-cloud strategy. And so that's one key. The other thing is community. So you know, we mentioned OpenVINO, this notion of democratizing technologies by working in communities, you know. So the OpenVINO community, of course.

Then there's Onyx, which I know OpenVINO community is working with Onyx is about how do I standardize how AI frameworks work together. You know, like a TensorFlow and on OpenVINO, et cetera. The root of our solution, we sell a SaaS orchestration cloud for edge computing, but we use EVE-OS from LF Edge. Linux Foundation's LF Edge community is democratizing a lot of the kind of middleware of the plumbing for edge computing.

By investing in those technologies, it not only reduces the undifferentiated heavy lifting that so many people too often do and helps you focus on value. So as all of these technologies start to converge, you know, we're gonna see more and more acceleration and transformation. And, you know, the key is to don't feel like you should be inventing the plumbing. The worst thing you could do right now is to be focused on trying to own the plumbing. The plumbing is going to be and I always say, yo have to democratize the south, like towards data to monetize the north. And that's where the real money is.

And so we work a lot with Intel, you know, just another quick point is we really like, you know, Intel's infrastructure because I mentioned this whole notion of moving the public cloud experience as close to the physical world as possible. We leverage all the virtualization technologies within the Silicon. You know, we can basically abstract, you know, all of the application layer using our solutions where as a developer, I don't have to have a lot of skills. I'm gonna deploy that AI model and assign it to that GPU. I want this, you know, data analytics or data ingestion stack to be assigned to those two, you know, Intel CPU cores. And so it gives you that sort of, again, that public cloud experience.

All I care about is compute storage networking, and just give me the easy button to assign stuff. So we see that, you know, OpenVINO as mentioned, but it really is important to do all, you know, the abstraction, but also invest in communities that are doing the heavy lifting for you so then you can focus on value. - Great, unfortunately, we are running out of time, but before we go, I just wanna throw it back to you, guys, one last time for any final key thoughts or takeaways you wanna leave our listeners with today. Blake, I'll start with you.

- I think that, you know, the key takeaway for me is and it goes back to maybe some of what Jason said and some of what I've said is, you know, selecting hardware is hard, you know. And I think a lot of people start there and that's probably not necessarily the first step. You know, it's interesting me saying that coming from a hardware company, but you know, at Lenovo, what we want to be a part of is that first step in the journey. And I would encourage, you know, all of our customers or even folks that aren't our customers to reach out to our specialists and see how we can, you know, help you understand what are these roadblocks that you're gonna run into.

And then also open you up to the ecosystem of partners that we have, whether it's Intel or ZEDEDA or others. You know, there's all sorts of different application layers that run on top of, you know, these fundamental horizontal hardware or software stacks like ZEDEDA as well as our, you know, hardware at Lenovo. My takeaway, or I guess my leave behind for this would be bring us your problems, bring us your biggest and most difficult problems and let us, you know, help you design that, implement it, and deploy it, and realize those insights and outcomes. - Yeah, I mean, I would just add as we close out, totally agree. It's all about ecosystem. You know, invest in community so you can focus on more value.

You know, it takes a village mantra and for us, if you do all the abstractions and you create this more composable, you know, software definable infrastructure, it's like another mantra of mine is, I'm all about choice, but I'll tell you the best choices, you know? So then it's like, okay, if you come to me, if we work together to architect it right, then we can kind of focus on what are the right choices, both open source and of course, proprietary. This isn't about a free-for-all, you know. This is about, you know, making money and helping customers and new experiences and all that. But very much it's about partnership, you know, like we're talking about here, but then also to navigate the crazy field out there, but also, you know, focus on real value versus reinvention.

- Well, with that, I just want to thank you both again for joining the podcast today. - Yeah, great. Thanks for having us.

- And thanks to our listeners for tuning in. If you like this episode, please like, subscribe, rate, review, all of the above on your favorite streaming platform. Until next time, this has been the IoT Chat. (light music)

2022-06-26 12:17

Show Video

Other news