Dell Tech World 2025 Day 1 Keynote Analysis | Dell Technologies World 2025

Dell Tech World 2025 Day 1 Keynote Analysis | Dell Technologies World 2025

Show Video

>> Welcome back everyone, theCUBE's live coverage here in Las Vegas for Dell Tech World 25. I'm John Furrier, host of theCUBE with Dave Vellante and Savannah Peterson, Kristin McCall- Martin, the whole team is here. Team coverage, SiliconANGLE, theCUBE research on the ground, getting all the real-time insights for you.

Thanks for watching. Dave, keynote analysis is my favorite session. It's like a mini-Cube pod in and of itself, but it's really targeted around Dell technologies. At a historic inflection point in the industry, not just for Dell but for everybody, Michael Dell gave the keynote a lot of content today on theCUBE. Michael Dell came on theCUBE.

Jensen Wong did a video with Michael because he was in Taipei for the big event there. Michael Dell and Jensen Wong said something that I wanted to kick off the keynote analysis with that struck me, and you're like a historian of the industry. We both are. Jensen said, "Michael, we were together for many years, decades from the PC revolution, mobile, web, cloud, and now AI, and it kind of hit me those waves. " And then Jensen said, this is unprecedented shift in history. Steam engines been referenced, the printing press, so a very important time for Dell Technologies.

As we've said many times on theCUBE a transition. Michael's done these before. He's actually built factories like real factories, a legend in the day, supply chain management. He knows the game now they're AI factories.

Just a remarkable moment, what's your reflection on that and reaction to that kind of where we are in the historic era of the industry, what we've seen, you and I have seen personally together and individually. I mean, unbelievable. I mean. I say it feels like the nineties all the time, but it's like 100X the nineties. What's your kind of perspective on the historic moment that is now? >> John, I feel like it's almost the combination of every wave ever in the history of the technology industry put together with a new accelerant built in. So I think about the mainframe was automating a lot of back office capabilities. The PC was all about, you remember personal productivity and it changed our lives.

You look back on it now and you're like, wow. Floppy disk slash file retrieve, which is how you used to interact with spreadsheets. And then of course the internet was just an amazing euphoric moment in our industry. I remember for years after the dot com burst, we stopped using the term paradigm shift. Remember? Because we were all so enamored with the internet. Oh, it's going to change the world and it ultimately did change the world.

And at cloud, mobile, social, big data, those are all like mini waves within this big wave. I think those all come together now and it's a combination of re- architecting your back office productivity, re- architecting your personal productivity, re- architecting the way you work, the way you communicate, re- architecting the way you think about data, getting massive value out of data. All of that built together to create what you've said will be potentially 100X more than we've ever seen. You use that as an example around SaaS, that the new software industry would be 100X what the traditional industry, SaaS industry is, and I think that's how I sum it up. It's just all those errors combined with a new lever.

>> It's almost intoxicating at many tech levels. I'm super excited. Today alone, I did three blog posts where I published three stories.

I've done more videos in the past eight months. The NYSE now and the NYSE Wired, a trust network's forming. All kinds of new patterns, and this came up in my research note I posted. A significant trend is happening, AI model efficiency and new usage patterns with reasoning and inference. The token demand is going to surge and what that's going to do is create the flywheel effect where all the old school IT and all the systems and now AI factory developers or operators or architects have to scope and build systems. So you have this flywheel of Michael Dell calls it the intelligence factory.

I call it the intelligence factory. I'm so used to IOT. No, it's intelligence factory. So these are large scale systems. There's huge implications. This isn't just like rack a server up and connect a PC to the network. There's huge factors and I outlined eight trends that will be my research agenda as a research analyst.

I'm going to dig in deeply. >> Really, what are some of those? >> Okay, first trend that Michael Dell talked about. That's going to be on our research agenda, GPU and XPU adoption. Why acceleration is the linchpin from modern computing and how mixing GPUs, basically NVIDIA and non-NVIDIA clusters are going to reshape CapEx and supply chain strategies. I could have NVIDIA, I can mix and match with an AMD, open versus closed.

That's going to be a huge, huge topic area. Number two, the integrated AI factory solutions. Our AI lab that we're working on to build out is going to open up how startups fit in. You and I have talked about Apple versus Android, NVIDIA's clinic closed and by design they're intentional about it. Other AI factories are open systems, huge. Hyperscalers love open.

I mean, they're buying NVIDIA, but they're not buying NVIDIA factories. AWS is not buying AI factories. >> You're saying they're open? - No, they're heterogeneous. >> Well, NVIDIA would say they're heterogeneous too with NVIDIA APIs and Ethernet. >> NEMO and NIMS, yeah, they connect to your data.

>> Well, it's funny, just to interrupt. What Jensen said last night is I love if you buy all of my stuff, but at least by some, so there you go. They're trying to interoperate with everything.

>> See, we can't even agree. We're going to put a marker down for the CubePod on that one on Friday. >> What is open? - What does open mean? >> Unix. - Okay, I'll say homogeneous versus heterogeneous. Like AWS has everything. Would you agree? >> Yeah. And you got one of everything.

>> And they stitch it together. - Yep. >> Okay, so a lot of NVIDIA, big customer. Okay, number two. >> Well, was Intel open? I mean, X86 for the years. Well, you can say that open. It became a de facto standard. >> Well, underneath the processor as a member of the hard and top conversation.

Who cares what's underneath if it works. >> It worked with everything. They were open. >> Again, a huge research area. Number three, portfolio diversification and market recovery.

This might be a little bit of a short-term bump, but the supply chain tariffs, the geopolitical is going to come in here. How broad technology portfolios, like what Dell has, like Lenovo, this industrial revolution, how is automotive going to recover with tariffs? What's the industry verticals look like? Life sciences. Will there be diversity in the portfolio of products and workloads? A good area to look at. Number four, edge AI, telco, sovereign cloud.

Number five, AI model positioning new patterns, revenue growth and operating performance. Nvidia says it's a revenue producing factory. Buy more, make more. Operating performance, productivity, and then tariffs, supply chain agility, and then emerging areas to watch.

Data center as the computer. How do you deal with packaging like silicon optics? These are areas that are going to impact the agent. So again, this kind of falls into all under the covers of what Michael Dell talked about is that how do you actually roll out an AI factory? Okay, Kubernetes is working, platform engineering is working in the enterprise. That's good. What does a stack look like? What's the clarity? Is there de facto standards? Is it a land grab right now? These are open questions. >> So Dell, I mean you're touching upon something that I talked about in breaking analysis this week was Dell's dual mandate, right? They've got to run fast and capture the AI opportunity and specifically the enterprise AI opportunity.

I mean, they have some examples here. We saw JPMC, we saw Lowe's, we saw Sloan Kettering, but these are few and far between in terms of enterprise AI adoption in the mainstream. So they've got to run fast and they've got to be number one and they've got to a lead in that, or else the market's going to go to the cloud or I don't think it's going to necessarily go to the competitors. I think that Dell's got an opportunity to go hard after that. So that's one mandate.

The other mandate to your point is they have to transition their massive install base talking about millions of PCs and desktops and storage and compute and engines that they now have to transition to this new AI era and customers aren't just going to rip and replace. They're not going to just move overnight. So Dell has to figure out, okay, how do we service the existing install base, at the same time capture this new one. Now what they've done, which is whether it's by luck or by design, is they've been servicing these service providers, call them tier two service providers, Neo Clouds, XAI, guys like Core Weave and others, and so that is a bridge to enterprise AI, but they've got to move fast. I would say they've got 12 to 18 months to prove that they can get enterprise AI solutions in market easily and cost effectively for customers with clear use cases and ROI.

That's going to be critical to watch. >> Yeah, my big takeaway was the Jensen keynote in Taipei, Taiwan last night. I watched the live stream and then Michael Dell's keynote today, all points to Dell's challenge. Can they run at the speed of NVIDIA. NVIDIA's marketing is running so fast- >> No is the answer. You cannot run at the speed. Nobody can.

>> They got to just hang on. Now, what I like about Michael Dell and his keynote is that he's actually singing the same notes. I loved the outside of the hokey Dell Street mainstream, which he's kind of making a good point there, but he opened up on there, he said, data and AI is running at Mach three. He said 75% of enterprise data will soon be live and processed at the edge. He led with that, Dave.

>> First thing I said, I was like, wow, he's leading with edge. >> He's leading with edge. >> But edge is PC. So that made sense. >> And smaller, remember the old Intel, smaller, faster, cheaper. Now it's smaller, faster, smarter, so it's getting better and some say not cheaper, but okay, but the data unlock is a huge trend. That's a huge term I've heard on every CUBE event I've been in the past eight months.

>> Unlock, well, it is cheaper unlock it is cheaper on a price-performance basis. >> Yeah, absolutely. Okay, now, okay, I got the edge. Okay, Mach three, they got the huge AI factories. They want to democratize AI he said from colossus to the edge, colossus meaning a big large thing. Not everyone needs that. Basically not everyone needs to be a hyperscaler is what he's saying.

Okay, it's good narrative. Pure intelligence factories powering that. So basically he's hyperscaler in a box.

What's your read on this whole colossus to the edge from colossus to the Edge? >> Well, I come back to something that we worked on, George Gilbert and I wrote, I don't know, six, seven months ago. It was last year. Why Jamie Dimon is Sam Altman's biggest competitor, and it's all about leveraging the data that you have on-prem and building your own AI factories in-house on top of infrastructure and other stacks that Dell provides.

And that's today, let's face it, AI factories, mostly hardware. >> Well, speaking of Jamie Dimon, JP Morgan Chase was one of the guest keynoters. They were one of the guest keynoters. >> Now, JP Morgan Chase has enough engineers and developers and data scientists and AI experts that they can actually be up on stage saying, "Hey, these are the successes we've had." >> 60,000 technologists, 44,000 engineers.

They got the chops. Two exabytes. Now you quoted 800 petabytes. >> No, I think we... Yeah, 800 petabytes.

And we knew because that was an Alex Wan quote, and we knew you knew. You found out, you said, " Dave, by the way, it's higher than that. " I said, I believe you. It's two... What'd you say? Two exabytes? >> Two exabytes. They say exabytes scale, but I heard from this in the hallway, scuttlebutt back channel from someone- >> Two exabytes. - ... from an inside source, it's two exabytes.

I said they were a $10 billion IT technology one year. It's actually 18 billion. >> 18 billion, just an IT budget. >> They have LLMs, it's got stuff in production. So really JP Morgan is a data company that happens to be a bank.

>> So how does a mainstream enterprise that's not JPMC with those types of resources take advantage of AI? This is where Dell's massive opportunity is if they can actually build solutions bundling in AI technologies from startups, from partners, from existing partners like Microsoft and others and new partners like... Well, I don't want to say because I don't think it's been- >> Here's my notes from this, how they're going to do it. They're data center focused, kind like if the data center's computer, it's a disaggregated architecture, open pools of computer storage network for any workload, mixed hyper- converged, easy three tier, we'll come back to three tier flexibility in a second. Power edge, 17G servers, networking, storage.

Storage, power store, power scale, power flex, power protect, a lot of storage. Dave, a lot of storage. >> Yeah, I mean- - EMC? >> No, it's different now. I mean, what Jeff Clark did to EMC and that messy but awesome portfolio that kept him alive for a long time, which Jeff Clark came in and he streamlined it dramatically, which was smart. Basically said, look, we've got compute and storage needs, all these processors, so we're going to use our servers. So they have storage servers now.

So they've dramatically simplified the portfolio and are leveraging their own internal sort of supply chain. It has affected negatively. I mean, they've traded gross margin for IP content, but for scale.

And so if you look at Dell's... I did an analysis the other day on my breaking analysis of Dell's financials and look, they're almost $100 billion company, but their gross margin is like low twenties. And so whereas EMC had gross margin in the mid- sixties all those years, but now you blend the two businesses together and they're still in the 20 to 24% gross margin, single- digit operating margin. You think about AWS, AWS's operating margins last quarter were like high 30%.

I mean, that's like software margins. Dell's running a single-digit, six to 8% operating margin, >> Huge services opportunity. >> And it is a big services opportunity. I guess my point is Michael Dell and Dell Technology is very comfortable with that low margin, high volume business model because they know they can beat anybody at that game. And when you start getting sucked in, it's a really weird kind of thing to say, but when you get sucked into that, okay, where can we focus on the value add, drive margins up? Actually, you're going to affect negatively your volume game and your supply chain leverage. So he's clearly winning at that game.

Who can beat Dell at high volume, lowest cost producer? >> And with the AI factories, they're getting such a sticky footprint. If this hits, and this is where the next big topic on the keynote was this AI factory 2.0 with NVIDIA, this product has legs. If they can get the adoption up in the enterprise, and there's still some holes in the stack on the AI side, what do you run on it? >> But really what's 2.0? 2.0 is faster hardware. >> New Blackwells. - It's faster. >> Eight Blackwells- - It's better Nvidia stuff, >> Eight Blackwells 300s, 256 GPUs per rack.

Okay, with the RTX Pro 6000 service for agents, the thermal innovations, the enclosed rear door heat exchangers on Dell's rack, huge selling point for Dell. They tout that around. NVIDIA spectrum X switches. That was a big- >> Yes.

>> I mean, I was like, whoa, power switch. That's NVIDIA spectrum X, That's going to get attention. That's not Ethernet. >> What I'm saying is what I want from 2.0, or maybe it's 3. 0 is I want a fuller stack. I want to understand more about the data stack, the governance layer, the agentic orchestration framework, all those things that are ultimately going to make a true AI factory. Because AI factory's not just the hardware, it can't just be the hardware.

It's got to be the other intellectual property that the company has that it can exploit to actually do what it's intended to do to create outcomes for that business. Whether it's financial services, whether it's healthcare, like we heard from Sloan Kettering, and it's all going to happen. It's just taking some time and it takes ecosystem. You got to have more ecosystem participation and integration and that just takes time. It can't be all services having to cobble this- >> Project Lightning was mentioned, I wrote up my post before we came out, that bulk parallelism at scale is a big factor in our research and what we're going to be covering.

That's high-performance compute, high-bandwidth fabrics, low-latency, custom optics. Okay, parallelism. That's a parallel file system, okay? The token surge is coming, Dave, it was obviously the object store.

The rise of multi-step reasoning is a huge trend. If you haven't heard of multi-step reasoning, check it out. It's an on-the-fly decision that's made that branches chain of thought processing, that changes everything. So a 10 million token question can balloon into a hundred million tokens. Okay? Chain of thought is the core of reasoning.

So this multi-step process will be the key of agents. Agents equal more token demand. That token demand equals more bigger, faster systems. This is going to be a gravy train, very much a Moore's Law-like thing. We've talked this on theCUBE many times, Intel would come out with a processor, 286, 386, 486, add the performance.

We hear Michael Dell saying it. We get Jensen Wong. We will always build a roadmap. We will increase the performance. We are seeing a generational shift at the scale of AI that looks like the PC revolution full circle. Here we go. >> He says, "A million X performance every 10 years. " A million X in 10 years.

I mean, what are going to do with that? I don't mean that in a negative way. Imagine what's going to be done with all that performance. >> All right, I'll put this in perspective.

>> What stuff where developers are going to do. >> Put this in perspective. If you're listening and watching, you're at a ChatGPT and you type in a query, a question with no reasoning, 10 million tokens.

You type in get reasoning to get smarter. It's a hundred million tokens. Now that's a human. Imagine apps and agents having machine having tokens. Everything is tokenized, so a token tsunami is coming to the world. Dell, I mean, this is at the beginning, it was just like, again, I'm very bullish on this sector because the fact of the matter is the tide is not stopping.

It's going to rise. All boats will float. If people's anchors are still on the ground, the boat will capsize. So you got to ride this wave. I mean, this is a big 100X thing. Cloud is going to be hot.

Edge is going to be hot. On-premise data centers is now the computer. Obvious as clear as day. Okay, where are we, Dave? Not even first inning? >> Well, so where we are, frankly, we're still in the cloud. I mean, that's where all the action is.

If you just look at the numbers, the vast majority of the AI is still happening in the cloud from a spend standpoint. I mean, it's not even close. And so you map Jensen's three vectors of AI opportunity, AI and hyperscalers, AI and the enterprise and AI and robotics. Robotics don't even show up yet.

It's just really largely in the experimentation phase. And there's broadly two types of robotics. One is it does this every day, right? Up and down, up and down. Same thing. I mean just versus a humanoid robot.

That's going to take a long time to emerge. Okay, so back to where the action is. Today, it's in the cloud. I think in the next 12 to 18 months, you're going to start to see real traction in the enterprise, but it really won't start to spike in a big way, in my view, anyway, until the end of the decade. It's going to grow very rapidly, but from a very small base, and I think by the end of the decade, early next decade, decade, it overtakes the traditional general purpose marketplace. But I do think it just doesn't happen overnight.

IT doesn't move that fast. >> Here's my summary, hot take to kind of wrap up. Obviously, Dell has got huge ambitions. They are betting the company and its future on a data-first AI strategy. Very clear. My takeaways, hyperscale to microscale,

everything is going microbe at hyperscale. So hyperscale is like AWS is going into the enterprise to your point. I see a huge surge going into 2026 and beyond for enterprise AI because of microscale and all the things they talk about. Banking on AI, the financial markets, a sector's going to boom, life sciences as well. Every vertical is reinvented. Life sciences, healthcare, every app.

A platform for every scale. The infrastructure message is there's a suite for everything. You want LLMs? You get a long tail like we predicted. You want some power edge servers? Buy racks or factories. And finally, the modern architecture will be re-platformed. There's no way a company can survive on this new token economics, token technology wave without absolutely re- architecting their enterprise.

No doubt in my mind, this is going to be a massive CFO problem to solve, spend problem, and ultimately a company betting generational decision. Once you make your decision, you're going to be... It's hard to walk that back It is going to be a big call. >> It's interesting to hear what John Rose was saying about how the approach he used internally to figure out where they could get the most leverage, and he said, look, if you're not cutting costs, you're not driving revenue.

You're not getting on the short list. And because there's so many opportunities to do just that, and I think David Floyer is big on this. He's like new companies are going to... And I think he's right on, new companies are going to emerge that have green field and they're going to build processes with no legacy process baggage.

We talk about technical debt all the time in theCUBE. There's process debt big time, and that's a lot of inertia. So those companies that can figure out how to re-architect their companies are going to win, but new players are going to emerge. >> And I think the fearless mindset is going to be critical.

It's a daunting cost complexity if you let it be it. So I think startups are more eager to run through a fire. Greenfield is, hey, Greenfield. >> Hey, if it fails, it fails. Start another one.

>> We got two days of great coverage of Dell Tech world, our 15th year covering this sector. >> 16th year. - 16th year, right? >> Right. You got to count 2010.

>> As Brian Bowman would say, "I had black hair then. " I want to thank everyone for watching, Savannah Peterson for being here. Kristen Nicole Martin and our whole team is here. Check out SiliconANGLE.com, thecube.net. Of course, thecuberesearch.com, we've got posts on CUBE research.

Dave's got a post up on thecuberesearch. com, of course breaking analysis, and it's on SiliconANGLE as well. Thanks for watching.

2025-05-22 17:50

Show Video

Other news

Closed captions on DVDs are getting left behind 2025-05-28 19:46
DigiPen Institute of Technology | 2025 DigiPen Europe - Bilbao Student Game Showcase 2025-05-27 01:26
A Tech Insider's Look at Nuclear With Faraz Ahmad 2025-05-26 06:40