Intel's CES 2022 News Event: 12th Gen Intel Core, Mobileye Advances and Intel Arc Updates (Replay)

Intel's CES 2022 News Event: 12th Gen Intel Core, Mobileye Advances and Intel Arc Updates (Replay)

Show Video

(bright upbeat music) - [Narrator] It starts with a spark, a spark that creates momentum. (bright upbeat music) Momentum that allows us to reach new heights and better the lives of every person on earth. Momentum that creates transformative experiences, that fuel connections, creation and play, momentum that helps us break through existing architectures and transcend what's possible, momentum that drives evolution in safety, mobility, and how we feel about how we travel and how we think about intelligence. At Intel, we create the momentum that will reshape our future and create the spark that changes literally everything. (bright upbeat music) - Welcome to Intel's CES press conference, we hoped we would be together in person today with a live audience, but in the interest of health and safety, and through the power of some technology, we're thrilled to be here with you virtually today from Las Vegas.

The technology shown at CES every year always amazes me, and this year, more than ever, it makes me grateful for the technologists who are dedicated to advancing innovation. We are living in a world where the digitization of everything has become accelerated, transforming how we live, work and play. This acceleration has been driven by what we call the four technology superpowers, Ubiquitous Computing, Cloud Edge Infrastructure, Pervasive Connectivity and AI. Each one of these technologies are powerful on their own but together, they reinforce and amplify each other, and at Intel, we are harnessing these technology superpowers to deliver world-changing technology. We're making ubiquitous computing a reality helping people to interact with the constantly changing face of technology. We're creating infrastructure that spans cloud to the edge, to help process the massive amounts of data while addressing the demands for lower latency and higher bandwidth.

We're driving pervasive connectivity, enabling technology to seamlessly communicate with everyone and everything, and, we're bringing intelligence to it all with our advancements in AI. Today, I am gonna be joined by leaders from across Intel, including the vice president of the Visual Compute Group, Lisa Pearce, and the CEO of Mobileye, Professor Amnon Shashua. Together, we'll highlight how we are unleashing these superpowers in three specific areas. First, the incredible advancements we're making with industry leading PC experiences. Second, the momentum and progress in our graphics business, and, finally, the new advancements in automated driving solutions coming from Mobileye. Our time together is gonna be action-packed with announcements every few minutes, so, let's just dive in, and let's start with a technology that's closest to my heart, the PC.

Now, the PC is one of the most essential tools of modern times, and we're committed to advancing client innovation further to deliver purposeful computing experiences to unlock people's potential and bring that idea of ubiquitous computing to life. Just last quarter, we launched our 12th Gen Intel Core desktop processors, headlined by the world's best gaming processor, the Intel Core i9-12900K, and the response has been just incredible. We are on pace to have our fastest enthusiast desktop ramp of all time, and it's no surprise, we created the 12th Gen Core Family of Processors with the superior performance to unlock the experiences that matter most to people, from gaming to creating, communication, collaboration and more. 12th Generation Intel Core represents our most significant breakthrough in x86 architecture in more than a decade. Built on our Intel 7 process node, these processors are Intel's first performance hybrid design featuring two core architectures, performance cores and efficient cores, and with Intel Thread Director built in, we can ensure that the right workloads move to the right cores, to deliver the best experience possible, and today, we are going to expand the 12th Gen family further.

Starting with the announcement that we are in production of our brand-new 12th Gen Intel Core KS-series processor. This processor takes performance to all new heights. It has a whopping 5.5GHz single core turbo right out of the box, and with optimizations for performance cores, we can get above 5 GHz on multi-core performance, and I've got Chuck here with me today to show you that processor in action.

Hey, Chuck. - Hey, thanks, GB. Here I have the game "Hitman 3", which has been optimized for our new performance hybrid architecture. Now, the high frequencies and the intelligent use of P-cores and E-cores seamlessly working together, result in unrivaled game play. So, you can see the game here and GB said you would get those P-cores at above 5 GHz but you'll see, those are hitting 5.2 GHz

across all cores right out of the box. - Wow, that's incredible, Chuck, and we'll ship this new enthusiast desktop part to OEM customers by the end of this quarter. The platform technology we've showcased today isn't limited to just our desktop line up, I'm thrilled to announce that starting today, we're bringing our new hybrid architecture to performance laptops with the launch of eight brand new 12th Gen H-Series mobile processors.

With up to 14-cores and clock speeds up to 5 GHz, our 12th Gen H-Series delivers up to 40% higher performance than our prior generation. The result is the world's best mobile gaming platform, period, and we've tested it against everyone, and I've invited Chuck back up to show it to you in action. - The new H-Series processor delivers incredible real-world gameplay experiences, and we're not just talking one or two games, you can see here we are running the popular game "Hitman 3" against the best comp has to offer in the market today.

When we tested it in our lab, we saw that our system got 49% higher frame rates, and we have "Riftbreaker" at 27% higher FPS, "Mount & Blade II: Bannerlord" at 23% and "Total War: Three Kingdoms" at 47%. As you can see, across the board, we beat the competition on those top games available today. - Now, thanks, Chuck, I love seeing the breadth of performance leadership here and we expect to maintain that leadership through 2022 and beyond. Now, the 12th Generation Core H-Series isn't just good at one thing, though, we designed the processor family to conquer multiple high intensity workloads at the same time, meaning that people can game, create and record, without compromise, and no one knows this better than Tokki, a content creator and Twitch streamer, who's been putting our core H-Series processor through the paces, and we're not just joined by Tokki here on stage with me today, we're joined by all of her fans around the world because she's right in the middle of a match so, thank you very much for being here with me today. - Yes, GB, thank you so much for having me. - [GB] So one, I know you've been putting the processor through the paces so, tell us about your experience so far.

- Absolutely, so, I understand that I'm one of the few who has had the opportunity to test out the new 12th Gen device, and, I have to say, seriously, wow, like I genuinely didn't know what I was missing until now. I mean, it's no secret that I love gaming. - Yeah (chuckles). - But to be totally honest with you, my real love, my real passion is streaming itself and connecting with my community and by the way, look at how many of them are here. - Yeah, I can see you're blowing up over here on the screen, that's great.

- I've been doing this for some time, I've been streaming professionally for about three years now and to be honest with you, in the past, there was always some element to my stream that was compromised, whether that was some frame rate issues or just a laggy stream in general, which no one likes and I have to say that with the new 12th Gen device, all of those issues and worries are a concern of the past, I'm so excited for it. - No, I love it, I love gaming, streaming, all without compromise, it's a huge thing for us, so, thank you and thanks to all your fans around the world for joining us here live at CES. - Yes, air high fives.

- High five virtually in the air, okay, great. Well, hey, we've shown that 12th Generation H-Series processors are great for gamers and streamers, so, now lets also show the experiences they unlock for creators. Whether you're a hobbyist, a studio animator or doing other creation kind of tasks, we built 12th Generation Core for you, and with our performance and efficient cores, you can execute performance-heavy tasks like encoding or rendering, while working on background tasks like file transfers, web browsing and more, all at the same time without compromise. But to truly appreciate it, you've got to see this in action, so, in our ongoing work with our creator community, we know that the virtual production in real time 3D engines is one of the hottest trends in this growing industry.

It reduces the lag time between creation and finalization without sacrificing quality, and I'm thrilled to share that we have Rich Hurrey, an expert in the field of Animation and Visual Effects and the co-founder and President of Kitestring, to show you this process in real time. So, welcome, Rich, thank you for joining me. - Thank you, GB, I'm super honored to be here today.

It's funny seeing all this innovation, it really is for me personally, a driving force for my creative journey as a character technical artist, and really to that fact, all the advancements we're seeing in the industry now was the inspiration for my team to create Ozone in the first place, which is our technology for creating high quality characters for film, TV and games. Ozone Rig Studio allows for a rig once animate anywhere approach to production, allowing artists to work in Maya, Modo, Unreal, Unity, and really any application over time, so, anyway. - That's great, I know you brought us some things to show us in action. - I do, I do, I do. So, this scene I was working on last week on the road and honestly, I gotta say this laptop with these new processors, I'm getting workstation level performance on the go. - That's great. - It's awesome.

So anyway, let's take a look at this scene here and I'm gonna show the process of taking animation from one environment to another really quickly. So, let's do this, I'm gonna hit play on this and it's pretty easy, so, what we do is we'll just export an Ozone clip here out of Maya and then we'll take that and then we'll bring it into Unreal and let it play, we'll see that we've got the same animation here running in the real time environment. Now, if I wanted to make a change now, in the past, I'd have to go back into my animation environment and I'd have to make a change, I'd have to re-export and bringing in it and it would really take me totally out of my flow, but with those on rig studio, I can come in here and I can make changes live. So, we're gonna select the Mushroom Puppy here, take that thing away, we call it...

- I like Mushroom Puppy. - Creepy cute, and I type in cute instead of jaw, so, we'll do that, but you can see that now I can literally change this on the fly in context. I don't have to kind of jump out of my flow, and if I go here and we'll do it on our other character Zoe and I can do the same thing with her. So, I'm gonna go ahead and set off her jaw, we're gonna look at her brows controls and I'm gonna add a little bit of concerns.

I can do this and I can just bring up her brows. - [GB] A little more, oh, yeah. - Right and it really shows some concern, but again, I didn't have to come out of my flow, I get to work in context, and on this laptop, I can work in real time. - No, I love it, I love that we're able to keep you in the flow with that performance and you don't have to jump out and then come back and, it seems like a real game changer.

- It is a fantastic way to work and Ozone Rig Studio is designed to enable feature film quality characters, whether you're in games, virtual production or in a traditional animation pipeline, it really allows artists to work in context, in real time, and honestly, just like your new family of processors, all without compromise. - That's fantastic, well, Hey, Rich, thank you very much doing this live, it's fantastic. It's very, very exciting.

So, from gaming to creating our 12th Generation Core H-Series processors deliver superior laptop performance, and as part of one of the most open and robust ecosystems in the industry, we're proud to be working with our partners to deliver choice in design and experience. (bright upbeat music) Now, today, I'm pleased to introduce more than 20 new designs powered by the H-Series processors. From partners including Dell, Gigabyte, HP, Lenovo, MSI, Razer, Republic of Gamers, and more.

(bright upbeat music) Ah, I love it, and this is just the beginning, in total, we will deliver more than 100 new designs this year, and, Chuck, I believe you've got one of Acer's newest designs with you here today. - That's right, GB, this is the new Predator Triton 500 SE. This device features the perfect mix of technology in an incredibly sleek design to fuel your gaming from anywhere.

This includes 12th Gen Core, up to 32 gigs of DDR5 memory, high speed PCIe Gen4 storage and Intel Wi-Fi 6E is built into every 12th Gen platform so you can have confidence in an uncompromised on-the-go experience. - We're not only bringing this architecture and performance to the enthusiast and gaming space, we're also bringing it to thinner and lighter laptops, and with that, I'm very excited to announce we are in production and shipping a new product line called the 12th Gen Intel Core P-Series. This family includes six brand-new processors, delivering up to twice the performance above our 11th Gen U-series systems in market today.

So, Chuck, let's show everyone what they can expect from the P-series. - Happy to, GB, the new P-series devices are tailor-made for the performance needed in the variety of thin and light laptops and cutting edge designs like these from Acer, Dell, HP and this system here from Lenovo. So, the new ways we are working, collaborating and creating can be done anywhere without missing a beat. - Well, thanks, Chuck, those are some great moves there. As you've heard today, our Intel execution is back and we're moving in a more accelerated pace than ever.

In total, we're delivering 28 new 12th Generation Intel Core mobile processors across our HP and traditional U-series product lines that will result in more than 350 more mobile designs this year. And with our next Gen processors, code named Raptor Lake, on track and already booting Windows, you can expect even more advancements from us in performance and choice coming later in 2022. Now, the innovation we're driving with our CPUs is just part of the story.

As you know, we've been on a mission to deliver the best PC experience with our Intel Evo Platform. Since it's launch, we've delivered more than 100 Evo verified designs from our leading OEM partners. We co-engineer and test each design against real world conditions to ensure these are the best experiences on the planet. To date, we've gone through two generations and today, I'm pleased to announce the third-generation of our Intel Evo spec. In this spec, we are adding new technologies to the platform including, Intel Wi-Fi 6E, background dynamic noise reduction using the AI engine built right into the platform, and for people looking for additional performance, we are extending the spec to include select 12th gen H-Series designs and an option that includes our new Intel Arc Discrete graphics that you're gonna hear more about later. Now, Chuck, over to you to show how Dell leveraged all of this new technology in one of their upcoming Evo designs.

- So, GB, with these new technologies added to the platform, we are able to engineer incredibly powerful devices like this brand new Dell Precision Mobile Workstation. This is a gorgeous grounds-up redesign that features our H-Series 45-Watt processor, an Arc discrete graphics, alongside a powerful platform that supports consistent responsiveness, all day battery life and more, and really, you know, to have a 45-Watt processor, a discrete card and something that looks like U chassis is incredible. - Yeah, I love it, I love that design, it's absolutely stunning, and we're adding more choice so people can pick the right experience that meets their needs, all backed by the Intel Evo promise of delivering the best experiences on the planet, and you can expect us to continue advancing and evolving the spec.

Now, one of the things that jumped out at us was this need for a better, more unified cross device experience. According to a recent study, 70% of people access the internet across multiple devices, and 90% of those people use multiple screens to accomplish a single task. At Intel, we wanna unify that experience and make it easier, and with that, I am pleased to announce that we've acquired Screenovate, a pioneer and leader in delivering technology for advanced interaction between multiple devices based on different operating environments. Now, by integrating and building on Screenovate's technology we intend to enable new interactions with full flexibility across ecosystems, operating systems and form factors, and most importantly, it will be built on top of that Intel Evo promise I just talked about. So, Chuck's gonna give us a glimpse of what this experience will look like. - Hey, GB, this new Intel technology will break through the communication barriers, no matter what your devices are and no matter what OS you're running.

So, let me give you some examples. The first thing we have is we have my iPhone here, but you can see my iMessages and SMS texts are showing up on my Evo laptop. Not only can I see them, I can easily reply. Now, there's also devices that are attached to our phone, such as our watches.

So, let's go ahead, make sure we can gather everything in here, we'll go to our health and you can see I've got my heart rate and my oxygen rate right there, I can also take a look at my recently captured ECG captures. So again, all my health data, right here om my Evo laptop. Now, many people also love second displays when they travel, so, let me show you how I can take my Android tablet and turn it into a second display.

We're gonna click on our tablet, we're gonna hit extend screen and you can see how my workflow is already moving across both screens. Now, let's take a look at those devices that everybody has in their house, like a smart TV. So, I'm gonna go ahead and take a picture real quick. Go to my TV, let's go take a camera. (camera snaps) Oh, gee, there you are, GB. (both chuckle) - I was photo bombing a little bit there, sorry, Chuck.

- I'll just go ahead and project that up to our TV and you can see it's gonna end up right on my TV so I can share it with everybody. See, you caught me by surprise, GB. Now again, here's all our devices, we've got Chromebooks, Android phones, Android watches, Smart TVs, Android tablets, iPhones and iWatches, all seamlessly connecting to our Evo platform. - Wow, I love it, Chuck, thank you so much. This is really gonna be a game changing experience, and I love to see the PC remains the center of it all, enabling all those devices to work seamlessly together, and I am excited to announce that this experience will be available in an exclusive number of Intel Evo platforms for Holiday 2022. Now, the PC really is one of the most innovative and open platforms ever invented, and we are committed to advancing the experience to ensure it remains the human touch point in a world of ubiquitous computing.

To do that, we will deliver new technologies to the platform just like I just showed and beyond, and with that, I am thrilled to welcome Lisa Pearce to share the advancements we are making in our discrete graphics business, Hi, Lisa? - Thanks, GB. To build on the momentum you just shared, I am excited to announce, that we are now shipping our Intel Arc Discrete GPUs, for 12th Gen Core H-Series mobile designs, to our leading OEM customers. With our long-standing partnerships with key OEMs, we've enabled rapid integration of Intel Arc into their next generation platforms, in fact, some of the designs are with us today. This is Alienware's x17 that will enable a premium laptop gaming experience powered by 12th Gen H-Series, Intel Arc and Alienware's Cryo Tech Cooling.

It's also Alienware's thinnest 17" gaming laptop to date. Another example is Lenovo's ultra-portable Yoga. This enables high performance mobile content creation and enhanced gaming, in a multi form factor device. Together with our partners, we will launch more than 50 mobile and desktop designs all using Arc. In fact, many of the H-Series systems we showed already have Arc integrated. One of the advantages we can offer our customers and gamers is the enhanced experiences that come with combining Intel Arc GPUs and Intel core platforms.

Which brings me to Deep Link, a collection of technologies where our CPU and GPU architects have collaborated in the platform level to deliver better experiences with Intel Arc graphics and Intel Core platforms. Our first technology in the Deep Link portfolio is called Dynamic Power Share, where the processor and graphics communicate to have an optimal set of performance levels depending on the workload. When graphics needs more power, it can shift the power from the processor and vice versa. We'll also introduce new gaming features that enable more user choice between performance and battery life, and performance while streaming.

But today I wanna focus on our next Deep Link technology, Hyper Encode. With the rise of social media, video editing has become a major workload for PCs and it's very demanding on GPUs which are responsible for both the decoding and encoding of media. Depending on your content, the video export process where modified videos get encoded for external sharing, can be very time consuming. This diagram is showing encode process today with Hyper Encode. Each sequence of frames is processed in order, one block after another on one device.

Now, with Hyper Encode, this is where we leverage all the technology in the platform and automatically divide the encoding work between the CPU's integrated graphics and the discrete Intel GPU. This speeds up the encoding by 1.4X. Now, let's take look at a demo. (bright upbeat music) This is DaVinci Resolve which is a popular video editing program from Blackmagic Design. We're showing the export process where an edited video is encoded for sharing.

You select, render all, to start the export. On the top of the screen, you can see the frames that are currently being encoded. In the middle, we are using our free public tool, the Intel Graphics Performance Analyzer, to show our activity on both our integrated and discrete GPUs, and at the bottom, you can see the progress bar blown up from the Davinci Resolve UI. As you can see, a traditional system would only utilize one of the devices during the encode process, in contrast, on the right side, the system has Hyper Encode active, also notice how quickly the progress bar is moving on the Hyper Encode side. You can see both GPUs are active in Encoding frames.

By combining the power of our integrated and discrete GPUs, the Alchemist Hyper Encode system completes the encode 1.4X faster than a discrete GPU alone. Now, of course, for many discrete GPUs users, it's all about gaming, and to deliver the best experiences on Arc, we partner with major game developers to optimize for Intel devices. As part of that work, I'm excited to announce an exclusive partnership with Kojima Productions and 505 Games. "The Directors Cut of Death Stranding" will be coming to PCs this Spring. This is the definitive version of the game, with additional game modes, a new infiltration mission and extra content.

We've been working with Kojima on integration of many key Intel technologies. For example, the game will deliver optimal core utilization on 12th Gen Intel Core processors and XeSS, our AI based image upscaling technology for our GPUs. With the "Director's Cut", gamers can explore a vast world of ultra settings and high performance enabled by XeSS, and we've seen rapid and eager adoption by the gaming ecosystem. With our initial titles include "Hitman 3" and the "Riftbreaker", which will all benefit from the enhanced performance and improved visuals of XeSS.

We've also been working with 10 leading studios on multiple engines and multiple titles, we are enabling a broad collection of XeSS games throughout 2022. Shipping Intel Arc to leading mobile OEMs and broad adoption of our new technologies like XeSS and Hyper Encode, marks an important milestone on our discrete graphics journey. So, stay tuned, more excitement is ahead, back to you, GB. - Wow, thanks, Lisa, that's really exciting stuff, this is such a huge step forward on our XPU journey. We are delivering the computational horsepower our customers demand in whatever architecture or form best suits their application, and, you can count on Intel to continue to partner closely with ISVs to ensure their games and applications run best on Intel, and now, for the final announcements of the show, I'd like to take everyone on a journey from the client to the edge, to show how Intel and Mobileye are going to drive us into the future.

To share more, I'm happy to introduce Amnon Shashua. - Thanks, GB, it's truly a great time to work on autonomous driving. 2021 was a really meaningful and a record year for us.

Our revenue rose 40% year on year to $1.4 billion, we had 41 new design wins, which again is a record number for us. Those design wins were responsible for 50 million new cars on the road going forward. This year, we also launched 188 new car models with our technology, with our chips, and we also had four industry first launches, one was with Honda, the first a level three system launched in Japan where we were responsible for the computer vision, the first eight megapixel camera with a 120 degree field of view launched this year with the BMW. Volkswagen, we'll talk about this later, Travel Assist 2.5 with a cloud-based enhancement

using our cloud sourced REM technology was launched this year, and our biggest effort in driving assist is supervision, 11 cameras around the car powered by two EyeQ 5 chips was launched a month ago with Zeekr, it's a brand of Geely, a few thousands of cars have been shipped already. through over the air update, more and more advanced features will be delivered to customers. Now, one of the reasons of our great success is Ford, and to help us learn more about our longstanding and now expanding relationship, I'd like to welcome Jim Farley, Ford CEO. Jim, welcome to CES, I'm delighted you can join me on stage today, even though we are having to meet virtually. - Amnon, thank you, such a privilege to represent all the team members at Ford. We've been working on safety technology for like a century so, we love working with Mobileye.

Your technology is incredible, your vision sensing technology is fundamental in all of our ADAS systems, especially your EyeQ sensing systems so, congratulations on all your success. - Thank you, Jim, it's truly a privilege to work with you guys. Our relationship goes back more than a decade together, bringing advanced technology to millions of cars globally, it's a relationship that we are expanding, can you give us some more color on what's new? - Well, when we first got into the ADAS systems and now into level two, you know, the mapping technology was something that was a concern and your REM mapping technology in our future versions of Blue Cruise are really important for hands-free driving solutions and a lot of applications where there's not concrete lanes and it's really safety critical for us to have lane centering in those situations, your technology is second to none. - I agree as well, very excited to deploy maps across the Ford family. You know, the maps are cloud-based enhancements that provide value to end users, so Jim, how will the Ford, Mobileye future look like? - Well, further out, Ford and Mobileye are working on a number of innovations, you know, since it's been a long relationship and we trust each other, but what I'm most excited about as a CEO, is working on your open platform on a whole new generation of autonomous technologies that are gonna really change customers' lives, moving from safety, to doing all sorts of new things inside the vehicle and it's thanks to your technology.

- That's right, our new EyeQ 6i and EyeQ Ultra which I'll talk about tomorrow, during the, under the whole technical deep dive that I'll give tomorrow, gives Ford and the industry a platform for innovation that's directly tied into the EyeQ functionality. - All of the progress is great, I love how the teams are working together, Amnon, it's been great to work with you specifically and your team of technologists are really so broad. I mean, we just couldn't offer the systems that we do at Ford without you and we're betting on Mobileye for our future. - Thank you, thank you, Jim, for joining us, it's truly a privilege to work with you and your team and a great 2022 for all of us, thank you. Now, I'd like to turn to another one of Mobileye's longstanding customers, Volkswagen.

Volkswagen recently debuted a state of the art driver assistance system called Travel Assist 2.5. It is the first system to widely apply Mobileye's unique REM crowdsourced mapping technology. Production vehicles equipped with Mobileye technology are gathering growth segment data via driving, assist systems, those are harvested, sent to the cloud, and then in the cloud, we have automatic map creation algorithms that we have been developing over the past five years. Those maps are sent back to vehicles and onboard the vehicle, there's a localization process being done where the car localizes itself in the map and all the map data is kind of a cloud based assistance to the driving assist. Of course, it's powering our AV technology, but in the context of what we'll be seeing right now, it's a cloud-based enhancement for driving assist. I wasn't in Munich two weeks ago and met with the Volkswagen Group chairman of the board, Dr. Herbert Diess

and we went for a drive to see how it works. I would like to share with you some of our conversation. (bright upbeat music) - I'm really excited to be here because we need a good understanding to make the right decisions. - You know, we've been working for so many years together and we have been doing great things with Volkswagen over over the years. When you talk about autonomous driving and driving assist, there's lots of evolution for driving assist that'll give lots of value to customers. It's really adding a very unique element which is now cloud-based assistance, but here we're doing something together.

Actually, we had started working on it back in 2017, I think. - I think even in 16, we started the first discussion. - Yeah, yeah, and now we see it really in production, lots and lots of data about the world is injected to the driving assist, so, even if you drive in an area that you don't see lane marks, the cloud knows about it and you know, helps the lane centering, that's a great. (bright upbeat music) - Yeah, you can feel that it's REM data because you know, we're not in the middle of the lane, we not to the right, we just doing a natural pass now. - And it's continuously on. - It's continuously on and it's working really well.

- [Amnon] You see road boundaries that are two lanes here and you don't see a line boundary marker. - Now it's working well. - So, I think the next step of now using REM is to go, is to start using traffic lights.

Today, REM is only the drivable path is using traffic lights information, then we have data about growth priority, who has priority over what and then it creates a very, very smooth control in areas where you have splits and margins so that you know who has priority. (soft serene music) - Do you think it will be possible at some stage that be we can update the AI part of the camera? - I think that the AI part has two parts, there is one, let's call it the pattern recognition. Now, the algorithms for knowing that there is a road user, you know, a horse, a car, the, you know, motorcyclist and so forth, and then there's the part that understands more and more complex surrounding and sometimes, it becomes difficult for the onboard processing to really understand in a fraction of a second, what's going on there, and this is the second part of the AI which you can do it through Swarm the cloud processing. - But it remains cloud knowledge then. - It's cloud knowledge, and in cloud, the beauty of the cloud is you have much more processing power.

For example, you know, in the ID4, we're talking about 2.5 kilometers of road in Europe from scratch. - 2.5 million kilometers. - So, we take all the Swarm data, it takes us a week, and there is no reason, you know, with the advancement of computing that will only take us a day, and then later, it'll take us half a day. So, everything that happens in the world can then be updated and stream to cars.

- Are you happy with how we can work together now? - But I would like to say in terms of our relationship, we have been doing things that are very, very innovative. For example, the Swarm data, Mobileye and Volkswagen were the first, we've talked about 2016, it's very, very innovative, but I think there's lots more to do. - I fully agree. - Very good. - Okay.

(bright upbeat music) - So, REM is truly a game changer, it's lean data, automated map creation in the cloud, let me show you some of "Under the Hood" of what we just seen while I was driving with Herbert. You see here in this clip, as we are driving, we're going to take a right and this rural road, there are no lane marks. Now, the two magenta lines are the central drivable paths, you see there is one oncoming and one, the path that we are driving through, so, without the map data, it would not have been possible to do lane centering in this kind of a road, either the car would not be, either technology would not be available or the car would center itself in the middle of the road, not being aware that there is also an oncoming path. Another piece of "Under the Hood" while we're driving there, you see that our traffic lights, the REM map technology can associate traffic lights with drivable paths, this is a very important feature, not yet available in the Travel Assist 2.5, where you can provide a very powerful customer function to prevent cars from running a red light. It's not enough just to detect a red light, one needs to know the relevancy of each traffic light with a drivable path and this is also provided by the map data.

You can see here, just a top view of one of the sections in that area, the richness of the data, the accuracy of the data, all of this is really a very, very strong amplification of what you can do, both with driving assist and later with the autonomous driving. If we look at the coverage, we have 2.5 million kilometers of road covered in Europe. In terms of the amount of data collected in 2021, it's around 4 billion kilometers of data being collected, today, we have about 25 million kilometers of data everyday, collected everyday, so, going forward to 2022, it's going to be around 10 billion kilometers of data going forward for 2022. You know, this kind of data allows us to, you know, efficiently expand our footprint in terms of autonomous vehicle testing into many different territories. We're testing in Israel, we're testing in Munich, Detroit, New York City, Tokyo, Paris.

Tokyo, Paris, is new, I'll show a bit of what we have, this is in Paris, before I run the clip. This is a joint cooperation with RATP Group, it's the major public transport operator in France. Galeries Lafayette employees can use the autonomous test car with their safety driver of course, to go from the offices to their home and the application, the top layer application is powered by MoveIt, which is a company that we acquired about a year ago. So, let me run this clip, it's a bit fast-forwarded, but you can see the richness of the driving situation, after all, it's Paris, Paris is very, very difficult, very difficult scene to drive in even for a human driver, and you can see the kind of testing we can do, you know, the smoothness of the drive.

Here is another testing site in Tokyo, I'll run this clip. Again, the richness of pedestrians, the roads are narrow, obstacles, pedestrian crossing zones and so forth. So, you can see that our REM data provides us geographic scalability to allow us to do testing really, really worldwide in a very, very efficient manner.

Talking about driving assist, if we take the premium driving assist with map data to its fullest extreme, you have what we call supervision. The first launch is with Zeekr, a brand of Geely. It was launched about a month, a month and a half ago, there are a few thousands of vehicles already delivered to customers, 11 surround cameras, eight megapixels powered by two EyeQ 5 chips. The ECU, that entire board is designed and provided by Mobileye, and the kind of driving that you saw in the previous two clips is what this car would be able to do with over the air update that would be delivered throughout 2022, just to show you an example, this is in Israel, we're testing the vehicle with the kind of software that will be delivered in few months in China, and you can see the traffic lights, taking turns, it's all the same kind of performance that you saw in the previous two clips, in Paris and in Tokyo, the Zeekr car will be able to do.

So, few thousands of vehicles have been shipped and over the air update during the next few months would gradually bring the capabilities of this vehicle to navigate on pilot point to point, hands-free driving, but as a level two system. Together with Zeekr, we also announced just this week, the first design win for Consumer Level 4 Platform, it's going to be a start of production early 2024, it's going to be powered by six EyeQ 5 chips an ECU that we also designed, and will be ready for 2024 launch. Going back to our chips, our EyeQ chips, we have been designing, manufacturing them for the past 16 years. We just announced 100 million chips being delivered since 2004.

EyeQs are very, very efficient, it's a combination of hardware and software, it's purpose built, it has very, very low power consumption, very high performance computing and there are 100 million cars on the road powered by EyeQ chips, and I'm happy to announce we have three new generations of, next generation of EyeQ. In the "Under the Hood" session, I will provide more details, it's EyeQ 6 and EyeQ Ultra, I'm showing here the crown jewel. EyeQ Ultra is an AV-on-Chip, it is roughly equivalent to 10 EyeQ 5s, it's on a five nanometer process. There are four families of accelerators that Mobileye has designed over many, many years. There are 64 accelerator cores in this chip divided into two parts of 32, such that we can provide with a external MCU and ASLD system.

There is a GPU and ISP for visualization. Power consumption is very, very light, it's way below 100 Watts, it's 176 Tops, and it's important to mention that Tops is not, not everything is about Tops. The fact that we have a very, very diverse set of cores allows us to be very, very efficient. Take, for example, the two, the clips that I've shown you with our two EyeQ 5 system, two EyeQ 5 is only 30 Tops, and we can do an end-to-end perception, driving policy and control. EyeQ Ultra will have also 12 Risc-V cores, each is a 20, 24 threads, so, it's a very, very powerful chip.

We'll be able to provide the full electronics of an autonomous car in a 2025 timeframe way below $1,000, not the chip, the full electronics, the fully ACL and this is, you know, propelling us to a 2025 timeframe of a consumer AV which I think is going to be a very, very exciting milestone. So, there are many more details to talk about, many more things to show, I'll do that tomorrow with the "Under the Hood" session, an hour long "Under the Hood" session. So, thank you and back to you, GB. - Thanks, Amnon. Today, we gave you just a glimpse of the advancements we're delivering from client to the edge, and we'll continue to harness the superpowers and drive this accelerated pace of innovation to deliver world changing technology throughout 2022 and beyond. And now for the final announcement of the day, I am excited to share that Intel will host our second Intel On event, called Intel Vision, on May 10th and 11th.

Alongside our customers and partners, we'll showcase how companies are leveraging Intel technology to help solve some of the world's most pressing business challenges, we can't wait to see you there. And with that, thank you, stay safe and let's make 2022 a great one. (bright upbeat music)

2022-01-08 16:38

Show Video

Other news