>> Hey. Good afternoon. Thanks for coming to hear us talk about holograms. Holograms? For those of you who don't know me, I'm Charlie Fink. I'm a former tech executive who now covers AI, XR, and the metaverse for Forbes.
I'm also a professor at Chapman University and at ASU. And also, if I might plug my own podcast, the AI XR podcast, which I do with Rony Abovitz and Ted, which just celebrated its millionth download. Woo! Um, I first heard about John when I read a wired cover story about Lytro camera in 2016, and I thought, this is because I did not know about volumetric photography, and it wasn't really a thing in the public.
You know, there were there wasn't, you know, apps that would let you make Gaussian splats or anything. And I thought, oh, this is really interesting because for the first time, we're capturing not just images, but space and the space between images. That's the way we see the world. And of course, because it was kind of a Hollywood thing, I was intrigued because it suggests you could film a movie from multiple angles at the same time as if it were a play, and then in a game engine, essentially technology from the game industry. You could go in and set up shots and, you know, extract them and actually shoot and edit your movie again.
And so I got to meet John in person five years later, which was really a treat. When you read about somebody and they're kind of famous to you, and then you get to meet them and. >> You're going to make me blush here. Wow.
And John had just started a company called light. I shouldn't say that. You were five years into it already. Yeah.
When we met, and I started to learn about what Light Field Labs does. And this incredible idea that we could see naked eye holograms in the real world. And there are lots of systems that that approximate this or suggest it, but no one has really done it. So that's what we're going to be talking about today. But before we jump into Lightfield labs, I want to start with your journey all the way through your experiences in the entertainment and special effects business, computational photography, how all that led you to Lytro and then eventually to starting Lightfield Labs? I would love to, and I cannot beat that intro. That was awesome.
Um, we figured for this audience, it would be kind of fun to do a little Wayback Machine and show some of the content from Lytro. Back where the founding team of Light Field Lab was the leadership team behind this program. Now, this has no audio.
I'm going to just talk over it here. But this was from that big launch that Charlie was mentioning when we were working with some of the biggest studios, some of the biggest talented minds out there. And there's Brandon, one of our co-founders, and he looks. Oh my God, what ten years will do, I guess. Uh.
Tell me. About it. Oh, just wait until you see me. I feel like I've aged in dog years, but what we were doing is capturing effectively the inverse of a hologram, right. What we were doing is giving the ability to extract.
Yeah. See, I heard some oohs and ahs when you saw my picture. You're like, oh, dear. Yeah, yeah, that's what CEO will do to you.
Right. Um, I got a couple of laughs. That's good. So the thing from that experience that was so exciting is the, the the reaction, the response from the entire industry. Everybody was asking us from this capture technology, is there a way not to flatten it to two dimensions and be able to use the entire scene, the entire volume? And the answer was, well, not yet.
So this is something to your question that I have been deeply passionate about making holograms for about, well, well over 20 years now. And it's been the underlying thread that ties my entire career together, going from visual effects at Digital Domain through RealD, where we're creating all the capture and the cinema technologies for for projection, and then had the opportunity to run product for Lytro. And now we're here with Lightfield lab making actual true holographic displays. Great segue. Let's talk about Lytro a little bit before we jump into light field labs.
Um. You were you said you were a project product manager. Product manager? Um, and I just thought that your transition from being essentially in the trenches with the guys writing code and doing research to being in the executive suite, it seems like very left brain, right brain, and. And then, of course, you became CEO shortly after, so I thought that was a pretty big leap. You want to talk about that a little bit? Sure. How many people here are CEOs? Oh, wow.
A couple people. Are admitting it. There's actually a lot.
It was way more. I was expecting like, one and being like, yeah, see, they're still, you know, busy in meetings. These tickets are expensive, right? That's even funnier. Of course they're CEOs. So for me, I've always been an entrepreneur.
And I was just sharing earlier today the story about my parents would always laugh. I was a little kid, probably five years old, building some shit and selling it to my next door neighbors. Having my own business cards and. What little kid thinks of getting their own business.
I it actually said monkey business on it. And I wish I was making that up. I'm not. So it's been a deep passion of mine to not only be building the technology, but also to have the ability to lead the technology. And that really is, again, everything that we were doing from the old days all the way through to the later days.
And now today, there's really no difference. It's just a lot more investor management nowadays. So how much of your time as CEO of Lightfield Labs is spent doing the executive stuff? You mentioned managing investors and how much of it how much science do you get to do you guys have a lot of patents? We do. So there's a lot of research and science going on there. Uh, my name is on every one of our patents, if that tells you anything. But my average work day is not the average workday for most people, so I'll call it a 50 over 50 split between the tech and the executive C level stuff.
But that's on a 20 hour workdays. Yeah, don't tell my wife I'm supposed to take time off. So what happened with Lytro? Can we talk about that? I don't want to put you on the spot. I mean, Lytro was going to save the entertainment business and kind of change the way we watched movies.
And then it kind of didn't and got sold in what appeared to be, you know, a friendly sale to Google, let's say. So we left two years before that acquisition occurred. So I can't speak to the specific details, but what we saw was a massive, much larger opportunity in the display space because what we were doing for capture was a little like having a color camera, but only having a black and white TV to see it on. And that's where we saw the market opportunity being much larger.
And that speaks to the business side of my the other side of the brain looking at where do we want to take this technology and for this type of capture technology, all the light field types of things, the holographic capture that is out there to make that successful, you really do need a way to visualize it. And that's what we set out to achieve. Um, and so Google took the Lytro technology, and they've combined it into other things and used it for their. I mean, they have an XR operating system and. Certain things I can't speak to, of course, but if you look at every of the all the photo apps you already have, refocusing capability, all those things, everything is part of that, that suite of tech.
All right. So let's let's get get on to the holograms because that's why everybody is here. I think Microsoft probably did holograms a dirty when they called the HoloLens a holographic display.
And of course HoloLens suggests a hologram. But it's not that at all. Correct, correct. Now, there are holographic processes. There are holographic optics. One of the things that we find we are always having to help educate is what is a holographic technology and what is not.
Now, this is just to be very clear. Not saying something is good or bad. This is just helping to articulate what a hologram can achieve and what the other things that are marketed as holographic are actually doing. And most notably, is Tupac.
If I say Tupac, how many people think of Coachella and they brought him back from the dead, or how many people don't know what I'm talking about? Okay. All right. There's a couple of hands. All right. I'll call it enough hands. Yeah.
All right. So not a hologram. If you look here, and for anyone that is viewing remote, I'll try and use my little cursor here. This top installation that you see is kind of a marketing kiosk.
You'll see that in retail stores, in different airports. And what they're doing is they have a monitor that is pointing down, and you're looking at the reflection of a 2D image off of that monitor. So it's cool.
You're seeing the reflection of that, whatever that digital 2D image is. But it's still two dimensional. And you can see the larger formats with Michael Jackson and Tupac and whoever, where. They do that with the Haunted Mansion, right. That's Julius Pepper invented that in like 1869 as a theater trick to do. Hamlet.
That is. Hamlet's ghost. I can move on because I can't say it any better. I'm sorry. That's a theater trivia from another level.
Yeah, yeah. So when you've got the new 2D digital versions of it, it is just two dimensional. Haunted mansion. They're using mannequins. That's right.
They're using little models, and they're doing a reflection of that. That's right, that's right. So one of the other really frequently marketed holograms, and we say holograms because that's how they market them in air quotes is what I like to call the volumetric or the wonderful world of spinny things.
And when you see these technologies, they are. Yours on Amazon for 79.95. He's bought one. No I'm kidding. No, they're really cool.
I mean, again, this is not to say something is good or bad, it's just what let's define what it is doing. So it is a literal fan. And there's a strip of LED diodes.
And it spins fast enough that through the persistence of vision you will see a 2D image and if you reach out to grab it, you will get your hand off. So don't do that. Please stand back. But they are very cool. And in the volumetric terminology, the way we define it is I have a 2D pixel and I place it in space. So in that case you have a 2D fan and you can place it in space.
And the bottom middle here you see a different more interesting version of volumetric where they are using rear projection and multiple LCD screens and you form a volume. So when you go off axis you can see those little slats of different planes. It is cool, but you'll always see the foreground, the background, everything will look kind of glowy, but it is volumetric, so you can actually see something that is forming into an object.
And then the other category is autostereoscopic or glasses free, which there's many of these versions. OOP, it jumped ahead of me I apologize. All right. We're going to go back because this is live TV guys.
Looks like the hand is coming out. Of the blue. That. apparently if you hit it too often, it's going to not.
That one's a graphic design trick, that is. We'll get to that. I'm getting ahead of myself. Oh, dear. Okay. So I'll just tell you.
You see these on bookmarks, movie posters and cereal boxes, frequently using something called a lenticular. And it gives you a left eye and a right eye, which is giving you stereopsis not actually forming an object. And we'll talk about why that's important in a moment. And because we already kind of foreshadowed on this one, I'll just blow through these really quickly.
You may see these things go viral on social media. And people talk about hands coming out and lions and things jumping off screen. And it is a two dimensional screen.
There is nothing at all that is even stereoscopic about it. It's a really cool technique, breaking the frame. If you were there in person, however, it would be like if Charlie and I are on screen and you see us being projected on a 2D screen, you're not going to confuse it for being actually us. And then you may have seen some of these there are two dimensional screens, and you can see through to a booth behind it. Again, they're 2D, they're very cool, but not a hologram. And then there are other things like AR, VR referring also to the HoloLens.
This is a version where they put the tracker onto the camera and they're just rerendering. And then sometimes people will call that holographic. So that's a proto box there. The one in the middle. Yes.
Yeah. So what's going on there, John? So what you have there. And again, it's super cool.
You've got a transparent LCD screen and then you can see through to what's behind it, which is like a phone booth looking thing. So two dimensional screen in front of something that has like a like a shadow box. Yeah, it's very convincing in video. It is. Yeah.
In person it is two dimensional. Yeah. Again not not bad. It's just it's not bad.
It's not bad. That's cool. They had one. Show.
Floor here at South by Southwest last year before they you see them at shows all the time. It's definitely a very cool technology. So these are the typical classifications for the not holographic things. Now that doesn't mean you need or want a hologram.
But if you did want that we can dive into what that means. Well, yeah. What is a real hologram and what does what is the product? What does Lightfield Labs making? Well, let's talk about some physics here.
And I promise, Charlie, I wouldn't talk about physics. So I'm going to go really quickly here. Because what's more important than looking at math is understanding what you get with different types of display technologies, ranging from two dimensional all the way through to full wavefront displays. So we created this, and we find it's a helpful way to set up what you will be able to achieve and what the visual result is. If you have, for example, a Monoscopic or a 2D regular display. So the type of displays that you're looking at right now.
This is setting up a television looking screen. You've got a piece of content on it. Things that all point your attention to.
I'm using the mouse here just for people watching. Remote is you can see that you've got butterflies and obviously you don't have anything changing as the camera is moving around, it looks like it's just flat on the wall as you would expect. And you can also see down here an actual mirror that is in the room that is reflecting, and it's reflecting, as one might expect, something two dimensional. So cool.
Right. So we're going to use this as a way to run through various display technologies. So if we compare that now with the same content, same room, same camera move, same focus cues to stereoscopic, like you went to a movie theater and you put your glasses on, what's happening is you're getting stereopsis, but if you're going off axis based on a fixed screen, things warp, you're going to have the. The disparity between the eyes is something that your brain has to work very hard to put together.
It is decoupling vergence and accommodation if you're familiar with that terminology. But you're getting two different views trying to showcase that in a 2D screen. Of course, you're seeing what's happening in your brain as things are warping, but you're not getting reflections, refraction, all those other things. Okay, now let's go to volumetric displays.
And this one is more akin to the rear projection. The multi multi planar methodology. And what you will get is an actual volume as you're moving to the sides, left and right, forwards and backwards, you won't see the reflections and refractions. You don't see the things changing in focus, but you do see something that is moving as you move in space. And then we'll compare that now to autostereoscopic.
And that's either lenticular or some kind of a headset, anything where you get two discrete viewpoints or something that has multi-view from a lenticular oriented display, and you will see some of the motion parallax, which is cool. You will be able to see some of the things changing for reflections and refractions, but you're not getting anything changing in terms of being physically present in the space with you. So think of this as a lot of the state of the art in terms of what other display technologies are doing, and if we go all the way to a wavefront display, which is another way of saying you're actually creating the full holographic projection of the object based upon what the product specifications are, you can actually achieve true focus of the real thing. It is as literal as having the real object, as if it reflected the light itself. And this is the way to then think of and look at how the reflections in that mirror would behave differently.
Where you see the glass, it is responding correctly to the reflections and refractions, the diffraction of everything that would have happened in the real physical world. Or you can create some uncanny worlds that are going beyond the laws of physics, which we always like to try and bend. So two years ago, you invited me up to the lab and you showed me finally what a solid light hologram looks like. So. So when I was prompting him with, what is the product? That's really what I want to talk about, because what I saw was an animated character floating in space in front of me.
He's testing me to test my muscle memory on SlideShare. They and they other thing that they've done more recently is that they partnered Lightfield lab, partnered with SETI, and they've created an alien character that also floats there in front of you, seeming to be a physical object, but it is actually made out of light. So that is a perfect cue to talk about simplified physics. The thing that we have built is giving you the ability to form the actual object as if it was truly there. And as you scale larger and larger, as I will show you here, it will give you the ability to form entire environments.
So about five years ago, we had what we call our submodules, and we build our displays very much similar to a traditional 2D video wall where you go from little PCBs, and then they get formed into panels, and panels go into walls. If you see any of the big video structures that are out there. But each one of our submodules is 160 million pixels. Now, these are now wavefront samples, not pixels anymore.
But to put that in orders of magnitude, the entire sphere in Vegas, if anyone's familiar with that, the awesome huge sphere that they have there, this is already higher resolution than the entire sphere just for that one submodule. So what Charlie saw when he came up and saw the what we call our our Aztec character was one of our very first alpha panels, which is taking then multiples of those submodules and forming an entire single structure. That's about a 28 inch diagonal, and it's modulating 2.5 billion pixels.
And again, pixels become. See, I wanted to bring this to South by Southwest and have that guy with us right here. And I still don't know why we couldn't do that. Comes down to customer priority. I'm not paying you enough to do this.
We'll check for the wire. Okay. But this is what you saw. It was 2.5 billion pixels.
That's all full real time. And you go from a little seahorse when it's the smallest submodule into then larger scenes like the Aztec that you were seeing. And I'll show a video in a moment. And then what we're doing as we continue to grow is build multi-panel large wall environments, and you can make things like holographic dolphins, which who doesn't like a holographic dolphin, but it gives you the ability to go from that now to showcase 10 billion pixels per square meter. And that's the type of density that's required.
So you could conceivably put panels on the floor and put panels on the wall and put panels on the ceiling. And now we're, you know, in the holodeck, for reals. No glasses requires. That was our fundamental premise when we were when we were getting funded as a company. And this is showing effectively that. So I'm going to dive in.
These are some really nerdy tech specs. Happy to dig in if we have the time here. But it's the modularity of the system allows you then to configure it with floors, with walls, with ceilings, and then you can interact in full, real time. I'm going to kind of geek out here and just.
Kind of yeah, here. We go. As you can see it.
Now, how big is this display that we're looking at. That's 28in. That's the. 28in. In diagonal. That's the size of the the Aztec that you, you know, so called the beach ball type size.
We've got a lot of a lot of patents. As I had mentioned. I've written almost I'd say 99%, at least as a significant contributor to my hand is really tired. No kidding. But I'd like to highlight this because it's not a derivative technology, and we are proud of the IP portfolio, but it's something that is a ground up new way of looking at how do you build a technology for visual display? And now getting back to your exact question, in terms of what type of configurations could you envision? You can do corners, you can do floors, you can do things that are then giving you the ability to have a light source, no matter where you are in relation to it. Because with anything holographic, you need to be able to see the light, to see the hologram.
And I know that sounds kind of like obvious when you say it out loud, but it's one of the fundamental rules of any holographic or any display technology is you can't freeze a photon in mid-air. You always have to have a direct line of sight between your eye and wherever the light is originating from. Otherwise, the light doesn't exist, and we make sure we abide by the true laws of physics. So in the example you showed us with the dolphins, it's a little bit like, um, you know, the head tracking on 3-D displays.
If you move out of the field of view now, the hologram disappears. Correct. And you'll actually see here we outline it from the viewer's standpoint. Every single thing would be within the field. This would be the exact experience. When the camera is now at this oblique angle, you'll see we outline everything that doesn't have a light source.
You can very easily solve that by adding a larger display or putting another wall behind it. But if you did the corner there, you would a person would be able to view it, have a wider field of view. That's correct, that's correct.
So that really is the question of where do our customers need to have objects placed? What kind of field do they want? Do they want to create multiple fields. So that way different people at even different heights see different things. Think of different ratings of a movie for short people and tall people.
Right? Okay, okay, so I kind of digress on that point here, but let's talk about what you actually saw, because we had a really, really exciting guest also interact with our our Aztec deity, as we called it, and this is one of our largest investors is Bill gates through the Gates Frontier Fund. And we got to geek out with him on site here. >> Hi, Mr. Gates, it's an honor to meet you. >> 99% of the time, the things that you see in the media that are called a hologram is actually just a 2D technique. This is a submodule that is a holographic surface.
Each one of them meaning over 160 million pixels. Is got the true reflection. That turn of refraction. >> That's why. Mr. gates, could I ask you a favor? Sir, could I get a selfie with you? I can't believe he actually let us do that.
That was awesome. So to that end, you were just asking about our recent launch. So in Q4, we launched our actual DVD units, multi-channel holographic system, and a new configuration where we have not only the full holographic suite, but a volumetric configuration. So different amounts of bandwidth depending on what customers need. Our goal is whatever a customer application requires, we make sure we can supply it.
And this was a teaser trailer that I'll go ahead and show because it's. The SETI project. Yeah. Correct, correct. So let me let me place the context.
We sent out these extraordinarily cryptic invitations that looked like something from 1960s that's been redacted in government top secret. And you are part of a secret mission. And we had our customers coming out believing, at least I think they believed that they were part of this secret mission. And it was in order to be able to finally realize and showcase communication with an extraterrestrial being. And this is when you go down into the bunker many hundreds of feet under the earth.
I'm not going to tell you how you get there. This is what you see. >> It's 252 is rattling. Can we make sure that is locked down, please? >> 252 check.
Will see. Three is holding. Thank you for taking care of that before. Photon matrix. Adjust that 20%.
That's incredible. Wait. >> This is impossible.
I always leave him wanting more. So I'm going to show another piece that we haven't actually made public yet. And I promised Charlie we'd find some things that are a first at South by Southwest just for this audience. This is another teaser that you'll get a bit more sense for the full experience. You're going to walk much closer up to the alien. And keep in mind, the alien is projected well over two feet in front of everything.
It's hovering in mid-air, and you would be able to fully interact and communicate with it like that. >> That's incredible. Wait. This is impossible.
Hello. Again, just teasers, making sure everybody still comes out. Um, one of the other pieces that we have here. So we can just talk about how do you create media for it? How do you create something that is a hologram? Because when you're talking about that kind of data rate, it's sometimes mind boggling, right. Talking about 10 billion pixels per square meter.
It's a lot of a lot of data. So we can do that in real time. And we put together this little behind the scenes piece of content that will show you how you go from actual, real time motion capture into the final thing that you actually can interact with, and doing that with what we call our wave tracer, which is, think of it the analogy with ray tracing. But now for wavefronts and do that all in real time.
So this is the holographic projection, including all of the little designs and all the other things, and all the way back to the internals of that collider. This is part of the effect, the reveal of going from call it fuzzy bandwidth to now full fidelity. And now this is going into an unreal environment. You can think of having Unreal or Unity or any other 3D type software DCC content creation solution, and you can control it with all of the same types of tools that you would normally expect, which we find. One of our ultimate goals for our customers is not to change the way they work, not to change the way that they create content, but be able to take the things that they've already done and be able to place that onto the display in real time.
So this is then showing you how you can create a true live, interactive motion capture experience. The actor can be anywhere else in the in the planet, frankly. But through all of these tools and techniques that already exist out there, in combination with our plugin for the wave tracing that does everything in real time, we can then have a direct dialogue with the people in the room when there's Ryan, our wonderful creative director, and you're seeing how they're doing everything with the Blendshapes and all the real time animation that's occurring. And then you'll see Trevor, our amazing architect.
And he is human. They made sure we labeled him as such. But then you can see they've gone through a couple of different iterations into different things to compare to the ground truth versus what's actually happening.
But when you're in the room in the experience, you are able to have that direct dialogue, even though it is something that is holographic. Now, you can attach it to an eye very easily. We couldn't find the language library for an alien interactive AI communication system, but if you have one, let us know. For us, it really is no different. It's just a question of what customers are looking for and what kind of experience do you want to generate? Well, you've now mentioned customers like five times in the past five minutes.
Yeah. So which customers and what applications. Let's talk about where these wills where you should start to think of holograms first getting into market and then ultimately where we see the entire market going. We the entertainment applications, large format things, experiences where you're trying to create a digital likeness is among the fastest growing markets for us in the video market.
Corporate spaces, lobbies, hospitality. Anywhere that you have high foot traffic and trying to differentiate. That's known as the largest segment in the 2D video market. And that is another high, high interest area. Now we're looking longer term because the sizes and scales going into full holographic cinemas, full theatrical exhibits where you can have both real things as well as holographic projection, those are our aspirations. That is where we want to head, ultimately.
Also doing live broadcast sporting events, Being able to do that, you already have Captcha systems that are able to support that. And then over Covid, we added this piece because we get a lot of requests for holographic telepresence, where you can have what we call bidirectional communication with the systems and look through the wall as if it's right on the other side, but it's 3000 miles away. So it's a little funny story.
That's a really interesting application. Super interesting. Now you had to light up a lot of bandwidth in order to do this at the real time encode decode.
In real time. I don't know about. Yeah, yeah, yeah.
Well, it's all possible if dynamite and a lot of cash. Right? But ultimately, this is the vision. What I mentioned about working on this for 20 years, it's being able to record your memories and project those memories not through a 2D illustration, but actually having the hologram in every home, on every device. And that's the ultimate vision for Lightfield lab. So when do you think that would happen? I mean, is that like, you know, so as a futurist, We're always looking like five years ahead because every three years we can move the goalposts. Yeah, but in this case, no moving the goalposts.
Do you really think we'll all see this in my lifetime? Because I only got, like, 25 years left. I don't want to ask. I'm like, well. How long are we talking here? Uh.
I'll put my my CEO hat back on and say, absolutely. Uh, it really what it will take is mass production, custom silicon to go into the types of production volume and price point that the consumer market would require. So you're going to start seeing these in more professional applications in the near term, going into some of the biggest of the marquee type of experiences, things that are highly differentiated. Theme parks. You know, those types of areas. Yeah.
But even even beyond that, areas that just have very high foot traffic, where somebody from a brand standpoint is looking to further differentiate, show you something that you've never been able to see without. Location based entertainment. Yes, that would be correct.
Those are the hottest areas for some of the first applications, but ultimately getting into the consumer market. Some of our largest investors also include Samsung and LG and Corning, the largest in the display space itself. And that really is the path. So LG sees this as a potential technology that they could commercialize.
Well, I can't speak directly on their behalf, but they did invest publicly and they wouldn't invest in things that they don't find to be very compelling. Gotcha. Um, is it going to get out in a smaller scale, do you think there's because there's other technologies? We were talking about Leia's system.
Leia has. I'm not sure. Did you call that a lenticular? A diffractive backlight? Yeah, de facto.
Backlight, but autostereo. So. And they were.
For anybody who remembers the trivia, it was a hydrogen phone uses used a screen from Leia and I think they're putting them in in high end cars now. Like the $300,000 Mercedes. The definition of the dials, if you reach out for them, pops up and you see them with your naked eye like floating in front of them, it's really, really powerful. Illusion. Cool. Yeah, yeah.
So it's auto stereo where you're able to get a depth effect and then you can do it for different, um, passengers, get some different types of content. And we wouldn't call that a hologram. It would be Autostereo auto stereo. Nobody's going to call it that. Uh. They'll call it, uh, whatever the marketing team tells them to call it.
Uh, well, no, I mean, it's a great technology. It really, really is interesting. And you see, we were just talking about the telepresence system, and you've got Google Starline. That's with HP, right? Right.
A version of Autostereo. But now for, uh, telepresence. Well. Virtual presence was a big deal. Once upon a time three South by Southwest ago when the metaverse was still a thing.
Yeah, yeah. But that whole idea that we could be co-present with someone, you know, that we needed to be with, even though we had to be apart. Absolutely. Actually, it occurs to me that we kind of skipped around and we didn't really differentiate how to think of why is a hologram different than something autostereo. So let's actually take a big step back and look at how do you think of from a non-math standpoint? Because I promised Charlie I'd stay away from 17 dimensional functions. You're welcome.
But I like to use. Fell asleep. While I.
Was explaining, and he took that as a sign. And I woke him up. And then now we're here. I mean, I just for those of you in the audience, John and I are friends and we've talked a lot and I understand about 75% of what he just said. So I know I'm sitting up here shaking my head and nodding, but I'm hearing a lot of this for the first time, too.
And some of it. I'm hearing it for the third time and comprehending for the first time. So you're probably feeling most people are probably feeling like I do. And so I knew that there was just a cliff, that if he took us off that cliff, we would all be asleep. It's too much information. Let's just say eyes glaze over and not asleep.
But, you know, tomato, tomato. Eyes glaze over. Yeah. That's true.
There's a famous, famous phrase in show business called me go. Yes. My eyes glaze over. Yes. Hopefully.
Hopefully they have not. And hopefully we're giving you some really cool new content that we've never done. You're doing.
Great. But I was mentioning this to Charlie yesterday when we were looking at the content here. And the easiest way to understand how a hologram works is if you think of a magnifying glass. So show of hands.
Who here has never used a magnifying glass. Okay, we dodged a bullet. Because if anybody has never used one, this is going to be lost on you.
Okay, good. So if you know how to use a magnifying glass and by how to. Meaning, you hold it up to something and it looks bigger, right? But if I ask you, how does it actually work? Why is it that when you hold it up, something looks bigger? It's really hard to articulate the science and the physics behind it.
So we're going to simplify that and then use that as the mechanism to show you how to think of a hologram. And then that'll hopefully become very easy to then differentiate that from other things that are out there. So geometric configuration and thinking of angles and rays. That's the old way of thinking. Move your thinking a little bit more towards wavefronts, the propagation of light. So now you have wavelengths.
You have phase, you have amplitude which are all really fancy words. For light is moving through space, I think we'd all agree, hopefully that that's occurring. And this magnifying glass has a prescription, just like a pair of glasses that will then change the propagation of light. And that prescription is defining how is light going to be received by the person that is looking through from the other side? In this case, the prescription is going to change the phase, which is literally changing the propagation speed of the object, the actual speed of light.
It's actually making that change which will then bend the path, create something that is now diverging. And by having that phase change, that diverging signal that would have been originally a butterfly wing up here is now further away. It is a virtual image in the optical technical sense, and you will see it as a larger thing. So that's all the physics of okay.
Well, let me let me see. If I can. Paraphrase that. And therefore we call it junk. Junk. Yeah.
Junk. So I'm looking at a diverging wavefront. And what you have managed to do is take on an atomic level, those diverging rays that make up the wavefront and you bend them, making the object appear to have definition in space. More or less now doing it digitally.
But this is just one of the functions. Okay. So he's getting ahead of me here. So here's a 20. Thank you. What you see here is then one of the components.
This is then just bending the ability to see the object. And that makes it look larger. So now if we apply this to the hologram, if you have multiple functions because it's made up, the entire visibility of any object out there is made up of phase and amplitude. And if you modulate these things, you can actually recreate anything just to your point.
So if I create now two magnifying glasses because what's better than one is of course two. And I have one that is addressing the amplitude component of an object and one that is addressing the phase of the object. You can then define the inverse and opposite reflection of anything. Now to do so and to make this now digital versus encoding it as a reflection hologram or encoding it in some other medium, you need a ridiculous amount of resolution in order for the wavefront to then interfere and create the actual focused object. So if you apply this logic now, where you have the ability to create these wavefronts to actually modulate something in mid-air, this is then why it's so different than other display technologies, because the things that you would see for Autostereo or whatever you want to call that or market it. As you get a left eye, you get a right eye, but you're not getting something that literally is x, y, z coordinate in space with reflections and refractions such that if I photograph it with a camera lens, you'll get the correct depth of field.
You'll actually receive all of the things that would have occurred had the real object been right there. So now everybody is a physicist. You now all understand how to create a hologram, albeit not digitally quite yet. So let's pivot away from Lightfield Labs for a second.
Sure. Just speak broadly about the future, because that's. Like the. Future is really all about. So we'll take some questions, by the way, in about five minutes.
So if you're using Slido bring it on. Bring it on. So you know, we have a convergence of a lot of different kinds of technologies now. And I know like me, you speculate you are in the business of the future. Have to be. Well, we're going to spend all our lives there, so we should be very interested in it.
So it's going to be made up of a lot more than holograms. What other technologies do you think these kinds of science fiction technologies that companies like Lightfield Labs are bringing into existence? What other ones are kind of on your radar as you think of the media environment that your technology is going to be maturing in? It's tough to have a conversation nowadays without bringing up. I, of course, and you get the very wide gamut of belief systems for what that will do and how it contributes to all of our collective futures as it pertains to visual media. You can think of having things that are generated by the AI or augmented by an AI.
Well, sure, the Aztec guy or the alien. Or. Any.
Of these. Things. Yeah, absolutely. So that is an area that I think most people in this audience probably have already been introduced to, because there's so many of the tracks, even here, that are talking about AI technologies.
I always like to look at science fiction movies to be the catalyst for what yet has not been accomplished, like flying cars. I know there's a bunch of companies working on it. Flying cars is the worst idea I've ever heard. It's not out yet though, and everybody talks about it. I know, well, they said they're going to start with taxis. Sure.
Yeah. What could go wrong? Nothing. Absolutely nothing. That's why you have the the flying Ubers, right? Um, I would be interested in where you've seen things, because I know that you're seeing and literally talking to everybody. Um, where do you see it going? Well, I've been focused most recently on the end of the entertainment business and the beginning of a new entertainment and media ecosystem that is essentially based on algorithms. The algorithm is the network now.
You know, there's an NBC is not going to decide what you're going to watch. You decide by interacting with the technology, what you're going to watch. And that's going to have profound effects on our society, as social media already has other big trends. It's hard to separate today technology trends with political and cultural trends. Sure.
Yeah. So, um, I think there's going to be there's not going to be any regulation to speak of which, you know, technology, technology today is at a critical point where it really does require mindfulness. And we're living in a world where we've raced here over the course of the past 35 years, we've raced to this point heedless of the social, economic and personal consequences. This is the most massive social experiment in the history of man. So, and I don't know what's going to happen.
I mean, what happens when we here's a good one, and it takes us right back to solid light. What happens when we don't know what's real anymore? So, you know, and I think technology has, you know, a big role to play in that. And then my final thought on it is is much broader and more economic, which is because of the power of technology and its influence over our lives. These companies get more and more valuable every day. And when AI agents come out and it's it's not just going to be on your screen for the 400 times a day you look at it, it's going to be in your head.
So, you know, it is. It's an incredible world, but also one that is going to change us. And we have very little control over that because it activates our animal brain. You know, that's still there. So, you know, when I think about the future, I think of all those things. When I said convergence, which was the title of my second book, it's not about any single technology, but it's about all these technologies happening at the same time, right? For example, self-driving cars.
Right. I mean, that is the amount of technology and spatial computing technology in a self-driving car is quite extraordinary. And by the way, those self-driving cars are also making a 3D map of the world, which is incredibly valuable because we're going to have robots rolling around in that world, and we're going to be putting more augmented reality into that world. Right. It's going to enable a combination of the physical and digital world, because the physical world will be machine readable and searchable. So, you know, there are incredible things ahead.
And I think display technology is something that is it's so funny because we're always leapfrogging ourselves. Right. Because it was always about 8-K. Oh, we're going to have 8-K and, you know, an 8-K screen.
If you haven't seen one, it is better than human vision. It's like you fall over when you see it. And if you wear a headset that is 8-K, you really will fall over. You know, you lose. You lose your balance because it's too much.
Um, so I think that, uh, display technology is going to make some huge leaps. Do you think that we could have really have a television like the size of the monitor we're looking at down here? That would be projecting the content into our living room? Absolutely. Absolutely.
Ten years. 20 years. Well, whenever you say in Silicon Valley. Ten years, it means forever.
So. Ten years. Okay. We'll move the goalposts in seven years.
I mean, if you if you look at how how many orders of magnitude and to your point about leapfrogging other technologies, things are moving at a faster pace than, uh, were ever able to be projected. Right. The ability to get to this kind of density of resolution, it hasn't ever been possible before, but it's now because of the fabs, because of the convergence. To your point about bandwidth, about the ability to move data around with different networks, with storage, with all the other devices. It takes all of these things in order to do something that would project in your living room.
So we've got some Slido questions up here. Some of them look like they're for you. Yes, I think I think so. Tackle them in any order. All right. So I see a couple of questions about price here.
And whenever you're in a public venue, you always have to be a little bit careful how you couch some of the the things that you. It's freaking expensive. It depends on which version.
So volumetric technologies start around in the low hundreds of thousands per square meter. And that is for professional applications. And then you've got the full holographic suite which is a large multiple above that because you're dealing with then 10 billion pixels per square meter. And then you'll even see even higher density technologies that we're bringing to market. So it just gives you a little bit of an idea of it's not your consumer display today, but if you compare it to the world's first OLED, for example, which I think that's like almost 40 years ago, they were millions of dollars, those types of displays.
I like Bruno's question because it goes to telepresence. And we were hitting on that. Right. How close are we? How close are we to having a holographic phone call like in Star Wars? I assume that's like having Palatine's head floating in the room telling you what to do.
So if you remove the violation of the laws of physics from Star Wars for a moment. Because in Star Wars they show it where, oh. The Star Chamber, where. It's just hovering in mid-air, which would have to have a light source.
We used to have this whole thing taking some of those scenes and showing where would the display need to be. So if as long as there is a light source that is able to create the hologram, we can actually already do telepresence even today. Well, what's.
Interesting is you're saying there has to be a light source. But in the old way of thinking about it, there had to be a projection surface, whether it was mist, whether it was some kind of curtain. If you want to talk volumetric. Yeah. And there are really cool light trap like BYU. There was a big nature article that they were doing where you can take a particle of dust and you can move it around with a laser really, really, really fast.
And then you can create something that is a little volume. You could do something like that if that is what you want to create. But if you want to create something that is as real as the real person, that's where you would have to have the actual light source that's holographic. Okay. So here's a good question for Moritz. What's the North Star here? Is it is it the holodeck? I mean, the holodeck is one of our biggest inspirations.
It's something that the entire leadership team, the entire company, frankly believes in and wants to bring into the market. But ultimately, that is a question of who is going to be the first mover that wants that installed. And I can't go into any details, but there are certain things that there are aspirations that exist to bring that into market. We're starting with things that are in smaller scale, called smaller walls and then larger walls, but ultimately getting to the holodeck. I mean, that is what everybody wants but the North Star. Just going back to that exact point, our goal is to make every device holographic, every display.
It's to have the natural way that you see the world around you. Be your interface, be the visual way that you can then interact with the media, interact with the convergence of all these other technologies. So here's another use case question about applications in health and health care. The ability to simulate and. Use holograms for medical devices is a really interesting application opportunity. You get into a lot of regulatory things.
So it is definitely one of the areas that we are planning to move into. But that would be a later vertical after we get the first products out into the market in a much bigger way. But if you think of the ability to do surgery on something that is now X, y, z perfectly oriented in space for everybody in that space with no headsets. That is the only way you can do it is by projecting something holographic. Otherwise, somebody that is trying to show how an incision would work if it's autostereo or one of the other types of technologies, every single person will see something that is skewed and different. So I wouldn't want to have anybody practicing with a, with a, with a razor that isn't seeing exactly what is in the real space.
Um, here's a oh, a new question from Jonathan, and I like this question. Do you have any active clients where the technology can be watched already? And if not, I think we're at. If not, when do you think it would be ready? Right now.
Exclusively available to see at Light Field Lab. Where in San Jose? Right in the heart of Silicon Valley. The things that our customers are doing with it. I cannot give you any, uh, confidential information, but you can think of future large scale things that many of our customers are crazier than we are. And we love that because we think we're pretty crazy.
Let's say a giant soccer stadium somewhere. I don't know. You know, where you might, you know, have a couple of thousand of these panels and, you know, project the billboard right over the field. Could be something like that. Okay, okay.
Now here's one, here's one that is more of a sciencey one. Do you want to answer the one from George? I have never heard of freezing light. Is that a thing? All right.
Before you answer it, we should say with a. PhD in the audience here. So if you want to get down to the atomic level. Right, you want to talk about photons and you want to talk about, uh, I mean, the practical way to freeze light wouldn't be in the literal sense. It would be in order to project onto something that is a particle. So I'd need more context from this individual.
Exactly which which? Well, maybe he'll he'll follow up on that, I don't know. We're going to hang out afterwards. Freeze light.
I don't know what that means. Here's someone. Wait. Here's one that I do know the answer to from Alessandro. What is the process? To create the content? It's whatever content creation software that you want to use, so long as you're in the 3D environment. Just like what we were showing of the creation of the alien, that was all driven by unreal using motion capture.
So you don't have to do anything different. You just have to attach the plugin for the wave tracer, use that as your renderer, and then everything else is computationally. Then put onto the display. So someone working in TV production is thinking about holographic monitors and wondering if that is going to change the way we create content. There's a lot of opportunity when you think about sports broadcast or even in volume. They have the volume stages for like Mandalorian, etc., where
you could do multi-camera shoots now versus when you have things on a volume stage where you have a 2D perspective and you have to do the rendering exactly for that camera. You can do things that you can actually have the correct focus, and the actors could even see and interact with the things that you are now projecting. So there's a lot of opportunities there.
If it's more than just putting a two dimensional graphic behind somebody, which there's still opportunity, but we always look at it as what's the most interesting thing for our customer. This is another really good question from Jenna. Thank you Jenna. How do you imagine people will interact with your displays? Are they going to be interactive? Right. Can you touch things inside of the environment? How is that going to work? So it's an interesting question because that can go two different, two different directions.
One in terms of and I think it's part of the question, can you feel the object in addition to can you move and interact with those objects? So right now you can already do things if you if you're familiar with any of the other real time engines, if you imagine having an object, you can move that in 2D. Now we can do that with sensors and you can literally move the object around it responds. That's relatively straightforward.
If you want to do anything where you want to feel the object in mid-air, that's now going into a class of technology called volumetric haptics. We are working deep in the R&D space on things that would directly be using mechanical energy from the nanoparticle polymer surface, so that you could actually match the object with the light, with the ability to touch and feel. That's Gen two, not Gen one, but generally just to give the direct answer for the question. Generally speaking, the smaller the display, it is more encouraging to have interactive experiences.
The larger the format of the holographic experience. There is a natural expectation to be further away from it, so you're not directly interacting, you're more immersed, and those really seem to be the two different categories that we have very clear application requirements on. Okay, 36 seconds left, I don't think we're going to be able to get to this question about defense tech, but it's an interesting use case. I don't think we've ever really talked about that. Not not not at this conference.
We haven't. I'd like to thank South by Southwest. It is one of the highlights of my year. Every year I want to thank John and everybody who sat through our panel and the great questions that we got that helped us wrap things up.
And I hope to see you back here next year with John or another interesting guest. Thank you everybody. Absolute pleasure.
2025-04-10 13:51