I Tried Real Augmented Reality Glasses!
(relaxed electronic music) (pool ball clacks) (birds chirping) (electronic chime) - This might be the best look into the future of tech that I've ever seen. (upbeat electronic music) So I am one of a very, very small number of people who now has tried both of the new and very rare real augmented reality glasses in 2024. The Meta Orion Smartglasses and the Snapchat AR Spectacles.
Neither of them is available to the public and you'll see why in a second. But also they're both pretty incredible in very different ways. I made a whole video a few months ago about this idea but in case you missed that, VR headsets are over here, smartglasses are over here and they're both racing towards this Goldilocks zone in the center somewhere which would would be AR glasses. Like virtual reality headsets are incredible. They have tons of tech, amazing immersion, super wide field of view but also they're absolutely massive and you don't really just go walking around in public with them, at least most people don't want to. But then smartglasses are the exact opposite.
They look like something you might just wear out in regular everyday life but they can't really fit that much tech in them. So you're limited to maybe a camera and some batteries and speakers and a little computer inside and that's about it. So VR headsets want to shrink down more and more until they can compact all the tech and actually look like regular glasses, while smart glasses want to add more and more tech as much as they can to be better while still looking like regular glasses and so somewhere in the middle is this fantasy product called augmented reality glasses. But what if we could pull back the curtain a little bit and see what that looks like with today's tech? That is what these are.
So Meta and Snapchat have taken two very, very different approaches to bringing these creations to life and neither of these will be sold to the public and I think that's probably a good thing. Neither of them is exactly ready yet but I still think they're both super cool and now that I've used them both, I kinda can't help but compare them to each other. So let's start with Meta's Orion project. So they unveiled these on stage at their Connect event a few weeks ago and they've let a few people try 'em since then and they're actually a three-part system.
It is the glasses that you wear on your face and there's also a wireless computer puck that must be within about 15 feet of the glasses at all times and there's also a wrist strap that's measuring electrical impulses through your arm and that's used as an input device. Yes, you heard that correctly. So combined, these three things form an augmented reality experience unlike anything I've ever experienced before. Now yes, there was Magic Leap and yes, there was HoloLens and things like that but now just wearing a pair of what feels like just transparent glasses that's actually overlaying tracked digital things onto the real world in front of me kind of feels like something out of science fiction. The main challenge actually with making a video about these things is there isn't really like screen recording. 'Cause like I said, it's literally I'm looking through glass and seeing things overlayed onto the real world.
Which is crazy but the best we can do is take a first person video and then overlay the graphics from the glasses on top of the video to sort of give you an idea of what it looks like to my eye. But it's really hard to do it justice. But either way, with the Meta Orion glasses, I got to walk through three basic demos here. So the first one was just a kinda basic usage.
So just imagine sitting down in some coffee shop or somewhere on a bench and just scrolling through Instagram, which was a window floating in the middle of the room that only I could see and then I did a little bit of multi-window here and there. So I had a video call going in one spot, some other floating windows with messaging and Instagram floating around me. Pretty basic but still pretty cool.
The glasses, they're pretty light on my face. They weigh around 100 grams. The audio from Instagram, since I'm watching Reels, it was playing through the built-in speakers that were right above my ears and I scrolled through them by making this gesture with my thumb and swiping on my own hand. So now seeing that, you might believe that the cameras and the sensors on the front of the glasses are picking up my hand doing this gesture and then doing the scroll in sync with it. There is hand tracking but it's not for that.
This gesture would be picked up anywhere if it was in my sweatshirt pocket, behind my back because I'm wearing that wristband and this thing may be the coolest input device, the coolest piece of tech I've tried in a long time. This is the EMG wristband that they've built. EMG stands for electromyography. It's about the size of a WHOOP, as you can see. It has electronics built into the textile weave.
It has an onboard machine learning computer that connects via Bluetooth to the puck and is able to measure the electronic signals being sent from your brain to your fingers and it's pretty great actually. If you think about it, I mean your tendons sort of run through your arm and they're connecting all the way through your nervous system to your brain and so the set of electrical impulses that go through your tendons to do this gesture is very distinct and different from the set of impulses when you do this gesture and also different from when you do this gesture and so the wristband can measure those electrical impulses and pick them up on the way to your hand and map that to the controls. So even in this prototype version I'm using, it felt like it's getting about 80% accuracy and it also had haptic feedback as well to confirm when it was getting things right.
The people I talked to at Meta, including CTO Boz, say things like they love this as a really high ceiling new input method and they could see it developing massively over time, potentially even getting to the point where they say they could measure you draw letters in midair with an imaginary pen and it uses those electrical impulses to map that to real letters as text input with handwriting. It's crazy. So either way, it's working for scrolling through Instagram here as I'm looking at it in the glasses. I'm also looking at the Instagram app with my eyes to make sure that's what I'm controlling because there is eye control and scrolling with the gesture, it's working. That by itself is pretty cool but they weren't done. The second demo was walking up to a table with a bunch of ingredients on it, looking at it and doing a gesture and then asking the built-in AI, what type of smoothie could I make with this stuff? And of course, there's cameras on the front and Meta's AI looks at the camera feed, sees a bunch of clearly-labeled ingredients and then the most recognizable fruits of all time sitting on a contrasty table and then decides oh, you could make a pineapple smoothie with matcha, great.
But that's not what the most impressive part of that was to me. What was impressive was the nice little touch of AR which was these little blue dots that appeared and then tracked onto the ingredients on the table, labeling each thing and then staying on those things as I moved around and looked around the space. It's such a little thing but it made a big difference. Again, you have to realize it doesn't look amazing through this video but in real life, just picture yourself looking at objects through your glasses and seeing labels pop up over them. That was like sci-fi, it was amazing. So then the third and final demo was probably the coolest because it was also a shared spaces demo.
So there's two people with glasses on. So both people walk up to this QR code in the middle of the room, stare at it for a few seconds and then that becomes the anchor point for a shared experience in 3D space, which in this case was a game of 3D Pong. So with the sensors at the front, now it's just shifted to visually tracking my hand through the air and mapping that to a paddle and it let me hit this ball back and forth. So I started getting kinda good at it, not gonna lie. Sorry Ellis, competition is competition.
Going hard in the paint. But yeah, this is just Pong in real life that only the people wearing the glasses can see. It's also funny because we felt kinda cool playing this game.
But yeah, this is how it looks to people not wearing the glasses. Not so fun. There's a lot of complicated technology and material science that goes into making these glasses work.
Just from the sensors at the front and at the back pointing at your eyes, pointing at the real world to the micro-LED projectors inside to the wave guides, the silicon carbide material that's allowing us to refract the light at an extreme angle without distortion. I went way into the weeds with this with the CTO of Meta, Boz, and I'm gonna put that whole segment on the Waveform podcast. It should be out by the time you watch this but I'll leave a link below to subscribe to Waveform so you can get into the weeds with us this week on that stuff.
But I think what you should take out of this is this thing is packed to the gills. They have seven small sensors and cameras that are custom-designed to do the eye tracking and environment tracking. There's custom silicon in here to bring all of the data together.
There's batteries split up to weight them evenly across your face. There's speakers. The frames themselves are made of magnesium, both because it had to be rigid enough to keep the lenses in alignment but also because that made for a good thermal conductor as the literal heat sink to the entire computer inside. They actually made a working see-through version to help get us a better idea of how squashed in there everything is and it overheats faster because the transparent plastic is not as good of a heat sink. So yeah, these glasses are absolutely thermally constrained using today's technology and they have a battery life of about two to three hours and that's not even to mention the completely separate compute puck with a co-processor where they're offloading the app logic. That is truly a technical marvel.
But all of that adds up to a set of glasses that was mostly pretty transparent, pretty lightweight, comfortable enough to wear for the two hours before they started to get a little warm and a little bit heavy on my ears and then the graphics themselves that it's overlaying were tracked pretty well. Not the highest resolution I've ever seen but pretty respectable and with a 70-degree field of view, which meant I could look around a bit and the graphics would mostly stay in my line of sight. You can see 'em start to get cut off at the edges a little bit and that's actually how I saw it in real life. But in total, it felt like just wearing slightly thicker, slightly heavier than normal glasses with a little tint and a little bit of flare. But delivering the most convincing demo of a post-smartphone augmented reality future that I've ever seen.
Of course, none of that matters right now because Meta is not shipping this, ever. It's kind of a weird move for a tech company to announce and show off and demo a new product but then never actually plan on selling it to people. But think of it as PR. Again, you can watch the entire chat with Boz but basically what I got out of talking to him was that they believe that continuing to iterate on this thing that they've made behind the scenes without all of the extra attention to packaging and marketing and selling the first thing will allow them to make something even better in a second or third iteration that may be good enough to actually sell. Ideally it can be brighter and can have better battery life of course and have a higher resolution potentially and still work towards all those things to be a real, shippable, deliverable thing for early adopters.
Especially those wave guides and the silicon carbide and the price, there's just an immense assumed price tag for this low volume prototype that they've made. But that's the idea and I think I actually agree with that. So these, these are the Snapchat AR Spectacles and right off the bat, they look dramatically more like a piece of technology on my face. But fundamentally, the same thing is happening in here. I am looking through these glasses at the real world. I can open this up.
In front of you, the lens, there is a menu right now that you can't see and it's stuff overlaid over on top of the real world and it's actually incredible that it works. But there are a few fundamental differences between these and Meta's that I think are really interesting. So difference number one, there is no separate computer puck. Everything is in these glasses.
So I think we can safely assume that that's why there's so much more mass here in general. Meta's glasses looked much more like normal glasses but had an entire separate smartphone-sized computer paired to it the entire time, tethered to it basically. These you can connect to your phone but otherwise they do operate without any other additional hardware.
Everything's in the glasses. So that's definitely way more hardware on your face. But then difference number two is materials and build.
Most of which you're looking at here all the way around is just it's black plastic all the way around and that's actually kinda interesting. That's not to say that they're built worse or poorly at all. They're still actually very rigid but I actually think this is more of a weight-saving measure. But then they still needed a high quality heat sink, which is why there's this metal band on each side. So you can still dissipate heat from the hottest components using the cooler air outside of the glasses. But in total, these still weigh 228 grams and when you put them on, they are, I mean they're huge, they are massive.
You can see the arm extends way back beyond my ear. There's a lot of mass on the front of my face. They're trying to be more balanced. I mean, there's some level of appreciation for the fact that there is no separate computer puck of course but like yeah, this is today's tech we're talking about. So there's a lot going on here.
But then that brings us to number three, which is when you actually turn them on and there's two immediate differences with the display. Again, I'm not looking at a lens right now. I'm looking at a menu in front of the lens but it's the resolution and the field of view.
Now this is obviously really hard to explain and show on video and show you what I'm seeing. No eye tracking, it's just gonna be hand-based and gesture-based. But on the Snapchat glasses, the resolution of the menus is dramatically higher. Everything is much sharper than the very pixelated-looking Meta glasses, honestly close to like what a Vision Pro looks like when I'm wearing those. But the field of view is significantly less. I think the number is 46 degrees versus Meta's 70 degrees and candidly, you notice that a lot.
Now field of view is normally not so bad that there's any interruption of the thing that you're looking at like straight ahead of you but it's actually close and this is one of the biggest differences between them as someone who happens to have used these and the Meta glasses back-to-back. Now I don't know if immersion is the right word here but my main way of describing it is when you're actually using the glasses, you're doing whatever with them, whatever software, whatever overlay you have pulled up, you're not really thinking about field of view. You're just immersed in the thing, using the app, looking forward. But the second you start poking around or moving your head a lot or observing more of what's around right in front of you, then things get cut off. You can actually see this a little bit at the edges of some apps with the Meta Orion graphics actually.
Right here, right there. At the edge, you saw the UI kinda get clipped off a bit and that's how it looked to me too. My eye sees that and that was with Meta's, which have the 70-degree field of view. This is even more restrictive. It's really only going to overlay things directly in front of you and if you even turn a little bit, it gets clipped. There's a golf game in here which I was playing for a little bit and it's fun but you have your phone which is mapped to the golf club and you look down and you can see the golf ball but you can't see anything else.
And in golf, your peripheral is very important. So you look up to see the hole and then look down to see the club and you keep having to look back and forth and you hit it and you have to track it with your eye just right so it doesn't leave the field of view. There's just, yeah, it's much more noticeable with these. But probably my favorite difference actually with the display or whatever you want to call it here is the electrochromic tint. Because the Snapchat AR Spectacles have a built-in tint made from the same tech that snaps the sunroof of the new Rivians from clear to tinted and back. It's actually built into these glasses as well and so that helps them go from clear to tinted and that helps dramatically with readability and contrast of the things that aren't supposed to be overlaid over your real world.
So I guess immersion is actually the right word here. Maybe it's just a scoreboard floating in place or a social media feed or maybe you're on an airplane and just want to watch a couple of videos. You can add the tint and then it kinda feels a little closer to a VR headset where the background is faded away and you're just looking at something in front of you floating.
It is crazy how much tech is built into these glasses. And then probably the last major difference here with these is on the subject of developers, developers, developers. See, both of these glasses are not real products that are gonna ship and be in stores ever.
But one of them, the Meta glasses feel like a super high-end tech demo and it's incredible that it works but the other one, these, these are actually a developer kit. You could theoretically get your hands on these today and start making apps for them. I mean, you'd have to spend 100 bucks a month and be locked into the developer program with a minimum one-year commitment.
But then you get these and you're off to the races building lenses for Snap OS and there's a bunch of them actually already available. There's that golf app I told you about. There's also a browser app and a music creation app and a "Beat Saber" equivalent called "Beatboxer" and dozens more that I can already play with and there's even more really impressive shared space experiences. Like Snap's glasses, they don't even require you to scan a QR code. They will actually just map the room you're in actively as long as you're looking around with the other person and it immediately matches you up and you can now both see and manipulate the same floating object in 3D space. It's kind of awesome.
So if the golden question with AR glasses, at least for now is what can you even do with these things? Then I do like Snap's strategy of just getting these out to people as early as possible and then just seeing what they make. Now me personally, I actually have two dream use cases for AR glasses. Here, the first one is I just want like any instrument, being able to learn any instrument by having the visual. Like the "Guitar Hero" overlay.
Piano, that's the one you've probably already seen where the piano notes come down and you can just sort of match it up and play piano. Any instrument, I think that would be incredible. But my other one is so you know how when you're on an airplane and you look out and you see like a cool monument or some like visual thing that you recognize, how super cool that is? What if, hear me out, you've got the glasses on, it's like glasses AR airplane app, whatever.
You look out the airplane window and it outlines like the territory outlines or the state outlines or whatever and you can see the scale of things and it pops up little monuments and landmarks and things that you may recognize just by looking out the airplane window. That I think, I would think that was awesome. So clearly more development is needed for both of these glasses. The Meta glasses, they have a separate computer in your pocket, right? They only have a two-hour battery life and an estimated $25,000 retail price thanks to materials and the Snapchat glasses, they have a 45-minute battery life and they look like this. So yeah, the tech is clearly not ready to start selling this to real people yet.
But you almost can't help but picture the future that we might have, that we could have maybe even looking at our phones less if we have this AR glasses thing actually come to life. Maybe. Someday. It could be cool.
Thanks for watching. Catch you guys in the next one, peace. (upbeat electronic music)
2024-11-03 07:25