The Metaverse and How We ll Build It Together -- Connect 2021

Show video

Hey, and welcome to Connect! Today, we're going to talk about the metaverse. I want to share what we imagine is possible, the experiences you'll have, the creative economy we'll all build, and the technology that needs to be invented, as well as how we're going to all do this together. The basic story of technology in our lifetimes is that it's given us the power to express ourselves and experience the world with ever greater richness. Back when I started Facebook, that mostly meant text that we typed on computers. Then we got phones with cameras and the internet became much more visual. And recently, as connections got faster, video has become the main way that we experience content.

We've gone from desktop to web to phones. From text to photos to video. But this isn't the end of the line.

The next platform and medium will be even more immersive, an embodied internet where you're in the experience, not just looking at it. And we call this the metaverse. You'll be able to do almost anything you can imagine, get together with friends and family, work, learn, play, shop, create, as well as entirely new categories that don't really fit how we think about computers or phones today. Now, since we're doing this remotely today, I figured let's make it special. So we've put together something that I think is really give you a feeling for what the future could be like. We believe the metaverse will be the successor to the mobile internet.

We’ll be able to feel present, like we’re right there with people, no matter how far apart we actually are. We’ll be able to express ourselves in new, joyful, completely immersive ways. And that’s going to unlock a lot of amazing new experiences. When I send my parents a video of my kids, they’ll feel like they’re in the moment with us, not peering in through a little window.

When you play a game with friends, you’ll feel like you’re right there together in a different world, not just on your computer by yourself. And when you’re in a meeting in the metaverse, it’ll feel like you’re right in the room together, making eye contact, having a shared sense of space, and not looking at a grid of faces on a screen. That’s what we mean by an embodied internet. Instead of looking at a screen, you’re going to be in these experiences. Everything we do online today, connecting socially, entertainment, games, work, is going to be more natural and vivid.

This isn’t about spending more time on screens. It’s about making the time that we already spend better. Screens just can’t convey the full range of human expression and connection. They can’t deliver that deep feeling of presence. But the next version of the internet can. That's what we should be working towards, technology that's built around people, and how we actually experience the world and interact with each other.

That's what the metaverse is all about. The best way to understand the metaverse is to experience it yourself. But it's tough, because it doesn't fully exist yet.

Some of the basic building blocks are here though, and others are emerging as we speak. We're starting to get a sense of how it could come together and what it could feel like. So, today we’re going to do something a little bit different. Rather than just focusing on this year’s products, like a normal keynote, we're going to talk about the future.

So let’s start by exploring what different kinds of metaverse experiences could feel like, starting with the most important of all, connecting with people. Imagine you put on your glasses or headset and you’re instantly in your home space. It has parts of your physical home recreated virtually. It has things that are only possible virtually. And it has an incredibly inspiring view of whatever you find most beautiful.

Hey. Are you coming? Yeah, just gotta find something to wear. All right. Perfect.

Oh, boy! Hey, Mark. Hey, what's going on? Hi. What's up, Mark. We're floating in space? Who made this place? It's awesome. Right? It's from a creator I met in LA. This place is amazing. -Boz, is that you? -Of course, it's me.

You know I had to do the robot, man. I thought I was supposed to be the robot. I knew you were bluffing. Hey, wait. Where is Naomi? -Let's call her. -Yes. Naomi!

-Hey, should we deal you in? -Hey. Sorry, I'm running late, but see what we're checking out. There's this artist going around Soho hiding AR pieces for people to find. 3D street art, that's cool.

Send that link over so we can all look at it. -This is stunning. -That is something. -It's awesome. -I love the movement. Wait, it's disappearing. This is amazing.

Hold on, I'll tip the artist and they'll extend it. -Wow, brilliant. -Did you guys like it here? I have another one that you're gonna love. Check out this forest room.

Come on, let's see it. Koi fish that fly? That's new. This is wild.

Hey, one sec, Boz. It's Priscilla. Hey, you have to see this. Beast is going crazy. Oh, I love that guy. We've got to show that to the kids.

Can you also send that to my dad? I'll message him. All right, see you at home. This place is great, Boz, but there's something I got to get back to. All right, so that’s a glimpse of a few ways that we're gonna be able to get together and socialize in the metaverse. It’s a ways off, but you can start to see some of the fundamental building blocks take shape. First, the feeling of presence.

This is the defining quality of the metaverse. You're going to really feel like you're there with other people. You’ll see their facial expressions, their body language, maybe figure out if they're actually holding a winning hand, all the subtle ways we communicate that today's technology today can't quite deliver. Next, there are avatars, and that's how we're going to represent ourselves in the metaverse. Avatars will be as common as profile pictures are today, but instead of a static image, they're going to be living 3D representations of you, your expressions, your gestures, that are going to make interactions much richer than anything that's possible online today. You’ll probably have a photo-realistic avatar for work, a stylized one for hanging out, and maybe even a fantasy one for gaming.

You’re gonna have a wardrobe of virtual clothes for different occasions, designed by different creators and from different apps and experiences. Importantly, you should be able to bring your avatar and digital items across different apps and experiences in the metaverse. Beyond avatars, there's your home space.

You'll be able to design it to look the way you want, maybe put up your own pictures and videos, and store your digital goods. You'll be able to invite people over, play games, and hang out. You’ll have a home office where you can work. Your home is your personal space from which you can teleport to anywhere you want. Now, speaking of teleporting, there are going to be all kinds of different spaces that people make, rooms like we just saw, but also games and whole worlds that you can teleport in and out of whenever you want. Teleporting around the metaverse is going to be like clicking a link on the internet, it's an open standard.

In order to unlock the potential of the metaverse, there needs to be interoperability. That goes beyond just taking your avatar and digital items across different apps and experiences, which we’re already building an API to support. You want to know that when you buy something or create something, that your items will be useful in a lot of contexts and you're not going to be locked into one world or platform. You want to know that you own your items, not a platform.

This is going to require not just technical work, like some of the important projects that are going on around crypto and NFTs in the community now, it’s also going to take ecosystem building, norm setting, and new forms of governance, and this is something that we’re really going to focus on. Privacy and safety need to be built into the metaverse from day one. You’ll get to decide when you want to be with other people, block someone from appearing in your space, or when you want to take a break and teleport to a private bubble to be alone.

You’re going to be able to bring things from the physical world into the metaverse, almost any type of media that can be represented digitally, photos, videos, art, music, movies, books, games, you name it. Lots of things that are physical today, like screens, will just be able to be holograms in the future. You won’t need a physical TV, it’ll just be a $1 hologram from some high school kid halfway across the world. And you’ll be able to take your items and project them into the physical world as holograms in augmented reality too.

You're going to be able to move across these different experiences on all kinds of different devices, sometimes using virtual reality so you’re fully immersed, sometimes using augmented reality glasses so you can be present in the physical world as well, and sometimes on a computer or phone so you can quickly jump into the metaverse from existing platforms. There will be new ways of interacting with devices that are much more natural. Instead of typing or tapping, you're gonna be able to gesture with your hands, say a few words, or even make things happen just by thinking about them. Your devices won’t be the focal point of your attention anymore. Instead of getting in the way, they're gonna give you a sense of presence in new experiences that you're having and the people who you’re with. These are some of the basic concepts for the metaverse.

And while this may sound like science fiction, we're starting to see a lot of these technologies coming together. In the next 5-10 years, a lot of this is going to be mainstream, and a lot of us will be creating and inhabiting worlds that are as detailed and convincing as this one on a daily basis. So even though it's still a long way off, we're starting to work on some of these foundational concepts today. Horizon is the social platform that we’re building for people to create and interact in the metaverse. One part of this is Horizon Home, which is our early vision for a home space in the metaverse. Horizon Home is the first thing that you'll see when you put on your Quest headset.

Today there are already a bunch of options to choose from, and in the future, anyone will be able to create one. We’ve just called it Home until now because it’s been missing something very important, people. Soon, we’re going to be introducing a social version of Home, where you can invite your friends to join you as avatars. You’ll be able to hang out, watch videos together, and jump into apps together.

Then there is Horizon Worlds, which is where you can build worlds and jump into them with people. Horizon is designed to make it possible for everyone to create, and we're already seeing people build some interesting experiences, from creating new games together to throwing surprise parties in VR that family and friends around the world can join. We started rolling out Horizon Worlds in beta last year, and we’re adding more people and more worlds every day. And we just launched Horizon Workrooms earlier this year for collaboration.

Beyond Horizon, we're also making it easier to communicate with your friends across different layers of reality. This year, we're bringing Messenger calls to virtual reality. You'll be able to invite your friends to a Messenger call and soon you’ll be able to go explore somewhere together or join a game.

These are the kind o tools that need to get built so that you can jump into the metaverse with your friends from anywhere. And they’re going to unlock some unique experiences. Now, let’s move on from some of these basic concepts and people just connecting, to a completely different set of experiences around entertainment and gaming. Imagine your best friend is at a concert somewhere across the world. What if you could be there with her? -Yo. -You're here! Oh, yeah.

METAVERSE AFTERPARTY FREE PASSES Afterparty passes? -Yes. -Yes. Jumping in now. What if there was an afterparty that anyone could go to no matter where they were? This is wild. Hey, check this out. Charity auction happening.

There's swag! And when you go to the afterparty, you could connect with other fans. Hear new versions of your favorite songs. And check out the merch that just dropped. Ooh, I like that. Well, now you've got to get it. You can start to see how the metaverse is going to enable richer experiences, by letting us add new layers to the world that we can interact with.

Creators and artists are going to able to connect to their audiences in new ways, and really bring them into these shared experiences. Now there's a lot that needs to get built to create experiences like this, but we're working on some of these pieces right now with Spark AR. First, we're building tools that creators can use to place digital objects into the physical world, for people to interact with. And, rather than just simple visual effects, new creator capabilities will support 3D objects, that can respond and react realistically.

Including a realistic sense of depth and occlusion. In the next year, we're also adding the ability for creators to connect different physical locations into cohesive augmented reality storytelling experiences like guided tours or scavenger hunts. We're also building a Horizon marketplace where creators can sell and share 3D digital items. And our hope is that this will enable a lot more commerce, and help grow the overall metaverse economy.

Because at the end of the day, it is really the creators and developers, who were going to build the metaverse and make this real. And to make sure there’s an ecosystem, that can sustain hundreds of thousands of people working on this, which is what we're going to need to bring this to life, then it's critical that creators and developers, can make a good living, doing this work. Now, if you asked people today what they thought the metaverse was, a lot of people would probably say it was a Spiderman movie But the people who actually follow the space would say it’s about gaming.

And that's because gaming provides many of the most immersive experiences. And it is the biggest entertainment industry by far. Gaming in the metaverse is going to span from immersive experiences in fantasy worlds, to bring simple games into our everyday lives through holograms. Maybe you'll play old games in new ways. So, Barcelona, huh? Well, it’s not New York, but I like it... New York misses you.

What’s that? I said, Let me put my game face on so I can beat you. Oh! Okay, check. I got to try another game face next time. Maybe you'll go head to head with players from around the world. Lucky shot! Some call it skill.

Keep talking! Or maybe you'll do things that aren't even possible in gaming today. Hey, Mark. Down for a VR Foiling session? Now, this is more my style. Classic look. All right. Nice. Oh, nice choice, Mark. Ready to shred.

Okay, here we go. Hang in there, Mark. -That's not good. -Hit this section. Boom! All right, backflip.

-I've got an idea. -Where you're going? -Got a pump it to jump it. -What? I didn't know that was an option. All right, you're not gonna catch me now. Take that, unicorn. -Tube City. -God, you're out of control. Don't worry, I'll let you win next time. All good.

Well, that was a close one. -You want to go again? -Maybe later. I'm gonna need a lot more sunscreen though. Man! Well, gaming is how people are going to step into the metaverse for the first time. It already has some of the most fully built-out digital goods, the most active creator and developer communities.

And major platforms like Epic are working to build up the metaverse, starting with gaming. For our part, we're heavily investing in building a healthy VR and AR ecosystem, so the game studios can keep building and gaming creators can keep creating. Now, Deb, from our studios team is joining me. Deb, take us through some exciting games in the pipeline for Quest? Absolutely.

Over the years, we've had the opportunity to work with incredible developers like Vertigo Games. The studio behind fan-favorite, Arizona Sunshine. I love Arizona Sunshine. That game basically got me and my friends, through the first months of the pandemic. That's awesome.

If you enjoyed that, Mark, you'll be excited that we're partnering with Vertigo, on five more great games from Deep Silver and others. We'll share more about this line up very soon. Nice. What else is coming? Well, the metaverse is constantly evolving.

So one of the most important aspects will be live-service games, that launch updates and new downloadable content regularly, like Echo VR, Beat Saber, Onward, Pistol Whip and more. We're focused on this a lot right now, making sure games can build-out active communities. -Beat Saber has a passionate community. -I love Beat Saber. So do I and Beat Saber just passed a 100 million dollars in lifetime revenue on Quest alone. It's a great example of a game that keeps releasing fresh content.

They’ve actually been working on evolving the way you interact with the tracks and feel the music. Also, the team has been working on something really cool. Check this out. I can't wait to play this.

And they keep partnering with incredible artists to release new music packs all the time. Did you play the Billie Eilish music pack last month? A little more than I should. I probably should have been working on this metaverse presentation.

Well, they have a great lineup of artists for 2022. And there's one more epic surprise before the end of the year. So stay tuned. Okay? Have you played POPULATION: ONE? -I mean, yeah, I love the game so much. -Well, for those who haven't.

POPULATION: ONE is a thrilling battle royale that is only possible in VR. Since its launch at Connect last year, it has become one of the highest earning games on Quest, and the biggest multiplayer FPS on the platform. You can have up to 24 people in at once for a match. We're super excited to keep launching big updates, like an all-new Autumn event later in November, and a Winter Wonderland Update in December. Okay, here's something I know our community has been waiting for. -Lay it on us. -All right.

This is a title from the WarpFrog team. This is the team that set the standard for VR combat physics, when it launched on Rift in 2019. That's right, Blade and Sorcery: NOMAD. The built-for-VR medieval fantasy sandbox, that pairs magic with melee, is launching on Quest later this year. For more gaming updates, look out for the 2022 Oculus Gaming Showcase.

It's going to be loaded with news you won't want to miss. But Mark, I believe you have some news for us as well. Yeah, you know, I have to say. It's really impressive to see this line-up come together over the last few years, but there's one project that I'm really looking forward to.

Yeah, this is one of the all-time greats, and we've been working for years to bring it to Quest. I'm excited to announce that the Rockstar Games' classic, Grand Theft Auto: San Andreas is in development for Quest 2. This new version of what I think is one of the greatest games ever made will offer players an entirely new way to experience this iconic open world in Virtual Reality. That's it, Mark.

I'm moving to the metaverse. Right, thanks, Deb. It's gonna be amazing. Now, a lot of the most interesting games out there take advantage of how you can move around physically.

Being able to look anywhere, move freely. It's just a fundamentally different experience from staring at a screen. This quality of being physically embodied, and able to interact with the world and move around inside it. Now, that opens up some completely new experiences that didn't really make sense before on 2D phones or computers.

And one example that we're seeing take off is fitness. And speaking of that, I think it's time for my workout A lot of you are already using Quest to stay fit It lets you work out in some completely new ways. It’s kind of like Peloton, but instead of a bike, you just have your VR headset and with it you can do anything, from boxing lessons to sword fighting to even dancing. You'll be able to work out in new worlds.

Even against an AI. Maybe you'll get some friends together for some three-on-three. Maybe play pickup with people on the other side of the world Or imagine your Facebook cycling group does an AR charity ride. -Let's go! -Complete with a leaderboard. Maybe you'll even train with the best, like you're right there. Like Lee Kiefer, Olympic gold medalist! On guard. Fence.

-Don't be scared to stab. -All right. You seem like a natural. Woah! All right, that's a little too realistic. -See you later, Lee. That was fun. -Good job, Mark. And that’s what fitness will be like in the metaverse. Now, I think we're going to see a lot more unique experiences emerging around fitness that take advantage of the full immersion and interactive training.

Speaking of which, Supernatural just added boxing to its lineup FitXR has new fitness studios coming next year And Player 22 by Rezzil, which is currently used by pro athletes, is adding guided and hand-tracked bodyweight exercises soon. We're making a fitness accessories pack that makes Quest 2 more comfortable, with controller grips for when things get a bit intense, and a facial interface that you can wipe the sweat off, making your sessions more comfortable. And that's all coming next year. But enough with the fun and games. It's time for everyone's favorite - Work. Over the last year and a half, a lot of us who work in offices have gone remote.

And while I miss seeing the people I work with, I think remote work is here to stay for a lot of people. So we're going to need better tools to work together. Let's take a look at what working in the metaverse will be like. Imagine if you could be at the office without the commute. You would still have that sense of presence.

Shared physical space. Those chance interactions that make your day. All accessible from anywhere. Now, imagine that you have your perfect work setup, and you could do more than you could in your regular work setup.

And on top of all that, you can keep wearing your favorite sweat pants. Looking good. Let's get together real quick for a debrief.

I'm free now. Let's jump in. -Hi! -Hey! So what do we think? I think it's ready. Great. I’ll prep it for the presentation. All right. Good luck.

Imagine a space where you can tune out distractions, and focus on the task at hand. And when you're ready to share what you're working on, you can present it as if you're right there with the team. Wait, where's Mark? I think he's in the middle of something.

You can already see some of these elements in Horizon Workrooms, which we launched a couple of months ago. Later this year, we plan to introduce room customization. so you can put your own logos and posters in your workrooms.

We’re also introducing a new office space in Horizon Home for when you want your perfect workspace to do some focused work or just cross a few things off your to-do list. We’re also announcing 2D Progressive Web Apps for the Quest Store, and as a new developer framework so they're easier to build. So you can drop in and check on a work project while you’re in VR using services like Dropbox and Slack, or stay connected with Facebook and Instagram.

This starts bringing more of your 2D internet services into the metaverse. And as we’ve focused more on work, and frankly as we’ve heard your feedback more broadly, we’re working on making it so you can log into Quest with an account other than your personal Facebook account. We’re starting to test support for Work Accounts soon.

And we’re working on making a broader shift here within the next year. I know this is a big deal for a lot of people. Not everyone wants their social media profile linked to all these other experiences.

I get that, especially as the metaverse expands. I’ll share more about that later. I'm genuinely optimistic about work in the metaverse. We know from the last couple of years that a lot of people can effectively work from anywhere. But hybrid is gonna be a lot more complex, when some people are together and others are still remote.

Giving everyone the tools to be present no matter where they are, whether as a hologram sitting next to you in a physical meeting, or in a discussion taking place in the metaverse, that's gonna be a game changer. I think this could be very positive for our society and economy. Giving people access to jobs in more places, no matter where they live, will be a big deal for spreading opportunity to more people. Dropping our daily commutes will mean less time stuck in traffic, more time doing things that matter, and it'll be good for the environment.

Actually, if you travel for work and working in the metaverse means that you just take one less flight each year, that’s better than almost anything else that you can do for the environment. I think working in the metaverse is going to feel like a huge step forward. And these dynamics, like the ability to teleport places with people, and interact around shared projects in virtual space, they're going to be valuable in a lot of other categories of experiences too. So now, Marne, our Chief Business Officer, is going to take us through some of them. Thanks, Mark. We'll see you in a minute! What if you could learn about anything in the world just by bringing it closer to you? So we're going to have an astrophysicist in the family.

Actually, I have to write this paper. Will you help me? Let's take a closer look. What part of the solar system are we talking about? Saturn. If you were taking astrophysics, you could study in the metaverse.

Did you know the rings are made up of billions of icy particles? -Really? -Look at this. You ready to do that paper now, right? Yeah. In the metaverse, you’ll be able to teleport not just to any place, but any time as well. Ancient Rome. Imagine standing on the streets, hearing the sounds, visiting the markets, to get a sense of the rhythm of life over 2,000 years ago. Imagine learning how the Forum was built by actually seeing the Forum get built, right in front of you.

Hi everyone! I'm Marne Levine. In the metaverse, learning won’t feel anything like the way we’ve learned before. With a headset or glasses, you'll be able to pull up schematics for your studying, or maybe even the service manual for a vehicle you're learning to repair.

Let's say you're a med student or a doctor. With apps like Osso VR, you can learn new techniques in surgery firsthand, practicing until you get it right. Or if you're studying earth science, you could swim through the Great Barrier Reef. Get up close to Earth's mightiest insects with your instructor, David Attenborough, whose VR documentary is playing now on Oculus TV.

This is a world of intrigue, a world of wonder, a secret world hidden to the human eye. Explore the extraordinary world of Micro Monsters 3D with me, David Attenborough. This is just one of the ways we’re going to learn in the future. But in order to get there, we’re going to need to help build the skillsets of the people who build these experiences.

So we're setting aside $150 million to train the next generation of creators to build immersive learning content and increase access to devices. And to help more creators make a living building AR effects using Spark AR, we’re going to establish a professional curriculum and certification process, make it easier to monetize, and put our Spark AR curriculum on Coursera and EdX. Thanks, Marne.

So that's a glimpse of the kinds of experiences that you might have in the metaverse. From connecting with friends, to gaming and entertainment, to work and education, and creation, and commerce. Now let's talk about how creators and developers are going to build all of this, and about the economy that we all need to create. The last few years have been humbling for me and our company in a lot of ways.

One of the main lessons that I’ve learned is that building products isn’t enough. We also need to help build ecosystems so that millions of people can have a stake in the future, can be rewarded for their work, and benefit as the tide rises, not just as consumers but as creators and developers. But this period has also been humbling because as big of a company as we are, we’ve also learned what it is like to build for other platforms. And living under their rules has profoundly shaped my views on the tech industry. Most of all, I’ve come to believe that the lack of choice and high fees are stifling innovation, stopping people from building new things, and holding back the entire internet economy. We’ve tried to take a different approach.

We want to serve as many people as possible, which means working to make our services cost less, not more. Our mobile apps are free. Our ads business model is an auction, which guarantees every business the most competitive price possible.

We offer our creator and commerce tools either at cost or with modest fees to enable as much creation and commerce as possible. And it's worked. Billions of people love our products, we have hundreds of millions of businesses on our platform, and we have a rapidly growing ecosystem and a thriving business. That’s the approach that we want to take to help build the metaverse, too. We plan to continue to either subsidize our devices or sell them at cost to make them available to more people. We’ll continue supporting sideloading and linking to PCs so consumers and developers have choice, rather than forcing them to use the Quest Store to find apps or reach customers.

We’ll aim to offer developer and creator services with low fees in as many cases as possible so we can maximize the overall creator economy while recognizing that to keep investing in this future, we'll need to keep some fees higher for some period to make sure we don't lose too much money on this program, overall. After all, while a growing number of developers are already profitable, we expect to invest many billions of dollars for years to come before the metaverse reaches scale. Our hope, though, is that if we all work at it, that within the next decade the metaverse will reach a billion people, host hundreds of billions of dollars of digital commerce, and support jobs for millions of creators and developers. We’ll try to lay out how some of this commerce model will work, but I also want to be upfront about the fact that there’s a ton that we just don’t know yet.

What I can tell you, though, is that we’re fully committed to this. It is the next chapter of our work and we believe for the internet overall. And our strategy and track record show that we will do everything we believe is sustainable to grow the community, the creator economy, and the developer ecosystem. And to tell us more about all of that, here's Vishal, our Head of metaverse products. Thanks, Mark. As you said, our approach is to serve as many people, creators, and businesses as we can by keeping fees as low as possible and offering choice.

As a company, we have been investing in Commerce and Creators for many years, but let me take a moment and explain how this will come to life in the metaverse. For people, the metaverse will offer more choice than we’ve ever seen before. And most importantly, there'll be a real sense of continuity where the things you buy are always available to you. Today, much of what you buy on the Internet is inside a single app, website, or game. You might buy a custom skin for your gaming avatar, but you can’t take it with you when you move to a new space.

In the physical world, that would be the equivalent of buying your favorite sports team’s jersey and never being able to wear it outside of the stadium. In the future, you'll be able to buy digital clothes for your avatar and then wear them into the metaverse more broadly. For creators, our goal is to provide a way for as many creators as possible to build a business in the metaverse.

One thing we’ve learned from today’s digital platforms is that we can’t artificially limit innovation. We need to enable as many different types of creators as possible to unlock the best ideas. There will be many kinds of creators in the metaverse. Creators who make digital objects, creators who offer services and experiences, and those who build entire worlds like game creators do today. We also want creators to have the biggest possible audiences. If a creator designs a signature effect to surround avatars, their fans should be able to buy it and then visit different spaces to show it off to their friends.

Frankly, if a creator builds any experience, all of their fans should be able to enjoy it. Businesses will be creators too, building out digital spaces or even digital worlds. They'll sell both physical and digital goods, as well as experiences and services.

And they'll be able to use ads to ensure the right customers find what they have created. Speaking of buying things, we’re also exploring new types of ownership models and entitlements, to ensure people feel confident that they actually own something. This will make it easier for people to sell limited-edition digital objects like NFTs, display them in their digital spaces, and even resell them to the next person securely. In short, the metaverse will remove many of the physical constraints we see on commerce today and make entirely new businesses possible. While this is very exciting, the metaverse is all about co-creating.

We’re building this together with creators, developers, and entrepreneurs. So today I’ve asked Jackie Aina to join me to start a conversation about what might be possible. Jackie is a beauty creator who launched her own lifestyle brand on Instagram called @forvrmood, where she sells candles with scents like 'cuffing season' and partners with major brands on makeup drops.

So let’s meet Jackie in her space. Here we are in this digital space, totally inspired by you and all the things that you love. Jackie, I've been such a big fan of yours for such a long time.

Tell me a bit about how you got started in social media. So rewind back to 2009. As it pertains to beauty, I would go to a makeup counter or go make up shopping, and I always felt rejected, always felt othered. So I got so frustrated that eventually I was like, you know what? Not only am I gonna not listen to them, but I'm gonna show other people how I'm making it work with my skin tone, too. How did you go from creating content, creating videos, to a product? You have a brand called Forvr Mood, it's huge on Instagram, Instagram Shopping.

So Forvr Mood was not only a bit of a love letter to me and my relationship with my mom. That's how we bond, it was through fragrance ever since childhood. It's been doing amazing and we've been getting a lot of support. Jackie, this space is pretty magical.

And I know it kind of reflects the stuff you love. Tell me a bit about what you love and why this represents you. Okay. So first thing, color is everything. I love pastels. You can see it in the brand, you can see it a lot of times in what I wear.

With the metaverse, when we're curating collections or brainstorming, this would be a great way to gather us all together and create atmospheres and environments and moods to enhance creativity. And then if I can invite people, it would be like inviting them in my home. It'd be like bringing them into a piece of me. So I would want them to feel some of the things I feel when I step into my house or when I light my candles. Interacting with people in comments and having a conversation is a way to get your fan excited about interacting with you, but also you feel like you get...

I get excited too. Yeah. So this idea of reaching into the comments section and pulling a fan in and having a conversation, -is pretty powerful. -I love that. And it humanizes that person on the other end of that of that comment. There's an interesting opportunity to think about how we build community and how we engage with our fans in a really deep way, as opposed to just through a comment box. Imagine, for your biggest fans, you could have an exclusive launch party where anyone could visit no matter where they were in the world. I love that because, when we first launched the brand, everything was shut down.

Obviously, quarantine. Couldn't go anywhere, and I have this idea of doing, like, pop-ups for Forvr Mood where we would have a room inspired by each scent. And now, anyone from anywhere in the world come and feel what you just described, and have that feeling, have that sense, really get into that mood. Also, you have your greatest fans, people that care a lot about you. You will need to reach new customers, use advertising to keep growing your business. You could also drop an exclusive product in the metaverse where, only available to your your most ardent fans who paid a special access to get that product.

That's dope. Commerce is going to be a big part of metaverse. You'll be able to sell both digital and physical products.

The Butterfly Effect transports us to someplace magical. So beautiful. So, Jackie, as we walk through this amazing world, what does the metaverse mean to you? I just feel like this is like endless possibilities of my imagination. I can't even begin to imagine how meaningful the metaverse will be, thanks to creators like you. There's a lot here to be excited about.

Just like the internet today, more people are going to have the freedom to find a business model that works for them, whether that's custom work, tipping, subscriptions, ads, or other monetization tools that may only make sense in the metaverse. Now, think about how many people make a living on the internet today. How many of those jobs just didn't even exist a few years ago? I expect that the metaverse is going to open up lots of opportunities for people in the exact same way. Now, we have confidence in this general direction, and we're investing significantly in building for this future.

But the reality is that no one knows exactly which models are going to work and make this sustainable. So we're going to approach this with humility and openness, and we're going to work with anyone whose efforts will help bring the metaverse to life. Now, Boz is going to join us to talk about what people are already building for the metaverse and what kinds of tools we can create to help out.

Hey, Boz. Still kind of jealous of your robot avatar. You got to have the robot when you're working your dance moves, right? I mean, of course. Look, Mark, we've been working on Facebook Reality Labs for a long time, but we always kept the same vision, which is, we wanna build tools that help people feel connected, anytime, anywhere.

And that vision's what drives us today. To help people get together in virtual spaces. We've come a long way since the early days of VR, thanks to the developers and the creators who see the potential to entertain and educate. But we still have a lot farther to go. So, to keep the pace of innovation moving faster and empower more use cases, we do have a few exciting announcements. Today, we're introducing the Presence Platform, which is a broad range of machine perception and AI capabilities that empower developers to build mixed-reality experiences. on Quest 2.

Tell us more about the Presence Platform. We've said before that realistic presence is the key to feeling connected in the metaverse. And the Presence Platform's capabilities are what's gonna deliver on that promise.

Things like environmental understanding, content placement and persistence, voice interaction, standardized hand interactions. In fact, let's start with hands. I mean, the human hand is an engineering marvel.

And bringing hands into VR was no easy feat. It required a lot of collaboration against product, design, research. And we continue to improve that product, finding new ways to navigate with gestures and interact with VR. So, today, we're introducing the Interaction SDK. A library of modular components that'll make it easy to add hand interactions to your apps. That's pretty exciting.

Yeah, we really think this is gonna help people accelerate the build time for new and existing titles that allow people to use their hands more naturally in more virtual experiences. Now, in addition to hands, there's another exciting interaction space for VR. That's your voice. The Voice SDK lets developers integrate voice input into their games and apps so they can create new gameplay and navigation. So, soon, you'll be able to jump into a FitXR workout, and shake up your routine just by saying, "Surprise me."

Yeah, I've tried this one before. -This one's really tough. -Me, too. It is. Since we launched the Passthrough API, we've already seen breakthrough experiments from developers that blend the virtual and the real world. Soon, with our next SDK release, developers will be able to ship their mixed reality apps on the Oculus Store and in App Lab. Now, of course, access to Passthrough by itself is not enough. To achieve that rich mixed reality experience, apps also need to be aware of things in the room and, you know, blend the virtual objects with the physical environment around them so they coexist in the same space.

So, developers wanna be able to place persistent world-locked content like animated holograms or your Instagram feed in your real space. And tools like spatial anchors, see-and-understand capabilities will help make these mixed reality experiences feel seamless. Now, I'd gladly nerd out with you -on the technical details of this. -I know. But I think it's better to show you.

Load The World Beyond. This is Oppy. Oppy, come, come on, Oppy. Who's a good, persistent-state virtual object laid on an interactive pass-through environment? Oh, you are. Yes, you are. Oppy, sit.

Pets in the metaverse. Just as stubborn as they are in the physical world. You know, Boz, I always really wanted a forest in my living room. -Did you? -Not really. I'm looking forward to how developers are gonna be able to build a new generation of mixed reality experiences on Quest 2.

And it also gives us a glimpse of the kind of worlds that people are gonna be able to build. Let's talk about augmented reality. With Spark, we really focused on those use cases that allow people to stay engaged with the world around them, but also stay connected to the people not around. Creators are really leading the way here. We're doing everything we can to support them with our know-how, with resources, and, of course, with tools. And for more on that, I'd like to say hi to Sue, the product director at Facebook Reality Labs and the head of Spark AR.

Thanks, guys. Spark AR is focused on building AR experiences that empower people to be more connected, and at the same time, more present with the world around them. Creators are helping us build these experiences of today, while exploring the possibilities of tomorrow. Fundamentally, we're democratizing AR creation and enabling a global community with the tools and knowledge necessary. to develop the AR content and experiences that people love to use...

programs like AR Curriculum, and world-over 22,000 creators in less than a year, and we're adding additional programming to support more advanced creators. The depth and breadth of content would not be possible without these creators. In one week alone, you'll see effects that bring moments like landing on Mars or the launch of a new album to life, all by tapping into everyone's imaginations. We're also powering experiences across Facebook. So, imagine playing that AR battleship game from your phone with a friend in the metaverse on augmented call.

How cool, right? Right now, we have over 600,000 creators in our community with diverse needs and motivations for using AR, to engage with their fans and audiences. We know people, more than 700 million monthly, across the Facebook family of apps, are already using AR a ton. We see over 80 billion effects applied per month, and we want to make it even easier for anyone to find their spark. So, we created a tool, code named Polar that makes AR creation possible for novice creators who have no prior experience in art, 2D or 3D design, or programming. Think, paint by numbers.

So, creators whose first love might be photography, video, or dance can enhance their content for their own special effects. And that's an important piece of this vision and this community. It's open to anyone and everyone who's curious. And the metaverse is well positioned to be a strong digital economy for creators from all walks of life.

This is primitive to a functioning metaverse. And we want to make sure creators are ready to share their creativity and capitalize on this emerging opportunity from day-one. Thanks, Sue.

So, that's some of the software platform that needs to get built to deliver the metaverse. But there are other important considerations here, too. Hey, Nick.

Hey, Mark, I hope I'm not interrupting. You got a sec? Have you got Oppy with you? I think Oppy's still in the virtual forest, but I always have time for you. What's going on? I just love the presentation so far. It's such visionary stuff. But as you mentioned early on, with all big technological advances, there are inevitably gonna be all sorts of challenges and uncertainties, and I know you've talked about this a bit already, but people want to know how we're gonna do all this in a responsible way, and especially that we play our part in helping to keep people safe and protect their privacy online. Yeah, that's right. This is incredibly important. The way I look at it is that, in the past, the speed that new technologies emerged sometimes left policymakers and regulators playing catch up.

So, on the one hand, companies get accused of charging ahead too quickly. And on the other, tech people feel that progress can't afford to wait for the slower pace of regulation. And I really think that it doesn't have to be the case this time round because we have years until the metaverse we envision is fully realized.

So, this is the start of the journey, not the end. Like I said earlier, interoperability, open standards, privacy, and safety need to be built into the metaverse from day-one. And with all the novel technologies that are being developed, everyone who's building for the metaverse should be focused on building responsibly from the beginning.

This is one of the lessons that I've internalized from the last five years. It's that you really want to emphasize these principles from the start. So, at Connect last year, Boz outlined our Responsible Innovation Principles, and the first one was, "Never surprise people." Right, and that means being transparent about how things work, what data is collected, and how that data is used over time. It also means giving people easy-to-use safety controls as well as age guidance and parental controls for when youngsters are using these products. And we're spending a lot of time talking with experts and getting perspectives from outside the company on what we're building even before we build it.

And this is about designing for safety, and privacy, and inclusion before the products even exist. And one example is what we're doing with Project Aria, our research device that helps inform the AR glasses that were building. We're also funding external research on this.

You know, last year, we announced grants for research on the impact of AR, VR, and smart devices on people who aren't currently using them, especially communities whose perspectives have often been overlooked, as well as best practices for creating inclusive environments in virtual spaces. And, this year, we're opening up support for even more research because we need those independent perspectives to make sure that we're living up to another one of our principles, "Consider everyone." And that point is so important because this is a collaborative exercise, and in particular, we need to make sure the human rights and civil rights communities are involved. So, these technologies are built in a way that's inclusive and empowering. One of the advantages of starting right now is that we can collaborate with people at the very early stages of development, like what we're doing through the new fund you announced a few weeks ago.

Yeah, the XR Programs and Research Fund. It's a two-year, $50 million investment in programs and external research with organizations all over the world like Howard University, Women in Immersive Tech, Africa No Filter, and universities in Seoul, Hong Kong, and Singapore. And this is crucial because, as we've said, the metaverse isn't something we're building so much as it's something we're building for.

Across the industry, we need to bring that same imagination and commitment to building for interoperability, openness, safety and privacy as we do for all the other product aspects of the metaverse. These have to be fundamental building blocks, just like the other software and experiences that we've been talking about. -Right. -All right. Thanks, Nick. So, one thing that we haven't talked about much yet is the future of hardware that will help bring the metaverse to life. We're working on multiple new products to advance the state of the art, unlock richer social interactions, and make it a lot easier to be productive.

That must be Angela, our head of VR devices. -Hey, Angela. -Hey, Mark. -Is that what I think it is? -I sure hope it is.

All right. Come on in. So, what do you think is the most exciting thing we're working on right now? That's hard to say, you know, with such an exciting pipeline. But next year, we are releasing a new product that'll push the boundaries of VR even further.

We've codenamed it Project Cambria. So this isn’t the next Quest. It's gonna be compatible with Quest. But Cambria will be a completely new advanced and high-end product and it’ll be at the higher end of the price spectrum too. Our plan is to keep building out this product line to release our most advanced technology. before we can hit the price points that we target with Quest.

All right. So, let's talk about some of the new advances here. Yeah, sure. There's a ton of new tech going into Cambria. For example, your avatar will be able to make natural eye contact and reflect your facial expressions in real time. This way, people you're interacting with will have a real sense of how you're actually feeling. It does mean building more sensors into a form factor that’s comfortable to wear for a while.

And because we want VR to be for everyone, we also have to make sure avatars represent a diverse set of human facial features and skin tones, as well as paying attention to things like glasses and beards that may get in the way of some sensors. So, that’s going to be a big step forward for social presence. And I'm really glad we're focused on making it inclusive from the start. Now, what about unlocking more mixed reality experiences? I mean, imagine working at your virtual desk with multiple screens while seeing your real desk so clearly that you can pick up a pen and write notes without taking your headset off.

Or you're doing a workout with a virtual instructor in your living room. It's going to be so cool. We're already seeing the potential for these kinds of experiences today as people are building for our Passthrough API.

But with Cambria, we’ll be taking this to the next level with high-resolution, colored mixed-reality passthrough. We essentially combine an array of sensors with reconstruction algorithms to represent your physical world in the headset with a sense of depth and perspective. Now, we’re still a ways away from exactly matching what our eyes see in the physical world, but we’re pretty encouraged by how far we’ve been able to advance the passthrough experience. Definitely, but we also need to push the visuals to the next level.

So, let's talk about the progress that we're making on optics. Yeah, we’re pushing the limits of display technology and form factor with something called pancake optics. They essentially work by folding light several times over to achieve a slimmer profile than current lenses. Now with several optical layers, we'll need to precisely control every aspect of design and fabrication to achieve that high-quality, artifact-free display and really deliver the best optics ever in one of our headsets. That’s pretty awesome. But let’s make sure that we leave some of the good stuff for next year’s release too.

All right. Now, I’m excited to keep building out this new product line over future generations, so we can keep getting our most advanced technology into people's hands even before we can get it into our Quest product line. That's our goal. We’re starting to work with developers

to build experiences for Cambria as we speak and we’re really looking forward to sharing so much more with y'all next year. Sounds good. Now, beyond virtual reality, we're also focused on the hardware to make true augmented reality possible. In a lot of ways, augmented reality is even harder, not just because we need to invent a completely new optical stack that’s not based on screens, but because we basically need to fit a supercomputer into a pair of normal-looking glasses. So we’re approaching this problem from two directions. First, how much technology can we pack into a pair of normal, great-looking glasses today? And second, how do we take the long-term tech stack to do everything and keep miniaturizing and improving it until it fits into a pair of normal, good-looking glasses? On the first path, last month, we launched Ray-Ban Stories, our first smart glasses in partnership with EssilorLuxottica.

They’re not full AR glasses yet, but they let you take pictures and videos, listen to music, and take phone calls while you're out looking at the world instead of down at your phone. We built leading privacy features into the glasses, like the LED light whenever you’re recording, which phones don’t even have. And we delivered this in the iconic Ray-Ban style for just $299. These are all steps along the path to an embodied internet. But the ultimate goal here is true augmented reality glasses. We’ve been working on that too, and today, I want to show you an experience that we’ve been working on for Project Nazaré, which is the codename for our first full augmented reality glasses.

Here, you'll see you’re chatting with friends on WhatsApp and planning a game night. You can select a game and then, as you walk over to your kitchen, you can easily just put your game onto the table and you're off. That’s the kind of experience that augmented reality will unlock. There's a lot of technical work to get this form factor and experience right.

We have to fit hologram displays, projectors, batteries, radios, custom silicon chips, cameras, speakers, sensors to map the world around you and more into glasses that are about 5 millimeters thick. We still have a ways to go with Nazaré, but we’re making good progress. I'm excited about our future roadmap. But even that's still early in a journey that’s going to go on for decades. Immersive all-day experiences will require a lot of novel technologies.

And for the last seven years, our research team has been working on a broad array of technologies that are necessary for these next-generation platforms. Michael Abrash leads this team, and he’s going to join me to talk about some of the future technology that we’re developing. Hey, Michael. Do you want to take everyone through our roadmap?

Sure! That covers a lot of ground though. It’s going to take about a dozen major technological breakthroughs to get to the next generation metaverse, and we’re working on all of them, displays, audio, input, haptics, hand tracking, eye tracking, mixed reality, sensors, graphics, computer vision, avatars, perceptual science, AI, and more. You're right. That's a lot.

We probably only have time for a couple right now. You want to pick a few? Well, I think the metaverse is really going to be first and foremost about connecting people. So, I'd say, let's start with avatars. Yeah, I agree. The goal here is to have both realistic and stylized avatars that create a deep feeling that we're present with people.

Exactly. And we've shown Codec Avatars before. Well, what can your face do? Can you show us? Well, I've always hoped you would ask me that question. And it's a remarkable experience to see Codec Avatars in VR and photorealistic avatars will be a huge breakthrough. But they're only part of the picture. Yeah, because you're not always going to want to exactly like yourself. That's why people shave their beards, dress up, style their hair, put on makeup, or get tattoos and of course you'll be able to do all of that and more in metaverse.

So this next video shows how we're making progress on part of that with hair and skin rendering and relighting for 3D avatars. She's going from, first medical field-- I think she was interested into... Now, I want to point out this technology is still very definitely research, but it shows the really high level of fidelity that avatars are ultimately going to need to reach. Notice that you can see the individual pores on his skin. Now, clothing is another important way to express yourself and whether you want to simulate your actual garments, say, you make a call to your grand mom. Or maybe you just want to be wearing a stylish blue Oxford shirt like me, You're going to need the ability to simulate clothing.

So, here's an early look at hand-cloth interaction. And you can see how the movement of the cloth, when the hand touches and stretches it, is really remarkably accurate. Now, to use your avatar to teleport to a meeting or sit down for a chat in the metaverse, you'll also need realistic virtual spaces to be with people in. So, let's look at that. Contrary to how this looks, this is actually not a video of the physical world. It's a 3D reconstruction of a mock apartment, which, I guess, just became clear when that bowl took off by itself.

Exactly. So, what you can see here is that a researcher is moving around various objects on the right, and then on the left, you can see the high fidelity real time rendering of the space and the moving objects on the left without the researcher. So, what's critical here is that this is all happening in real time. That's what's novel here. And that's what differentiates it from CGI.

So, what we've seen so far is pretty awesome. But we all know what we want. We want the whole package. Full body avatar in a virtual environment.

So you're just there. Last year, we showed our very first full body Codec avatar at Connect. Here to tell you more about our progress since then, is Yaser Sheikh.

Hi, there! My name is Yaser and I direct the Facebook Reality Research lab in Pittsburgh. Well, that's not completely true. What you're seeing is actually a rendering of my pre-recorded codec avatar. Over the past couple of years, we've made quite a bit of progress with Codec Avatars. My avatar can now look left, look right, look up, look down. Make facial expressions...

Here You can see Elaine is in VR on the right, You can see her view of Yasser's avatar and the reconstructed real world space that it's in, on the left, as she moves around the Avatar. What you see on the left is an entirely virtual reconstruction. This gives you a sense of a future in which you'll be able to interact with another person's realistic avatar in virtual reality in real time. Now, preventing others from using your avatar will be critical. That's why, even though this is early stage research, we're already thinking about how we could secure your avatar. Whether by tying it to an authenticated account or by verifying identity in some other way.

All right, so we've talked about what we're gonna about to see. But, of course, we also want to be able to actually do things in the metaverse. And while seeing hands in VR and manipulating virtual objects is a big step forward towards that, We're going to need an even easier and more intuitive way to interact with virtual content when you're on the go wearing AR glasses. And we believe that neural interfaces are going to be an important part of how we interact with AR glasses. And more specifically, EMG input from the muscles on your wrist combined with contextualized AI.

It turns out that we all have unused neuro motor pathways and with simple and perhaps even imperceptible gestures, sensors will one day be able to translate those neuro-motor signals into digital commands that enable you to control your devices. It's pretty wild. So, let's take a look at what EMG input is going to be able to do and where we are with it today.

This video shows some of the ways that we think that wrist-based neural interfaces can provide important input tool for AR glasses. So, at first, that input is going to just be around, enabling basic gestures. Click, scroll, select. But as the technology evolves, EGM put could potentially unlock full speed typing, and it could give you subtle, personalized controls that you can use in any situation. This is genuinely unprecedented technology, so let's look at an early prototype that shows how it could be used in A R. This experimental UI shows for the first time how EMG input could one day allow you to send a message in AR with your hand resting comfortably at your side without ever having to look at a screen.

Notice how the researcher used in overt and highly visible click gesture to send the first message. We've made a lot of progress over the last year, and we're now able to do the same thing with the smallest of movements of the hand. EMG enables this by picking up subtle neural motor commands with remarkable precision.

Basically what you're saying is that you're going to build to send a text message just by thinking about moving your fingers. That's going to be pretty amazing. But I guess it's just one part of the equation because the other is AI that understands your context and can give you a simple set of choices based on that context. So let's take a look at an AI interaction model that we've built using project Aria, our initiative to accelerate research into AR glasses and live maps that we talked about last year. Remember the mock apartment we saw earlier? Well, we've indexed every single object in it, including not only location, but also the texture, geometry and function of each one. We fed all that information into the Project Aria glasses Mingfei is wearing in this next demo, and the reason is to simulate how AR glasses will ultimately be able to access data from a 3D map that will help them identify the real world objects in the space around you and better understa

2021-10-30

Show video