Build immersive apps & experiences with OpenXR & Unity

Build immersive apps & experiences with OpenXR & Unity

Show Video

hi my name is Ryan Bartley I'm a product manager at for XR at Google hey my name is Spencer Quinn I'm an engineering manager for XR at Google hi I'm Trisha Becker I am the technical product manager at Unity hello my name is Luke Hopkins I'm a developer relations engineer for XR at Google it's so great to be able to announce that we're releasing our developer preview for the Android XR SDK we have a lot to share and talk about today but first off who here has built a XR application before lots of fans yeah we've got lots of fans here that's great if you're porting or building numers of experiences for open using open XR then this is the talk for you we'll learn about Android XR as an extension of the Android ecosystem next we'll talk about how Android XR is not only just adopting industry standards like open XR but how we're expanding the capabilities with new vendor extensions we'll also share how we're partnering with unity to talk about Android XR for Unity for Unity developers and then we'll end the presentation with a Q&A which give you a quick chance to ask those burning questions you may have we can't wait to show you more with that I let to pass it to Ram thanks Luke Android XR is the newest Android platform alongside our expansive device ecosystem of Mobile wearables TV and auto built on the found foundations of Android mobile and large Screen development you can build experiences that span devices and push boundaries uh when powered by Android XR bring your apps to the next generation of devices uh by going all in on the expansive Android ecosystem and Global reach of the Play Store's trusted secure infrastructure Android XR offers simplified development by using Open Standards such as open XR and familiar Frameworks like unity and uh Android Studio standards and Frameworks that let you unlock your creative potential and bring your Visions to life across the XR landscape in the keynote session we announced the Android SDK we've designed the SDK using familiar Android apis tools and Open Standards with the goal of reducing the learning curve and leveraging your existing set of skills to the fullest this SDK gives you multiple ways to build an experience for Android XR build I'm back build native applications using the net new uh jetpack XR SDK build immersive experiences directly with open XR or use the Android XR SDK for Unity if you want to build apps using the jetpack XR SDK we recommend tuning into our session called develop and adapt apps for Android XR you've heard us mention Open Standards today but I want to share more about what that means why we care about it and I'm excited that we can finally talk about everything we've been working on in this space Google has a long history of supporting open standards and we're actively helping to push the XR industry forward through standards such as open XR Vulcan and webxr these standards provide the best compatibility for your existing applications and ease the burdens and cost of porting whether you're porting an existing app or creating a new one we want to make the Android XR platform a natural fit for all your XR projects we're really excited to announce that Android XR is fully conformant with the open XR 1.1 specification in addition we support a long list of Highly used thirdparty vendor extensions using these extensions offers you that familiar experience When developing for XR already built an XR application with support for the common extensions used in the XR space today means the you can Port your application with ease to Android XR here's a short list of some of the vendor EXT and khr extensions that are supported for Android XR we have support for tracking and interaction extensions such as hand tracking hand interaction hand aim and palm pose so you can create experiences to let users poke pinch interact and use gestures with commonly used XR apis there's also support for eye gaze interaction which can be used to help give input a more natural feel for your users for performance there's support for advanced extensions like itrack foliation which allows an app to render high resolution content only at the eyes focal point and there's also space warp extension which uses velocity vectors in depth texture information to synthesize frames between rendered frames this reduces rendering load and increases the overall compute budget this is so great as an XR developer to know that the open XR specification and common vendor extensions are a core part of Android XR platform means a lot of the knowledge I have regarding building experiences with open XR apply to Android XR and even better Android XR is contributing XR Android vendor extensions and here's some a some examples with face tracking which brings facial expressions to your avatars or skinned mashes with pass through objects which allow for physical objects in the real world to come into your virtual scene for example you can bring your physical keyboard laptop and other tracked objects into your virtual scene and one of Spence's favorite new extensions hand mesh yeah this one is really really super useful the hand mesh extension provides a machine learning derived representation of the user's hands as a mesh this provides a model that is physically very accurate to users actual hands the this is an alternative to extensions which use bind pose and blend weights we see here a short code snippet how of how to use the extension this creates a hand mesh tracker which can then be queried for mesh information of both the right and left hands complete with normals and UV coordinates with Android XR multimodal input system you can design experiences that let users naturally engage with their sensors and interact with apps however they choose users can interact using their hands eyes and voice and of course they can use peripherals like controllers and game controllers XR uses hands as a primary way to interact with a platform which means users do not need a controller as menion mentioned earlier we support a wide range of hand tracking extensions which means if your app already supports this input method and extensions your work's already done some of these capabilities require runtime permissions that's because seen understanding eyee tracking face tracking and hand tracking provide access to data that may be more sensitive to the user keeping with Android security and privacy policies Android XR has permissions for each of these features this is so important for giving users the control so there's transparency on what permissions are required and for using your experiences for more information on the list of extensions that require permissions you can check out the information on the XR developer website speaking of we recommend exploring our developer site for more information we have content and designing and planning for Android XR as well as an overview of open XR extensions getting started guides and a comprehensive list of unity packages to use and with that we'd like to start talk about Unity we've partnered with unity to bring the unity engine to Android XR you can now harness the power of Android XR for Unity to create immersive experiences that Captivate and Inspire users we've loved every moment of our partnership with unity and with this I'd like to pass it over to Trish Rebeca to talk about open xile support and the features found in unity 6 thanks Luke what a day I am so excited to be here and talk about unity's Day One support for Android XR so whether you're porting an app or you're building something new uh the Android XR platform let you create immersive experiences using the unity tools that you know and love like Foundation XR interaction toolkit open XR XR hands and starting today open XR Android XR so with these tools you'll have everything that you need to support VR and Mr experiences including foundational features like occlusion and persistent anchors that seamlessly blend virtual elements with the real world the XR interaction toolkit lets you use natural hand poses and gestures to interact directly with the virtual space and with XR hands you can go beyond basic gestures too our custom gesture detection supports actions like thumbs up thumbs down pointing and more so you can add extra personality to your interactions on top of that Android XR introduces additional features like light estimation and performance metrics giving you platform specific tools to optimize and enhance your applications to help you get started we have Mr and VR templates a ailable today through the unity Hub where you can try out many of the features that we just talked about with more updates on the way now let's dive deeper into the new features that Unity 6 brings to the XR ecosystem showcasing how Unity is evolving to meet the needs of XR developers Unity 6 introduces some gamechanging features that are specifically tailored to XR development from enhanced rendering capabilities to optimized performance on XR devices one of the most exciting advancements for XR developers is eye gazed fiated rendering this feature leverages eye tracking technology to dynamically adjust rendering focus it optimizes for both visual Clarity and GPU performance for smoother more immersive XR experiences another rendering feature we're super excited about is composition layers so here we've got a side-by-side comparison on the left uh regular texture on the right composition layer as you can see the use of composition layers significantly reduces artifacts gives us clearer text sharper outlines and an overall better appearance in unity 6 we also introduced yourp space warp it's a feature that reduces your application's rendering workload freeing up resources for higher quality graphics and improve performance you gain additional compute meaning you can accomplish even more with your render budget we've got even more coming soon so in unity 6.1 we'll roll out new features for Android XR developers from improvements in developer workflows to Advanced rendering options that are designed to meet the demands of next gen XR development in 6.1 you'll see templates for creating VR and more multiplayer experiences for mixed reality we're adding a new tabletop game example it combines the best of our XR support and end to end multiplayer Solutions we're all bringing our VR multiplayer template to Android XR it's packed with everything that your application might require like networked interactions voice chat lobbies and more now we know that building for XR devices can be complex and manual configuration leaves a lot of room for error so that's why in unity 6.1 we're adding a dedicated build profile for Android XR making it significantly easier to configure projects with optimized settings for Android XR development you can also create your own build profiles the new build profile workflow allows you to create a profile that you can customize based on your Project's needs and since build profiles are project assets you can save manage and share them with your team uring a consistent build experience across the board so to wrap up we've covered how Unity 6 Works hand inand with Android XR providing everything that you need to get get started on this exciting new platform we hope you'll join us on this journey I can't wait to see what you all build and I'll hand things back to the team at [Applause] Google thank you Trisha and unity for all this amazing work unity's day one support for Android XR and the exciting new features in unity 6 will make sure developers can get started developing today we recommend upgrading to Unity 6 and urp along with the Vulcan as your rendering API as this will put your games and applications on an optimal path for the best experience on Android XR in our next open XR session you will hear from partners that have already started porting their experiences to Android XR and you will hear firsthand about their developer Journeys now in addition to the great packages from Unity Google has been working working on some features in our own package that we called Android XR extensions for Unity in this package you will find Early Access features and apis that are generally backed by our new Android open XR uh extensions our strategy for this package will be that in time popular features will be migrated to unity's open XR pack oh sorry Unity open XR Android XR package to reduce the number of required packages and enhance API longevity today some of the features you'll find in the Android extensions for Unity package include the following hand mesh for a system generated Pro uh and provided hand mesh that has highly physically accurate representation of the user's hand complete with UV coordinates and normals object tracking for objects such as mouse and keyboard to make it easier for users to locate them when in fully immersive mode pass through layer that allows an application to create a polygon pass through composition layer cutout of an immersive scene this is useful for creating areas in your experience that link back into reality and face tracking for a blend shaped representation of a user's face that can be mapped to a user's facial movement and is useful for avatars we have many more exciting features coming to this package so be sure to stay tuned for updates I'll hand it off to Luke now to talk a little bit more about his experience implementing the face tracking extension thank you very much Spencer uh I cannot tell you how easy it was to use face tracking in my project um I'd like to show a quick demonstration of how face tracking Works within Unity uh using the samples that we have provided in the Android XR Unity package um this sample comes with a mesh that is exported with blend shapes meaning the skin mesh renderer component has all the blend shapes correctly configured for you this gives you everything think you need in order to modify the heads mesh and then next we'll get the information from the XR face tracking manager to send to the skinned mesh renderer component now in order to use the XR face tracking manager we first must start by declaring permissions for face tracking and then we'll request it at runtime and again working with permissions in Android XR is exactly the same as working with permissions on any other Android form factor we can check in our update function if the user has granted permission to use feature and after checking we have permission we can then check if the extension is enabled great now we know the xile face tracking manager is ready we can start reading values for this example we're going to Loop through all the face parameters from the Jaw to the lips all the way to the user's eyebrow this API provides a weighted value which maps to the blend shaped weight method and this weight defines how the skinned mesh will be modified in that particular area for instance if I have a fully raised eyebrow this will result in a wanted a weighted value of one this updates the mesh and morphs the eyebrow to its maximum position remember your mesh just doesn't have to be a face for this example weighted values can be represented on cushions on a couch have fun playing with this feature I know I did so we've talked about space warp and itrack foliation but there are many other performance features that are also available some of these include Vulcan sub sampling for enhanced GPU performance display refresh rate to request optimal frame rate for your application and optimize buffer discards to reduce unnecessary cached information these tools can be used to tune your application to get the highest frame rate and quality the source code for our extensions as well as samples on how to implement them is currently available on GitHub we encourage you to explore the code you can also check out our API references and getting started documentation on our developer website we have also been working hard to create an amazing set of samples to inspire and educate you on Android XR these simple but playful mixed reality samples will showcase the power of Android XR now we would like to show you a video demonstrating all these features [Music] la [Music] m [Music] it's so great to see a lot of these open XR extensions working in unity a few of the examples use different extensions um um for instance we showcased plane detection so that virtual objects can interact with physical space gaze and pinch as a way to interact with virtual Elements by looking at them uh face tracking so that you can record random facial expressions uh so that objects can talk and make faces uh tape measure which measures real world distances using depth estimation uh and next year we will be publishing these amazing experiences for you all to see and explore once apppp is ready to share with users you can use a Google Play developer console to distribute to users everywhere for experiences that are only suitable for Android XR devices Google Play is launching a new dedicated release track for Android XR this enables you to deliver specific builds to Android XR devices without needing to do things like publish your app under a separate package name as as of today you can access all the developer documentation which caes everything we've discussed along with descript uh along with the instructions on how to get started with uni6 early next year we'll be hosting boot camps around the world so you can get Hands-On access to pre-release Hardware ask more questions and have more technical sessions like this with the Android XR team apply to our developer boot camps and get hands on with the latest technology in 2025 the spots are very limited so start now we can't wait to see the immersive experiences you build so thank you very much for your time and we can't wait to see you again soon [Applause]

2024-12-22 16:07

Show Video

Other news