The Nano Summit: Immersive Data Technology

The Nano Summit: Immersive Data Technology

Show Video

[Music] on to the next session immersive data technology my name is Brian Anthony I'm the director of the emersion lab the associate director of MIT Nano um immersive data technology I think of us we're the at the top of the technology stack so we think about and facilitate work that has the human immersed between Hardware data and experience so the hardware meaning compute and sensors data meaning data and the information that we get out of data and The Human Experience and that could be either the person that needs to work inside of MIT Nano the person that needs to learn how to use an instrument um or the person becomes a specimen in we observe them as they interact with technology so what is the immersion lab it's it's one of it's the third uh pillar within MIT Nano that's not advancing does that Advance one of these will advance that one advances there we go um so this is the immersion lab the space so we we 3D scan the space we see a bunch of of cameras inside of there where a person can stand in there and we can do motion capture but this is the immersion lab This Is A flamco Dancer uh that has markers on his body and the the cameras around the immersion lab are are capturing his Motion in three space we have wireless physiological sensors that are monitoring his heart rate and his respiration rate we have accelerometers on the floor that are monitoring as his feet interact we'll see the data here in a second we see the the camera crew watching him but we see the the 3D uh points that are being tracked a little bit we'll see all the wireless signals that are being transmitted from his muscle activation his heart rate his respiration rate and then the floor signals um so that we can understand how how he interacts with the floor so that's one example of being immersed between hardware and data and experience we're trying to understand what does it mean to be a professional expert dancer what can we learn by studying the motion the the kinematics the body the the way that the body responds um the immersion lab supports multiple communities we support the The Human Experience both Quantified and subjective the intersection between art um and that can be dance it can be well how do we use immersive technologies that can be ar and VR that can be things like motion tracking that can be things like head mounted displays to be able to rapidly visualize how you're interacting with your physical objects or your virtual objects uh it can be rapidly spinning up high performance compute so I cannot just do machine learning on that that wonderful cryo electron microscope data that I'm Imaging but I can sort of rapidly spin up rapidly in real time manipulate it and not just wait for a computer to give me an answer after several minutes but but what do I actively actively do with my data to understand it and to manipulate it so again the intersection the immersion the immersion between hardware and data and experience so the tools that we have um ability to do motion capture ability to do wireless physiological monitoring wearable or immersed in the environment cameras on the wall radar sensors in the ceiling accelerometers in the floor the language and tools of gaming AR and VR uh compute gpus uh virtual environment uh and take the approach of the artist the approach of the scientist and engineer and the approach of the clinician and we bring these things together to really try to um form little special interest groups with industry communities that are interested in say Health Care Sports and and digital Therapeutics and trying to say well what collectively is the market opportunity for these types of immersive Technologies in that space or within design and Manufacturing collaborative design or or how do we use these tools on the the factory floor uh to better better monitor the person or to better interact with the technology that we have to rapidly open the manual for to figure out how we debug uh and then the the tools and Technologies you whether it be the sensors that are allowing us to monitor the world both the human and environments the built and natural world um and how do we use these tools for Education and Research and then how do we use these tools what does it mean to quantify and to improve uh the artist so today to try to celebrate the the users uh the community of people that think about immersion like Technologies being at an intersection between hardware and data and experience uh we'll have three speakers that talk a little bit about the hardware so JJ who will speak first about Hardware uh then DED Logan will speak a little bit about medicine um applications and then we'll close it up with the Opera for the day uh and talking a little bit about Opa with J shy again celebrating the the various communities that we support uh within the immersion lab and how we use everything from the the novel sensors and and novel uh electronics that are being developed and researched with inside of Nano get them up to the human scale and interact with them in in ways that are are changing the way that we think about applications in uses um so with that um I will first invite uh JJ to come up um and then uh we'll go uh through series of talks and we'll take some questions at the end um and uh JJ take it away thank you so much okay thank you for the [Applause] introduction so uh it's certainly my privilege here to address this audience to discuss of some of our work in the area of Hardware especially immersive Technologies so because we're talking about uh emerging Technologies here let me start by talking about stateof the art arvr modules so what I'm sh you here this is actually a meta Quest Pro uh which of course there a very powerful arvr headset but what a lot of people probably have not realized is that on this modules there actually a ton of sensors actually integrated if you count it there actually a total of 13 Imaging based monochromatic sensors that integrated only the headset on the handset okay performing a wide array of functions ranging from head checking depth sensing ey checking hand checking all those different functionalities okay now one thing I have emphasized here is that these actually monochromatic sensors meaning that they actually have active illumination source LED or laser you then use a single color camera camera to then capture the reflective image and then give you Des information the challenge however though is that with this kind of sensors currently Optics can be one of the major botton necks so what I'm showing on left hand side here is actually s of dissection of one of this kind of uh sensors people use currently in consumer electronic devices and uh you can clearly see that if you look at the optic side it requires multiple stacked reflective elements just so you can actually suppress Optical abration and this of course increase complexity of the raw system increases volume also the overall cost um so what we're trying to do here is we're looking for ways to potentially break this kind of tradeoff and then enhance the performance while also giving a more compa footprint at the low cost so the technology solution we are coming up here is essentially called Optical metal surfaces and to introduce how it actually works let's step back a little bit take a look at how the conventional lens operates so if you look at conventional L here essentially you have lens coming in here and by having this kind of curved geometric surface you can then bend the procing direction L of light to actually focus it the other way to look at it is to think of it as a device actually introducing a spatially dependent Optical phase delay because in the center here the light has to go through a slam of material with high refract index which reduce more face delay so in the end you essentially can bend the wave front of light and then eventually this then create a converging light Ray that focuses on the focal spot now met surfaces actually operate under exactly the same principle here uh what we have is basically a set of microscopically fabricated structures on a flat transparent substrate and by engineering the size and shape of the subwa scale structures you'll then be able to impart the phase delay depending on its specific locations now if the size of this features actually much smaller than wavelength that the instant light will not be able to tell a difference between this kind of a staircase like face profile and conventional continuous face profile and what you end up getting here is basically the metal surface lens or metal lens so the first challenge we want to tackle here is that can we actually use a metal lens to actually simplify Optical design to reduce the many Optical elements into maybe one single optic elements so one uh casing Point here is actually so- called fish Island so uh for those of you who are familiar with photography it's a kind of lens allows you to capture very wide angle images right up to like 180 degrees however to get that you typically need to stack somewhere between 10 to 12 individual lens elements which of course makes it very complicated and difficult to assemble so a few years back we actually come up with a way to actually transform a flat piece of thin glass into a fish ey lens that can capture 180 degree panoramic images with a very high Optical quality so the uh this is s excuse me uh let me see how to go back here okay so uh what I'm showing on the left here this is essentially architecture of the lens so we have a piece of transparent subrate on the front side we have an temperature when light comes in at different angles will refract the front surface and go to a back side where we engrave this kind of micro structures or the metal surfaces what we have shown is that you design the metal surface properly then all the light can get properly focused into a flat in plan with very high quality on the right side this actually what first prototype we actually demonstrated in the lab and as you can see here this actually a measure focal sport profiles so at Zer degrees you see a nice sharp focal spot and if you go all the way to 85° in the angle this is where when the light comes almost parallel to the surface of lens you can still get this very nice shop Focus spot so takeway message here is that we now have a technology that can offer Ultra wide field of view very high resolution we have proven experimentally that this can actually be dection limited which is sort of a stal limit you can get with this kind of lenses and you can also manufacture them as skill and low cost using standard seos microfabrication Technologies now just to show you that the last statement I'm making here is not just a statement on paper over the past few years we have worked very hard with our Foundry Partners to realize scalable fabrication of this metal lendes now a big Wafers so what I'm sh you here this actually a picture shows an 8 in glass wafer with a whole array of this metal surface fabricated you can see this very nice quality here using essentially silicon and Glass Technology okay and if you want to look at the performance here we can integrate one of this metal lenses on top of a commercial of the shell of seam image sensor to form essentially a fish eye metal surface camera now if you play pedition the camera here to actually image this semicircular Target this is what you're going to get so as you can see from this image here we can actually achieve in this case very close to 180 degree fi of view about 170 Dees and uh you may notice that the image does show some Distortion at the edge which is quite characteristic of any kind of fish Island because you're trying to map essentially a hemispheric Field view into a flat surface so inevitably at the edge of the field you have to introduce some distortions however the key Point here is that despite the Distortion we can get very nice and Sharp Images both in the center of the field and at the edge of the field okay so resolution is not getting compromised by the Distortion and we have also proven as I said that we were able to actually achieve defraction limited performance across the entire near 180 de field of view so now what can we do with this kind of metal service wi Field view lens technology so I'll give you a few examples here the first example is that we can actually use it to create a stereo camera so essentially by by positioning two of this lenses together side by side and through triangulation we can not only get the shape of the surrounding objects but also the depths or distances so in this demonstration here we basically put a little pumpking there which is of course is a very timely experiment given that Hall Wings right around the corner and what we can see here is through our algorithm we can actually capture not only the images of the pumpking but also like how far away it is actually from the camera so another example I want to mention here is that we can also uh realize this a zoom lens and this is actually useful because uh despite that people want the white view lenses to look at overall scene sometimes you also want to be able to zoom in and look at a specific session with much higher resolution conventionally this is done essentially using a mechanical moving PA you have a little lens element that actually translates with respect to other elements here to get the mechanical zoom and this of course complicates overall system setup and also decreases it robustness so in our case here we can actually use method surface technology to realize optical zoom lens without any mechanical moving parts so what we are using here is we're taking all the advantage that these metal surfaces can actually be designed to be sensitive to the instant light pration so with the two ofal parations we can actually encode two different Optical designs into the metal surface here to realize two different Optical zooms so the specific design is Illustrated here so in one design here you essentially have a lens that essentially diverges in light and then focuses using second surface in the second configuration here of the second provation uh the two lens essentially switch rows to give you essentially a much higher resolution in this case so from experimental results you can clearly see that we actually able to realize 10x optical zoom which is consistent with design and you can see this is actually y angle view give you overall image and here you extend zoom in and see much more the details with much higher resolution and by integrating a Pizer kind of checkable type of Chip scale Pizer directly onto image sensor we can actually capture both images simultaneously Al to one image sensor uh without having to introducing any mechanical movement parts so last thing I want to mention here is that this is about a different technique to actually enable 3D depth sensing so previously I talk about stereoscopy okay which is nice you can obtain the depths of objects but actually requires always two different cameras just supposed with each other okay and this Rings the issue that if you want to Accurate triangulation you need to have some spacing between this two Optical apures here so that brings a trade-off between accuracy as the footprint of the overall system so it becomes desirable to see if you can actually use one single lens or one single Optical aperture to capture not just the scene but also the depth information so we actually able to do that I'm not going to go into much of the technical details but the whole idea is actually you can actually engineer the uh the focal sport of the lens so that the focal sport shape not only depends on the pration but also depend on the depth or distance of the object now more specifically you can look at this two examples here this actually correspond to two Thal parations that we Imaging using our metal lens you may notice that this MIT pattern seems to be blurred that's actually intentional because the focus sport in this case is actually not a single Focus sport actually two Focus sport that rotate with each other and the rotation angle actually depends on the depth or the distance of object so by combining both information together and through some post processing we then able to get the distance of this different objects with a very high accuracy here so simply put what I hope I have convinced you here is that this meta surface technology can be very versatile it not only allows you to replace convention multi-reflector lens elements with a single piece robust system but also at the same time it can enable advin functionalities such as auto zoom or 3D T stening so with that I'm going to conclude my talk here and of course I want to acknowledge all my hardw working posts students as well as employees in the startup company that we work with with to actually realize the devices here I in particular want to acknowledge Dr tangu he's our Optical designers that come up with all this ingenious structures uh Dr uh M shakov Dr fenyang and Dr Hing they are the actual real heroes that works in the lab day and night to actually realize all of these devices I also want to thank might. Nano because uh a lot of my students and post they basically live in might. Nano espe packaging facility that's actually the only way to get all this nice work done so with that I'm going to conclude my talk and I'll be happy to take questions thank you otherwise you'd hold the questions at the end so let's take one question uh because I know he has to to run after this so if there is a question now um otherwise we will takeen to the end okay there's a question over there please uh block the light in could you repeat that sorry so like spectral selective reception well yeah that's actually possible I mean uh you probably don't even need to use metal lens you can use essentially like different kind of micr structure Optical filters multi-layer filters to actually realize some of that and that's actually quite critical if you're trying to design for example a combiners right um so so that that's definitely a functionality I think we can realize using a metal surfaces yeah okay thank you very much okay next up so now we're going to use the technology and we'll see what uh some of the novel things are that we can do for our our youngest patients hi so I feel like I am coming at you from a completely different Universe uh and I'm only hoping that you will understand is it this one okay only hoping you will understand what I have to say better than I have understood what anyone else has been talking about this afternoon so we'll start with that we'll see what we can do uh I am a pediatric psychologist I work with children at Boston Children's Hospital specifically I work in our pediatric pain medicine program so I uh most of my time is with children with chronic pain problems uh whole range of different chronic pain problems that are disabling that lead these kids to not be able to participate in their normal lives uh and they have a whole we have a whole range of treatment approaches to that including some very intensive programs where kids are with us for weeks at a time working with a bunch of disciplines and that is where this collaboration has come in so what I hope to talk to you about briefly is uh a little bit about how we have started to integrate virtual reality into pediatric pain management uh I'll talk a bit about a collaboration that we've started across a number of Institutions as well as my collaboration here with the emerging team uh and then I will talk a bit about the actual Vision project that I had in mind that brought me here but my major Point really is that this is not work that one can do alone and the ability to collaborate and access resources such as the immersion team here has been absolutely instrumental in bringing these kinds of treatments to the kids that we work with so a few years ago uh they assembled a group of experts to kind of take stock of the field of pediatric pain management and think about where it needed to go uh and they came up with sort of four four goals making pain understood making pain visible uh making pain matter and making pain better and what's striking here is uh all of the different ways in which technology factors into the vision of pediatric pain management going forward uh so you can see all sorts of ways that data and technology is coming into play and particularly this idea of digital Therapeutics uh so using things such as virtual and augmented reality applications in the treatment of these problems all right and this is really an idea uh whose time has come there is now a growing body of evidence to really show us how effective virtual reality can be in the treatment of a variety of pain issues um we know that you know it's very effective in acute pain settings so kids who are getting uh procedures you know distressing needle sticks things like that it can offer a lot of distraction increasingly we're understanding other ways that it can be used in more chronic pain situations such as in physical therapy to help kids engage in movements that are otherwise really challenging and to kind of address the fear piece of that in addition to that you know these products have become increasingly affordable so when we started and this was all you know very expensive very complicated it's now affordable it's portable we can bring headsets right into almost any clinic setting um there are now a proliferation of healthcare products and companies recently there's the first FDA approved VR treatment for chronic pain uh that's for adults but you know we're making progress um when the pandemic hit VR brought us a way to kind of bring some interventions into the home setting so these were things we could send some patients home with uh and extend treatment in that way um and as I said lots of growing evidence-based to really show both the efficacy of these things and also help us start to understand the mechanisms through which VR helps with all of these problems all right so then I had this idea which I'll come back to about what I wanted to do and I set out to ask for some funding uh and my funders said well first you need to go survey the landscape so go bring everyone together who's doing this kind of work and figure out what's done and how you can all work together this was a a philanthropic funding organization that really liked Foster you know these kinds of Cooperative sorts of things so that's what I did I set out to bring this team together uh we met in January of 2020 for the longest time it was like the last time I had been anywhere for like years uh so we met we brought uh a number of different Children's Hospitals together along with uh there were some representation from software Engineers there was uh the philanthropic organization helping us think through how to fund these things uh there was another group that uh whose mission is to uh use VR in a whole range of pediatric medical settings and we had some folks from outside the pain world who also helped us kind of think through what needed to happen uh to bring this technology into Children's Hospitals we also came up with a logo so it was a good two-day meeting because we've got we got a um you know a a great cool title uh and a logo out of you know hanging out for two days the turtle is sort of you know emblematic of how things move in the medical setting when there is a lot of bureaucracy uh and the hope is that the VR will get us there faster so we came up with a number of goals that we as a group wanted to work on they're you know very modest uh Advance all of our own projects to this network establish best practices identify the gaps uh develop a repository of resources we really wanted our sort of high-end institutions to be able to make these products accessible to smaller hospitals to you know get these out into the hands of kids who needed them and we wanted to disseminate the knowledge so if you're interested we did generate a paper out of our Consortium that you can read about what we hope to do in this field but that brings me to my project so here's what I wanted to do uh I work as I said with kids with chronic pain and particularly in this intensive program where these kids are with us for several weeks really working on the goal of returning to function the biggest area of function for kids is going to school we have a lot of kids with chronic pain who just flat out stop going to school or who miss a ton of school because their pain is really getting in the way or they're there but they're not really mentally engaged and then we have them in this clinic setting and we work on going back to school but it's a little hard because they're in a hospital and it's not quite the same thing so you know the extent to which we can really practice these things has its limits so what I wanted to do was to use Virtual Reality to create a simulated School situation that kids could engage in in the clinic setting uh so this would be a way to address both the physical challenges of school so for kids you know who have trouble navigating hallways kids with headaches who the bright lights are really painful for them all of those physical pieces and the psychological pieces you know just the fear of re-entering that situation the challenges of trying to talk to to kids who are asking them questions about where they've been trying to interact with teachers all of these things are huge stresses that really create some barriers to school re-entry so I wanted to be able to address this in a realistic way and also for it to be something that could be individualized so that every kid could really tackle their challenges in this kind of simulation also wanted it to be something where we could be monitoring their physiologic responses and responding to that so if their heart rate is really up because they're engaged in a really stressful piece of this there can be some prompting to use some of the strategies that they're learning for coping and they can employ them in that simulation and ultimately hopefully hopefully coach patients within that that simulation to use the skills that they will bring back to school with them so again there needed to be a lot of collaboration because I have like a lot of things to do and I can't do all this and I'm also very technologically not so Savvy uh so what's been huge is the the collaboration opportunities I mentioned before invin the kids is a group of actually uh anesthesiologists who are fairly tech-savvy uh who have been working to bring VR into a range of pediatric settings a lot of it around um surgical settings so post-operative pre-operative kinds of applications but they have good they had sort of a good sense of the landscape and what was possible in the medical setting and then the collaboration here with the emerging team they've been my Goto people for really realizing this Vision so creating the technology and testing it out and I will say this is very much a work in progress I'm going to show you the beginnings of it but uh hopefully it's a collaboration that's got a good ways to go as we get to this final product uh and then also really helpful to have those other hospitals where people were doing similar kinds of things and we could all kind of Bounce our ideas off of each other all right so then the other piece of collaboration is really that when you develop something like this you know you have your great idea but you really need to make sure that the patient for whom you're developing this thing are bought into it and also find that it would be useful so our first step was really to go to patients and talk to them uh first and foremost about what school was like and what those challenges were so we made sure we were designing something that really hit the challenges that they're experiencing and also kind of float the idea by them and see what they thought so we did that through a series of focus groups it was coid so they were all virtual um and we threw out these questions around you know what's challenging in school what was helpful in the program these were kids who were either in the program when we did did it or had finished the program um what else could be helpful uh so we ended up with four groups uh of kids from the program they represent our typical kind of chronic pain demographic and that they're mid-adolescence they're predominantly female they've got a range of different pain presentations and from the information they gave us we kind of came up with these initial scenarios uh most of which we have developed a few more are still we're still working on um but the concept of sort of just entering School nav navigating the hallways having to get to their locker and get materials and get to class on time engaging with peers while they're doing that um and you know sort of that time piece of it and then getting into classrooms and really trying to do work sit at a desk tolerate that physical environment talk to teachers talk to classmates uh get through the day and then importantly kids really also said you know we really need these bigger settings too we have to have like the stressful cafeteria with all the pushing and shoving and the gym where we have to like run around and do things these are really stressful parts of the day so we're still working on creating some of those all other parts of the school day um but these are all scenarios that we plan to expose the kids to through this simulation so here's some of the work uh that Brian and Talis and their team have done so far uh on some of the scenarios that the kids experience so that's the entering the school then they're in the hallway navigating that piece they get into the classroom they have to turn and work and find their desk and we can sort of integrate these various stressful situations into to this we've done some piloting uh at this point of this work uh with nine patients in our intensive rehab program and importantly also with a number of clinicians because that's the other uh sort of audience for this is we also want the clinicians to feel like this is a tool that they want to use so pH physical therapists occupational therapists and the social work and psychology team in our rehab setting uh all have been involved in giving us feedback on this uh so far we've been somewhat wired in the application as you can see that looks really complicated and you know limiting uh but the the end point will not involve all of that this is really as I said a process that we're going through uh we did this the kids went through it we asked them a lot of open-ended questions uh and got a little bit of more uh structured data as well so you can see you know there was a lot of positive feedback kids found that this was a realistic situation it was fairly easy to use they felt like it addressed goals that they can't otherwise be really working on in this setting and in our U our questionnaire measure their scores suggest a fairly immersive experience so they really did find that they could buy into this experience feel like they were really in a school setting um both clinicians and patients gave us some areas of improvement so far the movement is a little wonky to them it's we've been using teleportation and they really want to be able to actually move and that would help us with some of the physical therapy goals as well so that's where we're hoping to head with it they wanted it to be more interactive meaning that they're really interacting with the other avatars that they're seeing they wanted more tasks and scenarios which of course is part of what we're hoping that it can be sort of varied by level of challenge so once a kid makes some progress then that hallway is going to have you know 50 kids in it whereas when they're just starting out maybe there's five kids uh so that's you know that's Downstream they want to be able to increase the stress of the demand same thing we want this to be a progressive kind of experience and the clinicians really warned us that we also have to think about the space that they could use this tool in so they have we have to be a little mindful that they're in a clinical setting that they only have X amount of space to move around with we did try to get some former patients to come to the immersion lab to test this out to see what we could do um and you bump up against sort of the real world of working with humans and that none of these kids wanted to get to Cambridge so we had to stick with the clinic uh which is probably just as well because that's where we want to use it uh and then in terms of sort of relevance they all felt it was very relevant to treatment they talked about the exposure to both physical and psychological triggers the chance to develop coping strategies work through some of the challenges and the clinicians really felt yes this is something that can be used across disciplines we see ways to use this in in physical therapy as well as in Psychology so we've written up a little little bit of that mostly focusing on those initial stages of getting the information from the patients uh and we hope to also uh look at some of the physiologic data that we were able to collect as well so that's forthcoming in terms of next steps um hoping to continue this software development partnership and expand this experience to really hit all of those markers that patients and clinicians have told us would be so helpful um also we have applied for a multi-site feasibility trial to sort of bring this to a few other Hospital settings and make sure that it really is helpful across clinical sites and isn't something that's just tailored to our setting so some of the planed features we do want to have a A coping Library so if a kid is sort of feeling stressed in the situation they can go to the library they can access some of the coping strategies that they have worked on in in their therapy sessions remember how to use those practice them use them to get through the school situation um and as I mentioned we wanted to be responsive to the physiologic monitoring so if a kid is seeming really stressed there are some reactions that can happen and ultimately it would be great if clinicians can really interact with kids within the simulation so that they're inhabiting other characters and can sort of do their treatment through that level of reality this is what we hope our feasibility trial will look like again three sites um testing this out and continuing to you know Advance the actual software piece of this so that we can integrate all of those pieces so that's out under review at the moment hopefully we'll get some more support and can keep moving forward just a last word if you're interested in our innovate pain Consortium we do actually have a talk on Friday we've been trying to you know move toward de cination and involving more people who are out there doing this kind of work so we have a panel of different kinds of clinicians who use VR in pediatric medical settings and they're going to talk about a variety of challenges that they' faced trying to use VR as well as some of their successes so it should be a fairly interesting hour if you're free and that is what I've got thank you thank you thank you Dedra um so next up because every Nano Summit should be closed with a little bit of discussion about Opera um we'll have uh Jay shy and then we'll take the questions so please hold your questions we'll take those with with Dedra and Jay at the at the end okay um hi thanks for having me um and um so augmenting Opera paral uh one might ask why you would need to augment Opera because Opera tends to be augmented music already but um in this particular case um I was invited to direct uh Richard vogner's Opera parsifal at the brro festival um last summer U actually before coid and then it was postponed and it premiered this past summer one of the uh one of the challenges in in making it was that Katarina Vagner the great great granddaughter of Richard Vagner um asked me if I thought we could integrate VR or maybe AR into the performance this being this being one of the most uh one of the most important Opera festivals um but strictly for the music of Vagner and um known for its very very special Acoustics um and so the basically when Vagner first designed this theater he designed it with an orchestra pit so the orchestra would effectively be hidden um whereas in previous previous or other Opera Houses of the time the orchestra was visible he referred to this as the machine of the music and he chose to try to hide the Machinery that would produce the music in order to create a more immersive experience for the audience um and he designed this strange curve over the pit so you literally can't see the conductor can't see light from the orchestra pit um and then duplicated the prum out and into the house so you would really begin to feel if you're sitting in the theater as though you were almost on stage uh even though you aren't um and so the NC soft MIT Nano immersion lab gaming program um sort of gave us the opportunity to take a really like big scary leap at trying to do something which seemed at least at the time relatively not not doable so we set out um to incorporate AR uh into into a the performance of parsifal um we made a digital scan of the entire space and built a digital twin of the theater in order to try to figure out if we could place individual headsets AR headsets in particular seats in order to give them all the the sort of correct perspective um throughout the performance um this took ton it was a it was a nightmare that turned into less of a nightmare um we end up working with an Andro on the Android platform with UN in a Unity environment and chose the nreal headset which I think now is called xreal um the reasons for that are various but um basically some of the other headsets were too expensive and would mess with people's haird because an actual good frisor is important at the opening of an opera um we used a websocket based control system eventually that's what we arrived at because we needed to be able to cue each headset to um basically within milliseconds of one another right um so we use the enre headset boom um a hole bigger than the thing the hole is in the the Opera parsifal concerns itself with a a wound there's a there's a king a guy am foras who is suffering from a wound that will never heal um parsifal shows up the sort of holy fool basically the dummy who asks the obvious questions and um and heals uh is able to sort of find the solution that would heal it so it's a the sort of parsifal story takes place at a time when myth and magic and sort of mythical things are still somehow possible the sort of magic of pre Christianity early sort of unusual rights and we were coming out of coid coming after the murder of George Floyd and we thought well and we sort of came up with this idea a hole bigger than the thing the hole is in um which seemed to be like a phrase that captured our times and that became the guiding principle for most of our design approaches and so we began by you know developing a lot of like Motion Graphics and um constantly bucking up against the the sort of polygon count possible for a headset um bearing in mind that it's a 4-Hour Opera right so it's four hours of content um uh I would make these like crazy drawings work then we would work with our team of um team of folks in order to create 3D animations so in this particular scene Vagner always like writing stage directions that are difficult to produce on stage a swan flies above the lake paral shoots it with an arrow it falls to the ground um and and so while we're using kind of like traditional staging in a sense in this particular scene on stage we were also putting a live live a live virtual Swan that was flying past your head in the headset arrows whizzing by um and um and coupling it with uh the scene on stage so it was actually kind of a wild experience and you would hear people gasping um uh interestingly only part of the audience wore the headsets um our original design brief was to actually figure out how to do it with 2,000 headsets um that's a really big logistical challenge right so we ended up with only 330 I don't really know why it was 330 but we ended up with 330 partially for financing reasons but also in order to limit the headsets to an area in the opera house where things if it things went sideways it wouldn't destroy the performance right fortunately nothing went sideways which um is unbelievable to me still um here's the swan on stage and then the swan was in the air um this is another scene that for the folks who didn't have a headset this is what they would see and those with headsets would experience sort of a much bigger environment and we worked with four use sort of had four cases in the design process atmospherics so rain fog smoke um wind and then uh we thought about extending the stage architecture so expanding the actual design on stage into the audience and out basically bursting through the walls um character animations so we added animals foxes plastic bags because we figured well the plastic bags will definitely still be here once civilization is gone um and um yeah there we go boom and then um there just another another example I was asked many many times like why are we even bothering to do this and I think somehow somehow uh here at MIT in the theater arts program we spend a lot of time like looking at various Ways by which to extend the experiential possibilities of live performance um that has included the incorporation of media and New Media Technologies pretty much in any any way we can imagine and we sort of test these things and work on them and work on them and and we kind of like run around with this motto that well either we figure out how to like use these tools to better express our contemporary situations um or I don't know or these tools will just Express us right and things will just get very flat somehow in the world and so coaxing bits to speak with Adams is is U it's an old media lab thing but it's something that we take quite seriously um and also not so seriously these are flower people floating through the space um which is a scene in in the Opera the the magic garden um the the flower maidens as it were so we have flower boys and girls that float float through the auditorium and I've have just a very short video I wonder if I can play it from here here can I maybe you can please 20 seconds so this was shot with a telephone because we loaded the app also into a phone and we're just testing testing um some of the [Music] [Music] materials cool um oh no I just started it great um so uh this project was developed myself and my colleague Joshua higgison uh in the theater arts program uh and uh with our friends in MIT Nano thank you so much and um yeah there it is uh in the next in the next years we will continue to develop it whether we go with a situation of 2,000 actual headsets in the auditorium remains unclear um we're going to kind of pursue a system by which people can actually bring their own headsets um which I think I think will be a quite interesting event in uh in by roids thank you so much we we're good with time Vladimir we can take a we have good on time okay so let's let's um D we can come back up J won you st up um we'll take a couple questions here um I I thought there were some more please come on up let's everybody gonna stage here we're at the end of the day you're the you're the yell out your questions or sort of I there was one question um we'll take this one as for J in future yes I think the answer is absolutely yes and actually before we did the AR piece during coid we produced a VR version of zigf freed where you could play zigf freed and it was a kind of first-person shooter scenario you could pull the sword out of the thing and on Q stab the dragon and and we showed it in bids and people had a ball actually okay so this one's not for you J you can try if you want uh so this is this is very stably our population in adolescent chronic pain it's also fairly true in adult chronic pain uh that the vast majority at least of of healthcare seeking people suffering from from Pain are female um we don't fully understand that there are genetic issues involved uh there may be other layers involved in that in terms of who comes to a pain clinic but that's who we've got um lots of lots of sex based kinds of stuff in pain and then someone asked about the panel on Friday I don't know if you can go back to my next to last slide which had a QR code if not if there's some way I can send it to any kind of group of people I'm happy to do that what are the possible generative AI for AR and VR in engaging people with tech um so it's everything from generating environments uh to making sense of people's interaction and and this is actually you know Bruce thank you for the question you know one of the things that we're doing is certainly within the immersion lab and you we we are in some ways the data this interface between the data the hardware and the human right and that shows up in different ways in art and shows up in different ways in medicine it shows up different ways in Sports and Health and so and certainly there's the technology underpinnings there's the the novel sensors the novel displays and and certainly compute and and algorithms so looking at both sort of the the cross well what's the foundational technology and how we use these things to make better use of and better interaction with technology then what does it map to these different domains so that's one of the exciting areas where we're defining these little communities of of companies and users that we can talk a lot about Healthcare and Medicine a lot about art and in entainment and and sports and what have you and then have a a cross cutting set of enabling tools and technology that raises the water level all around um I think with that I don't want to um um keep going I think we're probably good with the time here so Jay and Dedra thank you again and JJ I know he had to take off so thank [Applause] Youk

2023-12-12 10:34

Show Video

Other news