IMMERSED IN—The Metaverse Part 3 Haptics

Show video

welcome to immersed immersed in the metaverse today is our third and a sequence of four the on-ramps to ar and vr the on-ramps to the metaverse our monthly seminar series is really focused around immersive technologies new modalities for manipulating the virtual world interacting with the virtual world and looking at how those innovations are impacting science and engineering and art we try to have a mix of lectures and demonstrations and tutorials to give you a sort of a deep dive immersed is sponsored by the immersion lab the immersion lab is a central facility at mit to provide the space the tools the expertise the facilities to bring together researchers and artists um looking at creative projects that benefit from the application of ar and vr in gaming technologies into education and research and art and science so we're joined today by talus rex who is the ar vr gaming big data technologist in the immersion lab and then also our special guest lucas de bene he is a freshman at mit um and he'll talk a little bit later on about vr hardware and sort of content creation uh he has a youtube channel uh lucasvr tech and today we're gonna get a little bit of a feel sort of a very tangible feel for both what haptics is all about what we're doing in the immersion lab and quite importantly and quite excitingly uh what lucas is doing uh so with that please uh participants uh feel free to chat your questions we'll try to answer them in real time as we go and we'll have questions and answer at the end talus take it away all right all right everybody well welcome uh hello and welcome to immersed in the metaverse so my name is talis rex i am the technologist here at the immersion lab at mit nano and uh so welcome everybody who are joining us for the first time and welcome back to those who have been following these this series um we are now on our third part of our four-part series about starting our conversation about the metaverse some of the development tools and ways of kind of creating our own content and thinking about what this means for the future today's discussion is immersed in the metaverse specifically haptics getting kind of a sense of feel for the metaverse and what that means and some of the tools and the devices that are currently being developed to make that happen um this discussion today will will be set up a little bit differently um than our previous uh series given that we have an awesome guest speaker who will be sharing his thoughts development and vision of haptics as we kind of move closer to the metaverse and interactions within the virtual space so today just a quick overview of what we'll be discussing um we will just do a quick recap about the few seminars in case anyone missed the previous ones or if anyone wants a quick refresher and then we will dive right into haptic technology i will share a little bit of some of the use cases and some of the devices that we've investigated both here in the immersion lab as well as other companies out there that are driving their their innovations forward um we will i will then kind of bring it over to to lucas he will kind of again talk about his experience and his development and then hopefully we'll have enough time at the end for questions and and more of a discussion opportunity for for you attendees so um with that being said i think we can we can start uh right in so as some of you may already know this is a four-part series where we kind of journey through just a sliver of the metaverse and what is possible and exciting uh as we look forward to its creation so we started discussing avatar creation and the fabrication of these photorealistic avatars using free online tools that anyone can use and customize we then talked about the motion and animation for those fabricated avatars using open animation packages on unreal engines marketplace and how here at the immersion lab use motion tracking and our optitrack system to kind of expedite that process and and customize our own animation uh sets if you will uh today we will talk about haptics and its current state in today's technology as well as some novel developments that can be shared through the eyes of lucas and uh lastly next month we will then discuss kind of the virtual worlds the research behind all of this and what it means to put all of these pieces together towards the metaverse and in game development so before i go straight into haptics um i will just kind of give give a quick highlighted overview of the things that we discussed uh part one we we used metahuman creator for content creation um we then discussed kind of the nvidia's omniverse platform using the audio two-face application which is that application that we use to formulate facial animations using artificial intelligence to infer facial characteristics through through just audio alone we kind of glanced on the usd the universal scene description file format for collaborative and custom workflow for content developers and then we discussed unreal engine choosing this as our our primary game engine for this series to put all of these pieces together and to create kind of a virtual scene so here are just some images of things that you may have missed with with kind of a link on the top left to our youtube video if you are interested in diving in um but again kind of in the usd5 format on that top left description um the the fidelity of the avatar on the bottom left of the metahuman creator which is developed by epic games we have uh and on the right we have the omniverse kind of uh package the top right is the audio two face interface and the bottom right are all of the different applications and connectors that can be used to expedite and and collaborate amongst different developers using different software solutions for their workflow um part two we discussed motion and how to apply these custom animations to the avatar to create more realism both in the behavior of the avatar as well as the movement so we talked about the optitrack motion tracking system we have in our lab that we used for our custom animations we then learned about motionbuilder and how we use this tool as part of our workflow for retargeting and editing and uh interpolation and then again looked at unreal engine and how we added these uh to our sequencer which we could apply the animations and blend these spatial animations that we gathered from from the previous seminar with the motion tracking so again if you just missed uh just a quick recap video on the top left again but this is kind of the data that we're able to get from the optitrack system we have that skeletal mesh or that reconstruction if you will uh the top right is motionbuilder allowing us to to use the motion tracking data on top of our avatars mesh and drive that forward um on the bottom left just that animation starter pack that we went over uh that are free and you can use to to kind of experiment with but we also talked about uh rigging and what that means and how we can use skeletal mesh or skeletal reconstructions to drive a mesh forward and to add some behavior and animations to it so definitely check these two out if you've missed it they're on our line on our website um so feel free so our four part series so uh or this is our third part uh and we'll be talking about haptics so haptic technology if for those who don't know uh refers to 3d touch it is a technology that allows users to physically interact with the virtual world uh you can interact with virtual assets or even even to add on to your immersive experience some of you may know haptic technology already uh within maybe a gaming controller of some sort with vibrations or even you know your smartphone so haptic technology has already kind of embedded itself in today's uh technology that that you know if you're unaware or aware so uh not all haptic technology is the same there are a number of different ways that creators have developed to interact with virtual worlds and assets uh other than you know our standard uh vibrations things like force feedback for example which we will see a little bit later today uh that has a lot to do with kind of the haptic gloves that we'll we'll discuss in a bit uh air vortex rings for example which are kind of donut shaped air pockets made up of concentrated uh gusts of air uh you know both microsoft and disney have have both taken part in this endeavor to kind of deliver non-contact haptic uh feedback and there's kind of unique [Music] technologies and devices that are being incorporated now like ultrasound that uses focused ultrasound beams that can be used to create localized sense of pressure on a finger without touching any physical object and i'll show you one of the devices in a second and and more and again there's there's a lot more investigation here things like electrical stimulation which you'll see in a second um these these are constantly growing so it's very interesting to see where the development is taking place as we kind of think about interactions within virtual worlds and so some of these there's just a quick glimpse of some technologies in case you're unfamiliar on that top top left you can see some haptic gloves and this includes that force feedback on the fingers that restrict the movement of your hands to kind of manage the shape of the virtual objects that you're interacting with these can be very pricey very costly so you know a lot of companies and developers are starting to use these however you know on a consumer based level it might be a little bit difficult to acquire which is you know where lucas may be able to shine some light on we have on the bottom left these haptic suits we actually had a demo here in the immersion lab using a suit called the tesla suit not related to elon musk in any way but this allows you to kind of simulate through electrical stimulation certain objects or environments in the scene for example i was in the simulation where rain was falling and i was able to feel almost a sense of raindrops on my shoulders which which really added to the immersive experience and they've been doing a lot of training modules using this technology and this is this is all incorporated in the body so i think tesla suit has over a hundred different sensors all the way from the shoulders to the bottom of the feet so it can be very immersive paired up with virtual reality uh these these technologies can can really really be very compelling um then we have kind of this product on the top right that just came out called emerge which usually uses ultrasound to drive virtual objects and novel ways of interacting within each other here they use focus ultrasound beams like i mentioned to create localized sense pressures on the fingers so those focal points that create the sensation of pressure is generated by individually controlling the phase and intensity of each transducer in an array of ultrasound transducers so these beams can be used to deliver sensation of vibrations and give users the ability to feel virtual 3d objects so this is a very unique uh way of interacting with your virtual worlds if you will um and and has allowed for for uh real cool innovation content uh like braille and and uh you know super powers and and all that kind of stuff so um very interesting space to to uh consider as we as we think about haptics and finally again just the typical hand controllers that you may not be familiar with um such as the xbox controller or some of these new novel devices and i do want to share one video of this new device that just came out in the space and it's more of a handheld one handheld controller that was developed by researchers from korea uh talking about kind of these these skin slip around the fingertips and i'll let them explain it uh here in this video how can you feel virtual objects in vr sliding through our fingertips we developed a skin slip haptic feedback device using pivoting spinning discs and first evaluated user haptic perception of the device we then investigated visual haptic congruency perception of skin slip direction and grip with perception in vr we found that in vr the visual feedback has a strong influence on how the haptic feedback is perceived please refer to our paper for details so i just wanted to show that because it is it is an interesting difference how can you feel a word uh it is interesting development how these new these new technologies and devices are starting to be thought about um as as kind of our devices become smaller and more portable uh and what we can do and and how to use them and that brings me to kind of our use cases uh where we kind of think of where we can go with these tools how can they benefit us uh in different industries that are capitalizing on it so again these are just four examples but there are a lot more but i do i wanted to highlight these since since aviation for example was kind of the beginning of haptic technology it was actually first pioneered through aviation development and training exercises one of the earliest applications was used in aircraft training and in which aircraft approached a stall for example these these vibrations was felt by the pilot's controls and this was a useful warning of a dangerous flight condition so uh and they still use many of these practices today then of course it branches into more of that consumer market where we have all of those vibrational feedback in our pockets connected to our mobile devices and this is still constantly being developed apple has has released new patents on different ways to do multi-touch vibrations um so even though it's been around for a while the way of of translating the vibration is changing and of course uh you know along with that with mobile devices we have the gaming community where we use uh you know simple haptic devices are are common amongst the gaming controllers the joysticks and even steering wheels for for immersive driving um this could also include that full body suits like you saw in a second ago uh along with kind of hands and and uh feedback in the feet so uh you know this can be very compelling for a lot of game developers or uh game gamers uh and another example is also medicine and and surgery where training modules have been designed with force feedback and vibrational cues to indicate to training participants on on best practices um and so a surgeon may be able to make an incision and they feel tactile and resistance biofeedback or feedback rather as if it was working directly on the patient and they use this um you know remotely they can they can train on their own they have used it for veterinary clinics so a lot of different avenues of space and with medicine and training simulations it seems to be where a lot of these technology is driving forward so with that being said uh i am then going to transition to lucas uh who is our guest speaker for today as mentioned before lucas is a pioneer in the haptic field building open source hardware and software that can be used so that you and i and other haptic enthusiasts would be able to participate in so a lot of you know our workshops we we do want you to do it yourself and so i thought this was a great opportunity for for you to all see what you could do in today's kind of day and age with with uh you know the technology available so uh without further ado lucas feel free to take it away yeah thank you so much and thank you so much for the opportunity to be here um i'm really excited to share all this with you guys um so yeah um my name is lucas i am a freshman here at mit and vr and haptics have completely taken over my life in the past year um i actually um about a year ago uh december of 2020 was a high school senior taking a gap year um and i was stuck at home during quarantine um pretty much with no way of you know seeing my friends all of my friends were either also in quarantine or in college um and you know i really you know felt isolated physically um and that's kind of where i fell in love with vr as a way to extend beyond you know my physical boundaries of where i was um and in doing that i found an amazing community of people who pretty much just like exist in vr um which is you know fascinating as far as being able to you know live your life in a place that's completely separated from our physical reality um and from there i got really interested in you know the technology that allows us to do that um i've seen you know youtube videos for a long time theorizing like you know what is you know the virtual human going to be like in the future and you know one of the things that's really interesting uh like talus mentioned is those vr gloves that can actually let you feel virtual objects and i had always really wanted to try those um because currently vr headsets require you to use these you know bulky controllers um that you know stay in the middle of your hand you can't actually open and close your hands all the way um and you're always aware that this thing is in your hand and you're not actually interacting with real virtual objects um and so i really wanted to try those vr gloves um but of course being just a high school senior uh not really having any way of getting this enterprise grade tech or affording it even um i pretty much was out of luck and so basically the only solution i had was to build them myself and that's pretty much where i started so i will share my screen here and so pretty much this was the absolute start of the project it was uh a couple of badge reels like the kind that you would uh attach lanyards to um and some rings on my fingers basically just to see like could strings be used to measure and affect your fingers um and so the biggest challenge i think for me was that before you can have a like a fully working vr haptic glove to interact with vr you have to have like the position of your fingers you have to have the position of your hand there's a lot of data and motion capture that you actually need to do and of course i didn't have the you know opti-track system or access to that kind of thing yet because i was just a high school senior so my solution was actually to take strings and attach them to potentiometers which uh you can measure the rotation of a shaft which i eventually added a spool to and a spring and so this was basically a super duper cheap way of just measuring the position of your finger using technology that's existed for a long time and i think it's a really interesting pattern uh in this kind of vr hardware area uh is that a lot of the really cool innovations aren't coming from making some you know brand new groundbreaking technology a lot of it is coming from taking things that have existed uh for a long time and just trying them out in ways that nobody has ever done before um and so basically what i did from there uh is i you know i started adding more of them and seeing you know like can i actually use these to track my fingers uh and i was posting videos on tick tock uh for a while just as a way to like you know share this with my friends who are you know all the way in college already uh and show you know like hey here's you know something cool that i'm working on uh and the really interesting thing was uh the internet kind of fell in love with it um my first video that i posted had like 10 views in the second one had like a couple hundred and then by the third one it was at like 400 000 views like i don't even know what kind of exponential curve that is that's beyond exponential right um but then by the the fourth video i started adding um you know electronics to it uh to actually measure the positions of your fingers and this one got to like 8.4 million views um and you know a lot of people were expressing you know excitement about you know what this means uh for uh the vr community for the vr industry uh and so at this point i was hooked right like at this point i was gonna do anything i could to make these gloves you know really work in vr um and so what i ended up doing uh was actually starting with blender which isn't really meant for this kind of work but at the time it was you know the closest to the software knowledge that i had um to be able to get these to work to track fingers um and so you can see right here it's actually tracking my thumb based on just that one sensor that i have there um and then later on i added more of them uh it originally was very buggy uh but over time i was able to improve it a lot um at this point uh the electronics for just the finger tracking portion were about 11 per hand which is significantly cheaper than most motion capture solutions for fingers um and of course because it doesn't use cameras it means you don't have to worry about occlusions you could like put it behind your uh behind your head or behind your back and it would still track your fingers um but so using in blender was really cool as a demo but it didn't actually you know let you use it in virtual reality uh and this is where actually the community came in uh extremely handy um so basically in order to get this to work uh i had to write what was called a steamvr driver which basically allows um or an open vr driver which allows you to make external hardware uh work with openvr which is one of the most uh at least readily accessible platforms uh for pc vr games uh because basically my my big goal with this was to try to make the gloves work uh to be able to feel and interact with objects in video games that i was playing in virtual reality all the time anyway um and so i was able to get this driver to work but it was extremely buggy uh it took me a really long time uh and overall the process was uh of setting it up was not reliable at all um and so the really cool thing about this project is that because it's open um and there's you know really big community behind it there were people who are willing to you know help me out uh just for the sake of them liking the project so much uh and i think i really learned you know don't underestimate the ability or the power of people who do unpaid work because they really love what they're doing um myself included i guess um and so actually a high school student named uh dan uh actually just um reached out to me and we decided to create this huge open source project for a vr glove driver uh that we eventually called open gloves um that is now totally available you can download it for free on the steam store um and all of the source code is online as well so uh anybody can contribute to it and make it even better and now it's at a point that uh the project is really stable we're talking with valve and getting support from them on how we can make it even better um but so from there uh the biggest challenge was adding the haptics right so i had to compress all of the electronics down as much as i could to make the motors fit um and then eventually i started adding these force feedback motors so these are just nine gram servo motors which are like a dollar each to a dollar fifty each if you buy them in bulk um and so what i ended up doing was making it so that this spool that rotates when you open and close your finger can be limited um by the servo motor um and it was very hard to actually make it all fit in one package at the time without being you know super bulky but eventually i got it to look like uh let's see right here so this was now my fourth prototype of the vr glove which now had that force feedback haptics so you could actually feel the resistance and the shape of the objects that you hold in vr and so those strings would actually pull your fingers back so that you could feel um you know the curvature of an object kind of but more or less the the size and shape so like big things feel big flat things will feel smaller and it's one of those things that of course it doesn't give you every sensation that you have in vr obviously it doesn't give you texture it doesn't give you you know complete uh you know granular touch um but vr is one of those really interesting things where any amount of extra immersion you can add will go a long way when it's paired with the you know visual stimulation from the headset um and so not only did the we have to you know use like unity um code and things like that to make it work uh with the haptics but then a big part of this project was also making it work inside of vr games um which is really difficult when we aren't the developers of the vr games um but this is another area where the community came really handy um which is that we can actually modify the vr games by making mods for the games uh and taking the uh basically the data inside of the game of the things that you touch and sending that data to the gloves so that you can actually feel the shape and the resistance of the objects and so it might not necessarily be uh you know the easiest thing to do uh for every single vr game out there um you know if you're just one person but in having this community um i don't have to work alone on this because there are loads of other people in the community that are willing to dedicate their time uh you know to help add the haptics into different games um so this is one that dan and i worked on called half-life alex uh or the game is called half-life alex and we we made a mod for it um that does this um and i believe i also have a video here of boneworks which is another very popular [Music] okay uh so boneworks uh another very popular vr game um and this one is actually totally community made so i didn't even uh touch the code for this one uh as far as the mod itself and all of these mods [Music] i think someone is flipping the share screen there but basically all of these mods take that data and they send it to that open gloves driver that i talked about um which then sends that data straight to the glove which i also have one right here um and so the really cool thing about this is now the project is at a point that it's totally self-sustainable um you know content for the the project continues to grow um but not only that um you know there are people out there who are also building their own versions of the gloves so the hardware is also open source so anybody can go and you know take my vr glove parts and go and 3d print them and build their own glove right now with the haptics it's about 30 dollars in parts per hand so 60 total and there are other people that have [Music] and there are other people that have taken those parts um and modified them to make them you know fit whatever they want to do as well so for example here's a couple examples of people from our community that have made their own gloves um and they're growing every day and people are also contributing software every day um so anytime someone has a feature on their gloves that isn't supported by our software they can write the software themselves and then contribute it to our software to make it even better um so this is you know just yet another example of you know if you find the right community and the right people who are really interested in what you want to work on um you can you know really further uh that kind of development much faster than if you were just working on your own so yeah i think as far as like um the project itself that's that's as far as the timeline for that so i think from there i'm happy to you know jump right into q a um well one of the one of the questions what's been the most surprising sort of community use i mean you had a certain you know the set of things that you were trying to do and the things that you're following like what's the surprising thing the community has decided to hey i'll try this yeah yeah that's a great question um so right now uh the majority of the community is other vr gamers who are trying to make this work into their games um but that's not to say that's everyone right um so actually one use that's been really interesting is um there is a group of high school students in barcelona who have built their own robotic hand and they took my vr glove design and made some software to combine them so that you can control a vr glove or you can control the robotic hand with the vr glove um remotely from anywhere in the world um and what they also want to do is add like touch sensors on it um so you can actually feel the objects that are in the robotic glove with the goal of being able to shake someone's hand across the world um and so a couple weeks ago we actually got on a zoom call um and we shook hands uh remotely uh which is you know really amazing and i think is probably you know one of the cheapest uh you know applications of a telepresence robot that i've ever seen so i was you know really kudos to them because it was really exciting oh that's very cool uh so um so participants please feel free to um chat your questions you know don't don't chat uh rick astley but uh yeah i'm not sure if he's you know has any haptic stuff in any of his prior videos or not but um so in terms of like do you what's the sort of the the most uh sort of the game that you enjoy uh interacting with the most sort of with a haptic uh sense yeah um so boneworks which is the the second game i showed off in that video um has a really amazing physics engine um so you can do a lot of really powerful things with it as far as being able to climb and actually feel uh the hand holds that you're grabbing as you're climbing and the developer that actually made the mod for that put a lot of time into making sure that every single object you pick up is the exact shape that the the actual geometry of it is in the game um and so i i think that's probably one of the biggest ones uh for me it's that game's a little tricky because it causes a lot of motion sickness for me so i wish i could play it more but i'm hoping you know if there's you know more vr games in the future that implement you know similar kinds of physics engines but without the same kind of emotion then i'll be able to enjoy that a lot more so you're here at mit sort of being immersed in sort of as they say as we like to say drinking from the fire hose of the the the rate with which information comes at us but what about i mean what do you see now as maybe some of the interesting opportunities for education you know whether it be at the at the college level or the sort of the the elementary school level like how how can this type of both the ar and vr but you know haptics you know enable some understanding accelerated understanding for for our learners oh yeah totally um so i would say like a big thing i struggled with uh in like math and physics classes previously was like having spatial reasoning based off of like a little like 3d but really it's like 2d projected on the 3d sketch that you'll see on like a paper test or something like that um and i would say definitely one of those things where like if you could just like put on an ar headset and like see you know like oh this is like you know like the lorenz forest law or whatever and you can actually see it in three dimensions as opposed to like two dimensions and like a fake skewed third dimension uh and things like that it would help a lot i mean and even just having like animations of you know processes and things like that um for people who you know really need to learn visually or adding haptics you know in a tactile way you know being able to actually feel how um you know like two different forces slide along each other with variable friction or you know different concepts like that i feel like if you're able to add that kind of technology and would have a huge impact let's see here's a question um so are you planning to eventually make a basic kit for people to buy uh meaning already assembled maybe with injection molded parts instead of 3d printed you know so what are your commercial plans i guess yeah yeah great question um so it's definitely one of those things that you know if i'm able to do i would you know that's the dream right because right now the only people who can have these gloves are people who either um you know have a 3d printer and have the time to go and build them themselves or uh they you know can ask someone else to sell them to them which that happens a lot on our discord server um but uh but i would definitely say like if that's something that you know in the reasonable future i could do i would be you know ecstatic to be able to do it uh it's also one of those things that like while drinking from the fire hose at mit it's it's very hard to find time to you know fit all of those things in um but also there's a lot of really good resources here at mit as far as like entrepreneurship and getting funding and things like that um so it's definitely one of those things that i'm looking into and to piggyback off that as well i mean the immersion lab is definitely open to kind of hosting some workshops for for this creation and developing kits um so if there is a lot of interest um i you know i strongly recommend those who are on the call today to feel free to email me or or lucas and and you know provide your uh your enthusiasm and we will do our best to kind of create some kits and we you know once the coved um you know pandemic becomes to lighten up a little bit uh we definitely have plans to do maybe a uh in in lab kind of workshop uh creating one of these hands so stay tuned it's a very nice sort of elegant way i think i think lucas there's probably some additional ways that the community can imagine using this in education whether it be sort of conveying so both geometric concepts but but certainly sort of relative sort of force you're not going to do the extreme forces of picking up something heavy but sort of you can add that layer of intuition um that you know is previously inaccessible in this type of without the haptic feedback um let's see here you go some more questions here from the audience and just to sort of add on here's a uh here's a question uh which haptic components need to be improved the most uh both in terms of performance and in terms of cost uh yeah that's actually an excellent question um so right now all of the haptics uh which means basically just force feedback right now uh is being done via nine grams over servo motors which is great because they're extremely cost efficient they're super cheap um they are not the most durable um so i'm using mg90s which are metal geared at least so you don't have to worry about gear stripping most of the time but one of the problems is like if if you lose like one tooth um then every single time it tries to close around an object like it'll feel clicky as opposed to like a hard object um and so it's one of those things where like uh adding injection molded parts and having you know dedicated fabrication for all of those things um while it would have you know a much bigger overhead cost um so it'd require a lot of time to you know pay it off uh is one of those things that would certainly make the gloves more durable and reliable over time there was a comment from the one of the community members here i could see the see them being used for organic chemistry or chemical engineering to explore molecules in space sort of the geometrical arrangement maybe how how molecules fit together very nice uh so another question here is um how difficult i mean this speaks to the sort of the community assembling of them and maybe asking for a pre-assembled piece but how difficult um what do you say it is to make a pair of gloves you know i guess it depends on the skills you have the tools you have but you know how difficult is it so i would definitely say it the tools is a big part um because you can just do it like with like dupont wires and like breadboards and like basically anything that would come in like a basic arduino kit and it'll probably work great the one problem you'll have is like the wires might fall out of the pins and things like that on the potentiometers um it's one of those things where like if you have a nice crimp set or you use like a soldering iron for it it'll make it significantly easier to build as long as you know you know how to use those tools um and uh as far as our community goes we have so like we don't have specific data on like how many people have built the gloves but what we do have data on is how many people have downloaded the driver which is required to build you know use the gloves in vr um and so we have a few thousand downloads of the driver um which means you know some subset of that uh you know a few thousand people have built the gloves themselves uh and the great thing is because we have this community anytime you're building your own gloves and you run into a hiccup you can pretty much just go on our discord server and say you know like help i'm stuck on this part and there will be you know multiple people who are really excited to help out and help other people build the gloves there's just some comments here as i'm reading through them i think some people are highlighting for example some electro-reactive polymers uh sort of shape memory alloys but other potential ways of inducing both measurement and actuation that might be smaller form factors that might be interesting to to explore um i guess so maybe a one last comment here um maybe get your sort of thoughts on this and certainly as we think about robotics uh there is this this evolution very very rapid revolution people looking at sort of flexible and soft robotics things that aren't just rigid sort of fixed structures i mean are do you see this as sort of in in the context of haptic devices do you see this in a similar way that it's it's it's certainly there's some rigid components in your glove but you know you really want to be a soft flexible thing like how do you how are you informed i guess maybe uh by the things that may be happening in soft robotics or flexible other areas where flexible things are being used for actuation and control yeah so interfacing with the human body i think is one of the biggest challenges of this project because humans aren't like this rigid object that you can just like snap things on too easily like humans are squishy um you know they have stretchy skin and things like that but also all humans are a different size right so like if i were to take my gloves that i built for my hands especially the original prototypes which were just like pure plastic and tried to fit them on somebody else's hand uh you know it was a really big challenge to actually get it to fit because everyone has different sized hands and fingers and things like that um and so being able to switch to like i've been experimenting with like tpu um pieces so that you know you have like flexible rings and things like that um but even i know like a lot of companies like hapdex and even meta now are experimenting with using like microfluidics uh for like touch haptics so you have like these little squishy bubbles that will like expand and let you feel textures um i think it's definitely one of those things that is probably going to become more common in the future um but because that tech uh is still really big and hasn't been you know innovative to the point at which you can just like put it in your pocket or put it around your hand um it's one of those things that's still going to take time to get there well lucas i very much look forward to sort of supporting you at the immersion lab and figuring out a way to help you sort of further connect with with community on campus and off campus um tell us any other sort of comments you want to make before we close out for the day um yeah i'll just kind of finish up um you know lucas thank you so much for for joining and sharing your experience and uh you know the the idea is to really uh you know this is a workshop designed to kind of allow you guys to all create and experience the metaverse together uh with kind of do-it-yourself technology so um just like the first two you know we had that take-home lab to kind of go do the experience yourselves you know i strongly recommend um i posted a link in chat to to kind of um the the lucid gloves vr kind of the the documentation and all of the parts that are needed so just feel free to experiment if you guys are ever interested in the space and let me share my screen real quick just to kind of end this out um and so our next immerse series will be taking place in the summer so once you guys uh once we finalize the date we'll be posting that on our website and this will be all about virtual worlds so kind of the vr experiences that we can create how does this relate to the metaverse and specifically the current research that is being developed both at other academic institutions as well as meta and all of these different labs that are really looking to ingrain a lot of these tools in our everyday life and and how does this shape you know our different lifestyles and and is it is it here for for the better so hopefully you guys will be able to make that um thank you all for following along for for the third part um and we look forward to to seeing you guys um so thank you again and uh we'll talk soon and lucas thank you again for your engagement with the community your willingness to come today and phenomenal work um so looking forward to continued interesting and great things from you so tell us uh and lucas um have a great remaining your day participants have a great day and thank you and stay safe thank you thank you you

2022-05-04

Show video