[Music] foreign [Music] [Music] hello and good evening everyone i am your host welcome to the fifth season of ist practicals by internship trainings ist practicals is a series of free online master classes delivered by industry experts these 45-minute sessions are all about teaching students important concepts softwares and tools every ist practical is an opportunity to experience the power of practical learning the topic of today's session is arvr technologies and the metaverse and our teacher today is ashrae he is an xr developer at immersive insiders where he does research and development of xr experiences apart from that he writes blogs and teaches students the fundamental concepts of creating arvr experiences using unity engine and c-sharp scripting i hope you will enjoy this webinar with him we have a very special gift for you at the end so stay tuned for that is hinsei and let's welcome ashley hello hi guys how are you guys doing i hope all of you are doing well and let's begin now before we start i just want to know how much knowledge you have on arvr so let's see how many of you know what arvr is so if you know what it is just give me a yes if you don't know what it is you can comment no this will kind of give me a idea of how many of you here know what it is and how many of you don't know what it is okay okay so i think majority of us here don't know what it is all right then the next question for you guys is how many of you know what metaverse is okay uh not actually like meta versus not exactly facebook but you'll get to know what it is by the end of this webinar all right so let's start by seeing what ar is so ar stands for augmented reality now basically it's a technology uh that lets developers superimpose digital content and uh like images sounds and text over real world environment so it's like you take some real world uh examples like you have your camera device right so you have your ar phone you have a phone and from that you are getting the live feed of the camera and then we have some digital content like say some images or text you want and then you just super impose them one on other so that is what ar is so you have your uh real time feed and you have like the digital content super imposed on one another so the best example that i can give you here is that most of us have tried is pokemon go i'm pretty sure all of you have tried it and uh yeah so here uh i hope my audio is better now so i can see that it's low all right uh so the best example is pokemon go so you might have seen it so it's basically like uh you have your environment and when you open the ar mode you can see the pokemon right in front of you and then you can catch it so that is one of them and the second best example is like trending right now is like the real estate industry and home industry so basically big companies like ikea they have started selling their furniture through ar as well so you can select your chair or table or any furniture of your choice and then you can scan your room and place it exactly at that location you can change its color you can change its size and see whether it matches your room or not and then you can decide whether you want to buy it or not so that that's the advantage of having air technology and there's another advantage and that is that it does not require like external tracking or something you it just is just a small smartphone is enough it does not require any add-on variables or extra utilities now almost all of us have smartphones and not all the smartphones are capable of uh experiencing air like some of the lower end smartphones do not have that capability but majority of them do have it so earlier it is to be like you need to have a smartphone with uh something that it supports the er code that's the ar software that's there but nowadays what is happening is there is something called as web xr as well so it's like you have a website and you just go there you can enable permissions for your camera and then you can scan your environment and start placing your objects so this happens simultaneously you're right so that was that's the next thing that i was coming up to so lens card so that is another super example of having air technology it's like for those of you who don't know what lens cart is it's a specs companies and they sell sunglasses and they prescribe lenses all these things so you can try out the frames on your face you can move around it moves along with you so that is what ar is now before uh we move on to the next one uh how many of you know what the difference between ar and mr is like do you know what it is like you know what the difference is do you think it is the same or do you feel mixtality something completely different and it's not related to ar at all yeah snapchat filters too that's that's right so that's also here as well cool so uh ar and mr is not really different like it's just like more of like an extension so ar is like where you place your digital content on top of the world and mixed reality is when that digital content blends with your physical world so here you can see that we have some alien object and that is kind of blocking the furniture that's there behind and here on the right side that's mixed reality where in the uh digital content or the uh alien device alien thing that's there it's behind the furniture so it's kind of blending in so this is just one example of mixed reality the other one would be something like say suppose you have an augmented reality app wherein you're able to shoot some say shoot a ball somewhere so generally what would happen is you will be able to see your environment you shoot a ball it just keeps going going going till it becomes smaller and smaller whereas in mixed reality suppose you shoot a ball and there's a tree in front of you it will interact with the tree it'll although it's a digital content it'll interact with the tree and he'll come back so basically it knows something like depth perception and it kind of kind of knows where object is there in your environment all right so we'll have a quick quiz before we can proceed okay so the first question for you guys is which of the following is true with respect to ar option a it provides complete immersion option b it requires a projector to see the augmented uh contents option c it requires external tracking devices to detect position and rotation device and option d none of this so which do you think is the right answer you can comment your answers below here okay cd most of you are seeing see that it requires um uh it requires external tracking to detect position notation i don't think so that's because the correct answer is d so you can see the reason is that for ar it's not complete immersion because you can see your physical environment it does it says it requires a projector it does not all you need is a small phone it says it requires external tracking devices to detect position and rotation it does not so it does not require any external devices for it to track all right no problem if you got this wrong we have one more question for you so is it possible to augment 3d objects behind real world there's just option a and b so or maybe i'm seeing it slowly let's see yes all right that's correct it is possible to augment 3d objects behind real world it's called as occlusion actually all right so moving on to the next slide we'll see what vr is so basically vr is like an entirely simulated artificial uh digital experience so it's like everything is computer generated so once you wear your headset you're there inside a 3d environment and you cannot see your physical world you're completely disconnected from it so uh virtual reality technology it kind of has uh what can we say it's like it has the power to deliver experiences that you cannot normally experience right in in real world say suppose you want to experience how it feels like to be a superman or or like a spiderman or you want to know how how it feels to fly and those kind of experiences is not practically possible in real world but in virtual reality you can simulate it and uh so what virtual reality does this it basically makes you feel like you're mentally mentally and physically there but you're actually not now this deck needs external tracking devices and software that that can track the input so for example here uh for my vr we have the oculus controllers here so now to track this controller the headset needs to have some kind of a device which can see where the controller is and map it exactly so like i'll repeat myself again so for ai you do not need any external devices and for vr you do so that because you need to track different kind of inputs that are there now facebook latest they have released hand tracking as well so with the same software that they had initially you'll be able to track your hands you don't need these controllers anymore that's really cool actually so for vr technology what all devices do we have right now so we have the quest we have htc vive we have index we have pico and and there are many other companies that are building vr tech so it's really really growing and it's growing really fast the next part is what are the uses of vr we can use vr for socialization you can use it for training and games and the biggest advantage of this is that it is very very cost effective when we talk about training so if you think about it right now the uh us military the kind of using we are to train the soldiers and even the u.s air force also because for them to fly the jets in practically it's a huge cost they spend a lot of money to train them now vr technology is getting so advanced that you wear your headset and you're inside a simulator you actually feel as if you're in the aircraft but it's just a simulator so you're saving a lot by saving fuel and all the training expenditure and those kind of things all right let's have a quick quiz for this one as well there we go here so oculus quest can be considered as a if for those of you don't know what oculus quest is it's basically a vr headset i'm i'm really sure everybody knows it but anyway so oculus quiz can be considered as a heads-up display augmented reality display head mounted display a small computer so you can choose more than one answer so i'd like to see what you're going to choose see yeah that's right and one more there is one more [Music] c and d yes c a and c heads up display no it's actually c and d so these are the correct answer it's not a heads-up display it's not an augmented reality display it's a head mounted display when you're wearing it across your head and it's also a small computer because it needs to have the processing power to generate and you know and render these uh 3d experiences that you have all right moving on to the next question which of the following can be used as an input to virtual reality can you use keyboard eyes gamepads mouse or controller so which of them are used as input devices we already know one controller so i'm gonna check that can we use keyboard all of them okay e yeah controllers is like what about keyboard do you think keyboard can we use as an input device for virtual reality yes of course we can because uh say suppose your virtual reality application is something like multitasking wherein you have like three four you're wearing your headset and you have like three four screens you want to multitask so in such cases you can connect your keyboard and you can connect your mouse as well and use that as an input can we use eyes yes we can right now facebook is coming up with a new headset called project tranvia and over there they're able to track your eyes and tell exactly what you're looking at can we use gamepads yes we can use this one as well so those of you who said all you're right so all of them can be used as an input device for quest all right so going back to our presentation then in the next section let us see what metaverse is so metaverse is like a highly interactive three-dimensional virtual world okay it's exactly like the real world we are in but it's virtual that's how you can think of it in so in real world you'll be able to buy lands you can buy buildings you can buy assets right like you you can go to a shop and buy some painting and you can go to a clothes store buy some clothes for yourself you can do all these things virtually as well inside that universe so that is what virtual that's what metaverse is so it's like in a metaverse you and your friends and all the people who are connected to that metaverse can go inside and buy themselves some land and you can buy yourself clothes you can buy digital assets that's nfts that they call and and those things belong to you and nobody else so that is what uh meta versus so and then we have something called as avatars so avatars are nothing but the replicas of us like us the users who can do exactly the same thing that you do as a human that's what a metaverse is all about you know and the coolest part about it is there's so much things that you can do right now there is one uh matters that's going on wherein you can buy a land you can build any building of your choice it could be a theater or something and then people can come inside the theater watch a movie and pay some money for it and you you get to earn from that so yeah that that's what a metaverse is all right so we know what a metaverse is right and not many of us know what goes into it like not all of us know how a metaverse is built up from scratch so that's exactly what we are going to see right now we are going to see the anatomy of materials like how it is built up from its base so at the base we have the infrastructure and the most infrastructure like the most important infrastructure that we need is network and hardware now without these two infrastructure you cannot build a metaverse because you need to have a network infrastructure that's able to transfer high bandwidth decentralized data in real time like it has to be very fast and if there is some latency and lag then your experience is completely broken you don't feel like being inside the middle you just want to come back okay so you need to have a very good network infrastructure and right now we have companies like a tnt and jio who are like working for 5g networks and those kind of things right and next is the hardware infrastructure so if you want to access the metaverse you need hardware right like uh the headset or it could be ar glasses or any any hardware now for those hardware you need to have high performing processors you need to have high capacity batteries and other various components that are there so uh that's another infrastructure so you need to have good infrastructure that is able to manufacture these processors batteries and small components so that's the first step of building a metaverse let's see what's the next step is so here is the software so once you have the infrastructure we need the software so we have companies like google and aws who are providing a software for cloud computing so that you don't have to do everything uh locally and then we have software's like unreal and unity engine which is meant for creating 3d content that goes inside the metaverse and then we have platforms like all space and uh vr chat so these are like a virtual platforms wherein you can go meet your friends or you can have your meetings and those kind of things and last we have the software for making avatars which is ready player me uh there are other good uh there are not other so many good um software that create good avatars is what i can say so my personal favorite is ready play me so you can use that software to create your avatars okay and then the next part on top of that is the interface so we have the infrastructure we have the software now there has to be some way in which you can interact with the metaphors right and this is where the headsets and other interface come into picture so we have the vr interface which is like oculus htc vive and these kind of things and then we have the ar glasses so that this is another type of interface you can interact with the metaverse and then we have the uh holographs so this is also a way in which you can interact or is an interface towards the materials and finally we have the haptic gloves as well now here there are companies like apple oculus htc google so all these companies are providing us with these devices so that we can access the metaphors all right and on top of that comes economy so if you think about it nothing will sustain if you don't have a good economic system around it right so right now metabours is currently run by three major things the first is the asset store so everything that you see inside metaverse is created by somebody uh the assets are created by somebody the image is created by somebody the environment is created by somebody the audio is created by somebody so all these people who are creators for the metaverse so they are inside this economy so they create and they give it to people who are creating materials and they are earning money from that the second one is ads so suppose if i'm a developer i i made a nice application it could be a game or it could be some other any any application of my choice and then i spend some money uh and i asked them to put it up inside the meta versus ads now so now what this does is it uh kinds of it kind of helps me as a developer to showcase that hey my app is available for you guys to try and experience it so it has a wider reach to it and once people start using my app and buying it i start getting revenue from that so that's that's another part of the revenue system and the third one is the service providers so now any transaction that you do here uh is not done through the actual internet banking or you cannot use your actual money the whole concept of metaverse is based on decentralization blockchain and those kind of things right so these service providers like metamask and uni swap so they act as a service financial service provider so they help you with these transactions and in return they take certain small fees so that adds on to the metaphors economy as well all right and at the top we have the use cases so based on different use cases there are different metaphors so we have something called as decent land so this is a place where you and your friends can go play games together you can collect different things and showcase it and then we have something called as horizon so this is a metaverse where it's more like what can you say it's a collaborating matter was like you and your colleagues in come together you can discuss few things you can have meetings presentations and those kind of things and finally we have the openc so openc is more like a platform not exactly a metaphors but it's connected to the metaphors so this is where people can go buy uh they can buy sell or trade and all their digital assets so that's done over here so now we know what a meta versus right like we just knew what meta was is but none of us knew what actually went into it so now if you think about it it's really vast and you can think uh how much of resources is needed to create this one matter was that like you need infrastructure like somebody's providing infrastructure somebody's providing the software's then the interface economy and finally we have the use case so that's at the top so it's not really easy to you know make a metaverse of your own is what i could say and it's really cool that how fast it's evolving now it's time for a quick quiz on metaverse so let's go back to here and let's see so metaverse can be accessed only by ar and vrtec is it true or false false false yeah that's right so false is the right answer we saw that it could be accessed by other things like holograph or you can use your haptic gloves to access it so yeah that's not the only vr you can even access it from your pc so matter was for example all space it's like you can access it from your laptop or computer and you can access it from vr as well the only difference is that uh the experience so it's more immersive when you're wearing vr and it's not so immersive when you're seeing it on a 2d surface so yeah that is what it is all right so which of the following describes the matter was accurately it's a virtual world with uh only with it's a virtual world that works only with 5g technology a virtual world meant exclusively for virtual reality gamers a virtual world where users can interact with each other so is it option a b or c option c is what majority of you have taken so let us check yeah that's exactly right so you don't need 5 technology right now we don't have 5g technology and still we are able to access the metaverse and it's not meant only for virtual reality gamers because you can access metals through other devices like air glasses or your pc as well okay so let's go back to our presentation okay so till now we saw what ar is we saw what vr is we saw what metaverse is right now in the next step we'll see what is the importance of ar for the metaphors so we can use ar separately we can have metals separately but when you bring them together air technology has a certain degree of power that can convince your brain that those elements really exist in your environment and that's exactly the same moment when you start realizing that the world becomes a lot more interesting so what i mean is if you see this image or a gif that i placed here imagine you're walking on the street and it's completely empty stores people and all and you wear your ar glasses and all of sudden you see cool things like augmented over here you can see that what stores are selling you you can see about you can see about people you can see when the signal is going to turn on or turn green or red and when the vehicle is going to move and all these things you know so even though you cannot change the world that you're living in augmented reality makes it possible uh to give you an extra dimension to the existing physical world that's there right so basically all you need to do is use images sounds text gps data and uh with all these in place you can enrich the experience like you can enrich your uh when you when you're already there so that's what the importance of air for metaverse is like uh and one more important thing that we need to remember here is that the key element that that really matters is the uh spatial effect so you need to make sure that the ar is placed in in certain ways so that you can perceive depth for example if you see this person who is standing here in black and there are arrows that is moving around and it goes under his shoes so that kind of makes us feel like okay the image is below him and we are sensing our depth over there right cool so now we know how uh ar and metaverse will work together next we will see how the vr and materials will work together and what is the importance of it so with vr like you know we can have like a fully immersive and dynamic 3d environment like it it's possible to do anything so we are it has the possibility to substantially change the way people visit to work or school or how they go to concert or go shopping so with metaverse and vr headset it's most likely that you're gonna perform the same task that you do in real life but from from the from your home like that's what you're gonna do like you're going to go shopping for avatars you'll you'll buy a few clothes for it you will have your own home and you you can invite your friends over and all these things that are going to happen so yeah like i said the importance of it is like this has like the capability of changing the way we interact and we work so you can meet virtually and you will feel as if your colleague is actually there in your room sitting next to you and it's a completely different experience all right so let's move on to the next section which is the live demo so i'm going to show you guys very quickly how you can create a ar application wherein you tap on the screen and the avatar is placed it's like basically you need to scan your floor or any horizontal surface so once it's scanned uh we'll have a avatar and then you can tap it and it'll get placed over there right so let's begin with that now there are certain prerequisites and um i've already set it up uh man could you please share the document with the attendees over there here there we go thank you so if you click on that link it will just take you to uh google uh docs and in that it's mentioned like what all the requirements you need you need to have a unity software and then you need to have it set up for example if you go to file build settings excuse me yeah so if you go to build settings there's so many settings that you need to do to make sure it runs for ar so all these requirements and the references are done i'm not going to go through them right now i'm directly going to jump in and show you how it's done how to make it and you can have a look at it all right so here you can see that i have an empty scene and the first thing to do is to remove the main camera because we don't need the main camera we need something called as ar camera and that you'll get by right clicking here on the hierarchy window going to xr and clicking on air session origin now here there's something called as ar camera okay so this is what we need it has attack on main camera now you can see that it's positioned at some different coordinates we'll set it to zero it's not really required but it's good to start with your air camera starting at the origin but again it really does not matter all right so now that we have the uh camera next thing there's another component that we need to add and that's called as ar session itself this has something called as air session and ar input manager so the best thing about uh developing with unity is that they have like certain integration like ar foundation and all so they come with pre-existing components which you don't have to add so it's almost it's really simple all you need to do is just right click and select the component that you need exactly what we did here all right so we have the camera we have all the sessions that we need so that it runs perfectly fine now the next thing to do is to have plane detection so somehow we want to scan the flow and when we scan the floor horizontal plane has to be detected now to do that you can click on this add component and select something called as ar plane manager so this manager takes a plane prefab and detection mode it can detect vertical or horizontal plane anything that we choose so now let's add the plane prefab okay now to add a frame prefab we need to create a game object first of the plane so to create that you can right click here go to xr and click on this ar default plane that's there now when you click on this you can see that it's already added the component that it needs like the ar plane airplane mesh visualizer so this basically helps you visualize what the mesh looks like a collider and then we have the renderer and it has some materials attached to it all right now at this stage it's called as a game object okay now if i select here it's asking me to add a plane prefab so how do i convert a game object into a plane prefab all you need to do is just select this and drag and drop it into your project window so once you do it it is converted into a prefabs so prefabs are nothing but it they're like pre-configured reusable game object so what i mean by pre-configure it is you can see here it's pre-configured and once you drag and drop it here it's here right so if even if i delete from my scene now when i drag and drop it you see it's back and it's reusable because i can keep dragging and dropping you see how many other times i want this never changes so it becomes reusable so that is what a prefab is right so uh initially it's a game object and then once you drag and drop it inside your project window it becomes a prefab and a prefab is something that you can uh reuse any number of times you'd like all right now going back to air session origin we have the plane manager we have the plane prefab we will drag and drop this in here next we need to choose the detection mode so do we want to scan vertical no we want to scan just the horizontal plane so we will remove this so now our detection mode is set to horizontal perfect so with this we have finished the initial scene setup so we have air session origin we have the camera we have added air plane detection so now with so much if you build and run the application and see how it would look like then let us go back so this is how it would look like i mean the texture is slightly different for us it's going to look orange in color but generally this is how it would work so you have your phone and you slowly go around scanning your area the plane start getting detected and it will you know it will keep showing you on your device so how plane detection works is uh the ar code the software that's there it look for something called as feature points so basically for example here we have a table which is brown in color and below that we have a floor which is white in color so it's able to distinguish that these are two different features and it's able to generate the mesh on top of it so that is how the plane detection is working all right so we have the airplane manager we have the plane prefab we have the detection mode as horizontal then uh next thing to do is to create the ready player me avatar so if you click on the link that we have given it you it will take you to this website called ready play me you can log in you'll have to first log in over here and then you can enter into your hub and you can create your own avatar i've already created mine so this is how i feel i look like but i'm not really sure and yeah hold on yeah so here is my avatar so if if you already have if you are new then you can go ahead and go to my avatars and click on this and you will be able to create a new avatar but i have mine already so we'll go to the next one so uh let us click on these three dots and there's something called as copy glb url so you can copy the url and we'll go back okay now uh with ready play me avatar there has to be some way i can import my avatar into unity luckily they have given us a unity package and i have shared that to you as well through the link so once you open the link let us see over here it's going to take you to this google drive you can right click on it and say download once it's downloaded you can open the file select this and drag and drop this inside here like this okay inside your project window and uh there's another window that pops up saying import unity package so here you will have everything checked since i have already imported it for me it's not showing anything or generally these four will be checked you need to make sure that the newton's of json is unchecked because it's already there in your project now if you leave it at checked and if you click on import it's going to show you errors saying that it's already there so make sure you uncheck this yeah uh don't worry if you're not able to follow or not able to understand what's going on i have sent you all the references this is just like a demo to show you how quickly you can build an ar application and we have all the resources and it's already shared to you in the google drive link our google docs link that we have sent all right so yeah where was i yeah we downloaded the sdk so once you download and import the sdk on top here you'll be able to see something called as ready play me you can click on that click on avatar loader and you remembered we created avatar and we copied the link right so let us select this and paste it here and you can click on load avatar and we'll give it a couple of seconds and we should be able to see my author here there we go so here you can see is my avatar this right now is one is to one scale so what it means is that this will actually be about our height if we were to run it but we do not want that to happen so let us get rid of this here and if you go inside the avatars folder you will also have an avatars folder so if you go inside this you will be able to find your avatar you can drag and drop this okay so it's a prefab i'm dragging and dropping it in here now i want to configure it as per my wish so i don't want to be one is to unscale i want it to be somewhere like small small avatar of mine so i'll change it to 0.3 like in all the directions okay there you go now now my after looks pretty small and then there's one more thing that you might notice so if i make my let's see avatar here you can see that it's facing away from the camera i want it to face towards the camera so i'll rotate it by 180 degrees right so now it is facing towards it let us put it back to zero okay now the next thing that i need to do is i need to rename this and name it as something like my avatar and drag and drop this back into this project folder and you need to click on prefab variant because it's a variation of this prefab so this prefab has a scale of one is to one is to one whereas my prefab variant it has a different scale and it has a rotation to it as well so once you have dragged and dropped this in here it is a prefab now you can get rid of this we don't need it anymore in our scene okay yeah so so this kind of with this we have kind of set up our entire scene now the next thing is to you know somehow detect the plane and once you detect the plane uh we need to have like you need to tap on your phone and it the author has to get placed exactly there right now that has to be done via scripting so i have shared the script with you guys as well it's called as um tap to play so let us see it's called tap to play script so you can download the script and once you have downloaded it you can drag and drop it inside the project window okay now i'm i i'll explain to you how the script works so that you guys kind of get an idea of how scripting is done so at the start we have the library so we need to declare all the libraries that we'll be using so for example uh let us see here so if i want to say input.gettouch now in order to use this api uh it's somewhere inside these libraries so if i want to use the apis for each of the variable that's there then i need to have the libraries so once i have the libraries the next thing here is something called as required component of type arraycast manager so what it means is that if i want to use my ar tab to place component i need to have the ar request manager assigned as well so to give you a better understanding i'll just go back to my unity and here you can see i'll add a component called as ar raycast manager and then i can drag and drop the script that i have written here okay so now now you see that uh air tab to place has something called as required component arraycast manager right so now what i'll do is i'll remove this again and if i directly drag and drop this in here it automatically adds a recast manager so this script is doing exactly the same so it's saying that i require the ar excuse manager component and if i add the script this component gets added automatically as well all right because we want to use it in our script that's why we are doing this yeah okay so we have declared our libraries we have told what components we need then the next thing is to declare the variables that we'll be using inside the script so the first variable is the avatar prefab now this avatar prefab is something that we created earlier the first time that is uh where is it if i go here avatar so yeah this is my avatar prefab so this variable will store this avatar prefab right and then we have something called as pandavatar so what spawned avatar does is later in the script here we will be making a duplicate version of this prefab so we'll be assigning the prefab to this variable and later on we'll be duplicating it so that even if you make any changes to it it does not affect the original one it's always the duplicate that gets changed and then we have a variable called as a recast manager which is nothing but it stores this component a recast manager then we have a variable called touch position which stores the position of touch so example if i touch somewhere on my screen here so it's able to detect the position and that value the x and y value gets stored inside this touch position variable and in the end we have another variable called as hits so i'll talk about this later we just carry forward and then i'll tell you why we use this variable all right okay now here we have the first method it's called as uh private void awake so this method is called for the first time when you run the application so when the first time the application is run it gets this component arraycast manager and stores it inside the variable that's it so it takes the component and stores it inside the variable that's it that's what it does and the second function is called as the update function second method so update function is called uh every single frame so every time a new frame is there this method is called so what this method does is it checks if there is a touch so input that touch count greater than zero so what it means is that if only if i touch my phone or touch my screen only then i want this entire set of steps to get executed so if i don't touch my phone then these steps are completely skipped so it does not get executed at all okay so now if the input touch is greater than zero greater than zero because i can touch like this i can just like this so there's various way i can touch my phone right so if it is greater than zero then i want to get the first touch so the first touch whichever is the first touch i want to get that and store it into a variable called touch so once i have that i want to check if my touch phase is equal to touch phase or begin so what it means is if i get rid of this here you can see that there are various enums so if you're wondering what an enum is it's basically like a predefined constant now their enums are used when we know all the possible values so here all the possible values are like began so when you touch it's your touch is begun when you remove it your touch is cancelled or when you uh like completely remove it's ended and if you're touching and moving it means you're moved and if are touching and keeping it there it is stationary so there are different different possible scenarios and we know what it is so we want to see when it is begun like when when when i'm just touching like when just touch so i want to check that particular criteria so if that is met then i want to store my touch position in a variable called touch purge position all right so just a quick recap of what we have done here so first thing is we are going to check if the touch count is more if it is more then we store the uh that touch inside a variable called touch then we see if it is begun so if the touch is begun then we store the position inside touch position all right and once all this is done next thing we want to do is we are going to use the arraycast manager component and use its api method called raycast now this needs three things one is the screen point the point in which i am touching then it needs a variable to store the hit results so when the raycast goes and hits something it generates some results so there has to be some variable which can store these results and finally we need to uh tell them what are we tracking so what is the trackable type so what raycast is basically is like when you touch something right uh it kind of shoots a day cast like a like a laser kind of thing it does not actually shoot a laser i'm just explaining it and it sees whatever it hits so the line goes and hits something and when it hits something the result is stored so the recast will be short from the touch position so wherever i touch from there we have a ray car shooting and once it hits something it stores all the information inside this variable and trackable type so what are we tracking right now so if i click on this dot here you can see that am i tracking my face no am i tracking an image no what am i tracking i'm tracking a plane but what type of plane am i tracking am i tracking a plane with bounds am i tracking like plane with bounces basically means a rectangular bounding box so that's not we are tracking are we tracking plane within infinity so what this means is that if you're trying to track a plane with respect to its position and orientation but that is also not what we're doing we are tracking plane with polygons so it's basically like a 2d convex shape so that's what we are tracking so we'll select plane with polygon so if you remember this image let me go back and show it to you where is it i lost it just a second yeah so here you see that it's not a rectangular shape like it's it's like a convex shape so it keeps building up to it so that is what it's meant by a trackable type with the plane within polygon okay so now if this is true so if when i shoot my raycast when i shoot my recast from the touch position if it hits a plane with polygon then this set of code is executed so what is the what this set of code does is it it it stores the first hit position so imagine this imagine you have a table you have a floor and then you touch using your phone once you touch there's a raycast which shoots right now first it hits the table and then it hits your floor as well so it's hitting two points so we want the first point that it hits which is the table so that is what we are doing so hit pause it stores the first hit position uh when you touch your screen and now once we have the hit pose variable stored we know what the position is stored in the next thing we want to do is we want to spawn the avatar so when you open your application you scan your floor there is nothing right your avatar has not yet been spawned so the for the first time when when we want the avatar to be spawned that's when we do this so if the avatar is null that is when it's not there for the first time we will instantiate the avatar avatar prefab and where do we want to instantiate it at the hit position so wherever it has hit in that position we want to instantiate it and watch is the rotation that uh that should that it should be taking it should be taking the rotation of the prefabric so that it it is facing the camera now in case you already have the avatar so for the first time you touch we have the avatar that spawned over there and the next time you touch somewhere we just want to move its position we don't want to create a duplicate version so uh you touch one place it spawned there you touch somewhere it just jumps to the next position so to do that it's you done using this statement saying spawn of that transformer position is hit position dot position so that's what this script does is so to give you a small summary of what the script does is it uh it scans a plane uh like a horizontal plane and once it's scanned you touch it and if if if it is a proper plane that you have scanned and finished touching the author gets spawned once it spawned you touch somewhere else it jumps to the next place right so we have the script ready and all you need to do is drag and drop it in here then you need to add avatar prefab that we have created that one as well so you can drag and drop this in here and then you can go to file uh build settings and you can kill click on build and run if you click on that it will ask you to ask you for an apk name you can give it something like uh tap to place this one and then click on save and it will start building and it will generate the application for you once you have built it it should look something like where did it go hold on yeah like this so when you tap for the first time it's there and then when you tap somewhere else it it skips to the next position so that's that's how you make a ar application now this is like really small simple demo you can build it really quickly and uh it's pretty cool as well now this is just my after uh you guys should definitely check it out go to ready play me create your own avatar download it use unity c-sharp scripting you have everything that you need all you need to do is place the things correctly and hit build and run that's it and you will have your first cool application you can go show it your friends saying that hey see i have a very cool application that i built right all right so uh do we i think i'll hand it over back to manpreet thanks ashley the whole webinar was really amazing and i'm sure we all got to learn a lot of new things so now we'll be opening the floor to our audience for their questions but before we start taking up their questions i have a question for you my question is that as somebody who is eagerly interested in learning more about this domain what is the right path for me and i can see that a lot of people are also asking this question that what are the future prospects so we would love to know that yeah yeah definitely so i would like to tell you something from my personal experience so i started off as a mechanical engineer and then i got into this space so back then it was uh kind of difficult for me because uh i had to learn c shaft from some source i had to learn unity from some other source and then i had to learn development and all these things i really wished back then that i i wish that there was somebody who could you know put all of these things together and give it as a package saying hey you start with the unity then you learn c sharp then you can learn developing this developing that and uh yeah that's what uh that that would be the right steps the first step would be to you know learn unity and then you would learn how to code using shisha then you start learning ar and how to develop air apps and then you can learn vr and how to develop vr apps as well so but lucky for all of you that we already have this package on internship and i think you can check it out that's right yeah yeah and with respect to the future prospect i can say that right now we are exactly at the beginning stage like not many people know about it so uh like if you go and tell somebody that hey i work for ar and we are they they come and ask me hey what wait a second what is that so it's at a nascent stage so this is the right time to go and kind of start exploring it because in next three four years it's gonna blow up and when it blows up you're gonna you're gonna reach great heights along with it as well so that's what i feel definitely and for all the keen learners here today international trainings is offering a special 10 discount so click on the link in description and use code practicals 10 to enroll now for our arvr training and 70 plus other practical based online trainings so we can now open the floor to our audience so i would request all of you to just type in your questions in the chat and ask if you could take up a couple of questions from the chat yeah sure so for those of you who have joined from linkedin i think you will find the link in the video description so unity is basically like uh editing software so the software that i used right now it's like uh engine so development engine so you can create various uh contents in unity so that's what it's used for oh there's so many questions that are coming in oh let's see where do i start from so what are the ways to [Music] get into this role of carrier so there are different types of uh ways that you can see so it's not like you have to become a developer to get inside the xr and xr community it's like you can either be you know a 3d artist as well you know how to develop models and then uh you know how to do proper lighting and those kind of things so even for ar and vr you need models right there are some differences like for example if you're developing a mobile application uh your models cannot have cannot have like lot of polygons because then your phone will start hanging and things like that so if you are somebody who is a 3d artist then you need to know how to how can how to create good models for ar and vr and that's one way to get in and the second one is definitely through developing if you're a developer then yeah that's how you can get into so you need to know c sharp program the actually the thing is for you to start with x r there are two ways one is uh you can know c plus plus and uh unreal engine and c sharp and unity now the the difference between unreal engine and unity engine is that unity engine has like a huge community so if you get stuck anywhere you just google your problem and you get solutions already because a lot of people who are using it a lot of developers are using it so they know the solution to it unreal is uh not that great because uh if you get stuck somewhere then the help and support is not that much but it's but in terms of visuals definitely unreal is better you can make better looking visual games than unity and yeah so you need to know coding to make small things so for example for example like we saw if you're making an ar app and you want to tap something it's not like somebody has pre-written the code for you to make it work so you as a developer need to know how to code and make things work so suppose i want to make my avatar instead of jumping i want to make him walk then it's my job as a developer to write a code which will make sure that when i tap he should walk in such a direction so so these these kind of things that are there all right so actually i think that would be all since we are short on time thank you so much for answering all these questions and for the awesome webinar and for all of you thank you so much for joining and don't forget to like this video and share it with your friends do subscribe to our youtube channel and press the bell icon to never miss another update regarding our upcoming events you can also join our telegram channel and get updates regarding our events there the link is in the description and we'll see you guys soon bye [Music] foreign [Music] [Music] foreign
2022-06-25