Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi XR developers today we're going to talk about the new tool from meta which is called the mixed reality utility kit or mruk for short before when we wanted to test features like the scene API or anchors we always had to build an APK and test it on our device Mr now allows us to directly test everything in the editor when we use the quest link it even comes with room prefabs which allows us to test different rooms to better fit our application to different room sizes of our users trust me this is a really great tool to develop mixed reality experience and I hope you're as excited as I am if you like this type of videos and they are helpful to you please take a second to like And subscribe to this channel it helps me a lot if you'd like to get the source code of each video please consider subscribing to my patreon where you can find all the source codes if you have any questions please join our Discord Community we are a community of over 200 XR developers at this point and we are happy to help you with any questions and now let's get started with Mr to begin let's cover the requirements for using the Muk to use Muk with unity and medaquest ensure you have Unity 20213 do30 or newer and a Quest 2 Pro or 3 with firmware version 60 or newer for PC use Quest link but remember the pass through image is headset only and set up your room scan before connecting on Mac you will have to build and deploy an APK keep in mind that you do not need an OVR scene manager in your scene Muk serves as a replacement for it lastly it is advised to familiarize yourself with meta's scene API first with that out of the way let's set up a new Unity project we first want to install the meta XR SDK from the package manager you can simply install it by its name which is com. metax sdk. all next we install the mixed reality toolkit from our package manager by typing com. also make sure to install the samples which we will look at in a second let's make sure we create a new room scan inside our meta Quest the more accurate the scan the better will be our experience later take your time to walk around your room and look at your surfaces from different angles also take enough time to add new anchors for your furniture and add the correct label such as table couch or bed now in order to later test any of our scenes let's set up our project with the project setup tool from meta just apply all the suggested changes which will set up everything for us even our XR plugin for testing the scene directly on our device do this for both Windows and Android and lastly switch the platform to Android in the build settings in case you want to deploy the app to your headset later we are now finally ready to test out some of the samples and examine the components and functions that come with the utility kit firstly let's just look at the Muk based scene we can see that we have a regular OVR camera rig with the tracking origin mode set to Stage under Quest features we can see that we don't need to support spatial anchors seen understanding and pass through like we normally have to if we don't use the utility kit however we still require an OVR pass through layer to actually be able to start our experience in pass through mode keep in mind however if you would like to build your app to your quest device you will still need to enable the anchors and enable the scene support as well as the pass through mode Let's now look at our M prefab that contains the main component M Ru K which is a singlet ton and therefore should only exist once within our scene firstly we have a scene loaded event which lets us easily execute any public method from here once our scene has loaded now keep in mind you don't have to use this component to reference all your methods the utility kit also comes with an m M start component which you can find on effect mesh game object for example this component is not different to the Muk but simply exists for drag and drop ease of use so you don't have to use the event on the M component to reference all your methods that you want to execute when your scene has loaded now let's look at the main component in detail the first checkbox we enable is called World lock World locking is a new feature that makes it easier for developers to keep the virtual world in sync with the real world previously the recommended method to keep the real and virtual world in sync was to ensure that every piece of virtual content is attached to an anchor this meant that nothing could be considered static and would need to cope with being moved by small amounts every frame this can lead to a number of issues with networking physics rendering Etc so we definitely want to keep this box checked next we look at the scene settings our data source can either be a room prefab which is provided to us by the utility kit already or it can be the scene model that we have created inside our headset already there is a third option called device with prefab fallback which means if we haven't set up a room scan inside our Quest home our application will make use of the room prefabs that the utility kit provided to us this is not only beneficial for when we cannot scan our own room but also if we want to test our app in a variety of rooms that could be similar to our end users rooms next we have a room index and a list of room prefabs that can be loaded setting the room index to minus one means that a random room prefab will be loaded setting the index to the number zero means the first room prefab will be loaded every time and so on the list below comes already pre-filled with some room prefabs if we play the scene two different times you will be able to see that in the editor unit will randomly load two different rooms next there is the load scene on Startup checkbox when enabled the scene is automatically loaded if it exists and the scene loaded event is fired with no other action when Falls you can manually control scene initialization Behavior lastly we can specify the width of a seating area within our scene model this means if we have set up an anchor with the couch label and it has at least the width specified on our mruk component it can be queried from code so for example we could call try get closest seat pose which Returns the closest seat pose on any couch objects or we could also call get seat poses which simply returns all seat poses in the room it will return zero if there are no couch objects in the scene let's now look at the effect mesh component this component allows us to easily render our seen models in a different material which we can assign under mesh material it also allows us to enable colliders which will let us interact with physics in our scene like bouncing objects off our models we can also allow the casting of Shadows on our surfaces which lets us see the shadow of other objects when they are moving within our scene lastly we can decide which labels to apply this scene effect to for example just floors and walls or all the objects in our scene I will leave a link in the description that explains the other properties on this component which we won't look at in more detail in this video lastly let's look at the room Guardian game object it contains the effect mesh and mukk start component like on the effect mesh game object but this time we also have a room Guardian component where we can set a distance this is the distance from which the guardian should be activated from when the player moves within that distance on the scene loaded event we call the get effect mesh material from the room Guardian component this method will find the mesh from our effect mesh and fade the guardian material depending on our distance to the guardian we can see that right here in the code as well let's take a look in our editor you can see the guardian being simulated very accurately as if we are wearing our headset right now and we would walk towards a boundary you can play around with the distance and figure out which distance is best for your game the last thing we want to look at in this basic scene is the scene debugger component this component offers us a menu with a variety of tools for getting anchor and surface information it also allows us to shoot a projectile into our space if we enable the Collision on our effect mesh let's press play and see what kind of functions are open and available to us from the mruk singl ton class that comes with the mixed reality utility kit we can get the key wall which is the longest wall in the room that has no other room Corners behind it or we can request the largest available surface or the closest surface position which can be great for placing content in our own app we can also query the closest seat position or visualize where our raycast is hitting the model for example to get a better understanding of how users are interacting with our app now let's look at some code and for that we open the scene debugger script I want to show you how easy it is to query all this information yourself and create your own unique gameplay with it we can use three main classes m u k m k room and M Anor they all come with a bunch of methods that provide us with a lot of information about our room I will leave a link in the description to all of these methods now let's just quickly look at how meta has used some of those methods me for the debugging functionality let's check line 156 for example as you can see the get keywall debugger method is supposed to retrieve the keywall and apply the debug visuals to it to get the keywall from our room model we can simply use the Muk Singleton by calling m. instance then we need to get the current room and get the key wall from it by calling get current room. getkey wall also as you can see our key wall has the Type M K anchor let's also take a look at another method in line 197 we can see that we are doing the exact same thing we declare another local Muk anchor variable and get the largest available Surface by calling the M UK instance and then getting the current room we are in we can then simply call the find largest surface method and provide it with the surface type parameter so it knows which surface we are looking for meta really made it super easy for us to query a bunch of seen data but let's look at a few more samples to cover all the most important features of this amazing utility kit we open the fine floor zone scene and open the fine floor zone module because the next component we take a look at is the fine spawn positions this is an excellent tool for when we have our own prefabs let's say a model of a small building for an architecture application and we want to check where we are able to place it without overlapping with our furniture so we would reference our prefab in the spawn prefab field and depending on the size of it it will then decide on where and how often it can be placed on our surface if we look at this floor spot prefab for example we can see it is two units long and wide so it will only fit in places of that size or bigger we can set the amount of prefabs we want to place as well as the number of times to attempt spawning or moving an object before giving up we can also specify where on the surface we would like to spawn the objects and for which kinds of labels for example just on the Flor floor if check overlap is enabled then the spawn Position will check colliders to make sure there is no overlap we lastly call the start spawn method from the find spawn positions on the scene loaded event let's give this scene a try we can see now how the different prefabs of different sizes are being spawned onto our floor surfaces great guys there's one very cool component left before we close off this video that I want to show you and that's the anchor prefab spawner let's open the virtual home scene and look at the furniture spawner the anchor prefab spawner allows us to effectively replace existing anchors such as beds and tables with virtual objects or in other words prefabs that we prepar beforehand so if we open up the prefabs to spawn and then open one of the elements we can see that we first specify the label of the anchor that we want to replace in this case the walls and after that we assign the prefab that our walls should be replaced with let's give this last scene a try and see our room with those walls we added as prefabs we can see that our real room turned into a completely virtual room this allows us to modify our users rooms in the style of our game which opens up a huge variety of gameplay and I can't wait to see what all of you are building with the mixed reality utility kit all right guys and that's it for this video I hope you learned a lot about mruk today and if you're enjoying this content please take a second to like And subscribe to this channel consider subscribing my patreon if you want to get all the source codes of each tutorial and we are happy to welcome you in our Discord community and feel free to join if you have any questions thank you so much for watching and see you in the next [Music] one
Info
Channel: Black Whale Studio - XR Tutorials
Views: 5,671
Rating: undefined out of 5
Keywords: meta quest passthrough, develop app for meta quest, unity vr, unity meta, unity vr tutorial, unity tutorial 2024, meta quest tutorial, oculus integration, unity installation, meta vr, meta mixed reality, meta quest developer tools, mruk, mixed reality utility kit, meta utility kit, meta mixed reality utility kit, meta quest, meta quest 3, meta quest pro, meta quest 2, unity mixed reality, meta mixed reality utility, meta mixed reality toolkit, meta mixed reality kit
Id: n6YZlp4yMwM
Channel Id: undefined
Length: 13min 59sec (839 seconds)
Published: Sat Dec 30 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.