Apple Vision Pro Input & Object Manipulation with Unity PolySpatial | Connect to Unity and Xcode

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi XR developers in today's video I'm going to teach you how to use the 3D touch input for spatial pointer devices like the Apple Vision Pro I prepared a bunch of input scripts for you which let us manipulate game objects during runtime in different ways such as changing the color moving it around or rotating it next to 3D touch there is also something called skeletal hand tracking which is available to us through the XR hands package from Unity we're going to look at this in a different video if you enjoy this type of content please take a second to leave a like And subscribe if you want to get access to all the source codes please consider supporting me on patreon and if you have any questions feel free to join our growing XR developer community on Discord and now let's learn about 3D touch input or spatial pointer device input devices we start with a fresh Unity project you can watch my previous video to find out how to install all the necessary packages first I make sure to select the right target SDK which is the device for me then we go to the package manager and make sure that we install the play to device input settings as well as the unity poly spatial samples from the poly spatial package let's go ahead and add all the scenes to our build settings so we can later test all the scenes without missing any first let's open the balloon Gallery sample to test the scene we can use the play to device feature directly from our Apple Vision Pro device open the play to device window and then head over to the website in the description to join the test flight beta for the play to device app from Unity on your Apple Vision Pro go ahead and open the test flight app here we can then open the play to device app from Unity go ahead and install it after the installation click on start testing now the app will open for us and display a local IP address we can then enter this IP into our Unity editor window and click on ADD device after that has been done we are ready to hit play in the unity editor this will automatically open the scene directly on our device as well we can now either manipulate the scene in our editor or the actual device and the changes will almost instantly update on both ends for example I can now look at the balloons and pop them by pinching or by touching them directly we can also exit this scene and go to the selection menu where we can find many other sample scenes from Unity One sample we are especially interested in in this video is the object manipulation as you can see we can simply move the cube by looking at them and then pinching go ahead and try out all the scenes that Unity provides for us it is a lot of fun and it works great on the device let me now show show you another way of testing your Unity scenes on your Apple Vision Pro one way to test your app is to build a xcode project once the build is finished go ahead and open it in xcode by double clicking on it now before we can deploy this app to our device we need to actually connect our device go over to window and then select devices and simulators as you can see my device is already connected so let me show you how you can connect yours on your device go over to the settings under General you want to select remote devices where you should see your MacBook on your MacBook you should now be able to see your device and click on pair then it is crucial to enter the developer mode we can do this by going to privacy and security and scrolling all the way down here we can toggle the developer mode fantastic a last important step is to go over to your signing and capabilities tab on xcode and make sure you don't have any errors here I usually check automatically manage signing and select my team and after a few seconds the errors should all disappear we can then finally deploy our project once the build is successful we are able to test our scenes just like before but this time it is a 100% true representation of the actual operating system and not our Unity editor awesome now let's go over to the core part of this tutorial which is the spatial input let's go over to the input data visualization scene there are multiple ways of how to read input from your spatial input device here Unity used input action references as we can see by clicking on this manager we can see a very sophisticated implementation here which might not be the best sample to get started with if we click on an input action reference we can find the input action map developers that are used to working with unity's new input system can simply Define actions here Unity uses the spatial pointer device for 3D touch input which can be found under other and then spatial pointer device there is also a vision OS spatial pointer device the primary difference between the two is that the interaction doesn't require colliders thus Vision OS spatial pointer device is missing input controls related to the interaction there is a primary spatial pointer for detecting the primary interaction and a spatial pointer zero and spatial pointer one for the first and second interactions respectively let me now show you an easier way to pull input directly in our code for this let's create a new scene for a mixed reality app we need a regular camera as well as a volume camera let's let's add the volume camera component and decide if our app should be bounded or unbounded I'll go with bounded and therefore for assign the bounded volume settings we also add the volume camera resize listener this script for unity's vision OS adapts the dimensions of a volume camera to maintain the desired content appearance let's also make sure we go over to the poly spatial settings and assign the bounded volume camera configurations there now let's start with our actual logic we would like to manipulate a simple Cube let's add a cube and resize it to fix our volume we want to make sure that the cube has a collider to register our input Also let's add the vision OS hover effect and the vision OS grounding Shadow hover and Shadow are both handled by the operating system and look amazing we then add an empty game object called input manager where we will later add our logic to lastly to make the input work on our device we add an event system we can disable the input system UI input module here first we want to change the color of the cube when we make an input for this I prepared the color change on input script we use a new namespace called polypa do input devices to access the spatial pointer support we can also Leverage The enhanced touch API for pulling touch phases and Counting our touches up to two inputs can be registered at the same time we lastly need the input system. lowlevel namespace to access the state of our spatial pointer to detect the actual Target object in the on enable method we make sure to enable the enhanced touch support in the update method we first check if we actually performed a touch then for every touch we check if the touch phase equals touch phase. began this is the beginning or in other words first frame of the touch we do this so we don't perform an action every frame for as long as we are holding the pinch for example for this first frame we then get the spatial pointer state by calling get pointer state from our enhanced touch if we actually detect an object we call a custom change object color method where we fetch the object's renderer and assign a random color fantastic let's go back to Unity and assign our new script to the input manager we can then hit play and test the scene on our device and as you can see our Cube throws a realistic shadow in my room thanks to the shadow component we added to it we can now look at the cube and pinch it to change its color we can even directly touch the cube to change its color awesome as you can see a touch can mean an indirect pinch from distance a direct poke or even a direct pinch on the object now what if we only want to change the Cube's color when touching it directly but not when pinching for this I made a few changes to our logic in the color change on Direct touch script the only difference here is that We additionally check if the kind of touch We performed is a touch which means direct poke there is also direct pinch indirect pinch pointer and even stylus now that we change the color with our direct touch let's add some new functionality to our pinch for this I created the move object on input the logic here is largely the same except that we are now explicitly excluding the direct touch when a touch begins this code selects the object being touched and Records its initial position if the touch moves it calculates the movement distance from the last position moves the object accordingly and updates the last known position for future calculations the last bit of code here resets the selected object when there are no active touches great let's go back to Unity assign our scripts and test this new interaction on our device as you can see we can now pinch the cube and move it around within the bounded volume as we are approaching the boundaries the cube is getting cut off when we directly touch the cube we can still change its color fantastic you are now ready to implement your own interactions for your Apple Vision Pro application I am very curious to see what you will come up with all right guys and that's it let me know if you want to see more videos about Apple Vision Pro and I hope you find this video helpful let me know what you want to see next and as always please take a second to like And subscribe support me on patreon to get all the source codes and feel free to join our growing XR developer community on Discord thank you so much for watching and see you in the next [Music] one
Info
Channel: Black Whale Studio - XR Tutorials
Views: 962
Rating: undefined out of 5
Keywords: apple, applevision, applevisionpro, applexr, applevr, applear, apple unity, apple vision pro, apple vision, apple xr, unity vision os, apple vision os, apple xr development, unity apple xr development, apple xr glasses, apple xr headset, playtodevice, visionos simulator, develop for visionpro, visionos unity, polyspatial, unity polyspatial, polyspatial apple, unity apple vision os, apple unity development, polyspatial development 2014, apple volumes, SpatialPointerDevice
Id: k9cJfn4JcTA
Channel Id: undefined
Length: 11min 22sec (682 seconds)
Published: Mon Apr 08 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.