Unite Berlin 2018 - Getting Started with Handheld AR

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
the actual big focus of this talk and this presentation is giving you guys a little peek an example on how to start working with a our foundation and so if you've been following our blog post or some of the work we've been doing you know that that's something that was just released last Friday so I'm excited to kind of share it with you guys and show you how easy it is to to get started there so I mentioned hand held AR and again the focus of this talk is on basically the SDKs of AR core and AR kit so those are developed by Apple and Google and by that we're talking about just augmented reality that kind of exists on these mobile devices such as phones or tablets and so how does it work so the way that handheld AR works is that it uses something called visual inertial odometry and what that is is this a resolution from the camera feed using computer vision as well as the IMU sensors on the device so things like the accelerometer and the gyroscope are able to kind of understand it's the devices position and rotation and then using computer vision kind of detecting these feature points you can start to build out and the device understands the world now one of the really unique and cool things about handheld AR is that once you've kind of established a plane or start to understood the world a little bit your device becomes 1/6 off device and this means that you can move it kind of throughout the world and it understands its own position and rotation as you spawn objects in the world you can easily do kind of distance checks against the camera and things like that and you could also easily kind of track out or create like a motion path as you move through the environment alright so now let's talk a little bit about getting started with AR core so AR core is google's AR solution we actually had a couple workshops here at unite belen just kind of walking through the implementation at a high level if you're looking to get started you'll probably want to go want to go to goo Will's github repository here which I've listed and the nice thing is that they have some really basic sample scenes so the sample scenes are things like hello AR which is pre-configured to find both horizontal and vertical planes and then place little kind of prefabs of Andy the Android on top of it and it also uses what are the unique features to AR core which is oriented feature points you can see that there's a couple other sample scenes included there as well if you're looking to kind of recognize images things like posters or some sort of static image AR chord now supports augmented images and there's a sample scene kind of included in there as well and the nice thing is that a lot of these scenes are fairly basic if you wanted to replace Andy the Android with say a 3d model of your own you could easily kind of change out the prefab that's included in that hello AR scene so one thing to note for using this implementation of AR core which is basically the Google SDK is that we've now kind of directly integrated it in the engine and there's a little checkbox in the X our settings and you can see it's compatible with twenty seventeen point one so a couple unique features that they have is Google's actually built what they call the instant preview and so you can see here this is actually a small gift of me back in the office configuring and modifying a material in the unity editor that's actually running off of my Android device and so this allows you to basically hit the play button in the editor it pushes an apk to your Android device and then actually streams the AR data back into unity so this is super helpful when you're trying to iterate quickly kind of do some debugging and things like that some of the other unique feature points are some of the other unique features and one-time ention earlier was oriented feature points and this is basically finding a feature point in the world and then getting a directional vector back from it so if you wanted to place an object on say a slanted surface or kind of attach it to different areas where maybe you couldn't find a full plane you can use this feature for that and last but not least they also released cloud anchors which is basically a solution for creating multiplayer experiences within the AR core environment where you can basically place an anchor sink it up to the cloud and then resolve that on another device so you're kind of seeing the same AR experiences between devices so as I mentioned earlier you know it comes with this little Andy prefab which you see here and basically from a high level you can see that it has this kind of hello AR controller this is just a component in the scene and you can see that Andy the Android prefab has just kind of selected there it has a couple nice UI elements as well so that snack bar is basically what they've just named a kind of UI window that informs the user to start looking for planes and then kind of goes away when you selected it and the other thing to note here is that the way that AR core is kind of handling its sessions and this is kind of an idea that you'll see a little bit with our AR foundation as well is that these are just scriptable assets and so if you are trying to take advantage of certain features on a our core you need to make sure that you're using the right session data and that you've configured your session through there so you can see things like the Augmented image database so that actually is a scriptable asset as well and you kind of assign that specifically in your session and then you can also configure what types of planes you want and things like that so for the augmented images I actually am really fond of this kind of implementation it's super straightforward you can kind of multi select any images within your project and just right-click them which Creakle's kind of created a sub menu inside the create menu that you can see there and then it actually goes through packages up all of those images into a single scriptable asset and you'll notice as well is that it actually scores the images as it's kind of processing them so that's that quality kind of column there and what it does that gives you various scores based on how easy the computer vision can kind of recognize and track those objects now they suggest creating images with a score of 85 or higher and then there's also that width tab and actually helps the computer vision algorithms and the recognition if you know the kind of size of the image that you're looking for this can help kind of narrow it down and recognize it a bit quicker one thing to note is that these databases can actually contain up to a thousand images so if you had something like a magazine or had a lot of different ones you can kind of go and add those add those all to a single database and then ship that with your application there also is some support for streaming in images they do need to go through this kind of compilation step in order to get associated with it but that is a feature as well all right so that was a arc or just kind of a high-level overview how you can start getting started if you wanted to you know swap out Andy and start spawning things you can go ahead and do that and now I'm just gonna go over some of the AR kit so unlike a our core which has developed the SDK has developed through our partnership with Google or a our kit plugin is actually developed in-house by us so this is basically or a lot of the heavy lifting has been done by an engineer back in San Francisco called Jimmy and he actually puts all of the assets up on this bit bucket link and the nice thing here is that because he's working so closely with Apple he's able to put out new features and put out the branches right as they're announced now obviously Google does that as well as they release features but for example a our kit announced a bunch of new features at WWDC recently and Jimmy already had those ready as the features were announced the same day the beta branch was pushed up to our plugin so you could start taking advantage of those right away so that's really exciting the other thing that that Jimmy's done and some of the engineers at Unity is that we've built up a lot more sample scenes so with a our core you get basically the base functionality it's super easy to kind of swap out prefabs with a our kit there is a ton of sample scenes specific to some of the features some of the UI and UX that you should consider when developing with AR and the way AR kit is kind of structured is that it also encompasses face tracking which is available on the iPhone X so if you wanted to get started with that this is all kind of included with this AR kit plug-in and you can see there's also some really nice features like shadows and occlusion so those are basically special shaders that we've included in the plug-in and you can apply those to planes that you find if you wanted to kind of hide objects behind real-life objects if you were able to like find a plane here and it was using the occlusion shader you could kind of walk around and things like this and then the shadow one which I use in a lot of my projects is basically a transparent shader that accepts shadows to be able to be cast on it and again you can kind of place that at the base of your objects and get realistic shadows for like the exact shape of the object and as I mentioned before all the one-point-five features are just main lines on kind of the master branch and then the 2.0 beta features are on their own separate branch so you can get access to again so right now the way that apple is structured it for kind of the newer features and things like that it's gonna be on any Apple device running on 11.3 plus and we've updated the plug-in to be in the 2017 life cycle as well so some of the unique features with AR kit and a lot of these are kind of available in AR kit - so that's the beta that just came out the you know both Google and Apple are working on a lot of different features some some of them as you'll see with the foundation have some kind of core functionality or some crossover but there's also some unique things to each ecosystem so one of the newest ones that I've been playing around with a lot is environment probes and I'll show a little slide of this next we also have things like world maps and trackable images so similar to augmented images in a arc or a arc it can actually recognize an image as well as track it as it moves around so if you had something like playing cards or something like that it's able to kind of lock on to that the other thing is and this is kind of associated a little bit with world maps is that it can actually scan and recognize 3d objects so if you had a unique thing like let's say my mouse or a certain kind of device you can go through scan an object it kind of creates a point cloud using the feature points of that specific object and then you can add kind of augmentation on top of that so every time you recognize that object with your device you can kind of spawn something up from there so for an example that is I could go and kind of scan my mouse or something like that and then you know display some like particle effects coming out of it and then anybody else who had this specific mouse could actually go use that application and have that being recognized the other thing with the world maps is kind of similar to scanning specific objects you can actually scan specific locations and it's a various kind of similar feature there where it's containing all of the feature points in this kind of point cloud and then what it's doing in the application is it's going around and kind of trying to fingerprint that on a specific location so you could do that like let's say on this podium or something but as new presenters come up or as things get mixed around here it's gonna start to lose that exact kind of world map that you created so it might be hard to kind of rear eken I that again but if you have more static elements you can do that and the cool thing about these world maps is that these actually become a file on disk so this is an asset that you could airdrop to your buddy you could email it to someone they could kind of dynamically add it to your application or something like that and this is something you can either compile in with your application or kind of send over-the-air as you see fit all right so I talked about environmental probes this is something that I think is really cool a big kind of key feature of augmented reality is the more you understand about the environment the better you can augment objects in it so a lot of these features are about kind of understanding what's happening in the environment and then applying some of that understanding to your object so environmental probes are basically building a cube map inside of a our kit based on kind of the world in the environment that it sees you can then take that cube map and we just apply it dynamically to a reflection probe in unity and then when you spawn a shiny object it's just just like normal reflection probes in unity if it's close or within that reflection code boundary then it just takes that reflection so one thing to note is that there's a couple different types of environmental probes for this example here I'm using the manual one where when I tap on the environment it manually creates the environmental probe at that specific location there's also automatic that as you kind of scan around your environment it's gonna try to put and combine reflection probes as it sees fit in kind of unique areas so if you were doing something where an object is moving around the environment you're probably gonna want to do those automatic ones because then the object can kind of go and get the reflections from different probes around as it moves throughout the environment the one last thing I want to note here is that the way it builds up this cube map is that it looks at features and planes that it finds in the environment so I've done a couple demos and I was actually just like scanning around on the on the floor and then I started scanning around in the sky and one of the engineers at Apple actually reached out to me and he said hey you know the idea is good but scanning the sky doesn't do anything and that's because I can't find any feature points just kind of looking up into the sky so I'm just from my personal experience I found that this works a little bit better indoors because there's a lot more things with feature points I can kind of actually scan a ceiling if it's if it's fairly low it can it can start to build up that cube map a little bit bigger so something to keep in mind when you are using this feature here all right so this is what I'm really excited about and this is what I'm kind of gonna be talking about for the rest of the presentation this is something that's super hot off the press we released it last literally last week and it's now kind of a part of 18.1 moving forward because it's delivered via the package manager and a our foundation to us is a way to bring core functionality from a our kit and a our core into a common abstracted API for you inside of unity if you've been following some of our implementation for things like AR interface that was kind of a proof of concept of if this was possible and if we can kind of have this interface on top of these SDKs in a our foundation is our implementation that we're going to be moving forward with as we kind of build up all these SDKs and things like that now one thing I want to make note of is that right now if you want to start taking advantage of things like environmental probes or AR maps and things like that from a our kit or something like oriented feature points those are those are only available on those SDKs that I just mentioned right now AR foundation just covers the core functionality so you can see what I've defined as the core functionality here things like plane detection for vertical and horizontal light estimation finding feature points doing the hit testing as well as scaling and anchors so if you're looking for the latest and greatest you are gonna have to use those current kind of SDKs that are up to date but with a our foundation the goal is to start continually adding features and continually kind of supporting what all of these SDKs support off the bat so just kind of one to make sure I mention that that AR foundation now is about core functionality and one thing I want to make note of is with kind of hid testing here is that with each AR SDK they actually have kind of their own native implementation of how hit testing works and that's because you're basically taking your device kind of casting array into the world and potentially casting it against a bunch of feature points or planes and things like that and that can be fairly expensive so they've kind of optimized it some on the native side alright so now what I want to do is I want to walk you guys through exactly how to start using a our foundation I've done it a couple times before and I'm going to do a little bit of live development just to show you guys kind of how easy and how lightweight the implementation is so we'll see if this works so - first I'm literally just gonna start a brand new project here I'm gonna go into 2018 I'll call Berlin our foundation test something like that I am just gonna be using the kind of built-in render pipeline and just go with 3d here so that it's super lightweight and we're gonna go ahead and open a new unity project so if you guys have started using 20 18.1 you know that we now have the package manager included as a part of it and if you used 20 18.2 you know that those packages actually kind of become exposed in the hierarchy since I'm using 20 18.1 the packages are gonna be a little bit hidden behind the scenes but I will show you guys how to do this here so first I'm going to do is go to window package manager and the way it works is that in my project I get some kind of built-in packages that are already added things like this UI which is how this thing actually exists analytic stuff like that and if I go into all I can actually see a list of all available packages that are kind of out there now so you can see I can zoom in and you can see there's already kind of an update for the package there these packages live on your kind of hard drive here and they'd get linked in to your project depending on which ones you add so you can see we have you know everything from text mesh to pro builder but for this I'm gonna get the AR Foundation and so it before I get the AR foundation I want to grab a our kit and a our core so to do that I just go up here hit install it will hop back out so we can see and basically these are just linking my packages directly in my project so we can go back here again grab a our kit because that's what I'm gonna be building to because I'm on the Mac and then the last one is a our foundation and again our foundation is kind of this high-level abstracted API that's the glue between Erica and Eric or and one thing to note there as well is that this air kit in air core implementation are added via packages so something for a our core is that I mentioned earlier that we have this player settings down here if I was on Android that's as a our core supported and so for the package implementation of a our core you have to actually make sure that this remains unchecked so this is you know a different implementation a little bit and obviously these things are still a little bit in flux we're working on kind of the UX and UI of how that's gonna work in the future but for the packages right now we need to make sure that we're not we're not checking that AR core check box because this is a little bit different implementation than when you get that full SDK so I am gonna go ahead and switch my project over at iOS kind of forgot to do that earlier but the the thing that happens here is once I get the AR foundation once I kind of have this glue together then I'm able to start creating a couple built in what we're just calling kind of XR objects and one thing to note is that the way packages work in unity is that there's some files associated with them let's say so there's some files associated with them but they're not actually kind of accessible or mutable if that makes sense and I'll show you a little bit of what I mean by that so because I've implemented those or I kind of integrated those packages into my project now when I right-click here you'll see that there is hope I guess that zoom doesn't work let's try this right I right click here you'll see that there's now this XR tab so this is a new tab that we've added this is from the AR Foundation package and allows you to start adding these objects to your scene and building up how your AR is gonna work look alright here we go so first thing I'm going to do is I'm actually going to delete these two objects so the camera has a little bit of configuration and we go ahead and do that for you so I'm going to delete that and I'm gonna go right click and hit create a our session origin and so this is actually an AR camera that now exists in the scene and it's just created under this transform hierarchy where the session origin exists now a couple things to note here is that this camera has kind of been configured to work with AR and it's using our generic tracked pose driver which is kind of a generic abstracted component here that can be used for any XR device so right now you can see that it configured for generic XR device and color camera we could also do things like XR controller and then you can do like right and left and so this is how it's super easy to kind of start targeting multi-platform because you're just using these at kind of a generic abstracted level I'm just going to leave that as is for now and just note that you know this camera has a couple kind of configurations on it here and this tract pose driver is what's actually going to be kind of moving the camera throughout the space driven by a our kit or a our core alright so now that I've created the session origin I also want to go ahead and create an actual session and this session kind of similar to the scriptable objects that you've seen before with with kind of a our core it actually just has a couple kind of checkboxes here the one thing I want to note as well with a our foundation is we did release it it is still considered kind of a preview package so expect some of the things to change it seems like the API is then I'm gonna be using here in a moment have been kind of fairly flushed out and decided on but potentially some of this you act so the check boxes and stuff could could deviate a little bit in the future so for this one we have things like attempt to update and this is actually the ability to either try to update the device or download the latest AR core SDK which is actually just available on something like it's on the Google Play Store so this is kind of a check box included there we also had the ability to do light estimation so for now we you know basically have these two components here and the next thing I want to do is I want to create my plane visualization so to do that I'm gonna create an X our default plane and you'll see here by default we again have some kind of components that we've created for you we've added some things like a line renderer and stuff like a mesh Collider and a couple scripts there and one thing to note here is and this is what I was talking about earlier is this mesh render has a material fly to it called debug plane so where does this material exist and you'll notice that if I double click it I'm actually able to get kind of a reference in see what's happening here but I can't modify it in any way and that's because this debug plain material exists in the package and in the wave hacker does work is that we kind of restrict the ability to modify that so you can see this here but I can't actually adjust any of the settings so if I wanted to render the plane to be a little bit different maybe I don't like this yellow orange color I can just easily create a new material just like normal we'll call this plane mat and then from here I can go and kind of change maybe the color I'll leave it kind of a white gray and maybe turn off some of these settings so for this plane I'm gonna go ahead and just override this material with the new one I created and then I'm actually just gonna drag this plane into my project so that it creates it as a prefab and now in order for planes to kind of spawn and be visualized what I want to do is go back to my session origin and add the plane AR plane manager and so you can see from here it actually is looking for a reference to the plane prefab and so I'm just gonna go ahead and drag that in right now so if I were to build this out now what we've set up is we've set up the session which is going to kind of initialize AR kit and we've also just configured the plane rendering to be able to kind of render and visualize planes by default the way our foundation is right now because both our core and a our kits support it is it's gonna look for horizontal and vertical planes and so that's what we have off the bat and if I were to switch over to Android and build this out to a our core we have the exact same functionality it's gonna be visualized the exact same it's just gonna be running on an Android device powered by a our core as opposed to an iOS device powered by air kit now visualizing planes isn't that exciting and the next thing I'm gonna do isn't the most exciting thing but it is just kind of showing you guys a little bit with the api's for placing an object on a plane so I'll just call place on plane and the one thing I want to as well is that right now what we're doing is the kind of core functionality and all these kind of sub menus and stuff I've created so far is available via a our foundation but we're creating an additional repository on top of it that is actually going to kind of have all the steps that I'm doing here pre-configured and so that's just a public github repo on on the github unity - technologies and from there we're gonna have kind of this sample scene and stuff like that because right now those we're not able to include those with the packages so those are kind of just the core functionality there so let's go ahead and open up this script here see it what right here and so what I want to do is get a little bigger for you guys is I want to just start using the AR foundation basically just doing simple ray casting and again the nice thing here is that it's gonna be working with both AR core and AR kit so I'm just gonna be using unity engine X r dot AR foundation just like that and then that's going to give me access to these kind of AP eyes that we're at we have available so the first thing I want to do is I want to make sure I grab a reference to the session that's why I can actually Ray cast against that so I'm just gonna do a a our session origin sorry the session origin like that and then in my start function I'm just gonna do a get component on that so I'll apply this script to that session origin object like that and then the other thing I want to do is when I'm ray casting I want to kind of store a collection of all the hits I get back and then just use the first one so for that I'm just going to create a list of AR ray cast hit so we'll just call those hits there we go and the last thing I want to do is I want to have a public reference to what I'm actually going to be kind of spawning or placing in the environment so for this I'm just gonna create a public game object and we'll say prefab to place but something like that now for the update what I'm going to do is I'm just gonna do some kind of simple unity kind of touch input that makes sure that when I touch someplace I'm gonna raycast using the native ray cast looking for a plane and then spawning an object at kind of the beginning phase here so for this we're just gonna you know make sure that the touch count is greater than zero and then we're gonna go and just store that touch here and just store that first touch and then from here we're gonna make sure that the phase is basically at the beginning like that and then this is where we're gonna be doing our kind of native ray casting here so we're gonna say if the session origin dot ray cast and you know we've built these api's to be fairly similar or almost identical to normal unity ray cast this is kind of how you're you'd be doing this outside of AR as well and so from here we're just going to take in the touch position as well as output the hits here so that we can get a reference to those and then this is where we actually pass in what type of kind of filter or what we're looking for with this ray cast and so you can see here my IDE is kind of Auto completing there and what we're gonna do for this one is we're just gonna do planes within polygons here and so you can see if you are looking for just casting against feature points or something like that we have that ability here as well so there we go and what we're gonna do is let me just make this a little bigger just like that and what we're gonna do is we're going to say alright you know if we've done the raycast if we've hit something then we're just gonna store the pose of what we get back from the hit which is just gonna be it's zero dot pose and then we're just going to go ahead and instantiate our object at this exact pose so just a normal instantiation for prefab to place and then the posed opposition and the pose dot rotation here as well so again this is you know this is just the new AP eyes that we're doing but this is you know fairly straightforward pretty simple stuff of just using the session origin and then doing this a are specific ray cast so I go ahead and save this here and I'm gonna apply this component here to the object with the session origin here just like that and then for the object I'm going to go ahead and grab a little package that I have here called chair package and one thing I want to note here with the chair and whenever you're spawning kind of objects on planes one thing you want to make note of is that the parent transform is kind of at the base of the object and I'll show you guys how I've set that up here with the chair but just consider that that most of the time if you want to place an object on the actual floor you're going to be finding these horizontal surfaces and then when you spawn the object you want that transform to be at the root and then kind of offset the object based on that and we'll see here that this in a moment and then the goal is to build this out to my to my iPad here and I can show you guys how it's kind of visualized and then I've pre-built it you guys just have to believe me on my Android phone here as well and the other thing to note and you'll see this I'm going to kind of intentionally make a couple errors here but we do have a lot of things set up to kind of catch catch you along the way and make sure your settings are all configured correctly so for my chair here as I mentioned you'll notice that just turn my gizmos down you'll notice that the route here is at the very bottom and again that's because when I spawn it on the plane I want it to be sitting flat kind of in the environment so by default this chairs transform is kind of near the center if I were to just take that as is you know how I got the asset it's gonna be kind of down underground which isn't gonna really look correct so we can just put a empty transform here at the root and this is that object that we're gonna spawn so to quickly do this we can go and just drag that over here now I'm gonna save the scene and we can go ahead and hit build so if I hit build now it's going to let's just do like this it's gonna complain right away that I haven't properly set up my camera you should settings and so we can go back in here and I've also gonna have to do the bundle identifier here and for camera usage there's this little description right here I'm gonna say use AR and then for this we'll just do dentist Berlin they are found so now I go ahead and hit build and run and what its gonna do is this is just kind of normal iOS development it's gonna go ahead and package everything up for Xcode and then build it and push it out to my device here so the cool thing again is that to me this is you know super lightweight it's very kind of accessible for this core functionality you can easily build it for iOS switch it over to Android and the really nice thing is that because with the way we've set up the packages and because of some of the dependencies and stuff you can keep that and maintain the core the same codebase across both platforms so if you're you know developing this on your Mac and then you have a desktop PC or whatever you can just easily kind of save that same codebase and then start building for different platforms as well so you can see this builds out right now I will go ahead and actually switch to my Android device which has a build of it already so let's go ahead and try that um see here they try to replug this in sometimes the there we go all right so what I'm gonna do is I'm just going to look around here is the plane being visualized and when I touch it now we have a chair so again just the same functionality basically the same codebase I you know I ran through the demo previous times and the way it works is that we can actually just spawn chairs kind of all around and I've said that it's able to do both vertical and horizontal planes so if I wanted I could start just putting chairs on the side as well now we're just kind of you know getting a little maybe MC Escher or something like that where we can just spawn objects and the reason that it's locking and orienting to that is because I'm using that pose that I cash from the hits so you can see that that pose has the transform and the rotation of kind of how I'm hitting against those plane and now what I want to do is switch back over here fix my profile here and go ahead and build it out take a moment here and I've also there we go so we're gonna start to build it out and just to kind of show you guys that this is more of a real build I have it kind of configured through here so this is just my iPad and you'll see it kind of automatically launch and stuff like that as it starts to build and again you know you saw that the the basically the place on plane script that I have that's kind of shipped in that external repo and I you know us at unity as well as myself I'm going to actually be contributing to that a lot building up a lot of these sample applications and kind of adding functionality to that so that you can guys can get started right away check out the api's see how you can kind of use them and things like that so this is almost done building here all right and now if I switch back you'll see it added right there and it should oughta lock here in a moment there we go and now we'll see if we can emulate exactly what we just saw on the Android phone instead using a our kit you can see right there I need to accept kind of permissions there we go and here is our chairs right there and same sort of thing right there same code base using our foundation you know easily kind of building it up in a very lightweight manner using the packages and things like that so that's pretty much it for my presentation but thanks a lot for coming and I think we have a little bit of time for questions and I'll kind of be around after there so thanks [Applause]
Info
Channel: Unity
Views: 10,795
Rating: undefined out of 5
Keywords: Unity3d, Unity, Unity Technologies, Games, Game Development, Game Dev, Game Engine
Id: MqA0XhfKIE0
Channel Id: undefined
Length: 38min 24sec (2304 seconds)
Published: Mon Jul 30 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.