Introducing the FaceAR Sample Project | Live Training | Unreal Engine Livestream

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi and welcome to our Unreal Engine livestream I'm your host Amanda bot with me I have Zack Parrish our senior Deverell tech artist which is the short version that's the short version so thank you for joining us yeah and Tim swager our other community major shorter title yeah very much so [Music] yeah yeah so today we're gonna be taking a look at the face a our sample that was actually released with Unreal Engine 4 dot 20 so a few people have already seen this I've already seen some comments popping up in the forums as well as in the marketplace saying hey this is cool what is it how does it work so we were kind of saving that so we could put together a nice presentation some documentation show everybody kind of all at once what this this is and we're gonna be covering all of that today so real fast what I'm gonna do is open up my iPhone 10 and we'll talk a lot about why that's important because it's not any other iPhone and won't be and I'll open up the the app itself and here's what we see we have you might recognize this guy actually this is the kite boy from the kite boy demo we released about one two three four years ago and at you're gonna see me do with this a lot today I will be making a lot of silly faces because it's kind of mesmerizing so it's pretty obvious what this demo actually is right its facial capture I can turn my head it I can Lou it can actually lose face tracking if we messed that up so we'll talk about this we'll talk about all the features we'll talk about how it works and how you can use this tech in your own projects so let me start off actually this is totally unlike me but I'm gonna do it anyway I'm gonna start off with a slide deck I know I know it's just there's so much to cover here that if I don't do this I will lose track of everything so what we'll talk about today is first off what is the face a our sample will talk about what's included we'll talk about all the requirements which are substantial before you start asking and we'll talk about how you can actually use what we've provided next we'll go into how it works we'll talk a little bit about a Ark its face capture we'll talk about how you can set your own characters up to utilize this because you have a lot of options and then we'll look at the blueprints that are in the project and we'll play around a little with other ways you can use this data so for starters what is the face a our sample what have we given you here so this is an example of facial capture using Unreal Engine 4 iPhone 10 or iPhone X I keep calling it one of the other you'll hear me flip-flop throughout the discussion and a our kit now this was developed this particular sample was developed with the help of three lateral those folks are awesome they are absolute masters of real-time facial animation we'll talk a little bit about them here in just a second so the sample demonstrates facial capture possibilities on a mobile device and it can broadcast that capture data directly into ue4 this allows you to do rapid prototype for facial animation it's gonna be at least rough or prototype quality however depending on the scope and how much facial animation you need what kind of project you're doing you might actually get shippable results out of this so it's super super handy now quick thing about three lateral if you've never heard about them they actually worked with us to make the awesome Andy Serkis and Osiris demo we showed off at GDC if you didn't get a chance to see that where have you been but also just do a quick google search for andy serkis osiris you'll see it it's super awesome so these are the same folks that helped us put together the rig for the kite boy which is being used in this project so Wow what's included with this sample first off I'm gonna mention this a few times this is for Unreal Engine 4.20 only don't install it on 419 or anything previous because we actually had to make some serious code changes to make all this possible the sample includes two different Maps there's a simplified map and there's a regular map which still has a two on it you're actually will talk a little bit about the cutting-room floor stuff which is actually scattered throughout this project sorry about that I'll clean it up at some point so the the pretty face tracking map to which you will see in this live stream we generally don't want you to deploy this to a device though technically there's nothing stopping you so this is this other map which you'll see is intended to run on PC so that you can send live streamed data or live linked data directly into Unreal Engine 4 also if you get questions along the way because you know I tend to get on these rolls and I talked really fast and there will be no way for you to get one and just smack me or something it's fine so I will practice right all right see I didn't feel like counting different weight where's the delay yeah you should have different bag okay okay get on the ball so also worth noting that full documentation for the sample has already been launched it actually launched yesterday so I'm I'm just slowing down and keeping this on the screen for a second so that those of you who are following along can jot that down because you can't click on the link and you'll need a second but if you jump into our docs page and you search for face a our sample you should find this somewhere in the list make sure you scan through it there's a lot of really good stuff there and I'm gonna be making a lot of references to it throughout this live stream just so we're not covering the same data twice and I do want to give a big thanks to our Doc's team they have been crushing it lately with turning out documentation on new features left and right it's been amazing okay so requirements for this project and they are significant first off obviously Unreal Engine 4 20 or later we talked about that next an iPhone 10 or iPhone X it will not work on an iPhone 8 7 6 5 4 whatever it's only iPhone X and the reason is that only the iPhone X has the depth camera hardware in it at this time I have no idea what the future holds but at this time iPhone X only you'll also need a Mac if you've ever done any deployment to an iOS device you already know this but you have to do all of your compiling at least your shader compiling on a Mac so if you plan to deploy you'll need one then you'll also need an Apple Developer account so that you can deploy to an iOS device now I know what one or two of you were thinking you're like but that's a huge barrier for entry it's a really expensive device in a really expensive computer and then I got to pay a hundred bucks a year just to use this yes totally that's exactly what you have to do however if you plan to use this for anything serious like if you did want to use this to start creating facial animation for a character take a look at what it costs in terms of your time to do that facial animation or what cost to hire a facial animator to set that stuff up for you and then come back to me just some other note I think I've already kind of covered these I actually had fun with footnotes I figured out what a dagger symbol is and how to get that into my footnote so anyway obviously iPhone X only just to make sure everybody hears that and you'll need all the other stuff you'll need a Mac and a developer account in order to deploy to an iPhone and then before anybody asks because at least one person was already thinking it this only deploys to iOS because of the functionality we're exploring this facial tracking Tech has thus far only been implemented in AR kit which is an apple product so that's not going to be an Android thing cool so we're all clear on that so next how do you use this currently you need to download it you have to get 420 going and you have to deploy it yourself at this time there is no app in the App Store for you to just download and get onto your phone possible that might change in the future but for this first run that was a whole lot of overhead that we didn't want to go through all the certification process and whatnot this is something that we've specifically put together for developers so we actually intend you to be ready to develop and do your own deployments next make sure if you've never done it before that you check out our documentation for the process of deploying to an iOS device from either Mac or Windows or I think we might even have Linux but don't quote me on that because I haven't used Linux in a very long time and then just a quick reminder yes you need a Mac next as you use this you do need a very well-lit room in fact I can barely see anything right now because I have these huge bright LED panels striking me in the face but it's it's helping the tracking system so you will probably have to tool your lighting a little bit in your own developer environments make sure you find the right combination but the point is the facial tracking system needs quality lighting so we can track all the landmarks on your face so know that next for best results you don't have to do this but for absolute best results have a dedicated network a dedicated wireless network that has your computer and your phone and probably nothing else like don't be updating fortnight in the background on some other device or anything yeah so also make sure ok I already cover that make sure the phone is on the same network that should make a lot of sense okay so now let's talk about the app so I'm gonna open app up and here we are so the first time you open it this is pretty much what you're gonna see your the character might not be following you as well as he's following me because I've already calibrated this so let's talk a little bit about calibration I'm gonna kind of turn yeah that looks pretty good on the camera so first off let's take a look at the settings here so there's a little tiny Settings window down in the lower left corner so you hit that and you get a cool list of stuff first off is calibration mode so we'll hit that and it has instructions to tell you how to use it basically just make this kids face and hit the calibrate button and you calibrate it it's as easy as I think I can make it there is a note there that is worth pointing out things like hats and glasses and poor lighting and actually large beards I have seen kind of muck with this a little bit it's just the way it goes it's looking for landmarks if you hide those landmarks in some way you could have problems so yes yeah I actually switched up for contacts just for this demo because I have really thick rimmed glasses and the AR kit has a tendency to read those like my eyebrows so I would get external McKidd all the time so know that that's a thing and if you have a facial performer either get contacts or or you'll figure it out so other also as well you'll notice when I look away the app loses track of my face and actually tell you hey we're still searching for a face to track so you give it a second and it comes right back okay so we're now calibrated if I close my mouth his mouth is closed too actually I'm gonna do something a little crazy just because we have just a couple of extra seconds worth of time I'm actually gonna shut the app down and I'll relaunch it and it's gonna take a second to relaunch because I did that really fast there's our funny splash screen so now watch when I relax my face you see he's got that little gap you can see his teeth it doesn't look very good so that's why we calibrate so you have to calibrate each time you launch it yes yes we played a little bit with actually saving calibration data but it ended up causing more hassle than anything else why I dropped that okay so I'm gonna skip the live link connection button it's cool to show at the end because it's very dramatic next is the debug mesh so I'll turn that on and you get apple's debug mesh so this is the mesh that is being used under the hood to track what your face is doing Wow looks like something out drawn so the material for it is actually something we made you have to kind of make your own it's sorry I get kind of mesmerized by you know I'm moving it moves um also it's worth pointing out and I don't know how well this translates but currently actually it's pretty obvious on the screen this is mirrored so if I look left it looks right cool but when you're actually puppeteering and driving the kid he's not mirrored he's doing exactly what you do and that's on purpose because our goal here was to give you something that that would do exactly what you do in case you were puppeting a character on a screen also you will notice that if I talk really really fast he has a maybe a tricky time keeping up actually the that was pretty good if I'm honest but if you find you can get ahead of it you'll do better if you speak clearly don't talk at nine hundred miles an hour but there are some tweaks in the animation blueprint that will make that a lot better so when you're calibrating though is that grabbing the light in the room as well no just your facial structure so what it's doing specifically is actually hang on I'll get to that so the next checkbox I showed debug values so check this out when we turn this on and close the window this is giving you real-time feedback for all of the curves that live link is picking up as we track our face and it's gonna be really hard to read on your stream but on device you can actually see it so there's one for like jaw open and so forth so you can see in real time what's going on so the calibration system the moment you hit the calibrate button it reads where all of these are when your face is relaxed and then we simply remap the curve with that value as the new zero so it gets you back to neutral face okay so let's turn that back off that's a lot of data but it can be really useful if something doesn't appear to be rendering the way you want next is stat units I mean if you've ever used unreal you've probably seen this before it's just the statistics for using the engine so you can see like how fast your GPU is clocking what your draws are and so forth I had it there because when I was first getting this set up I needed to know what my performance levels were and then I just left it so there you go next there's a help in about screen I'm not gonna read this to you but once you get the app deployed you should probably check this out it's kind of like a miniature version of the documentation to kind of help you understand what this is supposed to do also if you just hand the phone off to a friend after it's been deployed then this will help them get an idea of how it works now the last thing I want to show off is the live link connection so real quick I'm gonna tab back over to unreal and I'm gonna open up a different map actually let me set the phone down and let it kind of cool I'll talk about why that's important here in just a second let's open up face tracking map - all right and here we go so this has actually got some HDR lighting we've got some depth of field it's a little bit fancier the kid looks like he's falling asleep which is pretty great so what I'm gonna do is hit simulate he's stuck in an even different pose so when you see that all that means is there's some leftover data from when I was testing this earlier so now let's switch back over and I'll go to the live link connection so if we can bring the phone up on the screen for just a second just so everybody can see this we have our IP address which hang on I don't remember the IP for this computer so I'm gonna look it up real quick because I've got it on a different phone so here it is it is 10.1.1.1 so bad at typing with my left thumb okay so next you can save your IP address for a wide variety of reasons actually didn't save the state of that checkbox so every time you relaunch the app that checkbox is gonna clear that means pay attention because it's going to try to flush your IP out of the phone so now we hit connect and haha now if we take a look back over in unreal no hang on let me I'll position myself so that he's looking back at us ah yeah so that's pretty cool now here's the thing so right now what's happening is the phone is passing all of that animation data back into unreal by way of live link which is cool the catch though is it's sending the original uncalibrated data so the character on the PC is not calibrated but the character on the phone is so what you can do is you can select the kid want and if you take a look over in the details panel there's a calibrate in editor button so I'm gonna square myself up just like for calibration and then click the button there we go so now he's calibrated there as well I could just host the rest of the show from here yeah today's live stream okay so alright sorry it's easy good job this is how we're gonna do future streams just like this totally alright so let's go ahead and stop that and I'm gonna put the phone down and a couple of things I want to point out if you're doing this on your own actually if we can get the the phone back up I know I'm getting our the the our video team giving them fits right now so check this out as you use this app your phone is going to heat up that's important to know because eventually the phone gets so hot that it starts to throttle itself now we're using this with the lowest rendering quality we can get away with and still leave you something on the screen so it lasts a pretty good while but you do want to keep keeping a like a finger on your thermals basically keep your hand back here and you can feel how warm the phone's getting and just monitor that because performance will drop you'll notice frame rates on the phone starting to decrease you can mitigate that a little bit by switching over and not putting your hand in front your face you can show the debug mesh because he's much lighter to render and you'll get a lot more performance for a lot longer because this is actually really really light Wow Wow if they were asking about the performance capture and like or the effects of it and like what is an average FPS coming from the captions yeah so the the phone is running a bull we went back and forth between 30 and 60 and I actually don't remember where we left off I think we left it at 30 but I'd have to check the project settings I can do that so if you're in these and forgive me for awarding this incorrectly if you're in this mode can you still send live link data to unreal yeah well so you could use this yeah totally so actually if I go back if we we do need to re-establish our connection so let's just go back over to unreal and I'll click play or simulate and then click connect and Wow so right now on the phone all I see oh man that looks so cool on the final output Wow awesome job so you can see all I see is the really simple apple face but my character is moving okay all right so we talked a little bit about thermals that's gonna be important I'm gonna close the phone down kind of let it let chill for a second and let's jump back over to our our PowerPoint presentation so we have calibration we talked about the show of lags we talked about live link it's great but how does all of this work cuz that's do you really want to know right so first off let's talk a little bit about the AR kit face capture I'm not gonna go into a whole lot of depth on this because I'm I'm really not the guy I don't I don't code this I didn't write our implementation of it but I can tell you from the artistic and usability standpoint the parts you need to care about so the AR kit sdk tracks 51 at least at the time of this development 51 unique blend shapes or morph targets depending on your DCC app religion on your face at all times and it just pipes out the value of each one of those those blend shapes on a zero to one basis now you can read up on what all of those shapes are at Apple's developer site the link is actually on your screen right now so that gives you a list of all of the individual blend shapes as well as an image of kind of what the face is expecting you to do so like left eye blinked for example is just the left eye being close there's also a right eye blink and there's smile left smile right and so forth so there's 51 of those that are all being tracked simultaneously if you're designing your own characters you'll need to have read up on that so make sure you do that so all we're doing is we're taking that zero to one data and we're passing it in to unreal using live link and then from there you can kind of do whatever in our case we're using it to drive the animation of character but as you'll see by the end of this live stream there's actually a few other things you could do so let's talk a little bit about creating your character all you really need to do if you're making your own character to do this is you need to implement some or all of those 51 shapes if there's some that you don't need you don't really have to use them like if I can't think of an example of top of my head but if there was like some extreme pose you're never gonna do or maybe you just got like a really simple game where folks maybe smile and frown and their their jaw wags and that's all you have a pirate right I mean it really depends like just if you don't need all the functionality you don't have to implement every single one but on the other side of that you can actually implement even more so you might want to consider some corrective shapes so like if you're looking down and to the left and you blink like your face actually takes on a different shape or the example that we have in documentation for correctives is opening the mouth which kind of covers two topics at once so let me jump back over to PowerPoint for just a sec so you uh also quick side note as I look down my list that was here to keep me from doing exactly what I just did so this kite boy as a reminder was all rigged up by the folks at three lateral they did an amazing job it's when I keep showering thanks upon them because this project probably wouldn't have gotten off the ground without their expertise and amazing help so next if you have special cases which I guess really aren't that special in game development but you have like jaw open so usually the jaw opening on a character for a game is not going to be by way of a blend shape you're gonna rotate a joint somewhere in the jaw so how do you handle that there's actually an animation blueprint node called pose driver which tells you how that works so let me jump back over to the docs so there's actually a set up here that explains how to build a pose asset for facial animation so you have it like this actually tells you like how you could set up like a jaw open so that Apple reads the jaw open shape instead of just driving a blend shape and only a blend shape it'll rotate the joint for you as well yeah it's driving bone rotation bone animation at the same time and we use that in conjunction with blend shapes at that point the blend shape becomes a corrective shape to give you proper shape of the face so for example on the left hand side you see the kite boy with his mouth just rotated open but no corrective shape and it looks like his head is kind of stretched it looks really awkward or maybe like he has an air compressor head blowing into his mouth his cheeks are a little puffed out sir I've done that before so I don't recommend you do that at home both first time we were head don't try this at home didn't you guys been doing it wrong anyway on the right hand side is just the corrective blend shape by itself so you see it's like sucking the cheeks in it's bunching up the flesh underneath the chin which is what your chin time for there when you open your mouth it's hard to talk and do that but then you run them both the same time you rotate the jaw down and you engage that corrective shave and you get something a lot more natural which is what we're doing here so keep in mind things like corrective shapes again depending on your scope and your level of quality for your face you may or may not need it but know that it is supported and it's actually easy to set up or at least fairly straightforward we won't say easy facial animations not easy also just another quick thanks out to the doc team because I'm able to flip back and forth between the stream and those great docs for this okay let's talk a little bit about blend shaped names if you've ever done facial animation those blend shapes or morph targets if you're a max person each have to be named the cool thing is if you name them exactly what Apple says you should name them say like jaw open left eye blink and so forth if you name them precisely everything will just work so ya know you gotta use fingers but that's my insurance policy yeah so if live link reads the exact same names that a Ark hit is supposed to read then the connection will just automatically be made and you'll start seeing some animation however if you're your blend shapes are named different things then there is support for that there's a live remap node and you'll use a tiny little title a bit of blueprint logic to remap the names to something else and this is also explained in the documentation so there's a section on remapping curve names in a live link asset so I think in this one all we were really doing is removing like a prefix so maybe you have named your your Blanche if the right thing but Maya or max or pick your poison has done the don't need a fantastic favor of appending your characters name underscore to all of your blend shapes which I've dealt with before this is how you would fix that okay it's all covered in documentation to try to make your life a tiny bit easier again thanks doc Steen alright so now we get to talk a little bit about setting up your project because it doesn't it all does just work but it does require a few little button presses on your end first off in your default engine dot ini which I am NOT going to go through the rigmarole of opening on this stream because I do not know where it lives on the PC you need to add the settings you see here and which again there in Docs you don't even have to type them and go over to that documentation page and copy paste then keep your life easy and don't type any keys you don't have to but you add this little line for be enable live link for face tracking which is a fairly simple command or at least straightforward easy to understand yes so you get that in your default engine ini reboot and then you're gonna need to create a config data asset specifically for AR so you'll go to your content browser right click go and your miscellaneous choose data asset and then choose a our session config and then these are the settings for it which they're up on your screen they're also in the documentation so you don't worry about writing them all down right now just do the same thing I did all tab back over to the docs like a good little student and you got so uh sorry that probably came out wrong so I you see there's world alignment such a camera if you don't set this by the way I think by default it's it tracks to the world and what that means is your AR app always actually knows the direction it was facing when you started the app the upshot to all of that is if you don't change this to camera and you start looking at your face and you turn 360 the kid's head will rotate like The Exorcist and it's really scary or amazing or amazing it could be what you're going for especially if you had like a a really nice particle simulation coming out of now thank you so next next we have session type obviously this needs to be set to face to engage the facial tracking system everything else is really just stuff we didn't need and therefore turned off we didn't need horizontal plane detection or any other types of plane detection we don't really care about light estimation mode because we're not trying to blend anything into the real world we're just tracking a face and we're done so we turn off all that other stuff you only want to auto focus though yeah not really no no so next we have the animation blueprint so what the next part of the discussion is going into all of the different blueprints that are found in the sample and kind of explaining what they all do so there's gonna be a lot of alt tabbing back and forth between powerpoints or sheets sorry and and our project so first off real quick in the blueprints folder which is probably probably don't really belong here but swear does anyway there's the face AR config so this is actually that config I thought I was talking about where you would right-click and go under miscellaneous I can't believe I can read that from here and then go under data assets and then when it asks you to choose a class you're just gonna search for a our session config and you'll see it right there I'm not gonna make one because um but it got one and then of course if I open it up you can see all the settings that I screen grabbed in order to make that that slide so let's take a look at the relevant blueprints in no particular order quick word of warning you will see some extra nodes and little tidbits like snipped wires from experiments that I tried that I meant to clean up and maybe didn't get around to sorry about that in advance at some point well I'll probably go back in and clean that up and we'll rerelease and you'll never notice that we also have to experiment and try and figure things out and ship things at the very last minute so yeah it does happen actually you'll see this throughout anything I mean source code occasionally has experiments that didn't quite work out so I try to keep them as cleaned up as we can sometimes we miss a beat so starting from the top well let's start with the animation blueprint because it's probably the most important one I'll go ahead just open that up and we'll do a little bit of back and forth here we are loading loading loading oh we're actually playing right now let's let's kind of stop doing that that's probably less than ideal also I'm gonna dock this up in here make that a little easier okay so first off let's start with the event graph so the event graph has a lot of functionality we don't need sorry about this but basically everything after this delay was part of an experiment to see if we could get head-turning working and I forgot to take it out after we went a totally different direction so if you look at this yourself everything after that one-second delay you could just delete and everything will still just work so but I'm gonna leave it there for now just so you can see that it is there and we're all looking at the same thing all this is really doing the only really important thing the event graph is doing is grabbing a reference to the pawn so that we can bind what happens when we fire off the calibration function because the animation blueprint is going to need access to all of those calibration offsets that we talked about earlier so if you missed that part or you weren't paying attention or your your Doritos were too loud so what happens is when we hit calibrate we store the existing values coming in from live link and then we use those to remap the curves so if your jaw open was like at a default of 0.2 when your jaw was rested that becomes a new zero and you get a whole new range between zero and one so that's that's how calibration works because we need access to that data there's a function that has to be called on this side so we connect to that by way of an event dispatcher okay so the next bit is the animgraph now the good news is the animgraph which is the most important bit and does all the heavy lifting is actually pretty well commented I did remember to do a cleanup pass on this one so yeah right so oh also while we're here you'll notice the kid looks like he's sneezing right now hang on I got a tell it's a live link connect one more time Wow so the live link connection also works in persona the displaced head yeah so it's it's a pretty simple but it does allow you to do really fast tests to make sure everything is working as it should so that actually said that's what happens when the app reorients it gets I guess a little little crazy so what I'll do is let's yeah well we'll just connect it while I'm making a really really funny face let's go from there but not when his neck is broken because I'm not fundamentally disturbing ok so the first thing we're doing is we're getting information from the live link pose node you can create this the default subject name you should feed this is iPhone X face ar that is actually written into the code it is hard-coded if you're doing this and reading data from an iPhone X use that subject name don't make up your own if you have had to remap any of the names of your blend shapes this is where you will plug in the retarget asset you had to create again that whole process is covered in the documentation in this case we didn't have to do it because I named everything exactly what the Apple blend shapes were named so next after we get that data from live link we denoise it and this is done by way of a modify curve node so as of 4:20 the modify curve node had some additional functionality added essentially some different apply modes were written in one of these is weighted moving average so this is just grabbing the average values over the last n number of frames think it's ten frames but don't quote me on that so what we do is we grab that average over the last few frames and we blend between the value of where we are and that blended average and that's your weighted moving average moving because we're marching down the timeline right next we apply our calibration so this is fairly straightforward we're grabbing those calibration offsets which is really just a struct of floats with each float named for each one of those curves that Apple is looking for so you can just march down them all it was really fun wiring this up in blueprints I just want to say because you had to do it one at a time you'll notice there's a few extras in here that aren't really being calibrated for a head roll pitch and yaw so then we drive we take that struct and we feed that into another modify curve note it's the exact same node just with a different setting this time is set to remap curve and the description basically says the whole thing is just going to remap whatever incoming value between zero and one so that's essentially the heart of calibration by default before you've calibrated the calibration offsets are all set to zero so you say you know ran a standard range of zero to one with nothing change the moment you feed a value in like a jaw open value of like 0.2 suddenly 0.2 becomes a new zero and as you open up it blends back out two to one the idea is that you subtract out that offset and then you take the entire result which would then slide to my left your right and you scale it back out so you get your full range of motion back I'd actually written it in blueprint but it was a tad slow so we moved it back over to C++ next we have some corrective blend shapes that three lateral put together for us now these are a little easier to explain if you take a look at the names of the nodes so we have like I look down left so the actual Maya rig has a special shape for when you are looking down and to the left at the same time but that's not supported with the apple shapes there's no look down and left shape from Apple so what we do instead is we grab the values of are we blinking R and are we looking in different directions and we blend them together with that we basically say are you looking down and are you also looking left cool then let's blend in this other shape as well so it's kind of the heart of how correctives work also being a run through a modify curve node in this case it's just the it's set to blend with an alpha hard-coded at one so we're essentially replacing the data with the new blended result next we have the joint rotation so when you do jaw open you actually need to rotate a joint when the eyes look around there are joints in the eyes that actually needs to rotate as well and that is all being done by way of a pose driver asset which again documentation tells you exactly how to create one of these the the bare bones of it like the elevator pitch is you will create an animation in Maya Max or whatever where each frame of that animation represents a different pose so in my case I had 51 individual keyframes you could have more and if if there's like the the first pose I think is I blink left that doesn't require any bone rotation so that's just a keyframe where nothing happens as you march down the list eventually you'll get to a keyframe where you should be rotating the jaw and on that keyframe you'll just annum you'll you'll have a pose where the jaw is wide open as wide open as I get when you import that as a pose asset the system will go through and say okay cool here's the list of all the different poses we need here's all your key frames and it just meshes them up so make sure that you keep them in order and everything will just work okay so then the last thing we do and I'm gonna give myself a little bit more room here is we handle the actual head rotation so again there's some old experiments for how we were gonna handle this that are floating around inside the animation blueprint but here's what we ended up with we added our own curves on top of the SDK for tracking the yaw pitch and roll of the the pink apple face and then we just take those curves make them into a rotator and I'm just driving bones directly with them so I have a modified bone node super straightforward in fact I think what I'm doing is I'm taking the total rotation dividing it in half and then putting half the rotation on two different bones for the base of the neck on the top of the neck so you get a little bit of a blend it's like the bottom of the neck rotates 50% and then the top of the head rotates the other 50% didn't get full rotation that way that's why you see to modify bone nodes then that's it that's your final pose in a nutshell that took took a second to go through but it's not really that bad okay so next is the level blueprint this is so basic it's almost not worth showing but you know since we're here before we dive into this a couple of questions have come in yet there we're going back to the deploying of this AR sample case people want to know if they if they put this through test flight if that could be deployed out to other users theoretically we didn't with so the way we were doing this was very very fly-by-night I was working with another one of our engineers who's actually based up in Salt Lake City there was just a lot of remote work and we had a pretty good system for him putting something up on perforce me sinking and just deploying locally your mileage may vary it should should just work if you put it up on test flight because at the end of the day it's just an app there's nothing really special about the app so I can't imagine why that wouldn't work but I make no promises because I didn't try it anything else yeah they're wondering if how they could turn up the dampening that the modified curve node creates oh sure they just add an additional one well so the the full answer there would be if you have the chops for it take a look at the modify curves source code and take a look at the new weighted moving average code like see what that's actually doing there because you might be able to make tweaks at that point but essentially all you really need to do is probably just adjust this weighting value so if you're talking about specifically the denoising of it all you can adjust this weight to bias it a little more toward the smooth results or a little more toward the final result okay so right now we're at point eight which should be most of the way to the smooth result but you could you could fluctuate that a little bit okay if you needed to go further than what the Alpha allowed then you probably want to crack open the source code and make some changes to that which shouldn't shouldn't be that hard yeah what else they're wondering if the performance or tracking could improve with face trackers oh it's possible but honestly that's that would come down to exactly what hard core under the hood Apple was looking for in terms of landmarks for how it's tracking your face moving around I would try it I didn't for this and it turns out it's actually tracking relatively well right so I haven't really found a need for it but it's possible like if I grabbed like a on a dry erase marker and just started drawing dots all over my face that it might get better which sounds fantastic I just never got around to trying it if you could use multiple bones even there's no reason why you couldn't but you probably have to make a minor modification to the engine source code right now because we're listening for the live link iPhone X face a are hard-coded value you might be able to get around that I don't we didn't test for that but it's possible probably the better way to do that would be to just set up different clients so I basically have a multiplayer game and then each player is driving like a separate client the game and it all ties together that way but I don't see why it wouldn't work it probably should be a minor change he said not being an engineer it's a minor change yeah and then just using the phone and using it from your face like is there a perfect or in an ideal range you know you're holding a note yeah it's gonna be around a foot give or take honestly the the best answer I can give you is deploy and give it a shot because it's probably going to depend greatly on the light the ambient lighting possibly the shape of your face or the size your face the natural angle at which you hold a phone I found that no holding it down causes you to look down a little bit which can be good but it also changes how you tend to animate like you actually you you move your face a little differently when you're looking down as opposed to when you're looking up so you know keep these things in mind probably the best-case scenario is to be like the hardcore folks and build like the the head rig in an arm that holds the phone right in front of you but again that's something that you should play with we kept it fairly simple for this first release and then where we kept talking about how Mac is needed to is that only for releasing their project no no no that is compiling and deploying at all because iOS requires your compiled code to be compiled by a Mac it has to be compiled with Xcode so if just to get that project from whatever your computer is on to your fancy iPhone X you have to have a Mac somewhere in the pipe either directly from the Mac which makes it actually pretty easy or there are instructions on our documentation on how to set up remote compiling so that you basically just have a Mac on your network it pipes the information it needs out to the Mac compiles it and then sends it back to your phone and windows and deploys that way but like right now we're actually running on a PC yeah I'm running it on a PC my workstation at my office across the hall is running on PC we actually have a central Mac on our network that everybody compiles through so it's it's not that hard once you get into it I think right now we're using like an older Mac Pro and it's very slow but it's not that bad the first deployment in my case is usually the slowest one and that can take whoa that can take anywhere from onee 2:30 minutes sometimes a little more because it depends on network speed how much it's got to send over to the Mac for compiling whether the Mac was already busy and so forth so there's there's there's variables right and that comes back now after you've already done that once you've built and cooked and dropped at once the next ones are averaged out to like 10 minutes it is still a bit of a slow process but that's that's deploying to device for you all right that covers them for now so cool well then I'll just jump on back to what I was doing earlier so we're talking about the level blueprint and like I said this is like super super basic all this is really doing if you have to sum up is it's checking to make sure that we are on a device which handles facial capturing so is session type supported face yes or no if so start the AR session if not just throw a string up and then do nothing so when we're running this on PC like if I'm out here and I simulate and send data into the phone this is not an AR session on the actual PC there's no AR happening here it's just a regular app it's just receiving data from an outside source only on my actual device do we start up AR so that's that's all the level blueprint is really doing next is the face AR save like open it up if you want to it's mega boring all it's really doing is saving a single IP address that was it I probably should have added the ability to save the value of your check box but somebody was talking to like so you're seeing things just gonna hold my IP address forever what if I forget to turn that off and I was like okay whatever I don't argue with security minded religious folks okay so uh next the face AR pawn so this is where things start to get a little bit funny so this actually holds most of the actual Apps functionality most of what the phone is doing is set up in the face AR pawn if I did this all over again and start it up from scratch I would probably put all of this in the game mode or some other overarching thing but for a fairly simple one-off example in this case it wasn't really end of the world but let's take just a second and look at that so go to blueprints and I'll open up the pawn so the pawn is really just our camera when we begin play we grab her we actually grab a reference to gamemode and send ourselves up to it so that we probably should have stuck most this logic up there the important bits are actually commented so you can follow through so as soon as we fire off our main setup event we check and see what device were on are we on Windows Mac or Linux because if we are then do none of what we're about to do all of this stuff really only matters if you're on an iPhone and specifically right now an iPhone X which we catch here by saying hey is face tracking supported yay or nay so in the Unreal Engine 4 source code we've already set it up that at this time only iPhone acts supports face tracking so any other device your Android device your iPhone 8 or whatever this will reply no and you'll just get a message on your screen saying hey facial capturing is not supported on this device and that's all that app will do at that point but you can deploy it it does work it's very boring next once we know that we're on the appropriate device we set up a whole bunch of UI that's that Settings window that I was digging through we just create all of that next we run a loop which is constantly looking to see if a arc hit has found your face because once it has we basically attach some objects to it a lot of stuff that happens under the hood so we need to know when that happens we do this on a loop every few fractions of a second in fact what is it five times a second we look to see if we've seen a face yet and then as soon as we do we set up our geometry we make a recording of what our face tracking geometry was so we can reference it later if we need to that's how we're actually turning the pink face on and off in the settings window then let's see what else do we do so at five times a second not really in the moment it finds the geometry it stops checking so it clears itself out once it finds what it's looking for so that's a different different function were gonna go into now so there is a constantly looping event that checks every two seconds so very low on performance hit that is checking to see if your face is still visible that's why if you look away there's a brief delay before it starts losing you and says hey I don't see you anymore I would you could probably vary this a little bit I felt that five was too long one second was too short but you can play with it and then the last thing is just setting up our save game which was super basic okay let's go : presentation okay so next is the kite boy head prototype which is still of it so if we go back over here this is just kite boy he's not really super important but he does have some things that are vital to the project first off he holds the apple AR kit face mesh component this is the thing you are looking at when you see the the pink apple face mesh it is this component if you have any need for that if you want to see it if you want to attach things to it like if you wanted to do your own app like you're starting to see like in the app store where you can like put a hat on your character or stick a mustache on this is the mesh that you would use to collide against and check so it has a lot more functionality than we've actually covered here but this is where it lives in this case it lives on our pond it technically could live anywhere but I wanted to put it in the same location as our skeletal mesh just to be safe then the other thing it does which is fairly important is it has an event which is a call in editor event or a blue Atilla tee so it has a blue Atilla tee to allow us to calibrate in the editor and the reason we had to do that we reason we had to have two calibrates is that live link is just sending over the raw data it's a live link is sending exactly what the exactly what face a are the a Ark hit is seeing it's not sending over the data after it's been calibrated so we have to calibrate it on the far end as well if we want the kid's face to appear calibrated if we're puppeting on pc so all we did is we create a simple event which it said it's really it's calling the exact same logic it's doing the exact same function call that is happening on device we're just calling it from here in a call in editor event which is why we're able to select the kid he looks so much like he's about to sneeze and click on calibrate and editor and it all just works no well no that calibrates based on the current data that it sees it does the exact same calibration logic that the the phone did it just does a local copy of it that does mean that you will probably have tiny variations between how the phone is calibrated and what the the PC sees in terms of calibration but if your facial pose was relatively the same you know correct the phone is a one-way connection the phone is just sending live link data it gets nothing back so that's what again that's why we have to make sure that everything over here it's kind of like its own little sandbox so we have to do some of the work twice alright so next we so let's see that's really all of the major blueprints there's probably a little bit more that could be gleaned if you were digging through them on your own but those it's like all of the heavy lifting this taking place some things that people will probably want to know how to do as he checked his watch so we want to be able to record with sequencer because you want to record this facial animation so let's give a quick example of what that looks like so I'm gonna open up my phone and I'm gonna open the app back up and I'll just leave it with apples pink face wow it's not pink I made it pink and then let's turn on the live link connection do and hit connect alright and I'll turn myself so I'm kind of looking back at the screen so now we need to record this in sequencer so we'll go under window we'll go down to sequence recorder and I usually dock this guy over here someplace there we go cool alright so now we'll just add kite boy because he's selected that's all good and let's let's do a quick recording Wow okay cool I don't know I probably should have been saying something but I didn't so let's stop cool and great so we can see that that got recorded BAM and we can open that up so let me put the phone away for just a second because at this time this is got a tiny little quirk that you should be aware of and let's say let me dock sequencer down here and you can see what I'm talking about so here we have the kite boy proto we have his animation that was just recorded and if we scrub nothing seems to happen it looks like it's broke it's not broke there's just a little bit of an idiosyncrasy here that if you're watching this in the future might be fixed by the time you get around to trying this out this might just be working for you which is if another fancy way of saying this is probably a bug but if we select the kite boy and go over to the details panel there we go now if we take a look his animation mode and I know this is kind of hard to make out on the livestream and I'm using this monitor from seven miles away it's currently set to use animation blueprint because he's using his animation blueprint which is not receiving any data he's currently frozen so we can take that and set it to animation asset and if we take a look in our new cinematics folder that was just created under sequences actually just hit the drop-down oh okay cool it's just gonna go the long way it's trying to create a pose asset which is not what I wanted all I see there okay what I'm used to is when I click on this that becomes a drop-down and what actually happened there is it wants me to hold the mouse down so it's just some setting on this PC it's fine this is fine I'm great oh you can also see both of them which is really disturbing so let me hide the one that's just natively in the scene so go out to the world outliner and we have boy head proto haha cool so there's that one now here's the thing this is what will throw you off so what we did if we jump back over the details panel with this kids selected is we're currently set to use custom mode or you could use animation asset and set that to specifically what you just recorded so that all works the problem though is he's a spawn Obol now so as soon as you drag out here and drag back he clicks back so that's the part that's a little wacky the way to fix that is come back over here set your animation mode back up to something that works and then as soon as you do keyable so rewind to the beginning of your timeline and drop a keyframe so now he's fine even if he D spawns and comes back he's still using the same thing so the long and short of that is that at this time he will lose his setting we need you a question real quick yeah so are they've been asking if it's possible to do hot swaps of facial meshes in case they would like to rapidly do different characters or things like that Zack will be he's you just talk so much you wore out the microphone and it couldn't keep up with you swap out different face different faces yeah I don't see why you couldn't but again it's just something that I haven't tried so anything any animation blueprint that is tied to that live link node listening for iPhone X base AR should just be receiving that data so you can drive anything you want actually here in a second I'll show you how we can drive multiple things at the same time so what you're asking should just kind of come along for the ride it should just just be a thing that happens as long as they're all using a similar set up for their animation blueprint and then you would just hide one show the other boom it just works okay it just works you know with finger quotes so okay we did the the sequencer bit it's also I mean this was pointed out kind of as a matter of course but over in sequences under animations we can now open this up and I have that animation which is a little bit that is really creepy I'm not sure what caused that specifically I think so it looks like that's actually getting driven by two things at once so there was some some funny error with the way that recorded I've never actually seen that before so I'm gonna imagine it's not like necessarily a problem they're probably the place to get live bugs on a live stream yeah totally totally every time you give a demo something's gotta act a little quirky but it that one does work I think it's just because of the way we had to capture setup right now okay so the next thing I wanted to cover real fast is what else can you do so at the end of the day this is just zero to one float curve data that's coming in and you can do anything you want with that in this case we're using it to drive facial animation but you can reconnect those curves to pretty much anything so real quick before I show an example I want to mention something though if you need the face tracking data or the curve values on a device you can use the Apple a our kit face mesh component and you can grab them that way so the face AR mesh component can actually grab the blend shape values that's something you play with if you would take a look at that component in blueprint if you need them on computer though the Apple a our kit face mesh component is never created and doesn't exist because you're not actually running AR so what you'll have to do is use live link so grab them from an animation blueprint that has that live link node and then you can remap them and do whatever you want so it's like a really quick sort of hacky example so here I have the sky box selected back behind the kid and I'm gonna add some blueprint script to this I don't really care what it's named let's go over to the construction script I'm gonna grab the static mesh for the light dome let's do a dynamic material instance which I happen to know is assigned to that was it the HDRI skybox instance I think it's this one that's what it can be that one that's fine that now it is that now yeah okay it's this one that's good so you'll notice that this instance has an HDR eye tint property which if we play with that you'll see what it does it just tends to so let's tap into that just as a test to make sure we can do something with the data first off let's promote our dynamic material instance to a variable I usually call these Dynamat or something similar because it sounds like fotomat I am that old so then we need to get a reference to our animation blueprint so I'm gonna do something that I typically don't do and don't necessarily recommend you do because there are more efficient ways to do it but because we're only doing it once and it's really light scene we're going to get all actors of class it's ok I'm a professional this is fine and we're gonna search for proto and there's the kite boy head proto so we're gonna get one of those from the end result let's just get a copy we'll leave that at zero because there's only one in the scene and we know that so we're gonna get the one in the scene and since we know that's what we're holding let's just get his Anam instance which is a fancy way of saying let's get his animation blueprint and that automatically creates a reference to his skeletal mesh which is pretty nice shortcut and then let's just promote this to a variable and we'll call this Deanna and BP boom ok so now we have access to his animation blueprint where all of that live link data is coming through and stuff is happening to it Awesome's and now what do we do well let's write our own custom event which I will now rename yes so just a quick aside to the audience I desperately just want to be able to just type anything I want and then just see like an option for a create custom event named anything I want right there inside the drop down I've only been asking for that since the first day I used you before alright so we will call this check jaw open which is gonna be the easiest thing for us to check or at least one of the most obvious ones anyway and as that receives updates what it's gonna do is grab the N MVP it's going to get a curve value specifically the curve name we're looking for is jaw open let's compile real quick check my viewport everything still seems good let's grab our Dynamat and from this we will set a vector parameter which the name we saw earlier there in case you need a refresher was HDRI underscore tint like so and then boom awesome now what do we do with numerical data well let's just grab value and let's do a lerp linear interpolate between two linear colors first one will be white which is no tint whatsoever basically the default state and then for color number two what's a good color it's already green it's still a good color green means go no it's still a good color you're right but no all right so now we'll just take the output of our curve value and use that as an alpha to drive a blend between these two colors compile and save now delete this a fuchsia going alright so that'd be terrible and I might feel bad oh and I've got the kid from sequencer actually in here as well so let me just make that go away I don't want you here yeah yeah he actually does you can control that one oh yeah come back Wow I'm just gonna restart this out dude I've been really naughty with that app turning it on and leaving it off alright come my own I love that splash screen makes me smile I know right now that was me making that cheesy smile - okay so we'll turn on the debug mesh let's set up the live link connection and connect okay so this is not working yet and that's okay the reason is it was pretty obvious I just didn't follow through we set up this really cool event but nothing is driving it right now so let me fix that real real fast for those of you who are like how is this supposed to work you're right it's not gonna uh let's set a timer by event the event will be our little check jaw open guy over here let's do that and looks like 10 times a second which is pretty extreme but that's okay and we'll loop and compile and come back out here and test again oh my god ah oh he's still working so this is not that interesting but it does show that you can do whatever you want with this data so this could be driving anything and actually on device you could have it where the game is contour or experience whatever with constantly tracking the expression you were making and maybe recording it every few seconds and then you could change things about the game so there actually is on my computer a very rough prototype that I have tentatively named keep smiling and I won't kill you use your imagination it's actually kind of a cool test you can make like a mood ring totally yeah yeah so what I did was I made changes to a nai behavior tree based on whether or not I was smiling so we had a character that was just patrolling minding his own business but the moment you stopped smiling he starts hunting you so you and it's really really easy to set up but it was like a lot of fun yeah you can experience so he's somebody who's really grumpy exactly ya know lucky B you'd be accused of discriminating against a certain group of people and who wants that so anyway let's review real fast so first off we talked about what is the face a our sample we talked about the app and the project we talked about their requirements which again I fully recognize there's a lot of requirements here in terms of hardware in terms of software in terms of the development account that you have to maintain which is 99 bucks a year but if you're really serious about this and you need some facial animation and you want to get this going in a project it's probably cheaper than some then a certain list of alternatives so it's probably worth checking out yeah we talk about how it works how you can use it we talked about the blueprints and the setup at least at the 10,000 foot view to give you an idea of what they're doing by all means dig into them and you'll see all the fun little idiosyncrasies and random tests that we're left inside and then we talked about other things you can do with the data we showed you how you can record that into sequencer we talked about how you can drive other blueprints with it final thoughts make sure you read the documentation I was supposed to put the address for that duck at that point but it's yeah it'll it'll show up at some point it'll probably get overlaid on the video and it was earlier on I'm just not gonna yeah yeah yeah do a search in Docs or face a our sample you'll find it real fast again thanks for the docs team for knocking that out next another tremendous thanks to three lateral this project roxton's because you guys rock as well and with that we're done are there any questions very quickly yep yep but have you tried this with full body capture streaming at the same time you can't absolutely you can do it with full body streaming because that would just be a separate live link node listening to anything else it could be an accent suit of ikon suit anything you like I guess maybe a fancy connect like setup it totally works in fact that's if you took if you saw our siren project we showed it was it see graphed last year that's exactly how that worked we had one device that was capturing the face another device that was capturing full body and the animation blueprint just has two of those live link pose now it's super easy like you addressed most of the questions if you have remaining questions throw them into the forum thread for this live and I'll I will peruse through them and I'll answer what I can over the next few days perfect this is amazing once we're done it kind of does make you want to get an iPhone tenants are doing wacky I've been doing it for like the last couple of months like people walk by my office and I'm just like wow that's that explains that I walk by one day and you were okay yeah I was like I don't want to bother he's every go moving you you might have been right anyway thank you all very much if you all have feedback for us or topic requests I did toss a survey link into the chat please fill that out it lets us know what you'd like to see how well we're doing and we're also giving away t-shirts so we're raffling to all those that fill it out you would agree their email we have the best t-shirts by the way that's pretty ridiculous also always check for your local ue4 meet up so meet up.com slash pro slash Unreal Engine there's a lot of great devs hopefully in your area making cool stuff and if not you can start one so there's more information who contact us and let us know if you're interested in starting one as always submit to the Nvidia edge program I love getting to send out in video graphics cards to really really sweet projects so make sure to follow up Unreal Engine comm slash Nvidia edge on that I want to talk about the contacts of course so each week we start the stream with a five-minute counter the one that we showed today was kindred games swords and magic and stuff so we have four or five of these different projects so if you're interested in doing what submit like take like an hour stream and compress it into five minutes and send it over to us with a PNG of your logo and a short description of your game you can send that to community at Unreal Engine comm we'll take a look at it and if we decide to use it then we'll make sure the shot you got on the stream and it'll just be a permanent part of that rotation so yeah I totally wish I'd known that because I would have sent you one for this for assembling all of this and it would have been me at high speed just going behind can we still do that I mean sure why don't let's do that let's maybe next weekend and make sure you guys follow us on social media we're super active on Twitter Facebook Instagram LinkedIn obviously follow us here on Twitch oh and since we're talking about twitch if you do stream make sure you select the Unreal Engine game category because now you can stream down really yeah we're a non game that's pretty cool right pretty awesome we try to do cutting-edge things you know yes again only every day every day thank you so much for joining us my pleasure great to have you so well we'll see you all next week [Music]
Info
Channel: Unreal Engine
Views: 56,131
Rating: undefined out of 5
Keywords: Game Development
Id: AIHoDo7Y4_g
Channel Id: undefined
Length: 68min 47sec (4127 seconds)
Published: Thu Oct 04 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.