MetaHuman Animator Tutorial | Unreal Engine 5

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
can you believe how realistic facial animation has become this is the metahuman animator the absolute best way to capture your facial animation it uses your phone's camera to reconstruct the 3D model of your face and then processes that into a super realistic animation this wasn't possible until just a few weeks ago you probably want to know how much it is absolutely free it's free it's free how long does it take to set up only 20 minutes and guess what you can do it too just record your face with your phone process it in a new engine 5 and boom just like that you have Hollywood level Face animation in fact it's so easy to use that for the rest of this video we're gonna turn into metahumans nice Paras let's go here's what we're gonna do today we're gonna teach you how to set up your iPhone with a LifeLink app to capture a facial motion then bring the footage into Unreal Engine 5 to calibrate process and then transmit that facial animation to your meta-human but here going to take it a step further once you get the animation working you're going to end up with a super realistic result on a floating head nobody wants that so we're going to show you how you can connect the neck back to the body on top of that we're going to give you more tips because I'm moving ahead on a static body ain't gonna cut it we're gonna use the default body animations available in Unreal Engine to drive the body animation and blend that with the head to get a full body motion let's begin by capturing our footage first we need an iPhone so that we can capture depth data download the live link face app from the App Store select merah human animator as your capture mode you need to record two footages here one that is the calibration take and another one that is the main animation which will fall onto your meta human before we start always name your takes by clicking the text and changing it for the calibration take we need to have a clear static shot of you looking straight slide left angle slide right angle and lastly you're gonna open your mouth wide and show those beautiful teeth once you are done stop recording now go ahead and change the name of your take before we start to record the main take press record again to do the main take you can always have more than one main animation so record as many as you like we're done without recordings so let's hop onto our computers and open Epic Launcher make sure you've already installed Unreal Engine 5.2 or above code in the marketplace and search for metahuman install metahumans and metahuman plugin to your latest version of Unreal Engine if you've already downloaded these plugins previously go to your Vault and install to engine go ahead and open a new project from quicksil bridge add a meta human to your project and set up your scene with some sexy lights before we start some of the processing that we're going to do might be heavy so if you got a beefy computer make sure you go to the project setting search for caching and copy the following settings now go ahead and create a new folder in your content browser because we're going to stay clean and civilized here within the folder right click and hover over meta human animator before we get into it let us clearly explain what we're about to do first we're going to use the capture source to import the footages recorded on the iPhone then we're going to use the meta human identity to calibrate a new meta-human phase based on the actor you only do this part one time per actor lastly we're going to use the meta Human Performance to process the actual take and bake it into a usable animation sequence that can be transferred to any meta-human face let's get started create a capture Source double click and for the capture Source type you have two options with live link face archives you can select the files locally from your computer that's for when you want to manually transfer the files from your phone to your computer we are going to go with live link face connection which uses your network to transfer the footages all we need to do is to fill up the device address first of all you need to make sure both your iPhone and your computer are on the same network remember the gear icon on the top left of the live link app click on it choose osc server and copy the IP address into the device address save and we are done here now go to tools capture manager select your capture source and you should see your takes if the notification pops up on your phone allow it add your takes to the queue and import all back to the content browser now it's time for calibration right click and select metahuman animator metahuman identity select create component from footage and choose a calibration take here's why naming your takes when you're recording them is important down here you can use a slider to move across your video remember the poses we did earlier during the recording we now need to show UE where the front left and right poses are in the video choose a keyframe where you're looking straight and promote it by clicking the plus sign it will give you a warning click OK you should now see the landmarks correctly fitted on your face most likely you don't need to change it once you promote a frame your timeline is locked to unlock click the camera icon on the bottom then repeat the same process for when you're looking to the left and to the right after the markers are created click on metahuman identity solve click on body select the body type and then click on mesh to met a human and choose auto rate identity skeleton mesh plus full meta human now we need to show you either teeth click on poses add add posts and add teeth use the same calibration video and find a shot in which your beautiful teeth are clearly visible and promote it here you can see four landmarks to Define your upper and lower teeth in this case the upper and lower teeth are visible so we're going to use them if your actor's upper teeth are covering the lower teeth the lower teeth landmarks are going to be messed up in that case you can hide and disable them by clicking on these two icons you will get a warning but don't worry about it even the official UE video disabled it themselves once you're done click fit teeth and then click on prepare for performance and wait for the solve to finish this is going to take a while so go make yourself a cup of coffee while you're waiting [Applause] back to the content browser again you should now see a meta human identity for your actor it's time to process the main animation to be used for other meta-humans right click and select meta human animator metahuman performance and double click choose the main tank for the footage captured data for The Meta human identities select the identity of your actor for our head movement mode we are going to choose control rig click on process to automatically solve the facial animation once the animation is done play it back and see the magic of meta human animator here you have two options you can either export the animation and simply add it onto a meta human head which is faster but your head will be detached from the body and when you attach it back it will lose the neck movements and you're left with static animation and we don't want that we're going to go with the second approach which is inspired from ibrews shout out to him here we'll be able to attach the head to the body get a body its own animation while preserving the neck movement this is the most realistic and ideal outcome so let's go select export level sequence and click create this way it sets up the entire sequencer for us back to content browser again you should now see a level sequence selected so right now in our scene we have the take video The Floating calibrated meta human on the right and our main meta human Professor X on the left our goal is to bring the animation from the calibrated meta human to the main metahuman in order to do that we have to pick the animation so that it's transferable however we can just go and bake it from the calibrated meta-human we need to First switch the meta human head to match the main meta human in our case Professor X's hat so go ahead and click the calibrated meta human then in the details panel on their mesh skeleton mesh asset go ahead and select your main meta human head you should now see the switch playback the animation to make sure it's working but before we even bake let's make sure the main meta human is in our sequence answer so that we can animate it click on the main meta human in the sequencer click track select actor to sequencer and add the meta human blueprint now it's time to bake so click the calibrated head it should be highlighted in your sequencer go ahead and open it up right click face and big animation sequence name it whatever you want and Export it to the animation sequence select your main meta human and in the sequencer on the face bar go ahead and click the plus icon go all the way up to the animation and look for the back animation you just created by now you have a fully functioning animation but a head that is detached from the body so now let us show you how you can add an animation to the body connect the head back to the body and blend in the neck and body animation together for the body animation we're going to be using the default animations that come with the third person pack in Unreal Engine 5. in order to add that open your content browser on the left click add and go to add feature click on it here you're given a bunch of options we're going to choose the third person mode and add to project in your content browser if you go to characters which is a newly added folder you will see the default mannequin characters that come with a third person mode and they come with their own set of animations like jump idle and walking we want to go ahead and retarget these animations to our meta human skeleton so that we can use it across all of our meta humans in the project to do that go to character mannequin animation and Manny folder shift-click all of the animation sequences and then go ahead and right click hover over retarget animation assets and click on duplicate and retarget animation assets slash blueprint on the top right of the ik retargeting choose RTG metahuman for the source we're going to choose the mannequins in this case many and for the Target we're gonna go with the basic meta human body click retarget what's great is that you have all these extra animations that you can put across all of your meta humans so let's go ahead and do that back in the sequencer go to the body of your main meta human click the plus icon and go all the way to the animation and choose one of the newly added animations in this case we'll go for the idle mode at this point we have an animated body and an animated head but they are detached from each other but you might also encounter the same issue as us where the body animation is not showing the sequencer if that happens close your engine and reopen it now that we are back and the body animation is visible in the timeline I want to go out and switch it with something a little bit more obvious so we're gonna go for the walking animation first thing we're gonna do is attach the head to the body we're gonna do that by clicking the body in the sequencer and on the right side in the details panel under animation tab go ahead and keyframe the animation mode and turn on the disabled post-process blueprint we're gonna pretty much do the same thing for the face so go back to the sequencer click on the face and on the right side make sure you keyframe the animation mode and turn on disable post process blueprint the same thing at this point you now have connected your head to the body and the animations work together however if you've noticed you've lost the original neck animation that you recorded with your iPhone and right now the body animation is driving the neck the most realistic outcome is to bring back the neck animation we recorded and blend it with the body animation so if you're down to be the best let's move on in order to do this we need to go ahead to the Face animation blueprint so click on the face of your main meta human in the sequencer and on the right side under the animation tab within the atom class click the search icon so that it pulls up in your content browser double-click the face nmbp on the left side make sure you double click anim graph once a tab opens up go ahead and zoom in and look for the layered blend per bone click on it and let's go on the right side in the details Tab and change the blend mode to blend mask we need to Define where we want the blending to happen so on the top click on the Skeleton on the left side click on the gear icon and in the blend profiles select add blend mask in the search bar go ahead and look for the head click on it and you should see the head bone highlight on the right side now it's as simple as dragging the slider for the headbone all the way to the right until we get to the value of one go ahead and save then on the top right let's go back to the blueprint on the right side go ahead and expand the blend mask and in the index you should be able to see the blend mask that you created earlier so select it once you do this go ahead and compile and we are done go back to the sequencer and see the magic of the head animation blend with your body animation and that's it for today's video see you guys next time make sure you hit that subscribe button if you don't I'm going to haunt you in your dreams as a floating metahuman
Info
Channel: Bad Decisions Studio
Views: 127,211
Rating: undefined out of 5
Keywords: bad decisions studio, daily vlog, vlog, bad decisions vlog, #youtubevlog, vlogger, vloglife, startup, vancouver, canada
Id: hZ2mkcd4C7M
Channel Id: undefined
Length: 14min 1sec (841 seconds)
Published: Tue Jul 04 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.