Insanely EASY Face Performance Capture for METAHUMAN ANIMATIONS in Unreal Engine

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
good morning today we will have a look at how easy it is to use real-time performance capture directly in a real engine using your Apple devices so let's have a look [Music] so first out we need to download the app it's a free app called live link here you can find live link face from orbital engine so simply download it and then open up the app go into the settings and heading to the first menu called live link so here we can connect the iPad to the computer so simply hit add targets and type in your IP address if you don't know your IP address you can using Windows press the windows search menu and type in CMD and then type in ipconfig down here you can see your IP address which is obviously blurred for security reasons simply enter that IP address into the app you can leave the port as it is by default next out you want to enable stream head rotation and then you can head over to Unreal Engine where we'll continue here we are in Unreal Engine this is a simple scene I've set up with a meta human walking towards the camera so you can drag and drop your meta human into the level and that's all you need to do I've made some extra steps but these aren't necessary at all you can go ahead and create a sequence and then Drag and Drop The Meta human into the sequence we will need some project plugins however so open up the plugins window and search for live link here you want to enable the first two ones live link and live link control rig then search for apple and enable the Apple AR kit and the Apple AR face kit support then you must restart the engine to make sure that these changes apply if you are experiencing issues with your method human's hair then this is probably due because of the load system so go into your meta human blueprint and then under load sync change DeForest lot from -1 to 1. this will make sure that the metahuman is always running at the maximum resolution and load level and then finally select your math human and enable use AR kit face in the drop down menu from the face subject select your Apple device in my case the iPad and now you can see that the facial capture is translated from the iPad over to the meta human in Unreal Engine simple as that which is very pleasing oftentimes things in a riddle are very difficult but this this particular thing is quite easy and surprised me in its efficiency every face is different so you need to make sure that you calibrate it for your face you do this by pressing this small icon in the lower right corner and press recalibrate here you want to make a very neutral face and you need to compensate for your lips so if you notice that the meta woman has like a opening in its mouth you need to kind of like counter it with your mouth so make the most neutral face you can and then look at your mat human and try some movements and then if it doesn't look good then simply recalibrates so the final question is of course how do you record what you're doing and I'll combine this with your body movement in my example I've already implemented a maximal animation for the body and I want to combine it with my facial performance capture so go to window and then cinematics and take recorder this is where we can record our actions so simply under this small Arrow menu here select the sequence that you want to record your actions to and you need to already have the blueprint method human in the sequencer for this to work and here you can change the take and then it's only a matter of hitting record and then it counts down from three so you can prepare before the take and once the take is done you will see something very creepy and namely to duplicates of your mat human and this is just to preview the recording so you could either temporarily toggle off the visibility of your original material which is the one not in the sequencer as you can see here or you could disable it entirely so it's not visible in the render if you choose this take you do this by typing busy or visibility or visible and then simply enable the visible setting for each body part of your math humor doing this will leave only the recorded method human visible which you then can render this step is a bit tedious but you only have to do it if you're rendering and here you can see the final outputs it's not a perfect facial performance capture by any means but considering you can do this with only your iPad or iPhone [Music] and it's pretty remarkable in my opinion the saved sequence you will find in a separate folder called subsequences [Music] and if you want to rear record you can simply select the sequence once again then change the take number thank you so much for watching and goodbye [Music]
Info
Channel: tiedtke.
Views: 5,641
Rating: undefined out of 5
Keywords: social media marketing, red camera, tiedtke, content creator, videography, influencer, komondo, tutorial, blender, vfx, editor, short film, films, director, Unreal Engine, facial performance capture, digital humans, real-time animation, 3D modeling, motion capture, facial animation, artificial intelligence, metahuman animator, 5.2, ue5, animation programming, computer-generated imagery, CGI, 3D rendering, special effects, digital compositing, digital doubles, virtual production
Id: y3wzvMGMbFE
Channel Id: undefined
Length: 7min 5sec (425 seconds)
Published: Thu Mar 23 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.