Unreal Engine Metahuman Facial Motion Workflow with Faceware

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi i'm gabriela and i go by feeding wolves i am an all-in-engine virtual production artist specializing in motion capture with metahumans i've been using unreal engine for a little over a year entering this world without any experience in 3d applications and having been a bartender for the last decade i am currently working on a short film and unreal engine in which the facial performance of my character is extremely important especially with dialogue the character i am using was originally created by michael weisheim i had the face portion removed with the help of blender artist pixel urge so that it could be replaced with a meta-human face he also taught me how to customize the face textures in substance painter the body textures were completely remodified by talented 3d character artist daniel kadena and the hands were re-rigged by character artist constantine lacrua for all of the lighting i received assistance from bernard ryder for anyone that does not know what a meta-human is it is a high-fidelity fully rigged character that can be created and customized in the meta-human creator they can be accessed using qixel bridge and then easily imported into unreal engine with all of this being completely free hi i'm a metahuman and i am fully rigged and ready for motion capture i am made by three lateral and epic games a ton of effort was put into making me look as realistic as possible look look at my eyes for example they look super real my experiments with facial motion performance began as in the story my character speaks in greek she discovers a mirror image of herself so the greek metahuman was one of the first tests i did in order to try syncing audio with body and facial performance as well as mirroring the movements ok let's see my current setup consists of the mark iv hmc by faceware which streams in live rgb data into faceware studio and is recorded in shepard i'm also using the xsens link suit to animate this body and the mana prime two gloves to animate the fingers and then for the face i am using the mark iv head mountain camera by faceware which is transmitting video data to uh face or studio which is using the live client plug-in by glassbox to transmit this information into unreal engine and oh yeah all of this is being run on a puget systems workstation with an nvidia rtx a6000 and that's my phone so i gotta go bye i'm able to integrate all of this data and record it while also streaming it directly into unreal engine by simply pressing record in one software which then gives me two animation files one for the face and one for the body i then record my audio separately and will do something like a clap and open my eyes wide using these as markers in order to sync everything up in sequencer later the steps involved in setting up facial motion to work with meta humans and faceware is simple and easy with my metahuman imported into my unreal engine project i enable the live client plugin by glassbox or if you are working in 4.27 faceware has a free livelink plugin now available i then assign the motion logic blueprint to the metahuman face this motion logic blueprint sits between the raw solver data from faceware and the metahuman pose asset and a special thank you goes to norman wang from glassbox as there are only a few people in this world that can tailor such an intricate motion logic blueprint this will give you the most natural looking expressions in combination with face where's tracking system and solver i begin my facial motion workflow by first capturing roms also known as range of motions as this allows me to retarget my facial motion directly onto my metahuman so that i can make adjustments using the multiplier and faceware studio and so that my expressions match up as close as possible to the meta human i am using and this is all done in real time in this example i show what the facial expressions look like with and without a profile once my expressions approximate my metahumans then i stream in the actual performance meta humans are heavy assets which is why optimizing involves lowering lods on the metahuman and paying attention to frame rates in unreal so that no data is lost once my facial performance is captured in sequencer i add my audio the body and facial animation and then using the sound of the clap the wide eyes and the clap itself as my markers i sync everything up before i begin fine-tuning the facial motion i like going through all of the controls on the face control rig board of my metahuman in the event that there are any areas that might need to be adjusted in the creator before continuing i then bake my facial performance to the control rig and go through the performance looking for areas that might need adjusting since the meta humans have over a thousand bones and blend shapes faceware will get you 90 percent of the way there but in order to fully utilize the face i use an additive backward solver to add on top of my current animation i'm currently learning how to use curves in order to smooth out any sort of jitter and since i am new to working with curves this is still a work in progress i highly recommend you watch the video shared by unreal engine on the face control rig board featuring adam walton and i also have a video library inspired by that video where i've organized all of the face control rig board controls using time stamps and go over the process of creating an additive backwards solver the next thing i am planning is moving my entire motion capture workflow with metahumans into unreal engine 5 so that i can redevelop my pipeline utilizing the new animation tools and also taking advantage of lumen this gives me an opportunity to see where the new features take my story and allow me to continue to learn during the process of creating that is when you learn the most you find solutions to problems and create new pipelines as a virtual production artist i am still learning you
Info
Channel: Feeding_Wolves
Views: 96,794
Rating: undefined out of 5
Keywords: unreal engine, metahuman, metahumans, virtual production, facial animation, facial motion capture, body motion capture, ue4
Id: RLajzvIh95E
Channel Id: undefined
Length: 6min 52sec (412 seconds)
Published: Mon Apr 18 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.