Artist to AI? Custom Metahuman Unreal Engine 5 Tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Jason is that [Music] you [Music] I just want to make something really clear before we get started with this video the AI used to assist this pipeline include face Builder by Keen tools mesh to metahuman which I cannot actually confirm whether AI is used in the mesh generation but I'm assuming it is and metahuman animator why do I feel like I need to mention this I personally feel there is much more to understand here than a step-by-step process to reach a cool result I by no means aim to contribute to the automation of creative processes nor the theft and homogenization of art the majority of this video is actually my creative process that has been informed by my authorial vision if you will but if you're interested in the overlap between Ai and art and the ethical implications of under legislated development I highly recommend you watch Jimmy McGee's video the link is in the bio okay on with the video let me walk you through the process I took to turn myself a CG artist here at inadar into AI with metahuman for Unreal Engine 5 if you are a junior like myself or an indie film developer or just somebody who wants to immortalize themselves in the digital realm and is that level of vain like me then this is perfect for you crafting realistic characters has been a challenge for people in the CG industry for years is a topic of question that has been battled from the very start is how and to what extent can artists emulate life in their work today I'm going to be exploring the fascinating world of transforming yourself into ai ai can now create 3D models rig them animate them and even trick your family into believing it's you in our industry we're taught that two major features of Animation classical animation it could be 2D or 3D are endowing with life and the endowing with motion I don't think I'm saying endowing correct in endow now with AI in the pipeline we have another approach to the process of endowing with life and making this metahuman really made me think about the wider conversation of new forms of Digital Life mind and agency as an artist who still has issues with how un legislated AI is and watching the consequences is firsthand in terms of job loss um it feels like we can barely scratch the surface of what the actual implications are with AI because it is such a new and evolving rapidly evolving field this project actually surprised me I saw the lines between animate and inanimate gradually dissolve I do stand by the notion of feed artists not the algorithm however artists like myself can hugely benefit from learning and optimizing combined workflows when we see posts shouting about how AI is going to replace artists death of human culture and AI Revolution is nigh it's not exactly inspiring stop AA I said stop who who whoa so let's begin with stage one shape I used a combination of photo gramet tree and the app face Builder by Keen tools for blender found face Builder useful for texturing later in the pipeline but there'll be more on this later but photogrametry produce more accurate representation of my facial structure so I'm using a combination of both in this video but more than happy for you guys to choose one I scanned myself in and took picture references with flat lighting we're going to make sure that the face Builder plug-in is enabled in blender so in face Builder you're going to create new head adding the photos taken double click on front view here and hit align face button what you can then do when these little red dots pop up is tweak pin points and make it a more accurate representation you can see we're deforming the mesh that they have generated here which is I think originally just like some dud drag those pinpoints into your face honey after you've align that you can press escape to exit out of this view double click on the next one so side left side right and just repeat the process until you have basically pinned this mesh using AI into your face this just saves so much time like obviously if you a high Lev modeler you'd be wanting to do this um with a lot more detail at this stage however for what we're doing all we need is the basic structure in order to grab the textures we're going to hit UV butterfly option as you can see here and generate only using the right left and front options don't bother with the three quarters as it can create some problems with the texture mapping basically what it's going to do is go through the cameras and project the image that you have fed into it onto your little face you can see it's been projected yay they are having a little dance we're having a little dance we can go on to the next stage hit export as a PNG is perfectly fine you can do JPEG but I find PNG cuz I just prefer them and then you will be given a butterfly unwrapped texture base we're not going to use this as the final textures however it will be super useful to layer on top of our textures later on when you export your fbx make sure you select copy for path mode and hit the little tray icon I'm not too sure what it is but basically you are telling blender that you want it to export all of your files associated with the fbx so the textures without having to manually rroo them back in this leads us on to stage two metaphy we're going to head into unreal we're going to set up a new folder because we are civil and title it meta whatever your name is import your asset in and because we hit associate files it comes comes in with all of its textures lovely doely in this stage we're going to make sure that all plugins that are needed are installed that is the metahuman plug-in and the metahuman animator as well as live link so grab them from the Epic um Marketplace and restart your engine if you haven't got them installed create a metahuman identity solver rename it whatever you want to double click and we're going to create a component from mesh so select the face Builder SL photogrametry mesh that you have import and what we're going to do now is just hit it to unlit we don't need any Shader information and we're going to line up the camera in direct okay yeah I'm going to give it a dance first we're going to line up the camera wiggle it on forward and trying to get it as front on as possible and then with that select this plus button here or at the top to promote frame and then after you've promoted this Frame as the front one we're going to track markers on this active frame and what that's going to do is something similar to the mesh face Builder that we used in blender and it's going to set out all of the markers for metahuman to create a metahuman so it's using the fbx and also this promoted frame to generate our meta human child at a child if it wasn't perfect which most of the time is pretty good you can select these green markers and reposition them into the correct location but for me as you can see it's pretty perfect if you are happy with all the markers that have been set up we can hit metahuman identity solve now this usually takes the longest amount of time but after you have finished that and you hit the split view you will be able to see wow we've got a metahuman and a little friend whispering in our ear after that hit body select your body type for the metahuman a little criticism they haven't got the most options for body types you either got tall tall and skinny medium and medium and larger and larger so maybe in the development of future iterations of metahuman we could get some more inclusive body types that would be really good we are going to select mesh to metahuman and this is where the magic happens let's head over to metahuman unrealengine.com MHC and we will see our metahuman solver looking blinking and staring at us this is the blank canvas metahuman has created for US based off of the photogrammetry information and face Builder information that we send over it doesn't look anything like me right at this stage because we haven't done any form of customization but if you completely ignore everything so maybe remove the hair and have no textures you'll be able to see that we have a slight resemblance in the profile and this is where it's really important to match up your profile in custom mesh we can enable editing and that will allow us to push and pull this template mesh metahuman has very kindly created for us into something that's a little bit more recognizable and also fix any errors that has happened in the automesh generation so for me I have some strange hairline issues that usually happen with photo gretry so if you've got big hair facial hair even um or something was wrong with the original mesh you can slide region influence down to zero and that will set it to something that metahuman recognizes a bit more clean tidy and organized you can also hit the sculpt button in order to push and pull selected parts of the mesh around and into something that is more recognizable this is incredible because it happens in real time and has rendering can even have the textures applied while you're doing this which in a classical pipeline you would typically go from modeling and complete it and then go into texturing and not touch the modeling anymore especially if you have a rig because sometimes if you make deformations in a mesh or change the topology count you won't be able to rig it properly but in this is not the case you can push and pull and edit and change and mold this metahuman at any point and if you realize something even after you've exported it you can go back and edit and reimport your metahumans so there is a continuous um feedback loop that you can enter which is fantastic for um an industry where there are always going to be edits that are needed from directors and unhappy clients but that's okay that's where this becomes so useful so if we hit sculpt and load up pure ref with your reference photos that you took for the face Builder stage we can hit contrl and plus or minus to adjust the transparency and contrl T to make transparent to Mouse and we can push and pull the mesh with the reference directly in front of it so we can get a perfect alignment this is really fun process um I've just sped it up a little bit here to show you how easy it was so I'm pushing and pulling I had some issues with the jaw as you can see really trying to match up that profile one thing that I will say is that metahuman is a bit Limited in how much sculpting can actually be done um it's very much if you want to change the jaw for example you either slide it forward or backwards you can't have the degree of Freedom you have in for example zbrush if I was to do this again I would actually export the base mesh into Zid brush see how I switch out there and make sculptural changes to really get that level of precision always trying to push for more maybe that would be another episode look from every single angle and consider lens lens lengths lens lengths in your reference photos because that will distort what the face actually looks like the biggest Pro that I have found with this stage is it has actually created a rig for you inside of your character and made it ready to be animated as well as real-time renderable materials groom groom is something that I haven't groom actually scares me a little bit but obviously I'm going to be learning it soon so thank you epic for doing that it's honestly incredible and the quality of it because of the LOD systems that are set up in unreal it just means that you get to where you want to be faster some of the biggest criticisms that I've seen of metahumans so far is that you can because of how long it's been out you can kind of start to tell where metahuman is being used like oh yeah that's a metahuman kind of off the shelf but I disagree because if you treat V human as a base creator for your characters and then add all of your quality iterations such in substance painter and what we're going to be doing later on in this video you end up creating something that's very bespoke very individual and in my opinion AAA quality in such a short amount of time but hey I might just be impatient trying to achieve likeness is an artist's biggest challenge I'd say if you are into trying to replicate life and realism and even in characterization trying to get some form of likeness and figure out what makes a face recognizable as somebody that you know or just a living human and surpassing the dreaded uncanny valley what makes a face recognizable as your friends or your lovers or your families how do your your actions and your personality shine through outwardly everybody knows how the eyes are this the windows to the soul but also because you know I stare at myself a lot in the mirror and I know my face very well spent a lot of time in front of cameras and screens you know exactly what it makes your face yours so for me I have freckles in very distinct locations which I won't be able to do inside the metah humor Creator but what it does have in the skin um tab is being able to customize certain areas of redness so you can see I add a bit more yellowness to my forehead because that's what I do have I have redness and almost like a residual tan from when the sun was in the UK over my nose Bridge under my chin I've got some redness from where I'm touching my face I've got acne scars that I definitely wanted to keep in because how often do you see characters with actual acne or scars makes me very distinctive it makes my you know that's something that I've done to my face I pick up my face I get a little marking on it and it shows it's it's visual storytelling like it's stupid but it's trying to imbue life in every little every little detail at the end of it you'll have something that is not 100% you we are aiming to get around 70% there so even if from a certain angle it looks nothing like you but but from the other angles it does it's whatever you play to camera I found myself giggling a lot throughout this process cuz my face was just being distorted I spent a week staring at my own face that's self-care that was meditation what is self see what I mean it goes down into this like rabbit hole of who am I staring at are they staring back what is going on what is mine why is that not mine why what does it look weird and everybody can have an opinion on on it because I mean it's hardwired into our brains as humans in our migala and our lizard brain and our recognition it's yeah what causes The Uncanny Valley it's our precise ability to recognize faces and distinguish between what is alive and friendly and familiar and what is potentially dead ill diseased or otherworldly that takes us into stage four makeover put in my metahuman through the quickel bridge and in order to give this girly a makeover we are going to find her head mesh inside of unreal and then export it as an fbx without levels of details without lods selected in the export settings we're going to take that into blender back into the blender and clean it up so it's just this like skin face mesh if that makes sense so we're going to get rid of the eyelashes we're going to get rid of the eyebrows and we can hide the rig we're not going to be able to export but just get rid of everything so that you're left with just this skin mask of your face God what we're also going to find inside of unreal is in the head folder and find the all of the face base colors so we're just going to work with the base colors today but you can also export the normals and roughness to adust that but I was pretty happy with the level of oiliness I got inside of metahuman Creator so I didn't really Touch Too Much of the roughness yeah you get the justess export all the face based colors and Export the head as an fbx that is the most important part the two approaches that I developed in order to edit the textures as efficiently and cheaply as possible is firstly trying it out with Photoshop now Photoshop is not free but I know that there are multiple different um softwares that are similar you can even use sketchbook for iPad just any layer based software that you can edit pictures in and draw over them and you know they get Lo in your base color texture and then over the top add all of your edits as well as adjustment layers so that they can be turned on and off because we want to Import and Export each version of the base color layer that we exported from unreal that might sound confusing but essentially we're going to instead of just make one texture for one base color we're going to make five textures this is because of facial blending and texture blending that we've got happening in the metahumans that really bring out a next level of realism bit hard to describe and I don't actually know why it does it or how it does it how it Roots but all I know is you need to do this process five times Photoshop makes that pretty easy cuz you just swap out the base layer um with the next version you do have to kind of guess where textures will fall so for example with my pretty precise eyeliner and my eyebrows and my freckles I know that they have to be in very specific places and trying to map that out on a 2d file like this is quite hard um but I tried it out and I did that with the eyebrows at first and I was like okay I did a pretty good match but as you can see the left eyebrow was a little bit janky so I ended up using the next approach which is um importing everything into substance which to be honest I would just do straight out the bat but substance does require a little bit of understanding um and again it's not free but you can still use an alternative such as quickel mixer which works in a very similar way to substance just creating stuff in layers now why did I move into substance uh you know exactly where you're placing things and what it will look like on the model you can become very precise you can export for 4K textures and completely have free Reign Over what your metahuman will look like adding the eyeliner of course and the eyebrows but also adding my specific pores over my nose um and the ways that I did this was importing literally just my face on picture and this looks absolutely ridiculous it looks like a temporary tattoo just being slid across my face this I yeah God it's like a mug shot on a mug shot on a mug shot it set the projection setting to UV it means that it projects the image onto the UV rather than the model and in this way you are able to effectively place and remove and adjust all of the textures that you would need that you need to so this is it without and one by one I can show you the different edits that I made so left eye right eye adding the eyebrows so they look more equal a little bit of contouring for my makeup girlies it really feels like I did my own makeup but in CG I had a lot of fun with this stage and I can imagine myself just doing loads of different makeup experiments on this meta Lily this is where the butterfly map that we created at the start of the video with face Builder really helped me with grabbing the eyebrows and just pased them on top in Photoshop cuz everything's flattened already but even just having inside of substance was really useful as well but if you're going the photogrammetry route don't worry as you can see you can just take a front all a front-on photo of yourself and load it straight into substance Quicks or mixer or wherever you want to go and texture once you're happy with your adjusted textures we are going to export everything um out of substance into 4K texture Maps but with the same names just these ones can't be bothered sorry and as you can see if I load it in now and instead of just the Bas mesh TAA she's looking a lot more like me so this is what I was talking about earlier on facial texture blending triggered by blend shapes I believe inside of unreal's metahumans blueprint um just for demonstration purposes I applied only the main texture into my metahuman and as you can see when I scrunch the face up you can see the transition between the original texture without any of the edits and our updated edited metah when you move the face these freckles and a slightly paler and just different lips appear just got to plug them all in um into the correct one and they should work fine what is the final step in order to bring it alive well it's animation look it's moving it's it's [Music] alive reality is not achieved and life is not emulated unless the Fidelity of motion matches that of fidelity of asset if your beautiful asset which looks like you moves like a robot it's not going to feel realistic metahuman have done it again around 3 months ago they released a plug-in called metahuman animator which is absolutely mindblowing so I'm going to run you through the process in order to make your metahuman move like a human what we are going to do is record a performance using the Livelink face app as well as a calibration take of your face so what that involves is again looking front and to the side and also doing a cheesy teeth pose that is so so that the engine can calculate what is going on with your face what is the proportions of your face and what it looks like when you expose your teeth so that when it animates you talking it looks like you talking rather than just a puppet this is crazy um there are many videos documenting how the actual U Mechanics Work under the hood of this pipeline but I'm going to be taking you through the process of getting it to work so I recorded my calibration take and and my animation take before you hit record for either of these just make sure that you rename your video to something that's recognizable so calibration uncore one test animation1 for each one create a capture Source in Unreal Engine so you're going to right click head to metahuman animator and select capture source and then making sure that your phone and or whatever that you're using to record your takes and your device are on the same IP address and internet you're going to connect them up and import your footage next step is to create another metahuman identity but when we hit create components we're going to select create from footage instead of create from mesh like we did earlier and this is all on the calibration test for we're doing a similar process so lining the camera up with the front on pose and selecting the plus button to promote frame and like it did with the mesh earlier on it's going to create little green markers to detect where your face is promote frame and do the same after unlocking The View for the side profile and the other side profile you have to unlock the camera each time in order to promote extra frames so you'll have three frames promoted to start with hit metahuman identity solve at the top top and the AI is going to create another B mesh for the animation takes in order to make this animatable we're going to select the body again and then hit mesh to metahuman after that takes a little while we're going to prep teeth by selecting poses add and add teeth pose and then we're going to go to the frame and promote that frame where our teeth is being exposed and smiling and then we're going to hit fit teeth just a tip I've read somewhere that if you have your mouth open it doesn't really work the greatest you'll end up having a weird overbite and sometimes even teeth coming out of the lips mesh so in order to avoid that just just normal smile and then after we have selected our teeth pose we're going to select prepare for performance next we're going to create a metahuman performance this is the facial performance where we load in the non- calibration video so just the take of you talking and doing your thing and then what we're going to do for metahuman identity solve is select the one that we have just created and then all you have to do is Select process now this takes a long while well depends on how Specky your computer is but generally speaking this takes the longest out of the processes because it's motion capture it's automated AI based motion capture using a phone so you get to watch it process and do its thing do its magic and by the end of it you will have a fully animated identity solver watch your back be creped out by it because you didn't expect the animation to be this good if it if I was to hand animate all of this oh my Lord I would not know how long it would take but it certainly wouldn't take the 15 minutes it took to process whilst this may seem like a lot of prep work and click cing buttons and not quite knowing what they're doing at least that's what I was feeling when I was learning about this process this is where AI does the rest of the work for you export either adto head using export animation or create a level sequence connect to body with the blueprint and add mixed emotions to have fully animated developed metahuman character that you have uniquely customized if you've made it this far congratulations you have now mortalized yourself in the digital realm using AI assisted workflows with traditional art customization and the power of Unreal Engine 5 I'm extremely proud of the results that I managed to create in such a short amount of time pigan moment falling in love with my own image I invite you guys to share your thoughts and opinions on the conversation surrounding AI entering the world of art I am conflicted intrigued excited and looking forward to seeing what else I can create using this Tech cautious is the right word AI has been a part of the human psyche for centuries and is something that we have continually been afraid of yet we seem to be embracing now and I'm curious to see what the percentages are um in the viewers attracted to this video this episode is my first debut using the alphabet superset challenge made by struthless so A is for artist turns into AI I will be going down the alphabet and posting hopefully every week a different letter of the alphabet and exploring where this channel will grow so if you're intrigued to see what else comes out of my my mind and what other work will be experimented with then I invite you to join me and we can can help teach each other if you have any tips for smoothing this pipeline out Corrections if I've generalized or made any mistakes in this then feel free to leave them down below as well I welcome everything I hope that you learned something or at least thought something new today thank you for joining me on this philosophical Deep dive and workflow tutorial I'd love to see what work is being created out there using such crazy new technology for now that everything thank you so much for watching and I will see you guys [Music] later
Info
Channel: loelfolio
Views: 21,876
Rating: undefined out of 5
Keywords:
Id: T9eYbeYvO3M
Channel Id: undefined
Length: 31min 46sec (1906 seconds)
Published: Fri Sep 29 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.