MetaHuman

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
metahuman we have it came out now uh three or four years ago I I still have the scars from the first year that we decided to use it in production um and uh and so metahuman is a um is an incredible piece of software that let you basically design human beings uh metahuman beings uh they're not real human beings U not yet uh anyway so but but the um but it lets you design uh you know a variety of characters that you can then bring into unreal or many many other uh pieces of software and um it really does uh give us a way to build lots and lots of unique characters uh and and so we we've talked about it in the past but we thought it would be good to bring it back and and um have another conversation because it's changed it's updated it's gotten you know better um in in the past couple years um and uh and so I so Nick uh who who has done a bit and I've actually done work with Nick on in metahuman um uh and in unreal uh the thought we bring him back and talk about where we're at Nick take it away thanks Alex uh yeah I mean metahumans are a a good three years old now in terms of public availability of course they've been in development for for longer than that um but I think they became publicly available at about this time in 2021 and they've certainly matured quite a bit in the in the last three years and it's um in keeping with epic's overall kind of Mantra of it's very easy to get started and it's no cost and then if you want to dig in deeper and deeper and deeper there's there's a whole Rabbit Hole of advanced features that you can get into but I think the exciting thing is that there's no cost to get started and you can immediately create very photorealistic characters digitally uh with these tools you can bring them into Unreal Engine Unreal Engine has gone through a number of revisions over these same three years and have uh dramatically improved things like the animation tools and motion capture retargeting tools really now it is fairly straightforward to use almost any motion capture system to capture body movements and apply them to metahumans natively within its own ecosystem and that's a really big Improvement and another thing gets improved over time has been the availability of Alternative Clothing and uniforms and things like that I mean the the metahuman tool natively has a a small selection of clothing to choose from uh but now that they've been in the wild for so long there's plenty of 3D modelers and fashion designers that have been creating content uh for alternative costumes and such so uh so that's kind of the overview of where we're at I don't know if uh you want to start digging into any of let's dig into it let's let's let's show people what it is I I think it's been a long time since we've talked about it um so there'll be a lot of people watching that haven't seen it so let's go ahead and kind of give people an introduction to how how you put something together now this is by the way so one of the things you do need to do is have an epic account um so you're GNA you're going to sign up for your epic account then you you join it it's kind of a funny thing it's it's spinning up a server for you when you jump into uh into into metahuman so it takes a couple minutes so you sign in for your account and then go get a tea or coffee or whatever and then it come back and it's going to and it's going to it's going to take a little time because it's actually spinning an instance up um at on AWS I believe uh to serve you um you really have your own server over the web um to to work with go ahead Nick yeah um do you prefer I share screen through my camera feed um so I can hopefully switch there you go so now you're seeing um through my camera feed what the metahuman Creator like um interface you know basically looks like uh if you search for metahuman creating you you probably get to this page first which is Onre engine metahuman and then if you scroll down there's a uh launch in the app but as uh Alex was just saying you do want to use your epic account which is free to create uh on uh you know there all of their websites there's a button here that you can you can do that um and then this button launch the app is it's a web app so you can use this on any device that uh is you know basically supports Chrome or a modern browser and on the other end of this web browser yeah there's an AWS server um that is uh running the interface and so I'm going to show the most basic straightforward way of uh editing a metahuman so this one I've created before so I've I've got it selected and I'll edit selected and um the app again it's running on a server somewhere and so this could be on an Mac or PC and it' be uh equally as effective so now um I can choose blend here and what's set up here is I've got a number of template characters to choose from so these are existing characters with a variety of ethnicities and facial features and um from them I can put them into essentially like a a painter's palette uh where you might normally uh mix colors instead we're we're mixing uh physical traits of and and you're not mixing the whole face and these are kind of using these as seeds and you're simply mixing different attributes I want the eyes from that are a little bit like this or a little bit like something else but you're kind of moving it around and mixing it for the eyes or the nose or the forehead right correct so if I were to uh begin this process the first thing I do is choose essentially the base metahuman the the the initial template so in this case this guy's Vincent and I chose him um for you know overarchingly his overall uh facial structure head structure is closest to me by my estimation uh versus any of the other options and then I looked for other traits in other characters that that might be of use and so um in this mode here I can I can hit the blend mode for example and we can see that there are some little circles on different portions of the body uh so if I click here on this circle on the nose what I've done is I'm choosing between uh Natalia and Wallace here um again I'm clicking on the circle on the nose and I've got all of the different characters in my palet that I'm able to choose from and I can blend you know essentially between three at a time you know so in the center is Vincent uh here in the upper left is Wallace directly up is Natalia and if I fall in between like okay you know that that news nose looks kind of like me um and so on so same thing with mouth in this case I was kind of blending between miles and uh aoi um I think for the eyes I was using Zachary here and so all five six of these different uh sets of facial features are being used in different ways in different regions of uh the face that I'm kind of creating from myself to try and approximate uh something that looks a little bit like me uh once I've finished with the blending and and used that to the best of my ability to get close I can go ahead and choose uh the next level of detail is a sculpt mode and in sculpt mode I can have symmetry on or off I'm going to leave it on for now and this gives me an opportunity to use a little smaller circles and maybe I can pull the sides of the nose in a little bit I'm just right clicking here to to look from a different angle um you know maybe the the tip of my nose goes out a little bit but the bridge maybe in and you can see that each of these is adjusting the face in a slightly different way again we're symmetrical right now um so if for some reason I felt like well I want this side of my mouth to to move a little bit different than the other side maybe I will uh click in here and drag this down a little bit um we also have a move selector that uh allows us to to kind of select a grouping of these controls all at once um and this lets us kind of roughly sculpt a face based on these various templates um I probably have a little more CH and one of the things is and and there's part of this is that you you you may want to do something where you look have someone look exactly like somebody else but there's also the ability to create lots of characters that look roughly like what you wanted them to look like so for when you're talking about previs or a game or other things you may not need or want them to look like somebody else so we're talking about trying to match somebody but you may what's really cool about this is that you can output so many different characters very very quickly exactly so you know maybe if I wanted to try and make a digital uh child version of myself maybe right so I'll um you know try and select uh a little bit of a different hairstyle here and so back when I had hair um you for the body proportions one of the things that I could do is increase the head scale versus the rest of the body so um that's going to make the character appear younger I could also make the body a bit shorter um and so now I originally started by choosing uh facial features that that matched uh something that was close to me but now I'm just trying to make this uh a little bit younger then I can go into the skin for the face and again I was choosing something that's textured somewhat similar to myself but I can choose a different face texture here maybe something smoother and and younger and and maybe maybe more freckles or something here um I don't know I'll just I'll just stop right there um so that's an option uh again with the eyes maybe I want to um push the eyes to be a little bit larger and wider again trying to make the um overall look a bit younger and I uh uh what I will say is that this is endless hours of fun regard you know outside of your outside of doing anything that you may need to do remember you can export these out and use them in in production but just remember also that this is just fun you're like working on the look of something and I find it to just be a really enjoyable experience just to sit around and it's like another way to Doodle yeah absolutely uh I mean you have a lot of ability to uh operate within the um the scope of this software itself um one of the other things is that I think that we can we can put a graphic here onto the uh onto this sweatshirt here and what's nice is what's really happening is this is a um a PNG file with three different you know red green blue masks and so once this is over in Unreal Engine I could replace this with any other set of colors and and any other logo if I wanted to uh so you essentially what I'm doing is is creating something just saying oh there's a logo there but you but the but you could the logo could be um the logo could be swapped out completely it's just giving you like a logo is here but it doesn't have to be that logo it's a logo placeholder essentially right exactly so the the fact that I've got this here means that this existing texture is going to go over to Unreal Engine when I uh send this into an Unreal Engine project and um then I can you know take a look at you know what the three colors you know there's two different Shades of Gray and a and a blue being used and all of that is reconfigurable in the engine so um I could make you know a wardrobe of several different hoodies for example and just by changing the colors the fabrics and things like that after export um there's also quite a bit of uh functionality in terms of the details like the the eyes so in addition to color you know we can we can have a lot of variation in the color um and there's different patterns for the you know the iris the the cornea and all of this so um just like you said endless hours of fun um and especially for for things that we would consider again when you when you throw these all together we again will talk about matching folks and and doing those types of things but as what we would consider like you're going to fill a bunch of things being able to have this level of control over essentially NPCs you know non-player characters inside of our our our event or our show um is is really useful you know it's just be I can make something and you know with some kind of interaction I can either just throw a bunch of these together but in one you know afternoon you can build 20 30 different characters that you can throw into um into into uh um uh into into your into unreal or something else so that you have people doing things you know back there and they're not all the same absolutely yeah and so um the the other thing that you can do with this because of the the level of detail that they have is you could bring these into Unreal Engine and test different potential lighting setups and that's one of the things that's exciting for me is that even without having direct access to a given person you could style an metum similar to somebody you're going to interview and maybe you had a chance to to stop at the interview location and and either do a light R scan or or photogrametry capture so that you can uh basically digitally create a twin of that location and then you could start trying out different lighting setups and there's a limited number of lighting setups built into this already so if I wanted to try like a indoor setting versus the uh Studio setting or try an outdoor setting already I can you know kind of audition um some different setups but you can also Imagine again with Unreal Engine that uh we could practice with different lighting placement and um really get a sense of you know what might work well for this particular person so um those are and again the amount of time what you're able to do now inside of metahuman that in a matter of minutes uh would really be weeks of modeling of uh of building of of um envelopes and all the other bits and pieces that would be required to put this together for every character and here you can sit there and just throw a bunch of things together um and uh and and have something that you can actually use now I know we're going to in the future we're going to jump in we've got some questions starting to stack up here but in the in the future uh we're going to talk a little bit about building it directly from photos um that'll be another hour but maybe we can get a little bit of a preview of that um before we uh before we jump into that yeah I mean I can show really quickly and we're goingon to talk about this in more detail so um but yeah but I I think it's fair to share that you know if if you don't use the the photogrammetry approach and there's also something called metahuman animator that is optimized for iOS devices for capturing a real world face um you can and talk about that a little bit so so what we're talking about there is that you can generate these metahumans you can then uh send them to uh to Unreal and then you have an iPhone app that can drive that face and all and and what's important is is that all of the linkages between your facial motion and the face that was generated by metahuman are already done done for you which is again days weeks of work to get it just right exactly so you know this is something that we you know we probably have to have a few episodes with down the road um but you can see that this what we're looking at here is the scary version of you now like we've gone we've gone from being it looks kind of like you to it looks a lot like you it's right and I mean what I can do is like this is that's the the raw geometry that comes from the the scanning approach and and then this is the result after I've gone ahead and you know applied skinning and and gotten the um the hairstyle about right and so um hair eye colors and that sort of thing so um so this is much closer to me than what I was capable of doing with you know essentially the push pull of blending and that's because it is based on a photogrametry reality capture 3D scan of my head um had a bunch of students I think Laura was involved in in in photographing that um and uh that's you know the output really matches the um the the actual structure of my face and so um then then we can also try you know I can um right now this is the idol uh we could try a few different animations things so we can make my uh body poses different so here's an emotional body pose uh so why yeah we can do this uh let's see I think there's a sport one where I'm going over a hurdle or something like this so there an interesting lighting artifact there uh there is also uh facial range of motion and what I will note is that um a lot of the Expressions that are in the metahuman rig out of the box are are pretty close the the smiles don't match my smile like that's not how I smile so I would probably need to make some adjustments in um how the facial rig works but honestly the blank face and a lot of these other kind of silly faces do not look entirely unlike me um and and so okay well this is my morning routine I guess um and there's also a body range of motion where you know we get to kind of um see the full body uh you know working through its Paces I will note that uh one of the things that is a dramatic Improvement to the the met human system in general is that you know the reason we have to log into this is that the data about each metahuman that we create is saved under our individual accounts uh and it used to be that the only way for me to share a metahuman with somebody else or someone else to share with me is they would have to create an empty Unreal Engine project export the metahuman into that zip up that project which might be gigabytes of data send that to me um but now there is an Import and Export feature so I can hit export right now it will collect a basically metahuman binary file mhb file and it's only 2.5 this is not the geometry this is just the all the it's all the data that defines what that character looks like exactly so it's not you know the XYZ of every verticy it's not you know because the UVS are standard um the texture is a selection you know so it's just the data that defines what this metahuman is and now I could share this file with uh anyone else that has an epic account they could go into metahuman Creator import this file with the import button and they'll they would have this exact metahuman and they could then use it in their own Unreal Engine projects or you can also basically export a Maya project file so that you could go in and do you know again it's kind of going down the rabbit hole where you can do further customizations customizations to the rigging um they call it the medic human DNA that that really defines the uh the facial performance parameters um but even without going into Maya I'll just switch over to uh Unreal Engine for example here um you know I've just taken a standard it's like the most serious I've ever seen you what were you gonna say he looks so serious he's like I am here for yeah I'm so serious um like this is uh yeah Neo Neo Nick um so in the metahuman I'm sorry I'm in Unreal Engine now here so if I set this to the beginning you know I've just applied a a video game idle uh scenario to this and so so there I am you know just doing a standard animation that's off of a game character uh but also the face can be animated um I can go ahead and activate I think there should be a control rig yes control rig and um face control rig I think this is it let's try this button here used to seeing the facial control rig show up here and I'm not sure why that's not showing up but we'll give it a good we'll give it a try face control rig let's try that all right and yeah so this is one of two control rigs that um allow me to then you know hand animate essentially the uh the character so I can deactivate my snapping a little more flexibility there we go and for those for those who haven't done this before it's a lot of setup like this was a lot of before when someone would hand you it's bad enough that they had to build the whole 3D model and then they had to do all the things that are there but then it comes through and it has to be all defined and all these facial motions and everything else had to be redefined um by someone painstakingly and oftentimes you'd have some rigs that made it a little easier but it's not you know this kind of work you know wasn't trivial four or five years ago yeah let me I'm just going to delete this here for a second and um just show like the full-blown um facial control rig I mean this is really um every bit as good as most feature film uh characters and and certainly uh most video game characters and the level of control that you have each one of these uh yellow circles here is a discrete control over some aspect of the face right so this is the eyes and and the rig is defined in such a way that as the eyes move the eyelids are being animated with them um but I still have direct control over some aspects of of you know the the forehead and and this is a very very discreet controlling there's um you know how sticky are the lips and how much do they stick together and uh another set of controls is uh that there is a pre-built library of um post that are built in to the the metahuman system so I'm just trying to see if I can navigate my way to those uh pose Library uh see can I add that in let's see control I'm trying to get this ah well I can't get to it easily right now here we go face we can do it we can do it physes and expressions let me just do Expressions okay great so um there's you know already this Li library of Expressions so you know here's I'll put on my Angry Eyes and so I just select the controls and paste that pose and immediately I've got that pose and if I work with my timeline here I could select you know my blank face select controls put that pose in um and then for the face I can set a key frame to that and let me go back up to here and just set a facial control rig key frame so there at the beginning of this animation I'm pretty calm and then maybe right here I get a little perturbed and disgusted and then um you know then I get really angry about what's going on and so I apply that and I I guess what I did is I forgot to apply my um key frames so let me activate Auto keying and collapse this a little bit and there we go put a key frame there and right here I will be a little saddened right here's try that there we go and set I hopefully yep it's automatically setting keys for me so now you know I have a little animation of uh changes in Expressions here um very quickly with just a few clicks and because all of this is provided as a free kit and and so I could manually animate as an animator or I can um go ahead and apply facial motion capture again there's a free app for iOS that you can use on iOS for iPad or um iphon how hard is it to map that the the IOS app to the it's it's already mapped so there's no hard to do the the hardest part is just to make sure that your uh iOS device is on the same network and the computer that Unreal Engine is on is communicating with your iOS device so you do some Network ping and double check your IP addresses and your ports and and um and and then usually it just works so yeah exactly and and the um uh and a reminder that everything we're showing you right now is free right so the unreal is free the I mean if you're not making more than a million dollars uh the unreal is free the metahuman is free the IOS app is free so we're not the only thing that would cost money is the oh no the the uh the capture tools to make it look just like you are free so all these tools right now that we're talking about are not going to set you back thousands of dollars or tens of thousands of dollars this is all things that are that you can just start downloading and start using um which is one of the reasons we want to show it to folks you know down the road we hope to do more and more uh digital production um and there's a couple different places that we look at it I mean obviously being able to tell Cinema stories and so on so forth is important but being able to previz uh being able to you know very quickly throw things together um I you know I think as a storyboarding tool as a uh prev visualization tool and of course game tools and other things are are also important but if you're thinking about you know getting good enough in unreal that you can do layouts I just want to do a layout where I just want someone to uh you know I just want to do a layout and maybe you're not even animating it you just want them to be singing or you want them to be doing this th this one thing or or you want them to be dancing or moving around just the the layouts for your shoot shoots this is very valuable yeah I was just going to share that you know where things can start get getting expensive and where we become expensive friends um is you know the accessories and all so this is an example isn't it always it's always the accessories that get you in trouble it's right yeah um you know oh I this thing it only cost me $50 but then I had to buy $150 device to go with it and um so you know this is for example a set of outfits now the thing is that things go on sale right so I think I I probably got this at 50% off or more at one point and when it was on sale um I went and I got it so this is a collection of alternative clothing outfits uh for both male and female uh metahumans and and this is actually independent like it does have its own like this face is the the 3D model uh human that comes with it but uh this entire kit is set up to work with metahumans so um you know I I just click a button to basically add those assets to my Unreal Engine project and so now here's the the same metahuman but it's on you know an outfit that that doesn't exist in the metahuman Creator per se um I could put that on or you know I can this is this is actually like a really really detailed like firefighting uniform and honestly this is part of what sold that um that collection for me is that if I am going to do a demonstration for First Responders I want to have like a you know something that's a plausible firefighters outfit um here's more of an outdoor exploring kind of outfit with a with a backpack and such um and so these are all um commercially available and you know this is a $600 set but there are many you know this one uh author for example has has far less expensive options so if you just want a pair of shorts okay well that's $15 or jacket Etc and so forth so um you know and this is just one author in the uh in the marketplace there there are plenty and actually the other thing is I think next week's show we're going to have folks from zwe come on where they they provide software where you can create your own fashion that that works with metahumans so the um now you can see why when you look at all this all these accessories and the cost you can see why epic doesn't want to pay Apple 30% because they want to sell they want to create these marketplaces where you create your own characters and then you start to buy all the stuff and they they don't want to they don't want to have to uh pay some extra on top of that to Apple to do that but that's if you're wondering what is the source of of their upset is not so they can sell more of their games or or something like that it's because they want to sell all these little things out to everyone and not not pay a third of it back to Apple and we're seeing that play out with the the fortnite version of unreal you know it one of the new developments in the last month or two is that it supports metahumans now um Disney is allowing their IP into uh unreal for fortnite and so folks can go in there and start playing and creating their own worlds sharing it with friends um that you can buy and sell assets and you can um there are brands that are creating their own destination locations and and showing off uh you know what their brand identity is and again there is really no cost barrier to entry licensing dongles anything like that all the software is downloadable for free even motion capture today I mean there are a lot lot of simple video based AI motion captures move AI for example and uh radical and a few others where all you need is a video camera and and perform for the video camera and the AI will will at least approximately create that animation as a uh fbx file full body skeleton animation that you can then retarget in Unreal Engine without the need for any other software so um there's a lot that you can do with this without a significant investment other than you probably do want a PC with an RTX board uh it's going to make it much smoother for you yeah um but really once you've crossed that hurdle you can get started without as further significant investment and it becomes practice and and time and then you know maybe adding a few things here and there and you know then next thing you know you are the the F the motion capture fashion a motion capture suit whether it's a nurse or otherwise those add up yeah yeah absolutely let's jump into the questions yeah first one comes from Rick Markley and Wisconsin Rick says what are some of your suggestions for the best budget conscious face scanning apps I'm currently using kulog with mixed results goad Nick so uh I'm I've just started using reality capture uh entirely for that I I used meta shape before and um reality capture works great so all you need is a camera and if you know bonus points if you uh cross-polarized your lighting and and uh you can search pixel Prof cross-polarized and there's there's a whole video about that um where the the advantage of cross-polarization is it gets rid of all the shiny on on a face and and allows the the photogrametry software to really just see the the skin color itself without any interference from Reflections and shine and um that that works gloriously and and then Unreal Engine has its own plugin for transforming that output that model that comes from photogrametry and turning that into the metahuman it sounds like you might have already tried that but um I would use reality capture from a photogrametry standpoint um there is also a and we're going to do another session as a as by the way um this is our introductory session to metahuman but we are going to do a whole session on using reality capture and building a metahuman and bringing it in so um so stay tuned for that but anyway go ahead go ahead Nick yeah and there is also a free app it's called Uh live link phas for iOS and it operates in two modes um the live mode the traditional mode is just for animating an existing metahuman but the metahuman animator mode of it uh will actually step you through the process of uh capturing images using the depth sensor of the iOS device uh from a couple different angles your face and also even capturing your teeth and um then all of that data gets loaded into a module in Unreal Engine which then uh does a little bit of AI and feeds it to to metahuman Creator and and we'll use that to create a metahuman identity so I would definitely give both those a try and they're all built into the unreal ecosystem I'm gonna back you up there for a second so there is a the the what's what's the app again that does the the that uses the depth sensor so the app is live link face it's the same one it's not you're not getting a different one animator mode yeah there's the there's two modes there's um the there's the live AR face and that's the traditional legacy mode and that still works fine and that's you know that's what literally what they use to drive Patrick and SpongeBob for the Super Bowl right so this is not you know non-commercial or it's not um you know it's it's real you know in that mode itself but then there is a new mode that was added over the last year I think it was about a year ago that it got added called metahuman animator and that is um capturing Stills with the depth sensor as well as uh the the photo camera and it it prescribes certain angles for you to capture those stills from and it it wants a frame that it wants to see your teeth uh and it uses those images to create the metahuman identity that is the the facial structure itself and the reason it's doing that I mean you can use that identity then to eventually you know create a metahuman um but it's doing that because then it will record the facial performances that are going to get applied to the metahuman and it's comparing the all of the deformation of the facial performance with the calibration poses the blank face poses at all times and it's doing that three-dimensionally and it's doing that in post it's not doing it live on the iOS device you have to upload that data to your computer and as a result you get much more accuracy out of that that performance so um yeah free stuff we'll play with that some more but but again I think that when you think about doing and and it's not just doing pre-visualization for TV or film think about um layouts for your corporate events and and examples of what you want you know a lot of these things are downloaded we downloaded a a whole stage a music stage that we needed a couple pieces from we downloaded the whole stage for I don't know it was like3 or $40 or something we downloaded it it had DMX in all the lights it had you know a lot of layout like all that stuff can be kind of pulled down and there's a an enormous uh amount of things that available but also think about as Educators um this this could be the next generation of some of the content that you might build for your classes um to talk about history to talk about other things like that these are all things that be added uh next question next one comes from Paul Wallace in Austin Texas can metahumans be used to create digital doubles of real people and what are the implications of this for example the Scarlet Johansson brewhaha all go ahead Nick yeah so of course you know the the brewhaha with Scarlett Johansson is the um the AI voice that sounded very much like her voice and clearly that had not been done with her permission so absolutely you can um get yourself into uh difficulties with folks that are very very conscious about how their identity how their likeness is used and and that that is their livelihood so um and those folks that that that is their livelihood have lawyers who are whose livelihood is to to pursue any of those INF it's their livelihood to protect their livelihood yeah exactly exactly so again I imagine that you know there there is a there are room for fair use I am not a lawyer but um you know if you're creating a parody um you know there there's lots of examples like Robot Chicken for example that you know will will leverage existing uh likenesses and in IP and things like that and and so there there are those gray areas uh obviously you know it's in general interest to to not take off people so um you know if if you want to work with somebody specifically I mean there are entire companies now that do work with um I guess you know celebrities both living and deceased you know working with the Estates of celebrities who have who have passed about creating digital doubles of those folks and having their legacies live on it uh was pretty big news a few months ago where the the rock band Kiss You Know officially transitioned to becoming avatars themselves so that you know they worked with ILS them to have digital doubles created of them so that um their brand IP can live on I just think the best part of it is the kiss went to see Abba and then decided that's what they want to do too like you know like you that's there's something about the kiss guys is going to see Abba that is awesome um yeah go ahead bill yeah I just wanted to ask ni since you're so immersed in this and you keep really close track of it if if there's a suspension bridge that's cross The Uncanny Valley and we're all wondering if it's ever going to reach the other side where do you think we are in terms of that of eliminating that feeling of otherness when we're watching these creations I know it's not here quite yet but what's your feeling about when and how we might cover the span so there is a lot of artistry that goes into Crossing that bridge so um I I had the The Good Fortune to have the opportunity to work with digital domain on uh Curious Case of Benjamin Button and and that's over 10 years old now um or getting close there anyway no it's way over 10 years old now the come it's been a long time um and you know I think one of the greatest points of pride is being on the team that that worked on the digital double there and to see shots that were digital double pointed to when the um Oscar award for best makeup was being oh this is this one W for best makeup and here's an example of why it won for best makeup like that that's not makeup but okay thanks um you know so it's possible to cross that that on K Kenny Valley the number of incredibly highly skilled people involved in that is is huge and and it's you know I think the other week we were talking about how the corridor crew folks were working with uh um you know debevc to to do the the um sodium vapor process and like oh you're working with with debevic of course you're going to do well you know now now be on a team of debevic and so um so you can cross it so can you as an individual cross it well that's going to take a lot of practice right so motion capture helps because you're capturing natural human motion as opposed to attempting to to key frame match uh the motion capture of the face get you a certain distance it's going to capture um some of the you know eye darts and and things that you might not think of as a newcomer to animation um but it also has um in the industry a lot of times we call it digital sandpaper like it doesn't notice every single nuance and so you know you kind of go in there and fix some things and you know there might be a b or a p or an M that's stated in dialogue and and well the the mo cap didn't quite close the lips there so we need to go and make sure all of those things get done so I guess my answer to your question is it's absolutely possible today it's been done there are things that that folks maybe didn't even realize were digital that aren't that were digital and and they happened already um and it's with this tool set I think that it is possible to get really really close in the meantime the um general population has become more and more accustomed to watching digital characters and more accepting of them as well so um I think that you can do a lot with them and especially if you say well here's a video game version of myself all of a sudden folks don't worry about how far across that uncanny bridge that you've gotten there like oh this is a video game character I can accept that so it sometimes it's just in it's in the presentation and and I would say that there is there are so many opportunities to do things that just don't require you to get over the valley you know like on on our side of the valley there are you know cartoons there is um historical you know Recreations there is for for Education there is you know tons of all kinds of cool videos and everything else and and when you think about the quality it's kind of like if you think about um I always think I was trying to explain this to somebody and this is what I use I said you think about the quality going like this right and this is the you know the uncount valley lives somewhere you know in here right um the level of effort these days to do this is like this it goes like this and then it goes like this and then it just goes up like it's just like you know like like this isn't there's not a lot of effort to do you know that's and what unreal what this gives us is all of this takes very little effort and then as you start getting it gets a little harder as you start to go and then it gets harder and then it just goes into almost impossible but living down here you know which is better than almost everything else we've seen 10 years ago in cartoons or anything else like this is and this is constantly moving like this area here is constantly moving that direction and so so you just want to think about the fact that there are so many things that we could do that are fun with these characters whether it's again prev visualization uh education um games U uh um you know all the all these types of things corporate you know videos so on so forth if we're willing to say we're not going to try to cross the valley that Valley is really big and deep and dangerous um we're going to stay on this side of the valley and we're going to make it you know have fun with it there's an enormous amount that that you can do for for with no cost you know to the to the hardware and software and and so much stuff that used to take us weeks or months to do I think is s super exciting crossing the valley is is is like yeah it's dangerous is it fair to say Nick that that it is possible but it still requires so much effort that unraveling that and seeing something and this is to Paul Wallace's question from Austin um you can still kind of unspool things and know that this was created as opposed to Natural unless you're working at the highest highest level and even then there there's probably some tells I guess I I'm I'm just going to jump in because we got to keep moving but but what I will say is that look at uh Rogue um uh Rogue one is you know ilm had all the money in the world and it still didn't quite work like you know like like it was like it they didn't quite get over the Onan Valley and they had all the money in the world the Deep fake stuff is actually getting closer to it than than actually using Geometry The Deep fake stuff is actually doing a better job at doing some of those things and the reality is as it turns out when we talk about Scarlett Johansson is there's a high probability not a guarantee that that voice was created long before they talked to Scarlett Johansson and and it got them thinking about having Scarlett Johansson do it but there was no attempt to to copy Scarlett Johansson in that area and that's the danger that we get ourselves into is that is that they you know the woman who sounds like Scarlett Johansson just sounds like Scarlett Johansson like that's what she sounds like and we're going to be building characters and people say well it looks just like this person and maybe maybe it does that's G to be the hard part um next question oh oh go ahead go ahead Rick real quick yeah I was just going to mention I actually if you recall a couple years ago I was demoing where I would take a medum and then I would apply a deep fake on top of that right so so just taking that little bit of you know the the metahuman to drive the basics of it and then use a you know a an actual large language model deep fake on top yeah absolutely um next question Dave Troutman at Edmonton Canada comes up next could a designer copyright a metahuman design or creation uh I think we jumped one here um can you can you I can take that question it just makes it hard in our system let's go back and I'll do the is there any way to have a metahuman track live video input go ahead Nick so um you would use multiple Solutions I off the top of my head I can't remember which of the AI uh video Solutions do offer a live video transformation so that it would generate live um motion capture for body so the arms and the legs and all of that I think move AI is one of them but I think there are a few solutions that will take a full body uh video and ideally you know maybe from multiple cameras and give you that approximate um full body motion now there's also you know things like Vicon and optitrack that can absolutely track you know under under the hood those are video systems and so they can track a person moving around live and that can drive a metahuman and then there are also separate solutions for the face so uh faceware is one where uh they have uh facewear uh Studio that can take a live feed and you know usually use a little tiny camera ideally SDI or something like that and that can be tracked by the um the the faceware live software and immediately that is live you know manipulating the the facial control rig of the M human so you can do that um you eliminate of course if you're doing that live and you're doing the for a live broadcast you eliminate any opportunity for an artist to get in there and and tweak the performance of that you know in in that live setting but uh but all of that is recordable as well and then you can go edit that further and post next question Dave Troutman comes up now from Edmonton could a designer copyright a metahuman design or creation good Nick I believe you should be able to now again I am not a copyright lawyer I'm not any kind of lawyer uh but there is the export of the mhb the metahuman binary file technically you as a human being authored that file using the tool I don't see that as being particularly different than um you know taking some digital photos and blending them together and and using Photoshop to create a new digital file so there are are enough variations certainly billions if not more uh possibilities within the scope of that uh configuration so I think that you should be capable of copyrighting that mhb file um could someone you know get that mhb and alter it a little bit probably um could you take that mhb and put it on an nft probably uh so yeah that that would be my answer to that yeah and I I think that um copyright's going to get aot really complicated you know as we as we kind of move forward I I think that you know what you really want to be looking at is I mean you could try try to copyright it but what really is all the things that you can use whether you need the copyright or not you may have a copyright on on something there but um you know there's an enormous amount of content that can be created where copyright is not that important um and there's and and it's like 99% of the market um next question Douglas carmichel how can you synchronize the movements of a metahuman mouth to live or recorded speech go Nick so again for live speech you would want to use either the live link face software and you know you have a microphone and you might need to do a little bit of audio delay because iOS device is translating the the uh information into controls and those controls are going to Unreal and then unreal is rendering um but in that way you could have the facial movements of a metahuman in sync with live uh audio and the same is true for faceware Studio where again faceware Studio would be getting a live feed from a video camera translating into um metahuman controls and uh that can that handles the live so that that's the live part of the question and then recorded is is a lot easier um again with faceware faceware is only monitoring video so you would have a another system for capturing your audio to to record the audio but uh when you animate metahumans you're using a functionality in Unreal Engine called the sequencer that's basically the timeline it's like Final Cut or Dent resolve in Unreal Engine and you can put your audio file on there and you can set markers on the timeline and say okay here's the different syllables and you can synchronize by moving on the timeline the uh the captured uh facial performance controls and and then once you've got that uh set up you you can go in and edit and everything stays in sync next question comes out of Nick in New York City and it's one of our QR code submissions what's the process for using live link to receive realtime motion captured data from a live person go Nick so uh the process is number one your iOS device for it needs to be on the exact same network as your Unreal Engine machine and and you don't need IPS or anything else they they'll discover themselves on that Network it's better to have the IPS they will discover themselves uh but I I personally like to make sure I can see the two IP addresses and that I can ping one from the other I have a a network tool on my U iOS devices so I can do a ping of the PC and vice versa I've run into issues where it's usually the firewall on the PC that's blocking Communication in One Direction or another and so I always just like to make sure they can ping each other and and that's good uh but then once they're both on the network and you have live link face on the iOS device running and it sees your face and and you can tell by the um display in live link face software that okay it's tracking your face uh that should show up in there's a live link window in unreal and you get a green light oh there's this iOS device it's streaming uh live link data and I'm I've got a green light because I'm actively getting data from that uh so that's then finally there is a checkbox setting on the metahuman character itself that you want to activate so that it's basically responding to the live link data coming in and you know you might have multiple performers each one of them with their own iOS device so you want to be able to assign okay this iPhone is going to this character and this iPad is going to that character and um that's kind of a walkr again we could probably do at least an hour on that so and we will we'll come back to that so so you know one of the things we want to do is this is the beginning we're going to be coming back this because I think it's such an important understanding how to use these unreal tools I think is really really important for a lot of us again whether you're you're doing corporate educational U broadcast film um uh you know all these things that are there um YouTube videos all of these things are really it's really important for our community to have a group of people that understand it maybe not everybody's going to do this right now but my goal is to get 20 30 40 50 people that are using it so that we're getting good at it so that when you realize that you need it um that there's other people in the community that already have already been playing with it so so expect us to be talking about this fairly regularly and we will come back to another hour on how to you know use the photogrametry to to build the face another one that's dedicated to just animating it you know there'll be a we we'll keep on coming back to these things and showing you different parts of the pipeline because I think it's really really important for not everybody but some of us to understand how it works um next question nikil kimker New Jersey is back again with Nick I'm looking at cascader for animating metahumans but it seems not as automated as I'd like it seems to require me to do ring and so forth in cascader before their prompt likee animation becomes possible thoughts or other options perhaps go Nick so I haven't used cascader yet myself but my impression from it is that it it should be able to take your prompts and uh perform animation so you should be able to use any of the quote unquote standard characters that are built into cascader um I've used a uh another AI tool called motion uh M oo t o n it runs kind of through a Discord interface uh and my experience with these is that they're capable of uh animating and with their own character sets with their own uh character skeletons and exporting as an fbx so as long as you can get an animated fbx character out of the tool then that should be readily importable into Unreal Engine there might be a small little bit of rigging where you just if the character um skeleton isn't something that's standard in Unreal Engine parlaments it's just identifying okay well this chain of joints this this belongs to the arm and this is the left arm and this is the right arm and and it goes very very quickly again I I have a tutorial video on that that within minutes you can essentially Define a brand new character that Unreal Engine has never seen before and say well this is left arm right arm spine two legs neck and head and at that point any animation that's on that character can now be applied to any standard character including metahumans inside unreal engin so I feel like that's I'll probably have to look at cascader and and just do a video about that but it it should be straightforward my expectation from what I've seen of cascader is that it should be able to export an fvx and you should be able to bring that into unreal and you shouldn't need to do any customization on the cascader side from there next question Courtney Gooden in Hollywood California how do you think the new tools like vasel one from Microsoft research that can realistically animate faces and head movement from any audio track will affect productivity in character animation good Nick yeah uh this is great there's also um Nvidia has a tool called uh audio to face that that has been around for a year or two at this point and it's also free it's part of their whole um uh I forget what their metaverse tool set is Omniverse or something like that um and it it's great because you can just give it an audio file and it will generate all the control patterns that actually has uh the interfaces for metahumans already built into it so you could feed I know the the Nvidia um audio to face you can feed that audio of dialogue and it will create an uh animated set of controls that will operate metahumans I mean it it does swallows and and and kind of all kinds of things like that um so absolutely I think this is all going to uh you know we call it democratization it's going to democratize this tool set where these kinds of things used to only be possible with uh multi-million dollar budgets and now for free you can get started and get a result that you can start to work with uh anything that can take audio and give you uh facial performance control data is going to give you that much further down the road in terms of the production I mean when if if we're teaching facial animation you know we start with okay well a muppet kind of is facial animation and all it does is open and close its mouth right so we start with that and then we have uh more expression in our own lips so you know our lips go wide and narrow and our lips go open enclosed and and so we you know we work through teaching manual animation through all of these different things and a tool like Vasa or uh audio to face you know does several of those steps for us and gets us to a point where okay I just want to tune this up a little bit or I want a little tweak or I want a little glimmer in the eye and so you know makes all of us much more productive yeah and I and I think it's really good I think again when we get into this like how are we're going to is going to replace actors uh it's going to take a little while it's hard to find actors that are good at what they do let alone something that's going to uh copy them you know like you know it's you know there's only a handful of actors that do a really really really good job and they get paid a lot of money to do it um so and that's that's other human beings doing it and so I think getting to a point where the AI is doing it as well as those it's probably a little ways out um so but there are super there's all these opportunities to to um do it use it in so many other places um yeah I was I was just going to note too you know Alex you used the the Rogue one example earlier and you know one of my favorite characters is k2so and you know there was an actor behind that you know that had his own performances on set and his voice uh that you know helped bring that character to life and then a whole team of animators to animate the the CG robot itself but um that performance with an actor there's something there's some core part of a person like we I think we think that we like the character they're playing but really you know like we see Tom Cruz playing this this character and then this character then this character it's three different names but it's still Tom Cruz you know like you know and and people have decided that that's that's something that they like but that comes from a a lifetime of experience that that shows up as that whatever that is whether it's Tom Cruz or Marill stre or whoever it is there's a there's a lifetime of experience that comes that I think AI I think when people think that it's they're going to be replaced by extras I think could be replaced relatively soon digital doubles could replace some of the stuff that stunt people do which is often the things that really hurt stun people um you know so some of that stuff can you know um those kinds of things I think make more sense but I think that when you're talking about actually replacing the a level and B level actors probably a little ways out but still coming um but probably a little ways out next question Josh Koffman at Pittsburgh Pennsylvania has our last one what are the tools in workflow for animating the body movements of metahumans along with the face motion capture of the face go Nick and so again there multiple episodes we could do show on this but um my own workflow here in our studio is we use a Vicon motion capture system and that uh is feeding the body movements including that the head movements and then uh we're usually using faceware Studio or live link face to uh Drive the face and those can be run simultaneously on a single Network and one instance of Unreal Engine can receive both of them and and Route the facial performance to the face and the body performance to the body and we can capture within Unreal Engine or we can capture on those independent Tools in fact Unreal Engine comes with a tool called um switchboard that'll allow you to hit a single record button and give you know a single slate name and it'll populate all of the controls for all of the various uh heterogenous recording systems and when you hit record all of them start recording at once um and you can hit stop recording at the same time so um that's one workflow I mean we also have optra here um there are inertial suits like perception neuron accents um Roku Etc so there are lot of tools that can capture body and uh there are a lot of tools that can animate the face and they're fully interchangeable with one another um unreal has been designed to be very um accommodating of all of these different systems whether that data is coming in Live or being imported as a file and and certainly this is material we should be uh covering in future episodes and we definitely will be so thanks thanks Nick for uh taking so much time to to put this together and be ready for it really appreciate your time and uh we appreciate all the panelist time uh we we can't do this without you uh both for the first hour and the second hour uh and we thank the uh all of the great producers asking all the great questions that we had both in the first hour and the second hour um and finally thank you to the incredible team on the back end that makes this possible every single day there's teams that are developing the tools that we use uh there's people who are trying to figure out what we're going to talk about next week uh or tomorrow uh there are people who are actually cutting the show and then putting together the emails and putting together the website and putting together the thumbnails there's a huge village that makes this show possible and we appreciate all of your contribution today if we had walked around and we didn't have the slock traversal we would have traveled 55,000 miles uh that is 88,000 KM and that is 432 million bananas for scale all right let's go ahead and jump into after hours I feel like we just talk about metahuman every once a month or something like that well let's capture it is a very deep Rabbit Hole honestly would be one of the less expensive rabbit holes that that we dig into I know I know we're doing so well everything was free today you don't have to sell your house today uh you know you might have to you got a PC I guess I have a PC sitting here that we got for we got for uh anyb so just add time and effort yeah exactly exactly just add time and effort um so very good all right thanks everybody see you later thanks by e
Info
Channel: Office Hours Global
Views: 853
Rating: undefined out of 5
Keywords: office hours global
Id: ZGrK153Oec0
Channel Id: undefined
Length: 64min 35sec (3875 seconds)
Published: Tue May 28 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.