SUITLESS Real-Time Face + Body Mocap in UE5 | New VTuber Workflow with the Coil Pro

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey what's going on everyone I'm Sam Lazarus creative director for Roco and I've got a very fun and special little workflow that we're going to go over here today the point of this video is just to show off one of the potential uses for the new roko coil Pro so you might notice you know I have seemingly full performance motion capture happening on this character my head is moving my arms are in the right spot look I can like grab this controller this might need to get tweaked a little bit but looks pretty darn good you know this is just parented to my left hand I can turn this off if I put this down but I've got full performance motion capture happening here and I'm not wearing a smart suit I'm not wearing a suit I'm just wearing gloves and as much as I love wearing my smartsuit Pro 2 for a use case like V tubing like this one that we're demoing right here it's really cool to be able to just hop into your gloves and still get that full movement of your character that you're looking for especially for the upper half of the body so in this video we're not going to be doing a deep dive into how I got this character set up we are going to be going through how to set up this system using a metahuman which is a little bit more automated it makes it a little bit easier to get all this running together with the Livelink app I'm not a very talented unre Dev I'm sure people who really know what they're doing in unreal could get some really interesting use cases running with this setup and it's you know again I don't have to hop into a suit for this so what actually is happening here what's allowing me to do this as I said I've got my smart gloves the coil Pro allows us to track your smart gloves in absolute space right they're actually in the right spot so that's taking care of my hand positioning I'm then using kind of an ik system to drive the rest of the upper body right so the rest of my body is moving and then I've just just got the unreal live link app set up on my iPhone right here and that is transmitting both the facial motion capture data but also it has an option to send over my neck and head movement as well so that's taking care of my neck and head and that makes this setup very low lift right I can hop right in I just have to throw on my gloves turn on my coil turn on my smartphone and I can be doing this for hours and hours and hours super reliably I'm not going to be drifting anywhere I'm not going to be moving and again it doesn't have any lower body motion capture you can see if I pan down here my legs are just static but that's fine because for a v tubing type setup we just need you know the upper body please put any questions that you might have down in the comments below and I'll be getting to them and without further Ado let's jump into this and and set this up for metahuman okay okay so here I've got a fresh new uh unreal project I've just already imported my metahuman you can watch our other tutorials for information on you know how to get a metahuman into your project but we are going to be working with Roco live streaming data and when we are doing that there's a few things that we're always going to do uh in our project uh just before we even get going so the first thing we're going to do is is I've got this linked down in the description below but we have this Roco resources folder and this is free you can go and download this and just keep it and use it for whatever you want but there's a bunch of things including bone maps and then a te pose asset for your metahuman that we've already made and have given you in this folder that you can go and is is going to make our life a lot easier so in order to add this folder it's just azip I want to copy the folder rcoo resources then I can go and if I just show and Explorer here we go this is our you know unreal project the uh folder here I'm going to go to the content folder in my unreal project paste the roko resources folder you can see Ooh La La this little guy popped up right here so the next thing we do after we add the roko resources folder go to plugins search for roko cuz we're going to be live streaming data from R Coco Studio enable the plugin and we'll have to do a restart okay there we go we've restarted and now we're going to set up this metahuman for some live streaming and this is really just going to follow the typical workflow that we go through when we work with metahumans to to live stream to them we have tutorials on our YouTube channel about this process so there's not going to be a ton of Things Different with what we're doing and and therefore I might move a little bit quickly so put questions down in the comments below if you have them whenever I'm working with a metahuman the first thing I do is I open up its blueprint and then I go and I Chang LOD syn the forced LOD to zero and this just means that the metahuman is going to be forced to use the highest possible uh level of detail and it won't be changing dynamically based on where my camera is which I don't want then I'm going to go and select the body um of our metahuman I'm going to open up the body mesh this is the way I like to get to it and then we want to go to the blueprint tab of our mesh and this is where we're going to be building out all of our nodes uh to actually do our live streaming right it's going to open up in the event graph we want to go to the anim graph and here is where we're going to be building everything so now I'm just going to go and do the basic workflow here I'll I'll actually dock this guy so we get a little bit more space but we're just going to go and do what we normally do to do live streaming to roko so I'm going to add a roko body pose node I'm going to wire that in to my output pose then I'm going to create two variables first one I'm going to call Roco actor name the second one I'm I'm going to call Roco bone map and we're going to make the actor name variable a name type variable and the bone map we're going to search for Coco R ccoo Body map data class reference and then I'm going to drag both of these variables into my scene select get and I'm going to wire my bone map into the retarget asset and the actor name into the Roco actor name right here and if I select my body pose node right here and I hit compile you'll see we get a couple new options that show up now that we've wired in our variables right one of them is our roko actor name uh this is the actor profile that you're using in roko studio in my case it's Sam and then we also need to add the bone map so this is telling unreal which bones are which for our character and thankfully again in that roko resources folder I have already included a metahuman bone map for you you can see also these changes will not be applied unless we actually go and hit apply and compile and we want to do that now we're not done we're going to make a couple more changes here we did just wire in this bone map but as you might remember from the intro of this video we're going to be using the live link app to control the head and neck rotation for our metahuman so we don't want our motion capture to be driving the neck and head rotation so what we're going to do is go in to that Vero resources folder bone maps you can see now when I open open up the bone map because we have the Roco plugin loaded everything shows up and here I've gone in and I've defined all the different bones for our metahuman and I'm just going to delete the neck and the head bones so now no data from Roco studio is going to be going to those two bones on our metahuman so that was the first difference from our normal live streaming workflow um next I'm going to undock our blueprint here because as you can tell our character is not in a t pose and we need our metahuman to be in a t pose in order to properly receive roko motion capture so again in that roko resources folder I've got this T pose pose asset we'll drag that into our blueprint I'm going to right click on that convert to pose by name the name of our pose is pose unor 0 which again you can see if if we open up this pose asset see pose undor 0 is the one that puts our metahuman into a t pose so as soon as we do that and we wire this in to our component pose and we hit compile look at that our metahuman is now in a t pose so that means that this metahuman is now ready to receive our map from roko Studio okay so now I've actually jumped into Roco studio and let's set up our gloves we have full tutorials on tutorials on our YouTube channel about how to work with the coil Pro so again I'm not going to go super deep into it but there are some different things we're going to be doing for this workflow um so we'll go over that in a second so gloves on and I basically just got some USBC cables running through the sleeves of my sweater so they're kind of out of the way and they're not getting tangled and they just run to to a battery that's sitting right in my pocket so this is a scene I've actually used uh my coil in previously so we have a coil in here normally the coil will not show up until you actually power on your smart gloves right the coil is only going to appear once it detects smart gloves uh near it so if you open up a new scene you plugged in your coil but your coil isn't in the scene yet that's fine you just need to plug in your smart gloves and power them on and your coil will appear and so you can see I've got a disembodied hands here and I'm going to go and add my actor profile and again you can see my actor profile is named Sam the same name that we used in our blueprint for our metahuman and I'm going to drag on my gloves so normally these wouldn't be calibrated already I've already used the scene before so they are but I'm going to do a calibration anyway and when you're using gloves only you can see our calibration pose is a little bit different there we go and the other thing so this is looking you know this looks great my gloves are working perfectly ch ch ch looks awesome but let's dive in here and actually show you what is happening if to my full body so you can see when we've got our gloves only enabled we still have body movement for the rest of our Newton body but it's all Frozen right and you get this kind of wonky situation happening where again we only know the position of the gloves so some of the data might look kind of funny for the rest of the body and that's especially true if I am sitting down so so right now I've got my coil um set to manual distance so if I turn this off this is the actual height of my coil so when I'm standing this looks perfect but if I'm sitting down we get this weird kind of hunched over look for our body which does kind of make a difference for our unreal scene um because it means that our the rest of the body that isn't being controlled by the live link app is hunched over so to kind of jury rig fix this I can manually set the distance of my coil from the height to the floor and that's just going to kind of fake it that my oil was higher up in the room so that when I sit down I'm not hunched over anymore and you can see if I if I change this to like 80 now my hands are really high up if I change this to 50 I'm hunched over so this I just kind of played with a number that got me to a good spot for for me sitting down and still having my shoulders look good and everything like that so the rest of this data is going to be going over into unreal as well um so I'm going to turn off cuz I actually don't love I just want to see these cool hands this just looks great and at this point we are ready to actually send this data over to Unreal so I'm I'm in the live streaming section of Roco Studio here I'm going to hit activate there we go and now if we jump back to unreal I'm going to grab my metahuman for the body torso legs and feet I'm going to make sure that it is using the anim blueprint that we set up which is that retarget metahuman base scale soon as I hit that you can see o look at this now we're in the T pose right our blueprint is working and if I go over and I turn on live link and I hit Plus roko Studio Source Studio here we go we now have our actor body data coming through into unreal so before we get all this data actually on our metahuman I'm going to open up my live link app which should be recording and I it will should come up on the screen I'm going to select live link not metahuman animator continue continue and as soon as I do this you can see that we get our live link working and if I touch this live button up top you can see it's turning green turning white when it's green it's actually live if I go into settings I've got head rotation turned on because we want that head rotation and if I go to my live link you can see again because I've already set up my phone um to to work with the Apple um AR kit live link app it just pops up automatically so this is ready and now if we go to our metahuman if I hit play look it's all working right except for the uh live link app but if I actually want this to be working in my editor what I can do and this is a little jury riggy but essentially if you just go and you turn on the AR kit face subject here and use arkit face look at this we now have a completely live metahuman working in Unreal Engine only using my gloves which is crazy this is so cool now the other thing that you can see that we have going on here is we have a lot of body movement in our scene which I actually don't like I kind of want my my guy just to be absolutely still now it is cool that I can actually move around my scene and my metahuman is going to follow me because again the coil Pro actually knows where my smart gloves are in space which is very different from the non- coil Pro workflow but if I want my hips just to be locked into place what we can do is we will select our metahuman I'm going to hit control e to open up the blueprint then I'm going to go to the body again we need to get back to that animation blueprint that we created for our metahuman which is right here and what I'm going to do is I am going to add in a transform form modify bone node wire that into my chain and I'm going to select my pelvis and then on translation replace existing 000000 0 when I hit compile it's going to get a little weird for a second because it means that it goes down to the origin but I am just going to move this back up and you can see so now on my z-axis I'm at a value of 83 and but my my body is no longer moving in space it's just rotating in space which is what I want because I'm on my swivel chair right I want to be able to rotate but I just don't want my body moving around in space so there you go got it all up and running and this looks pretty good um of course the hands aren't like perfectly touching that's because of subtle differences between my proportions and the metahumans proportions but I you could get this a little bit better with tweaking and it already looks really good though you can see there's like no intersection of the hands which is all very different from how it used to be uh using our inertial based uh motion capture system just with a suit alone without the coil so that is how you get this all set up um with a metahuman now metahumans are built by default with this ability to take in uh Livelink data of the neck and head rotation so that is not the case for a custom character that's why I kind of chose to do this full setup with a metahuman because it's already built in now let's jump back back over to the monkey character that we started this the video with and I'll kind of talk about a little bit about how you can go and set up live link data to be driving the neck and head of a custom character where it isn't just you know again for us it's just we click this on and it all is just working right automatically that's not going to be the case uh for most um most custom characters that have facial blend shapes the facial blend shapes the facial motion capture will work pretty much automatically with your iPhone but the this business will not so let's jump over to that character and I will show you how to get that uh all set up and working but this is so cool I think this is amazing that we can have this kind of performance just using gloves I just hopped in my gloves I can be here for hour hours streaming like this and I don't have to be wearing a suit as much as I love wearing a suit this is pretty nice just having my comfy little gloves on and still getting all of this expressive accurate motion capture data you know uh looking good so okay let's hop over to the other character now so here I am again back in that first character setup that we started the video with and you can see I've just got my unreal you know open I've got kind of my controller just model that's just parented to my left hand so I can turn that on and off but it's really cool to be able to have you know little props and I'm sure you could gamify this or hook it up to a steam deck a stream deck or something like that um but yeah let's go over a little bit about how this system is set up so I'm going to open up my blueprint for this monkey character and you can see we have a little bit more of a complicated setup here um but it but it is also very familiar to what we just did with our metahuman right so here I've got the roko body pose node um I've got my bone map I've got my actor name for my bone map I went ahead and removed my head bone and my neck bone again right because our live link is going to be taking care of that I've got my T pose asset the T poses for my character um but then I've got a bunch of other transform bone uh nodes and and other variables and what is happening here so I will put a link to uh some of the videos that really helped me to understand how to do this it's a pretty simple setup to get the neck and head movement coming in from Livelink um and and using that for your character right I'm not going to go over how to build all that follow the videos that I have down in the description below for more information but the way that it's essentially working is that if we hop back over to our event graph so you need to set this up in the event graph and this character that I'm using actually came with a little uh already pre-made blueprint that did some of this translation from the live link app into head movement so I just kind of copied it from the blueprint that was included and added it to this one but what's what's basically happening is we have our evaluate live link frame node and you can see the subject is just my iPhone and what this is doing is it's pulling in data from our live link app that has all of our neck and head rotation and and all of our blend shape animation and then I'm pulling out the data here you can see and grabbing the different properties uh from that live link and then I am clamping those values within a certain range of head movement because we only want our head to be moving in kind of specific ways um and then I'm connecting that all into a rotator variable you can see down here I've got this head rote variable Rotator variable here and this is essentially containing all of that data that we're pulling from the live link app that has our head rotation then when I jump back into my animation graph if I cruise down here look at what we got here we have that head rotation variable right get rotation and this is the data essentially the live data coming in from Livelink and then what I'm doing is I'm plugging that into these transform modify bone nodes and I am adding or I'm actually replacing replace existing the rotation data you can see that this head rotation variable is plugging in eventually into this rotation plug in this node now what I'm actually doing here is I've got a couple multiplier nodes that is I don't want I don't need all of the data I actually thought for this character what would be better is to use some of the data to rotate the headbone then use a little bit the same data but just less of it to rotate my neck bone and then I'm even plugging in very .1% of the rotational data of the amount of rotations essentially the degree of rotations into my upper spine bone so I'm actually getting some movement on my upper spine bone too and that means that my head rotation from Livelink is affecting my headbone of my character it's affecting the neck bone of my character and it's also affecting slightly the spine bone of my character and that that is just allowing me to get a little bit more movement out of my live link app than we got by default from the metahuman um you know workflow where it was just using all of the data to rotate our head essentially um so that is the that is the very you know Bare Bones not getting too deep into what I did here setup um I I've just again I've got my normal Roco live streaming setup going to all my bones except for my head and neck bone um and spine upper spine bone and then I am pulling in this head rotational data that is working from our event graph right to actually pull this data live second by second frame by frame from live link I'm using that data to drive my headbone my neck bone and my spine bone so that is the full setup then I've got you know my unreal here set up with a little green screen I've actually got you know weird stuff happening behind my character if I if I want to go full uh non- green screen but yeah this is a really clean easy workflow and again put any questions you might have down in the comments below this was not intended as a full workflow video I really just wanted to show you what was possible um using the coil Pro and just your gloves to get this full performance motion capture uh result so lots of things can be tweaked this is very imperfect but it's really cool just as a brief demo to show that this can work so hope you found this video useful we'll be doing lots more stuff like this on our live streams Thursdays at 11:00 a.m. PST you can come join me there if you have more questions and other than that I will see you in the next video thanks everyone bye-bye
Info
Channel: Rokoko
Views: 2,821
Rating: undefined out of 5
Keywords: mocap, motion capture, smartsuit pro, motion library, 3d animation, 3d character, vfx, unity dev, game dev, unreal dev, unity3d, unreal 4, motion, film animation, review, demo, handson, blender, maya, autodesk, cinema4d, maxon, behindthescenes, education, student, studentwork, analysis, tracking, capture, mograph, motiondesign, learning, treadmill, drift, fix, magnetic, interference, xsens, noitom, perceptionneuron, awinda, vive, htc, ikinema, nansense, inertial, sensors, ik, locomotion, loop, loopable
Id: uvLt6v6bV88
Channel Id: undefined
Length: 26min 40sec (1600 seconds)
Published: Fri Feb 23 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.