AMANDA: Hey folks! Take a trip to the virtual
outback in Rural Australia, a new high-quality
environment pack now available on the Marketplace. The assets are from
the spectacular Memories of Australia short film created
by Andrew Svanberg Hamilton. Explore the collection of
photogrammetry-based content, captured on
location in Australia, and download it for free
from the Marketplace. And speaking of free Marketplace
products, in honor of Earth Day and this beautiful
planet we live on, we've released five
new Quixel MegaScans collections featuring foliage
from a variety of biomes. Download them all and start
planting your own environments. When pandemic restrictions
forced a planned exhibition to go digital,
Kengo Kuma and Associates teamed up with game
dev company historia and rose to the occasion with
"Multiplication," a physically impossible, interactive
architectural visualization that anyone can
experience online. Head to the feed to uncover
how they invited the whole world to their stunning experience. Did you miss our webinar on
the new automotive configurator sample project? Now you can learn how to
build your own interactive digital showroom on
demand - the full video is available now on
our YouTube channel. Explore, adapt,
and survive in Returnal, a narrative-driven
roguelike from Housemarque. Read our interview
on the feed to discover how the studio's
history of creating frenetic,
arcade-inspired games has been key to creating
the next thrilling AAA exclusive for PlayStation 5. Digital twins are shaping
the future of architecture, engineering and construction. Explore what they are and
how they'll help us build better cities with AEC influencer
Fred Mills--together with experts from Microsoft, Buildmedia,
and Epic Games--in the next episode of The Pulse on Wednesday,
May 5. Register today! And next week, from April 26-28,
many folks from the Epic team--including
Epic CTO Kim Libreri, VP of Digital Humans
Technology Vladimir Mastilovic, and VP & GM of Unreal Engine
Marc Petit--will be speaking at the RealTime Conference
on a variety of topics from MetaHumans, virtual production,
collaborative workflows, and more! Pop over to
realtimeconference.com to see the full line-up
of sessions and register. Now to give thanks to our
top weekly karma earners. These rockstars are: Everynone, ClockworkOcean,
ugmoe, Micky, chrudimer, Shadowriver, Makigirl,
Chief Man, Nachomonkey2, and Jack.Franzen First up in our
community spotlights - you may remember 2019 Epic
MegaJam winner Escape Velocity. The team has gone on
to refine it into Orbiterrion. Match against
your friends or team up in a fast-paced,
action-packed space fight set in orbits across
the Solar System. Use skills,
collect shards and kick your friends to outer space. Download Orbiterrion from Steam! Next up, enjoy a short
film inspired by the Sicilian legend of Colapesce,
the son of a fisherman who disappeared into the sea to
prevent Sicily from sinking. Visit crisodora.com
to watch the full film and see more of the
gorgeous work from their team. Last up, enjoy Northern Lake
by Mateus Kurzhals--their first personal project
in Unreal Engine! It's an impressive start
and we look forward to more creations from Mateus. Head over to their
ArtStation page to get all the details
on the project. Thanks for watching this week's
News and Community Spotlight! VICTOR: Hey, everyone. And welcome to Inside Unreal. A weekly show where we learn,
explore, and celebrate everything Unreal. I'm your host, Victor Brodin. And today I have members of the
MetaHuman Creator team with me. Let me please introduce
Aleksandar Popov. ALEKSANDAR: Hey, everyone. VICTOR: We got Chris Evans. CHRIS: How's it going? VICTOR: I should
probably get my notes out. So that I can probably
read your titles as well. Alongside Chris Evans,
we also have James Golding. JAMES: Hi. Good evening. VICTOR: He's the
technical director. And then Nina Ward,
product manager. NINA: Hi, everyone. VICTOR: And just to make sure that we get everyone's
introduction correct, Aleksandar Popov
is the art director. And Chris Evans is the
lead technical animator. Welcome to Inside Unreal. ALEKSANDAR: Thank you. CHRIS: It's great to be here. JAMES: Thanks. VICTOR: Today, we're going to talk a little bit about
MetaHuman Creator, which is now available out in early access. If you haven't seen it yet,
the link is pasted in the forum
announcement post. You can go ahead
and sign up for-- or request early
access to use the demo. I'm not going to do
much of the talking today, so I would like to
hand it over to Chris. CHRIS: Hey,
we just wanted to go over the initial inception, if you will,
of MetaHuman Creator. I'm going to spend a
couple of minutes on that. We've been working together
with 3Lateral, and Aleksandar, and Vlad, and others there,
for many, many years, across many different projects. There was the
projects like working on Senua with that team. And working on
Siren and Andy Serkis, and a lot of the digital
human demos that we've done with Ninja Theory. So we have really
created one of the, I think, best teams in the
industry for digital humans. But it takes all of us
working in the same direction to create one of these
high fidelity humans. And a goal that we
really wanted to hit was allow anybody
that uses the engine to create a super high
fidelity digital human. And that was kind of
like a moon landing for us. A couple of years ago,
we set out. And we said,
what do we need to do to try to enable all
UE4 users to create. We were doing Andy
Serkis's performance at the time for this GDC demo. I was talking with
Aleksandar and others. And I mean we're rigging
like individual nose hairs and things like super
high fidelity character. But we really wanted to
enable anybody to do that. And 3Lateral has really
focused over the years on kind of a scan to rig pipeline. How can we create
rigs as fast as we can at the highest
fidelity possible? So we were all talking and
would worked on so many projects together. We said, well,
we think it's possible that we can take the existing
infrastructure at 3Lateral, and wrap it in a way that
can allow Unreal Engine users to really create a super
high fidelity character very, very easily. And at the time,
when Vlad was talking about these markers to
move the face around. And fit the face
to what you want. And the call and response
of when you pull on an ear, it anatomically forms,
using anatomy, some kind of plausible ear. That was kind of magic. I remember sitting in the room
when he was describing that to execs and they were just like,
oh, so it's kind of like putty. It's just very hard
to understand that you're going to
make a movement, and it's going to look up into
a really huge learned database of different anatomical
facial features. And give you some kind
of anatomically plausible, believable response. But yeah. And now years later,
it was really great to be able to
show the world what the teams have been working on. And there was a time when
a little bit of the background was shown when Vlad worked
with James on a GDC demo. He could talk
about a little bit, just getting the groundwork
in the engine for-- JAMES: Yeah. This was a while ago. And it was interesting going
back and looking at that video. Actually I hadn't
watched it a long time. But I worked with Vlad
on a demo for GDC 2017, which was just
very early showing this sort of technology of like
combining regions of faces. There was a very
small number of faces. There was no on surface editing. But you know,
Vlad had some things that they needed in the engine. I added some features,
and we worked together on the demo with a UI artist
at Epic to show something. And so that was just the
beginning of this idea of like, oh, you can make a unique face,
and then animate It. And make a whole--
get an idea of how you can take even
a small database, and by using this
sort of mixing tech that 3Lateral had
been developing. But it's really
interesting to see how long it takes to go from
a prototype into something that you can really give to
people and give to everyone. And it's been interesting
to have been there at the beginning
of that journey. Working with 3Lateral on
demos like the Kite demo that we did many years ago. They were involved in that,
and so forth. So all the way through to
design we can give to everyone. It's been a very interesting,
sometimes challenging, journey. But it just goes to show
that none of this stuff is easy. There's been a lot of
interesting challenges along the way. But it's good to look back
and see the overall arc of it at this point. CHRIS: And I would say at
all times the 3Lateral team, especially since they
have the expertise when it comes to creating digital
faces and things like that. Aleksandar has been pushing
for how does the user use the UX, how does this workflow work. I remember of the 2017 video. I think Aleks you had called it
like shopping for body parts. It's like little icons
and you blend in a little bit of this ear. And it was really the
vision of the 3Lateral side to have something
where you can really feel like you're sculpting,
feel like you're working. JAMES: You have to keep
taking your eyes off what you're making. And you do a bit more of that,
and then all the other ones go back. It's not a good way to work. But it just goes to
show that it's not just about having really
interesting technology, and the ability to
render that technology, but you've got to think
about the whole pipeline. I think that's where 3Lateral
really put their experience. And not just being a
technology company, but being a content
creation company. And a company that's worked
with so many different other game studios. They sort of understand
the diversity of needs. And that there was
a clear need for this. But in order to actually
meet those needs, there was a lot that
needed to be done. So it's been really interesting. I think the other thing is
it's been a very collaborative. Just like we're saying that,
it's a mix a lot of
different disciplines. So I've had to
learn an awful lot about both the artistic side,
all these terms of lacrimal fluid. And all these really
technical terms, which are really interesting,
like the details you have
to get into around how the meniscus of
fluid in the eye moves, as the eyeball moves around,
just real detailed stuff. All the way on the other side
to sort of cloud technology of how do you scale in the cloud
and manage a fleet of servers. That stuff that coming from
developing games for console, it's a very different
experience. So I think everyone on
the team has learned a lot over the last couple of years. NINA: It's really fun to see
the reaction from everyone as well. You guys say technical terms
like meniscus fluid and stuff, but there are people who
aren't even particularly familiar with sclera,
or some kind of slightly more basic terms. But it just shows how broad
the audience has actually been with this as well,
which I think really speaks to how intuitive
the whole tool is. ALEKSANDAR: And it is only
just the beginning in a sense. NINA: Should we jump into it? JAMES: Yeah. So I think, Aleks, are you
good to give us a bit of a demo? ALEKSANDAR: Sure. Yep. JAMES: So Aleks has probably been using this tool
longer than anyone, so we'll let him show
how a pro does it. You good to go, Aleks? ALEKSANDAR: Yeah. We can switch. So I'm running a local
build for the stream. And I was already fiddling
in the background trying to create something
of an [INAUDIBLE] but that's actually
one point that I would like to connect
with Chris and James said. At this point in time,
we don't necessarily see MetaHuman Creator
as a tool where you try to hit one-to-one likeness. So it's more like getting a
plausible, realistic result out of the already mentioned database,
which is sitting as sort of like this
main aspect of the MetaHuman Creator tool. And yeah,
let's go into a bit of a deeper dive. So the way we envisioned-- I'll leave it at
the face for now. So the way we envisioned how
you would approach modeling-- because,
as Nina mentioned, the idea was that we wanted to cover
a really huge range of users, from really entry level
people who just want to play with this
without being maybe an artist who already
worked on faces or whatever,
all the way to industry grade users and artists. So to do that,
we developed a system called which we call direct
manipulation tool, DMT, and that is actually the
core of how you edit the face. And we separate this
into like these three panels that a lot of people
have played with. So I'm just going to
go quickly through it. So the way we envisioned
that was you either start with a preset. Or you can go into blend space,
and select a couple of presets to use like a minimum
to enable this. Or you can go a
higher number than that. And then, once you do that,
you enable basically blending per region
with the characters that you have selected. So you can get really quickly-- you can quickly blend
between the characters and get the starting point
that you want to continue. Or you can use the Move here. Let me move that, one second. So the Move is
essentially comprised-- it's like Sculpt would be the
most granular type of editing that you can do in the
MetaHuman Creator. And the Move would be
simply like a grouped markers so that you can also
do a quicker movements, and get the results
quickly before you actually go into sculpting. And then you can refine
at the very detailed level with your mesh. But the thing to understand I
guess about the sculpting that might be one of the
most important points is,
unlike ZBrush and other software, you're not just
moving a part of mesh. And in a linear way, you're actually,
what Chris mentioned, you are moving to
this area of space, which means you're moving
to the data points of all the characters in the database. And geometry is trying
to actually hit that spot. And that means this is
something-- sometimes you will see, especially with
the eyes, when you move, you're not actually
moving only that marker. So you select the surrounding
area moves as well. And this is because
you're getting the-- at an instance, you're
getting full rig, full deformation from that particular
character that you're actually selecting that area from. So this means it would
be very difficult to, I guess, localize it to a
really small spot. So that is why you will notice
some other regions move as well as you move the markers. And essentially,
it comes down to just-- we feel that it
is very intuitive, but you'll need some
time to just like play around and notice how this works. And going back to the
likeness that we talked about. So yes, for now,
we don't intend for the user to be able to take a photo
and hit the likeness one-to-one. It's more like we want
to get a plausible, realistic digital human. You can go like-- even though all of the markers
that I was mentioning, whenever you move them,
it is actually data that exists in the database,
meaning it's from a plausible
position on the geometry from the database. But still you can get some
plausible looking characters. Because essentially,
like if you-- right now, there are all
various ethnicities and genders in the database. And so you're moving
through all of them. And so you can,
especially with the eyes they're very sensitive,
you can create like maybe Asian looking eyes. But your eyebrows would
be from a completely different character. And you'll have a big
difference between the two. So we have to mindfully
move the other one as well to avoid
potential possibilities, unless you maybe
want to do that. And then that is also
an artistic choice I guess. So yeah,
that would be a brief cover of the way you edit the face. And I can start going
maybe category by category and getting a more in depth-- CHRIS: We would love to hear it,
Aleks. ALEKSANDAR: Yeah. OK. So I'm going to actually
go right from the beginning, select the preset,
and just start from there. OK. Let's see. NINA: Aleks,
do you have a favorite preset? ALEKSANDAR:
That's a tough question. I guess I really do
like this character. I felt like when we
were creating him it was of a Genghis
Khan looking character. That was the idea. So I like him. And I like Hudson. I think he's is very
charismatic character. NINA: Hudson's quite lovable,
isn't he? I kind of like Skye. Skye is one of my favorites. ALEKSANDAR: Oh yeah. Skye,
she's pretty awesome as well. Actually I like Seneca as well. He has an interesting
head shape. But OK. Let's maybe take I don't know,
Tori. it doesn't matter. So this is my
personal preference. I like to set up the background
color, which you can find here in the settings. So this just might actually
even be missed by people. I'm not sure whether on
your side if it's clear enough. But so you can
select the background, and find a complementary
color like a painter would do. And it makes, I guess,
the experience a bit more enjoyable. And OK. So let's start doing something. I'm going to maybe try
and age this character. So I'm going to select
some of the older characters from the presets. Let's see. Maybe take this lady. And take her as well. Who else? Maybe a male character as well. All right. So let's see. I'm going to do
something about the eyes. So maybe select some
point between here. And I'm going to
deflate the lips a little bit. Let's see the nose. I'd like it to be bigger,
a little bit bigger. But I may do that in sculpting. Enlarge the ears a little bit
as the character gets older. Let's see. Yeah. Like this and like that. The jawline, now I think
I'm going to keep it like this, support it. All right. I'm going to switch
back to the Move. Actually,
I can go directly to Sculpt. I want to address
the jaws a little bit. VICTOR: Aleks,
while you're operating here. I'm going to take
the opportunity to ask a couple of questions. We're getting a load of them. So we're going to have to start
a little sooner in Q&A if that's OK. ALEKSANDAR: Sure. That's fine by me. VICTOR: MR3D-Dev
was wondering will there be will there ever be an
offline MetaHuman Creator? JAMES: That's an
interesting question. I think we can certainly see
why that would be desirable. Obviously,
what this looks like when you first see it is a character
creator from games. Now what it's doing is
quite different than what normally a character
creation in a game is doing. Because we're
creating brand new rigs, and mixing from this database,
and so forth. So it's a very
different proposition to put this into a game. So I think it's a really
interesting question. I think it's something
that I can understand why people would want to do. It's something that
we have talking about, but there are some big problems. The reason that this is a cloud
based application right now is we have this big
database of our faces, as we've talked about. We also have a big
database of the facial textures. Those are big and
they're going to get bigger. So that's really
the big challenge. How could we
possibly in the future make something
which is small enough that would ship with a game? So we just don't
know at the moment. It's going to be saying
that we're going to look at. We hear that question. But it's not going to be happening,
you know, next week or anything. So we've got a lot
of R&D to do that. VICTOR: Next question comes from Will Voss, who's wondering,
how do you paint textures for MetaHuman faces? There are three albedos. How is that supposed to work? JAMES: Chris,
can you take that one? I'm guessing it's three
albedos for the blood flow map. CHRIS: Yeah. So there are a lot
of different textures that comprise a MetaHumam face,
but there's kind of the main. So what Aleksandar
was just showing, we call that kind
of texture synthesis, when you're building the
skin that you want in the tool. And when you're creating your
character in MetaHuman Creator, you are so-- I'll get a little bit technical,
because I think that there's a lot of
game devs in the audience that might be interested. But basically, we have a
kind of machine learning back end that breaks out
the high frequency from the low frequency detail. What that allows you to do is
completely change and rework complexion. But still,
you'll notice when he moved through the high frequency details,
that allows us to use those normal maps and things. But mix them to make a new skin. And allow you to do some
pretty big changes to it. The reason why we kind
of keep that high frequency is because it's a real challenge
to use machine learning to generate new normal maps. A lot of character artists
know you can't even really transform a normal
map 90 degrees if you're a purist. So having the high frequency
and low frequency broken apart, that then gets sent down
to you as a new albedo that's synthesized from that. But then it knows the high
frequency that it picked. And the tool does
some special things like it will
automatically change the tangents of the face
you've made to better fit with a high frequency normal
map that has been chosen. Because we saw kind
of early on that if we tried to place a normal
map-- or people know if you bake a normal
map for a face shape that it wasn't baked against,
if you use it that normal, it will start looking puffy
because the mesh tangents are now different. And the normal map is
expecting the mesh tangents that were there when it was baked. So we go ahead and we
change the high frequency detail of the face mesh as well
to match the high frequency detail of the normal map. And then all of the
driven wrinkles use those same high
frequency details and what we call kind of blood
flow maps, which are very, very tiny driven diffuse. So I think that's probably
what you're talking about is the driven diffuse,
which is kind of like the
readiness of the face. But there's also
multiple normal maps and those are driven
by the rig as well. These masked regions in the
face shader that we're talking about, they're driven by
the face rig live. Because the face rig
runs live in the engine and is not baked out. The face rig talks to that
material for the face shader and it blends the
appropriate wrinkle areas from 96 different zones in
as you play an animation, or create your animation,
or interact with the face in Unreal. So that's a quick overview of
why there are so many maps. But yeah,
that's a quick overview. VICTOR: Thanks, Chris. Next question comes from leftwitchde,
who's asking, will the clothing section
be improved as well? CHRIS: I can
talk about clothing. Yeah we'd like to add clothing. The clothing that you see
in MetaHuman Creator-- so there are 18
different body types. But then across those
different body types-- so that's masculine, feminine,
three different heights, and three different BMIs-- and across those there's
then four different LODs. And because we don't really
know what LOD you're going to see, some games can auto LOD
out and copy over skinning. Because they know the distance
that you'll see a character at. Because we have
no idea how you're going to use these
characters in your game, we really tried to hand author
all LODs and skin all LODs. And that was a real challenge. All in all,
the very limited clothing set that you're seeing,
is over 1,000 different skins and imported assets
into Unreal Engine. As you start downloading
different characters, if you look at the
folder structure that's being built in your project,
you'll start noticing some
common items. And which items-- you'll
start noticing all of the things that I just spoke about. You'll notice that
there's per BMI clothing, per height clothing. We tried to SIM all
of the clothing folds for most of the garments
across every different BMI. So I think,
with the exception of the hoodie, the clothing shouldn't look
like it's just the same clothing warped to fit a new character. It's important to us that the
clothing feels good and fits the different characters. So there's more clothing
to be coming in the future. And the great thing
about MetaHuman Creator is that we're not really
tied to engine releases. You download your
characters through Bridge. We're going to figure out what
our next update cadence is, but you won't be having to
wait for a new version of Unreal to get new clothing. It should just-- you
should just see it show up. And I'm sure we'll put
some messaging out there. JAMES: We've been patching MetaHuman Creator, I think,
every day since we released it. It's just been small things,
small fixes, and stuff. But yes it's really
great that we can get updates out
really fast and do things more than just bug fixes. VICTOR: And Aleksandar, feel free to interrupt us whenever
you would like to comment. We're watching you here. ALEKSANDAR: Sure. All right. I'm getting a feel
of this livestream so sorry for being quiet. VICTOR: No, you're good. Next question Yeah. Go ahead. ALEKSANDAR: Oh. I'm sorry. Please proceed. VICTOR: Next question
comes from Ntrillion1. Will the 3D models be
compatible with other engines, for example, Godot and Unity? CHRIS: Nina,
do you want to do that one? NINA: Yeah. I can take that one. So MetaHuman's are
for use in Unreal Engine. We'll allow you to take them
into other places like Maya, but you will need to
render them or publish them with Unreal Engine,
in order to comply with our end user license agreement. VICTOR: Easy enough. SirVoxelot is wondering,
is it possible to transfer a DNA file from one mesh to another? JAMES: I'm not quite
sure what that means. VICTOR: I don't know either. Verbatim.
Perhaps SirVoxelot can detail. Also, for all you out
there that are watching, we're getting a
lot of questions. Unfortunately, we will not
be able to cover all of them. If you want to continue the
discussion once we go offline, the forum
announcement polls that exist on the Events
section of the Engine forums is where you can continue
the conversation afterwards. And we can potentially
chime in with some answers there as well. CHRIS: I can take a
quick swing at that one. But so basically, when you
build your character in the engine, it depends how
advanced this guy is. But so there's the DNA
found in the engine, but it's built into
the file structure. So maybe you saw that
on GitHub or something. With Maya,
we send the DNA file down. The DNA file basically says,
for this configuration of joints and blend shapes on this face,
when I move this control,
what's the response? When I do this,
what's the response? Because each character has
a different blended response. So it wouldn't make
sense to use the same DNA file on a new character
unless that new character had the exact same face or so. So I'm not super sure if
I answered the question. But that's the relationship of
the DNA file to the character, JAMES: I can just add
a little bit more detail. That's a good point,
Chris, just in terms of the technical details. So when you download a
MetaHuman skeletal mesh from this tool,
embedded into the skeletal mesh, just like Chris says,
there is this DNA data which controls how,
just like Chis said, the controls affect it. That's used by what's
called Rig Logic. This is a new
plug-in that shipped as a standard with 4.26. And this is based on all
of 3Lateral's experience around building face rights. That's what drives these
rigs live in the game. And so you place a rig
logic node inside Control Rig. And that looks up in the
scalar mesh what the DNA, that the mapping
driving data is, and then we'll use that data. So there is saying kind
of generally the assets we give you are just sort of
plain Unreal assets, materials textures. It's very intentional
that there's no magic in these materials. You can open them all
up and mess with them. The only sort of special
thing is this DNA data inside this skeletal mesh. And currently the
only way to alter that is using MetaHuman Creator,
because, just like Chris says, it's very tied to the face. you change the face,
you'd have to change that data. It's all coupled and not an
easy thing for people to do, because it needs
all of the pipeline. VICTOR: Thank you all. Aleks,
are you good with us continuing through some questions here? Or is there anything
you want to go over? ALEKSANDAR: If
you guys are fine. I'm good for you to
go with the questions. And then I'll chime
in at some point. NINA: I'm enjoying
you kind of go-- JAMES: Yeah. Yeah. It's great. ALEKSANDAR: Yeah. I'm finding it a bit hard
to focus on the questions and what I'm doing. So I'm just trying
to get a hold of it. And then once I'm good,
I'll tell you guys. If that's fine by you. VICTOR: Yeah. Sounds great. Thank you, Aleks. Next question comes
from Player Chass, who's wondering about body sculpting. JAMES: Oh. That's a really interesting one. It's obviously something
that we would like to do. Something we've talked a lot about,
again. There are some really big
challenges with body sculpting. Like Chris said, right now,
we have these preset body types. And we've manually
gone and created the clothing and
all the correctives for each body types. There's a huge amount of work. But it's the only
way that we can make it really hit the
quality bar we want for all the different body shapes. Probably the biggest
challenge with sculpting a body-- well first of all,
we need to build a database. And figure out,
similar to what the face took, it took a long time
to get to this point with faces, a lot of research,
a lot of iterations. So there's all that work to
do to figure out if and how this would apply to bodies. But then clothing is a huge
challenge on top of that. If you've got complete
control over the body, how do you make sure the
clothes look good on that body? And that's just something we
simply don't have an answer to at the moment. So it's a really
interesting question and something we'd love to do. But again some
really big questions that we're going to
have to get into there. And on top of that,
we also need to think about retargeting animations. We're going to have to make
sure the same animation can run on all of those
variable body types. So that's something that's
being worked on for UE 5. But those are--
clothing and retargeting are probably the big topics
that we're going to need to tackle, even once we've solved the
big challenge of sculpting a body and what they would
actually look like. Would it be handles
like the face? So yes,
some really big questions there. But I think it's something
that we will be thinking about. CHRIS: And then,
I don't know if the question-- I just want to chime in on that. Because we're
delivering source files, you can download
the source file. You can make
changes to the body, because it doesn't
have as many deformers and things as the face. Basically, if you were
to click off of the clothing, you could download
your MetaHuman, and it's going to
have the source files. And you could build a spacesuit
around your MetaHuman, or that's why we give you
the deforming skeleton that's needed to export
your body and bring it into the Unreal Engine. VICTOR: Next question
comes from leftwitchde again, who's asking, will we be able
to make colorful hair like purple? Right now, we're restricted to
the blonde and black spectrum. NINA: Yes. It's a very short answer. We're going to-- VICTOR: I like it. Next question. Let's go. NINA: Yeah. Let's keep going. JAMES: I don't know when. NINA: Who can assume? But it'll be-- JAMES: On a big, long list. NINA: --on our growing
list of things to add. VICTOR: I guess,
not a question from chat, but something that we
might want to-- Is there a public road map? Or a plan for a public road
map in terms of MetaHuman? NINA: No. We might at some point,
but we kind of want to see what
people are telling us first, and get a sense of what the
bigger buckets are for what we could be looking at. But once we get those,
we might show something, but it will be all hypothetical. So none of it will be
definite until it's actually out into the world, I think. JAMES: Something so
exciting about this product versus other
things is like there's nothing really
like this out there, so there's not like a
set of expectations. We could take this in so
many different directions. And so that's why
getting to this point was such a big
deal for the team. Because now we get to see
what are people doing with it. What are the most
important things people are finding and needing? And then we can
kind of respond to that. So we're still very
much in the kind of learning phase and
listening phase at the moment. Because there's
just a lot we can do. VICTOR: Next question comes from Benbentwo, who's asking,
will there be any ability to extend the MetaHumans
system to create human-esque creatures,
like elves, dwarves, maybe even orcs? NINA: That might be
one of those big buckets that we don't
want to talk about. It is a really popular request
like we see a lot of people are really interested
in humanoid characters, like the orcs and the elves,
and I mean everything in between. JAMES: I think there's-- oh, Nina, you just muted. NINA: No, go ahead. JAMES: OK. I think there's a
big range there as well between characters,
which are just a small tweak on humans,
maybe a different shape ear or something, all the way
to just be the same topology, but with some adjustments. And then all the way to
orcs or something, where it's a totally different topology. It's such a broad question. Again, it's one of those
things where, yes, we definitely hear the interest,
but there's certainly some challenges
for us to figure out, in terms of how we
might even tackle that. NINA: I think, ultimately,
we're still pretty focused on humans right now. We kind of want to get to
that specific MetaHumans that you can have final
control and over what you end up with. So that's still going
to take a while yet. CHRIS: That's always my quick answer,
Nina. It's MetaHuman Creator,
but we heard the-- NINA: It's in the name. CHRIS: Yeah. It's going to take
a while for us to really nail this and get it. This is just the
initial release. And there's a bunch
of stuff we really want to do to
make these not only usable on a lot more platforms,
and more performantly, and things like that. JAMES: I saw we announced MetaPets
Creator like three weeks ago. So I guess we should
probably get on with that. NINA: Have you started it yet,
James? JAMES: Right now. I'm typing right now. NINA: OK. Get coding. Faster. CHRIS: The next question,
I see you can do dogs, but can I do a dragon. It's just a dog with wings,
you know. NINA: Remove the feather. VICTOR: If anyone watching knows any orcs or elves out there,
let us know. So that we can use
them for references. JAMES: That is
the other challenge is finding enough orcs
to scan for our database. That's definitely a problem. NINA: I think they have
a few in New Zealand. VICTOR: Next question comes from Formula_E,
who's asking, can we use MetaHumans to build
character creation systems in our games? JAMES: That's kind of
the point we covered earlier. I think it's a really
interesting idea, but there's definitely some big,
technical challenges to taking this from
an online sort of tool that you use outside the engine
to build assets for the engine, versus something that
you could bring online I think we'll look into it. But we won't know
yet how feasible that is until we really
get into to solving some of these problems
around data size. VICTOR: We're
definitely receiving more specific questions
in terms of like eye colors and different shapes. And I think we've sort
of covered that, right, in regards to where
we're going in the future. All of that is sort of
on the potential road map in terms of
customization that is available. And remember the tool's
early access-- what is it now? A week, I think. A week and one day that it's
been available to the public. So tune in to the Unreal
Engine Twitter handle, as well as Unreal Engine forums. We have created a
MetaHumans section there. If not,
it will be online real soon. JAMES: We have one. NINA: It's there. VICTOR: OK. Awesome. I've been busy doing the livestream,
MetaHuman. I haven't had enough time to
play with the tool myself yet. All right. Let's move on. Next question comes from
Nauman Arshad, who is asking, will support for 3D
softwares other than Maya be introduced any
time in the future? CHRIS: I could take that. Uh no. So our company has used Maya
to develop all of the back end pipeline everything. Not hating on other softwares,
but it's just there's a lot going
on under the hood with the Maya plug-in
that we released. And it would be a real
challenge to figure out how to get that working
in other software. So a great example
is if you've done a lot of rigging and
animation in Blender, Studio Max, or Maya. Maya has like a
joint orient matrix. And that doesn't
even exist in Max. So there's just some stuff-- and like Blender requires a
bone length, which Maya doesn't. There's quite a few things
where the softwares just do them completely differently. And we'd be chasing
ourselves and we'd be spending all of our time
trying to get that working, instead of actually
improving the tool. But you know
it'd be great to see in the future what people do. If people figure
out ways to author. JAMES: Chris,
do you think that's something that community effort to make
a Blender version or something would that be plausible? Or is it just that's just
not going to happen? NINA: Anything is possible. CHRIS: Anything's possible. I have to check about
the actual plug-in code. But we're very
transparent with things. We could make available
what is needed to do it, but it would be
a real challenge. Because, like I said,
the softwares are just very, very different. We're not even talking
about handedness. We're just talking
about representation of blend shapes, joints,
different things like that. NINA: I think the other thing was like we wanted to open
up to Maya so that people can animate their MetaHumans. For most other stuff,
it's really kind of taking it into
engine and going from there. JAMES: Yeah. I think that's a good point,
Nina. And also, we would rather-- we're still a limited
development team. And it seems like a big company,
but the team on writing the code for
this was surprisingly small. And we're trying to focus also
on animating inside the engine. We'd really like that to be
a plausible place for people to do a lot of their work. Rather than having to split
our effort between different 3D packages,
if we can do it in just one place, and make it
available to everyone, that seems like a
really exciting way to focus our efforts. CHRIS: And I also just want to say there's so
much that goes into it. Delivering source files was
really not the easiest thing. But it was very,
very important to the team. And there are a lot of
people involved, like Nikola, Vladmir, Kay, Voya. There's a team of
people really working to get source assets right. And when you're watching
this like this bar creep across the screen
for three minutes while it's generating a
Maya file from scratch, the team is
doing a lot of stuff. We're making sure DirectX
is the renderer that you've set in Maya,
we're building a DirectX shader that has all of the wrinkle,
matte blending, and everything. There was so much. There was a lot that went
into setting up the Maya file properly to work with
all LODs and all Rig Logic and everything. But we're really proud that
we're able to give that out. Because that's really
powerful and will allow people to kind of reverse
engineer stuff if they want to. Or, like I said, build a space
suit around the character. There's also the UE4 RBF solver
plug-in is installed by Quixel. So when you're
skinning your character, you're able to look at the
driven joints of that character for your clothing the same
way that the joints look in the engine using
our post driver. So a lot of work went
into the source files. And it's really awesome
to be able to give it out. But again,
that's a whole other plug-in that would need to be
kind of written for these files to work in another DCM. VICTOR: Thanks
for the clarifications. Next question comes from Xenthorx,
who is asking, any plan to be able to import
scan data into MetaHuman? JAMES: That's
an interesting one. I don't know at the moment. I think it's another
one of those ones that we need to look into. There's nothing we're
announcing at the moment. VICTOR: Let's see. Next question
comes from SirVoxelot, who's wondering,
where are the dynamic material instances on the head
being created and controlled? It seems like it's ticking
since it works in editor even if there's no Live Link
skeleton in the Blueprint. JAMES: Very specific question. I don't remember exactly without
digging in, unless Chris does. But those MIDs are going to be
driven by the animation system. So one of the things that
curves in the Anim Blueprint can do is drive material parameters. So the Control Rig
node running Rig Logic is going to be driving the
wrinkle maps, the parameters for the wrinkle maps,
and the parameters for the animated albedo, blood flow maps. Those will then run as curves
through the Anim Blueprint and then drive
material parameters. So it'll look to see which
sections have the parameters. And then set those. So yeah it'll be
happening in the editor, but it doesn't have to tick. It's just any time the
animations playing, it's going to be pushing those
parameters into the MIDs. I hope that helps. VICTOR: I'm sure SirVoxelot will let us know if it did. JAMES: It's a standard feature. That's not a new thing. We've done lots of
demos over the years, and there's lots of places where
you will use animation to track material parameters and MID. So we didn't write
anything special there. VICTOR: Next question comes from Skye,
who's wondering, it'd be good to hear what
you thoughts are on what you think the effect
and/or non effect is to artists and character models are. There a lot of folks in the
community and the industry who are sort of wondering about
what ripple effects MetaHuman will sort of set off. JAMES: Maybe Aleks
has thoughts on that one. ALEKSANDAR: Yeah. I can chime in on that. We've heard a lot of
different opinions on that. But the way we see
this is basically it's supposed to empower
artists and everyone else. So being a
character artist myself and working for a long
time in this industry, you're used to having a
lot of tedious processes while you create the character
right from topology to UE to everything. I mean this tool is
supposed to help you jump start these elements. And just allow you to create
more characters more quickly. And whether you use
it as a starting point, and then as already mentioned,
use it in the other
abstract on top of it, or use it as a finished
character or whatever. It is essentially
supposed to empower you. And we feel that the
creativity will still be recognized in terms of like if you have-- level the whole field of
users and elevate them with this tool. But you you're still going
to see different artists using the tool differently. And some will get better results
and someone different results. So I think that nothing
essentially changes. You're carrying a tool
that allows you to do things more easily and quickly than
it used to up until this point. So personally,
I really fell in love with the tool. It's a bit different. Well I guess what
I'm trying to say is like if you're a traditional
sculptor or anything, you can still do that
in any application. But if you're using it in a
production environment, and I believe a lot of
artists in the industry right now will agree,
when you're working on a project, you start from various assets
that were already prepped and then you try to optimize
the whole process anyways. So that this MetaHuman
Creator is actually supposed to do that in one place. And so I don't know why
I understand the concern, but I think that happens
with any new technology, and essentially, I think,
at the end of the day, it will only help
artists and not cause them to lose their jobs. NINA: You can definitely
spot the difference between a professional
and someone who is just kind of tinkering. When I get into the tool,
I can't get anywhere near what Aleks does. ALEKSANDAR: Yeah I
mean it definitely helps. If you have knowledge,
and if you do, all the skill that you gathered
while working in whatever way, it will all be reflected in
MetaHuman Creator as well. So it's no exception. Yeah. So basically you just get
a really good head start on a lot of processes. NINA: Yeah. On the other hand, it does mean
I end up with a digital human. it's just a digital human,
whereas Aleks' ones are a bit more special. CHRIS: I also feel like it's going to allow people
to be more artistic and focus on art I
think that there was an entire kind of painting
genre of people, many more than nowadays,
that would really just try to paint an exact human. And then when the
camera was invented, that was the explosion
of modern art. It was kind of like,
OK, well, it's a bit easier to make an exact,
pixel perfect, reproduction of a person now. So maybe we can focus
on doing more artistic stuff. Or I've been on enough
AAA games to know that when the art
director wants to kind of noodle specific stuff like
a scar on a face, or anything, there's always going to be-- I think it'll allow
artists to focus more on maybe some of the
hero characters for now. And get really high
quality characters that are interacting and
telling the story, and filling in. But maybe that's
just my take on it. VICTOR: All right. Next question comes from
toddkleinhans7, who's wondering, does MetaHuman
Creator have the ability to import a reference image? JAMES: It does
not at the moment. That's something that-- uploading
stuff is surprisingly fraught, because then you
start having to worry about ownership, and security,
and things like that. So again,
not to say never, but it's not something that's
planned at the moment. VICTOR: So try to specialize in the second monitor talent? JAMES: I've seen people use, I think it's called
ghost or something, where they can bring an
image over the top, which seems to work pretty well. I don't know exactly
how that works. I just see it on a few videos. CHRIS: Yeah. And PureRef as well let's you
onion skin something on top. VICTOR: Next question comes from Odib7,
who's asking, can you clarify the symmetry options? ALEKSANDAR: Yeah. I can cover that. So just give me one second. OK. So when you' go in to
Sculpt you have the symmetry. So essentially,
I saw a lot of comments in terms of people commenting
on whether the characters are too symmetric
and stuff like that. So essentially,
the whole database that this tool is comprised
of is based, as mentioned, on scanned characters which are
digital doubles and completely like asymmetrical to the extent
that each individual character is. So the symmetry in the
tool is actually working only on markers, which means
if you turn to symmetry on, you're moving
markers simultaneously. But still the way we
explain how the whole DMT, direct manipulation tool,
works we find selecting and moving a marker at this point. And the symmetry is on. So that just means that
the marker on the other side is also moving to
that same space. And just like selecting the part
of the character symmetrically, but it does not mean that
character in the database is not like-- he's
probably asymmetrical. So I'm trying not to
confuse too much. But the point is if you turn
off the asymmetry, you can just simply move markers
independently, and get more
asymmetrical characters. But in essence,
with the symmetry on,
the characters will still be, to the extent of the digital
characters in the database, asymmetrical. And for the various modes,
they're basically relatively
self explanatory. So it means when you're
turning off the symmetry and choose left-to-right,
just copy the markers from one to the other side. And the same goes for the other,
vice versa. And then the average
will just do as it say, it will average and
find a spot there. And then after that,
the symmetry is activated and, in fact,
the same on all three levels I hope that clarifies it. VICTOR: I think so. JAMES: Yeah that makes sense. ALEKSANDAR: If you were confused by any chance of something,
we can follow up on the forum later,
so no worries. VICTOR: Thank you, Aleks. Next question comes from Ason,
who's asking, are those characters
meant to be used for games? Like a multiplayer shooter? Or are they only for
cinematic purposes? JAMES: Oh no. Definitely. But the intention is that
they can be used in games. We've worked really hard
to make sure that LODs for all the clothing,
for all the hair. I mean Chris will know. I've been banging on
about this for months. But we tested MetaHuman sample,
the original sample we put out,
across all the different devices, on mobile phones and on Switch. That's why we have
eight levels of detail. That's why the hair goes
from strands to cards to baked textures in some places,
or a sort of helmet representation. We really hope
that we're building characters that can
scale all the way up to almost movie quality. And all the way
down to the phone. Now, having said all that,
I don't think we're done yet. I think that the groom,
the strand-based hair is amazing. The team,
Charles and the rest of the team, we've worked on
the strand-based hair. It is phenomenal. And I think everyone is
really excited about using it, but it is still
really expensive. And he has tons of ideas to
make that cheaper and scale better. It does run on like a
PS5 and Xbox Series X now. But we can make that better
and that will get better in time. And I think the same with
the materials on the scale. Right now, we've had to make
a number of cuts in the material when we put it onto mobile. I think we can do
better in the way that we maybe prebake textures. And so that's again,
something we're going to be looking
at in the future. Not just getting it to
work on these platforms-- like it does now,
we've put a lot of work in to
get to this point-- but make it really work well,
and be really shippable,
and really efficient on all these
different platforms. That's going to be
one of the things we know we're working on next. As you download these
characters over the coming months and years, they're going to be
more optimized more and more and tightly set up for
all the different platforms. But yeah,
we absolutely intend these to be usable across
lots of different devices. That's been a big
effort from the team. I don't know, Chris,
do you had anything to add to that? CHRIS: Nope. VICTOR: All right. Next question then. It comes from RCA
Film Productions. It's wondering is MetaHuman
ever going to give you the ability to make a digital representation
of an actual person via a photo? NINA: Again,
another really big bucket. I don't think that's going
to come any time soon. I mean I don't even know
if we will consider that. JAMES: Yeah. We haven't really talked
so much about that. There's just not enough data
in a single photo I think to get-- I mean you could probably
come up with something. But that's not something that
we're planning at the moment. VICTOR: To clarify
a little bit on sort of-- not to go into too much detail,
but could you compare what the
difference between computing a photo versus the actual
scan that we're getting that's being used for MetaHuman? JAMES: I mean the process for scanning someone that
went into these database is pretty amazing. I've been lucky enough to
go visit the scanner in Serbia where they've built
all this database, and it is pretty phenomenal. But it's a lot of cameras. And a lot of processing
power that goes into it. So the idea is you want a
complete model of the face from all directions. So that you can capture
and you can see the detail on these faces down
to the pore level. So you know those
are reference images. Now, of course,
you could have a much lower res scan, but you still kind of want a
3D representation of the face if you're going to fit to it,
I would guess. But yeah these are
all open questions. But 2D image just
doesn't give you the shape I think to try
and build or fit a face to. But I don't know, Chris. Do you had any
further thoughts on that? Or if you had any thoughts? CHRIS: Yeah. What you said,
it would be much easier to fit to some new mesh
than just some photo. The photo stuff
is a real challenge. You're able to-- it's
one of these things where you can really get
80% of the way super fast. So you can take a front photo,
apply it to a face, and just use the photo texture. And because you have pressed
that photo texture onto the 3D model, if it's not moving,
it might look like whoa it looks
just like the photo. But the way that these
different technologies work like texture synthesis
and all of the shader stuff that I talked about. It just means that it would be very,
very hard to generate a
person of the quality that people expect from
MetaHumans and from Epic. So yeah, it's a data. JAMES: How you'd get
the nose shape and the chin and all that stuff from a
single front photo seems hard. VICTOR: Next question comes from Xeon Cat,
who's asking, are you planning on making
more traditional sliders for more direct control of
main features in the face, instead of always
relying on the database? NINA: I guess the question is,
do you struggle to use the marker. I mean like
personally using them, I don't find them too difficult
to kind of move around. There are maybe some
slightly more difficult areas like the eyes can
be a little bit tricky. But I don't think we've spoken
about sliders before, have we? CHRIS: Well we wanted
to get away from sliders. That was the
shopping for body parts. But that would be a slider
that blends in a specific nose. We call it the direct
manipulation tool because it's
already very direct. If you're not-- we're
growing the database, but if you pull a direction
and there's no feedback, that means that hasn't been
observed in the database. So it's not just
like we're going to make a blend chip on the
fly that lets you pull that way. You are constrained
by the people that have been observed on the back end. VICTOR: Next question
comes from Retro Game Trip. Have you got a full release
target that you're aiming for? NINA: Sort of, but nothing
we can really share right now. VICTOR: Sounds good. Next question from
metagrafaux is wondering, will artists ever be able
to take the character down to a single material with
only one set of textures? JAMES: I suppose that
goes a little bit to what-- you could do that now,
Right we just give you all the base textures and material. And you could take those
textures and composite them in Photoshop or something,
and then bring them back in and build your own material. So there's nothing to stop
you from doing that right now. In terms of doing that's part
of the automatic process, that is saying we are looking at. For lower end platforms
and just for efficiency, there are certain
textures right now which never actually change. If you look at the material,
things like the region tints that you can change,
those are still done in the material but we could prebake those. And just have a single
texture rather than a stack that's being processed
in the material. So optimizations like that,
we are working on. And the same-- like
I talked about earlier for mobile,
where we might want to bake some of those layers of
textures down to be more efficient. And because we are limited
by texture samplers on mobile. So I think there will be
some of that going on But it was really important to
us that the assets that you get are just normal assets. And if there's things you
want to do for your project, the MetaHuman is
just the starting point. And you can take that further. And apply your own clothes
and your own, like Chris said earlier, your own body. And you can mess
around with the textures. And you know
it's really supposed to be this open system
where you can take this as a starting point
and bake things and manipulate
things as you need to. VICTOR: Next question
comes from Timo Helmers, who's wondering, do you have
plans to unlock the constraints-- this a little bit on the same
topic of the previous one. Do you have plans to unlock
the constraints a little bit so we could take it past
the morph targets, or whatever you're using here,
and make characters with more extreme features? JAMES: Nina or Aleks. ALEKSANDAR: Yeah. I was just going to add,
I guess a bit on what Chris was
saying previously. And also connected to this. So the idea it is to keep-- at least for this moment,
for MetaHuman Creator to be very realistic
and plausible barrier. And so this means like,
yeah, we could easily do that and allow
the user to move the markers beyond the point. But then you get the potential
to break the rig and everything else in place. We have been
discussing previously about maybe having
some control where you just can switch off all
the mechanisms that keep you bound. But I think that's also a
discussion for later on. So far we want to
keep it disposable. And I guess one thing that
will be growing in the next year is the database itself. So as it grows,
you will actually be able to hit more
and more shapes. So this mean-- I guess also what I
wanted to just add on the marker is like you
can clearly visualize. One thing that you need
to actually understand is this is not like a 2D volume
that you're moving through. It's a 3D space. So it means like
right here like when you pull the marker to the edge,
you can see the border
being visualized. Beyond that, there is no data. But you need to keep
in mind what I said. This is a 3D space which
means this is just a simplified representation, a more in depth
representation of convex hull. So each point on this marker
has this distinctive shape, which describes where all the
points from the database are. So just keep that in
mind whether you're planning to take all
the axes into account. Because like you're essentially
moving through the volume and searching the data with it. But going back to having the
sort of like those boundaries off, I don't think for now
that would be an option. JAMES: And we found when we were testing, during development,
that you could sort of we played with
a few different options for moving paths. And you can make faces
that look quite interesting, but then as soon as they animate,
they break. I think that's one of the
big challenges is the rig. Because this is not
just making a neutral. It is building a
whole rig in real time with all the different
movements that comes with that. When you start pushing
outside where the data is, you can get some very
odd things with the rig. And we thought that would
just be frustrating to people. That you'd make the face
that you thought was great, and then it breaks down
as soon as animates. CHRIS: And I wanted
to mention just about-- we've talked about
the database so much. So the database itself,
another one of the reasons why it's in the cloud
is it's not possible for you to create a
person that was observed by the scanner in the database. So every MetaHuman,
all the presets, anything you make, you don't need to use it
in a game or something. You don't need to pay a likeness
license, or anything like that. Also for a like GDPR,
for privacy laws and stuff, there's no way to
create one of the inputs in the database
using the outputs. So that's just a
good thing to know for anybody who keeps hearing
us talk about scanning inputs, and people for the database,
and observe noses, and things like that. VICTOR: The next question comes from A13 South,
who's wondering, and I wanted to talk a bit about
this a little bit in general. The question is,
can I still use Reallusion characters for Unreal. And I just wanted to
clarify that by no means with us releasing MetaHuman
Creator are we disallowing the use of any other software,
or models, or assets that you import
into Unreal Engine. There's absolutely
zero requirement or plans to make MetaHuman
the only character creator that you are allowed
to use in Unreal Engine. NINA: Sorry to interrupt you, but you can definitely
still use those characters. That's absolutely fine. You can even have them in
the same scene as a MetaHuman if you wanted,
but it might look quite different. VICTOR: The next question
comes from elprofesorwolf, who's wondering,
how long did the development of the MetaHuman
application take? And how many
people were involved? JAMES: Oh wow. That is a hard question. And Chris and Aleks
probably can talk about it. Like he said at the beginning,
this is based on technology and
ideas that 3Lateral have been developing for a long time. So they can probably
talk more about that. ALEKSANDAR: Yeah I do not know,
Chris. The conceptualization
that we had and everything, I think it's like three years
maybe or more than that. CHRIS: Well, you guys have
been working on the pipeline to really-- using all of the knowledge you
have over 10 years of rigging faces. What's the way that we
can rig faces the fastest? And given a new vision
for a character in a game or something,
make that face or get that face and make a rig for it
as fast as possible? But that was really
the secret sauce. That was the core. Rig Logic was
one of the core IPs, one of the core things
of 3Lateral as a company. So the turning point
was talking about, well, going from the best content
creators for animated faces in the world to using
our core secret sauce and giving it away to
everybody in the world for free with MetaHuman Creator,
that was a bit uncomfortable. Because the company,
over the years, has focused on doing this really,
really well. And by releasing MetaHuman
Creator and joining Epic, it really said like,
OK, we're going to focus on making a system
that allows anybody to do this, and really democratizes. And that was a
couple of years ago. We talked about-- ALEKSANDAR: Maybe
even not two years. JAMES: And in terms
of size of the team, again, it's really hard
to sort of say exactly. We've had a core group
of developers working on this product for a while. But when you start taking into
account all the people working on, like Chris said,
the huge effort in clothing, and all the grooms
we did for the hair. And then the MIDs
for those grooms. But then when you
also think about people working on the engine team. We've worked on
the groom technology or who've worked on
the animating in-engine. The fact that we can build
these rigs for these characters that when you download them,
they have a fully animated rig. There's really been
a cross team effort. Also we've had people working
on the back end systems. And adding the support
for Quixel Bridge. And you know it's really
been a huge collaboration between different
parts of Epic to do this. So I think it would be
very hard to draw a line and say this is the team. We've been hugely
grateful for the help we've got across the company. But a lot of folks
have worked very hard across a lot of disciplines. It would be very
hard product to make anywhere other than Epic,
just because of that depth of experience. VICTOR: Thank you all. Next question comes from
game_dev_onian, who's wondering-- I apologize if this has
already been answered. But the video and
animation production, do we own the IP for video
production with MetaHumans NINA: So you'll have to
check the EULA language exactly, but the MetaHuman
assets themselves, Epic owns those. And you have the
right to use those. I think anything on
top of that is then yours. But I'm not a lawyer,
so I can't advise you there. VICTOR: Next question
comes from Autumn Palfenier. Hi, Autumn. I actually know this-- I used to work with this person. When will we be able to
import saved Live Link face performance CSV files? Or potentially is this
something we thought of? CHRIS: I guess you
mean the CSVs that are saved to the phone if you're
not streaming into the engine and recording. I'm not sure. I believe those were
originally saved to the device to be for debugging. And I think loading
into packages to debug what was going on to load. Sometimes,
I can see you would probably want to record
directly to the phone, because then you're
not beholden to latency. Or anybody who's
using Live Link, you may know that when
the frame rate drops we kind of record data at that rate. So yeah that's a good one that
I can ask to the Live Link face team and get back,
in some fashion, through Victor. Yep, so they'll be recording
data onto the phone. Copying that data
onto your computer and importing it into
the engine in some way. VICTOR: Next user
has a user name that I won't read the entire thing. Because it's 0xb16f15h. OK. That wasn't too difficult.
Just asking, have you noticed,
in terms of performance, what is the recommendation
for the best browser or best practices to use while working
with MetaHuman Creator? NINA: So I mean performance-wise in the browser, I think we've
seen pretty good results across the board. I haven't seen too many
people struggling with it. I know in certain regions,
like depending on where you are in the world,
you might have a few
connectivity problems. But sadly,
that is just dependent on where you are in the world. In terms of best browser,
we recommend Chrome on Windows, but we do support
a few others as well. VICTOR: Check one. All right. Sorry, my computer just
went slow for a moment. Can you hear me? OK. Cool. I'll take that as a yes. KOHAKO is wondering,
are there any terms except the EULA that
that has to be applied? We have a game
for Playstation 4, can we just start using it? NINA: I'm not sure I
understand the question. VICTOR: I believe
they're curious if they're allowed to start using
MetaHumans in production. NINA: Yes, yes. I think there-- As long as you're
within the terms of EULA, you can use them
in production for sure. JAMES: [INAUDIBLE] because we got to make this work
for that kind of thing. VICTOR: Forums. Good place for feedback. Sidian is wondering, could you
show us or write documentation on how to properly export
MetaHuman animation made in Maya back to Unreal Engine? JAMES: Chris, is that
something you prepared earlier? CHRIS: Yes,
there is documentation. We can post that through Victor. But basically, it really depends
whether it's body or face. So you may have
noticed in the engine that there's two
different scale meshes. If you're animating
the face in Maya, there is a selection
set in Maya. And actually,
I've got it up right now. I was going to be
doing a little demo, but we've had such a
good time watching Aleks. Do you guys want
to pop over and I can show what I'm talking about? VICTOR: Yeah. Let's go. Give us just a sec here. You're good to go. CHRIS: So this is,
I think, Hudson. And this is just the
source file that you'll get. Here, you have body joints. Whoops, that's just my Maya. So there's body joints
and there's face controls. If you select the controls
in the facial control selection set,
that is going to select all of the facial controls. My callback is
going to pop up here. But if I select all of
these controls here, it's selecting all of
the facial controls. And you know James had
mentioned the awesome work that the team has been
doing on making animating an engine easier and better. The MetaHumans or
one of the first characters where you export your sparse
keyframe data on the animation controls. same controls
used as an animator. You save those out as an FBX,
and then you import those into Sequencer in the engine. You do that by right
clicking your rig in the engine. And you say important animation. And that's in Sequencer. So it actually does not use the
traditional FBX import options, dialogue, and things like that. This is really importing
rig animation directly into Sequencer. And it supports sparse
keyframes like you had in Maya. For the body, you would
go into this other selection set that I pointed to earlier. And this has all
of the export joints. Because there are
multiple skeletons-- and I've hidden this for
now and I've turned off joints. So there's multiple skeletons. There's a driving skeleton
and there's an export skeleton. So by selecting these joints,
these are all of the joints that
are in the export skeleton. And when you export
those through to the engine, that's how you will get the
body animation to come in. And once those
animations are in, you can plop the body
animation directly into Sequencer and import your facial
animation onto that. And I hope that kind
of covers the question. But that's why those two
selection sets are there to make it easier for you. If you are exporting with FBX,
kind of a handy little thing is if you scroll down to the
bottom of the import options, turn off inputs. If you turn off
exporting inputs, it won't traverse the
entire DAG or Maya graph. It will just export the animated
controls that you want. If you leave inputs on,
it will try to pull like most of your
MAYA file into that FBX and it just waste
everybody's time. It just wastes your time
when you go to export. JAMES: Thank you, Chris. VICTOR: Thank you. Chris was that it in
terms of what you were planning to show from your end? CHRIS: I can show
some stuff in UE4. So I've got that
character here in UE4. Just let me know if
we're back to my screen. VICTOR: Yeah, you're good. CHRIS: So [INAUDIBLE] VICTOR: DMCA strike
hammer comes right now. CHRIS: Sorry. So as you can see,
here we've got the face and the body. If you drag a character into
Sequencer, the rigs pop in. So this is another thing that
the team the Control Rig team added. And the animation
and engine team added that's really awesome. Is the ability to-- when you just drag your
character into Sequencer, the scale mesh now knows like,
hey, what is my rig. And it automatically
adds the proper rig for you. Another thing that
is not so known is that if you go into
the content browser. And you look under MetaHumans,
Common, and Utilities, we merge so many different
file structures together. That's why there's
two Common folders. But there's a
MetaHuman picker here. And our animators
really enjoy using picker. And this is kind
of a picture where you select which
MetaHuman you want to use, because you can have
multiple in the scene. And then you can come in
and use that as you would a normal picker. It's also a really,
really awesome example of the power of
Editor Utility widgets. Jared Monsen on our team
was able to make this picker using Editor Utility widgets. And I think the
Sequencer scripting lip, and it just really
shows how you can make your own really
awesome UIs using utility widgets in the engine. And this picker
allows you to make large selections of entire
hands and things, flip IK to FK. A lot of the different
rigging features that Rahim Tullen had created. She created the MetaHuman
rig that you see here in the engine. Yep VICTOR: Thank you, Chris we're getting close to the
end of the stream here. And so I wanted to let everyone
know that your questions have been awesome. We really appreciate them. If you have more
and you would like to continue the conversation,
head over to the forum announcement
post for this livestream, as well as the MetaHuman
category on forums dot unrealengine dot com. I'll do a quick glance
before I completely get away from my question notes here. A lot of duplicate questions. I should say I've
seen a lot of questions that came in at the
end of the stream that we've covered earlier on. So make sure you go
ahead and watched the VOD. As soon as we go offline,
it will be available on both
YouTube and Twitch. With that, I should also
mention that a transcript is will always be made available. But it is usually
available within a week of the livestream. We do a manual one. So in case you're looking
for any terminology, et cetera, you can go ahead and
download that transcript, Control F, look for it, it has a timestamp. Great way to go
back and study the live streams that we do here. Is there anything else from
Aleksandar, Chris, James, Nina, that you would like to cover? JAMES: I don't think so. There's some great questions. I think that it's
given us a chance to talk about a lot of stuff. NINA: I think we've
covered quite a lot today. I think the main thing
is beyond questions, if you do want to see more
content around clothing, and grooms, and stuff,
let us know what kinds of things you want to see. We would love to
see what you want, and we can try and
plan around that. JAMES: But for so
much of this project, Nina and all of us have
been in these conversations, like well maybe
people want this, and maybe people want that. It's very hard. Because,
like I said earlier, this doesn't exist really
anywhere before. And so there's going to be
people using this that we just haven't even thought of. So it's great to move past
the point of like, well, maybe. And into where we can
actually hear from people. So we really want
to hear from people NINA: And we obviously
want to add tons more to the tooling,
and we will do it over time. So I think obviously we've
got a very limited set of clothing right now and some
of the bits as well. But as we increase that,
obviously we'll try and keep our
range in mind as well. So we're hitting a lot of
different marks there as well. VICTOR: I would
like to thank you all for coming on the stream,
but I'm not going to let you go just yet. I do want to let
everyone out there know that if you are interested
in game development, video production for
architectural visualizations, Unreal Engine
is the tool for you. And you can go to
unrealengine.com to download it for free. If you already have the
Epic Games Launcher, it is available in the
Unreal Engine tab. You can go ahead and go get 4.26.2,
which is the latest version of Unreal. It's right there. If you are more familiar
with the use of GitHub, the engine releases,
as well as the master branch, is available on GitHub. You can go there and
download the source and compile it yourself. You'll see some of
the latest and greatest, and potentially unstable
versions of the engine in the master branch. But if you go to the tech
section and pick release, that's where you
get the same version that we are shipping as a binary
version on the Unreal Engine Launcher. I already mentioned
the transcript. It's a good way to catch up and
learn some of the terminology. In about a week,
you can just turn on the captions on YouTube. That's the place
you go for those. Make sure you let us know
how we did today on the stream. We're going to go ahead
and paste a survey in chat. Let us know what
you think of the stream. And what future topics you
would like to see us cover, whether that's more
MetaHumans or something else. We also do have the entire
playlist of Inside Unreal episodes available on YouTube. You can also find it
in all of the future-- some of the past,
and all of the future, forum announcement
posts on our forums. There are no physical
meet ups going on around the world right now
because of the pandemic. But some of the
community groups are hosting virtual meet
ups using Discord and other various tools. If you're curious about getting
in touch with more people that are using Unreal Engine,
go ahead and head over to communities dot
unrealengine dot com. If there's no local
media group in your area and you're interested
in becoming a organizer, there's also a
little button that lets you send a form to
us that we will go over. So if that's something
that you're interested in, feel free to go ahead
and get in touch with us. If you're interested in
information or anything else updates when it comes to MetaHuman,
most of that will be released on
our forums as well as our Twitter accounts. We also have a Facebook,
LinkedIn. There's also an unofficial
Discord community known as Unreal Slackers. It is great for real
time conversations. They have channels
for everything. It's past 50,000 members. Some of our most active
community contributors are there. It's a great place to hang out. Just go to
unrealslackers dot org. There's also a lounge
channel if you just want to talk you know everything
not Unreal with people who like Unreal. That's the situation I find
myself in a lot of times. I like hanging out
with game developers. Who would have thought? We do highlight community
spotlights every week. If you want to suggest
your project for one of those, you can get in touch with
us through any other channels I previously mentioned. You can also go ahead
and send an email to community@unrealengine
dot com. We monitor that email closely. We have a countdown
video that we do every week. This is generally 30
minutes of development that is sped up to five minutes. But we've seen some really
good contributions lately that completely went
away from that format. So just go ahead
and submit anything that's five minutes long. And we will take a
look and potentially you might be one of our countdowns
that we put in the loop. If you stream on Twitch,
make sure that you add the Unreal Engine tag,
as well as the game development tag. Maybe not if you're
doing something in terms of virtual production,
but the Unreal Engine tag it's important. And that way other
creators who are interested in like content
using Unreal Engine can go ahead and find you. So make sure you follow
the social media for all news. Hit that notification
bell on YouTube if you want to see when we go live,
as well as all the other non live videos
that we release to YouTube. There's some cool stuff being
prepared by the Evangelist. They're been doing a
lot more video content during the pandemic. And so it's exciting to see. Next week,
I remember what we have next week, because I will host the show. Next week, Alexander
Paschall is coming on stream, and he's going to
cover a little bit more in depth of some
of the learning-- one of the learning videos
he did on googly eyes. He hasn't responded to
me in the last couple of days. So I really hope
he's still on board. Because I'm excited to
get him back on the stream. It's been a while. And I'm sure all of you
who know who he is would be excited to
have him back as well. With that said, I would like
to thank Aleksandar, Chris, James, and Nina,
for taking time out of their day to come
here to talk a little bit about MetaHuman Creator. Chat,
please give it up for our guests today and wish them well. Or give them whatever
you want to write in chat. It doesn't really matter. I know some people say
give ones in chat, give twos. I don't care. Whatever you want to say,
let them know that you appreciate them
coming on the stream today. With that said,
is there anything else y'all want to cover? NINA: No. Thank you for having us. JAMES: Yeah. Thank you very much. It's been great. ALEKSANDAR: Thank you. Thank you. JAMES: Thanks, Aleks for
doing such awesome art where we can all-- NINA: Thanks. I was enjoying that. ALEKSANDAR: I think it was less,
I guess, verbal at that point. But I think you guys can shoot
any questions that you feel like it would be good to cover. And I think we could
arrange like a video where I can record and
give you more in depth. I find it a little bit
difficult to talk and work at the same time. But I think we can
do something if you feel that you need any
more info on the tool itself. VICTOR: And as always,
whether there are plug-ins, or new features
and Unreal Engine that come out in early access, since we are
still actively developing them, it is relatively rare
that we produce sort of in-depth
documentation tutorials or learning content on
those tools specifically. So do expect more
of that in the future, alongside the MetaHuman
Creator maturing and becoming more of a full-fledged
tool that the developers are planning it to be. With that said, I want to thank
everyone for watching today. I hope you stay safe out there. And we will see you again
next week at the same time. Take care, everyone.