JEREMIAH GRANT: Hey, everyone. Welcome to Evolving
Animating in Unreal Engine. I'm Jeremiah Grant, the
product manager for animating and engine at Epic Games. A quick background on me-- I come from a technical
animation background. I spent the first 15
years or so of my career rigging characters and
developing pipelines in both games and film. And in this role, I've
had the great privilege of working closely with
many animators, riggers, game designers, and other
disciplines dedicated to creating engaging
character performances. We really strive
to create something that goes beyond the screen,
something compelling, an immersive experience. But creating characters is
difficult and technical. The process often pulls you
out of the artistic process. And you spend more time focused
on managing export settings or juggling expensive
applications or creating monolithic
rigs that never quite do what the specific job requires. It's often a process
of compromise. So what does the Unreal
Engine do to improve this? Well, we're going to talk
about three different ways to elevate the animation
process using Unreal. First, by animating directly
in context of the environment you're building, then by
using layered rig systems for more sustainable
and modular development, and finally by using
procedural animation to create more dynamic behavior. So first, what is
animating in context? Well, Unreal enables
us to work directly in context animating in the
environments alongside all the other disciplines,
working together in parallel towards a
common artistic vision. Traditionally, everyone works
in a bunch of separate silos hoping that, later
on down the project, all those pieces come
together cohesively. I'll be focusing on
animation and rigging, but the same concept applies
to all disciplines involved. By leveraging Unreal's in-engine
animation and rigging tools, we're able to sculpt incredible
performances without leaving our creative environment. We're going to
take a look at how Sequencer, Unreal's non-linear
animation tool, and Control Rig, the in-engine
rigging system can be used to change the way
we animate, rig, and build our characters. When combining these tools
with Python and Blueprint's scripting capabilities, we're
able to create robust pipelines and customized specifically to
our unique production needs. So let's take a look at this
in action in the engine. OK. I'm going to open
up UE5 Early Access. And we're going to take
a look at these concepts through the recently
released "Valley of the Ancient" project. First, we see we have
our environment here. This is the environment
that we saw in the video we showed for UE5
Early Access review. And I'm going to open
up the intro level sequence in Sequencer. If I scroll through
a Sequencer, you'll find BP ancient one and
the Control Rig itself. Let's just click on a
control and see what happens. All right. Immediately, our editor
changed a little bit. We changed to the
animation editing mode, which brought up this tab here. We have a couple of tools
available that we'll talk about in a sec. And the controls popped
up in the viewport. So this is Control Rig. Control Rig is our ability
to create animation controls on top of the skeleton and
animate it here in Sequencer. And Sequencer here is
down at the bottom. You can see our timeline. And we have
keyframes shown here. If I select a control
of the viewports, I can manipulate it directly
and adjust my animation. Let's go ahead and play a
little bit of this animation and see what's going to happen. All right. Let's just pause it there. So as I play through this
timeline and sequencer, we see not only the animation of
the robot but also the effects, the lighting effects, and
the audio being triggered. By changing our camera
to the cinematic camera, we can get the full effect. Let's keep going. So this represents all of our
disciplines working together to create this incredible shot. As we designed this
moment, we were able to really explore the
terrain and the character as we were introducing the
robot and carry every element of this performance. This would have been
very difficult to achieve in such a short amount of
time without everyone working in context, especially with
everyone working from home, scattered across the globe. Traditionally, we would
have had to pass around versions of the geometry and
iterations of the animation back and forth through
different programs and praying that everyone is
referencing the correct files. Any change to the ground
or animation or the effects or other parts of the
scene could potentially cause cascading effects
through the other teams, resulting in both lost time and
a lot of frustrating rework. Originally designed as a
cinematic animation tool, Sequencer has
expanded with a couple of features that allow
us to easily create gameplay animation as well. When transitioning from
the reveal of the robot cinematic to the gameplay
portion of the battle sequence, we're switching from a
level sequences being played through Sequencer to state
machine-driven animation logic being driven by
Animation Blueprints. But the workflow changes very
little from the animator. So here, I have a scene up. Let's go ahead and
look at the contents and see what's happening. So these are actually a
bunch of level sequences that can be played and
animated in Sequencer. But every time I save
this, it automatically breaks out an animation
sequence, which is just skeletal animation. This allows me to animate
directly in Sequencer and automatically bake
out any animation sequence every time a level sequence is
saved, essentially replacing the process of exporting and
importing from external DCCs. And that allows the
iteration process to happen all in one place. So let's go ahead
and give that a shot. I'm going to go to frame 45. And I'll select this control. And let's go ahead
and move this-- yeah, let's go ahead and
move his hand way up. There we go. So now, that's what our
animation looks like. I wouldn't recommend
shipping that, but that'll demonstrate
the process. So if I save this now, and I go
ahead and look at my animation sequence, his hand now
pops up at frame 45 and drops back down just
like the animation sequence. So this baked anim
sequence that can be used through the
runtime animation system is automatically
generated for you. You don't need to manage
exporting and importing that animation. Beyond just the
ability to animate in engine and your
context, artists also need to be supported with
tools and workflow enhancements to be as effective
as they can be. In UE5 Early Access,
we've introduced the first of these tools
here in the Animation Mode as a level editor. We have a couple here-- pose library, a tween
tool, and a snapper tool. Let's take a look at
the pose library first. So here I have a couple of
poses I've already created. Let's go to create a
new pose just so we can demonstrate the whole process. I'm going to take a
couple of these controls. In fact, let's go ahead
and select them here. I'm just going to select
all the finger controls. And we'll find a good pose. Here we go. Got his finger kind
of pointing out. So I'm going to press
the Create Pose button. And we'll call this
pointing to the right. All right. And I now have a new
pose available for me. That pose is a little hard
to see in this thumbnail, so I'm going to scroll down
and hide my manipulators, so this checkbox here,
and try and frame up a better shot for this. Something like that will work. And I can just press
this capture thumbnail. And it regenerates
the thumbnail. There we go. Now that I have
that pose available, I can apply it anywhere
else in my scene. Let's go ahead and go to
the end of this animation and apply this new pose. I can do that by
selecting this pose and either pressing
the Select button here, which will select the controls
I want to apply it to, or I can right click on this
pose and select the controls. Now, I don't have to apply
this pose to the entire thing. Let's do that first. I just clicked the
Paste Pose button. And it applied that
pose to my fingers. Now, maybe I
immediately regret that and I want just to
apply to all the fingers other than the thumb. I'll just deselect the
thumb controls here and press the Paste Pose now. And it'll just paste to
everything but the thumb. Additionally, rather
than just pasting to the controls that
were initially saved, I can mirror this pose. So I'm going to press the Mirror
Pose button and click select. Now, automatically find and
select the mirrored versions of these fingers based off of
our mirrored settings here. Let's go ahead and
paste the pose here. There we go. And now we have the pose on
both sides of our character. In addition to the
pose library, we also have a twinning tool and a
control rig snapper tool. These can let you
drive your controls, either fix them through a
period of time and space, or be able to adjust how
the controls are animated based off the previous
and future posts set in your timeline. As with the rest of
control rig and sequencer, the pose library and animation
tools are exposed to Python and blueprints, allowing
you to write custom tools specific to your workflow. All right, with that, let's
jump over into the Control Rig editor and see what
goes on in making this rig for our giant robot. All right, here we have
the Control Rig editor. Control Rig is a node
based rigging system where you can create
relationships between controls and bones or curves. In this case, we are getting
the transform of a control and setting the
transform of a bone. Now, for the ancient, we
used the metahuman skeleton as a base to start from, adding
on additional bones for armor, pistons, mechanics,
and other elements we wanted to be able to drive. By taking advantage of
control rig setup mode, we were able to immediately
adapt our rig design for our metahuman rig
to the proportions of our massive robot. Now, here's our setup event. And what we're doing is stepping
through all of our hierarchy. We're just getting the initial
transform of each bone, and setting the transform
of the controls. So this is actually placing
the control locations to align themselves
with the bone. So before any
animation takes place, the controls are in
the right location, and they're able
to drive the bones from their initial position. So if you're importing
in a skeleton externally and you want to snap
all the controls to the correct locations,
you can use a setup event to do that. You can also use the setup
event to initialize variables or settings for your rig. In this case, I am just
creating collections of items, and, for instance, all of the
finger bones and all the finger controls here in the setup event
so I can call those collections later on throughout the rig. Looking at the
rest of the editor, we get a sense for how we
end up creating this rig. We have a rig
hierarchy here, which is a hierarchy of
controls of various types. We have a skeleton
hierarchy here. This skeleton was
actually imported in as our base skeletal mesh. So this sets the
geometry and the bones. We are able to import
in a skeletal mesh just through this right
click menu here. And I can create
a new control just by right clicking and
saying, New Control. There we go. So now I have a new control, and
I can look at the details here. By default, it's creating a
control type of Transform. But you can have controls
of float, integer, vectors, rotators, bools, any
number of types here. And all of these control types
can be animated in Sequencer. And in the case of
this rig, we actually have a bunch of
boolean switches that can be animated to turn
on and off your IK and FK, and toggling
between those modes. And you also have a control here
for changing the body controls. I can toggle this right now
and see the results there. Let's go ahead and change
the IK and FK for the arm. Let's go ahead and clip
that a little bit more. So here I'm just going
to change my arm. And we'll go back
to my right arm. IK, FK. And as I toggle this,
it's automatically snapping all the IK
controls to match the pose of the FK control. All of that is happening
directly here in Control Rig. Now, as I zoom
out, you get a kind of full glimpse of this rig. There's a lot
going on, including a backwards solve graph,
a forward solve graph, and our setup graph. Now, Control Rig also enables
us to create a backward solve graph, allowing
controls to be driven by the bones themselves. This essentially is
baking skeletal animation to the controls. This is super helpful
in MOCAP scenarios where you record
Motion Capture, and you want to bring that
animation into the engine, and continue iterating on
it with the animation rig. Now, I can preview that here
in the Control Rig Editor by changing my mode
to backward solve. We see that this line
is now highlighted, and this is the event
that's being run. If I go into my preview
scenes and change my preview controller to use
specific animation, I can preview an
animation real quick and see the controls animated
along with my skeletal mesh. If I pause this and
I select a control, I won't be able to
move this control, because it's actually the
bone transform being evaluated in this graph to
set the transform of the control itself. Now, let's go back
into the level editor and take a look at how
we can use this backward solve functionality in Sequencer
to iterate on our information, whether it's MOCAP, or importing
animation using Control Rig. So here I've actually
just created a new level with nothing in it. So we'll be able to
start from scratch. I'm going to go to
the content drawer and go in to find our
BP Agent One actor, drag this into our level. Just going to frame it. And I'm going to
press G so I'm not seeing all the gameplay
and stuff going on. Let's create a new
level sequence. That's the perfect location. And let's drag in our
blueprint in here. I'm going to press
the plus on the track, and add it in the animation. Let's go ahead and do
the charge animation. Now, in this case,
this animation was created in engine. But if it had been
recorded elsewhere, this is how we
would iterate on it. There we go. So I just adjusted
the frame range so it's only for the
duration of this animation. And as I play
through it, we just have the animation playing. There we go. Now, this is baked animation. Again, this is the actual
animation sequences generated from our linked
animation sequence. But in this case, since it's
just baked skeletal animation data, I can right
click on the actor and say, Bake to Control Rig. It will automatically
find control rigs that have a backward
solved graph that's using this skeleton. I can change some of the
settings where to search and what to look for here. Now, I'm going to choose our
one available control rig here. It gives me some quick options. Do I want to reduce the keys? But what tolerance do I want
to use for reducing those keys? I'm just going to check
that off so we have a one to one solve of this. And then as I scrub
through my animation here, I now have my
controls animating. And I can select these controls
and see the animation here. You can still see the original
baked animation sequence here. And all we did is
muted that sequence so it is no longer active. If you ever need to get back
to your original animation and not use your
control rig animation, you can do that just by
selecting your control rig track, deleting it,
and right click, and activate your
animation again. And now you're back to your
original imported, baked skeletal animation data here. Now, let's pretend that
we don't have a control rig with that baked data,
or with that backward solved graph available for us. We can still edit our
animation here in Sequencer. We can do that by right
clicking on our actor, and going to Edit with
the FK Control Rig. We get a similar dialogue. Bake to Control Rig. But this, instead,
is going to bake to an automatically
generated control rig, which you can see here. It's going to have a control
representing every bone, essentially giving
you the direct bone FK rig to manipulate. You may not want to manipulate
all of those things. So you can always right click
and toggle which controls and what tracks you want
available here in Sequencer. Control Rig is the
underlying framework that is enabling us to evolve
the way we rig and animate. It allows us to
feed in information about our environment,
enabling us to offer dynamic and
responsive performances. Traditionally, rigging systems
were built as modules and baked into a more
monolithic structure, before being handed
over to the animator. This means that any
change to the rig can potentially break
all the other animations. Additionally, the
animator and rigger are siloed from each other,
not adhering to our working in context theme. The Unreal Engine
gives us the ability to build rigs in layers. By building in layers,
we're able to add levels of complexity at
any point, working in small, bite-sized pieces. One way to do that in Unreal is
by using post-process graphs. So let's go back into the engine
and see what that looks like. So this is a post-process
animation blueprint. The only difference between this
animation blueprint and an anim BP that is more commonly used
is where it's being evaluated. The graph will run after
all other animation, state machines, gameplay logic,
or even sequencer animation has been processed. In fact, we can already
see this in action if we go back to
the level editor. Notice how the gears are
automatically spinning here. And if I grab this hand control,
if I grab this hand control, the pistons are going to
automatically compensate for the arm movement. All of that is being taken
care of by separating the rigs and using post-process graphs. And by doing this, the animator
can start creating performances faster. And the rigger can build
out the needs of the rig into independent pieces. Now, going back to the
post-process graph, we can dig a little deeper
into what's happening. First, we're grabbing
the incoming pose and storing it as a cache. This is going to be
our Sequencer animation or our gameplay animation. This step isn't essential,
but in this case, it allows us to be more
specific about how we're applying our additional layers. Then we're feeding that input
cache into our Control Rig and into a rigid body node. We're able to blend
those two together using this layered
blend per bone node, before we're
outputting our pose. Due to the way we
animated the robot, we wanted to be able to control
when the physics were enabled on the front skirt armor. By breaking out the rigid body
physics of that front armor, we're able to be very
specific about when, where, and how much was being applied. We're blending that
on with this physics enabled float value here into
the layer blend per bone node. The control rig also takes a
parameter called gear weights. When the robot is
destroyed, we want the gears to stop spinning. So we feed in a gear weight
value for the control rig to read. This value is being
set in the event graph, and is being fed in by the BP
agent one actor blueprint here. This allows us to drive
our animation rig directly from gameplay, something we'll
talk about a little bit more later. Drilling into the
Control Rig here, we find the mechanics setup. We're controlling pistons,
gears, armor, and other things we don't want the animator
to have to worry about. Since we have fine control
over every element of this rig, we can apply additive
or absolute adjustments to any part of the character,
responding directly to the context of the
performance and gameplay needs. Another benefit of
working in this way is we're able to create
animation that automatically blends through other
incoming animation, essentially removing part
of the baking process. This also enables
our character to be more dynamic and responsive. For example, if the spinning
gears were baked animation, then they would pop
depending on when we interrupt the idle animation
to transition to a fire animation. By using post-process
graphs and Control Rig, this isn't a problem. This brings me to using
procedural animation as a tool. In gameplay scenarios
we're used to adding in layers of physics simulation, IK
compensation for hip placement, and many more procedural
systems to make our content come alive and
feel immersed in the world we've built. In Unreal,
we're able to accomplish these things without requiring
you to write any code, and in a way that could
be shared and applied to any character on screen. Control Rig is unique in
that it is always live and running, allowing
you to take advantage of the rig evaluation to
create procedural and dynamic behaviors. In the Valley of
the Ancient project, we use full body IK to layer
on a full posture adjustment, depending on where the player
is in the battle arena. The position of the
player and the multiplier for how far we want
the robot to reach is passed into the Control
Rig, allowing us to drive this behavior during gameplay. By using a procedural
system to achieve this, we don't need to author
unique aim offsets for all conceivable positions
the player may be in. And the ancient feels like a
grounded and integrated enemy to encounter. So where do we go from here? Just because the experiences
we create are complicated doesn't mean the process
has to be complex. By working in context, building
up small layers of complexity, and moving away from
baked, siloed workflows, we could spend
more time creating, and less time working. In short, you can tell
your stories faster. Thank you for watching. You can contact me on
Twitter @Jeremiah3D. I look forward to interacting
with you all and hearing your thoughts on how we can
continue evolving the animating process.