that are now more possible than ever in Unity
thanks to the Animation Rigging Package. Today let’s start learning all about animation
rigging, the animation rigging package and how we can use it to add procedural animation
to our games! As always, a huge shout out to the Patrons for helping to keep the lights
on. And now, let’s get started! A classic animation pipeline for Unity game
development starts with modeling and texturing a character based on concept art and turnarounds
using an external software like Blender. That model then undergoes two processes called
“rigging” and “weighting”, which build out the skeleton of the model and determine
how the model can and should deform. The bones can then be moved, rotated and keyframed to
create new animations! When animations are complete, we can then take the ‘ready-to-be-used’
character and animations into Unity! This is a tried and true pipeline that has
gotten the job done for many years. And if we look at the vast majority of 3D games from
the past up to more recent games, it’s been used to create most animations. However, there
are now a few additional animation options that shake up this pipeline! And the option
we will talk about today is procedural animation. Procedural animation is generated in real-time.
While standard animation saves the rotation and placement of the character’s bones prior
to the game running and uses those saved values when the game is being played; procedural
animation is able to move bones based on what’s happening in the moment and in the environment
while the game is running. The extent of how much procedural animation
is used will vary between games and its use-case. A game like Uncharted uses procedural animation
for actions like characters reaching to touch a wall, foot placement, head and arm rotation
and a bunch of smaller details. Whereas games like Fall Guys and Gang Beasts are much more
noticeably procedurally animated as their character animation is almost entirely physics-based,
which is a form of procedural animation. To be clear, procedural animation isn’t
exactly new to indie games, or even Unity for that matter. There are asset store packages
that allow us to create procedural animation. Unity’s Mecanim animation system has offered
support for animation layers, animation blending, and inverse kinematics which can create some
similar effects to what we can do with the animation rigging package. And procedural
animation can also be programmed ourselves using code. So then what does the animation
rigging package actually offer? The answer is: a helpful user interface in
the Unity editor, a list of constraints that can be used to achieve multiple types of procedural
animations and dynamic rigs. Let’s not forget to mention that we can access this package
for the low price of “free” and bugs have the possibility of being fixed by the Unity
team, themselves, which is always a benefit! Ok – what exactly are constraints and what
are rigs? Remember the animation rigging process mentioned earlier when describing the animation
pipeline: building and connecting bones of a model which can then be used to animate
the character. But to be more specific, “Rigging” is more than that: it’s not only adding
the bones, but applying “constraints” between specific bones. Constraints will determine
how bones can be moved and rotated by the animator and sometimes what happens when the
bones are moved. For example, a constraint may be placed on a hand so that when it moves,
the arm and shoulder bones move as well, like they do in real life. And “rigs” are the collections of the
bones and constraints. A rig might consist of all of the bones and constraints of a model,
sometimes referred to as an “uber rig” because it houses everything. Or a rig could
only focus on specific bones and constraints, like just a single arm. So now, let’s get a little hands on. In
Unity we can download the animation rigging package from the package manager. By using
this package we can essentially move part of the animation pipeline into Unity itself. Here we have bananaman: an imported character
asset from the asset store. Here we have Jammo: a character asset from the hit youtube channel
mix and jam. The original bone layout and weighting for this character had to be originally
done in an external software like Blender, and that is still the case despite the animation
rigging package’s new features. As mentioned, this package doesn’t mean all of the animation
pipeline is now in Unity, rather, specific parts which will make procedural animation
easier. But, with those original steps of the animation
pipeline already complete for this character, we can now use the animation rigging package
to render the bones, create brand new keyframed animations and, of course, procedural animations
using runtime rigs and constraints. Let’s begin. The first step is to attach
an Animator component to the animated gameobject, in this case our character gameobject which
has the armature as a child. The animation rigging package is built on top of Unity’s
mecanim animation system, so in order for it to work, the animator component is required.
The Animator component allows any gameobject to be animated and when combined with an animator
controller can be used to play animation assets in the animator window. The next required component is the Rig Builder.
As we create new Rigs for the animated gameobject, each will be added as a layer to this component’s
list property. Think of this as the controller for all of the rigs that might be added to
a character. Each rig layer added to the list is to be ordered by priority but the order
must be decided before the game is played. The Rig Builder interacts with the animator
component which is why it must be added to the same level as the animator component;
which again, is the parent to the bone hierarchy. This next component is technically optional
in the setup process, but will help tremendously with creating animations and working with
the animated gameobject and character model. We are talking about the animation rigging
package’s bone renderer component which we attach, again, to the parent gameobject
of the bone hierarchy – same as the animator and rig builder. We can then select all of
the bones in the hierarchy window and drag them into the “transforms” list. A known
bug, but you may need to refresh the component for them to show up. Alternatively, if we
select the parent gameobject, the animation rigging dropdown menu has a button that will
set up the bone renderer automatically. The remaining properties of the bone renderer
allow us to customize the appearance of the stored bones. We can select from three different
options for the bone shape, increase or decrease the size, and customize the bone color. Enabling
and modifying tripod size will display the local axes of each bone, making it clear how
the bone is rotated. The bones are only visible in the Scene view,
which is where we would want to manipulate the bones anyway. But with it set up, we can
more easily create and manipulate rigs. In the context of the animation rigging package,
a Rig (otherwise known as a control rig) is a collection of constraint gameobjects and
the possible source objects of the constraint. Source objects are what the constraint requires
to be used properly. To create a rig, we just need to add a new
gameobject as a child of the parent gameobject that holds the Animator and Rig Builder and
add the Rig component. This is typically the same level as the bone hierarchy - so to be
clear, rigs and constraint components are to be used outside of the bone hierarchy,
not on the bones themselves. We can now add this rig to the “Rig Builder” as its first
layer. But to give the rig functionality, we create a new gameobject as a child of the
rig for each constraint that we’d like it to have. Let’s add our first constraint. We will
create a new child empty to the Rig, select the animation rigging option in the menu and
choose from the list constraints available. Each constraint has its own set of properties.
Now, there are many to cover, and I believe we’ll go over them all at some point on
the channel but if you’d like me to make a video of practical uses for each constraint
or a specific one, be sure to leave a like and comment below telling me which you are
interested in! For today, let’s start with the most basic
example by having the character’s head turn towards its target: for a better naming convention,
we’ll rename the rig from “Rig” to “TargetTracking” and rename the child to “HeadTracking”.
Then let’s add the “multi-aim constraint” to “HeadTracking”. This constraint rotates
the constrained object to the direction of a required target transform. Because we have used the rig renderer, we
can much more easily select the head bone, but before we do… let me show you a quick
unity tip: we can right click on a gameobject in the hierarchy and press on properties to
open a new inspector window that will never change targets. What this means is that if
we are going to be working with a specific gameobject and its properties, we can now
add it as a separate window in the editor. This allows us to much more easily select
and drag a gameobject from the inspector into as property value without worrying about the
inspector accidentally changing to the new gameobject from the hierarchy. Cool! Let’s drag the head bone from the
hierarchy into the open “Constrainted Object” property. The only other required component
for this constraint to impact our character is to add a Source Object, aka whatever we
want it to look at. Let’s create a new object titled “LookAtTarget” and then drag it
into the source object list. Moving around the source target and we’ll
notice nothing is happening. This is because constraints only work when the game is running
or when preview mode is active on the animation and timeline window. If we enter play mode,
we can now see that as we move around the target, the head follows! You may be wondering what the remaining properties
of the multi-aim constraint are for. They determine how the constrained rotation works
– note that other than weight, in order for changes to take effect, we actually need
to restart play mode. Weight is the percentage of influence we want
to apply to the constrained object. 1 meaning 100%. The Aim Axis should be set to the specific
axis that we want to point towards the target – for a character, this is most likely going
to be Z in Unity. And setting the up axis will determine which axis we want to be rotated
to the upward direction. However, this doesn’t do anything without the world up type set. Specifying the World Up Type will determine
which direction the constraint will consider “upwards” and how the rotation handles
rolling. Here is the difference between setting it correctly and not setting it at all. We’ve seen one source object, but we can
also add many more! Playing with the weight slider will determine where the constrained
object will rotate. Having multiple at a value of 1 will rotate towards the centerpoint between
the source objects. Maintain offset will keep the object in its
original rotation until the source object is moved, but will then rotate to follow its
movement path. Offset is simply an additional applied rotation to the object. Selecting
X Y and or Z of the Constrained axes property determines which axes is allowed to rotate
in order to look at the target. And Min and Max Limits determine how much rotation can
be applied to any of the available axes. For a character, we’re looking at about -90
and 90 for the min and max values. And that is working procedurally generated
head tracking! Part of the beauty of how the rigging package
is designed comes from the rig builder and its layers. We can have separate rigs for
different situations that the animated gameobject or character may encounter, and activate and
deactivate each accordingly. We can have a “TargetTracking” rig which has a constraint
for the character's head, but we can also have completely separate rigs for things like
“Climbing”, which has constraints for the hands and legs to properly position themselves
on a wall, or “PassiveInteractions” which could have constraints that handle interacting
with the environment. The point is, the layout provided to us as developers is super flexible! In the future, we’ll take a look at more
practical uses of each constraint, discuss some of the limitations of the package and
learn to bake new animations! Remember you can access this project file by supporting
the channel over on Patreon, as well as vote on future video topics, get early access to
videos and more! Thank you to all of the current Patrons for your continued support and guidance
of the channel. And a special shoutout to ___ for the top tier support! Hop into the
channel discord to meet some awesome developers just like yourself and follow me on twitter
for updates on the next video. But that’s all for today, thank you so much for watching
and I’ll see you in the next video.