Active Ragdolls in Unity

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Thanks for sharing 😁👍

👍︎︎ 6 👤︎︎ u/Ricardo_PL 📅︎︎ Dec 04 2020 🗫︎ replies

Quaternions are evil. Preach, brother!

👍︎︎ 3 👤︎︎ u/Im_Berji 📅︎︎ Dec 04 2020 🗫︎ replies

Wow, awesome video, Sergio! The simplicity in which you demonstrated the concepts was great. And the splices were perfect.

👍︎︎ 2 👤︎︎ u/VisionShift 📅︎︎ Dec 05 2020 🗫︎ replies

Enjoyed the vid ! lots of interesting stuff , iv subscribed to your channel

:)

thanks for sharing!

👍︎︎ 2 👤︎︎ u/736f6c7665 📅︎︎ Dec 06 2020 🗫︎ replies

Awesome content. It was enlightening to watch a video on a topic I had always wondered about.
Subbed !
And please do make more videos :)

👍︎︎ 2 👤︎︎ u/the_legend_01 📅︎︎ Dec 07 2020 🗫︎ replies
Captions
When we think about artificial intelligence, we  usually think about natural language processing,   or image recognition. But AI has  many other less known applications.   One that I’m very passionate about  is replicating natural movement.   I just love how some systems are able to move  in a way that makes it look like they are alive.  So, for the past months, I’ve been working  on this character animation system for Unity.   You see, standard animations are played  completely out of the physics simulation.   For example, if a character is walking and  something collides with its arm, the other object   may react to the collision, but the arm animation  won’t be modified at all. For animations to be   responsive to the physical environment, we need  to simulate them using what’s called physical   animations, also known as active ragdolls. This type of animation demands more processing   power than standard animations and is much harder  to set up, that's why it isn’t very commonly used.   But in the last decade, it’s been rising in  popularity due to the increasing potential of   tools like Unity and Unreal. Games like Human Fall  Flat and Gang Beasts are some famous examples.   It’s also been used in big blockbusters such as  GTA, which uses a software called Euphoria that   generates realistic animations in real-time, but  they use much more than just physical animations   so it’s not that relevant for us. Let’s get started. The first thing we need is a model, I made this one quickly.  Yes, I’m obviously not a 3D modeler. After that,   I made a very simple skeleton along with some  quick animations. Yess, I’m not an animator   either. I pulled everything into Unity and got the  standard animations working. If you look at how   Unity handles characters with skeletons, you can  see that it creates a GameObject for each bone,   then the Animator component moves  these game objects to play animations.  Some bones are child to others, which means  that when you move or rotate the parent,   the child will follow it accordingly. If you  want to move the arm, you just need to move   the shoulder and the position of all its  children will be automatically calculated.   The process of finding how child objects  move with respect to their parents   is called Forward Kinematics. It may look  simple, but it’s the basis of classic robotics,   and knowing it will allow us to understand a  more advanced technique later in the video.  Getting back to physical animations, the first  thing I tried was to put the joints and physical   rigid bodies directly on those animated game  objects. It was a stupid idea indeed. Since the   animator is forcing their positions and rotations  in each frame, I can’t control them with joints.  Trying to physically move the  animated body won’t get me anywhere.   Instead, I’ll do what everyone suggests on  forums and videos all over the internet:   use one body for animation, and make a duplicate  that tries to match it by applying forces.  Our character now contains two different bodies.  The first is just a normal animated character,   which plays the animation we’re trying to achieve,  but without any physics. The second one is a   copy of the first, but instead of being animated,  it’s made up of rigid bodies connected by joints,   it’s what’s usually called a ragdoll. Joints are just physical entities that   link two rigid bodies together, making  them rotate and move around each other   in a certain way we can configure. But as it stands now, it’s just that,   a normal ragdoll, it does nothing but falling  awkwardly. We need to bring some life into it   with some movement. To do so we’ll be using  a feature of joints called target rotation,   which applies a torque to reach a  specific rotation, as its name implies....  If we could just set this target rotation  to that of the animated body for each joint,   we would obtain a ragdoll that would  try to match our predefined animation,   exactly what we’re looking for. It’s actually not as easy as it appears.   Rotations are messy, and quaternions are  evil. Making joints match the rotation of its   animated peers is not writing ‘targetRotation =  animatedRotation’. But thanks to the advancements   made in the 20th century, we’ve got this nice  thing called the Internet, and someone has   already done it. So thank you, Michael Stevenson,  your efforts will be put to good use, hopefully.  By the way, I’ve hidden the animated model by  disabling it, but doing so stops animations from   being played unless you change the culling mode to  ‘Always Animate’. Just in case someone is trying   to follow this as if it was a tutorial, which it  isn’t. If you want an actual tutorial to know how   to implement this yourself, you can find it in the  description of this video. I will also put a link   to the Github repository of this same project, in  case you want to test it yourself or use it as a   basis, feel free to do whatever you want with it. This is nice and everything, but it’s completely   useless. Yes, we have physical animations, but  right now I’m fixing the hips to see it properly,   this is what happens when I release it. And so  we get into one of the most complicated subjects   I’ve ever dived into: balance. We need to get  this thing to balance itself without falling.   Not only that, we need to make it stable enough  so it can walk properly using the contact between   the feet and the floor. This is a concept that’s  been highly explored in robotics, it’s absolutely   fascinating and I’ll be working on it for the next  few months as part of my bachelor’s final project.  Naturally, I will not be dealing with balance  in this video, there are other methods that can   be used... Instead of balancing with its own  movement, we can just force it to stay upright   with artificial forces. This may not work for real  world applications, but games are not real world   applications.Human Fall Flat does this very  clearly, just look at how the hips tend to   stay vertical even in unbalanced situations. The more straightforward way to achieve this is   by locking the rotation of the hips on the X and Z  axes, that way the ragdoll literally cannot fall.   Unless you know for sure this is  a good fit for your application,   I believe this to be an awful idea. It looks  rigid and can be a source of huge glitches.  The second method is using another joint. This one  will connect the hips to a new rigid body, which   will be upright, and the hips will try to get to  that rotation through the joint’s target rotation   feature. This new body will be kinematic, which  means it cannot be affected by external forces,   so it will always be upright since the  joint’s torque will only affect the hips.   It’s given good results to me so far, but it  produces a springy feeling I’m not a fan of.  By the way, if you’re wondering  why it’s always the hips,   it’s because it’s where the  body’s center of mass is.  The last method is applying torque to rotate  the torso towards an upright direction.   This is the same as what the joint of the previous  method does internally, so there are no apparent   differences. But actually, there are. Doing it by  ourselves allows us to decide how much torque is   applied at any given point. The previous joint  had a torque output that worked like a spring.   The more you get out of balance, the more torque  it delivers, just like this [show graph]. But now   we can choose that function, so we can change  the way it delivers torque, and thus get rid of   the springiness, at least to some extent. Okay so now we can sort of play physical   animations with our character. It’s not perfect  and it would require a lot more tuning to have   something that could be used commercially, but  as a prototype, it works pretty well nonetheless.  But there’s one more thing that we haven’t done  yet. What if instead of playing a predetermined   animation we wanted to generate movement  dynamically depending on the context.  This is best exemplified by making the  character reach for an object with its hand.   We cannot use an animation because the character  will not always be in the same spot in relation   to the object. And teleporting it, while it can  look decent enough with usual animations, will   look terrible combined with physical animations. We need a system that allows us to position the   hand where we need it and let the body find  the ideal configuration for that to happen.  With forward kinematics, we could calculate a  child’s position given that of its parents. Now we   need just the opposite, given the child’s position  and rotation, find the parent’s configuration that   makes it possible. This can be done by a complex  mathematical process called Inverse Kinematics  The problem is, it can be done in many ways,  you can reach the same spot with your hand   in infinitely different postures. To solve  that uncertainty, we need what’s called a   hint , an object in space that tells a certain  bone where to point to. This adds a constraint   to the system and reduces the infinitely possible  configurations to just one. In our case, we need   a hint for the elbow, once the elbow knows where  to point to, our solver can solve the equations.  Unity has a simple Inverse  Kinematics system built-in.   It’s not perfect, and I’ve run into some problems  while using it, but it works well enough for the   project’s scope. This is actually the part of this  project I’ve spent more time on. I think I might   have overdone it for what I actually needed. But  anyway, I learned some new things, so it’s okay.  And this is all I had to show you, it’s not a  perfect solution and there are a million ways to   improve it, but this is not a commercial project  so it makes no sense for me to spend more time   polishing it. It looks cool and it has taught  me a lot, so it accomplished its objectives.  I hope you enjoyed this little journey of  programming physical animations in Unity.   I’m very interested in natural movement  generation in real-time, as well as other   types of artificial intelligence. I’ll be talking  about similar topics in this channel regularly,   just in case you want to hit a  certain button. See you next time.
Info
Channel: Sergio Abreu
Views: 51,395
Rating: undefined out of 5
Keywords: programming, active ragdoll, active ragdolls, unity, unity 3d, gamedev, animations in unity, physical animations in unity, physics based animation, physics-based, unity animation, ragdoll physics, ragdoll animation, animated ragdolls, ragdoll movement, natural movement, movement generation, c#, AI, artificial intelligence, active ragdolls unity, active ragdolls in unity, unity animation tutorial, unity ragdoll, unity ragdoll movement, game physics, unity physics based movement
Id: HF-cp6yW3Iw
Channel Id: undefined
Length: 10min 1sec (601 seconds)
Published: Fri Dec 04 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.