Dear Fellow Scholars, this is Two Minute Papers
with Dr. Károly Zsolnai-Fehér. We have showcased this paper just a few months
ago, which was about creating virtual characters with a skeletal system, adding more than 300
muscles and teaching them to use these muscles to kick, jump, move around, and perform other
realistic human movements. It came with really cool insights as it could
portray how increasing the amount of weight to be lifted changes what muscles are being
trained during a workout. These agents also learned to jump really high
and you can see a drastic difference between the movement required for a mediocre jump
and an amazing one. Beyond that, it showed us how these virtual
characters would move if they were hamstrung by bone deformities, a stiff ankle, or muscle
deficiencies and watch them learn to walk despite these setbacks. We could even have a look at the improvements
after a virtual surgery takes place. So now, how about an even more elaborate technique
that focuses more on the muscle simulation part? The ropes here are simulated in a way that
the only interesting property of the particles holding them together is position. Cosserat rod simulations are an improvement
because they also take into consideration the orientation of the particles, and hence,
can simulate twists as well. And this new technique is called VIPER, and
adds a scale property to these particles, and hence, takes into consideration stretching
and compression. What does that mean? Well, it means that this can be used for a
lot of muscle-related simulation problems that you will see in a moment. However, before that, an important part is
inserting these objects into our simulations. The cool thing is that we don’t need to
get an artist to break up these surfaces into muscle fibers. That would not only be too laborious, but
of course, would also require a great deal of anatomical knowledge. Instead, this technique does all this automatically,
a process that the authors call…viperization. So, in goes the geometry, and out comes a
nice muscle model. This really opens up a world of really cool
applications. For instance, one such application is muscle
movement simulation. When attaching the muscles to bones, as we
move the character, the muscles move and contract accurately. Two, it can also perform muscle growth simulations. And three, we get more accurate soft body
physics. Or, in other words, we can animate gooey characters,
like this octopus. Okay, that all sounds great, but how expensive
is this? Do we have to wait a few seconds to minutes
to get this? No, no, not at all! This technique is really efficient and runs
in milliseconds, so we can throw in a couple more objects. And by couple, a computer graphics researcher
always means a couple dozen more, of course. And in the meantime, let’s look carefully
at the simulation timings! It starts from around 8-9 milliseconds per
frame, and with all these octopi, we’re still hovering around 10 milliseconds per
frame. That’s a hundred frames per second, which
means that the algorithm scales with the complexity of these scenes really well. This is one of those rare papers that is written
both very precisely, and it is absolutely beautiful. Make sure to have a look in the video description. The source code of the project is also available. With this, I hope that we’ll get even more
realistic characters with real muscle models in our computer games and real-time applications. What a time to be alive! This episode has been supported by Lambda. If you're a researcher or a startup looking
for cheap GPU compute to run these algorithms, check out Lambda GPU Cloud. I've talked about Lambda's GPU workstations
in other videos and am happy to tell you that they're offering GPU cloud services as well. The Lambda GPU Cloud can train Imagenet to
93% accuracy for less than $19! Lambda's web-based IDE lets you easily access your instance right
in your browser. And finally, hold on to your papers, because
the Lambda GPU Cloud costs less than half of AWS and Azure. Make sure to go to lambdalabs.com/papers and
sign up for one of their amazing GPU instances today. Our thanks to Lambda for helping us make better
videos for you. Thanks for watching and for your generous
support, and I'll see you next time!