Narrator: If you compare
these underwater shots from 2009's "Avatar" to 2022's
"Avatar: The Way of Water," the improvement is pretty clear. And a look behind the
scenes can help explain why. The first movie was shot on a dry set with actors mimicking what
it'd be like underwater. But for the sequel, they actually filmed these dynamic scenes, well, underwater, creating some of the clearest
animated water scenes ever captured for a film. "Avatar" was a stunning
visual achievement in 2009. But 13 years of technical advancements made the second film even
more visually groundbreaking, and in few places is that more evident than looking at the characters' faces. VFX artists brought the Na'vi to life by taking the actors'
real facial expressions using motion capture and applying them to a 9-foot-tall character. In the first "Avatar," the
cast wore these helmets with one tiny camera on them. The initial system was effective, but on a much more surface level. Eric: You can capture
the movement of mouth, nose, eyes, the general
motion in a almost 2D way because you're not getting
depth, you're not seeing, if we have the corner of
the mouth here, for example, you can see where it's
moving side to side, but you can't actually tell
how it's moving in depth. Narrator: The tech advanced with 2019's "Alita: Battle Angel," where Wētā switched from one camera to two to capture actor Rosa
Salazar's performance. The studio also used
this two-camera system for "The Way of Water,"
and if you look closely, you can see the improvement
in the facial expressions captured for the Na'vi characters. The extra camera adds a layer of depth and even more data for animators
to perfect the final shots. Eric: With two cameras
separated like this, apart from one another, all the different points in the mouth you can actually calculate
using AI and all the new tech to see how this point in the mouth is actually moving forward or moving back. You're not just a planar motion,
you're depth motion also. And we could've animated on top. We could've made something
that looks similar. But with the two cameras, you just get much more proper information from Zoë, so you get a much better
performance of her. You can tell, the way her
face moves forward and back and her mouth stretches
and her nose moves around, that without the two cameras, you would never have gotten that. [hissing] Narrator: The character work doesn't end once the performance is captured. Animators can and do
adjust the expressions on a CGI character's
face in postproduction. Why am I different? Narrator: On the first
"Avatar," VFX artists used a pretty common animation
tool called Blendshapes. Blendshapes allowed them
to adjust and heighten the facial expressions
of Na'vi characters. They'd previously used the tool on movies like "The Lord of the Rings" to animate different
expressions on Gollum, especially as he moved from
one personality to another. But Blendshapes only allowed artists to manipulate the surface
of a character's face, and Wētā wanted to go deeper. The studio made advancements with Thanos in the "Avengers" movies,
going down a few more layers. You can see the difference
in this test footage. This is facial manipulation using the surface-level Blendshapes. This is the same thing, but on a deeper muscular level to create
more natural expressions. Still, Wētā wanted to push things further. Of Blendshapes and the surface of the face and started looking at muscles,
the muscles of the face, as a way to transfer from
actor to the character puppet. Narrator: For "The Way of
Water," Wētā was able to digitally reconstruct an actor's face based on the two-camera
performance-capture footage. From there, the animators could measure how the face muscles under the surface strained during their performances. They used these muscle strains as data to more precisely adjust the
Na'vi facial expressions. Dan: The data set was
generated by James Cameron sitting down in a booth with an array of cameras
in front of the actor, and the actor went
through the entire film. It's quite remarkable
seeing the actors emoting in the way that they do just sitting in a chair with Jim. I'll be nice, once. Narrator: Working with facial muscles creates everything from
more realistic blinking to the perfect eye roll and
a new and improved smile. The animators could pull the corners back on the animated puppet's mouth, and it wouldn't stretch too far but could still stretch
from cheek to cheek for a fuller, more realistic smile. Picking up the subtle nuances
of an actor's performance was crucial for filming underwater. That shot of Jake going through the rapids in the original film was a
shot in a dry-for-wet style, meaning the performer acted
out the scenes on a dry set, so the reference footage
wasn't as accurate. The water you see here
in "The Way of Water" and the way the characters swim through it looks much more like the real thing. The first step to achieving this was filming hours of footage
of the cast acting underwater. The crew built a
performance-capture volume in a tank that was 120 feet long, 60 feet wide, and 30 feet deep and then built another
volume directly on top for shots where characters
pop in and out of the water. Having to build such an elaborate set wasn't the only reason this
was never attempted before. Getting clear underwater footage is a challenge on its own. Take lighting. Water creates all kinds
of unwanted reflections. Wētā encountered the same issue while shooting the daytime
performance-capture shots for the "Planet of the Apes" trilogy. So for those films, they
switched to infrared light. That would work for surface shots, but infrared light is useless underwater because water quickly
absorbs red wavelengths. So to ensure visible lighting
above and below the surface, the crew instead used an
ultra-blue light underwater and infrared above the surface. But even the most up-to-date cameras struggled with water's natural qualities. For instance, surface reflections were making it hard to
separate surface light and conditions under and above the water. The director had to go back in time to his 1989 film "The
Abyss" to find the solution. On "The Abyss," he filled
the surface of the water with black beads to keep the lighting as separate as possible. On "The Way of Water," he used these opaque
white Ping-Pong balls. And because the balls
could be easily separated, they wouldn't get in the way
of the actors' movements. You can really spot the
difference in this shot from the first "Avatar" where
Jake pops out of the water compared to this shot of Lo'ak and Payakan swimming right under the surface. There was another obstacle to getting underwater footage: bubbles. If the cast were to wear scuba gear, air bubbles would make it impossible for the performance-capture cameras to get a clear read of their faces. So they had to hold their breath
for long stretches of time, and this required six
months of intense training in diaphragmatic breathing. Increasing their lung capacity allowed them to hold their breath for at least five minutes at a time. By the end of that training period, Kate Winslet broke the world record for longest breath held on a movie set. But it would take a lot more to make the actors swim as fast as their digital counterparts. Unlike other Na'vi clans, the Metkayina, who are introduced in the sequel, are a water-based tribe, meaning they were animated with features like larger chests for
holding their breath. Breathe in. Narrator: And much larger
tails to propel them forward. Whenever Kate and the other performers playing the Metkayina were swimming, the stunt team equipped them with these underwater jet packs. The VFX team would then
use that swimming speed and swap out the jets for
tails kicking the same way. They had to step it up a
notch whenever the characters rode onto an ilu or a skimwing. While the actors used wire-rigged puppets to ride the flying
banshees in the first film, these new creatures required them to be strapped onto these Jetovators that could move them in
and out of the water. Once the underwater sequences were filmed, Wētā took that raw footage and extended it into an
endless aquatic world. Because of the sheer volume
and complexity of water, animators had to simulate the movement rather than animate frame by frame. That included all of its
defining characteristics as aerations, splashes, droplets, waves, mist, and more. It's lot like facial animation. You can look at a face and
look at its basic movements. But where you really sell it is in those tiny little details. Narrator: When Wētā worked on this shot in "War for the Planet of the Apes," the animators had to
account for everything from how Caesar's hair
would look when it was wet to how water droplets would flow when there was air resistance. "The Way of Water" was a lot more work. Wētā was tasked with 2,225 water shots, including over 30
waterfall-interaction shots. Luckily, the VFX artists could now use a program called
Loki, which Wētā created for the underwater scenes in "Alita." For "Way of Water," they added a whole host of specific interactions, like what a boat looks like at high speed. Here, the red particles are
spray, and the green are mist. And even how hair and cloth would move. In the first "Avatar," the
animators could only move the characters' hair in one big clump. Now they could move them
in individual strands. You can see the difference looking at Jake's wet
hair in the first movie versus the way Lo'ak's braid
interacts with the water. And because they nailed
the tiniest details, they could now have
high-definition close-up shots of water shedding off a character's hand. Once the animation was done
on the rough water surface, the effects team would apply simulations, and at times animators would have to go back in and rework them. You might have one boat next to another. One of these boats makes a wake. All of a sudden, the
work that we did earlier needs to take that wake into account. So we had this sort of cyclical
thing with the effects team. Narrator: Some of the
most challenging shots for the VFX artists were ones like these, where characters come up to the surface. They needed more than
just a Jetovator for that. So they worked with a new
tool that allowed objects or characters to float on the surface. You can see it at work here. I have no idea what you just said. Narrator: And here, elevating
the film's action scenes. Dan: The tool would help
us with both buoyancy and perhaps boats getting air as if they were going fast enough that we would simulate their flight as they took to the air, going
over the crest of a wave. Narrator: "The Way of
Water" showed great advances not just in how the Na'vi emoted and the world they explored, but how they interacted with one another and, more importantly,
the smaller-size humans. In the first "Avatar,"
there are only about a half dozen of these shots. For that film, the actors'
references included a tennis ball on a stick
and green-suit performers. Dan: You end up with
eyelines that don't work and with a depth problem. Narrator: For "The Way
of Water," real humans and CG creatures could coexist seamlessly thanks to football. More specifically, this eyeline system based off the SkyCam, which
is used to film NFL games. But instead of a camera,
the crew attached a monitor to a cable that could
lift it up into the air and approximate Na'vi height. The monitor following them around displayed rough performance-capture video to give the actors
something to play off of. The monitors created more dynamic shots, and so did improvements
in virtual cameras. This was one of the biggest innovations of the first "Avatar." With this lightweight device, James could view a
rough version of Pandora and move around within it, helping him frame shots
as if it were live action. When shooting with the cast, the crew used real-time depth compositing to see the real and digital
elements in a shot in real time. This made shot composition
less of a guessing game. It helped Wētā seamlessly place Gollum in shots in "Lord of the Rings" and let Mowgli walk alongside
Bagheera in "The Jungle Book." It even helped the artists match Alita's digital robotic arms
with Rosa's real ones. For "The Way of Water," the
tech took a huge leap forward. Real-time depth compositing
got more precise because the crew could see digital and real elements combined down to the pixels making up the shot. This meant the shots where Na'vi and humans interact were a lot better. Just compare this shot in the original to this one in the sequel. Eric: We honestly got that working, oh, probably two weeks before
we started filming. Being a stereo 3D movie,
you always have to have everything exactly where
it should be in space. Narrator: And depth compositing is also why the water looks so good. For shots like this,
the actor playing Spider stood alone in a wave pool with the Na'vi characters
visible in a real-time composite. A weird side effect that we
actually weren't expecting is we were able to use
that same data that we used for the depth compositing
back in postproduction. Narrator: Those pixels helped them digitally extend the water so seamlessly that it's impossible to know what was real and what was computer-generated. Eric: We could get exactly
where the water was. We could then plug our
CG water stimulation into the live-action water and
make the two work together. Narrator: After all that,
those early waterfall shots look a lot less daunting now. By the end of "The Way of Water," that shot was a doddle for us. Doing that was easy.
Bonus:
https://www.youtube.com/watch?v=DbK8JhnqJp4