>> Hello, everybody.
My name is Chris Murphy, and I am an Unreal Engine
evangelist. I handle Australia, New Zealand,
and southeast Asia, and my job is supporting
developers across the region. So, I head between
different areas, and help wherever I can. For this presentation, we
are going to be focusing on high-end effects
using Niagara. Niagara is Unreal Engine's new real-time
visual effects system that we have been working on
for quite a while. It is currently available
in Early Access, and if you want to use it,
you actually can do it, even though it is not
officially released. You can go to
your plugins menu and literally just type
in Niagara-- Niagara, and enable it, okay? Now, for today's session, I want to stress something
ahead of time, okay? And this is really,
really important. My aim for today is for you
to understand what is possible, why the tools are set up
in a certain way, and how to start
going about it. But I am not going to be
doing a very slow, step by step, 'Here is what you need
to do' kind of thing, because for me, it is much
better that you understand what you can achieve from GDC rather than just
an intro tutorial that you could probably look up
on YouTube at some point, okay? So, if I seem like I am going
at a million miles an hour, it is a combination of me
being both over-caffeinated and me wanting to kind of get
to the good stuff, all right? Now, for today's session,
we are going to be going ahead and we are going to be
creating a Niagara system. And my idea here is we are
going to get some particles spawning in a cylinder
around this, and eventually, they
are going to ... a little drone outside.
This guy. So, he is flying around, and I want to project a map from
him in particles in the world. So, this is going to become
a sick-looking holographic map. Now, to go about doing that, I am going to need
to do a few things. I am going to begin by
right-clicking and going to FX, and this is honestly ... So, this is one of those areas where people first
make their mistakes. When they first load up Niagara, -- sorry, when
they first load up their FX menu in the
Content Browser, they see all of these things
when it comes to Niagara, all right? And this is very confusing
if you are first coming into it, and believe me, I
totally understand why. A lot of people here are coming
from a Cascade background, where you are used to everything
being contained in one little thing,
everything was Cascade. Niagara intentionally separates
these out into different things, so it is much easier to extend,
much easier to add new stuff, and much easier
to share resources between different
kinds of systems. Now, the things that we are
going to focus on initially are the bottom one,
Niagara system, and Niagara emitter. So, essentially,
a Niagara system refers to something that is
being added into the world and creating particles, okay? Now, a Niagara system
contains Niagara emitters. And what I mean by that is, imagine you had an emitter
for sparks, okay, and an emitter for smoke, and an emitter for, like,
the boom bit of an explosion. There is a better phrase
for that. A Niagara system would hold
all three of those emitters to create the explosion, okay? Now, this would be
really useful, because it means if I go
to create another explosion, I can take the smoke
that I have already got, I can place it in there
as fundamental smoke for what it looks like
in my game, and I can tweak those values
for this specific effect. But it means I end up with
consistency across the board in all of the things
that I am doing. So, I am going to create
a new Niagara emitter, and I am going to base it
on a fountain. Now, I am going to call
this 'circular spawner', because as I said,
we want to create something that spawns particles
around here. It is worth noting
that I do not need to do this as a
separate emitter. I could use the fountain
from the get-go, but for the sake
of demonstration, this is just going to be
a little bit easier. So, this is Niagara,
as its initial-- oh, hang on, there we go.
So, this is Niagara. You can see here that we
have got the viewport on the right hand side.
We have modules. And down the bottom
we have a timeline. So, essentially
how this works is, over on the right-hand side, you will see that we have
emitter update, particle spawn,
particle update. If I scroll up,
we also have emitter spawn. Basically, every single one of
these blocks is called a module, and that is a series
of instructions that is getting run
at that point in the system. So, the very top,
emitter spawn means when the emitter
is first created, do this stuff. And then every single time
the emitter is updated, tick by tick
by tick by tick, those are the instructions
that are being run as well. When the particle
is first created, these are
the per-particle instructions that are getting processed, and then every single time
every particle is being updated, these are the specific things that are being updated
for those. Now, what I want
to focus on here is you can see that
when the particle is spawned, it is given a velocity,
which is that initial burst, and down the bottom, you can see
that we have gravity force, drag force, and Solve
Forces and Velocity. And this is called
a post dependency, because if I get any node here,
if I was to look at gravity, this is what the gravity node
is doing under the hood. And this is the first thing to
note when it comes to Niagara, and that is that every single
module that you are adding is something that you can just
roll your own Niagara modules and extend as you see fit.
But here is the cooler part. So, you know how
this is calculating force, and every one of those modules was calculating force over time
and working it out, what this node actually
does is it says, look for the variable
called physics force. and if that does not exist,
set it to 0,0,0. And this is the first
little cool thing, because everything
works in name space. Everything can be
arbitrarily created. If I wanted to start
adding some modules that just used
a variable called heat, there is nothing to stop me
from doing that, and then having a bunch
of other modules that are looking for the heat
value to do something else. I can also feed
in that heat value as an instruction
for other things again, which I think is
pretty exciting. Now, if I look at the
gravity force here, it has actually got itself
a post dependency. See this required dependencies?
This goes ahead and says, I need something that
solves forces and velocity. And down the bottom, you can see
that we have actually got that. Because basically, this node
says, look, I am adding force, but eventually I need
some sort of node that says, can you do me a
favor and add the forces to the object's
position itself? You know, work out
that velocity change. So, having predependencies
and post dependencies is a pretty cool thing. And you can make up
your own chains to say, look, this is an important part, but it is kind of this
much bigger system that I expect this kind
of emitter to have. Now, after saying all of that,
I am actually going to go ahead, and I am going to strip
out everything to do with force and velocity, okay? And you can see that we are left
with just a ball of particles. Now, I want to also get rid
of this sphere location, which is currently just spawning
them in the very tight sphere, and instead, I want to add
a cylinder location to my particle spawn. Now, it is a little hard to see
what is going on right now, so I am just going to crank
this number up. You can see here
that the particles are spawning inside of a cylinder
that we created, okay? Now, remember that
what I want to eventually create is something that is
going to fit in a ring around this mesh in
the world, okay? Now, to do that,
what I need to do is make sure that the measurements
that we have here are something that works. Now, I can go up to my viewport,
and a nice little trick that you can do that most people
are unaware of is, if I was to look
at the orthographic viewport and hold down the middle mouse
as a button, this actually becomes a ruler. And I can see that
this is roughly 570 centimeters
in radius, okay? A lot of people are just unaware
that that is a thing, and that is fair enough.
It is a little hidden. So, I have gone ahead,
I have seen where that is, I see that it is 470, so I am just going to read that
in as my default value. I could override this later,
but I am just doing it. I am also going to go ahead
and shrink this cylinder down. Now, I do not want
these particles spawning across the tops
and bottoms. I only want them
spawning around the outsides. So, if I enable surface only, this will now only spawn them
on the surfaces of the system. And if I disable end caps, we are now left with something
that is doing this, okay? So, see how it is spawning all
of those particles in a nice ring? I am going to hit save
at this point, and I am actually going
to go ahead and work on it from this point
as a Niagara system that I am going to call
'holographic map system'. Okay? Now, if I was to drop this
into the world, I would want to set its
position to the same as this. So, I am just going to
right-click, copy, go to you, paste. Okay? Now, if I open up
the holographic system, we can see that it is working. So, the first issue I have got is
those particles are pretty big, but that is okay for the time
being for what we are doing. Now, what I want to do
is eventually project a texture. I want to have these particles all representing a point
in a texture, okay? I want them
to all take that data. Now, to do that,
I want you to imagine that I have got a texture, okay? The coordinate of this point
in the texture is 0,0. And the coordinate in this point
of the texture is 1,1, okay? So, X and Y are from 0 to 1.
Everyone is with me? Now, I want every particle
to have a value. So, what I am going to do is, I am going to assign
every particle on spawn a random value
between 0 and 1 on the X axis, and a random value
between 0 and 1 on the Y axis. And to do that, all I need to do
is go to particle spawn and say, hey, I want to
have a new Vector 2D, and it is going to be called
particles.uv. And I want that to be set to
a random value between 0 and 1. Now, we cannot see
that doing anything yet, because at the moment, this is literally just
an arbitrary value that is being stored. Now, one cool thing you can do is if you go to your attribute
spreadsheet and hit capture, this actually gives you a
listing of every single particle that currently exists,
and every single value that all of them are storing.
And if I were to scroll across, I would see that
the UVX and UVY values are actually being held
right there, which is fantastic. Now, with that in mind,
my next thing to do-- excuse me, my next thing to do
is to go ahead and just view them in the world, just so I can see
what is going on. And a nice way for me to do that is if I wanted to,
I can actually go ahead and I can set the position
in the spawn directly. So, if I was to set
the position, I actually can override this
with anything I want. But one really nice little
thing here is if you use HLSL, if you do not want to roll
a full-on node graph to do something very small, or, like, a dynamic input script
for something small, you can always just literally
type straight HLSL into this and say, I want this to be
a three-dimensional position between particles.uv with 0
on the Z axis, multiplied by 400, and I can then assign
all of those values to their position in space. For anyone that is wondering
what just happened, what happened was,
if they have all got a value between 0 and 1, if I multiply
that value against 400, they now all have a value
between 0 and 400. And if I set that value
between 0 and 400 to be their position in space, we end up mapping them
into a nice little square. So, that is the first thing
that I want to do. And the next thing
that I am going to do is I am going to go ahead, and I am going to go
into my particle update, and I am going to sample
a texture, okay? Now, sample texture
requires two things. It needs to know the image
that is being assigned, and it needs to know the UVs of
that position in space. So, to do this, I am
literally just going to go, UV is the variable
that we just created. Particles.uv. And the texture
is going to be the-- it is going to be the texture of
whatever we want to have there. So, if I were to type in
demo, demo logo would do. Now we're not going to see a result immediately
for two reasons. One is for anything
to handle texture data, we need to go ahead,
and we need to say to it, you are being processed
on the GPU. So, I am going to scroll up
to the top, and I am going to say,
you are no longer on the CPU, you are on the GPU, okay?
Just select that. Now, the next thing
that I need to do is I am going to scroll down,
I will just keep you running. The next thing I am going to do
is I need to scroll down and I need to actually set this
to be the color. This is one of the mistakes
that a lot of people have when they are coming
from Cascade into Niagara. What they often do
is they assume that if they drop a module, it will just do something
related to what it was, because that is how Cascade
had to work. Sample textures should just be
setting the color, right? No. Because we could use that data
for all sorts of things, and we do not necessarily
want it to override color. Instead,
if I open up sample texture, you can see
that what it is doing is it is setting the value
of sampled color and sampler UV. So, it is actually
getting that information and storing it inside the module so that anything else can
go ahead and interact with that. The reason that is really useful
is that if I get this, and I look at my system, if I move this above
the color setting, there is nothing to stop me on
the color from overriding that with the one
that we just brought in, right? So, if I said access
sample color, this will now sample them according to the texture
that we are looking at. It is going to bring the R, G,
and B in. All right, that is kind of cool,
but let us do something better. First off, if I bring in any
colored texture as well, you can see that
it is going to bring in all of that color data,
too, right? Okay, that is
kind of interesting. But, remember that drone
that I showed you earlier? So, that drone that
I showed you earlier, this guy, actually has a render target
stuck to the bottom of him. And what that means is that
if I look at this, that is an image
that is constantly being created by the drone as it flies around. Come on, buddy. Start moving!
There we go. So, you can see here that that has actually got
a drone flying around, capturing that render
target, okay? So now, let us get A and B and, like,
put them together, all right? So, let us look at this
Niagara system that we have got, and say,
hey, what if the texture here was actually
the camera texture, okay? There, do you see that there?
I am going to enable real-time, so it is a little bit
easier to see. So, it is a little hard to see, but see the yellow dots
that are starting to appear? The blue dot here? As the drone is moving, it is
actually updating the texture according to the information
that we are seeing, okay? Now, this is going to be
more visible if I were to go ahead
and shrink these particles down. But for the time being, I just
want you to understand it. Because here is what
we have got now. We now have particles
representing a texture, okay? And RGB of this texture
are the color data. But, here is the cool thing. Alpha channel, the alpha channel
represents the depth. So, the alpha channel actually
represents the distance that every pixel
is away from the camera. Here is why that is cool.
If I know the size of the-- if I know the kind of lens
we are using, there is nothing that stops me
from re-projecting that color back into world space by saying,
you are at this angle, therefore,
you are this far away, therefore, your particle
is at this location. Everyone with me still? So, that is really cool,
because if I go ahead and do that
by going particle update, I want you to now re-project
the camera. First thing it wants to know
is the depth value, okay? Now, the depth value is going
to be how far away the pixel is. And we know this already, because we know
that there is a value that is being stored
in the linear color. We know that it is
in the alpha channel. So, I can say,
pull the alpha channel from this arbitrary thing
that we created, okay? The next thing it wants
to know is the FOV, which is default to 45, and it also wants to know
the position in UV space, which is again, simple enough,
just the UV data that we created. Now, the final thing
that it wants to know, and you can see, it starting to take effect in the
distance there, is which direction
to project it in. It does not know which way
the camera is facing. So, all I need to do is tell it
to project downwards, okay? And to project downwards,
I literally set this to -1, 1. Oh, and I forgot
to zero this out. Now, it might be
a little tricky to see, so I am just going to crank up
the size for the time being. I am going to turn this up to,
say, 20. Now, see down the bottom there? What we are starting to see,
here, I am going to make
you way bigger. There we go. What we are
actually seeing down here is the world is getting
recreated in particles that are equal to
their position in space, okay? Now, that is a really kind of
a weird way to show it, but we are going to
make it cooler in a second. I just want to start here. So, the next thing that
I need to do is this. I have got particles that are way
down below where my emitter is. Now, the reason
for that is this, is when I re-project the camera,
what I am actually doing is, I am putting the particles
all the way down to how far away the surface
was from the camera itself. So, I need to move it upwards. So, I need to offset that
position by a certain value. Once I have got it around
the emitter, I need to scale it inwards
by the size of the lens, and then scale it outwards
to the actual size that I want it to be, okay? Now, I could do this in a
number of different ways. If I wanted, I could just be
crafty with HLSL, but let us say I wanted
to use a node graph. All I would need to do
is right-click, go to FX, and create a new Niagara module that was called 'adjust
camera projection', okay? And this is where
I can just start to roll my own additional solutions
to this stuff. First thing I need to know
is some sort of vector that is storing
the position of the particle. The next thing I need to
know is another vector that stores
the offset direction. The next thing I need to know
is the distance away that that currently is, which is going to be
called 'offset distance'. Finally, I am going to need
to know the angle of the lens, and the size that we want this
to end up as, okay? Now, again, this is the most boring
section of this presentation, but just give me a second
and we will get there. So, we have got
all of these things. Now, if I want to offset
its position, all I need to do is multiply
the direction that we want it offset
in against the distance, okay? And I am going to put in
some default values here, just so that
they are seeded by default, and the direction
is going to be, it is
currently offset downwards. Okay, now,
if I was to plug this in, there is nothing that would
stop me from just getting this, putting it into there, and
putting it into there, okay? So, if I did this
and just quickly applied it, I am just going to add this
to the particle update at the end, okay? I am going to add
set position-- oh, no, I want to add
camera projection. So, adjust camera projection.
And if I move this, we are now going to see that
this is moved up, but it currently does not know
the camera's position. And the reason for that
is I need to set the default values
for here as 45, and the current position is just
going to be down to the variable that we have from that
camera projection. And when we do that,
we can now see that it is creating it
around the origin of the system. And because of that,
I can actually shrink these down to be a little bit more visible. And if I wanted to, I could
crank up this spawn rate as well to get
something pretty cool. Going to hit save. Now,
if I look at that in the world, let us go back to this a minute,
and I hit play ... Ooh. One moment.
Have I missed a variable? Sorry, one second. 45,
opposite direction, -1, 7,000 scale
is irrelevant right now, position is camera projection. Okay, I will work this out
in just a moment. I am going to move you outwards
so we can see you. Okay, that is working. The particles
are just too small to see. Good job, Chris,
you scaled them down too much. So, I am going to set these to 5
for the minute, just so we can see them.
There we go, maybe 20. Okay, we are getting something
a little more interesting. But see how I can actually
see the environment? And as I run that drone
so that it is driving around, I can see the actual system
changing to compensate for the surfaces
of that environment. Next thing I need to do
is scale it down into itself, and then scale it out again. And that is actually as simple
as getting this position and saying to it, I want to transform
this position into local space, so that you crush
in on yourself. So, we are going to get that, and I am going
to transform position. And I am going to say you go
from world to local. Okay? I am then going to divide this
by a size. Because if it
is 5,000 units across, and I divide it by 5,000, it is now a value
between 0 and 1. So, I am going to divide you
by a size, and the size I am dividing it by
is actually rather simple. At the moment, the distance
is the ground to the lens, okay? Which means that the distance
from here to outwards, from the lens angle, is equal to the tangent
of this angle multiplied by the distance, all right? I know I am busting out
the high school math, but let us just roll with it. So, I am going to get
the tangent of this. Oops, sorry, tangent in degrees. Multiplied by this distance, is what we are dividing it by. And then we are multiplying it
by the scale of this to make it as big as this scale. Because if we bring it down
to a value that is between 0 and 1, and then I multiply it
to a value that is say, between, like-- if I multiply it
by a value of 600, it is now going to be a value
between 0 and 600, okay? So just punching that in. Excellent, and now we need
to do one last thing, which is convert it back from
local space into world space. Plug you in. And in
theory, in theory, Alright, just a second. I am going to load this up,
I am going to scroll down. Now, the scale is currently,
not default, there we go. So, I am going to delete this
and just look at the one inside. This is now what
we are looking at. It is super bright.
Give me a second. Let us go ahead and fix that
up, okay? Now, there we go. So, we can now see that we are
actually starting to create a pretty interesting
interior holographic map. You are not allowed to grin yet. We can make this look way
cooler, guys. Come on, bear with me.
So, we have got this spawning as particles that are just doing
their thing, all right? And that is kind of interesting. But remember the
first thing we did when we created that
cylindrical spawner? What about we go ahead
and do that? What about if I go here
and I say, be gone with you,
HLSL that we created. Instead, I want
this cylindrical location, and instead of saving this off
to particles.position, what if I turned this
into something called particles.projectedposition,
okay? Now, if it is a
projected position, this means it has just
automatically gone off and created a new variable to store that that
we can access. Now, the reason
I might want to do that is actually rather simple. If we stored
the projected position instead, and I disable the intrinsic
calculation over here, we can see that the particles are now spawning
as a cylinder, right? But see how some of them
have little red dots and blue dots in them? That is because they know
where they are going to go. I have just set their position
differently, okay? Now, here is the fun part.
If I go to my update loop, and I say, I am going to set
the position of the particle, but I want to set it
that over time, it is smoothly-- whoops, sorry, wrong one.
There we go. It smoothly lerps over time from
itself to its target value, which is the value
that we just made up, what we are left with is this. Now, that might be
a little bit weird, but what these are doing is, those particles are
trying to make a map, but they are not fast
enough to do it. They are dying
before they get there. But I could just crank up
my convergence rate. And now the particles
are whisking away from where they start into
their final position, all right? But we can do cooler than this. Because why do we not get
this convergence rate and change it over time? Why do we not say
convergence rate, you are set to a curve? And when you are
first initialized, I want you to actually
have a value, -- convergence rate of near zero,
all right? That means that as the particles
create, see, they are now chilling out for just a second
before whisking away? That is because
when they first start, they are not allowed
to converge, and then they are. And what we end up doing
is creating something that, over time, if I add this and I can change their
convergence rate as I see fit, and I can scale the convergence
rate max and mins, say to 10, here
is the fun thing. Those particles are now going
to dynamically move up and down. So, when the texture moves,
they go, whoop, I better move downwards,
whoop, I better move upwards. So, see, you get that
really cool, like, flow between the particles
as they move up and down. And the other fun part here
is that if I were to throw it
into slow-mo, if I were to throw this
into slow-mo, you can physically see
each and every particle -- let us follow this yellow
guy over to exactly where it needs to be in the map. And then of course, depending on how much you want
to tweak this over time, you know, we can do that
without too much drama. I can change this, for instance. You know the color value
that we set before? Nothing stops me
from getting this color value and saying, hey, you know what? When you start, you are actually
going to be green. And then I slowly
want you to convert from green into the actual color
that you are becoming, you know? Or, if I wanted, I could
create a dual ring effect by saying hey, this
curve that we created, what if we stop converging you
at some point, so we do like a real close
secondary ring, and then you crank up again?
So, if I got this, we kind of did a
little bit of a hop before eventually
getting up there, those particles are now
going to whisk out, slow down, and then whisk out again. I am just going to bring that
in a little closer, and bring you a little lower. So, we should create
a secondary ring in the process. And again, this whole
thing is acting live based on a camera
from this guy out here. Now, here is what I am
going to leave you with, as we are kind of
running out of time. What I am going to
leave you with is this. Look, Niagara is
a pretty cool system, and obviously you
can do some stuff that you could not
due in Cascade, but I really want you to consider
what that actually means, okay? I am going to turn up
the particle count so we have something cooler. What that actually
means is this. I am not excited
for the explosions that you will make in Niagara. I am not even excited
for the beams or anything else, because you could
make them in Cascade and you could do pretty well.
What I am excited about is this. In any other particle system
that we have ever had, or any other engines
really had on this front, typically when an
artist hit a roadblock in what they were
trying to create, they would either have
to get a programmer, or give up
and do something different. And what I love about this is, any time you want to
reach under the hood and change it, and warp it,
and add to your, you know, extend it
to whatever you want it to do, and add any sub-systems
that you want them to have, you can do it now. It is
not actually a limitation. And what I am excited about is,
all of the stuff that people are going to do that we have
just never seen in games before. Because this took 25 minutes while explaining
what I was doing, you know? And normally, this would be
an effect that you would fake with a 3D mesh
and some transparency. And you would be like,
I guess that is good enough. But when you actually
have these things, there is nothing to stop me
from spawning these particles from a character's skin, and literally turning the skin
into a map, and then back
into a character again. That is actually only
a couple of nodes. So, I am super excited about
where people can take this all, and I hope you enjoy
the rest of GDC. Thank you very much for
coming to this presentation. Stick around. We have got
learning theater stuff all day.