>>Amanda: What's up,
Unreal Creators?! We're at the top of the month
and that means new free Unreal Engine Marketplace content! Get ready for the long haul with
game-ready trucks and trailers, effortlessly add
dynamic radial menus, easily extract root motion
data for your animations, venture into lush,
tropical jungles, and transform your towns
into entire kingdoms with this month's content. New to the permanent
collection, head off on an exhilarating safari with
a collection of African animals. Download these free products
from the Marketplace before time runs out! While you're there, shop
over 800 products at 50% off during the August Flash Sale! Propel your pipelines with
dazzling particle systems, advanced plugins and
Blueprint systems, and out-of-this-world
environments. The flash sale ends Friday,
August 7 at 11:59 PM Eastern. For our community
developers in Europe, there are brand new training
opportunities now available to you from our Authorized
Training Centers. These partners
will provide live, hands on training in a
virtual classroom-like environment--available
in various languages. Visit the Unreal Engine feed
to find out more and sign up! As we eagerly
anticipate season 4, HBO's hit sci-fi
series Westworld explores the technological
landscape of the near future, from lifelike robots
and advanced AI to autonomous vehicles
and blockchain. In season 3, showrunners
Jonathan Nolan and Lisa Joy and VFX Supervisor
Jay Worth also turned to futuristic
techniques behind the camera, with help from Unreal Engine. And now to give a shout-out to
our top weekly karma earners! Everynone, TheKaosSpectrum,
ClockworkOcean, Tuerer, MMMarcis, Knapeczadam,
EvilCleric, G4mble270, Shadowriver, and Louap Olinad. Thank you all so
much for your help! First up on our spotlights is
the absolutely endearing Epic MegaGrant recipient
Mitzy Makes It. To be released as a children's
book and interactive app, Mitzy Makes It introduces STEAM
concepts to young children and aims to inspire
more girls to pursue STEAM fields later in life. Learn more about Mitzy and
Shazzy Gustafson's development on the project via
her mitzymakesit.com! And to totally change gears
Released earlier this week, Hellbound is a
classic FPS action game focused on speed,
gore, and metal music. From Argentinian
studio, Saibot Studios. Not for the faint of heart,
download Hellbound on Steam. Last up, take control
of nature in couch co-op puzzle platformer
Keepers of the Trees. Test your platforming
mettle, work with a friend to solve puzzles by
growing special plants, and explore a lush,
fantasy forest world. Available on Steam now! Thanks for watching our News
and Community Spotlight! >>Victor: Hi,
everyone, And welcome to Inside Unreal, a weekly
show where we learn, explore, and celebrate
everything Unreal. I'm your host Victor
Brodin, and my guest today is none other than Asher Zhu. And right as that
happened, we had a little bit of an
interesting camera flip that you might
happen to see again throughout the stream. Welcome to the show. >>Asher: Yeah. Hi, hi. >>Victor: Hope y'all out
there are doing all right. >>Asher: Yeah. >>Victor: Asher, what are we-- >>Asher: I need to mute the stream. OK? Wait, sorry. Again? >>Victor: There we go. I was asking what are we
going to talk about today. >>Asher: We're going to talk
about the various approaches you can do with volumetric
effects in Unreal Engine, including volumetric
fog, volumetric clouds, which is a new thing
that's upcoming in 4.26. And some Niagara
particle effects that's going to be integrated
with the Niagara system. That could blend very well with
the two systems I mentioned. So yeah. My camera will do that
thing every 20 minutes. And don't mind. It's just heating up. >>Victor: Well, I
think we'll survive. All right, whenever
you're ready, feel free. The screen is yours. >>Asher: All right. Cool, cool, cool. Wait, I'm on? >>Victor: Yeah. >>Asher: All right. So this is the approach I
made for the previous thing I did. This one-- so this thing, I
called it the dry ice demo. I've used volumetric fog for it. It was pretty popular on Twitter. So we're to talk
about it and thought it was-- I was very honored and
excited to be here. And I thought it was
because volumetric fog. is a 4.16 feature. So I decided to do
something new with it. And the hook, the
catch is I only had one week to do everything. And it was quite a ride. But and it was scary
when I think back. But yes. I did it because
the new sky was pretty popular
on Twitter too. So that was this one. A cloud render, I call it. And I'm very impressed by
Unreal Engine's current status because its ability to assemble
a wide array of features, and modules, and technology to
create something new, something novel. And I could do that
with a lot of ease, given that I know what I'm doing. And I'm just very happy what
I can do within one week. OK, I'll wait until
the video plays. Maybe some guys haven't seen it. >>Victor: It's beautiful. So we don't mind watching. >>Asher: Thank you. Guys, I'm a little bit scared. I'm nervous. Can we get people in the
chat, please, to ease my mind? >>Victor: For the
visibility, should know that the stream was a
bit of a last minute call. So Asher definitely
put a lot of effort into being able to put this
together for me, or for us in such a short amount of time. >>Asher: Thank you. Well, that makes me very happy. I always wanted to do this. >>Victor: Yeah. And we're excited
to have you here. >>Asher: All right,
and another thing is because I have
limited time, I might not be able to prepare the
stream as perfectly as I hoped. And I have good points. But if I haven't explained
something too clear and you have questions, feel
free to type in the chat. And OK. Let's get into it. I'm going to talk about the-- Oh, this is the video. I thought it was the engine. That's why it's not moving. OK, I'm going to talk about the
dry ice symbol for a little bit because it's-- wait. I forgot to put up my slides. Yeah. Because it's a default
solution for volumetrics. And it's very stable. It has been out since 4.16. It's very efficient,
scalable with-- scalable, and have input into
released games for two or three years now. And it generates just
a very good start point to learn about the volumetric
system and volume materials. Excuse me. I need to find a good place
to put the Twitch chat. Oh, there we go. OK. So I'm going to
open a sample map. The way volumetric
fog work is it can work with light very well. What we love is a kind
of light shaft here. And it can work with a
spare light, local light, and many other light types. Like this, I can see
there is a core here. And it works with shadow too. I am here. Yes. This is a light. It's actually occluded
by the path here, and the volumetric
fog will reflect that. And that's very nice. The way it works is the fog is
the fog is presented, voxelized and presented as
volumetrics into the camera. So if we type in volumetric
fog grid size Z here, and typing a small
value, like 32, default it 128. And disable reprojection. You can see the sizes
of the volumetric fog. It gives you a pretty
good idea of how it works. And because, of course,
we have the option to change the view distance
because it's a volume. It's a 3D texture field field
in the camera column. So we can trade accuracy. We can trade distance
for accuracy. This is a 6,000
centimeters here. If we pull that
near to the camera, I can see the accuracy
here is increased. But, of course,
then it doesn't have the range it needs to have. So that's a tradeoff
you might want to make if you are using
volumetric fog in your game. So the temporal projection,
reprojection, as I said before, is basically the list. And you can see it's
much smoother, even with the same sampling count. That's because it's using past
few frames to kind of not blur, but filter the volumetric
fog to make it look smoother. But there is still artifacts. So that's because we have
a very low sampling size. The grid size, the 32 here, if
I typed in the default value, which is 128, then yeah. There is smooth fog. I hate something here. Wait. Yeah. This is a main
thing that you can play with to know more
about volumetric fog and make a screenshot,
if you want. This is the parameters
I used just now. >>Victor: We got some
questions in chat. They're wondering which
Engine version you're on. >>Asher: I'm on,
unfortunately, dev rendering that's the development
branch of Unreal Engine. You can combine that. You can pull that off GitHub
and compile that yourself. And the volumetric cloud
feature we're using today is going to be released in
4.26, which is the end of year. Of course, the previous
version will be before that. But if you want to play with
it now, you could just do that. Compile the engine
from GitHub yourself. And I highly recommend
to learn to do how to do that
because it's very fun, and it will make you feel
like a pro immediately. So yeah. And next thing. Let me check my
reading points. Nice. All right, so this works with
local light and translucency. Let me adjust the
global fog a little lower. So this is the volumetric fog too. It just use a local volume
material to fill in the cube. The cube just act like a vector. So if I go near it-- you can see that
because I turn down the global volumetric very low,
there is not-- you can't see anything here. But if I go near it,
you can see that. Wait. What's happening? You can see that the fog is
affected by the light too. And I have a translucent
object here if I drag it in. As it blends with a
translucent pretty well. And that's actually a
pretty important because translucent usually will have some problems in the
deferred rendering outline. And how did I make this? It's actually very simple. It's just a volume
texture hooked directly into the extension, which you
can understand it as density. And volume material is
something like this. This is real 3D texture. Maybe some of your guys have
known about the single volume like sliced 3D texture, which
is actually 2D texture, but in the way that you
could treat that as 3D texture. But this is Engine feature now. I was going to say it's
actually just volume texture. Is real 3D texture. And you can share view
male mode tracing to volume to see how it looks as a cube. So the UVW is hooked to
the absolute work position and/or multiplied by
a very small value because work position
is in centimeter and because this is huge. If we don't multiply that,
we'll get something very dense. So this is just a way
to present it better. Then the power is just to
try to make it sharper. And bias is
just the value that's added to the final density. If we increase that, you can
see that it just fills in. And if we decrease
that, it slowly fades out. And it's pretty straightforward
how volume material works. And if you want to do
that, it's very simple. Just select the volume
in material domain. And now, the nice thing
about volume material is volume material was
first introduced along with a volumetric
fog in 4.16. And that volume material
was only for volumetric fog, until now, until
volume cloud is out. So we now have unified
volume material for both fog and
volumetric cloud. If-- OK, I'll show a-- I mentioned, on
the rendering, which are the two
event branch, if you compile that, just
for the record, I'm on the chat list 13979994. I've been using this
version for a weekend. It never crashes. It had some small bugs, but
it's definitely good enough for it to play with. And you can just type in cloud. And we will find the
volumetric cloud here. I'll put it one in my thing
so I'm not just dragging that. If I said, enable, you can see. So this, this volumetric
cloud actually uses almost exact volume
material as a volumetric fog. This part is just-- it's the same except
the scale is much larger because the sky is big. Truer words have
never been said. So yeah. And this is not directly
connected to the extinction because we have optimization
that if you hook in-- how do I say-- you can scoot that in the
volume of the advanced output as a conservative density. What it does is-- this is used to accelerate
to the ray marching by early skipping expensive
material valuation. What that means is
for the empty pixels, we don't want to relaunch
small steps and to up the-- with the whole range because
that's very expensive and a waste. So we could just
hop quickly in this. And that will loop,
loop back into the volumetric advanced input. So this is basically
the same as this except this is optimization. You can-- yeah. And I have another
layer of filter. I use absolute work position. The B channel,
which is just xyz, to filter the
density, what it does is if it's higher than the
height I'm setting here, it will be extracted from the
density from the volume texture sampling. So if I lower the value
here, the clouds will grow thinner. This is a video presentation
of how this works. So yeah. What you can do, I will
explain this later. And what you can
do is we can just put this into a
material function and use the static
switch to connect with both volumetric fork
and volumetric cloud. So the cloud will use
the same material. That takes us to-- yeah. I figure I haven't explained
things super clear. If you guys have any questions,
just type in the chat. >>Victor: Someone is
wondering if it's possible to set a bounding
box for the volumetric clouds, for example, if you didn't
want the clouds to render beyond 1,000 feet. >>Asher: Yeah. Of course. That's building parameter, which I plan to
talk about it later. >>Victor: Sweet. Yeah. >>Asher: OK, I'll just
talk about it later. All right, so this
demo, I will explain how to make more of
this look beautiful. All right, so this is
actually just three noises stacked together. I could disable the pan,
disable the two layer of detail. You can see it's actually just
a pretty noise and filtered with distance field. I could increase
the decent field height here similar to what
I did just now with a cloud. As you can see, it works
similar, in a similar way. And if I pan this-- oh, wait. Sorry. No. Yeah. If I pan this, this it just a
single layer of volume texture. So we can say it's
not very interesting. But it's still a little
bit interesting though. It's just a painting texture. And if we add another layer of
painting texture on top of it, immediately, it has this kind of
fluid sim-like feature to it. It almost looked like it has
some character of real fog. And then we'll
add another layer. That looks much better. And we can use curls,
some curl noise to distort the input
UV to make it a little more interesting. It might be hard to
see in the stream. And I'll just pump it
up to 50. Now, this is what it does. You could make it
super stylish, even. I just keep it a small value. But yeah. This is basically how you
make very nice flowy clouds. So there are a lot of
other parameters here. Most of them are directable
parameters like curler scale. You could make
this 25 something. Yeah. This, almost like some,
the clouds in Fortnite. And we can change the height. I feel like I don't have
to talk for this part. You guys know. >>Victor: It Looks beautiful. >>Asher: Thank you. And yeah, of course, color. And the color here is actually
controlled by a color. >>Victor: What's that? >>Asher: A Curve Actor. Basically, what it does-- you have a curve
here, this curve. Then the curve is
automatically baked into a texture,
which then in turn can be read by the
material graph. So I have a-- Oh, here. Yeah. This is a curve atlas, atlas
in the way that a lot of curves are just compiled into
the single, one texture. And the reading and writing the
phrase is pretty convenient. You don't need to worry
about this texture part. It does that for you. What you can do is
basically just a change. The x-axis is a density,
I think, for this. For this one, the x-axis is
a distance to the ground. Wait. Let me turn that back. And if I pull this down, it gets darker near the ground. And the curve of it is
too fake ambient occlusion. All right. All right, so I'm not going
to talk about this one too much because I have read
a pretty comprehensive blog about it. You can type in Asher.gg. That's my website. And there is a post about that. I explained exactly how
to make these effects. A little bit about
volumetrics and local controls, 3D noise, and how
to make them, how to use distance function,
how to stack noises together to get good results. And how to make fake
shadows to volumetric fog and some automization tips. Yeah, check out
that, Asher.gg.. So let's skip to the cloud demo. Is it feasible to use it in
large open world environment? Yes, that's why I do it. That's not why I did it. That's why the cloud
component is developed. I keep calling it
cloud component. It's just the
volumetric cloud system. And this is the same-- just show off a little bit. Wait. Let me type in the show off
comment to make it more sharp. And yeah. I first gave you a quick tour
of the cloud, volumetric cloud. So the layer session
is just basic control as a question I
was asked before. This basically controls
when the range-- I set this pretty low
because I want the cloud to be close to the ground. If I change that-- yeah. You can see what it does. And we have layer height,
We just enter the height. And this-- Tracing Max
Distance to control the range. And the clouds
tracing section is the most important
to balance the visual and the performance--the framerate is
not good right now. And I used some similar
value, higher than usual to have a sharper
picture for the stream. So the view sample count
scale is basically-- yeah-- the same account. You can see what it does. And you need the higher count
to have a sharp image quality. But this will have linear cost. And reflection sample come-- I'm not using reflection. Shadow-- yeah, shadow is
another thing that cost a lot. If I think that, I can
see the shadow trace. It's very rough. You need to try, what kind
of value that gets you, just a good enough picture
without too much cost. And we have a shadow
trace distance, which controls the range of
how long the shadow should be tracing. So yeah. So if you don't-- so this is a ray
marching approach against the volumetric
fog, which is voxelation. If you don't know
what ray marching is, this should give you a pretty
good idea of what it is. It marches rays-- spoiler. Yeah. I'll play with it a little bit. To share what this is, what things
you can do in video games now. Yeah. OK. It's all available on dev
rendering branch right now. Yes. We should try that. I added a hotkey to change the game, the time of day. Let's just fly silently. >>Victor: Perhaps we need to make
some sound effects for Manny here. >>Asher: Can you do beat boxing? >>Victor: I can, but
I'm not sure I'm going to do it on the
Unreal Engine livestream. >>Asher: Do it. Do it. Do it. >>Victor: Maybe next time. >>Asher: All right. And the landing! >>Victor: And I just want
to apologize to chat. Why I'm so quiet here,
I have construction going on right outside my door. So I need to stay muted
for most of the time. >>Asher: All right, where is up? OK, this us up. All right, I have used a-- for what I call-- I will talk about that later. There is a plug in
the engine-- allows you to change the time of
day pretty easily and very conveniently. But, of course, it doesn't
do the night circle because for night, you really
want a completely different light setup than daytime. And you want different
particles and even different seeing objects. So yeah. I did that manually too. So I just have a
wonky switch here. Boom. This is very fun to play this. I feel I could just do
this for the entire stream and you all will be happy. Nighttime-- >>Victor: That's beautiful. >>Asher: Thank you. I was very happy making
this because I'm a hardcore DayZ fan and
I've been playing, so I will-- OK, I might do that again. And I've been playing
all the survival games before it's cool. And yeah. If this was DayZ or anything
like that, I would be so crazed. Yeah, OK. Don't forget about
the presentation. Yeah, so I need to introduce
to a few features I made here. I forgot how to fly. All right, we'll look at what
happens when I get into the fog games, actually-- this is a volumetric fog. But when I fly near it, it will
be replaced by the volumet-- wait. Did I say volumetric fog? This is a volumetric cloud. And the way I fly
into it or near it, it will be replaced solely
with volumetric fog. And you might be able
to see the transition. OK, I'll make the-- this will be more obvious. This looks ugly,
but it will get the point across. Yeah. There is a small color
hue change up here. If you didn't see it, it's OK. I'll explain it a
little better later. I just want to show--
I've disabled the-- That's transitional
from cloud to fog. Is that before or after
the particle? I'm sorry. All right, I'll just-- oh, now with the-- weird. I honestly can't
remember where that was. Have I talked about the
particle? The Niagara-- Have I talked about the
ShowFlag.Particles 0? >>Victor: I don't think so, no. >>Asher: Yeah, OK. So for the near distance,
we are using volumetric fog, and it has different
layers of the fog to make the ground
look more interesting. Because without that,
it looks really dull. it don't feel like you
are inside a cloud. And yeah. I'm pretty proud of
this one because it gives you feel
that you are inside the crowd. If we disable the particles by
typing ShowFlag.Particles 0, we can-- OK, it's not a good
position to show that here. So this is with
particles, right? And this is without. Yeah, so the ground
part is another layer, which is added on top
of the big layer that replaces the volumetric
cloud in the near distance. And yeah. Let me explain the cloud
material a little bit. Question. Are the unified volume material
already in the engine-- so volume material for anything
other than volumetric fog? Yes. That's the idea,
although you need to do a little work to make sure to switch in between the
two, work without problem. But yeah. That's what I'm going
to talk about later, being a more practical example. All right, about
the Sun/Sky, you can enable this plugin
just by typing sun. We will have that sun
position calculator. You will enable that. You can drag Sun/Sky
Actor into the scene. And you will have this Actor to take and put on any
position on earth, the latitude and longitude, time
zone and north offset. You have that set. And you can have month
and day of the year. Then you can just
drag solar time to make the sun move and
the sky will move along. The sky intensity will
adjust with that too. This will be more clear. Yeah. What I did in game
is just setting this when I press a key, a
very nice thing to know about. And yeah. It happened to me a lot
to make the demo look more versatile in a
very limited time. And yeah. Let me adjust up to
a nice looking angle. Here? OK. So to the cloud material,
it's a little bit similar-- not a bit. It's very similar to
the dry ice material I talked about before. And, in fact, it's
actually just-- I just made this on top of that. The important thing,
again, is stacking noises. If I disable the detail noise,
two layers of detail noise, it's just a shape,
general shape of the cloud on top of the mountain. And I can make that pan to-- I'm not enabling
the time because I need to know where
the good spots is to show of the features. But if we enable the
time, it can actually move-- I'll set it
to a higher state. And if I enable another layer
of detail there to subtract the density from the cloud,
it will look much nicer, and then another layer
to give more details. Then this doesn't
look very good. That's because we need temporal
filtering to smooth out the shape. But that doesn't work very
well if the cloud is moving very fast. So 0.3 should look-- Yeah, this, now it
works pretty OK-ish. And, like I say, there is
a huge chunk of blue here. That, you could change by
using art directable parameters like ambient-- multiscattering
ambient occlusion, if we learned that value. You can see it looks much
nicer from this angle. But the problem is if
I go into the cloud, it doesn't have
too much contrast to make it feel believable. So that's a balance of
art direction that you will choose. And next thing-- next thing,
should I talk about this one? All right, yeah. This is stacking noise part. And how we can control
how the clouds blow over to the mountain? It looks OK. I can show you this blows over the
mountain. That's according to
the height map, which is similar to the distance field
we mentioned in the dry ice demo. But because it's
a height map, and it could have
almost unlimited range. The distance fields
have a limited range because it requires
every static mesh to have a pre-built mesh distance
field, which can be super huge. And you can actually
adjust distance field. As we just here, just
mentioned, in one sentence, you can choose how far
the distance will go and to the object, or the
mesh distance field range, as I've just mentioned. Yeah, that's not
related to this material here what I'm talking about
is because it's a hard map, so you can do this. Yeah, that' the main thing
that we could use to art direct where the cloud,
we wanted it to go. We have the max distance
and clamp, high map clamp, if we want some clouds on a
uniformed layer, uniform height like this. All right, it look good now. But if you want to have-- like,
if you stand here and you want to have a huge
layer of cloud below this, you can do that, of course. And yeah. The high map could be controlled
by a editor widget I made and-- yeah, I know you guys are
very curious about this. But I need to expand
some other things before I can get into this. Well, we it basically can
add another layer of height onto the landscape. So we can control how
we want the crowd to be. I'll just do a
quick, little example and switch to the next topic. Hey, yeah. Here. Yeah. That's all you can do
with an added widget. Wait, this doesn't
look good on that one. I kind of feel like I need to
do a better example of that. Make smaller strands. There we go. There, you can
click, click, click, make clouds around the
mountain, something like this. All right, I just
won't do all of that. And yeah. I'm starting to talk about the
other parameter of the cloud. All right, what
others do we got here? Details, time, another
round, actually. Yeah, OK. I'll get into the actual
material notes a little bit. It's the main thing here. So the height then
I mentioned just now is just assembled here. There is a landscape right here. It's all wide because
it's larger than 1. But this is actually the
height data of the landscape. And combined with
the mask, we could do with an added widget I made. and that's added on top
of the landscape height. Then do some filtering
here, parameters. Then that's subtracted from
the sample density here. And this is-- yeah. This is the part where we
changed the cloud shape using the parameters just now. It looks like a
lot of parameters. But actually, it's just doing
the same thing four times. This is the same
volume material. We just sample that, sample
different channels of it, and with different
UV inputs so you can have a different
intensity and scale and stuff to control how you want
the volume material, volume textures to
affect the cloud shape. And yeah. There is other curl where I feel
like I have changed something that I didn't want to. Wait. This looks weird. I can't tell why. There are other parameters
like curl strings. I make that pretty small. Yeah, a similar
thing with a fog. Have a kind of
stylish outcome. Yeah, this kind of has an
interesting stylish feature to it. I can definitely do different
things with crossing again. And yeah. Another thing that I should
mention along with this is camera, the camera
distance-based density reduced. That's just basically how I
blended the cloud material with volumetric fog. Wait for me to do
50 on the-- So if I disable
the volumetric fog, and when I go near
the cloud, you can see it's actually
just pulling back. And this is how I blended
the volumetric fog without. And actually, if
I enable the fog, you can see almost
work without a trace. And there is some
video artifacts here. And it's being addressed by our
graphics engineer, Sebastian Hillaire It will be fixed soon. But yeah. That's basically how it does it. All right, cloud,
fog, cloud, fog. And that part is
actually pretty easy. Just get the camera
position, and get the lens, and fading how far you
want it to fade out and give a blending range. Then subtract that from
the sample, the sample value before then down. OK, questions? The temporal filtering
seems to struggle with fast moving fog lines. The volumetric beams seem to lag
behind as the light ray moves. Yeah. That one, you could-- there's a parameter for it. It's r.volumetric fog. The way it hits your weight-- yeah. You can change the way to make
recent frames' weight heavier than before. So it's kind of a
faster reaction. But the temporal smoothing
effect will be less intensive. It's a tradeoff to
what you want to do. Is this supported
by the ray tracer? If so, what's the
performance right here? I'm not sure about that. But from what I understand,
it should have already been working with ray tracing. But if you're just using
it for the vista, it shouldn't matter
because ray tracing-- yeah. It's very tricky
to make ray tracing with visual effects. So it's actually a
pretty big topic. And volumetric fog doesn't work
with ray tracing shadow either. OK, make use of volumetric
fog and virtual textures with internal level design,
a prison facility. Oh, virtual texture. What do you mean by
virtual texture? Question-- is it possible
to cast shadow from this? Yes. it supports shadow. The volumetric fog supports
shadow and lighting form a lot of different lights. But volumetric-- the water. But the volumetric fog-- wait. All right, but the
volumetric fog-- but the volumetric cloud only
have directional shadowing and sky light shadowing. And it can cast shadow
with directional light onto the world, and it can
cast AO, very approximated AO onto the world here. Question-- would it be possible
to cast shadow from the these? I already answered that. Can it work on PS4 or X1? How, costly is it? Of course, it will work. But I'm not sure currently. But, of course,
this major feature, it will work with various
consoles, how costly is it. Yeah. Things that were wrong on
PS4, but it's still a balance between visual and cost. If your game is entirely about
volumetric cloud, of course you can dedicate more
results to that. I can get rid of the
banding in volumetric fog as we just saw on the
screen as well. OK, if you use the temporal-- of course, you will be using
temporal reprojection. You can degrade the history
weight to make it smoother, or you can increase the-- wait. What's it called? Grid Z? The fog grid Z yeah,
VolumetricFog.Grid Size Z to give it more sizes. But it's a linear cost, yes. Wait. I feel like I would
be answering question over the entire stream. >>Victor: Yeah. I've been picking the ones
that are sort of-- some of them are about the same. And so I've been
highlighting them. And we can get to them later
in the stream, if you want. >>Asher: Yeah, yeah, OK. OK. You are picking question for me. It's not like I don't
want to answer them. >>Victor: Feel free to
pick them if you want to. But I'm letting you know that
I'm keeping an eye on them and making sure that we capture
as many of them as possible for the end. >>Asher: OK, OK, cool. All right, yeah. If I have energy left, I will
answer questions in the stream. "I really like your haircut." Thank you. It goes with my t-shirt. I have a bunch of these,
so I don't have to pick. All right, let's
push the progress. I was talking about the material
and the camera culling. And I expanded the hybrid
fog cloud solution. OK, let me show a
few slides here. I kind of mentioned it,
but I kind of mentioned it. But I want to show a few slides
to compare volumetric fog with volumetric clouds. So volumetric fog support all
the life types all packed in. And it's-- yeah, general,
it's very easy to play with. And you can get results and
support local complexity, like translucency and
other volumetrics readers. But it has-- A, but it has a limit
because of range. Yeah. It can typically just
go around 100 meters. And you have a pretty low
performance slash cost ratio. But that depends too. Volumetric cloud, it doesn't
support the versatile light types. But it's designed for
far distance and support the kind of-- what are the main types
of light and shadows, like the directional light
with shadow, and skylight AO, and can cast AO and
shadow to the world. So what are we
achieving or what we want to achieve for the
solution is first to be able to, for efficient workflow, and
for the final release game, first, we want unified material
for volumetric fog and cloud. Otherwise, you need
to constantly sync the two materials, which
will be a huge pain. And yeah. I'll just show you a
bit how that works. It's actually pretty simple. And we've talked about
this material before. There's a camera part. There's a height map part. This is an assembly
part, and this is a kernel noise,
which I use to make the kind of stylish cloud. So this is basically it. And this is this function. It's called inside
this master material. Yeah. Arranging has a very
nice thing called-- [COUGH] sorry--
called static switch. It means you can compile-- you can have a small checkbox. And that small checkbox
controls which path you want to go in the material. The material button
is much, much easier. Yeah. This checkbox controls
which path we want to go. Here, if it is cloud,
we will use this. If it's not, we will use this. Then it's the same as below. And the nice thing, the
best thing about it, is it's not like a lerp. It's actually considered
in compile time. So if you have two in
this material instance-- so for this instance, it
wouldn't compile all the thing that goes along the false path. So everything here will not
be compiled into this material instance. I'm not saying that won't
count 100% correctly, but that's basically
how it works. And so this is material
instance for the cloud. And for the volumetric fog
material, as we can see here, basically, it just inherited
from the cloud material. And but instead, it has
a checkbox unchecked. So for the cloud material,
it will use this. But for the fog material,
it will use this. But because these two are
the same material function, the material names will
be completely the same. So we just inherit everything
that the cloud has. So when I go in, the shape,
it's just 100% the same. Even the light
behavior is different. Yeah, so that gets us to
talk about the details of the near distance fog. We'll actually disable
the time first. I can disable the time! That's a super power. OK, so as I have
mentioned before, there are two main thin,
main detailed parts that added to the new plane. One is a ground fog. One is sprites. The ground fog-- I am thinking that I'm talking
about this material too much. Is it getting confusing? Let me know if
it's too confusing. I will try to clear things up. But this was a big cloud. This is the big cloud. This is a small cloud. Is that simpler? So this is almost exactly
the same thing as this. But it uses
different parameters. Similar to the dry ice one,
I just kind of migrated it into the material and
added the density together. So we can have this
on top of this. And another thing that I'm
not sure if it's very useful, but it's generally useful,
but you can do it, is because this material
name is the same name as inside this one-- but I don't
want them to be mixed together. I need this to be different. So I added a-- what's a waterfall
prefix, but in the tail-- a past fix? Anyway, I just want to
append the underlying fog to every parameter here. And I could do that with
manual labor, but manual labor it's getting more
expensive nowadays. So what you can do is you
can actually just select all of them and inside a text editor, you
can see the material names here-- wait-- or the node names here. This is opacity fog. This is a scale fog. So this was not scaled--. It was scaled before and I wanted to add
every parameter name with this on the light fog. So you could just
do that with reg. It's reg XP. Perfect. That's the one. OK. Thank you. I learned something today. And that, yeah. You can use regular
expression or-- I don't know-- there must
be something very convenient for that, right? If I replaced everything, that
started with a parameter name and appended this end. And then I copied
this entire thing. And then I place them
in the material editor. So yeah. That's actually pretty useful. I can think of a lot of-- >>Victor: That's a neat trick. >>Asher: All right, that takes
us to the editor widget. I need to enable the
ShowFlag.Particles. Again, this doesn't
look very good, but it's good enough
for a concept proof, that this will definitely work. And to the editor widget, so
this end looks very fancy, works very fancy,
and people will think you are a pro. That you modified the
engine to fit your needs. And it's actually very simple
to add something like this. OK, sorry. I phased out a little bit. You can right click,
and editor iterative, select Editor Utility
Widget, and double click. It's basically just
a UMG widget that you could use inside the Editor. We can drag a button
here and make it-- this is a very
important step, that. And to make a
background color darker, it will look much better. This will be the default. See? And yeah. You can clear
button and select-- click the plus
button on clicked. And you can have all
kinds of Editor Utilities here. For example, you can get
selected actors for each. And it could delay that. I'm just doing
something simple here to demonstrate how it works. So hit Compile. And if you right
click on this, you can see a wrong Editor
Utility Widget button. And this is the button we got. Wait. It's a little bit too dark. is it not? I'm sorry, but I need
to get the color right. All right, so if we click
on our random victim here, and click this
Murder button, boom. It's gone-- magic. So yeah, that's all very
simple demonstration of how the edit widget works. And OK. Let me reset all of this. So for this widget, yeah. Yeah, I can drag this out so this looks normal. And you can append that to
every window app you want and hide the tab. So what happens when I
click Start Editing cloud is it caused the function
of another Blueprint I've put into the scene to
hide, hide the cloud layer. Wait. What it is? What? OK, to hide the
cloud, but show up the landscape I put there
before as a kind of-- I call it a cloud landscape
map, cloud map landscape. Yeah. I can draw cloud height
of that and cloud density of that on the
landscape, which you are seeing as a wireframe right now. And because I have the
landscape showing up, I could just edit it in
Engine, landscape editing 4. It will draw the
height and density, which is just a layer
of landscape weight map here in this render target. OK, why are you going from
this straight to this? Up to 50%. All right, sorry. If I paint something here and
click on Finish Editing Cloud, it will just capture the
weight we just painted onto this render target. And this is not super
robust way to do it. But because I had limited time-- and it's a lame,
makeshifty way to do it. And because the landscape,
the cloud landscape is not going to be combined
into the final game. So if it works, it don't really
have too much performance conce-- how is that-- concern. Yeah. That would be reflected to the heightmap, what we have showed
before in the cloud material. And another thing that's very
nice about the editor widget-- I think I need some
coffee-- is they can tick inside the editor. You don't need to run
it again or simulate it. It just take-- and you can use
a buttons to control and a gate to control, if you want
to execute and know that that does things for you. Let's just basically--
equals to-- I keep hitting the
Fringe Ending Out button. So because that's
executed, every tick-- not actually--
every tick actually has set a kind of cooldown to it. So it just keeps capturing the
height and the weight for me. Do we have any questions in the chat that I need to answer now? >>Victor: We can keep
going a little bit. I'm still gathering info. >>Asher: Cool, cool, cool. Would this widget
also be available? This has been available
since a lot of versions ago. You were not paying attention. Tahldon, I'll remember you. Yeah, you can paint the height,
the height offset for the clouds too. Yeah. And this is showing
wireframe because I have set distant choice-- how is that-- distance lerp
because the boy that's helped me to capture
the landscape here is actually very further away. So obviously, this
is very dirty. But it works for me. If I get near to it, it
will show the wire frame. If I get far away, it will show
the weight map and the height. And so when it
captures, it won't capture the mesh, the grid. Yeah, that pretty
much sums it up. Anything I'm forgetting about? Not quite. Niagara, next part. Now, this is very hacky,
but I have excuses. Do you have excuses? >>Victor: It works. >>Asher: It works. And Niagara in-game show. I have shown that. I turn on and turn
off the sprite. And another pretty
interesting idea is-- OK, let me go to the nice spot. >>Victor: So I was wondering
what the effect on Manny is when you're flying
through the clouds. >>Asher: I'm sorry. What? >>Victor: The sort
of effect that's on the edge of the
mannequin while you're flying through the clouds? >>Asher: Oh, that's
artifacts, similar to if we look into the depths. If you look into the depths,
you see this kind of outline. Yeah. This will be addressed
and solved so don't worry. >>Victor: OK. >>Asher: All right. So I need you play
with it a little bit. It will feel very good. So yeah. This Niagara system is
added only to the daytime. So it doesn't look quite
good when the light is dark. It will look like this. And that actually fits
to a horror scene, maybe. Yeah. At night, actually,
I just use the-- added to the kind of
waste to the clouds. So when you enter,
you will see this. I'm not sure exactly
you're on the stream. Yeah. I can see that. >>Victor: Yeah. It's coming through. >>Asher: So it doesn't exist
outside of the cloud, and nothing here-- pure block. Oh, so the flying mannequin is
a plugin called dynamic flight. You can search that on
Marketplace-- pretty nice. Never did a Iron Man landing. Yeah, so how I did this is-- you can't actually just rate
where the cloud is from the GPU because that's not how it works. The GPU really doesn't really
know where the cloud is. It just calculates it. So how I did this is similar
to a demo I did before. I just did-- OK, I'll share that demo first. Yeah, so give me a second. I should have opened that
before I tried to explain. I was literally reviewing
my bullet points one minute before the stream. All right, this is
kind of wave splash. This is a wave splash idea
I tried when I was first starting learning Niagara. It's broken, but it works
for the current setup. I can't adjust to the waves. So this is actually
just Gerstner waves. If you don't know what
Gerstner waves is, just search it. It's basically just five waves,
five waves stacked together to make the ocean have
kind of complex characters. If we fully zoom out, you
can clearly see the pattern. But from this angle,
it looks kind of OK. And yeah. You can see the wave
actually drives the splashes. Material is not as good
as I hope it would be. Can I grab a good
angle for that? Yeah. This is actually just,
basically, microchip particles that's marching with the waves. I have set up a
debug view here. All right, yeah. That's not speeding up. >>Victor: Is that a
sphere particle right there? >>Asher: Yep. But why is it so slow? I'm not sure. It shouldn't be. Even there are 1
million instances. It should be fine. >>Victor: Well, you are
running another editor in the background. >>Asher: Yeah, but
my machine should be able to handle
two Unreal Engine. >>Victor: Discord
uses up a little bit of GPU as well, especially
when you're streaming at 1080P. >>Asher: All right, just give
me some time to adjust. This is actually 4.23. And it lets me disable
a bunch of stuff. OK, the thing I can
do is lower this. Editor just reset. What rig are you using? What kind of rig? And your resource recommendation
for learning ray marching for Unreal? Ryan Brucks has some posts
that help me enormously. It's very good. Check him out. And yeah. He has volumetric plugin. Yeah, the engine really doesn't
want to work with me today. 4.23-- I'm looking at you-- you will not live
long in my hard disk. All right, yeah. I have video for that. Let me just play
the video instead. Give me a second. I'll just answer some questions. >>Victor: I've gathered some
from earlier in the stream, if we can tackle those first. >>Asher: OK. >>Victor: So I was curious if it's
possible to set up bounding box for the volumetric clouds. For example, if
you didn't want-- >>Asher: Yeah, I answered that. It's in the cloud system. Of course, it can. That's a very important thing. All right, add this video. >>Victor: That was quick. >>Asher: So the wave-- I would look before-- actually, I said this
wave is just a wave. And I fit in exact
same parameters into the Niagara system. So the cubes can
float in on top of it exactly as they should be. Because this is not physics. This is not-- it's not reading
the height of the wave. These are just
marching using mass. And yeah. Because I'm spawning cubes
along the rocks here, using distance field
rejection sampling. And when the cube
hits, like here, when-- can you see the moss? I'm not sure. Yes. Yes, you can. >>Victor: Yeah. We see it. >>Asher: OK. I will make it larger. Wait. Why do I want to make my own. OK, so when this
part hits the rocks, the collision could be
calculated from distance field because these are
all GPU particles. When it get into
the distance field, it will kind of switch status. These are all just
wave, wave cubes when it gets into the collision,
the distance field collision of the rocks, it will
switch into the smash mode, and the velocity is
calculated by using the wave speed, the wave velocity,
and the angle of how it's crashing onto the rocks. And this is all GPU particles. So it could be very fast. And you can have a
lot of-- lot of them. So that's the basic idea. Is my cursor big enough? And yeah. So that's a similar
thing we are doing now because I've said how the
clouds are actually just-- That's still pretty not good. I just takes a millisecond
now because I'm having it on a very high, very
high set, high-- yeah. I'm using a very
expensive setup. And my material hasn't
been optimized at all. It could be-- I'm confident I could
reach the same without-- within 1 or 2 millisecond. Because this is
actually pretty blurry. And the renderer is
being improved too. So for the volumetric cloud, the
performance would be very good. I could make-- yeah. I think I should demonstrate how
you should, could optimize that a little bit. Yeah. If I select to 8, can you
tell too much difference? I might need to pump
up the AO a little bit, because less sampling means
that it will be brighter. And the shadow distance can
lower the shadow count a bit. I show how much it cost. Yeah. That immediate jobs to 3.38. And I have some-- yeah-- unnecessary sampling
in the cloud material. So that could easily half it. And the thing about
it, it's very scalable. If you allow on a very-- a low hardware, you
can have some cloud, but not that sharp,
not like this. Yeah, if you have-- you can have a parameter-like
volumetric cloud, volume cloud. Now let's render target. Render target large
scale, if we decrease this to a little bit-- Why is it not working? It has a similar parameters
to the temporal weight of the volumetric fork. But I can't remember
what it actually is. And yeah, OK. I didn't really exactly
prepare for that part. But the idea is it's very
scalable to different hardwares similar to volumetric fog. It will be blurry
and low quality. But we're wrong on
most modern hardwares. But I don't think it will
be wrong on mobile phone. Maybe it will. I really shouldn't talk
about something I'm not sure. Sorry. Forget what I said for
the last 20 seconds. Yeah. Let me set it back. So the texture-- So we used noise textures
to stack together to make the cloud shapes. And the thing I
did with Niagara is I basically duplicated
that cloud material and made a Niagara module. Yeah. This too does exactly the
same thing, this and this. I just kind of put this
on the other screen and made the same thing. In Niagara module,
it sounds painful, but it's not a bad-- it
took me about 25 minutes. And because the material is-- the cursor-- should
I send that back? Together, the material, it's
translated into HAO, SAO. And Niagara GPU
theme also translate the modules into
HAO, SAO language. So if the logic is the same
and the input parameters, like what position, basically
old time plane texture, is the same as a material graph. The output value, like RGB here
and the final density value here, should be 100% synced. So this is how
the Niagara can be aware of where the clouds are. And you use something like
called rejection symbol. If the density is
lower, lower than zero, it will be killed by
sending the data interface bot live to force. So that means I have
depart view here. So that means the particles
respond around the camera. A The particles are
actually spawned uniformly around the camera in a sphere. But if it find out, oh,
I'm outside the cloud, then he just commits
suicide, commits Sudoku. And yeah. That's what rejection
symbol means. Yeah. Another thing that I want to
mention is the camera interface. This should have been
in 4.25 but not sure in versions before that. Yeah because we want to spawn
particles around the camera. And before we have to have
kind of like a BP setup up to fade the camera
position every frame into that Niagara system
using a user variable. But now we have this
camera data interface here. It's still experimental, but
it works for me, at least. For any vector, you could
just type in camera, and you will have a
return camera property. You can just get a
camera position up for right vector and
the vector to camera. And because this
is HLSL, so this vector to
camera it's actually exactly the same as the camera
vector in material graph. Because that's very convenient. You don't have to have
an actual BP setup. You can just use
the camera position. And camera position Yeah. And that's the very nice
thing about this-- Niagara, once you've
set up the scene, you could apply any other
Niagara modules you want. I'm spawning the wisps in here. And it could use wisp. I just fat-fingered
my mouse cursor. I could add a kernel force
in the particle update to make it kind of move around. If I make the stance higher,
it's more obvious how it works. Yeah. Just throwing this around. curls also works
with everything. Use it, and gravity, and
force, and some drag too. Because this should
be pretty lightweight and shouldn't have
too much momentum. So I used a track. So framed around
nicely and slowly. And yeah. That's the thing about
Niagara, is modulized. If once you made this, what I
call cloud rejection symbol, you can just insert it into
everything that you have. You can have an explosion
that's consumed by the cloud that then explode
inside a cloud. I'm just making Examples. All right, yeah. So for sprite-- I've mentioned a lot
of times-- If I had time-- Where are particles--
where am I? Oh, wait. Yeah, I didn't
put anything here. All right, there
goes my sprites. If I had time, I would
use some techniques to make the sprite move-- oh, sorry-- make sprites
look more volumetric. And that's the same. I happen to have tweeted
a while ago on Twitter. And people loved it. I'll just play a video of that. Yeah, so this is called
fixed point, that map. It basically bakes
the light scattering data into the texture,
six or four channels. So in the actual
engine material setup, you can get the atmospheric
directional light direction and decide what kind of-- decide which channel,
what directions you want to use
and blend together to get some convincing
result. This is just a sprite. And yeah. If I have time, I
would like to do that. And you can do that without
painting because there is an awesome plugin called
volumetrics in the engine made by Ryan Brucks. By the way,
I'm a huge fan of Ryan Brucks. >>Victor: I think we all are. >>Asher: I learned
so much from him. Oh, is he-- Is he going to do a livestream? >>Victor: Yep, yep. That' the plan. >>Asher: Do you have a date? >>Victor: Do not have a date yet. >>Asher: Yeah. Do you have an approximated? >>Victor: In the next two months. >>Asher: OK, cool, cool, cool. I really look forward to that. He showed me a lot
of stuff that's very amazing. And yeah, volumetric
cloud stuff. And this plugin is
called volumetrics if you are on 4.25 or later. You can just search volumetrics
in the plug-ins window here and enable it. Once you enable it, you can
access the content here, the volumetrics content. Just search for fluid sim. You can see some
demo maps, like fluid simulation 3D. Is there a documentation page
link for the built-in plugin somewhere? I'm not sure, but
this it's actually pretty straightforward. For Blueprints, I feel
like the best documentation is comments in the Blueprints
because it's very easy to read. And I just messed with things. Have your way with it. And that's the best
way to learn, I think. Of course, documentations
is important. But because Unreal Engine
has evolved so fast, and they have something called
opportunity cost. If we spend a lot of time
into making tools in the engine that's the time we
could have been put into making the engine more. So there is always a balance. But yeah. But we are actively
trying to make document for important things. But for in development
plugin like this one, it might be tricky
to document it. Yeah. OK. This is-- what is that? Actually, I should
start reading chat. OK. Yeah. If you select-- if you press
Simulate, it will do this. Without documentation,
it's pretty awesome. I'm just explaining
why it doesn't cover 100% of everything. >>Victor: And we usually ship. With some of these
new features, they're usually example
projects as well that are accompanied whilst
a feature goes out of experimental and/or beta-- so when it actually ships. >>Asher: Yeah. I don't know. I don't know. It sounds like some
guys in the chat feels like I'm
making peace with it. But actually, there's so
much features in the engine, I completely understand why
we can't cover all of that. And so yeah. This is a simulation,
and you can play around with it, the parameters,
like curl strength. This is all simulated
by the material. And there's some buoyancy. I just made a very short introduction
to it, I haven't used it too much. But this is very
fun to play with. And another thing I have
just learned recently is it can capture flipbooks
directly inside the engine. It's called the BPSubUVMaker. And it can capture curling
opacity, velocity, normal, not more than spherical
harmonic, which is just another fancy name for
the 66 direction that maps-- I just mentioned. OK, cool. Yeah. And if we click-- oh, wrong Actor. Yeah, we can see that
when the target's here, it's not filled
incompletely because it jut ran a very short time. This is opacity,
curling opacity. This is normal. This is a thing that
I could use a bit. This is our 16-point
map that you could use with whatever material you got. I feel like I should-- yeah. I'm going to release
something, including the 16 material I just got. Can we just add that in
the forum page-- the stream forum
page, or the 6D lightmap. We are planning to release
something for clouds too. But currently, we are still
talking with Sebastien, who is making the feature
in order to kind of-- I want to release
something fun to play with. But they something up
and to make things easier for me to understand
before I can do that. Yeah. This is the volumetric plugin. if you play with it,
it's pretty awesome. Yeah, there's another
third party software I want to mention to you
called Embergen. It's pretty awesome for
to make flipbooks for video games. It runs in real time, just
do some really quick examples of it. So this could be
cooked into a flipbook and played as a particle-- And get and get
some dry ice to it. And there is also 6
points lightmap as well. So what I can do is I make
some generic simulation without too much gravity. I just go poof,
poof, poof, and play the fake flip book wave six
points direct map in the cloud. So it will react to the light. And that's good. Yeah, explosion. Yeah. Another thing I wanted to do,
but didn't have the time to, is-- That's not free. This is a demo I did a
while ago just for fun. Like I said, the fire is slowly
creeping outside the circle. This has been used in
video games quite a lot. What I'm pretty proud
of is this sphere mask, this sphere mask is actually not calculating every frame. When I showed something
on the grass, actually, it recorded the
time of the impact. So because we are recording
the time of impact, we could smooth out that time. So it could slowly
creep in outside. And this is-- for
the grass material, it could just raise
that texture to slowly have a texture of
shaded base effects. But we can also see there
is a forest fire here. This is generated by
Niagara too because Niagara could read the texture the
same way as the material. So yeah. And another thing is-- let me draw this again. If I draw a line here-- If I draw a line here,
and I move over to here and pay attention to this part-- I feel like I should
use my cursor-- pay attention to this part. And then if I shoot again-- boom-- you can see it that kind
of tadpoles through that part. That's because the greater
target space actually follows my character. So the center is always
where I'm standing. So that means I have just
virtually unlimited range. I could run to an open world
map and bring the target, who had just followed me. If I could just
continue doing this until the end of the year-- and yeah. What this can do with
a volumetric fog, as you have already
imagined, is, of course, the volumetric
effects, including the fog and the cloud, the
materials, and everything, and the Niagara modules could
all access these render targets. That means if I hit this,
it could disturb the fog, give it curl noise look. It could have a tornado?
What's the word for that? So, how do I know
what I'm talking about? Tornado. Wait. Yeah, tornado. All right, sorry. I shouldn't be sorry. I just didn't know that. Yeah. You can have turbulence. You can have a swirling
tornado when you are-- you can't have wind here to
advect the volumetric fog and make it feel like
very, very hot air for hot air bouyancy because this is very hot, it could just clear
the fog above it to make a hole or something. And yeah. You can even insert,
inject black, dark smoke into the volumetric fog to
make it look more awesome. Yeah. Listen, that's
quite a lot of work, so I didn't have
time to do that. But it will definitely work. I think it would be awesome. You guys can look up-- I think Uncharted 4 have a talk
about it or similar effects. So yeah. That's actually pretty much what
I prepared for today's talk. And that's nearly two hours. >>Victor: Nearly two hours indeed. >>Asher: Yeah. I was afraid I wouldn't
fill enough time. But it's just longer
than what I expected. >>Victor: Well, let's try to
take a couple of minutes for questions. And then I think
three outstanding ones that are specific to some of
the things you showed off, I'll make sure to pass
Asher all the questions. And then maybe you can fill
in some answers on the forum announcement post
after the stream, if you're able to answer them. >>Asher: Yeah. Yeah. I have a lot of energy left. Challenge me. >>Victor: Across
the board, viewers who are interested
in sort of what-- if it's possible to break
down the performance impact, they should expect when they're
working with volumetric clouds. >>Asher: Sorry. Can you read that again? >>Victor: Yeah. There were several questions
around the performance impact that they can expect when you're
working with volumetric clouds. >>Asher: Yeah, because
that's very new. And I just started playing
with it a week ago. I'm not super
to talk about that. But how our Engine is,
it always is a very important feature. We have great scalability, which
means you can just put whatever budget you have to the cloud. And yeah. Because we have a lot
of filtering options-- and yeah. Because we have a lot
of filtering options, like just like volumetric fog,
I think it will look very good. And the code is
actually very fast, faster than anything, everything
I've used before. And yeah. I'm super happy about it. The four or six milliseconds I saved is actually me
abusing the system because I have a 2080 ti. So I could just
do whatever I want to just experiment with it. And I used kind of-- I don't know-- around
10 texture sampling, which is crazy for against
what you should do. So yeah. I don't think performance
would be a problem. And as you can
see, a lot of games like Horizon, Horizon Zero
Dawn, and Red Dead Redemption have all put ray marchers
into their games to make awesome
volumetric effects. So yeah. It will work with PS4, no problem. >>Victor: Any way to reduce
the volumetric beam lag/ghosting
with fast moving lights without crushing performance? >>Asher: Crushing? Yeah. I've answered that before. Yeah, you've read the
question before for me. It's this volumetric. You can search for
volumetric weight. There is a
VolumetricFog.HistoryWeight that's defaulted at point 9. I can make a quick
example of it. Can I get a link to your
page or something? Someone is asking for
my Twitter handle. All right, we can
see the ghosting here if I just adjusted
R.Volumetric.Fog.HistoryWeight to something like 0.99. You can see the
ghosting is still very strong because
it's trying to make it as smooth as possible. So against that, we could do 0.8
or something quite that point. Point 1. You can see it flickering a lot. But the ghosting--
oops, sorry, not there. So it's kind of balance
between how smooth you want your light to
be, and how responsive you want it to be. And you can have 0.7% or 0.16. 0.16 works for my eyes. And if you increase
volumetric fog grid size Z to something like 256. And now it looks
pretty, pretty good. But, of course, the
cost is doubled. It's-- well, that's a lot. Why is it so much? It shouldn't be. It shouldn't be 4 milliseconds. Yeah. That's enormous second. It's a 1 millisecond. And now if your volumetric
works only on the near side, you could cheat by using
a smaller distance. And so using smaller, grid
size A would be even cheaper. Yeah. That was the thing I
wanted to talk about, and I forgot what it is. Just ask me the next question. >>Victor: We can do
the next question. >>Asher: I lost track
of myself of train-- train of thought. >>Victor: I think this
question was in regards to the volumetric clouds. They were wondering if they
work with ray-trace shadows. >>Asher: Oh, I answered
that one too. >>Victor: The answer is yes? No? Someone tuned in a little later. >>Asher: It's complex. Yeah. The ray marching method,
here, it doesn't work. Because it's kind of
similar to ray tracing. But ray tracing, we can't
to have ray traced AO. And yeah. Engine ray tracing actually is an array of features
that could work on top of the current
default render pipeline. So if the ray marcher
wanted to-- or if the cloud wanted to
work with ray marching, it kind of requires some
completely different philosophy. I know I'm not
sure with that. But currently, no. Currently, not really. The volumetric cloud
and volumetrics fog doesn't work with
ray-tracing, currently. And yeah. I'm not sure about
that question. >>Victor: We might be
able to bring it up when we get Ryan on the stream. >>Asher: Yeah. Hey, Ryan. >>Victor: So I was wondering if
the-- this is also in regards to the volumetric clouds-- if it's possible for them
to give their own shadow. >>Asher: Volumetric. Yeah. volumetr-- self-shadow? >>Victor: No. I believe they were
wondering if it's possible to have a shadow on
the ground from the clouds. >>Asher: Yeah, you can. Let me try. I haven't tried it. But-- >>Victor: Let's do it live. >>Asher: Sebastien
said it could be done. All right, challenge accepted. To the ground. Yeah. the thing I have tried is-- wait. What? I'll use the sun/sky. >>Victor: OK. >>Asher: But see, I have tried
in the directional light. You can enable
cloud shadow on it. But that will result in a lot
of solid error-- this, cast cloud shadows. This actually is a-- yeah from the video
I've seen, the crowd definitely can cast a shadow
on ground and looks very nice and have galleries
and everything. But my user case here
is a little special because the cloud wasn't being
used on the ground before. So that probably
doesn't work for this. But it should work for this one. Wait. What? Was I asked? >>Victor: Sorry, just managing
chat for a little bit. Yeah, I think the question
was if they can draw it. It seems like it's in the works. I haven't done that before. I think I just
failed the challenge. I'm sorry, chat. Yeah, it's because I
only used four weeks. So not super familiar with it. But it definitely can be done. I've seen people do it. You should have seen videos of
guardrails and cloud shadows and everything that works. But yeah. We're not spending time to
the experiment on stream. >>Victor: Well, I'll
bring that up with Ryan We can see if we can
do it later as well, or if you come across it,
we can jut talk about it in the forum announcement
thread, which if you are just tuning in through
Twitch, we do announce the streams on the Events page
of the Unreal Engine forums, as well as on the
schedule on Twitch. But all the information
exists on the forums. Let's see. You might have touched
on this as well. One of the questions
was, any way to add localized layer textures,
like a super cell storm cloud blowing through? >>Asher: Localized what? >>Victor: Localized,
layered textures. >>Asher: Layered textures. >>Victor: So I believe
they're asking if they can layer textures
on top of the height map for the clouds to
sort of adjust, adjust the final effect of the clouds
to, like have a tornado come through. >>Asher: Yeah. Of course, you can do that. That's just how you can do it. Yeah these are all
on the landscape. But if you want, we have Niagara
model currently called grid 2D. You could do simulations
in entirely within Niagara. So if you stand here, and you
want some wind, or bullet, or explosion to effect
the fog or cloud, you could totally
do the simulation in Niagara 2D and have a
texture to the output, like we're in the target. And user rendered could put on
top of the current town set up like mine, or
whatever you're using. It can be done. It's just how you do it. By the way, I was right
about the Niagara simulation. In a few days, maybe a week,
I have been reading that. I have been writing that before
you asked me to do the stream. So yeah. Go to Asher.gg and
my Twitter handle. I typed that in chat. >>Victor: It's also on the
form announcement poles. >>Asher: Yep. >>Victor: Let's see. We can do a few
more minutes here. >>Asher: Yeah. Someone asked about the demo. Was the grass fire render? We'll upload it, if you
guys want to look into it. I will tweet that, and I
will put that in the forum announcements post. >>Victor: Yeah, you got it. They were curious of what
the specifications your PC has-- you mentioned
you have a 2080 ti. What's the CPU? >>Asher: It's not as good. It's just an I9. >>Victor: Pretty good. >>Asher: No, that's kind of low
spec for Epic's standard. >>Victor: Yeah, when you compile
the engine over and over and over, it helps
to have a little bit of-- >>Asher: It's so slow, guys. I'm like in pain. It's torture. >>Victor: See if we
can work on that. Does the cloud
shadowing only work for a main directional
light, like a sun or a moon, or would it also work for, say,
an extremely large spotlight? >>Asher: Not currently, no. It only works with skylight
and directional light. The Niagara tutorial
write-up in a few days will be on my website. And I will tweet it. So whatever way
you're following me, you will be able to say that
I do a pretty good Twitter marketing. >>Victor: Someone was curious how
the detail is getting better when you get into the cloud. >>Asher: How to make
the details better when I get into the cloud? >>Victor: Yeah. They're asking
how it gets better when you get into the cloud. I that a mix
because of the blend between the clouds and the fog? >>Asher: So he's asking how I
could make that better if I had more time, something like that? >>Victor: No. I believe he saw that the
detail got better when you were flying into the cloud. >>Asher: The actual shape
it's actually not better. The actual shape
is still like this. And it got better because the
natural open volumetric fog can react to the lights. And because the shadows, I just think
your brain tells you it looks better. And the details are actually-- yeah there are two layers of
the details I mentioned before. One is either
ground fog, which is another layer of volumetric
fog added on top of that, which the volumetric
cloud doesn't have, and another layer
of these sprites. And now, have I answered this? Wait, I've answered it? I don't remember. I don't remember. Detail's mainly
the two thing I did on top of the volumetric fog. >>Victor: There was
another question. How do you know when it would
be better for performance to use a ray mar
shader for volumetric rather than the exponential
height fog approach? >>Asher: Exponential height fog. Exponential height
fog is mainly for-- it's kind of
uniform height. And you can really
have shades and stuff. The exponential height fog, we
always have better performance, I think, because
it doesn't really have to copy a lot of voxels. It' just there. And the volumetric
frog is actually an option that's embedded into
the exponential height fog because it makes sense when
you have a global height fog like that. I have disabled it for
the sake of clarity. But if I set it to something-- this here-- you can see
this didn't have fog before. But if we send it
to a small value, it will have volumetric
fog when I'm getting close. The way you can tell
is volumetric fog is-- you can see the
light shafts. And there is a
maximum view distance on so exponential
Height Fog Actor. And if this pixel is
beyond the distance, it will be replaced
by the height fog. Now you can see there is a
quite clear blending light here. Beyond that is exponential fog. And this is volumetric fog. I can scale it. If you are only mobile, you
can disable the volumetric fog. And if we're on PS5, we can
have super dense volumetric fog with high resolution. So the light shaft looks
much better and sharp. >>Victor: See someone's
curious if-- does this work-- [COUGHING] All right, we're all good? Yep. It happens. I know how that goes. Yeah. Let me just speak
while I'm going to take a sip at the same time. I'm just going to wait for
Asher to be able to hear my question before I ask it. >>Asher: It's all right. >>Victor: Right, you with us? >>Asher: The volumetric fog is bad
for your body. >>Victor: Now, if
you add a filter-- no. They were curious if-- I believe this is
also in regards to the volumetric
clouds-- if it works with world composition and the
toolset succeeding it in 4.26. >>Asher: Sorry. Again? >>Victor: They were wondering--
and I believe this was in regards to the
volumetric clouds-- if it works with world
composition and the toolset succeeding it in 4.26. >>Asher: Toolset succeeding it. >>Victor: What are the changes to-- >>Asher: --how is that in relation
to volumetric fog? Is it about composition? >>Victor: I think
they're wondering if it applies to all of the
tiles in the world composition or if you need to implement it
specifically for each sublevel that you are loading. >>Asher: No. I don't think it needs to be. It's kind of a global
effects, like post effects. And yeah. Exponential Height Fog
only have one and the same. And you can have two layers,
so Exponential Height Fog, by using the second fog data. But yeah. You shouldn't have
more than one of this. But it really doesn't
matter if you're using what composition, whether
or not you should have a-- they should have a list
of persistent level to-- [INTERPOSING VOICES] >>Asher: --and you'll be good. >>Victor: Yeah, just make sure
that that level is always loaded, whether you like to put
your lights and all the effects into a sublevel that's
always loaded or just in a persistent one. I recommend a sublevel,
because then you can check it out and collaborate on it. They were curious about
the world, so the landscape and the scene. >>Asher: It's called skyscrap-- that's Stylescape. It's in the Marketplace. I just grabbed it and made some
small adjustment to the clouds. It works. It's close. It looks great. And some small
adjustment to in here. >>Victor: Did see a couple
more questions coming in. Which Engine version? Just want to reiterate that. This is from the dev
rendering branch on GitHub. If you haven't downloaded and
compiled Engine from source before, it can be
a good practice, especially if you're working
on something in the fix only exist in one of the
branches on GitHub. It can be good to
be able to know how to download that,
compile it, and then change your project-- [CHUCKLE] --into, say, dev
rendering or another one to be able to say
maybe fix a bug that was shipped
on GitHub, but not in the binary version
or the launcher. And that package
will gain for that. And then eventually,
the fix will come back to the binary version
of the launcher. And then you can continue
reiterating on that version. Someone was curious if
you have any resource recommendations for learning
ray marching in Unreal. >>Asher: Yeah. I learned that from
Ryan Brucks' posts, and because he had demos. He had done all that. It's GDC 16, I think? I just searched for Ryan
Brucks ray marching. He has had a lot
of posts about it. And yeah. If you want to learn
about that, fine. Of course, go ahead. If you are a tech artist and
just want to play with it, I highly recommend you
to just use the renderer. That's on dev rendering. Because ray marching, if you
want to do something simple, it's very fun and very
good for learning shaders. But the actual engineering,
and embedded with a lot of engine features
and optimization is-- yeah. It's very hard. And it's very hard. And they did a good job. And it's been developed
for a few months by a senior engineer,
who was working for this for a long time. So if you just want
results, just do this. If you want to learn, go ahead. >>Victor: All right, I think with
the construction continuing right outside my house, and
the fact that we are two hours and 20 minutes into the stream,
I think it's about time for us to call this a day. Asher, I want to give you a
big, super, super big thanks for coming on the stream
and showing off also, some of the cool tech. If people want to find out
more about you and your work, where would they go? >>Asher: First, check
out Ryan Brucks' works. This is a gold mine here. You can learn about a lot about
volumetric ray marching, and practical tricks. And if you want to find
me, I'm on archer.gg. This is my blog. And my twitter is
@Vuthric. Yeah, V-U-T-H-R-I-C >>Victor: It's also linked
in the forum announcement post in the Events
section of our forums. Awesome. I think with that said-- I'm still looking
for countdown videos. We are looking for
countdown videos. If you have been watching
this from the beginning of the stream, thanks
for sticking around. Hope you were
having a good time. The first five minutes we
do a little countdown, which is 30 minutes of development
recorded in the editor. And then fast forward
that up to five minutes. Send that to us
without any logos composited, but please
send your logo with it. And then we will add the
countdown as well as your logo there. And it might become one
of the countdown videos that we have in our rotation. Also, make sure that you let us
know about all the cool stuff you're working on, the
released/work in progress channels on the forums,
Unreal Slackers. Hit us up on Twitter, Discord. We're pretty much
all over the place. Make sure to let us
know what you're doing, and you might be featured
as one of our spotlights. Make sure you follow us
on Twitter and Facebook, all the social media out there. Make sure you give
Asher a follow as well. And then next week, I
had a quick little-- oh, I remember what we're
doing next week. Next week, Andreas, one of our
evangelists are coming on. We're going to make a game
from scratch, something that he's been working on. I'm pretty excited about that. >>Asher: The boat thing? >>Victor: Don't give
the surprise away. It's not a boat thing. It's definitely
not a boat thing. >>Asher: Not a boat thing. >>Victor: Nope. That's not the way it works. That said, I hope you all tune
in next week, have a good time. Stay safe out there. And with that said, we are
saying goodbye from this time. Thanks again, Asher. It's been a pleasure. >>Asher: Thank you again. Thanks to the chat. I had a really good time. I was scared and very nervous. But yeah. I went with the flow. And you guys are awesome. I had a really good time. Thank you. >>Victor: I think we can
say you did fantabulously, which is a word I just made up. With that said, I'll
see you all next week. Bye, everyone.