[VIDEO PLAYBACK] [MUSIC PLAYING] - Hummingbirds. Fighter Jets. Giant hornets. Can they fly? Yes, of course they can. Eggs. Eggs. Eggs. Can they fly? No, absolutely not. [HEAVY BREATHING] - Hey, Kevin. - Yeah, Daryl. - What do you
reckon about flying? - Uh-oh. - Oye, Tess,
stop bloody bouncing your sister into the coral tree. Bloody caviar. Sorry. What were you saying? - I was, um-- I was just talking about flying. - Oh, right. Right. Why? - I don't know. I just think it might be
really interesting to see the world from that angle. - Wait. Are you saying you want to fly? - Well, maybe. - But you're an egg. - Yeah, I know. But I could wear a
helmet or something. - Oye, Tess, stop putting
the bloody decorative pebbles up your nostrils. Look, Dars, I just don't think
eggs are meant to fly, mate. Ow. Tess, what did I just say? - Oopsie. - Flight Simulator,
where you can fly anywhere you can dream of,
as long as you are not an egg. Please verify that
you are not an egg. - Melanie, do you reckon I can
have a go at flying your kite? - I don't know, Daryl. Isn't it illegal for
eggs to fly or something? - Well, yeah, maybe. I don't know, but I wouldn't
actually be flying myself. I would just be holding-- - Oye, babes. Is this egg bothering you
because I'll end him if he is. - No, sorry. I was just-- - Oh, my God. That's like so romantic. - Melanie, your kite. [MUSIC PLAYING] - All right, Humpty Dumpty. Do you know why you're here? - Um, uh, I tried to fly. - You tried to fly. And do you know what happens
when eggs try to fly? - Um, they have fun? - This is the third
time this year you've tried
something like this. What's going on, Daryl? - I-- [MUSIC PLAYING] - Look, you're a good
guy, Daryl. I'm going to let you
off with a warning. But this is the last time, OK? You should really get some
help, mate. [MUSIC PLAYING] - Egg, a vessel for housing
tiny embryonic lifeforms that have evolved
over millions of years to live in complete symbiotic
harmony with one another. But can eggs fly? Yes, I think they can. AMANDA: Hey, everyone! Haven't had enough new
features and updates to try out in Unreal Engine? Well, good news! Unreal Engine 4.27
Preview is now available, with new virtual production
and in-camera VFX workflows, updates to architectural
pipelines, and production-ready
Pixel Streaming. Visit the forums for a full list
of updates in 4.27 Preview 1, then download it from the
Epic Games launcher or GitHub! Also ready for you in
4.27 is RAD Game Tools' cross-platform
performance-oriented video codec Bink Video and their
performance-oriented audio codec, Bink Audio. Both of these are now freely
available via the launcher and GitHub--learn more
about them on the feed. If you're ready to get
hands-on with UE5 Early Access, we're hosting a
series of livestreams on Inside Unreal where
our dev teams go in-depth with UE5's latest features. Check out the past streams
on Nanite and Lumen via the Unreal Engine
YouTube Channel and join us in the coming weeks
on Twitch or YouTube live, Thursdays at 2PM Eastern,
for more UE5 content. Don't forget, Unreal Build:
Broadcast & Live Events is next Wednesday, June 16! The free virtual event
will feature sessions with deadmau5, FOX Sports
CEO Eric Shanks, The Weather Channel,
and more incredible innovators. Head over to the
Unreal Build event page to see the full
schedule and register! With digital fashion opening
up new doors for designers, strut over to the
feed to find out how Epic MegaGrant recipients
Gary James McQueen and Moyosa Media created Guiding Light,
their first all-digital fashion show, in Unreal Engine. Developer Sumo Newcastle
asked themselves, ""What if Game of Thrones
did Payday in the Robin Hood universe?"" Hood: Outlaws & Legends,
a class-based PVPVE multiplayer heist experience,
is their answer. Venture over to the feed to
learn how Sumo Newcastle built and are continually
balancing Hood: Outlaws & Legends
and its novel mechanics. And now over to this
week's top karma earners. Thank you to: ClockworkOcean, Mahoukyou,
Everynone, Micky, Tuerer, Shadowriver, Luos,
Andrew Bindraw, mindframes, and Bojan Novakovic. Get whisked away
into an enchanting world with our first
community spotlight, Palia, a community simulation MMO. Build a home, a
life, make friends, and explore the mysteries
of humanity's past. Learn more about
Palia and sign up for their pre-alpha
at palia.com. Next up, set in a
post-apocalyptic future, the short-film AWAY,
follows ""The Traveler"" as she scavenges through a
burnt forest and the ""ghost city"" she lives in. Made to explore storytelling
with LED volumes, the short was created in
a single production day and garnered the team
at ICVR 5 Telly Awards! See more incredible work
from the team at ICVR.io. And last up is a stunning
seaside setting, called Oblivion, by Thilo Seifert. Inspired by their
camping holiday memories, the scene started as R&D for
building an ocean shore shader. View more technical details
on Thilo's ArtStation page! Thanks for watching this week's
News and Community Spotlight. VICTOR: And we're live. Hi, everyone. And welcome to Inside Unreal,
a weekly show where we learn, explore,
and celebrate everything Unreal. I'm your host, Victor
Brodin, and today, we're here to talk about Lumen. To help me in this
endeavor, we have co-host and Senior Technical
Producer, Chance Ivey. CHANCE: Hey, how's it
going, everyone? VICTOR: And the star of the
show, if I may, Daniel Wright,
Engineering Fellow for graphics. Welcome. DANIEL: Hey, guys. VICTOR: And last but not
least, Galen Davis, evangelist for Quixel. Welcome back. GALEN: Thanks for having me on. VICTOR: As always,
I'm glad you're all here. I'm not going to do
much of the talking. And so I would like to
hand it over to Daniel. DANIEL: I thought you
had stuff you were going to show at the beginning. VICTOR: No, that's it. [INTERPOSING VOICES] DANIEL: You said there was
like seven things to show. VICTOR: We already showed
all of that before-- CHANCE: We showed the fun stuff. VICTOR: Yeah. DANIEL: OK,
we were busy chatting. VICTOR: We did that. It's your turn now. Lost track here. DANIEL: OK, it's my turn. CHANCE: Let's dive right in. DANIEL: OK,
so I put together a few slides to talk about Lumen. And early access has
been out for two weeks, so I had a little bit
of time to write up some best practices with the
way I see people using Lumen and maybe how you can
get better results. To start out with, Lumen is-- we have a team. You might see the guys on the
forums answering questions, Krzysztof Narkowicz
and Patrick Kelly work on Lumen along with me. So let me just give
you a quick overview of our motivation for Lumen. When we set out to
make this thing, we wanted to solve the problem
of dynamic global illumination. And instead of
making something that was targeted at
current gen consoles and trying to scale it
up, we just kind of bit the bullet
and targeted next gen consoles so that we could make
something really high quality. But we did also set out to
scale to high-end PC, especially enterprise use cases where
quality needs to be top notch. That's also part
of Lumen's goals. And we really wanted to provide
performant reflections together with dynamic GI because
it's really difficult to do dynamic global
illumination together with reflections. That's not something that
existed before in the engine. CHANCE: That's great. DANIEL: Lumen also needs to
work with large open worlds, which is a big focus
of Unreal Engine 5, particularly with Nanite. Nanite makes crazy
levels possible, millions of instances. Just ridiculous content, really. And Lumen needs to
work with all of that. It can't be the
thing keeping people from making awesome levels. While all running in real time. And that means all of Lumen's
algorithms had to stream and they all need
to be GPU based. But indirect lighting
can't just work outdoors. It's much more important--
indirect lighting shows up much more prominently indoors. And this also is by
far the hardest problem for dynamic global illumination
because the entire room can be lit by a very small area. And we have to find that
area reliably in order to give good image quality. And it's not just enough to
solve these individually, but we have to solve
them seamlessly so that you can walk into
a room that looks great and walk back out into
a complex open world. So these have been the
problems that we've been focusing on and
trying to solve with Lumen. CHANCE: Daniel, since I'm curious as
a not artist here, for indoor quality, by far, is the hardest
problem in real-time GI. Why is that? Is it because of all of the
different geometry that's kind of in an enclosed space and
trying to get the bounce right off of things? Or lights affect more things
that are closer to the camera? Is there a reason that that
is, say, harder than an outdoor space? DANIEL: Yeah, it's that there are a lot more shadowed areas
where only GI is on screen. Like in this screenshot
here, the direct lighting is just in the bottom left. Everything else is 100% Lumen. Lumen skylight,
Lumen global illumination. So every pixel,
Lumen is providing all of the lighting for it. Whereas outdoor, this is not a middle
of the day scene. But in what we shipped
for Valley of the Ancient, because the sun is
what you see the most. So we can get away with a
lot lower quality there. And that's true of
most outdoor scenes. CHANCE: Got you. DANIEL: So that's half of it, basically indirect
lighting's more prominent. The other half of it is that
to solve indirect lighting with good quality,
we need to have very low noise. And that means finding where
the incoming lighting is. And the incoming lighting is in
a very small area for indoors. Like behind me,
the whole room is lit up just from this one window. Whereas outdoors, you can kind of look
at any direction and you'll find some lighting. But indoors,
most of the directions are just black,
except for that window. You'd better find it. CHANCE: I see. Makes sense. DANIEL: Some of
Lumen's features. Lumen provides dynamic
global illumination, which is lighting
bouncing off every surface onto every other surface. And in indoor scenes with
really bright base color, really reflective materials
like white paint, it needs to bounce
multiple times. And Lumen provides
multiple bounces. If it doesn't bounce
multiple times, you get to this kind of
like unnatural dark look. And artists have to compensate
for it with other techniques. This is showing how high quality
Lumen global illumination can be if you crank everything up. This is not the
default settings. This is a scene
from Rafael Reis. Did I say that right? CHANCE: I believe so. DANIEL: You guys
corrected me last time. This is a project that
he's still working on. And I saw his
screenshots on Facebook. And I reached out to him and I
said, can I show these for the Lumen Livestream? Because it's just unbelievably
good quality that he's created here. CHANCE: I'd say that
he's still working, too. It's something we
discussed yesterday. To my untrained eye,
this looks amazing. Like really, really spectacular. Detail in here,
everything looks like you would expect a real room
to look like, right? DANIEL: And he did not
bake lighting for this. CHANCE: Yeah, 100% dynamic. Wow. DANIEL: Lumen is solving this in a handful of milliseconds,
where as to bake lighting, you got to wait for minutes
and can't move anything. So thank you to Rafael for
letting me show your scene. It's really cool. Lumen also provides
shadowed skylight, which allows indoors to be
much darker than outdoors. And when you place a movable
skylight in your scene, it just works. Lumen picks it up, shadows it automatically
and removes it from the interior areas. And Lumen does propagate
emissive with a little star. CHANCE: Yeah,
we've seen so many photos online of emissive
only scenes and what. I'm sure we can dig
into detail there later. DANIEL: Yeah,
I have a few slides on like best practices
with emissive. You can't just replace
your light sources with emissive right now. CHANCE: Yeah. DANIEL: It has its limits. Basically, you need to
keep the emissive subtle. But it does propagate emissive. And Lumen provides reflections,
it traces extra rays to solve reflections if the
material's smooth enough. And they integrate together
with the dynamic global illumination. So in the reflection,
you see GI, which is very important
because otherwise, it would just be black everywhere
except where the sunlight is. CHANCE: And a question on that. On the reflections, are those also casting the same
color light back out and influencing other
parts of the scene, too? DANIEL: No, the reflections are only influencing
what's on your screen. CHANCE: OK, got you. DANIEL: There's no
caustics, which is the sun comes in the
window, bounces specularly off the ground,
makes a pattern on the ceiling. Maybe that will be
Unreal Engine 6. CHANCE: Right on. DANIEL: Lumen supports
clear coat properly. So there's actually two
layers of reflections, not just one, which is really
important for car paint and some other materials that
have a lacquer over the top. Some of the text is
hanging off the screen. VICTOR: Yeah,
I see that, actually. DANIEL: I think that's
just this one slide. I think it just messed up. VICTOR: Everything else
seems to look good. DANIEL: OK, good. So Lumen does provide
dynamic GI and sky shadowing on translucency
and volumetric fog. But it is much lower
quality, which is something we hope to improve. But it is a much harder
problem than solving GI on opaque pixels. Because there's arbitrary number
of layers for every pixel. So now, I'm just going to go
over some of the Lumen settings that you'll see in early access. If you make a new project,
Lumen's on by default. It's default dynamic GI
method for the engine. But if you bring over
an existing project, then you just need to
go into Project Settings and set this one thing,
which you can find really quickly if you do a search. Just set Dynamic Global
Illumination Method to Lumen and it
will automatically set the other dependencies,
which are these three guys. These are the things
that we change to be on by default
in Unreal Engine 5, which is Lumen GI, Lumen Reflections
and Generate Mesh Distance Fields for
Software Ray Tracing. CHANCE: So that first
one would be more about if you're moving a project
over, and then if you want to take Lumen and
turn it off or toggle it back, this would be where
you would check? DANIEL: Yeah, you can do it
here for the whole project. You can toggle between
Lumen and not Lumen. CHANCE: Got it. And project wide settings, too? DANIEL: Yeah. CHANCE: That being in projects. DANIEL: Yeah. Yeah, all the project settings
are for your whole project. So that you can also change the
method in post-process volume for just one level. I'll show that in
a couple slides. CHANCE: Super neat. DANIEL: Here's some other
important project settings, which I'll talk
more about later. But these are the ones
that you need in order to get top ray tracing
quality from Lumen. You've got to turn on Support
Hardware Ray Tracing, which makes it so that we
compile all the shaders and build all the structures to
support hardware ray tracing. And then you have to tell Lumen
to use hardware ray tracing, which you can toggle
this one on and off and see the influence of it. CHANCE: Got it. DANIEL: Lumen also has
a couple more settings in the post-process volume. You can change the
method per volume. So you can have one-- we use this in our QA
stuff all the time, where this level uses
Lumen and this other level uses something else. And then you can crank
up the quality of Lumen. This is mostly meant to be
like, in this one hallway I need higher quality. It's not meant to target
different platforms. That happens somewhere else. That happens in the
Scalability.ini file. CHANCE: So this would
be more about like, you're building your
scene, your building maybe a game or something where you're
going to move through a space and you've got the
budget in this space. So things are closer
to the camera, you want to just right now
spend those extra milliseconds on bumping these numbers up
using a post-process volume? DANIEL: Yeah? CHANCE: Got it. DANIEL: And like Arches
projects that generally don't have a ton of different maps. Project Settings are really
powerful because you're automatically setting
across all the maps. If you're in a big
production, people are making test
maps right and left and you want to make sure they
all have consistent lighting settings. That's why you get
to have the project setting. But if you're not
in a huge project and you can just
control it all yourself, you can just do everything
through the post-process volume. So Lumen doesn't have that
many controls, you notice. And not that many
things to fiddle with. Actually, because Lumen
is a lighting simulation, most of the inputs to
Lumen are actually coming from your light settings-- your existing light settings,
your material values and the exposure you
setting on the camera. And then Lumen is taking
those and propagating them through the simulation
and bouncing the lighting. CHANCE: That's great. DANIEL: So let's look
behind the scenes at how Lumen actually works. So just a high level peak. From the high level,
Lumen uses software ray tracing by default,
which is something we've developed for Unreal Engine 5. And it can also support
hardware ray tracing, which I'll talk about a little bit later. First, Lumen traces
against the screen-- the depth buffer. And if that misses-- sometimes those rays
go behind something or they go off the screen-- then Lumen traces against
an alternate representation of the scene that we
maintain, which has signed distance
fields for meshes. And then once the ray hits
the signed distance field, then we get the
lighting for that hit point with a surface cache. You guys have any
questions about that? Does that make sense? CHANCE: As a non-rendering,
non-artist person, I think that it's a little
bit over my head for this one, but I get it. DANIEL: I have some
nice screenshots to show to hopefully
show it a little better. So Lumen doesn't just use
one type of ray tracing, it's a hybrid ray
tracing pipeline. So screen traces,
like I was saying, when they go off the screen,
they can't find anything. So then it switches
to the next method, which is tracing the individual
objects distance fields, which is more reliable
than screen traces because they work off screen. But there is a cost
to those traces. So once the ray
gets even further, then we only trace against
this global representation where we've merged
everything down. And those traces
are extremely fast. And that's what's called
the Global Distance Field Trace here. So there's no way to
see this visualization. I just hacked the
code to show this-- to let you see
behind the scenes. But this is what ends up
happening for a single scene. Each pixel's GI is solved
through a combination of different types of traces. CHANCE: Are some of these
some of the dynamic settings you've built over
the years for UE4? You're just able to
take some of those-- some of the same
processes and apply them all together to get the
best rounded out result? DANIEL: No, we developed a
lot of new software tracing technology. Our screen traces are all new. They're hierarchical z-buffer
traces, didn't exist before. Our mesh distance field traces--
we had distance fields in UE4, but they were not sparse. This is my next slide. And our global distance
field is way better, too. It's also sparse now. So we've basically rewritten
global distance field and mesh distance fields to be way
better in UE5 for Lumen. Those were used in UE4
for distance field shadows and distance field
ambient occlusion. So our mesh distance
fields are way higher quality in Unreal Engine
5, which we rely on heavily in Lumen. We now have a sparse
representation. So we're only storing data where
we need to, near the surface. We generate mipmaps of
those distance fields so that we only need to load
the ray version depending on how far away you are. And as a result, we were able to
double the default resolution, which since it's 3D, actually means eight
times more data. Because it's doubled in x,
doubled in y, doubled in z. And four times
higher max resolution per mesh, which is really
important to supporting those large meshes that
were failing with distance is fields in Unreal Engine 4. CHANCE: I'm sure
that helps handle the hyper high-res Nanite assets
as well a little bit better. DANIEL: Yes, it makes the system
a lot more robust. If you just go and
place a mountain, it works much better now. In Unreal Engine 4,
there were a lot of artifacts if you placed a large object
with distance field ambient occlusion. Since we increased the
resolution so much, you'd expect that it got at
least 10 times more memory. But actually, it uses about half
the memory of Unreal Engine 4 because of streaming and
the sparse representation and a bunch of
optimizations I did. CHANCE: Interesting. DANIEL: We also made the
distance field building 10 times faster,
which is kind of necessary. Because once we
increase the resolution, it became quite a pain point. But it's 10 times
faster now so that it's kind of like hidden in the
background for the most part. And we improved this mesh
distance field visualization so you can actually get
a better sense of what the distance field
scene looks like. It does some simple lighting now
using the distance field normal and the hit position. CHANCE: Got it. On those improvements there,
are those Lumen specific? Or would they affect
the DFAO as well? Or are they just
mostly for Lumen? DANIEL: They're
not Lumen specific. Distance fields are usable
by other rendering features. CHANCE: Great. Cool. DANIEL: Like I said,
they were very much developed for Lumen so the benefit isn't quite seen in other
features as much as in Lumen. So distance fields only give
us, basically, what's in this visualization. They give us a hit
point and a normal. But how do we actually know what
was the material color, what was the incoming
lighting there so we can bounce it to your eye? Well,
distance fields can't give us that so we had to come up
with a different solution. And what we came up with is
what we call the Surface Cache. And it's basically,
capturing meshes from different orientations and
storing them off into an atlas with all of the material
properties sorted out. And we maintain this as you fly around the scene.
We're recapturing all the time. I got to say something
about the Surface Cache, it has one big
content limitation. And this is the biggest
content limitation of Lumen, which is that
the Surface Cache can only work if your meshes
have simple interiors. The walls need to
be separate meshes, they can't all be in one mesh. And it's not
something we're happy with, obviously. But it is a current limitation. And I can't remember what else
I was going to say about that. CHANCE: It's all good. How big is that big
sheet there in memory? DANIEL: It is a 4096. CHANCE: 4096, OK. DANIEL: These are all cached
so we're not recapturing them all every frame. We're going to be recapturing
a very small portion. Basically,
what we can afford to cache-- to recapture. And they're kind of recaptured--
as you move around, we recapture them at a higher resolution. Or if you move away, we recapture at a
lower resolution. And these captures, these really
small, low-resolution captures are actually stupidly
slow without Nanite. Because if the mesh doesn't
have a really good LOD setup, it's going to process way too many triangles to
render at like 8x8. And also just based on the
limitations of the hardware rasterizer, you have to render
into one view at a time. But Nanite,
since it's a custom rasterizer, it has a multi view feature. So we can just
render it-- we just blast into all of
the different views for each different projection
of the mesh at the same time. And we measured, it's between 10 and
100 times faster, depending on how inefficient
the original content was to the point where
you basically need Nanite. If you're going to use Lumen
with any high-poly assets, and especially instanced static
meshes, they need to be Nanite. [INTERPOSING VOICES] GALEN: Oh, sorry. Go ahead, Chance. CHANCE: No, no, no. It's good. GALEN: I was just
going to say maybe it's worth lingering here for a
second, because-- I mean, we just had our livestream about
Nanite last week. And some of the limitations
there with the tech as it currently exists
for early access. So what if I'm
making a scene just hypothetically with
lots of things that are moving that are not Nanite? What would you
kind of prescribe, I guess, for a scene that maybe
has a handful of Nanite assets, but there's also a
handful of assets that you know for sure are
definitely going to be moving and therefore, not Nanite. DANIEL: The main thing we need-- that Lumen needs to be efficient
is to have LODs set up. If you have a million
poly mesh with no LODs, it's going to be slow. But if you use instanced static
meshes, it has to be Nanite. There's no way around it. The performance
increase for Nanite is the only way to make
it fast enough to use. GALEN: Makes sense. CHANCE: And you can use
Nanite on lower-poly meshes, too, as we discussed last
week, too. It doesn't have to
be a super high-- [INTERPOSING VOICES] DANIEL: You should
definitely use it. Because of the GPU driven
rendering that Nanite uses, it can blast through-- there's very little
overhead per instance. So after we've captured
the material properties for the surface cache,
we still need to light them. So we go through
and apply each light to all these little cards
in the surface cache. And we also do a GI gather
in the surface cache. This is how we provide
multibounce in Lumen. And that GI gather is not
particularly high quality right now. And I'll show you some
artifacts from that later, which we hope to improve. So you can see the combination
of the mesh distance fields and the surface cache. You can see what they look
like by going into the editor: Show -> Visualize -> LumenScene. And this is basically
what Lumen sees while it's doing a global
illumination or reflection trace. If the screen trace doesn't have
anything or it goes off screen, then Lumen picks up the ray and
traces in this representation. So this is like the most
important view mode for Lumen. Because if your scene
in the LumenScene doesn't match what's
on your screen, there's going to be
view-dependent artifacts in GI. So this is your number
one tool to seeing what is Lumen doing? Why is there an artifact? CHANCE: Yeah,
and you can A/B this with what your-- you can toggle
this back and forth to make sure that it looks roughly
the same color-wise and size-wise and
whatnot, right? DANIEL: Yeah,
it's really convenient because if you use the
G key in the Editor, you could just flip back
and forth between them. Because you can have
different show flags for like the G-mode and non G-mode. I'll show you some
of that later. So these mesh distance fields,
they're pretty accurate. Not as accurate
as the raw Nanite triangles,
but they're pretty accurate. But they do cost something
to trace through, because the cost of tracing is
dependent on how many meshes the artist placed. If they placed a
ton of messages, especially overlapping meshes, each trace has to go
through all of them. And so we have an optimization
for software ray tracing, which is to merge them all together
into a global data structure. We merge all the
mesh distance fields into a global distance field. And in Unreal Engine 5,
the global distance field is double the
resolution as Unreal Engine 4 and very efficient
cache implementation. Not something you need to
know but behind the scenes, it's better than what was
used in UE4 for distance field ambient occlusion. CHANCE: Right on. So when I asked earlier about
using these things for DFAO, it doesn't really matter as
much if you're just going to use Lumen going forward? DANIEL: Yes. Yeah. I do hope we find other ways
to leverage distance fields, but Lumen is the primary user. So on top of merging
distance fields, we also merge the
2D surface lighting into a voxel representation. And that's shown
here on the right. The left is actually tracing
against the mesh SDFs, getting a really accurate hit,
looking up into the 2D atlas. And on the right is the
merged Voxel version, which is so much faster
to ray trace against but more approximate. So that optimization is
always used by default after two meters. We trace against the individual
meshes for the first two meters,
and then we switch and we trace against only the
global merged version, which makes software ray
tracing very efficient. But in some crazy
scenes where you just have a ton of
meshes overlapping, with this project
setting, you can tell Lumen software
tracing, only trace against the
global version. And that's very fast. CHANCE: Got you. And that's also just
there in Project Settings? DANIEL: Yeah. Software tracing has
some limitations. It's kind of like to make
this stuff real time, we had to limit our
scope of the problems we were trying to solve. We only support static meshes
and instanced static mesh. Screen traces work
against anything, but the distance
fields only exist for static mesh and
instanced static mesh. And we don't have Landscape. We know we have to add that. That's not intentional, just didn't get to
it in early access timeframe. In 5.0 release, if you're building
an Unreal Engine 5 project, you can use
Landscape with Lumen. You don't need to
worry that like we won't add Landscape support.
We definitely will. I don't usually make promises
about future releases. But that's one-- Lumen has to have
that to be successful. World position artifacts-- world
position offset causes some artifacts with Lumen
because we cannot replicate per-vertex deformation
in the distance fields, which were built off
of the non-deformed version. And we don't have a good
solution for that right now. What we hope to do is
the same thing distance field ambient occlusion has
where there's just like a bias you can provide and say, don't self-shadow within
this distance. That allows you to do some
sway without having it self-intersect
and get all black. CHANCE: I see. DANIEL: And there are a
bunch more limitations. By the way, the Lumen
documentation is really good. Check it out. We spent a lot of time on it. We were very thorough with
all the limitations and stuff. And you can find the full list
of limitations in the docs. CHANCE: Well,
thank you for doing that. I'm glad we have
docs at early access. That's super great. DANIEL: Yeah. So that was all
software ray tracing. Didn't require any
special video card. It ran on any DirectX
11 hardware or greater. And has options
to run super fast. But supports limited
geometry types. So let's talk about hardware
ray tracing in Lumen. So here's how you turn on
hardware ray tracing for Lumen. You have to tell the project to
support hardware ray tracing, and then you have to tell
Lumen to actually use it. And you can toggle
this one on and off and see what it's
doing on the fly. So hardware ray
tracing, high level, it can provide the highest quality,
but it also costs the most. And you need a special
video card to run it. And on PC, it only will work
if you're running under D3D12. And the video cards that you
need that support hardware ray tracing are either
Nvidia RTX 2000 series or higher or the
AMD RX 6000 series, which are both very
recent video cards. And hardware ray tracing in
early access, Lumen's use of it is still partial support. We support it for
reflections and for part of the final gather,
but we're still working on supporting it
in all of Lumen's facets. So you actually
need distance fields and hardware ray tracing in
early access for Lumen to work. Nanite enables an order of
magnitude-- several orders of magnitude higher
detailed geometry. But it accomplishes this
through some rasterizer specific techniques and decompressing
the vertex format on the fly, which are not
compatible with hardware ray tracing's limited API
and representation. So we cannot hardware ray
trace against the raw Nanite triangles. Instead, Nanite provides
a proxy geometry version, which is a simplified
version of the geometry. And hardware ray tracing
operates on that. And most of the time,
we can cover over this mismatch with screen traces
because what's on your screen is the actual Nanite rasterized
geometry at full resolution. So the screen traces are tracing
against the full resolution version. And then once it goes behind
something, it gets blocked, then it switches to
the proxy geometry. But in some cases, where you
have something really important like a car, the proxy geometry
will cause some leaking. And then you can raise the
Proxy Triangle Percent 200 to get rid of that leaking
and have the hardware ray tracing use the
full res version. CHANCE: And to be clear,
this would be Nanite settings to get rid of some of these
artifacts, correct? DANIEL: This is on the
static mesh under the Nanite category. CHANCE: Got it. OK. And this would be probably like
really case by case, something that you might
encounter at some points if you go asset by asset. Not something that
you should consider project wide as
part of a workflow. But just kind of as you find
things, just kind of-- OK. DANIEL: In fact, there is no project
wide setting. So you can't-- CHANCE: Well, I'm just saying-- DANIEL: You can't just
set it in project like-- CHANCE: Yeah,
as people build their pipelines, start importing assets and
say, oh, we heard this thing or
we found this one asset, there's a hole and we changed
this thing, so now every asset we want to change this to be-- up the [INAUDIBLE] proxies. It's not necessarily
something we would encourage. It'd really be more
about, hey, just using it like you normally would. And then as you find
issues, this is a good place to look
to try to overcome those. DANIEL: Yep. So you'll see in the
LumenScene visualization, you'll see what those holes are. And then as you increase
the proxy triangle percent, you'll see that it gets fixed. In this Lake House
level from Rafael, which I've been testing
with hardware ray tracing, I didn't have any issues with the Nanite proxy
geometry at all. So I didn't have to do anything
to fix any artifacts there, even though what we're
ray tracing against is a simplified approximate
version of the triangles. You'd expect on a curved
surface, some of the rays would self-intersect, but
actually, the screen tracing is doing a pretty good job of
escaping the proxy geometry. GALEN: A quick
question actually. So for the last two projects
that we've done internally here at Epic,
I'm just curious if you felt like you've seen
diminishing returns as far as increasing
the percentage of the proxy geometry for Nanite
assets with regard to Lumen? Is there some-- I know that we must have been
using kind of Megascans assets for those last few projects,
so I was just curious, do you feel like there's kind
of a sweet spot or some area where you feel like we're
seeing diminishing returns once we get past a certain
percentage for those setups? DANIEL: For the most part we just use the default
Nanite proxy triangle percent, which gives you
like 2,000 triangles, about, per mesh. And it's been good
enough, except on cars. That's the main pain point. Because cars have
very thin surfaces, and especially if you start
from a very high-poly mesh, the Nanite proxy geometry
will be a heavily simplified version of that. And once it gets
simplified to that degree, some cracks will show up. And we may improve these
in the future, such that this is not even
an issue anymore. But currently,
we generally don't have to touch this, except on cars. GALEN: OK, yeah. Because I'm just curious
if we're saying, most like 2,000 triangles per
asset, increasing it to say like 20,000 per Nanite proxy, would you expect
to see huge return from that being 10x or no? DANIEL: Do you mean
in terms of quality? GALEN: Yeah, in terms of quality for
Lumen, yeah. DANIEL: Generally, if you just-- I think the default is like 1%. The proxy geometry is
1% of the original. If you set it to 2% or 4%,
it will fix all the bugs. So you don't have to go
all the way up to 100, you just need to
go up a little bit. GALEN: Cool. CHANCE: So that's
a very little bit. I guess it's double. But I mean, still, 1 to 2 is not that
big of a jump. DANIEL: But it's just on
an individual mesh, yeah. CHANCE: Yeah. DANIEL: I mean,
people are not really going to have to
touch this setting. I just mentioned it
for full disclosure. And I think if you're making a
scene that has to be perfect, ArchVis, then this is a
setting you need to know about. But otherwise,
you pretty much-- it just works. GALEN: Cool. Awesome. DANIEL: Some other
things to know about hardware ray tracing
is that it does not scale to Nanite-size scenes. And in early access, you need
to keep your scene kind of small at less than 10,000 instances. But this is something we're
actively improving already. And actually,
in the main branch, we can already afford 100,000-- 10 times more. And hardware ray tracing
does support skeletal meshes, which is one of the big draws
of it over software ray tracing. However, those skeletal
meshes actually cost a lot. Because it's dynamic
geometry, we have to rebuild the ray
trace acceleration structure for every single frame. So it's very important to keep
what ray tracing is building for them, very low poly. And we're working on
a ray tracing LOD bias so that what's on your screen
can be very high quality, but what we're ray tracing
against can be a lower quality. That does not exist
in early access yet. CHANCE: And just to clarify, because as not an
artist, you say that it's more expensive
on skeletal meshes or skinned meshes. That's not because they move,
because they deform some, right? Because the amount
of surface area that a color might actually
show in a scene is more or less, depending on its state. Or is it more about movability? DANIEL: It's entirely
about deformation. CHANCE: Yeah, OK. DANIEL: So the ray tracing is accelerated by a
bounding volume hierarchy. And for static geometry,
just build it once at load and then you're done. But for dynamic geometry
and deforming stuff, you've got to rebuild
every single frame. And that is very expensive. The last thing about hardware
ray tracing performance is that if you want to use it,
if you want to build your scenes so that you can use it,
you cannot overlap your meshes to a crazy degree. You cannot copy/paste meshes
and have them all overlapping. Because hardware ray
tracing has to go through every single
one of those that's overlapping every single ray. And it becomes incredibly slow. So hardware ray tracing
overall, we see is about 50% slower than software
ray tracing, while being more accurate. But once you start
overlapping meshes, then it can become
significantly slower than software ray tracing. So this is something
to keep in mind as you're building your scenes. If you plan on using
hardware ray tracing, this is something you
need to keep in mind. CHANCE: Kitbashing
has huge costs. Don't tell Daniel how we
built Valley of the Ancient. GALEN: Yeah, I've never seen this screenshot in my life. DANIEL: Hey,
I didn't want to out you. CHANCE: No, it's all good. DANIEL: You guys mad
about that screenshot? CHANCE: No. It's fine. We talked yesterday with-- or last week with Brian
about how much we learned on the project. Kind of taking a lot of
these new technologies and throwing them in and
seeing what we could actually do with them for early
access, right? And so just like with Nanite,
there's a lot we learned about what we can and cannot or should
and shouldn't do with Lumen to stay within
performance budgets. DANIEL: I think it's pretty cool that even though there's 15
layers underneath her foot, we have a software
tracing solution that gives good performance
in this situation. I have a pretty crazy
thunderstorm going on. VICTOR: There's a
thunderstorm going on, Yeah. I was going to say, if I drop or
me and Daniel drop, that's why. CHANCE: Vic, Daniel,
and Chance... DANIEL: My plants are getting
a good drink right now. OK, so I talked about how Lumen
traces rays to find light. But I didn't talk about
how we actually provide global illumination
and reflections. So the Final Gather is
the name for the process of propagating the light
from the LumenScene to your final pixels
on the screen. And figuring out that light
with a very low amount of noise, because we can only afford to
trace about half a ray per pixel, which is a very
small amount of rays that we can afford. Where, to actually provide
noise-free GI, you would need about
200 rays per pixel. Because ray tracing
is so expensive, each individual ray costs a lot. You have to make
the best use of it. CHANCE: And the
difference there, too, just to understand the
half, we can only afford this, but you need 200 for these
things, again, is it indoor
because you have all that bounce from a
single kind of location? It's filling up all that space
and you're really close to it generally? DANIEL: When you go
to trace the ray, you genuinely don't know where
the lighting is coming from. Could be from anywhere. It's very different
than direct shadowing where the engine has a
list of all the lights and it just can go
through them and send a ray to every single one. With global illumination,
because it's the entire scene that's
bouncing light into you, you don't have a list of them. You have to just send rays
out and see what you find. So the smaller and
the brighter those areas are,
the harder it is to solve, the more noise there will
be in the final image. And that's why indoors
need a lot more rays to find that lighting
than outdoors. So we have this problem
where we need 200, we can only afford a half. How do we make this
400 times faster? And some existing-- [LAUGHTER] CHANCE: This is only, right? DANIEL: This is why dynamic GI is such a hard problem. And existing approaches
mostly fall into two buckets. Either people use
irradiance fields where they do ray traces from
a very small set of directions that are spread out far apart. And then they have to do
a whole bunch of stuff to hide the fact that
they're not actually tracing from the pixels on the screen. They're tracing from over there
somewhere, which might even be on the other side of a wall. They have to store
occlusion in the probes. They have to interpolate it. They usually require the
artist to set up volumes around the level manually. And if you have a
little complex area, the artist has to go and
put more probes there. They also usually have a very
slow lighting update speed. If you change the time of
day, they need to update very slowly
because, otherwise, dynamic-- because they're not tracing
for very many positions, a dynamic object going
through the probe would cause a lot of
artifacts that would just get spread all over the place. So irradiance fields are
great for performance. But they are very
limited in quality. Not what we were
looking for for Lumen. And the other dominant approach
to solving the final gather is a screen space denoiser. This is where you
just trace one-- which is all you
can afford-- one ray per pixel distributed randomly. And then you have an
extremely noisy image. And then you try to go and look
at the neighbors on the screen and reuse them to get
something that's more stable. But the high level
problem with this approach is that when you're
denoising after the fact, you've already traced the
rays, it can only do so much. I'll show you a comparison
in a little bit that's just showing how noisy the
input is with screen space denoisers, which makes it very
difficult for them to overcome to actually produce
good indoor quality. So for Lumen, we went in a
different direction. And we used screen
space radiance caching, which allows us to trace from
very small set of positions on the screen. We generally downsampled
by about 1/16 and do more to ray traces
at each of those positions. So each probe in
the radiance cache has much more stable lighting. And then we interpolate
that to neighboring pixels. And that allows us to do
way more rays per point to solve to get stable lighting. And then we do the integration
with the Nanite super-detailed normal at full resolution
so that the final lighting is super detailed. CHANCE: Another reason
these work so well together every time. DANIEL: Yes. And it informed us on our
Lumen algorithm design. We knew from-- some lightning. We knew from day one
with Lumen that we had to support extremely
detailed geometry. And that's another thing
that screen space denoisers fall down at. You can only reuse other
rays on the screen that came from similarly facing geometry. You have to have a normal
weight in your filter. And that basically
rejects everything with Nanite geometry where
every single pixel has a different normal. Lumen also has-- to get really
stable distant lighting, we also have a world
space radiance cache where we placed a much
smaller number of probes and trace from those and
reuse those for many pixels. And you can actually visualize
that world space radiance cache,
see those probes in the world, and what Lumen is
gathering for them. So this is just for
distant lighting. A lot of people are used to
seeing a probe grid like this. But other techniques use
this as their final lighting. Lumen does not. This is just distant lighting. And we're still ray
tracing from the pixels on the screen,
which is how we're able to get such detailed and
direct shadows. Just to summarize
everything, why Lumen gets such great results indoor
quality versus a screen space denoiser is that
basically the input-- the raw traces that
come out of-- the result of the raw traces with Lumen
is already so much less noisy that once we go to
do our filtering, we don't have to
filter very much. So here on the
right, Lumen is doing a quarter of the traces
of a screen space denoiser and already has a much
more stable input. This is not the final image. The final image will
be after filtering. But you can see that
Lumen already has-- the Lumen filtering has
a lot less work to do. And I'm going to be presenting
on the Lumen Final Gather techniques this year at
Advances in Real-Time Rendering at SIGGRAPH. This is just super high level. VICTOR: And we'll make
sure to let everyone know when that's happening
so they can tune in if you're excited. DANIEL: Yeah. Another slide deck to make. CHANCE: It's OK. You do lots of them. DANIEL: I'll use some of these. VICTOR: I'll make
sure to send you the updated template with
the new logo for next time. DANIEL: Hey you didn't
say anything last night. VICTOR: I didn't. DANIEL: I don't know
if you can complain. VICTOR: That was
not a complaint. Like I said,
I will make sure that I send you the updated template next time. CHANCE: Yeah. We just love that Daniel's
here, going legacy on us. DANIEL: They said it's
like every quarter, there's like a new template--
a new Epic template. And I hate 75% of them. So I keep coming
back to this old one because it's just like it
hides in the background well. It doesn't distract from
what you're trying to show. Some of the latest
ones, one of them was bright yellow text
on a black background. It's like offensive
to your eyes. VICTOR: Coming from
marketing here, I'll make sure to bring your
feedback back to the team. Because, honestly,
I do find it valid since you are one of
the programmers who are producing our tools here. CHANCE: Alan Noon in
the chat from our team, I feel attacked. DANIEL: Did he make it? CHANCE: I don't know. VICTOR: He's here. CHANCE: I do find
it a bit awesome that it's one of our rendering
engineers about lighting that's talking about
how bright the text is on a dark background. I think that that is-- DANIEL: And past a
certain contrast level, your eyes-- it just distracts you from the
content of the presentation basically. Anyway, I'll get back
on track now that I've finished ranting about that. So Lumen Reflections. So I talked about
the final gather. I talked about how
we do ray tracing, I talked about the final
gather, how it uses that ray
tracing to produce very low amounts of noise in our GI. And now, I'm going to talk about
how Lumen solves reflections. So Lumen Reflections,
we reuse the traces from the final gather if
the material is very rough, because we already did
that work to figure out what's the incoming lighting. We already did a bunch of
work to make it low noise. And if you think about it,
as the material gets rough, the specular lobe
gets wider and wider. And eventually,
it just converges on diffuse, which is a full hemisphere. So that's one of the reasons
why Lumen Reflections and Lumen GI go so well together is
that half the reflections on the screen are
actually being provided by Lumen's final gather. For the rest of those
pixels that actually have a very smooth material,
we trace extra rays to solve the reflections
that are distributed just in that reflection cone. And that does increase
the cost of Lumen. It shows up in
Lumen Reflections. CHANCE: Got it. DANIEL: And then
Lumen reflections has a spatial and temporal
denoiser that runs in. And given only one ray per
pixel, it tries to provide you with like
very smooth glossy reflections. By default,
Lumen Reflections-- even if you turn on
hardware ray tracing, by default, when the ray hits something,
we get all of our lighting from the surface cache,
which is very fast. And makes Lumen
Reflections about twice as fast as if we had
actually evaluated lighting at the hit point. So it's a very good
scalability option. But it is on out of the box. And I saw some people
reacting to that. If you test Lumen
Reflections by default, it's not as high quality
as ray traced reflections. Well, if you set Lumen
Reflections > Quality to 4-- magic value-- you get-- on the hit point,
Lumen will go and actually evaluate the
lighting at that hit point, which gives a much
more accurate lighting and reflections. So if you are-- just to compare against
ray traced reflections, if you want to compare
between the two, make sure you set the Lumen
Reflections > Quality to 4. And the reason that's
a hidden setting is because it's early
access and it came in hot. GALEN: Did I remember
you saying something about reflections
related to HLODs also, or did I completely
invent that in my brain? DANIEL: It's a work in progress. GALEN: OK. I just wanted to check. DANIEL: It's not
in early access. GALEN: OK. DANIEL: Some other things
about Lumen Reflections versus ray traced
reflections, right now, Lumen does not have all
the same features as ray traced reflections,
but we're working toward that. And we're working toward Lumen
being the unified reflection method of Unreal Engine 5
that works with Lumen GI that works with Lightmap GI. Right now,
Lumen has a bunch of things that ray traced reflections
don't, especially in dynamic scenes particularly. If you use movable lights,
ray traced reflections don't really work,
especially movable skylight because the skylight
will not be shadowed. Lumen has screen
traces integrated so things that don't exist
in the ray tracing scene will still-- Lumen Reflections will hide
that mismatch quite well, specifically Nanite meshes. Since we're not ray tracing
against the actual Nanite geometry,
ray traced reflections can have a lot of self-shadowing
artifacts with those. Lumen Reflections also
support software ray tracing. So it's not like you have to
have a special video card on PC. Any DirectX 11 hardware. And we have dynamic GI and
shadowed movable skylight in reflections through the
use of the surface cache. And Lumen Reflections
support clear coat properly with actually
two layers of reflections. The sharp one on the top
with the lacquer, and then the glossier one underneath,
which is not something that ray traced reflections supports. But ray traced
reflections supports lightmap GI in the reflections. So you can use
lightmaps for your GI together with ray
traced reflections. You cannot do that yet
with Lumen Reflections. That's something that we want
to support for 5.0 release. So long term,
looking to the future, ray traced reflections
are deprecated. And we're going to merge all of
their remaining functionality over to Lumen
Reflections, which is our scalable, unified reflection
pipeline for Unreal Engine 5. CHANCE: Right on. DANIEL: I wrote up
some best practices from observing how
people use Lumen and from our own projects
developed using Lumen before early access release. First thing is,
basically, the way that you can use emissive with
Lumen and expect good results. If you try to actually use
emissive meshes to replace light sources,
you're going to be disappointed with the results with Lumen. Or another way to look at
it, this is just a feature that
hasn't been implemented yet. Lumen does pick up emissive, but the smaller
and the brighter-- because Lumen is accepting
lighting from any direction, it doesn't know where
the emissive meshes are, the smaller and the brighter the
mesh, the noisier it will get. So you can use
emissive with Lumen, but you need to kind of
keep that caster large. And you need to keep it dim. [INTERPOSING VOICES] DANIEL: Go ahead. CHANCE: You would still
want to go through and maybe put a point light or
two in those spaces to really get the lighting
you want in that space? DANIEL: That's the other way you can use Lumen emissive is
you actually place a spotlight or point light and you have
a little emissive light bulb, but you just keep it dim. You don't crank it up so
that it's luminous trying to propagate it
through the scene. Those are basically the
two uses of emissive that we see that are successful. CHANCE: Is there any problem with blowing it
over the value of 1 to get some bloom on that, too? DANIEL: No,
but as you crank it up, you'll start to see noise
appear in the scene. Like this, where-- I mean,
this isn't even the worst case. You'll start to see a
lot of noise appear. CHANCE: And even a
red X will show up. It's crazy. DANIEL: Right on. Yeah,
I programmed that in there. CHANCE: Yeah, cool. DANIEL: This is a
screenshot from the engine. Yeah, once you get to a certain
setting, it's like, no, no. Don't do that. CHANCE: That's great. DANIEL: BaseColor
is a huge impact on global illumination. So every time the light
bounces off a surface, it is attenuated
by that base color. So for the light
that's directly lit-- for surfaces that are
directly lit by the sun, it's only applying
the base color once. So if your base
color is like 0.1, you're getting 10%
of the sun's energy. But for indirect lighting,
it's applied every bounce. So if your base color's
0.1, you're getting 100th of
the sun's energy in the GI, which you basically
are not going to see that. So your base color
needs to be bright if you want GI to
actually show up. And here's a comparison from
one of our internal projects. The base color's on the left. The top is grayer and
darker base color. And the bottom is more
saturated base color. And you can see on the bottom
right, that's basically how much brighter
and more noticeable the global illumination
is than the top right with the dark base color. If you look at the
base color by itself, it doesn't look that
dark, right? It doesn't look like
it's going to have that big of an impact on the
GI, but it does. If you turn on
Lumen in your scene and you don't
really see anything, this is almost certainly why. And the single biggest impact
on having good lighting in your scenes is keeping
your base clear bright enough to actually bounce the lighting
where you'd expect it to. GALEN: Yeah, I think it's
worth maybe lingering here for a second, too. So some of the
discoveries that we made specifically working
on Valley with this. One of the things
that we noticed in compensating for
this problem early on was that our guys were kind
of cranking the exposure to kind of compensate for just
general darkness in the scene. And that was
something that Daniel flagged to us immediately. And was this kind of
like, hey, look, this is something
that you all should be aware of as far as
how to sort of make this content look the best
that it possibly could-- or it can. And so I mean, we did go back and
brighten up albedos across the board for the assets
that are in the light world specifically. And as a result, I mean,
the bounces were so much better. We also increased the
value of the skylight. So I think the intensity
was set to 1 before Daniel took a look at it. And then we raised it to 1.3. There were some cases,
certain shots where we actually raised it to 1.5. And in some cases, even a little bit
higher than not to accommodate some kind
of caverness type shots and that type of thing. Yeah, I mean,
it was something that we learned through the process. And on a product
level for Megascans, we're definitely
hoping to make it so that Lumen is as compatible
with the Megascans calibration as we possibly can
out of the box. So something we're
working towards actively. DANIEL: And both of our
projects using Megascans ended up having to
boost the base color. So don't be afraid to do that. Don't be like, oh, it's scanned. It's got to be perfect. It's actually really
powerful to put a multiplier in your
materials that's driven from a material
parameter collection so all your rocks can
have the same multiplier. And then you can go in the
material parameter collection and just scrub it
for the whole scene and achieve your
artistic vision. And just see what the
impact of that is. And it can be really surprising
how much that one setting can add to the realism
of your scene. GALEN: Yeah, and so we're trying to address that on a
product level, like I said. We definitely want to make
it so that out of the box, you wouldn't have to worry
about something like that. But it's something that we
have to actively kind of comb through the entire library
to kind of get the result that we're looking for. So yeah,
our processing team is going to be communicating a lot more
with engineers going forward. And hope to have a much better
result out of the box soon. CHANCE: Awesome. DANIEL: So that's
all about BaseColor. Next best practices is about
Indirect Lighting Intensity. It's such a powerful setting. And it's so fun to set it just
to see bounce and then remove bounce. But we actually have a bug
with our implementation of it, which is that screen traces do
not know about this setting. When they hit something,
they have all of the lighting from all the lights baked
down into one value. And they don't know, oh,
that light was supposed to be five times brighter. Which is something we hope
to improve in the future. But what this will
cause right now, if you crank up
Indirect Lighting Intensity to a large value, you'll get basically
view-dependent GI from Lumen,
which looks like artifacts. As you look around,
the GI will be shifting. And then,
as I was saying before, the single biggest thing
to troubleshoot with Lumen is the surface
cache, which can see Lumen's version of
the scene with the Show -> Visualize -> LumenScene. And a lot of times,
when it's black, it's because the mesh did
not have a simple interior. It was like a whole
room in one mesh. And if you want
good Lumen quality, keep an eye on this
view mode and make sure it stays in sync
with your main scene. We know all of these like
content requirements suck. And it's not just works. But it's better than
building lightmaps, right? It's better than
authoring lightmap UVs and waiting 10 minutes for
the lighting build to finish. These are kind of
the things that allow us to make it real-time. CHANCE: Over night,
you mean for light-- DANIEL: Overnight. Yeah, depending on what you
have the quality set to. These are kind of
like the corners that we had to cut in
order to make it run in a handful of milliseconds. GALEN: Yeah,
I have lots of memories of working on projects where
we'd kick off light bakes, it takes several hours, you come back and
everything's broken. So I think it's a good tradeoff. CHANCE: That's right. Yeah. DANIEL: There's bugs that
only happen when you crank it to the highest settings. So everything was fine on
preview-- everything was fine. And then you put it on
the highest settings and you go to bed. Wake up the next morning
right before your deadline, and it's like-- CHANCE: Yeah,
designer editor door right there that opens
and casts shadow. DANIEL: Or like a door that
moves or anything dynamic. [INTERPOSING VOICES] DANIEL: OK, so Lumen platforms. Since we targeted
next-generation console and high-end PC,
that's all it runs on. And it is a bit of a
forward-looking technology that just like-- Unreal Engine 5 is not actually
released yet for shipping, so we hope to scale it down
better to more platforms that are out there today. But this is what we have
right now in early access. We do not support
Lumen on mobile. And we're not even
working on that because we don't think
that we can provide a version of dynamic
GI on mobile that will be Epic-quality basically, that will be good enough to use. CHANCE: I think on that note, is that a mobile hardware
kind of bottleneck or is it the rendering
tech like Vulcan or OpenGL? Or is it a combination
of all things? DANIEL: It's 100%
about the performance. Because we use
software ray tracing, we can run on a lot
of different hardware. But if you use dynamic GI on a
phone, it's going to get hot and your battery's dead. So it's just not a good
fit for the platform at this point in time. I mean, there are certainly
like probe-based approaches-- like irradiance field based
approaches that can work. But we're not currently
working on that. And similarly in VR,
VR has crazy high resolution, two eyes,
and usually needs to run at 60 or 90 frames per second. It's very difficult to give
acceptable dynamic GI quality under these constraints. So we're not developing
Lumen for VR. I know some people are
disappointed about that. I'm sorry. There are limits to
what can be done. CHANCE: On that note,
a question came across earlier and it had to do with deferred
versus forward rendering. This is a deferred rendering
technology, correct? DANIEL: Yeah. Yeah, so the performance
is just the first reason it can't work in VR. The second one is that
Lumen does require the deferred shading pipeline. And most projects
are relying on-- most VR projects rely
on forward shading for lower overhead and MSAA. And Lumen does require the
deferred shading pipeline. And then if you want to
use hardware ray tracing, that has some
extra requirements, which you can find fully
spelled out in the docs. CHANCE: Something I've
seen come across chat here. A lot of these things you said-- I know there's a lot
of caveats in here. If these things aren't
in the documentation, we can definitely get them
in there as well to just to make sure that people
can kind of use Lumen to the best experience
they can on the hardware that they've got. And that they don't paint
themselves into strange corners like we have a few times. DANIEL: Yeah, the only one that's
not in the doc is VR. I forgot to put that in the doc. But everything else is in there. So let's talk about Lumen
performance briefly. Lumen relies very heavily
on temporal super resolution to allow Lumen to render at a
much lower internal resolution than what's output
finally on the screen. And temporal super resolution
does an incredible job of getting close to the quality
of that final resolution while paying a tiny
fraction of that cost. And we feel like this
is really the best way to get the best final
image quality is not to actually render at 4k
native, which is a crazy amount of pixels. But to actually render at a
much lower internal resolution and then use temporal
super resolution to get an image that looks
much better than 1080. It looks close to 4k. However, so consoles
has it set up by default very well-- the upsampling. But PC does not. So if you load Unreal Engine
5 on a 4k monitor on PC, it's going to render
at 4k internally and it's going to cost a lot. So in editor, you can go and
lower the screen percentage and get much better performance. CHANCE: And it really comes
at very little visual cost. I mean, I think when we
were working on Valley of the Ancient we did that. A lot of our recordings
were done on 4k monitors. And we had to do a lot of
A/B testing to kind of see specific areas that we needed to
use TSR or let TSR do its thing and bring it up and-- there's very
unnoticeable difference in so many different
things there. So I encourage you to
play with that setting, too, community, to find kind of what
you actually need and what you can get away with. It's a really powerful feature. DANIEL: The other thing
about Lumen Performance is that by default, you get Epic
scalability settings. And Lumen is,
by default, tweaked for a 30 frames per second
on next gen consoles. If you want your
game to target 60, it's useful to look at the
High settings set up for Lumen. And those are the
settings we're expecting to use for 60 frames-per-second
games on next gen consoles. However, the whole Lumen
scaling down thing is super early at this point. We didn't get that much time
to work on it for early access. So we hope to have much better
results there for 5.0 release and later. CHANCE: Oh, right on. DANIEL: Yeah, which segues into things we want to
have for 5.0 release. I don't want to promise
very many things. And none of these are promises. But we do expect to have
full hardware ray tracing in pipeline in Lumen so that you
don't need to have the memory cost of distance fields and
ray tracing acceleration structures. And we expect to have
landscape supported. And we're looking for your guys'
feedback, which we've already gotten a lot of on the forums. Thanks. Thanks for that. We read all that and
helps inform what we're going to work on next. CHANCE: Yeah, I'd definitely encourage everybody to go hop in
there and take a look at that. Or add their thoughts, I guess I should
say, to that thread. Let us know what you think. That's why we do early
access, right? DANIEL: Yep. CHANCE: Before we hop
over into content, I just saw this [?
item ?] come up. Daniel, how long have you
been kind of working on Lumen? You and the team. I mean, you said there
was three or four people. I mean, how big is the
team size and long have you all been at this? DANIEL: There's three of us. We've been working
on it for two years. We started in March 2019. And we've made a lot of
improvements in the last 12 months. Hardware ray tracing didn't
even exist 12 months ago. Patrick Kelly has been working
really hard on that hardware ray tracing path in Lumen. And having Lumen work in
these Nanite scale levels has been an incredible
amount of work. Krzysztof has absolutely
incredible work to make that
actually happen along with a ton of
other improvements. Lumen, 12 months ago-- in that land of-- Lumen in the Land
of Nanite demo, we've greatly increased greatly
improved Lumen since then, which is actually what
I'm going to show you. CHANCE: That's awesome. DANIEL: So this is
internally called Reverb. Can't hide that
because it's up here. But Lumen in the Land of Nanite
demo, in-editor-- so the performance
is lower in-editor than it would have been in-game. I'm going to lock FPS to 30
so that the streaming will be smoother. And so in here,
Lumen is providing everything that's not this sunlit area. So if I go ahead and turn
off Lumen contribution, it switches to distance field ambient occlusion,
which because it's just ambient occlusion,
leaks skylighting. It's not sky shadowing,
it's just ambient occlusion. And it has some
self-shadowing artifacts. This was our Unreal Engine
4 dynamic lighting that can actually ship on consoles. And so this is a difference
between our Unreal Engine 4 best technology versus
Unreal Engine 5. And you can also toggle
Lumen Reflections. I'll show that a little later. So in this demo,
which is entirely Nanite meshes for all of
the environment geometry, Lumen is bouncing the lighting
from the directional light. Well,
this is actually a spotlight. But it's bouncing
the lighting and it's shadowing the skylight. So you can see
really detailed sky shadowing here where individual
meshes block the sky lighting. CHANCE: Right. So everything in here
casts a shadow also, right? Yeah. It's got a directional--
or spotlight and a bunch of-- just a ton
of super high-poly meshes that all cast shadows. DANIEL: It's really fun to
just start hiding stuff. Start moving stuff around. CHANCE: That's the first thing the first thing I noticed
in working in Valley of the Ancient was
deleting a mesh or moving a mesh and everything
just kind of like readjusting really quickly. And I'm like, what? I'm just not used to
seeing results like that. DANIEL: Not going to
get this with lightmaps. CHANCE: Yeah, not at all. DANIEL: That never happens. CHANCE: Yeah,
you'll get the lighting needs to be rebuilt. DANIEL: That's right, actually. CHANCE: Then you
get your art team yelling at you because your
design broke the lighting. DANIEL: Yeah,
and in a production environment where there's a lot of
people working on the level, the lighting is basically
never up to date. The baked lighting is
just never up to date. It's always unbuilt.
And that's just gone. The problem is just gone. CHANCE: That's fantastic. And then in gameplay, you could destroy that
ceiling, right? Say that's a destructible
mesh, right? You just blow it out and then
you can walk through there. You don't have to have an
art team come and make sure that that visually, from a light
perspective, looks realistic. It's just going to do
what you expect it to do. DANIEL: Yeah. Yeah, usually,
you would like hide that mesh from casting a
shadow in Lightmass so that the bouncing-- so you
have to pick between two evils, basically. Do I want the sun to
bounce and it's still there when the rock is there
when it looks wrong? Or do I want there
to be no bounce? You can only have one of
those scenarios look right. OK, so let's see. What else was I going to show? Let's look at the
LumenScene in here. Some black stuff,
which are artifacts. Oh, let me first show what
the distance fields look like. So this is the first-- this is the representation that
Lumen is ray tracing against. And it's not as detailed
as the Nanite geometry, but this is showing it
with simple lighting. So it's not supposed
to look good. It's just trying to show you the detail of the geometry. You can see that everything
is like in a blobby form represented. So that's where we get our
software ray traces from. And then you can also look
at the global distance field, which shows our merged version. This is the super
fast software ray tracing, because there's only-- for any given point in space,
there's only one distance field that you have to check. You don't have to
go over all of them. And then we put that together
with our surface cache and get the LumenScene. This is what we're using to give
hit lighting when the screen trace didn't work. So the screen trace-- so it doesn't
match the triangles in a lot of different ways. But the screen traces
do a pretty good job of covering over
those mismatches. There is one thing
I want to mention is that, you see all these
little bright salt and pepper spots? That's noise in the
surface cache multi-bounce. And that's something that we
definitely want to improve. That's not the quality
that we hope to ship out. But didn't get to that
in time for early access. CHANCE: That's fascinating. DANIEL: See what else
I'll show you here. This is kind of a
cool scene up here. Let me turn on the
directional light. And as soon as that comes
on, you can see all kinds
of bounce lighting. This is skylight only. And then turn on the
directional light and can move the directional
light around, the GI updates. Can get some pretty
dramatic lighting with that, especially if you make
it later in the day. CHANCE: And so this
is just the skylight and the directional light doing
all of this work in this space? DANIEL: Yes. Where did my
directional light go? It got lost. This widget is a little bit
of a challenge to work with. But you can make some really
dramatic sunset lighting. All bounce lighting,
this is all Lumen. It would just have been this
with distance field ambient occlusion on Unreal Engine 4. CHANCE: That's wild. GALEN: We didn't even
have the sky atmosphere stuff for this project,
which is kind of interesting. I'd be really curious to
see hooking that in seeing the difference there, too. DANIEL: Oh, man. And I love these
stalactite shadows here. These are so good. CHANCE: Yeah, really
rich, really detailed. DANIEL: Yeah,
we're doing our traces at such a low resolution,
you can't even tell. And in this case,
the geometry is so detailed but you still get really
accurate shadowing. I mean,
I didn't find this time of day before when I was testing. This is really cool. I got to look around. CHANCE: Chat's asking for a
blue, like a blue sun. DANIEL: A blue sun? All right,
let's go to an alien planet. That's cool. It looks like a horror movie. CHANCE: Yes. DANIEL: Where they just have a super bright blue spotlight
on the character's face. CHANCE: Star Trek or something. DANIEL: Green planet. Red planet. Blue planet. CHANCE: The room
right now is very-- the art director of this
project is probably like, what are you doing? DANIEL: Yeah. What are you doing to my baby. It was never meant
to be seen like this. VICTOR: On the topic
of playing around with light values,
Dylan Burrell had a question. They're asking if Lumen
is assigned to play nicely with lighting and physically
correct units like 5,000 lux for sunlight and using
real world camera values? DANIEL: OK,
so this project did not use it. It had a sun value of 100. We do have some
internal projects that have a sun value of 65,000. And Lumen is working
well with that. It's not the most tested
path, though. It's something we
want to support. And I think is working,
but I'm not 100% sure. DANIEL: The big gotcha that we found with setting your sun
to physically accurate values is that your emissives
basically don't show up. It appears like they're
not working at all. Because the exposure is
adapted to the sun values, your emissives have to also
be 10,000 times brighter to show up. That can be a
little unintuitive, especially if you have some
test levels in your project that don't have that
65,000 sun light. CHANCE: Then you have to
change all your material. And since it's just
for those spaces, so it doesn't look
so out of place. DANIEL: Yeah, it's better to
just make sure all your test levels use the exact same
sun units as your main level. And then just
bypass this problem. But it's super intuitive. You drop an emissive
material in and you're like, the render's broken. Where's my emissive? All right,
that's enough of this one. Now I want to show you
guys, this is the level that Rafael Reis sent me from-- his company is ue4arch.com. He sent me a early version
of his next project and gave me permission to show
it, which I really appreciate. Thanks, Rafael. CHANCE: Thank you, Rafael. That's great. DANIEL: This is a level
that we did not have before early access release. So there are some Lumen bugs
that we would have jumped on. But I think it's a
good example for me to show because it's basically
your guys' experience as you use Unreal Engine 5. So this level that Rafael's
made looks amazing. This is all using Lumen
lighting for all indirect. No lightmaps. So if I turn off Lumen,
only direct lighting's visible. Now, obviously, you would never
have shipped it like this. He would have shipped
it with lightmaps, which I can't show right now. CHANCE: You don't want to bake lightmaps over the next-- DANIEL: No. Yeah, actually, I was trying to
make the comparison and I was just too annoyed
by the hassle that I just, never mind. I don't have time for this. CHANCE: Go in the right
direction then, right? VICTOR: Daniel,
did you clamp the frame rate? DANIEL: Nope. CHANCE: Yeah,
and to clarify again, for folks that might
have just joined, we're doing that not
because we can't go over 30, but we're dropping it to 30
just because the streaming rate that's going out with
some of the compression looks way smoother. VICTOR: The frames
interpolate much better, yeah. CHANCE: Yeah, so much easier. DANIEL: So this is
Lumen cranked up. This is not the default quality. This is basically like
the best Lumen can look in the current settings. So Final Gather
Quality is set to 4, and reflections
quality is set to 4. And this is the
quality that we can do for really clean indoor scenes. And let me just go through the
different parts of the house and show you. Show you what's going on. So the bounce lighting's
working nicely here. I really like the
skylighting coming in this door from the side. You can't see the door,
but the skylighting's there and it's stable. We got the emissive television,
which has kept dim and subtle so it's not blasting artifacts. The curtains here are
actually emissive, too. That's how he's achieving the
subsurface look without it actually being subsurface. CHANCE: Nice. DANIEL: And if we go back here, this room looks really nice. Lit entirely by this
emissive curtain. CHANCE: Wild. Oh, yeah, because I guess the
directionals not getting back to that space, right? DANIEL: Yeah,
it's shadowed by that curtain. CHANCE: Yeah, right on. DANIEL: And the bathroom
here, you can look in the mirror. And we can look at the
difference between the default reflections-- just ignore that flashing arrow. Got to fix that. Default reflections and the
higher quality hit lighting. CHANCE: Yeah, that's awesome. DANIEL: Which is more
comparable with ray traced reflection quality. VICTOR: And you can see it
there in the tile, as well, not only in the mirror. CHANCE: Yep, right next to it. DANIEL: That one is actually
the transparent glass here blocks more
accurately when you set it to higher quality. So go into the next room. This room looks really nice. I forgot there was a
bathroom over here. There's another mirror. CHANCE: That's a nice
house, in general. DANIEL: It is extremely nice. It looks so good. I couldn't believe it when
I saw his screenshots. That's why I messaged him
on Twitter and asked him if I could show this. Because I mean,
we graphics programmers work on Lumen all day. But we mostly just see artifacts because that's our
job is to make it as high quality as possible
and fix every problem with it. And then it gets out
there in the community and we're just
cringing over here like, ah,
I got to find all the bugs. And then we see results
like Rafael's results and we're just amazed. CHANCE: Yeah, it's fantastic. DANIEL: It's one of the most fun things about being a
graphics programmer at Epic is you get to see
people break your stuff. But you also get to
see people surprise you by making content that brings
out the best in the technology that you make. CHANCE: Yeah, I was just chatting
with Nick Penwarden and a few other folks about some
of the different screenshots or videos that are coming
out of the community. And it never ceases to
amaze me what people are capable of doing with the tech. That's really awesome. DANIEL: So Rafael has a
second lighting setup in here I shall show. This one is kind of like
the middle of the day. But if I turn off directional
light and the middle of the day skylight and I turn
on the evening sky light and some
spotlights, then voila. It is now sunset. CHANCE: Awesome. That looks so good. VICTOR: That is so good. I was just looking up from my notes here over
to the screenshare and I'm like, is that a picture? CHANCE: That's great. VICTOR: It's a really
nice sky sphere or a cube map that's out there, right? DANIEL: It is. It is very nice. Love it. So let's talk about some of
the Lumen artifacts in here. As I scroll side to side, behind things,
especially foliage, there's some flickering. Something that we'll
continue to work on. It's not too bad in
this level because I have the Final Gather
Quality cranked up to 4, which makes it
cost quite a bit more, which is acceptable for
ArchVis but not elsewhere. So something that
we're still working on. And then if you go and look
at the LumenScene version-- oh, this is something
I meant to talk about. OK, remember how I said that
we never actually tested in this level before release? So you see how these
walls are black here? That's the surface
cache-- it's black. And that means that any light
that bounces off this wall is going to only work
with screen traces. And when it goes off screen,
it's going to stop bouncing. Which is why as you look around,
the lighting in this hallway is not super stable. The reason for that is that
content limitation we have where meshes need to
have simple interiors. And this level--
this mesh does not. So the entire
interior of the level is all in one piece,
which is normal modeling pipeline for architectural
visualization. But Lumen has this
extra requirement that-- and this wall piece
needs to be one mesh and then this wall piece
needs to be a different mesh. CHANCE: So basically,
an easy way for you to tell if you're seeing that
as you move around watching it kind of creep in and crawl in
and update over time, flipping on that visualization to
let you know if you've got something that
might be malformed for how Lumen is today? DANIEL: Yeah. CHANCE: OK. DANIEL: Yeah, and if I go
back to the directional light, turn it on somewhere-- so
looking at the surface cache, see, it's correct on the
wall because the wall was modeled as a simple mesh. But it's wrong on this piece
because this piece is-- these are not in
separate meshes. That's why the surface
cache failed here, which is something
we hope to improve. But this is where we
are at early access. And as a result,
the bounced lighting off of this will go away when I look off
screen, which I'm not actually reproducing right now. Well, let's see. Yes. OK, you see it there. When the screen traces hit
the edge of the screen, the reflection changes. That is because the surface
cache failed on this mesh. So you can fix that by breaking
this up into three meshes. CHANCE: Got it. DANIEL: Let's see, what else did I
want to show here? Oh, yeah. Hardware ray tracing,
this is our hardware ray tracing showcase. Or the one I want to show
hardware ray tracing in. So if we go to Project
Settings and search for 'hardware ray tracing'. So Lumen is-- hardware
ray tracing is on and Lumen is set
to ""Use Hardware Ray Tracing when available"". And we can toggle that
and see the difference. So not a huge difference here. Let's go to another
part of the house and see what hardware tracing
versus software tracing is doing. Here, hardware ray
tracing, the curtains, especially are more accurate
with hardware ray tracing. If you go and you look
at the Visualize mode, then you can really
see what's happening. So in the Visualize mode,
it matches the setting. So right now, we're looking
at the hardware ray tracing. And if I turn that off, this is the software
ray tracing. And distance fields are
not great for thin stuff, so the curtains didn't become
solid in software ray tracing. And in software ray
tracing, we remove a lot of the little objects
for performance reasons. Whereas, in hardware ray
tracing, we don't. So all the little
objects are there. Let's see, what else? Oh, yeah. Let's go look at those
bathroom reflections. So this is our hardware ray
tracing reflections off screen and then software
ray tracing does not have nearly the
quality of reflections. That's the area where hardware
ray tracing really shines. You can't beat it
at reflections. But we find for
global illumination that software tracing
is pretty close. CHANCE: So you can not
really worry about it as much if you don't have a
bunch of reflective-- fully reflective surfaces like
a mirror or like some stainless steel or something
that's going to have-- OK. DANIEL: Yes. And that's why software ray
tracing works so well for us in the past two demos we made
is that they didn't really have any mirror reflections. Whereas,
hardware ray tracing, it's pretty much a
requirement to make mirror reflections work well. CHANCE: So that
would be basically, if you had like a
mountain that was-- this was a question from,
I think, the forums. If you had a mountain that
was reflecting off of a pond and it's not showing up the
way that you think it would, would that be because
software ray tracing versus hardware ray tracing? DANIEL: Yes. Almost certainly, yes. CHANCE: And then
on that same note, was it planar reflections? Any support for that? Would that be a way to work
around that or would that be kind of left to the
same kind of limitations? DANIEL: We don't support
planar reflections. We could,
but we don't plan to do it because planar reflections
have a lot of overhead-- a lot of performance overhead. They actually
rerasterize the scene, including rerunning all of the
dynamic lighting like Lumen again-- the second
time on a bigger portion, often than what's
actually on the final screen. So there's just a lot of
sources of inefficiency there. It's difficult to ever
make them anywhere close to as performant
as hardware ray tracing. So the path to getting
those mirror reflections is not planar reflections,
it is hardware ray tracing. CHANCE: Got it. VICTOR: Cool. DANIEL: Let's see. What else should we look at? VICTOR: There's a lot for sure. Daniel, just let me know when
I could start bombarding you with some of these questions. DANIEL: OK, I think it might have been in something
else I was going to show. Let me just mess around
with the level a little bit. Because it's cool to
watch dynamic changes. [INTERPOSING VOICES] DANIEL: If you want
to hide the sky, it gets all overexposed. So then you can just
compensate for that. And it's cool to see just
enormous lighting differences from scene changes as Lumen
shadows the skylight based on changes to the scene. And it actually looks-- so I lowered the
exposure greatly. It actually looks really
moody in here now. CHANCE: And so you would
just kind of light this scene like you're building a house. You would kind of light it the
same way you would put lights in your own home, right? DANIEL: Yes. CHANCE: So you go back to
that moody scene there. Well,
if you want that kind of thing, you could put a lamp in
the corner or some light that can kind of
fill in that space. And it should start
to feel closer to what you would
see in the real world as long as you're setting
the actual brightness and whatnot to kind of match. DANIEL: Yeah. Rafael has a bunch of
spotlights in here. Place where you'd
expect you actually have lights in your house. And they bounce lighting
through Lumen as well. So this is with Lumen off--
it's just direct lighting, and then Lumen on. Let me just move the directional
light around a little bit because it's always fun. CHANCE: Oh, yeah. DANIEL: The skylight's
too bright right now. Let me make this a little
less-- a little more dramatic. VICTOR: Do you know
if there are any-- oh, yeah,
I can almost see it there. It looks like some of the
lights are using IES profiles. Oh, no, that's the skylight. DANIEL: I don't know if
any of those are set up. I could go look. But we do support IES
profiles in Lumen. Let me turn off these
spotlights that are flooding the area with lighting. I want it to be
really dramatic-- really dramatic sunlight. I even have the
skylight turned off. So it's just all sun bounce. Hell, yeah. Am I allowed to say that? CHANCE: That's awesome. DANIEL: This is like
if you're on the moon, there's no atmosphere scattering
so there's no skylights. So everything is
really high contrast. CHANCE: Yeah. DANIEL: Like the sun is
incredibly bright because it doesn't get filtered
through the atmosphere and it doesn't bounce
around in the sky either, so it's all bounce lighting. Yep,
so I think that's all I had. CHANCE: That's awesome. DANIEL: Victor? VICTOR: Yeah. You know what? This is one of the streams where
we don't need to talk as much. You can just sort of
play with the parameters while everyone's watching. DANIEL: It's so much fun that, dude, I just get lost. Just the editor controls have
a whole new life to them. Just being able to
hide and show things, it's so much more
fun when you can see the way it affects the light. VICTOR: Except
knowing that if you did any change
otherwise, you would just get an error saying
that lighting needs to be rebuilt and-- DANIEL: Yeah, let's go back to static
light, OK? Preview. And change to static. Where the lighting
needs rebuilt message? It's not showing up. VICTOR: It's not. [INTERPOSING VOICES] CHANCE: Move the chair. Just move the chair or
something, right? DANIEL: Oh, you know what? It's because I
don't have lightmaps built in the first place. And the message only comes
up if you had any built and then you unbuild it. So it doesn't actually
care right now. So I can't show the message. I deleted all the-- I made everything-- as
soon I came in here, I made everything movable and
I deleted all the reflection captures that had to be
manually set up for it. VICTOR: All right. CHANCE: That's wild. VICTOR: Yeah, thank you, Daniel. Before we get to
questions, we're going to take a quick
break so that we all can be with you all here
just for the questions. We will be back in just
a couple of minutes. Thank you, everyone. VICTOR: All right, we are back. Thank you very much,
everyone, for waiting. Hope you're sticking
around there with us. It is time for some
of the questions. And I would like to
start with a couple of the ones we received on the
forums prior to the livestream. Let's see. This one here is pretty good. From Eugene,
who's wondering, ""Is it possible to get more than three
lighting channels with Lumen?"" DANIEL: That's funny
that you ask that, because actually you get zero
lighting channels with Lumen. So OK-- CHANCE: Does that
mean it's global? The word global kind of-- DANIEL: Yeah,
the problem is that once the light hits the
surface and it bounces, we simulate all
those rays together. We don't keep track
of which light it came from anymore because it's
too expensive for real-time. I mean, you can do it. But we chose not
to do it because it would be one of those costs that
most users don't want to pay. So we don't actually
have any lighting channel support in Lumen and
it's not something we can really add either. CHANCE: Right on. Let's see. Oh, I have one here
from ReticentOwl. Lumen usage with light-based
particle effects. So if there's a particle effect
that's got emissive on it, how is that going to
influence the scene? DANIEL: It's not
going to work well. So the combination
of small brightnesses doesn't work well
with Lumen emissive. And then also, Lumen uses a
pretty strong temporal filter to actually make it possible
to be done in real time. And any kind of
fast moving lights are going to cause
a problem with it. And in fact,
if you have a muzzle flash, you probably want to set
Indirect Lighting Intensity to zero and remove it
from Lumen's consideration so that the lag is not visible. CHANCE: Got it. OK. VICTOR: Couple of questions that are in regards to
the possibility of being able to exclude certain
meshes from Lumen. And this is in regards to,
say, one of the suggestions-- or examples was
a sci-fi corridor with a lot of emissive
lights going on. Is it possible in
any means now or potentially or in the
future being able to exclude emissive primitives from Lumen? DANIEL: Yeah. We need to do something
about that because if you want to use Lumen
and you're trying to make a sci-fi map with lots
of small bright emissives, then there's going
to be a lot of noise. And you don't have any
control over it right now. We do not have that
ability right now because of screen space traces. They're going to hit
what's on your screen and they're going to
pick up that color. And so while we
do have like this separate Lumen representation that we could hide
emissive from, screen tracers
don't respect that. So we need to do some work
there-- just work that. CHANCE: Right on. I've got one from Laurish asking
about potentially making Lumen calculate and converge faster
for things like the Movie Render Queue since it's not--
it's happening kind of frame by frame. Is that something that is
possible at this point? DANIEL: Yeah, it's something we know we need to do,
especially on camera cuts. Since Lumen is all
targeted at real time and it needs multiple
frames to converge, when there's a camera cut,
we have a noisy image. And the image changes
after a couple of frames, too, which shows up as a pop. So what we need to do is do
some warming up, but also allow more budget for
those early frames to smooth it out and make Lumen
work better with cinematics. And if you do a full
offline render-- can't remember the name of that
in the Movie Render Queue-- I guess that is just the
Movie Render Queue, right? Then, yes, we should be using
some much higher quality setting since the performance
is not an issue anymore. But we don't have that
implemented right now. CHANCE: Got it. VICTOR: On the same topic,
GulloPBR was wondering, ""Are there any plans to transform Lumen
GI into lightmaps for ultra fast baking?"" DANIEL: No, there aren't. We actually have a really
good lightmap previewer through GPU Lightmass, which-- so Lumen is operating
on the screen. It's not going through the
lightmap textures, which makes it not a great
preview for lightmaps because it doesn't
accurately represent lightmap artifacts like UV seams-- lightmap UV seams. Or your lightmap resolution is
too low or you're just missing UVs on this mesh or they're
overlapping or something. You want to see those artifacts
as part of the preview so you know what to fix before
you click the long build. And Lumen doesn't
have the ability to reproduce those artifacts. But the GPU Lightmass
Previewer does. CHANCE: All right, got one from
PatientZero, I think, if I'm understanding
this correctly. So the surface
cache-- if you have a mesh that's got a
bunch of different planes and whatnot inside of
it, and we saw the black on the screen
earlier, that's basically because of the mesh
itself, not necessarily the layout of it? For instance, the question is
about building floors and walls inside of a Blueprint
and then putting the Blueprint in the
scene, it would still work great with that, right? DANIEL: Yeah. Yeah,
and both of our recent demos did a lot of that
where the meshes were assembled in a Blueprint
somewhere and then placed in the level. That works fine. CHANCE: Great. DANIEL: Hey, we win one. CHANCE: This is
really amazing tech. VICTOR: Yeah. VICTOR: There are some
other questions in regards-- I think you touched
on this a little bit in your presentation. But might be worth diving
into a little bit further. What are some things
to look at to optimize? You mentioned sort of
overlapping meshes. What else would one
like to start diving into if they're looking to
optimize their scene for Lumen? DANIEL: Yeah. So let's see. The main performance
stuff is the resolution. Making sure temporal super
resolution is set up, and that Lumen's rendering at a
much lower internal resolution. And the output, not 4k. Content wise-- like the
way you build your content, if you get up to two
million instances, there's going to be quite a
lot of cost with that of Lumen trying to keep the surface
cache up to date, especially if you fly fast. The instance overlap is probably
the single biggest thing for Lumen,
especially if you want to use anything higher quality
than just the global traces. Like this scene that
Rafael made performs very well because
architecture stuff, you tend not to
overlap too much. But if you just copy paste
scanned rocks all overlapping, then the tracing cost
would be quite significant. Let's see what else? Reflective materials cost more. That's kind of unavoidable. Not reflective,
but smooth materials, roughness less than 0.4. It's kind of
unavoidable because we have to trace more rays to
give good quality reflections. But it is true that scenes
with more diffuse materials cost less. VICTOR: That's not
only true for Lumen. That's true for other
features as well, right? DANIEL: Well,
our reflection meshes that worked with bake lighting,
like the reflection captures, are very fast. So you don't really notice that
reflective material costs more. It does cost a little bit more. But it's not enough
that you really notice. Whereas, with Lumen, it actually costs
significantly more. Because just like ray
traced reflections, Lumen needs to trace
a ray per pixel to give good
quality reflections. VICTOR: So resolution
is once again, probably one of
the most expensive. Yeah. CHANCE: Generally, Yeah. VICTOR: Yeah,
it's just in general. But it's good to know that
it's specific here for Lumen as well. DANIEL: Yeah. CHANCE: I got one-- I'm sorry. Go ahead. DANIEL: With dynamic lighting, we have to do so much more
processing for every pixel. Dynamic lighting
pixels are not the same as static lighting pixels. It's significantly
more expensive. So temporal super
resolution is really the key to making all
of this real time. VICTOR: Go ahead, Chance. CHANCE: Got one from our
old friend Tom Looman. Hey, Tom. The question is
about, let's see, more improvements
during early access or is the next release
immediately 5.0? I don't think we've made
any noise yet about-- we haven't said anything
about any interim releases in between the two. But if you're looking
for some of the latest and greatest,
and Tom, I know you have the guts to go do this. Main is updated
pretty much regularly. So if you watch out for
specific areas there, you might be able to cherry pick
some things down and try them out. DANIEL: We did fix
one bug in Lumen in early access, which was if
you had the software tracing set to global tracing, it would break
hardware ray tracing. That's fixed. Well, sorry, will be fixed in
the next update. But no new features in early
access, no development. It's stable. Just fixing bugs and,
yeah, like Chance said, check UE5 Main if you want-- if you want to drink
right from the fire hose. CHANCE: If you want to go
through the pain of Main. I said this on Twitter
a couple of times. Main is generally, not super
stable for a number of reasons because you're getting basically
the latest as we can check it into the engine ourselves. Having said that,
a number of folks do feel comfortable
in that space and want to kind of
see what's coming or grab a fix or
two that they might see that affects their lives. So if you-- yeah,
if you've got the guts for it and want to go fight the
dragons, go nuts. VICTOR: There was another
question here from unknown. I don't have the
name, unfortunately. But also, ""Is it possible
to disable emissive from the computation entirely?"" DANIEL: It is not because
of the screen traces. Because we're always going to
pick up what's on the screen. But that is a very
commonly requested feature to be able to control the
intensity of the emissives. So we'll see what we could do. VICTOR: Thanks. Another question here from
VirtualH-E who's wondering, ""Can the distance
field mipmaps' distance thresholds
be tweaked?"" DANIEL: Yes, they can. OK, so right now,
it's all in a bunch of CVars. I don't remember the
setting right now. But basically, with this one,
you can actually just force-- so you can force it
down to-- just here's the lowest mip of
the distance fields. And you can see what the
different mips look like. And there's the
highest resolution mip. VICTOR: Let's see. We still have the PIP up there. Can you repeat what the
console command was, Daniel? DANIEL: It was
r.DistanceFields.-- I crashed it. Good timing. CHANCE: That never happens. DANIEL: That was one
of my debug CVars and I didn't check the
bounds on that one. Nobody would ever input a
wrong value into that, right? VICTOR: Sure, no. Why would they? CHANCE: Yeah, and early access
doesn't crash at all. DANIEL: Except for
me in a livestream. CHANCE: I think back in the day when we were first doing a DFAO,
way back in the Fortnite days, I think we had crashes back
then, too, if you recall. DANIEL: I crashed it? CHANCE: Oh,
I think everybody did. I think we all have. DANIEL: Well that's my
first crash for today. VICTOR: On the topic of
mesh distance fields, there's another question from
1Millinis who's wondering, ""The limitations that
mesh distance fields had in UE4 with non-uniform scaling,
are they still relevant?"" DANIEL: For the most part, no. We don't have
significant problems with non-uniform
scaling anymore. We've greatly improved that. If you go too far with it,
like let's say, more than like 4x scaling in one
dimension versus the other one, then you'll start to
see some artifacts. CHANCE: And I also
see that in your art to begin with, right? DANIEL: What's that? CHANCE: With scaling. You would see it
in other places, too, other than just lighting, right? DANIEL: Not always. A lot of times,
people make these really simple template levels with just a box. And they just scale
it like crazy. And it looks fine
because the triangles didn't have any detailed
texture map to them. CHANCE: Yeah, I was speaking
more after grayboxing but yeah. DANIEL: Yeah, real world assets, if it's not procedurally
textured with world space, if it has actual
texture map to it, then it's going to look pretty
terrible once you non-uniformly scale it by a crazy amount. CHANCE: Got one here
from RendermanPro, what about sub-surface
or translucent materials? I know you showed the
glass in there earlier. But what about subsurface? DANIEL: Yeah,
we want to add support for the subsurface
shading model. We haven't got around to it yet. And we also want to
add light scattering through using Lumen's
traces, but we haven't implemented that. VICTOR: Let's see here. We had a couple of
questions in regards to combining Lumen
and baked lighting. Lumen for some areas,
baked for others. DANIEL: So we looked
into the feasibility of say baking
lighting indoors where you have a lot of lights
that need top quality, and then dynamic
with Lumen outdoors. And we came to the conclusion
that it wasn't actually a useful combination. Because most of Lumen's
cost is screen dependent-- screen resolution dependent. And so you're paying that even
when you're in a baked lighting area just to have a seamless
transition between the two. So we decided not to support
that particular combination. But we do really want to have
Lumen Reflections supported with Lightmap GI. The same way that ray trace
reflections work today. That's something we plan
to add in the future. VICTOR: Cool. That's a good note. Thank you, Daniel. On the note of
Lumen reflections, we have a question
from CarloALT who's wondering,
""With Lumen Reflections, the reflective surfaces reflect
the distance field scene. In the future,
will it be possible to reflect the lit scene?"" DANIEL: Yeah, that's what this
reflection quality is doing. The reflection quality of 4. So this is my bad for
not documenting this. Kind of came in
hot in early access where we were
trying to figure out what the scalability
settings should be. But the Lumen
Reflections > Quality 4 is a magic value,
which gets you the hit lighting, actually evaluating
the light at the hit point in the reflections
instead of using surface cache. VICTOR: So not in the future, it's right now
with the number 4? DANIEL: Yeah,
it's a hidden feature. VICTOR: And that's why
we have you here, Daniel. DANIEL: That's why you guys need to watch the livestream. To find the secret. VICTOR: I say that every week. DANIEL: We saved it
for the livestream. It was intentional. No, it wasn't. VICTOR: Another question. And also something that
we talked about a little bit before the livestream. This one comes from
Caradeco who was wondering, ""Any tips of how to use
Lumen but also make our game scalable for low-end PCs?"" DANIEL: Lumen doesn't
scale down to low-end PC right now. We have some prototypes
of basically generating irradiance fields with Lumen. And that can scale
down pretty well. But they haven't gone
further than prototypes yet. And I don't actually
know whether we'll be able to scale down further
than we are scaling down today. Can't make any promises. VICTOR: Galen, I saw some questions
earlier in regards to when we were talking
about changing the base color to better match Lumen. Is that something that
folks who are already using Megascans might
have to redownload some packs to change that? Have you talked about how
that workflow might look like? GALEN: We don't have an
official announcement on that today from a
product perspective. It's a complicated
issue, obviously. Yeah, we'll have a proper
announcement about that soon. VICTOR: For now,
you can go ahead and open up the material,
pop in a little scaler in there. And go ahead and
tweak it yourself. Most of them do have a
master material, right? So it isn't necessarily a one on
one work that you've got to do. GALEN: Yeah. I mean, you can go and
literally just crank albedo tint to-- if it's at 1, 1.5-- [INAUDIBLE] --or
something like that. So you could do whatever. But yeah, I mean,
there's definitely more mathematical ways that
we want to kind of sync up with Lumen. DANIEL: You do
have to be careful when you're multiplying
albedo, which is base color, you don't want to get
anywhere close to 1. So if the input is 0.5,
you cannot scale it by 2. 2 works on the rock
textures because they're starting out like 0.05. But real world materials
like the brightest white wall is like 0.997. It's not 1. 1 Means 100% of
energy gets reflected, physically impossible material. And the multi-balance will
just explode and wash out your whole screen. So just like while you're doing
this, look at albedo-- the diffuse and make
sure that it's not 1. GALEN: Also we have-- so
yeah, I agree with all that. So then there's also
just an albedo tint that we sort of pass in
through the master material that you can change the color of
a rock or something like that. And that by default, just passes in 1s across
R, G and B. And so if you were to
change those values, you can also get you there, too. But yeah, I mean,
we'll definitely look to get a mathematical
solution in front of the community soon. CHANCE: Awesome. Daniel, earlier you had
mentioned altering Lumen through post-processing-- Lumen settings through
post-processing. Do we have access there
to all the same checkboxes that we have in
Project Settings? Or is it just more of
the drop down to use it or to not use it? DANIEL: No, it's not the same. So the Project
Settings, we put stuff where we expect you'll
want it in every level. Kind of like a
project wide decision. And you don't want those test
levels to get out of sync. For example, the choice to use
Lumen or the choice to use hardware ray tracing. Because it greatly
affects the look, you wouldn't want like
someone in test level somewhere else that's
verifying your assets to not have it enabled. So the settings that are
on the post-process volume are things we expect
you want to change just in this one part of the
map or this one hallway. CHANCE: Can you query
that information from Lumen in a
post-process material? DANIEL: No. Post-process materials
don't have access to any of the Lumen internals. CHANCE: Trying to find
ways to really dig in. VICTOR: Another question
here from MallChad who's wondering--
a bit open-ended. DANIEL: MallChad? VICTOR: MallChad, yeah. I guess, Chad of the mall. A bit open-ended. ""What considerations was
taken into flexibility wise for people
who want to extend the behavior of something
as big as Lumen?"" DANIEL: I'm sorry to say, we haven't got to any
extensibility yet. We've been just
absolutely hammered with just getting the
quality that we want out of it and performance and making
it work on all the platforms and shipping early access. We do really want users to be
able to extend the geometry types, especially if someone's
making a voxel plug-in or a procedural mesh
plug-in to be able to have-- to extend Lumen to be
able to ray trace that. But we haven't worked
on that at all yet. VICTOR: Another good question from Raiju who's
wondering, ""Any thoughts as to the application
of Lumen with regards to a stylized art style
as opposed to realistic?"" DANIEL: I don't have
any thoughts on that. We're pretty much
always just trying to match the path tracer when
we're working on this stuff. Basically, the path tracer
is the reference. And if Lumen doesn't match
it, that's a bug. I guess it mostly comes
down to our art knobs-- artistic controls,
which we're sorely lacking because of the
difficulty with screen traces making it hard for
us to have the GI not match the main screen. We're looking at what we
can add in the future. And we definitely know we
need to add more art knobs. But right now,
it's very limited. GALEN: Well, one of the things
that this brings up in my mind, too, is we've shown
some photorealistic examples, obviously,
in the last several projects that we've been working on with
regards to Lumen and Nanite. But it's actually really cool
to see now, especially I've had several friends that
I've actually reached out to me to do more of like concept
painting and environment design where they rough environments
in, in 3D. And they're able to just
now use Lumen and not have to mess with a lot of
the settings they otherwise didn't really understand
in UE4 with regard to lighting in general. So it's really cool to see that. That's kind of like a
tangent off that question. But it is pretty awesome
to sort of see now that you have a lot of--
different types of artists, a lot of different types of
artists that are actually kind of engaging with the
tool and making just entirely different kinds of art. And Lumen is one of those
tools that just empowers artists in such a cool way. So I don't know. I just figured I'd throw
that in there on the side. VICTOR: And I'll say it's
great for those of us who aren't too good at lighting. CHANCE: That's right. Yeah,
that's what I was going to say. Yeah, for me, it's in many
ways, a workflow thing. The final output,
it's great and all. But kind of knowing what-- Daniel said earlier,
knowing what your scene's going
to kind of look like without having to wait
for somebody to come click the button and then get your
lighting scenario loaded up or that sublevel that has all
that lighting in the scene. You just make the changes
and they're there. And that happens regardless
of what style your game is. DANIEL: So I worked on CPU
Lightmass back in the day. And it was always
pain and suffering. I made it the best that I could. But at the end of the
day, it's baked lighting and it's going to
take time to build. And it has these pain points. Very poor artist iteration. It is definitely like
personally satisfying for me to now be part of
making this tool, which greatly just addresses all of
that poor artist workflow. CHANCE: Yeah. Well, and the same
thing for a designer. If you're gray boxing
and trying to figure out what the flow of something's
going to look like and then somebody
comes in and lights it and actually arts it up. And you need to make
a change to something, you can know what that
change is going to affect. And what other
problems that might pose for your lighters
or your artists without having to just do it. And then wait for them to come
back with a solution and say, hey, this will or won't work. Or a response that
this won't work. I mean,
I think it's just tightening all those iterative
loops that is something that really, really,
really helps iteration time. VICTOR: I've seen a couple
of questions in regards to-- oh, where did it go? Seeing a lot of questions. One particular that
I was looking for. Apologize for that. I'm scrolling too fast. Jesus was wondering,
do you guys have any example which plays with water? I would love to see how
this works with water. DANIEL: Lumen
reflections don't work on single-layer water yet,
which is the shading model you have to pick for water. But we're going
to make that work. I'm not sure if that
will be in 5.0 yet. But it's been on our minds. CHANCE: There's a bunch of
these questions we've answered, so I'm trying to pop through it. Is there scenarios that
you wouldn't use Lumen that you can think of? Other than hardware? DANIEL: So there's
the whole question of, what do I use for my project? Is it dynamic lighting
or static lighting? So if you have gameplay
reasons that you need to use dynamic
lighting, that makes the choice easy for you. And there's the platform
constraints of Lumen. The alternative is our
baked lighting pipeline, which is very mature
and high quality, especially on lower
end platforms. Platforms where battery life
and heat are a big concern, like handheld
platforms, lightmaps are a much better choice. VR, lightmaps are a
much better choice. And ArchVis, where the quality
has to be perfect, lightmaps are
extremely good at that. The diffuse quality at
least, together with ray traced reflections. And probably some of them
that I'm not thinking of. CHANCE: I think
that's good though. It's a lot of good options-- considerations. DANIEL: The important thing
is that in Unreal Engine, we have both of these paths
covered with extremely high quality tools. It's basically up to you
to decide static lighting-- static, global illumination
through lightmaps or dynamic through Lumen. GALEN: Boris. CHANCE: Or what? GALEN: Boris for now. CHANCE: Oh, yeah. VICTOR: I found a question
I was looking for. I had seen a couple of them. This one in
particular comes from Mohammed Edri who's wondering, ""How can I render Lumen
in a big forest scene? By default, it disappears at a
certain distance. Is there a parameter to
tweak to solve this?"" DANIEL: We have two problems with forests right now. First one is the range
of the LumenScene. There's no well-exposed
parameter. I mean, actually, everything in Lumen
is just like a CVar. So it's there. But you're going a little
bit off the beaten path if you start tweaking those. And the bottom line is that the
techniques just don't scale up to a giant open world. LumenScene,
the surface cache specifically, is limited to about 200 meters. So screen traces work past that. And in Lumen in
the Land of Nanite, we had a different technique
called the distance scene, which is Lumen working up to about a kilometer. But that is like an experimental
thing that's not on by default. So if you have a big
forest, for now, there's going to be noise
around all the leaves, which is something we're working on. And Lumen's only going to
be active to 200 meters. And there's no control
to extend that distance other than tweaking CVars, which I don't want to say which ones
they are right now because it's been a while since I tested it. GALEN: And that 200 meters
isn't as scary as it sounds, I think. I mean, the distancing and then
switching SSGI and stuff like-- that was something that
when Daniel mentioned it to us, originally,
we're building that demo, it's like, oh, well,
we're building this giant world. You're going to see where
that bubble basically ends. We never really noticed that. I mean, Daniel and maybe some of
the other rendering engineers-- DANIEL: Yeah, I noticed it. GALEN: --could pick out
exactly where they saw it in certain screenshots. But for most people,
I don't think that that was a huge factor,
especially on the last demo. DANIEL: Yeah, so to correct when I said it doesn't work with
huge open worlds, what I meant was, specifically
the surface cache. Because it's only simulating
what's around you. It works with large
open different worlds, but it doesn't cover
the whole view range. Whereas,
basically Lumen falls back to screen traces
after the 200 meters. So the final result works
with large open worlds. It's just not this one
ingredient of the recipe. CHANCE: So it's coming
up quite a bit again. I know you covered
it in your slides, but I figure it might be
worth for the folks that might not have seen
that or just hanging on for a Q&A to cover again. What are the current
supported platforms? What are the planned
supported platforms? And kind of where are we
maybe not going with it at this point? Again, could you just
reiterate that one more time? DANIEL: Yep, sure. Let me go back and
find that slide. OK, so Lumen is next generation
consoles and high-end PC only. And we do not currently
have the ability to scale down below that. We might develop
that in the future. But it's kind of like
an area of open research because it's very
difficult to scale dynamic global illumination down
while keeping a quality that is actually acceptable. You pretty quickly
get to the point where you wouldn't have wanted
to ship with that quality level. For games that need to use
dynamic lighting and ship across platforms,
like Fornite, the solution is to use our other dynamic
lighting technologies that do scale down like
distance field ambient occlusion and screen space GI for that
mid-spec PC and last generation console. So Lumen on next gen console,
distance field ambient occlusion and SSGI on
Playstation 4 and last gen consoles. And then unshadowed
skylight on mobile. And that way, the same set
of content and the same game can ship across a
wide range of hardware all with dynamic lighting. CHANCE: Got it. Cool. VICTOR: Chance, is there anything
else you found here? I feel we've-- CHANCE: Yeah,
I think we've wrung the rag. All the water that's
in it in many ways. VICTOR: Wrung the rag? CHANCE: Wringing a rag
out, like squeeze some-- VICTOR: Oh, ringing. All right. OK. CHANCE: Yeah. VICTOR: Yeah. Sorry. I don't-- not a
native speaker here. Some things are still new to me. CHANCE: It's OK. I forgive you. DANIEL: Can you switch back
to my screen real quick? While we were just
doing Q&A, I didn't even
notice these chairs before and under this table. But it just looks really nice. I was just sitting
here looking at it. I never even looked
under the table before. But like this chair
looks real right here. CHANCE: Stream ended.
Are we good? Are we still good? VICTOR: Let's see. I think we're still
seeing Daniel. The TeamView instance died. DANIEL: OK, it's fine. It's not important. VICTOR: We're still live. Getting some artifacts. They can still hear us,
so don't say all the things. DANIEL: I'll try not to. DANIEL: Victor, what?
I can't believe you said that. VICTOR: To not say
all the things? DANIEL: I can't
believe you said that. CHANCE: It's OK. Oh, there we go. VICTOR: Let's see. It's going to be-- oh, I haven't had this
happen to me before. CHANCE: Yeah,
those are [AUDIO OUT] chairs. 9 or 10, what's it? DANIEL: We save our
technical difficulties for the end of the stream. VICTOR: If that's how
it goes every stream, that will be great. Trying to get back up. It's kind of interesting that
I'm not able to access the PC but we're still live. CHANCE: Yeah. And chat's talking
about the chairs. So I think they can see. VICTOR: No, and I can see
Daniel showing the chairs. It's just like 8
seconds delayed. GALEN: Look at that chair. It's hot. CHANCE: Look at that. It's a good chair. GALEN: It's a great chair. DANIEL: You model a chair--
a really high quality chair. You put it in a level
and it looks real. CHANCE: Yeah, that's fantastic. I can't wait to see
where we go from there. I mean, the techs been out for
just a couple of weeks now, right? So I'm sure it's
going to be exploding with all kinds of
beautiful chairs. GALEN: Well, I mean,
that's one of the cool things, too. I'm sure you guys have
been doing the same. But just jumping on
ArtStation, jumping on YouTube and start seeing
people update scenes that they maybe made in UE4. Flipping on all the bells
and whistles with UE5. DANIEL: Yeah, how easy it is
to try out the UE5 features. Lumen is just one
project setting and you can really quickly
select all your static meshes in the Content Browser
and convert to Nanite. And you can be trying it out. And Virtual Shadow Maps
are enabled by default. And there's just one
project setting to turn it on if it's an existing project. It's like minimal
effort maximum result. CHANCE: And on that same note, too, there's so much free
content in the Marketplace. If you want to try
these things out, you can grab them into a UE4 a
project and then bump it to UE5 and take a look. This is so great. Yeah,
it's so fascinating to see what people are able to do here. As not an artist, this right here is
just magic to me. Or a rendering programmer. I'm just a guy in between. So this is really special. Well, I am good on questions
I think at this point. We've got other things. If there's anything
that didn't get answered that we missed on the doc here, I think we can take this
back to the forum. Right, Victor? VICTOR: Yeah,
the forum announcement post, which is where we announce
the streams every week. You can go to
forums.unrealengine.com. The Events section, that's where we
post all of them. And the forum announcement
posts for the episodes of Inside Unreal is where
we continue the conversation after we have gone offline. Daniel,
is there anything else you wanted to leave the audience
with before I do my outro spiel and we say goodbye? DANIEL: Nope. That's it. VICTOR: All right, well, I'm going to thank
you in just a bit. But before we get to that point, I would like to let everyone
know, thank you so much for joining us today. Again, it is my time to talk. Because I don't say
enough every week. If you tuned in and you just
found a channel here today, thank you for joining us. If you're interested in getting
started with Unreal Engine, you can just go to
unrealengine.com, download the Epic Games launcher. From there, you can go ahead and
download both UE4, as well as UE5. There are no reasons
why you shouldn't be able to start learning
how to use Unreal Engine. If you think, oh, but UE5, it's in early
access-- not ready. Just go to UE4. If you go to
learn.unrealengine.com, you can find a whole
wide array of courses available to learn everything
from lighting to Blueprints to programming to VR. We got it all. And if I'm able to
scroll up to my notes, I will be able to continue. Make sure you use the
communities.unrealengine.com. That is where folks host
regularly physical meet ups. But during the pandemic,
this has not been happening. However, folks are still getting
together virtually, showing off their projects, talking about what
they're working on, as well as helping
each other out. And so if you're interested
in finding a community that might be a little
bit closer to you than anywhere else in the
world, go ahead and go to
communities.unrealengine.com. You can also find others on-- I already mentioned the forums. We also have an
unofficial Discord channel unrealslackers.org, as well as a Facebook
group and Reddit. All good spots. Make sure you let us know
what you're working on. Clearly, it's super exciting
for all of us to see all the UE5 experiments
that you all are doing. Make sure you mention
us on Twitter. You can add us on the forums. Send us PMs. Or if you have something
that you'd potentially like to let us know but you
haven't let the world know about yet, go ahead and email
community@unrealengine.com. We're still looking for
more countdown videos. We've been going through some of
our short film Australia videos lately. But they're going to run out. And so we're still looking
for development videos. They're usually about 30
minutes of development. Fast forward them
to five minutes and send it to us
separately with your logo. And we will go ahead and
add the countdown to them. If you stream on Twitch, make sure you use
the Unreal Engine tag, as well as the game
development tag, if you're doing game development. If not, just use Unreal Engine. Maybe there's something
else there specifically for what you're doing. We used the creator one. That's a good one. And those filters
are the best way to find other creators
that are working live on stuff using Unreal Engine. CHANCE: Lots of them up
there, yeah. VICTOR: There are, yeah. And today is a
great time to do it. Because we try to raid someone
after we've gone offline. That is if you're
watching this on Twitch. If you haven't
already, make sure you follow us on social media. That is where all news related
to Unreal Engine and Epic Games go out. Or where you see it first. And make sure you hit the
notification bell on YouTube if you want to see when
we upload new videos. Most of them are not live. And so there's a
lot of content there that you might be
missing if you're only looking for our live content. Next week, Chance,
what do we have next week? CHANCE: I think next week, we're going to cover the open world
features with the tools team. So that ought to be great. I'll be chatting
with Jean-Francois on, well, partition, data
layers, one file per actor support. All the great
things that allowed us to really build
this expansive space in a short amount of time. VICTOR: Yeah,
we were hyping about this earlier this week. Just sort of talking about how
cool some of these features are. And while they're not
as flashy and pretty and we can't just
have Daniel sit here and turn a slider back and forth for everyone that's enjoying the content,
they're extremely useful for some workflows. So suggest you
tune in next week. CHANCE: I think this is the
one that on one of the YouTube comments was like, oh,
these folks are talking and Chance is just nerding
out about all the features. I think this is
the one I probably nerded out the most on
that livestream about-- because I worked on it
in the project myself and so I immediately saw
some of the benefits there. I'm really excited
to talk about it. VICTOR: Cool. And with that said,
my outro spiel is done. Daniel, Galen,
Chance, thank you so much for coming on this week. Hope you had a good time. DANIEL: Yep, thank you. [INTERPOSING VOICES] VICTOR: Oh, you're so welcome. I'll make sure you
get the updated slide template for next time. Or maybe we'll tweak it a little bit so that you'll
like it better. DANIEL: Do I-- do I get anything for being on the livestream? VICTOR: Send you a t-shirt? CHANCE: You like ice cream? DANIEL: You guys should
have a live stream t-shirt. VICTOR: We can take a
walk in the rain maybe. I don't know.
Socially distanced? DANIEL: I like the t-shirt idea. VICTOR: OK. What size are you? DANIEL: Large. VICTOR: All right. All right. I can make that work. CHANCE: It's happened. VICTOR: Yeah. Now, you all saw that I
promised Daniel that live, so yeah he'll haunt me for
that if it doesn't happen. Cool. Thank you all. DANIEL: I have witnesses. CHANCE: That's great. VICTOR: You do. About 477 plus 927
as of right now. Anyway,
thank you all for hanging out. And thank you all for
staying out there. Stay safe and we will see you
all next week at the same time. Take care, everyone. [MUSIC PLAYING]