>>Amanda: Hey folks
- Wait no more, September's free
Marketplace content is here and we're offering double
the products this month! Launch your own on-rails space
shooter with out-of-this-world warp effects and
customized dialogue boxes. Space not your thing? Take your archviz
projects into VR, populate your world
with unique characters, take a swim with a dynamic
water system, and more. While these items are only
available for free this month, you can always add
effects to your footsteps with the now permanently
free Niagara Footstep system. And if that wasn't enough
content to get you started, in collaboration
with ArtCore Studios, we're excited to help you ramp
up production on your next project with a brand new
free Factory Environment Collection--the industrial
pack features all the advanced machinery, vehicles, props, and
fine refreshments you'd expect to find in any modern
large-scale manufacturing center. Download all this free
content from the Unreal Engine Marketplace. After those freebies, if
you find yourself still searching for just the thing
to accelerate your development, over 1000 items
are on sale at 50% during the September Flash Sale. Featuring new
Marketplace arrivals, discover everything
from mountain meadows and dazzling
automotive materials, to dynamic sound effects
and high-quality props. The sale runs through September
7 at 11:59 PM Eastern. Selecting the proper
time and person to approach about
funding is just as important as having a
strong project to present. Author and co-host of the
Virtual Economy Podcast Mike Futter returns with the
next article in his "Finding Funding" series to explore
different types of investments and how to secure funding
for your company or project. The recently released
genre-blending No Straight Roads
harmonizes elements from traditional
rhythm games with action-platformer mechanics. Featuring a rock music
versus EDM motif, No Straight Roads is
Epic MegaGrant recipient Metronomik's first-ever game. Head over to the
Unreal Engine feed to hear from their
two founding members and discover not only how they
composed combat sequences that locked into the
soundtrack's rhythm, but also how they overcame a few
challenges during development. And now over to our top
weekly karma earners! Many, many thanks to: ClockworkOcean, Everynone,
Herb64, Okarii, Tuerer, T_Sumisaki, MMMarcis,
Shadowriver, Haojun_Dai, and staticvoidlol. Thank you all so very much! From around the community,
released on August 18, action-RPG Mortal Shell tests
your sanity and resilience in a shattered world. Your adversaries spare
no mercy, with survival demanding superior awareness,
precision, and instincts. Possess lost
warriors, track down hidden sanctums of the devout,
and face formidable foes. Download and play Mortal Shell
from the Epic Games Store or Steam! This gorgeous scene is EPITAPH,
built by a student team from the NAD School in
Montreal, Canada.. View the entirety of
the beautiful, yet somber short film on
Moufid's ArtStation. In Bagpack Games' Out
of Place, prove yourself in the mysterious
world of Mithem! Help Simon to find a way back
home, defeat your enemies, and restore a war-scarred world
in a tale of friendship, hope and courage. You can wishlist Out
of Place on Steam. Thanks for tuning in to our
News and Community Spotlight. >>Victor: Hi everyone, and welcome
to Inside Unreal, a weekly show where we learn, explore, and
celebrate everything Unreal. I'm your host, Victor
Brodin, and my guest today is RTX Unreal Engine Evangelist,
Richard Cowgill, from NVIDIA. Welcome to the show. >>Richard: Thanks. Hi, good to be here. >>Victor: It's good to have you. Would you mind giving us a
little bit of an introduction of your background,
and then what we're going to talk about today? >>Richard: Sure. My background is in
game development. I've done game design and
art for about 20, 25 years. I've done independent
development and AAA development, so
like every range in between. And our focus today
is on ray tracing. Specifically in Unreal 4. This is my area specialty. I actually made an
indie game before I came to work for NVIDIA last
year that was ray tracing only. But my background,
like I said, I've been developing games
for a long time. I've worked on the Battlefield
series, Borderlands. If you want to go way back,
I helped make Desert Combat, if anybody remembers that. So quite a range of projects,
and different sizes and scales, and things like that. Today is all about ray
tracing, and what we can do. >>Victor: So you've basically seen
and been a part of the progress from 720p, even 640, maybe. And now, until
today, where we're at and some of the new technologies
that you all are working on. >>Richard: Yeah, when I got
started in the industry, a 3D FX card was a
brand new thing. So you know, it's
like, wow, we can do accelerated rasterized
graphics for the first time. So that's, it's pretty old. >>Victor: Yeah, very different. I'm really excited for everyone
to see what you have prepared for us here today. >>Richard: Sure. So, can you see my screen OK? >>Victor: Yep, yep, yep. >>Richard: OK. We'll just get right into it. This is an RTXGI scene. And I'll start playing this. I'll make this full screen. This is a fully dynamic
ray traced scene using RTXGI, which
we just announced, and is available for
anybody to download. And in fact, I could just
real quickly show you what it looks like with it off. So that's just
standard ray tracing. Oh, and let me show you
the frame rate delta. So this is using ray traced
shadows, occlusion, reflection, and translucency. All the standard
ray tracing effects. But when I turn on RTXGI,
you can see what that does. >>Victor: Yeah, it clears it up. >>Richard: Yeah, and it
barely touch the frame rate. RTXGI has about a
two millisecond cost. And it's just like
anything else. You can you can abuse it, right? You could crank everything up,
and generate millions of rays. And it can cost more
than two milliseconds. But it's also very optimizable. You can get it down
to like a millisecond. This is going to be something
we constantly improve, too. So that's about where
it starts today. So here's this example
scene has a pretty-- to do RTXGI over
everything in the scene here has a relatively low cost. I'll let it speak for
itself for the most part, but you can see what it does. Again, I can turn it off. That's just old ray
tracing right there. And you see it's got about
that too millisecond cost, and here I am at 1080p. Now this is also about
other technologies we're working on, like DLSS. I can turn that on. And you can see what that
does to the frame rate. So it's like the same
quality level or even better, and you can see it's a
significant boost in frame rate to do that. >>Victor: And you mentioned
earlier during our preparation that you actually saw
about a 25% performance reduction because of the
streaming over Discord. >>Richard: Yes, normally
this would be more like 90 or 100
frames per second. Hopefully everybody can see
the frame rate counter there. Right now, I'm running on
an i7 system with a 2080ti. So a performance level
that is considered now the low end of the new
generation of RTX cards. Normally I would
expect, like I said, 90 or 100 frames per second
out of a scene like this. But I can just walk
around a little bit. This is a fully dynamic,
interactive environment. It's got collision,
reflections are happening. >>Victor: There was a
question from chat, they were wondering if
RTXGI is ray traced? >>Richard: Yes. Yeah, it's only possible
with ray tracing. I can explain some of that,
how that actually works. Just so everybody knows,
you see this debug text at the bottom of the screen? That's just to let me
know that DLSS is active. That wouldn't be like running in
the application or a shipping game. That's just my debug
text, so I can see it. When I go back to the
old antialiasing method, the debug text goes away. If I turn that back
on, it actually it's telling me as a developer
what it's doing here. Because if you look, it
says the input resolution is 675 by 421, in
this case, and that is upscaling it to 1162 by 725. You can see what
it's doing there. On the ray tracing
aspect of RTXGI, this is what's actually
going on in the background. And I'll pull out. This test scene is an attic. It was originally Omnikit art. It was made by one
artist at NVIDIA. And see, I grab the DDGI volume. So this is a GI volume, much
like a light mass volume, where you would wrap it
around your gameplay space. And just like a
light mass volume, it's got probe points
for generating light. So this is the debug mode. I can see each RTXGI probe
point, and the lighting profile that it's generating
at that point. It's using ray tracing at each
one of these sphere points to generate what the probe sees. So each probe is casting
thousands of rays. And you can see
there's lots of them, but it's handling it just fine. But that's what's actually
going on behind the scenes. So I've got a volume here. We call them DDGI volumes. Dynamic direct GI. Because that's exactly
what it sounds like. It's fully dynamic,
diffuselating. And you have a lot
of configuration. You'll see that's a very
common thing with everything that I talk about. There's a lot of configuration
and options you have. But you can set how many
rays per probe per frame are being generated. How much sampling is it doing? The default here we have
is 288, but a lot of times, you can actually
go less than that. And like I said, this is
where things are at right now. This is basically the
1.0 of the product. But there's more
optimizations to come to speed this up even more. You might notice
certain things in here. Like some probes are black
with red around them. That means that they're
not touching anything, and they're not relevant. So they've been put to sleep. It's sort of-- so if I'm
sloppy with my volume and I've made it too
big, that could be OK. I mean, at least to
a certain extent. Yeah, so I haven't done like any
real fine tuning of the scene. I just took a volume-- whoops. I moved it. Yeah, let me grab that. Oh, what did I do? I moved it way up. Yeah. So again, here's my volume. And you can see, as
soon as I moved it up here into non-relevant
space, it immediately put a whole bunch
of probes to sleep, because they're not
touching anything. They're not doing
anything important. >>Victor: That's a nice little
automatic performance helper. >>Richard: Yeah, it's
a very smart system. It's got some neat
options to it. I mean, you might be saying
to yourself, oh, probes. If you look at the
current RTGI, that's just completely automatic, right? You switch it on,
but it's doing-- there's a lot of sampling noise. I think everybody considers it
to be, in its current state, like a beta product. It's not completely there yet. And that's understandable,
because it's just-- RTGI is basically beta. But it's got some drawbacks,
in that it doesn't currently run very fast. And it can be kind of noisy
because of the sampling. RTXGI doesn't suffer from
any of the sampling issues, because it's basically it's
not operating that way. It's using probe
points to generate spherical textures that
then get re-projected back into the scene. So ray tracing is used
in the generation, but not in the
projection, basically. And so therefore,
it doesn't suffer from any of the noise
issues that you normally get with current GI ray tracing. And this system allows you to-- it does everything
you'd expect GI to do. It will do colored bounce light. It will do soft,
ambient shadows. So lighting and shadow
is fully occluded. And you don't necessarily need
really dense probes in order to get that kind of result.
This is kind of-- yeah, see, generally it's pretty sparse. I've got some secondary
volumes in here where I want more detail, but
that's really not necessary. It's just I just wanted
to do the maximum quality on this scene. And so some of these
extra probe volumes are in here for that reason. But anyway, that's kind
of a long winded answer. But yeah, anything else
on that you wanted to ask? >>Victor: There are
plenty of questions, but I also know we have
a lot to cover today. So why do we go through
some of the things that we had prepared, and then
we'll dig into questions either as they come up specifically,
or leave some of the more general ones till the end. Because I know a lot of the
questions that are coming in, Richard will be covering
throughout the presentation. >>Richard: Yeah. OK. There's lots of really cool
things you can do with RTXGI. And I'll-- let me turn off the
probes so we can see the scene. Well one of the benefits of
this method is mesh lighting. And this is really an
incredible feature. Outside the window here, I've
got large emissive surfaces. In this scenario with RTXGI,
every emissive surface is a light. And it's completely
performance-free. You get that for free. In fact, when you saw
I moved that back, it generated a little
bit of light right there on this wood plank. You see, I lit it up. So if I go back in here,
and I move that away, you see it gets a dimmer inside. So if I make the
mesh more emissive, it effectively emits more light. And in this case, I also wanted
to be able to see the outside. So I set a depth-based
opacity on it, so that as you approach
the window, it fades down. >>Victor: That's a nice trick. >>Richard: Yeah, and it's
still generating light. It doesn't need to be
visible to generate light. In fact, over here I
have an invisible mesh. Easiest way for me to see
is probably wireframe. Where's that guy? There he is. So there is a sphere
mesh right here with the same material on
it, except in this case it's flagged to be invisible. And I did that because
in this branch-- we don't have caustics
in this branch. That's a whole separate issue. But one of the things
caustics lets you do, which we're going
to be getting to, is light transmission
through surfaces. So my translucent
shadows, basically. So if this had caustics and
GI and everything in here, this surface here could
be receiving some-- it could be semi-translucent. But I can fake it with a mesh
light, an invisible mesh light. And that's why I'm doing
some denser sampling in certain key areas. And you don't have to do that. But if you want GI
lighting in certain areas to pop more than other
areas, you can do that. And individually, the
probes are very inexpensive. Each one individually
is virtually free. They only cost something when
there's tons and tons of them. But there's even ways to
mitigate that, and pull that back in performance. This is why it
ends up being such a performance-efficient system. And also worth
pointing out, you might see a little bit
of artifacting here in the GI, that's generating
GI that appears to move. But that's because
I also have SSGI on. So RTXGI can work with SSGI. You can use both. That's just a choice, right? You don't have to. But sometimes SSGI can give you
just a little bit of extra lift in certain locations. This is with SSGI off. And it looks fine,
just doing RTXGI. But you see, SSGI created just
a little bit of extra lift at the bottom of
the couch there. That, to me, looks better. And just as an
artist, I'm saying OK, I'm willing to take some of
the screen space artifacts in order to get a little
bit of extra light boost. These are just-- That's part of the
flexibility of Unreal, right? We have these
choices that we can make to pump in more
detail or more lighting if we want to get
a certain result. >>Victor: And there
were a few questions coming in related to-- will this project,
or any new stuff you're showing today will it
be accessible for people to dig into and test
out at some point? >>Richard: Yes. The whole point of
this here is that it should be an example project
that everybody has access to. So all the content
will be downloadable. You can tear it open, take a
look at how the scene is built. There will be a
version of this that will be compatible with just
regular Unreal and our branch, we call it NvRTX. It's on our website. But that branch has-- it's got DLSS branch, ray
tracing specific optimizations that aren't in main line yet. So you can get
either version of it. You can compare
the two you can see the performance gains
you get from DLSS, all that sort of thing. And sort of see what the effects
are and the differences are. >>Victor: And to be clear, you're
currently on the custom de-- your branch, right? NVIDIA's branch? >>Richard: Right. Yeah, because I wanted to show
RTXGI and DLSS both running in the scene together. In fact, just right here. Again, we're looking at the
current antialiasing method. If I switch to
DLSS, you can see, I got the frame rate boost. Quality still looks good. you can see what it's
doing behind the scenes by looking at the debug text. >>Victor: You want to
go full screen for us? >>Richard: Oh, sure. I can't remember if we
mentioned it already, but yes, I am losing some frame
rate from doing the stream. Like I said, this is normally-- 90 or 100 is more like
what we'd expect to see. This scene, by the
way, like I said, it was made by one
artist at NVIDIA. But it was intended for OmniKit. This was sort of like CGI level
or off light rendering level graphics. And so you're looking at a
two million triangle scene. It's very dense, very high poly. And it's been
brought into Unreal and sort of assembled
in the Unreal way. So there's individual
objects, and we can look at what the performance is. So it's got interactivity and
gameplay, proper collision on things. There's all kinds
of physics going on. So this is like a real scene. This could be a game. That's part of
the point of this, is it needs to be real, right? A tangible thing that you
can look at, and take apart, and know that it's practical. And also, completely done
with blueprints, by the way. But yeah, there's a lot of fun
little interactivity in here. It's still being
developed, but this will-- >>Victor: Here's the cool-- yeah. >>Richard: This is
your favorite, right? >>Victor: Yeah it's my
favorite part by far. >>Richard: Yeah, I mean this just
shows how accurate ray traced shadows can be. Because like I said,
it's fully dynamic. Ray tracing for everything. Reflections, shadows,
translucency. If you look really
carefully, you'll see the character
reflecting in the sphere. That's not screen space. And I kick it around. The great thing about
ray traced shadows is you can see how sharp the
shadow casting is at the base, like where the light's
coming out of the star. But then as it
gets further away, it's got that nice
penumbra effect. >>Victor: This definitely
makes me want to see like a ray traced disco
game in the next game jam. >>Richard: Right? Exactly. Yeah, lots of dynamic lighting. All dynamic. Dynamic mesh lights,
in this case. >>Victor: Right, it's just an
emissive material, right? On those green balls? >>Richard: Yeah Full transparency,
there are little very small, non-shadow casting point
lights also placed here. But they're not generating
the light, really. They're just for a
little bit of touch. They got a very
small radius on them. And it's just so that it
creates a little more precision. But it's not necessary at all. You could just go with GI. But you know, it's-- non-shadow casting lights
are very close to free. So it's almost, why not? But you wouldn't have
to do it that way. Like I said, I can turn
the GI lighting off. And you can see, so that's
the direct lighting. And everything else around it-- so there's some screen space. Yeah. I'm gonna the GI on. GI off. So the GI contribution
is significant. >>Victor: Yeah that's
stuff that you-- without any form of
dynamic level illumination, you would have to fake
with indirect point lights. And bake it out with-- play around with your skylight
and such to try to get that. >>Richard: Right, exactly. Yeah, I don't know
if what I just did there was completely
obvious to everybody. But that was--
you just saw RTXGI do occluded-- it's
doing occluded shadow and lighting in real time
with changing geometry. This is a fully dynamic system. So there's a lot of
advantages to this. And it produces a
very clean result, like-- I mean, that's one
of the key things. It doesn't have some
of the noise issues that we saw in the very first
iteration of ray tracing. This is significantly advanced,
very stable, very fast. So there's a lot of
good quality advantages. >>Victor: So we touch
quite a bit on GI here. Would you mind explaining, for
those who have never heard about DLSS? What it is, and what it does? >>Richard: Yeah. So DLSS is-- it's basically a
new form of image enhancement and antialiasing that
uses deep learning models. So we're using
artificial intelligence to take a lower resolution
image, like what I have here. Right now. Pretend like this
is your input image, and we're scaling it
up to full screen. The actual input
image is like this. It's smaller, low resolution. What the AI is doing
is as it scales it up, it's looking at all
the edge pixels. It's looking at all
the little details that basically get lost,
and reconstructing those details in real time. In fact, if you look down
here in the debug text, it's telling you what the
millisecond cost is in order to do that reconstruction. So in this case, that's
a half a millisecond. Only cost a half a
millisecond for an RTX card to do that real time
reconstruction of a lower res image to a higher res image. >>Victor: And compare that,
then, to actually drawing those pixels, or telling the
GPU to draw those pixels, that is significantly lower. >>Richard: Yeah, I mean,
fill rate issues, I suppose are a very
large bottleneck. The smaller an image is
the more postage stamps-- >>Victor: One moment,
there, Richard. You cut off. OK, I think you're back. Yes, yes all right, great. >>Richard: Yeah, hopefully
not too many of those. So yeah, it's not just
sharpening it and scaling it up. It's constructing detail,
and inserting that detail into the scene, and
making it look right. And it knows how to
do this because-- I guess the best answer
is artificial intelligence is just really good at
imagery construction. It understands images, and
understands how to do it. It's been talked about, but
the AI algorithm is trained. It's fed a ton of ultra
high resolution images. And it's basically taught
what things should look like, how light should
interact, how details should appear to the human eye. And it learns how to
do all those things. And then once it's been
taught-- and there's years of development that's
gone into teaching the AI how to do this sort of thing. It then inserts that detail
into basically an enhanced lower resolution image. And the fascinating
result with that is that a lot of times,
the lower resolution image that's scaled up looks
better than the native image. And I know that's been
talked about a lot, but it's a real thing. It starts pulling out text that
you couldn't read otherwise. Some of that just might be
current antialiasing methods are, while really, really good,
they don't do every detail perfectly. And DLSS just might. There's some interesting
discussion points there. But DLSS might be doing so. I mean it's definitely-- it understands what
images should look like, pulls them out, makes
them look better. So it's a very
impressive technology. And it benefits-- the great
thing about it is that it works-- if you have an RTX
card, it just works. Right? So DX11, DX12. Ray traced, not ray traced. Those details don't matter. But in the case of ray tracing,
it really helps the feat. Because ray tracing is doing
higher quality work, right? Higher quality lighting, and
shadow, real reflections as opposed to fake ones. So anytime you go
higher quality, there's a cost to that, right? So technology like DLSS gives
you your frame rate back. Any frame rate you might
have lost to ray tracing, to going higher quality,
DLSS returns to you. And I would just point out
that these are pure software advancements, right? Ray tracing is getting better,
DLSS is getting better. They're advancing. And they're not-- these
are not technologies that are holding still. There's going to be further
advancements coming out. We'll see more and more. But it's definitely
impressive stuff. I know this is my
job and everything, but I'm constantly
blown away by what I'm seeing being produced here. And I have to say
as you see, I'm turning on the assess
with a console command. So the antialiasing
method, too, is TAA. There it is, off. And you saw the frame rate drop. So 4 is for DLSS. And largely, there's very
little or no configuration that's really needed. You see I just doubled my
frame rate [INAUDIBLE] shot. And I don't have to finesse it. I can, there are options
to finesse it further. But it really is one of
those things where you just turn it on and get the results. >>Victor: So do DLSS
completely replace the need for any other form
of antialiasing? >>Richard: Yeah. It's a full on replacement. It does its own version,
its own way of doing that. And concepts of, like
temporal upsampling-- or prior to DLSS, if you're
trying to get maximum frame rate out of your software,
you might do something like you'll turn on upsampling. You'll run it at a lower
screen percentage 83%, 75%. You might do
something like that. Run at a lower screen
percentage, upscale it, and then maybe run a post
filter on it, like a sharpen. And that's sort of
the standard way that we might do in
professional development. And so on. But a lot of indies, too. Just everybody. That's how you might
get an extra 10 frames a second out of it, say. All that stuff is-- you don't have to think
about that anymore. That's not necessary. Don't do any of that,
DLSS just handles it. It takes that lower res image,
and scales it up, sharpens it for you, makes it look better. >>Victor: There were a couple
of questions in regards to-- previously,
folks have heard that you had to train DLSS
specifically for your game. That's no longer
the case, right? >>Richard: That's right,
that was DLSS 1.0. 1.0 was-- I mean, it
was earlier software. DLSS 1.0 had to be
applied to AAA projects. Because we would get the
developer's binary, and train DLSS how to analyze their image. As you can imagine, it's
an artificial intelligence algorithm running in real time. In theory, it could do any
number of things to your image. It could manipulate it
in any number of ways. It could change all
kinds of details. So you had to make
sure it ran right. And it was kind of like
software-to-software. DLSS 2.0 is very generalized. It takes any kind of image
and puts in the right details. And just does it automatically. It'll work on any
type of environment. You know, photo real,
cartoon, doesn't matter. >>Victor: I have a lot
of questions coming. I'm going to try to
grab some of them here, while we're on
the topic of DLSS. Have one question coming
in though is, does DLSS work with-- OK there are a couple ones here. Main one, a couple questions. In regards to VR,
if it's possible or will be possible to use
it for stereo rendering? >>Richard: Yeah, that's
being worked on. I can't talk about
release dates, but it is being developed. I've seen it personally working. >>Victor: Do you know if it works
with-- another question was if it works with
the forward render? >>Richard: I don't know that. I probably can't get
into specifics anyway. >>Victor: That's OK. I'll bombarded
with the questions. And then perhaps,
if there's someone-- either you or NVIDIA-- who would like to
follow up later, you can just go ahead
and post the answers on the forum announcement post. Let's see. There are a couple of
questions, moving back a little bit to RTXGI in
regards to the light probes. Someone was asking if you
need to precalculate the RTXGI probes? >>Richard: No it's
all fully dynamic. But this is actually
another advantage of RTXGI. You could say that
we've attempted to think this one through. Because the question might
be, well, what about consoles? What about lower end systems? RTXGI today is fully
dynamic, uses ray tracing to generate the
light probes in real time. A future version
of RTXGI is going to break that down for
lower end systems, consoles. So it's fully
dynamic on RTX cards, but we'll have an update at
some point that will open that up a little bit more,
and provide, like I said, a baked down version of
the GI for older systems, or lower end systems. >>Victor: I had
another question come in that was, how
does the RTXGI handle small details or thin geometry? >>Richard: Very well. I wish I had some of those. I don't have the-- that was one thing
we were looking at. I mean, there probably
are cases where certain very thin geometry,
or maybe a certain geometry, can trip up a little bit. But I would say, in 90-some,
very high percentile of cases, that works extremely well. It won't have any leaking. Any issue you would
run into there is usually just a
matter of fine tuning. I can show you one
other thing about it. It might be
interesting to look at. Show the probes. So there is an option on
here, enable probe relocation. It's on by default. And there
I go, moving stuff again. So as I shift this around
in the environment, the probes will
intelligently figure out where they're supposed to be. Are they inside geometry? Are they inside gameplay space? And they'll relocate
themselves on the grid. So they won't end up stuck in a
place they're not supposed to, or generate light and
shadow in a bad way. And if it doesn't make sense-- you see this one up here,
in the top corner's reach, it's shifting its position. See that? Just at the top there. Because I moved
the volume around, it's going oh, I need to be a
little bit over here, or over there. Right? And sometimes, it'll go
oh, I need to be shut off. It'll just figure out where
the optimal position is. So you really-- you can,
in a lot of cases just, without much fine tuning, just
drop a volume in the world, count on the pro-relocation
to give you decent results. You see like-- Yeah, that's interesting. >>Victor: And I'm not seeing a
significant performance drop while you're-- >>Richard: No, no. Yeah, it's all real time. It's constantly
being recalculated. Yeah, we're looking at
further optimizations on that. Like I said, it's a two
millisecond cost today, but people can expect it
to get faster and better. Just because-- I'm so
glad Ampere came out. Because you might be-- someone might suspect
oh, well they just made ray tracing cards
that are twice as fast. So they don't need to try
to improve the software. That's definitely not the case. The software is advancing, and
we'll get faster and better. And that's over and above
what people are seeing with the hardware improvements. >>Victor: Yeah, you know, I
was saying this last year when we first released ray
trace lighting and shadows and reflections. I was like, it's the early days. But it's still
early days, right? >>Richard: Yeah. I mean, you might consider,
if you step back a little bit and look at what's
happening right now, there could be some debate
about this for sure. But you know, starting
with Unreal 4.22, you could look at ray tracing as
a very specific set of things. It was shadows, it was
translucency, reflection. I can't remember. But it was a very
narrow set of things. Reflection, translucency,
shadows, stuff like that. We're kind of getting
into the second generation of the software now. We're looking at real time
GI that is extremely fast. I mean, like I said,
I'm running on a 2080ti. No longer the high end, and I'm
getting very good performance with it, very stable image. And we're looking at
things like caustics, which is not hypothetical anymore. It's a real thing. You can look at
this next generation of ray tracing
enhancements almost like a second generation, to
go with the second generation cards. And it's like I said-- I feel like it needs
to be reinforced-- the software is evolving. It's getting better, and
faster, and higher quality. So it's not a static
thing where it was two years ago or a year ago. We're going to see
continual improvements over where we are now. >>Victor: Just a clarification. Someone was wondering
if DLSS blurs all text, but it's actually just that
debug text that is overlaid, right? >>Richard: Oh, yeah. You mean it gets
blurry in the scene? >>Victor: Yeah, I think-- yeah, so when it's full
screen and you fly around, you can see the debug
text, you know, moving. >>Richard: Yes. Yeah, that's nothing
to be concerned about. That's just-- I don't know
why it's not doing it now. >>Victor: Yeah, it's actually-- >>Richard: Yeah, that
effect right there, right? Where it gets strangely blurry? That's just the debug
text being debug text. Like I said, if I built an
exe, I wouldn't see that. And-- or I would, if
I had to debug dll in. But that's just
part of development. Yeah, it's nothing to
be concerned about. >>Victor: Let's see, we have a
few more questions coming in. But I think we can leave a
couple of them for later. I know you're going to
cover a few more things. >>Richard: Yeah I have
other content I can show. I mean, this attic
scene is something I'm eager for everybody
to get their hands on. It's not quite ready yet
as an example project, but I feel like
the more examples we can get people's hands, they
can load up the editor files and take a look at everything. That's kind of the best way to
really understand what this is and what we're doing. >>Victor: And to clarify, you did
mention that this scene will eventually be available in
the vanilla version of Unreal? >>Richard: Yeah, I
think we'll probably put out both vanilla and NvRTX
versions at the same time. >>Victor: And I did see
a question earlier. Someone asked if the video
branch provided binaries. As far as I know, you have
to compile it from source. >>Richard: Yes. Yes. Well, I can just
show that real quick. We tried to make
this really easy. Understand, I'm not a coder. I don't write code. I have a design background,
an art background. But even I can do this. If you go to our Unreal Engine
off developer.nvidia.com, all you got to do is
get GitHub access. And that's just creating an
account and getting approval. And then once you do that, you
can you can grab the branch. And we document all this. And once you're in, and you have
the code and you pull it down, here's the branch right here. And we're we have
version going back, but this is it's
equal parity to 4.25 >>Victor: Yeah, it's very recent. I'm super glad to see that. >>Richard: Yeah, and we
coordinate everything with Epic as far as that goes. We're in touch with
you guys constantly about when's the next
release coming out. Making sure our code
is ready to go when your codes are ready to go. So the delta between
Epic putting out a new version and us updating
our branch is very short. We're trying to shorten
it all the time. But we try to be days,
or weeks at the outside. But very, very quickly. But all the source is here. You just get the
code and build it. And we have installation
steps for everything. Like what you need to run
DLSS, and a pretty good guide for how to get it all
set up for yourself. So yeah, we're trying to make
that as easy as possible. And really, I'm serious, if
I can do it, you can do it. We have a lot of
non-coders at NVIDIA video. Producers, and so on. Testers who are people who-- they learned to do this, too. They just pull down the code. You build it. It's kind of powerful
once you're in there, and you're doing that. Once you realize that it's
not nearly as complicated as it seems. All the quality enhancements,
the performance enhancers, especially. They're significant. So it's really good
to get if-- if you're serious about doing a
ray tracing project, you should get in our
branch and get that code. >>Victor: Before we
leave the attic scene, there was a question in
regards to the god rays that's coming in
through the window. The question was, how are
you getting ray tracing shadows to work with volumetric
lighting to cast those god rays? >>Richard: That's in our branch. Yeah, that's it's not
currently in the main branch, but it's in our branch. It's something we
recently addressed. >>Victor: Someone out there in
chat paying good attention. >>Richard: Yeah, sure. Yeah, there's lots of little
enhancements like that. That's a good catch. I mean, that was done on
purpose for the scene, too. Because I don't want
to just have this scene to test one thing. We need to look at everything. We need to look at how DLSS
works with volumetric fog shadows, and how that works
with ray tracing enhancements. Because that's what
developers are going to do. You're going to look at
everything together, and see the combined result. What's
the performance like? What's the quality like? It has to be there. Test scenes like
this are to make sure all that's working right. >>Victor: There's also
someone wondering if it's possible to turn the
inputs resolution right down, and see what we
can get away with. >>Richard: The input resolution? Oh, for DLSS? Yeah, I'm not sure about that. I mean, I'll be honest,
I'm not a DLSS expert. At a high level, there might
be tweaks in there to do that. But I'm not 100% sure. >>Victor: Why don't you
go download the branch and try to figure it out? >>Richard: Yeah, exactly. Yeah, go take a look. >>Victor: All right. Unless there was
anything more here, I know you have a couple more
levels that impressed me, so. >>Richard: Yeah. I just wanted to-- maybe I can speed
through some of that. Because I might be spending
too much time on that. >>Victor: Oh, we have--
you have good time. >>Richard: OK. This is our basic
example project. As you can see, it's
NvRTX GI example. So this is based off
our NvRTX branch, so it has all of our RTX
enhancements plus RTXGI. So this is a custom branch. I mean, I put it together,
but anybody can do this. So this is just-- if people saw certain
webinar videos recently, you might have
seen this already. But it's just a good
dynamic ray tracing example. It shows fully dynamic lighting. In this case, ray traced GI-- RTXGI is on here. So if I turn that
off, you can see you lose a lot of the color,
the bounce lighting back here. There's a little bit
of GI if you look. Attentive people will
notice, right there. I can't see where I'm pointing,
but right here, there's a little bit of GI happening. But screen space. If I look away, it goes away. You know, it's good it's
good to have, right? If you want it, it's there. And RTXGI is compatible
with it, if you want that little bit of
extra lift in certain places. And there's RTXGI on. So you can see, there's
a lot more very subtle lighting and shadow
activity happening with that enhancement. This is an interesting-- this is another way
to look at what's possible with mesh lighting. Because I can take these
emissive materials here, and just go like that. And create lighting
in the scene. There's really no
limit on that, right? And I mean, that's
real light that gets bounced, and occluded, and
does everything you'd expect. It's possible to completely
light you're seeing this way. RTXGI is-- I mean,
by definition, it's not super precise. It's not super high
resolution, but it will cover most of your
lighting needs just like that. And those would be
mesh lights, then you're generating all
that diffused shadow colored bounce lighting in real
time, with no additional cost. And here, in a darker area. >>Victor: Yeah. Yeah, that little
alcove there was pretty cool to see when you placed. Yeah. >>Richard: Starts to
make purple in between. That's just fun, right? That's cool. I don't know what's happening,
but it's pretty neat. >>Victor: Yeah, you
know, it's really about how caustics makes you
think in new terms. But even here, I can see
as an environment artist how, all of the
sudden, you can rethink how you would light stuff. >>Richard: Yeah, I think this
is one of the potential maybe not commonly
thought-about elements of RTXGI. Because I've been
a lighting artist. And typically, you do
things as a lighting artist to try to make your
scene look right. It can be a lot of
different adjustments. Everything from post-processing,
tone-mapping, or whatever. But you might flood your
scene with a lot of lights, or bake things, and
just maybe a combination of different elements. RTXGI eliminates all that. And in theory, you can have a
scene with a lot less lighting. You don't have to put down
lots of little point lights all over the place. Sort of the traditional
way of lighting a scene. You can-- well, I mean,
I'll just do this quick. Like I can grab-- this is just a bunch
of separate objects that have got different
materials on them. Very simple materials. But I could, you know, have
this guy generate light. Oh, whoops. And then, you know, make
it really physically big. And now there is light coming
up from underneath here. >>Victor: Now we
have the floor is-- not lava, but-- >>Richard: Right well
we can make it lava. >>Victor: There we go. All right, OK, floor is lava. Thank you. >>Richard: And it's
occluded light. I mean, it's popping up
through this window over here. You know, it's-- but it's
creating a subtle light and shadow, and then
bouncing around in the scene. It gets occluded
where it needs to. And that's a no cost light. That's a zero cost light. So a lot of situations
where you might put in rec lights
or whatever, maybe you still want to have them. There's no reason why you
can't combine methods. Again, that's one of the
great things about Unreal is the flexibility of it. But you can very conceivably
get away with less lighting. And then RTXGI lets you-- or will let you bake that
down on lower end systems. That's a bake, right? So lower end systems,
consoles, that sort of thing, they will lose the dynamic
element that RTXGI gives. But you can still
see the quality. If you're not
constantly changing what you're doing with the
lighting every frame, maybe baking down is OK. >>Victor: So thinking
about if we're sort of that kind of cross
platform development. What it means is that the
environment artists can light the scene and then-- unless there are dynamic
elements where you want GI-- the environment
artist doesn't have to go in and make a
completely separate level, or try to go and
fake all the effects. You can actually
bake all that down? >>Richard: Right. Right. Yeah. That will, yeah. And a lighting artist won't
have to really worry about that. You just bake it, and it'll
give you a very similar result to your real time look. You know, like I said, that's
compatibility with lower end systems. Oh, that's so interesting. Light's coming through
here because this is not shadow casting. This mesh. I'm pretty sure,
I could be wrong. >>Victor: Yeah you did
you did mention that-- >>Richard: I did something wrong. >>Victor: --previously how
you used a specific node in the material to make the
object not cast any shadows? Or even be visible, right? >>Richard: Yes, yeah. Let me just switch
back to the attic. >>Victor: Someone was curious
if the bake feature is already there. Is it ready yet? Or is that something
that's coming? >>Richard: It's coming. Yeah, you'll notice
this is-- so this is a level without
little volume fog. We have a blueprint. You know, it's a console command
to turn on ray traced volume fog. So when I press play, it
gets activated on script. Yeah, that's why
that's like that. And then you see how
it looks appropriately. But let's see. As you can see, this is
a very complex scene. These are all the assets
that go into making this. There's a two million
triangle scene, 4k textures, et cetera, et cetera. Very, very high end stuff. Oh, I should get
the master material. Yeah, so just using ray tracing
qualities which replace, in this case, I'm able to
have your visual capacity and your ray tracing opacity. So in this case, I
can use this trick. This is one way to go. Add an opacity mask
to make something visible to ray tracing but
invisible to direct rendering. All you need is this ray
tracing quality switcher place. I mean, in fact, the
pixel depth stuff here is just so I can do a fade. But you know, this is
all that needs to be. A value plugged into that,
plugged into opacity. Right? Very simple. >>Victor: Yeah, I thought I was
a super cool little trick. >>Richard: Yeah. Well it's just
something that wasn't-- a lot of stuff is not
super intentional, right? We're developing
this and going, oh yeah, there's this feature
in the material system where you can get a
different result for a ray traced result than a
direct rendered result. You can get different results
on the material depending on whether it's reflecting or not. So it's one way to create
an invisible mesh that generates light. I set that to vanish
when you get close to it. But this is a good debug
way to see what's going on. Right? So yeah. That's what it's actually doing. Yeah. And in fact, part of the way
this is designed for lighting, you have so many options
with how you use RTXGI. You don't have to
use emissive meshes. You could just go straight
off the direct light. Put the values in that you want,
it'll generate bounce light straight off the sunlight. And that's all you need. In this case, I wanted a
little bit of extra boost in the right places. Just sort of an artistic choice. But it's also a test
of what it can do. And I've got a slightly
sunny, tinted, emissive mesh on one side, and a bluish
tinted one on the other. It's meant to
match my sky color. So if I go like that it'll-- as I approach the
window, you know, then I can see stuff outside. You know, you don't have
to do something like that. But this is just yet another
way to pump extra light into the scene where you
want it by using emissives. Like I said, it's free. And you see it's
fully-- like here, where there's in-between,
it's you know, where there's holes in the
mesh, lights coming through. And then here, it's occluded
where it's supposed to be. And then here,
there's light coming through, et cetera, et cetera. I even put little
emissive meshes up here, even though you can never
see them in the scene. Just so that there would be
a little bit of extra bounce light coming in up
there where I wanted it. So it is very tunable from
an artistic standpoint. You can make it as
realistic as you want, or as sort of artificially
pumped up as you want. >>Victor: There's two questions
here I would like to address. Someone was curious if
there were any licensing fees in addition to the
Epic license agreement by the use of your technology? >>Richard: Licensing fees? No, not that I'm aware of. But there's just-- Yeah, there's
just-- we just want to know basically who you are. So we have a
development program. But we support
everybody in these. Individuals, filmmakers, AAA's,
anybody who's interested. We have an interest to know
who's using our software. That's really all it is. >>Victor: On the topic
of that, someone is mentioning they took a
look at the DLSS branch. And it says you need some DLSS
registry files from an NVIDIA contact to use it. Could you say
something about that? >>Richard: Yeah that's just-- I think it's just part
of what makes it work. They're just little
registry files that display debug text
to confirm it's working. But it's not-- it's
just something-- it's a very simple install. I'm not sure what the
question is asking, exactly. But it's just it's
very simple install. You drop in registry keys
on your copy of Windows, and then you get-- like you can see it
working, and the debug text, and things like that. It might have something
to do with making sure that from a
development standpoint, it is properly working. >>Victor: I think they were
wondering where they actually get the registry files. I haven't gone through
the process myself, some I'm not aware. >>Richard: Yeah when you sign up
for the development program, everybody-- like we
have bizdev people who help end users through that. So once they're signed up,
there should be a contact that they would be assigned
who can help them with that. >>Victor: And for a
little bit of clarity-- Yeah, I was just
going to go with that. You know this is still
early days for technology. NVIDIA is looking for
feedback, and for people who use it, know what
it is being used for. As we move forward,
similar with features that epic is
putting out as well, you can access it early on the
source builds of Unreal Engine. But then later on, it will
be available for everyone with the vanilla branch. And that's a plug-in. And it won't be too soon as
far as from what I've heard. It's not too far away. But no date yet, right? >>Richard: Yeah I
mean, like RTXGI, I couldn't talk about
even a month ago that it was coming
out on September 1st. But I said a month ago,
it's coming out soon. So keep refreshing that
page, it's coming right up. And no joke, it came out quick. You know, RTXGI is here. Anybody can get it. Just sign up for our
development program, and you got access
to the source. And you can plug that
right into Unreal 4 and go. You don't have to
be on our branch. You know, we've got
a version that will work with the mainline Unreal. So you don't have to. You know, you don't have to. It's totally up to you
how you want to use it, and what you use it for. But yeah. You know, caustics
is the other thing. It's in a similar boat,
it's coming out soon. It's going to be real quick. >>Victor: Will the
emissive lighting work with moving textures? >>Richard: Yeah. I don't have an example of that
here, but it is fully dynamic. >>Victor: Yeah, I don't see
a reason why it wouldn't. >>Richard: Right, it's just it's
just it's constantly resampling your [AUDIO OUT] frame. [AUDIO OUT] samples
that [AUDIO OUT] >>Victor: Let's see, can we
get a quick sound check from you, Richard? We heard you drop out there OK. Yeah, I think you're back. >>Richard: Yep. It's like every 30 or 40
minutes, it drops out. >>Victor: Trust me I'm
aware of audio issues. Thankfully, I had the
audio interface crash before the stream. So I'm hoping it won't
happen during the stream now. >>Richard: Right. There's just one more thing
I want to show with GI. Because I think
this is important. Because you see an attic, and
you see like a cube map example map. But I think the real test is
doing something like this. Where we have an actual
large open world for a scene. So it's not the biggest map in
the world, but it's fairly big. You know, and there's a lot
of foliage going on here. Oh and this, too. This is where it's using
volume fog, so let me start it. >>Victor: Wish we
had some epic music to go along with Manny's
adventure in the RTX world. >>Richard: I know, right? So here, I mean, this
is a great example of GI in an outdoor scene. And there's actually--
I'll just quickly explain-- there's two GI volumes. You can have
multiple GI volumes. I mean, there's a threshold. Where if you put in dozens
and hundreds and hundreds, it becomes abusive. But you don't need to do that. Or at least you shouldn't
need to do that. In this case this big
level only has two. It's got one big volume that's
static for the entire level. It just covers it end to end. And that's like a light
mass importance volume. Same thing. So you just wrap that
thing around your level. And it's got very sparse probes. So those probes are-- whatever, they are
like 30, 40 feet apart. They don't need to be
close together in order for you to get a good result.
That's generally true. But then there's a second volume
that's centered on the player. And it moves with him
everywhere he moves. So that's just shows how
dynamic these probe volumes are. And that's done because I want,
basically, higher resolution GI around the player
than in the distance. And that could work for a
multiplayer scenario, too. That would scale
up with, you know, lots of players running around. Each one having
their own GI volume. Because the system has
additional smart elements to it where it puts them to
sleep when it needs to. And it can do stuff
based on distance. So it's very, very clever about
making sure that performs well. But here I can just turn off GI. You can see what that does. But this is a fully dynamic,
fully ray traced scene. But GI is contributing so much. It's quite a bit. >>Victor: Yeah, you
would have to bake that down if you weren't using this. >>Richard: Yes. Yes. And yeah, this just
eliminates all bake times. Workflow improvements,
I think, are a big deal. So there's GI back on. You know I think
that's a big part. A lot of artists who have been
doing this for a while, that's what they're thinking about. They don't want to
do overnight bakes. They don't want to wait forever
for a light match to finish, and then oh, they
missed an object and they got to move it again. And now they've got to
re-bake the whole thing. And it's a nightmare. And they're constructing
levels in a way where they're compartmentalizing
things very specifically. And so you go through
all these extra hurdles. And it becomes sort
of a workflow problem. If you can just dynamically
generate it and keep it fast and good
looking, then you know, get rid of all those problems. There's GI off. This is a real stress
test of the system. And by the way, these are
using the Kite Demo assets. So this is just the free
stuff off the Epic store. Kite Demo was made
four or five years ago. So nothing especially
special about these assets. Right? This is GI on. GI off. >>Victor: Yeah this
little part of the level is particularly
interesting, right? Because of how tight
the tree coverage is. >>Richard: Yeah, this
is not probably even realistic for games
because it's too dense. If you're actually
doing this for a game, you'd want to clean
this up a little bit. Not have so much
foliage overlap. This is a crazy
amount of overlap. This is like a real
insane stress test. I don't think most sane
developers would do something like this. But this that's part
of the point, right? Is to see what it's capable of. And this is so great,
because you know, we're getting some
skylight from above. And I come in here
under the tree canopies, and I'm getting
appropriately lit. As I can move into a dark areas. I mean, look at that. As I clear the trees and come
out here towards the water, I'm incrementally getting
brighter like I should. It's not so binary right? The shadowing is very
subtle, very dynamic. Oh and you know, this
is running without-- I mean I'm losing,
what did we determine? I'm losing 20 frames a
second or something because of the real time streaming. But yeah, if I turn on-- Oh I can't. OK. Yeah, I don't have
it on this one. But yeah if I turned on DLSS,
it would double my frame rate on these scenes. I mean, this is really pushing
it it's way too much foliage. >>Victor: Yeah. Yeah, the shader
complexity debug view would probably show us a lot
of red and white right here. >>Richard: Yeah it's
a pure red mess. >>Victor: Yeah. >>Richard: But, you know
got to stress test it, right? And we're trying to
push graphics forward. So that's all part of it. >>Victor: That's super exciting. >>Richard: Yeah. Here's some, I mean,
more mesh lights. And I left these visible
so that people could see. But you know, they
just basically match my sky color, more or less. But you can use them. I just want a little
bit of extra GI light being generated in one
area as opposed to another. You know, and you don't
have to do it that way. You could put in lights,
spotlights or whatever. You know you could
do it the old way. But you get these
for free, so why not? >>Victor: Right, and so when
you talk about an open world, for free is a lot better
than an incremental cost for every sort of visual
tweak you want to add. >>Richard: Right, there's a
whole discussion topic, here. Because if you're a
lighting artist, or just an environment
artist in general, you're often looking
at hotspots, right? I'm using too many shadow
casting lights over here, and maybe very few over here. So over here I'm
getting 70 frames per second, and
this one area you're getting 20 frames a second. RTXGI kind of flattens that out. Right? So if I'm putting in
shadow casting mesh lights wherever I need
them, and I'm not putting in tons of rec
lights and spotlights that are shadow casting
all over the place, I don't have to worry
about managing those lights or turning them off by distance. I don't have to try to optimize
the scene in any special way. I just get that flat
2 millisecond cost, and I get all the
lighting I need. And you know, you
still might want-- certain areas, you might say,
OK, I need a rec light here. But I would bet
that in most scenes, you can get away with
a lot less lighting than you used to in the past. And this scene here, I think,
is a good example of that. Because this lighting that's
coming in is pure GI lighting. So you know, if I turn that
off, it's very flat right there. It's reading the
normal map, it's doing everything you'd expect it
to do, just like a real light. And it's occluded so light's
pouring in down from below, but then it's not
coming right here. So this is like getting a
shadow casting light for free. Very powerful stuff. Probably can't
emphasize that enough. >>Victor: There was a
question whether the shadows in this demo were ray traced? All of them, or specifically
for the foliage. >>Richard: Yes. Yes, this is all ray tracing. This is very much a
stress test scene. It's a fully dynamic
scene using, you know, basically modern assets. Right? These are-- they're
fairly high poly. I think they were meant-- like I said, they were
meant for the kite demo. So it was meant for a
cinematic from four years ago. That was the intention
behind these assets. But you know, they're being
used here in real time, and ray traced. I'm not lowering the resolution
of any of the assets, or changing what
the material is. Like these are two-sided
foliage materials. It's using all the original
material attributes for the cinematic. But, you know being
done here ray traced. With GI on, so. I couldn't say that even
any more than I am already. It's just a very
much a stress test of what ray tracing can do. Especially like,
you know, I think it was said a couple of years
ago by your lead engineer that doing a forest or a jungle
is going to be the worst case scenario for ray tracing,
because of all the ray trace translucency that
needs to happen, all the rays being cast against
the edges of leaves and stuff like that. But it is 100% handling it here. >>Victor: Yeah get some
animals in there. >>Richard: Yeah. Yeah, exactly. I do want to talk
about caustics. >>Victor: Yes. I think a lot of people have
been waiting for us to actually get into the caustics. And I did intentionally
use it as the image because I find this to be
absolutely mesmerizing. >>Richard: Yeah. I'm right there with you. I don't know what to say
about caustics half the time because it blows my mind. OK, so you can see my screen OK? OK, cool. >>Victor: You want to
go full screen for us? >>Richard: Oh, sure. So, yeah. This is a different
branch that we're in now. Yeah, I did have-- you said you had
people on who were running five copies of Unreal
at once, and stuff like that. I was only running
two copies of Unreal. But so yeah. This is-- I mean, I'll just
start moving stuff around. These are actually some
improvements over-- if anybody had seen the previous
video that we've shown on this, there were-- I mean, this is
very beta software. It's still in beta. So if you see imperfect
things, don't be surprised. But it was actually-- some people pointed out it was
doing the wrong things before. Like, I think the red
blue rainbow effect was flipped the wrong way. Stuff like that. And this is supposedly-- I think this is the
correct and latest code. But it's more accurate, and
producing smoother images than before. What you're seeing
here is just the-- I've talked about this
before-- but it's just attempting to reproduce
Newton's prism test. Where he's like, oh, if I just
shoot some light at a prism-- which is right here-- that light should-- you know it
should refract and break apart into its rainbow. And then if I
create a, lens which is what this is here, and put it
somewhere to catch the rainbow, literally, it would reassemble
that back into white light. Which is basically
doing right here. So this is simulating in
real time the very physically accurate, physically
based caustic effect. And these Actors
here are very simple. I can show you that there's
almost nothing to it. It's just the geometry. The prism geometry. Which has just got a little
bit of edge polygons on it, but that's even
completely unnecessary. It could be a very
low poly object. And then the material,
which has got some-- you know, it's a
very simple material. Anybody who's a material artist just could do this in
their sleep, right? And this is-- you
know, it's just got basic attributes for
simulating physical behavior. For example, if I look
here, you see how-- you know so here's
my light source. Right? And it's shooting--
this is a rec light. Very narrowly focused. And it's shooting
a white hot beam. But when it hits
the surface here, it's like it's bouncing
light back out. You can see that? Yeah. I mean, that's
sort of refraction. The fact that not
all light is absorbed physically into the object,
and then and then sent out in other methods,
or other directions. That refraction is partly a
based on its metallicness. So as I ramp that down,
the surface gets-- I make it 0 on metallic-- and you know, it loses
that reflective behavior. But that completely makes
sense, because if it's a mirror, light will go in and
then bounce back out. One of the first things I wanted
to do with ray tracing when I first thought was like oh, I
want a mirror that you can hold and I can shoot-- like if a beam of
light hits the mirror, I want light to come back out. And ray tracing
today can't do that. But with caustics, we can. You get that full sort
of physical behavior where it's is literally
simulating virtual photons, and just doing all these
incredible dynamics with it. And it's very tweakable. So I don't know. If anybody has any
questions at this point, I should probably stop
talking for a second. >>Victor: There are a lot of wows. I'm still also just looking,
being a little bit mesmerized by it. Let me see. >>Richard: You see? I mean look, at that. The way the light
bounces back out. And this is also using-- these are like shadow
casting lights that also are generating caustics. Just doing that to give it
a little bit of extra oomph. You know? Lots of artistic choices. >>Victor: We do like extra oomph. >>Richard: I'm one of
those hack artists that likes to exaggerate things
so that people can see them. You know, if you want
more subtle effects, you can do that. That's totally possible. >>Victor: You didn't actually
show me moving that point light. And that is-- >>Richard: Isn't that crazy? >>Victor: Yeah. >>Richard: And it's very tunable. You have a lot of controls here. Whether or not a light
is caustic light or not, you have that. That's on the light
source itself. So you can say, I
want this light source to test mesh caustics or not. So it'll stop generating them. Or I want it to cast
water caustics, too. Those are separate
concepts for us right now. Water caustics as
opposed to mesh caustics. So we know what that result,
what we want that to look like. Mesh caustics is
any arbitrary mesh. And a glass object, stained
glass, whatever, and how light transmits through
that physical shape. >>Victor: There was just-- >>Richard: Yeah, go ahead. >>Victor: Oh I'm not
going to do this now. We had a question in
regards to if this will make it possible to
build physical accurate optics like lenses. And you did talk
about this earlier, sort of, with the example
of a magnifying glass. >>Richard: This is a
lens, right here. That is a lens. Look at what it's doing. If I don't want it
to be so bouncy, I can change those attributes. It's almost like you get extra
light out of a light, right? It's making extra light
all over the place. It's pretty wild. I mean, this is a neat sort
of tech demo, but I mean, we get to-- I'll show one other scene here. Just because-- No, don't save. You know, more
stress tests, right? >>Victor: You're gonna have to
go full screen for this one. >>Richard: Oh, sure. It's so pretty. And so people can feel
assured about the performance, again, I'm on a ti card. I'm not on-- I mean, it's a
very powerful card. It's a beast. But you know, it's not even
the fastest card anymore. >>Victor: And we're
also running Discord. >>Richard: Yes, I'm losing
frame rate from that. A significant amount of frame. And if I had DLSS on, normally
for me this is around 90 frames a second without it really
dropping much of anything. So it's in that 80 to 90
range on this hardware. >>Victor: Wow, there's
one spot right there with this crazy blue. Little bit farther. That's beautiful. I don't know, you hit it. Yeah, there. >>Richard: Oh my
God, what is that? >>Victor: I don't know,
but it looks awesome. >>Richard: Yeah, it will do
things you don't expect. Yeah, disco balls are possible. Magnifying glasses,
or a magnifying lens. Yeah, the capabilities are-- I almost feel like it's
unknowable right now. Because it's so powerful. This fundamentally changes
what we do with light, and what way you use a
light source in Unreal 4. So that's, almost, I don't know. It's very hard to grasp
what that is, exactly. I do want to show-- let's see. Yeah. No, I don't want to save. Anyway, we'll switch over. >>Victor: Yeah. You mentioned before
our call that what's missing to make that
demo really cool would be a light that
is actually a laser. >>Richard: Yes. That's an interesting
thing, because what we have in Unreal right
now for light sources, you've got point lights,
spotlights, rec lights. We don't really
have a light that we can put in the scene that is-- that doesn't have a
width to it, right? There's no precise
laser point light where all the virtual photons
coming out of the light source are traveling in
the same direction. It's got an arc, a little
bit of a width to it. So even with a rec
light there, I really had to struggle to get
it very narrowly focused. And you can see, it's still got
a little bit of a width to it. But you know if the
caustics is almost begging for a new light source,
like a laser pointer-type. Where you can- I had to sort
of hack it a little bit just with a rec light. But yeah, that'll be
really interesting. If someone can
come up with that, I'd be really appreciative. This one here, I'll just-- I'll play it. We'll let it speak for itself. So this is water caustics. >>Victor: And a nice little custom
wading animation on Manny. >>Richard: Yeah the artist who
put this together, I think he-- there's some free water assets. I think anybody can
get these water assets, and the way the character
moves through it. So he just grabbed that stuff. This is the seaside
town, I think those are free assets off the store. Oh, and so everybody knows,
like I was saying before, what you're looking at-- RTXGI is shipping. Its 1.0. A lot of the kinks
are worked out. It's very solid. Caustics is still being
worked on, it's in beta. So if you see any
little issues, just keep in mind,
that's a beta thing. And I'll just I'll point it out. I don't like pointing
out mistakes, but you can see one right here. Where the character has
got a little bit of-- I mean, he's ghosting
in the water? That's just, you know, that's
something that'll get resolved. There's a duplicate of him. But here you can really
see the caustics working. If I jump in the
water, you know, all that light deformation? That's water caustics. I put my camera under the water. >>Victor: And there
are a lot of things here that you can make to
make it look more realistic. This is just a demo of
the caustics, right? >>Richard: Yeah, very
rapidly put together. Like I said, this is
the seaside town assets. Anybody could grab that. I think he dropped
the water in, and we just kind of fiddle with it. You know, I'm riffing on
it, he's riffing on it. We're just kind of messing
with what can be done here. But all this lighting here,
that's all caustic effects. And you have a lot of control. How intense do you want it? How far do you
want it to project? What's the resolution factor? So those are all
tunable controls. And this is a scene
running without GI. So I'm trying to show stuff
in a little bit in isolation. And so on. So you can just get a look at. Here's our ray traced scene,
fully dynamic ray traced scene. But it's got caustic
effects on it, and dynamics. You see there. This is one of the great
side benefits of caustics, and something
that's really sought after by a lot of artists. Which is-- well, you
can see it right there. Translucent shadows. Translucent, colored
shadows, for the first time in ray tracing. >>Victor: And no special trick to
achieve each shadow and color? >>Richard: Yeah, I
mean if I put a-- I don't know if I can nudge it. Probably creating
some purple there, I just can't get
close enough to it. With the red one
over the blue one. It's almost like
painting with light. I feel like I don't know what
is possible with technology like this. But here's sort of a stained
glass window prototype. You know, I can see the
character absorbing light differently depending
on where he's standing. And over here with these. Look at the way the
light is gathering in the spheres at the base. It's-- if I get
the camera down-- it's doing like a
magnifying glass. >>Victor: I don't think I've
seen such interesting spheres rolling around in
Unreal prior to this. >>Richard: Right. Some dynamics. Physics are always fun, right? >>Victor: Yes. Yes, they are. >>Richard: Just automatically fun. >>Victor: Yeah
[INTERPOSING VOICES] >>Victor: --area, see,
like at a church window. You know, that kind of tiled
colors, and the shadow of that. >>Richard: Exactly. Well, yeah. You know, I had a
friend ask me when I was getting into ray tracing. And he's like, can it do
stained glass windows? Because I think like every 3D
artist, they want to make that. They want to make that really
cool Diablo type scene, or whatever. Where it's like, the
old broken down church with broken stained
glass windows, and light pouring through it in
a really creepy, awesome way. You can do that with caustics. There seems to be so many side
benefits to this rendering technology. You get translucent
shadows as part of it. You get water effects. Caustics-- I don't
remember if I've said it-- but caustics reads both animated
mesh and animated normal maps in order to determine what
the caustic effect is. So you animated a normal map
on this, you know, it would-- here, I think this movement is
happening because the water, actually, it's a deformed
mesh that's animating. So that's why it
looks like that. So all the artists had to do,
who put the scene together, was drop in a mesh with
the appropriate animated deformation on it. The caustic effects you get
as part of the technology. Yeah, the transmission through
the water, the refraction back out of the water
up against surfaces. You can see it's, you know,
under this thing, here. >>Victor: Yeah. I feel like me and Andreas,
he was on the stream a couple of weeks ago, we need
to revisit his little sailing game prototype, so that we can
get some of these effects added to it. >>Richard: Yeah. Yeah. So I mean, yeah. I think it's been said,
we're going to do caustics as its own special branch. It's going to be a side branch. Not-- at least in
the initial release. And it will be out soon. So anybody can get it, and
anybody can mess with it. We're doing that because the
change to the material system and translucency
and reflection-- just basically how deeply
involved all those changes are-- it's a big code change,
in order to do this. No surprise, right? Caustics might-- you know, this
might be a generational leap in software. In order to do these effects,
it is a large change to Unreal. Well, what we're
trying to figure out is more efficient
ways to do that. And I think we'll get there,
but rather than sit on the code and not release it, we're
going to put it in a branch that anybody can get. And you can make a game with it. You can pull down that
code, compile a binary, and ship a game with that code. You can pull down
the GI plug-in, and have caustics and GI. So you can do all
that if you can't wait for it to be more widely
released, or for us to work out you know, wider distribution. If you want to jump
in early, we'll give you a way to do that. >>Victor: Let's see,
there's a question here that I'm not familiar
with the terminology, but perhaps you-- Would the caustics be able to
produce interference pattern through double slit? >>Richard: That's a good question. I don't know. That's a test I'd love to do. Thank you, whoever
asked the question. I'll tell you
what, I'll test it. We'll see what happens. >>Victor: There was also someone
asking about GI bounces, and maybe we should clarify that
there is no GI in this scene, right? >>Richard: Right, no
GI in the scene. This is just caustics. >>Victor: Here's another question. There seems to be some flicker
in the split light of caustics. Where's that coming from? >>Richard: The split light? >>Victor: Yeah. It could also just be
the screen compression. And there are a lot of-- I'm even seeing a couple
of artifacts on my end, before I'm streaming
it out Twitch, so. >>Richard: Sure. Well, I mean I can talk about
just how caustics kind of does its thing. You know, it's another
ray tracing effect. But in order-- I mean I
think I can say this-- how it actually works,
at least at a high level. And keep in mind, I'm not a
coder, so I'll do my best. But it's using ray tracing to
determine what the light should be, what the shadows should be,
to do all those calculations. But the actual
projection into the world is a volumetric screen
space effect, right? I mean, everything
about ray tracing, everything about Unreal
right now in ray tracing, is kind of a hybrid model. That hasn't changed yet. Ray tracing in
Unreal is-- you know, we use real ray tracing
for certain things. Like shadows, reflections. And we use rasterization
for other things. Like post-processing
effects, bloom. Because you don't need to-- it doesn't make sense to
ray trace those right now. The rasterization is
good for certain things, and ray tracing is
good for other stuff. And it all just works
together, right? So that gives us the capability
to do things like that. Where you know,
we're ray tracing how light should behave. But then, that ray
traced result is getting projected into
the scene like a three dimensional texture. So I mean, that means that
there will be circumstances where-- like if you
get really close to it, if you got really
close to surface, you see it just disappears. Because it's got screen
space aspects to it. But you know, you'd have
to get pretty close. Or be really crazy
on your scale. And there are controls for that. You can-- it is tunable, as
far as you know, where it draws and what the resolution
is, and things like that. So there's lots of ways for
you to fine tune those results. And I'll be completely
honest with everybody, I'm [AUDIO OUT] caustics. I'm learning about this
just like everybody else is. You know, I'm getting a
chance to use it early, but I think there are
people in NVIDIA that know more about this than I do. >>Victor: Yeah. When we started talking
about doing the livestream, you just briefly
mentioned caustics. And we weren't entirely sure if
it would even be ready in time to show off at the stream. And then a week or two ago,
you were like, yep here. Check this out, got the demo. And here we are, showing it off. >>Richard: Yeah, this is
awesome stuff, right? I mean, it's really incredible,
what it adds to the scene. Yeah I thought--
gosh, I don't know. >>Victor: Another
question came up. Can it casts shadows
from reflected-- can it cast shadows from
reflected by water light? I think they mean if
it can cast shadows from the water caustics. >>Richard: Mm-hmm I
actually don't know. I'm not 100% sure about that. I mean, there's the virtual
photons that get re-projected into the scene. They can be occluded. I know that, and they
can be colorized. OK, so these right
here, this might be-- I don't know if this
will answer the question, exactly-- but these are
translucent colored objects, right? And you see, there's
a color translucency coming through them. And it's colors are
doing what they do. The ones over here
are not transparent. These are just pure reflective. Solid, right? So this is generating light. That looks like it's
casting a shadow. There is green light,
or I mean the sunlight, bouncing off the surface. And you know, he's occluding it. That would technically
be a shadow. So I don't know if water
will do that, but at least the mesh caustics do. That, there's some
physics for you. Like, I said, this is
not a fine tuned scene. So just everybody knows. It's not tuned for
every consideration, it's just a quick example. >>Victor: Early preview. Interesting question
here, can you change the speed of each ray? Like how light goes
slower in water than air? >>Richard: That's an
interesting question. Not that I'm aware of. I'm not even sure if anybody's
considered that one, yet. But that's a really
interesting question. I did see someone
ask a question before about simulating black holes. Yeah the answer
to that one is no. Because that would
mean that we would need to be modifying photonic
behavior based on gravity, which we're not doing. But you know-- and then
fully simulating that, right? So light is getting
bent according to the mass of objects. That's an incredible thought,
and it's probably possible. I don't know, maybe? But it's you know, yeah. There's lots of possible
real physical simulations. Can we do them in real
time at 90 frames a second? That's a good question. >>Victor: Will the caustics be
cast from translucent foliage? I don't see why not. >>Richard: Yeah it should,
any translucent material. Cold be a foliage or anything. A Foliage Actor is just a form
of an instance static mesh. It's batched a little
bit differently. But it's just
another mesh, really. >>Victor: I just caught-- or learned,
yesterday, how you can replace individual
instances of the foliage with an actual mesh. Just using a Foliage
Instance Class Actor. Thanks, Brad. Once again, maybe some folks
joined a little bit later in the stream. A question came up in
regards to if the RTXGI UE4 plugin will become available
on the Marketplace. In which way it
will be available, I don't think we
are entirely sure. But it will be available
for public download. >>Richard: Yeah. Right now today, it's
available to anybody through our development program. but getting on the development
program is pretty easy. You just go there, you sign up. It's not a painful process. I've done it myself. And then you have
access to the source, and you can download the plugin. That's how we're doing it today. I can't talk about what
our future plans are. I can say, though, that-- I mean we would like to be as-- we would like to go as wide
with the software as possible. We want to get it into
every developer's hands. So this is something
that we're constantly looking at and working on. >>Victor: They were curious. I don't know if there was
anything else in this scene, but they were curious if it was
possible for you to drop a-- lets see, what was
the question?-- volumetric fog in the
demo with the prisms? >>Richard: Oh, I'm not
sure if this branch-- I don't think it
has volumetric fog. >>Victor: It should if it's
anywhere 4 to 5, high fog. And then you turn on the
volumetric fog option. >>Richard: Well, yeah, but
in the mainline version, that does not currently
work with ray tracing. And this is a ray traced scene. So I'd need to make some
changes in order to do that. >>Victor: They just want
some more eye candy. Me, too. >>Richard: I imagine
it would look good, but that's a good
point to bring up. I mean, this is why this sort
of thing is important, right? Because that person is asking
the right kind of question. Which is, does your software
from one department of NVIDIA work with the other
department of NVIDIA? It should. We should all-- all the
software should work together, and create a seamless total
result that gets us closer to-- well, you know, closer
to the Matrix, I suppose. >>Victor: I guess so, yeah. Because you don't want
to make decisions where, oh, I want this cool feature. But then I can't use
this other one, right? As an artist you want to
have the power of all of it, and be able to combine the power
to make your ultimate creation. >>Richard: And if you're on
our development program, you have access to RTXGI, DLSS. You want to make sure
everything works together, and it produces a
very good result. So any area where maybe
something doesn't quite blend right with
something else, those are weaknesses that
we would be fixing. I'd say they're a priority. They're very important for us. >>Victor: All right. I think we've gone through a
good amount of the questions. Oh, this is a good question. The foliage in the
forest demo didn't seem like it was using
world position offset. Are there any updates on
when this will be supported? >>Richard: Not a specific update. That's correct, world position
offset is turned off, there. Ray tracing shadows and foliage
currently don't support that. But this is on our radar. And we're going to fix it,
and we're going to do it soon. >>Victor: I know it's
being worked on. It's not something
that's impossible. They're just some interesting
problems to solve, is basically what I would say. >>Richard: Yeah, I
mean, because ray tracing changed how
graphics are processed. And that-- literally,
every mesh on the scene has to be calculated
simultaneously in certain respects, where
it didn't have to be before. Because rasterization, you're
only worried about what's in your frame. Ray tracing has to consider
everything behind the camera. Because it might be reflecting. So because of that sort
of fundamental change in how meshes are grouped
and loaded into memory and processed, that meant
new problems to solve. But yeah, this is a known issue. And it's going to get fixed. >>Victor: Awesome. Was there anything
else in particular you wanted to
demonstrate for us today? >>Richard: Oh boy. No, I think I've shown
everything I have. But I really appreciate
you hosting this. And this is really cool. >>Victor: It is really cool. >>Richard: Have a chance to show
everybody what we're up to. >>Victor: We can just sit and
keep watching for a bit. I think some of
people were like, can you please show
the prism scene, again? >>Richard: I can load it up, sure. >>Victor: Yeah. We have another good
10 minutes, here. >>Richard: OK. Yeah, I'm not sure how
we're doing on time. >>Victor: Oh, we're
doing good, actually. When we start hitting
the 2 1/2 hour mark, that's what I'm
like-- >>Richard: Oh, what did I just do? Hold on. Oh, I see. OK. Yeah, prism stuff is fun. In fact, I think you might
notice that the effect is kind of weak right here. But that's just a tuning
control that you have. There's lots of resolution
and intensity adjustments. You can make your
light brighter, and get more effect out of it. Or you can just pump up the
intensity of the effect. >>Victor: Oh there they're
saying the other map. >>Richard: Oh, the one
with all of them? >>Victor: Yeah. >>Richard: Yeah, there's Newton's
Law 1 and Newton's Law 2. I'm going to start
making sound effects. I probably shouldn't do that. I wonder if anybody else
is doing that right now. I know it's a lot of fun. It's just incredible. >>Victor: Yeah, just--
and the colors, and how dynamic they are. Just even trying to fake that,
you know, with any method, it'd be difficult to get
that sort of quality. >>Richard: Yeah, this
is literally where-- OK, so when I talk about
like generational ray tracing technology? That first
generation, first pass at ray tracing shadows
and reflections? That was really just, let's
just make ray tracing. The initial effects were
kind of better than raster, but not fundamentally
different, or something you couldn't do before. Caustics is probably something
you couldn't do before. Now, we're getting
to a new territory. With lighting and
shadow and refraction, where it's very photo
real and doing things that you probably just can't
do with rasterization. This is where the technology
really starts to open up. >>Victor: Yeah, we
can just sit here. Just keep watching
you turn that light and keep being amazed
all day, I think. >>Richard: Yeah. I mean, there's a lot I haven't
experimented with here, either. Like what happens if I feed
a different colored light into it? I don't know. I just went straight
for the white light. Because I was like, you know,
let's split it apart, right? Let's turn it into a rainbow. Gosh, I don't know. >>Victor: It looks like it's still
actually splitting the light through that prism
there, to some degree. I'm not familiar enough
with Newton's Law to be able to tell whether it's
physically accurate or not. If you shine a, in this
case a pink, violet. >>Richard: And those are
really good questions, too, that we might have
to vet at some point. I mean, it's just a fun topic. I mentioned it before, but
it's probably worth repeating. As an expression of
how powerful this is, the coder who wrote the
caustic system didn't know this was possible. He wasn't thinking
about, I'm going to take light and split
it apart with a prism, and then reassemble
it with a lens. He totally wasn't
thinking about that. He was just, what
is the math I need to figure out in order to
simulate photons in real time? And do all that correct
physically accurate behavior? And then it was the
thought that, well, what if we put a prism in there? What if we just simulate
the physical density of that prism with a refraction
factor, and it's translucent? And you put a light
on it, what happens? Is it going to behave like
an object in the real world? And this is the result. He
was surprised as anybody that it actually worked. >>Victor: And that's often what
happens in science, right? >>Richard: Yeah, but see,
what's important about that is that really reinforces
that he's got a good system. Or that this system is probably
the right one, or at least really close. If it's not exactly right,
it's in the ballpark. And, you know, talking
to programmers, they'll be the first
ones to tell you that what's being done
in ray tracing is not-- you know, this is not-- we're not at the
Matrix yet, right? We're not simulating
atoms and particles, but we're getting closer. You know? And any place where it
needs a little more work, those are areas of research
to figure out and take to the next level. So I mean, there's a lot of
work being done in that area. You know, some of the
results are just crazy. Like, what is that about? Somebody needs to get 40 prisms,
and a really hot light source. >>Victor: That was
my exact thought. We need to have a real
demo, and compare that. And see what-- what
actually happens here? Because I don't think I've
ever seen this experiment done in real life. And so, I can't tell whether
that's accurate or not. It's beautiful,
but is it accurate? I don't know. >>Richard: Yeah. You know this is a test
scene with just one light. I mean, I could just put
a really large attenuation radius on it, so it's just
infinitely casting, basically. And then you can
make it unrealistic, in the sense that it
won't cast shadows. So it'll just go right through. Which is maybe a behavior
that you want, maybe not. But I mean, this sort of-- I don't know, how
realistic you want it to be versus how
simulated, or whatever. I mean you have those options. You can tune it. I think there's a lot
of potential, here. I mean, just for
academic purposes. Because you can make it very
simulated, very close to real, probably. Yeah, and then just test
these things virtually. I mean, without having-- just construct whatever
you want virtually. And then try out
different physical results is very fascinating. >>Victor: I mean, using
high-powered lasers can be dangerous, right? Especially when
you are pointing it at prisms placed in
a random location. You never know where that
bounce is going to go, right? So. That's amazing. Richard, thank you
so much for preparing this little presentation for us. Or little, I'd say
there's quite a lot to it. It's been amazing to see some
of the advancements that's being made. And having you talk
about and show us, as well, a little
bit, what you're able to do as an artist
using these tools. And sort of the
philosophy that you apply when you're one of the
first people in the world to get to play with
this kind of tech. >>Richard: Yeah, thank you. I appreciate that. It's a lot of fun. I mean, really I'm
just trying to-- for my part, I'm trying
to help NVIDIA figure out the technology right now. Because there's a lot of
simulated systems here. But you got to put it
together with actual content. And does it really work? Can it work for game developers? Is it a real tangible thing? Everything I'm showing
today is a yes. Yes, yes, yes. You can do all that. I just can't wait to see
what people do with this. I know that's a little
cliche, but it's an unusually powerful technology
that we're heading into. >>Victor: All right, I
think with that said, I'm going to do my short
little outro spiel. Thanks, everyone,
for watching today. If you've been sticking around
with us from the beginning, it's awesome to
have you with us. If you are curious about what's
happening in the Unreal Engine world, make sure you follow us
on Twitter, Facebook, LinkedIn. And if you are new to
the world of Unreal Engine, and you are
curious where you can learn all these things-- I mean, we've
definitely mentioned a couple of terminologies
today that if you are new to the world of game
development or real time graphics-- a good place to go is
learn.unrealengine.com, and you'll find our entire
course library of courses related to Unreal Engine. And there's a lot
of good content there so go check it
out if you're learning. Even though the world is in
a very different state today, there are still virtual
Meetups occurring. We do have over 100 physical
Meetup groups in the world. You can go to
communities.unrealengine.com to find where they are. Even though they're not doing
any in-person Meetups right now-- and we don't
recommend them to-- some of them are still
throwing virtual Meetups. And it's a chance for
you to get to know people close in your area,
or just join a meet-up in another part of the world. I don't think anyone would
not want to have you there. So go check the page out. If you are interested
in forming or hosting a meet-up in your city,
there's also a little form there, become a
leader, you can click. And then fill that out,
get in touch with us, and we will give
you the information that's required for you to
spin up your own meet-up group. We are still looking
for countdown videos for the livestreams. So if you've been in
from the beginning, we do a five minute-- it's
a sped-up version of about 30 minutes of development. Send that to us. So, 30 minutes of developments
sped up to 5 minutes. Go ahead and send that
to us, and then you might become one of our
spotlight countdown videos. Don't worry about the music, we
take we take care of the music. Licensing rights
and all of that, gets a little tricky sometimes. We're always, every week,
doing our community spotlights. The places we look at are
plenty but a really good place is the forums, the released,
and the work-in-progress thread. As well as the
unofficial Discord channel, Unreal Slackers,
which is unrealslackers.org. Great place to talk
about Unreal Engine development and
everything related to it. I said, make sure you
follow us on social media. That's where we announce all the
streams, or the Events section on the forums. That's where first place
where you'll see which stream will come next in
a couple of weeks. I have a super old
note here Next week, it's animating with the
control rig sample project. That is not true. Next week, we actually
have New World Interactive coming on the stream to talk
about Insurgency Sandstorm. Now, there is
something else to that. It's not just about the
game, but they haven't done the announcement yet. So I'm going to keep that
a little bit of a secret. New World Interactive
will be on talking about Insurgency sandstorm
and something that's going on in that realm of the world. With that said, special thanks
to NVIDIA for letting Richard come on the stream today, and
Amanda behind the sidelines for taking care of all the
questions that came in today. There was definitely
quite a bit of them. So I'm glad to have that help. And to all of you out there,
I thanks to all of you for watching, and
hanging out with us, and being so
interactive and chat. It's great. Hope you're all doing well,
staying safe out there. Richard, anything else you
want to end the stream with? >>Richard: No, thank
you everybody. This is really great. It's fun for me to do. So I really appreciate it. >>Victor: Well, I think
since this is sort of-- you mentioned-- ray tracing
level 2 or the next era, I do believe there will be
more chances for you to come on and show off, continue to
show off the new technologies that NVIDIA are developing. >>Richard: Yeah I mean, it's
a constantly moving target. And you know, it's
not sitting still. >>Victor: Super exciting. With that said, once
again, stay safe everyone. We will see you again next
week at the same time. Take care.
DLSS on an engine level even if its a branch/option is exciting. We need games to start using it without expecting compensation from nvidia. UE4/5 support can really help jump start adoption.
Did anyone notice change when switching to dlss on 33.06 sec? Adding some artifact shinning on left bottom side object.
RTXGI is the shit, cant wait to use it!