AMANDA: Hey, have you heard? We are thrilled to
welcome Capturing Reality to the Epic family! Their software, RealityCapture,
is the state-of-the-art photogrammetric
software for 3D scans, yielding unparalleled accuracy
and mesh quality at speeds many times faster than
competing software. Together we will make their
world-class photogrammetry technology more accessible
and affordable, including direct integration
into Unreal Engine, and bring the real world to
even more digital experiences. With recent refinements
to Control Rig, you can create a range
of expressive animations in Unreal Engine 4.26, no
third-party apps required! See how these improvements
make crafting complex rigging solutions easier and start
bringing robots to life yourself--Watch the Using
Control Rig for Unreal Engine presentation on our
YouTube channel. And if you're still working on
those New Year's resolutions, march over to Unreal
Online Learning where we've added new free courses. Find out how to
prepare your 3D models for import, how you can use
Blueprints to prototype games, plus more! French fashion label Balenciaga
recently teamed up with Dimension Studio and Streamline
Media Group to create Afterworld: The Age of
Tomorrow--a cloud-based real-time experience designed
to showcase their Fall 2021 collection. Pop over to the
Unreal Engine feed and see why it was a
game-changer for the fashion industry. Beyond what you see,
real-time technology is being used to
enhance what you hear. Forward-thinking organizations,
like global consultancy Cundall, who offers engineering
services for sectors from residential to
healthcare to aviation, and Chalmers University of
Technology's Digital Twin Cities Centre, are
using Unreal Engine to improve urban planning
and the acoustic design of buildings. Find out how on the feed. Once your acoustics
are optimized, become a digital DJ and headline
your own virtual festival in FUSER--the latest
rhythm-sensation from the mix-masters at Harmonix,
the studio behind classics like Rock Band, Amplitude,
and Dance Central. Put your hands in the
air and learn more about how they created their most
robust and customizable music-creation tool yet. Training autonomous
vehicles, robots, and drones requires highly
realistic environments, which in turn require
huge amounts of data. So how do you solve
this conundrum? For Duality Robotics
the answer has come from the world
of animated films. Steer over to the
feed to learn about their innovative solution. If you're looking
for weekend plans, tune in for a live
performance of Dream, a virtual experience
inspired by Shakespeare's A Midsummer Night's Dream,
hosted by the Royal Shakespeare Company. With Puck as your
guide, you're invited to explore the forest from
the canopy of the trees to the roots, meet
the sprites, and take an extraordinary journey into
the eye of a cataclysmic storm. The show runs from March 12 to
March 20--visit dream.online to learn more! Now for our top
weekly karma earners! Many thanks to: Everynone, ClockworkOcean,
chrudimer, nasvic, Hakaishin0895, Shadowriver,
Grim ヤ, leomiranda518, ccddee60, and Unreal-Peter. First up in our
community spotlights is Promethean AI, which is
an semantically aware AI that was created to help you create
games and environments more quickly, and while removing
some of the tedious bits. It learns your
assets and materials, then you can simply ask
Promethean to start building for you. Learn more about Epic
MegaGrant recipient and its free public beta at
https://www.prometheanai.com/ Get ready to pew pew, in
Aurelia: Stellar Arising. In this mobile, 3D turn-based
fleet vs. fleet space combat game, choose
between two factions and fight for control
of the Aurelia System in over 77 missions. Download on Google Play or
the iOS App Store today! Last up is a handy tool from
ryanjon2040, the Unreal Network Profiler. They've created an updated,
easy-to-use interface that's now available to
download from GitHub for free. Check it out! Thanks for watching this week's
New and Community Spotlight. VICTOR: Hey everyone,
and welcome to another episode of
Inside Unreal, a weekly show where we learn, explore, and
celebrate everything Unreal. I'm your host Victor Brodin,
and my guest today is Richard Cowgill
RTX Unreal Engine evangelist from NVIDIA
Welcome back to The show. RICHARD: Oh, thanks. Yeah, it's good
to be here again. I had a fun time last time. So I hope we can cover
all things DLSS related. VICTOR: Yeah. That's what we're doing today. Last time we went through
a couple of topics, but today the plugin DLSS
for Unreal Engine is actually available on the
Marketplace, and we're going to talk a little bit of
how to use it and the future. RICHARD: Yeah. So I should probably start
by explaining what DLSS is for anybody who's new to it. DLSS stands for deep
learning super-sampling and if you break that down, it
does exactly what it describes. So it's using deep
learning algorithms to supersample an
image, upscale it. And so essentially
take a lower res image and make it higher resolution,
but do it intelligently. Not just the traditional
way of up scaling an image and then sharpening it and then
hoping everything looks right. This is using intelligent--
artificial intelligence programming to
enhance the image, bring out details that would
otherwise not be visible, and at the same time give you
a real significant performance boost. Another way to look at it is
it alleviates a lot of load on the GPU. A lot of games, especially
in an advanced engine like Unreal Engine,
a lot of your time is spent trying to optimize
the scene for GPU tasks. Maybe you're doing a lot
of dynamic shadow casting, maybe you're doing a lot of
fill rate intensive effects. So anything where the GPU is
a bottleneck, DLSS can help. VICTOR: And you showed
me some impressive results, so why don't we go ahead
with the presentation. RICHARD: Yeah. So let's look at my screen if
we can switch over to that. VICTOR: Good to go. RICHARD: OK. So I'll show you the-- step 1 is when you-- first of, you just
go get the plugin. We have it up on our website. After you sign or after you
agree to an end user license agreement, you can download it. And DLSS, it works with
Unreal Engine 4.26. So when you get the plugin
you have everything you need. Let me just step through
that a little bit. You'd want to-- now there's
directions for all this but I'll walk you through it. What you want to do after you
get what you need here is-- this is like your plugin folder,
let's say I've got for 4.26.1. So I would grab all this
content here, copy it, and you need to get to your
plugins directory in 4.26. So I would surf
to this location, go into my engine
plugins, runtime, NVIDIA. If you don't have it
NVIDIA video folder-- I think you should by default,
but just in case, make one. And then make a DLSS folder, and
just copy this plugin content into this folder here. And that's step 1. That should get you good to go. VICTOR: Are
you required to put it in the Engine Directory or
could it be on a per project basis as well? RICHARD: It could also
be in your project folder. I think the reason
why we document it that way, I believe, is
because somewhere, I think, it's been said that or written
that that's technically correct plugin installation. But I think with
plugins you can also make a plugins folder in your
project and put it in there. Either should work. But I'm just wanting-- that's what we document
and that's what-- I think that's official-- the official method. You're not allowed
to redistribute the source is the thing. So you just-- nobody should
be doing that anyway. They're probably not passing
their project around, but as a compiled binary there's
no problem with distributing your project with DLSS. So, yeah. Once you've done that-- let me load up a
basic project here. I've already enabled
DLSS for this, but I'll walk through all
the steps for setting it up. VICTOR: And we are
taking questions for our Q&A we might answer them in
the middle of the stream or at the end during
a Q&A session. So feel free to submit all
your questions in regards to DLSS and NVIDIA RICHARD: Yeah. So here I got a
basic test project. I'll explain more
about this later. I've set it up for ray
tracing, and what I've done here, even though this
is a very simple project with simple graphics. I have attempted
to bloat this out intentionally so that
DLSS has something to do. So that we can get a nice
frame rate boost and I'll. Just sort of be a
representative of how you might put a lot of
advanced graphics in your game so that the GPU has
something to chew on. Anyway, once you're all here,
just go to your plugins, and you can search for DLSS. And you would just
enable it, and then you would need to
restart the project. But once you do,
once you've done that and you've restarted--
oh, and I would just note to carefully follow
the PDF instructions as well because there's a lot in
there that's documented. If you run into any trouble. If you hit any snags
whatsoever that's all there. Common ones might be
you're not on the latest Windows 10 version, or you need
to update your GeForce drivers. But if you've done all that,
after you activate the plugin and restart the project, if you
go up here into your viewport, you'll see a DLSS settings menu. So it then becomes
active and built in. And that's how you
know it's working because it's right there. So let me set this up. OK. So this is with DLSS off. And if I'm skipping anything,
I mean, please, seriously just stop me and ask questions here. But-- VICTOR: I'll be on it. RICHARD: Sure. OK. So I've got-- yeah it's-- you can see what
the frame rate is. By the way I'm running
on 2080 Ti right now. So this is-- it's fast hardware,
the 2080 Ti is no slouch. This is a very high end,
very powerful graphics card. But it's not the highest end. Faster cards would give
you better performance. And, like I said, this is a ray
trace scene where I'm doing-- you can see it's got the
nice penumbra shadows the ray tracing gives you. So sharp at the base, gets
softer as it goes out. Same thing over here. I've set the sampling
up on all the lights. So this is a ray trace skylight,
ray traced ambient occlusion, and a ray traced direct light. And all of them have
their sampling set up to either, a value of
eight or in the case of the skylight I set it to 16. So that's why it
looks so smooth. There's very little distortion,
there's almost none. I mean, maybe a little bit if
I turn the camera real fast, and you'll see the
denoiser catch up. But it's looking pretty good.
it's performing decently. Almost 60 frames
per second is OK. I'm running in
1080p on a 2080 Ti. That's all right. But you know we'd
like it to be faster. So that's where DLSS comes in. I go to my menu here, and
I'll send it to quality mode. You can go faster with
the lower quality modes. But if I set it to quality
that's what it does. So you see the frame rate there,
and very arguably image quality has not taken a hit. In some cases, it
might even improve. But here I've taken
a ray tracing. Like I said it's bloated,
I'm oversampling. I'm doing too much. It's not very optimized,
I guess, you could say. Even though it's
simple geometrically, in terms of how much ray tracing
activity is going on here, there's a lot happening. Because as you can see
it gets those nice shadow within shadows and soft
shadows where you want them. And it looks very solid. Yeah it's OK. VICTOR: Those
settings that for the plugin, you can expose them
to the end user, the player as well, right? And they can adjust them
during run time in the game? RICHARD:
Yeah, absolutely. I can show that real quick. So there's some simple
Blueprinting here, but yeah, you're most-- once the DLSS plugin is active,
you have access to Blueprints to control it. You can still do console
variables those work as well. Excuse me, let me see. So all of your
console variables are going to be under RNGX.dlss. And there you can see the
relevant console commands. And there are some here
that aren't, I think, are not available or
maybe less available than in the Blueprint. But that gets very particular. So you can control-- you can
enable it and disable it. You can quality, you
can set sharpness. You can do all that
from here if you want. But if you prefer Blueprints
we do-- we have that too. So here's a very, very,
very simple Blueprint where on level start
we can test for this. It's all under DLSS
so if I do a search-- context sensitive
--and type DLSS, I get my DLSS Blueprint scripts. And in this case, I can test
to see is DLSS supported? Is it supported
on this hardware? This function here will-- is looking at do you
have an RTX card? Do you have-- are you
on the latest drivers? And you can test for that too
using Blueprint functions. So you can test to see which
driver version a user is on, and then make your
code appropriate. But this would all
potentially feed into like a user interface. Where the user can go
into those preferences and turn DLSS on or off, set
the quality mode, et cetera, et cetera. So here I'm just doing
a very simple thing where it's checking
to see if I-- if DLSS works on this hardware. And if it does, put
it in quality mode and set the sharpness value. If it doesn't-- if it's not
supported just turn it off, and that's OK. So I'll just compile
and save that. And I think I did another thing
up here just a quick thing. Yeah, where if I
press the Tab key, I do a simple flip-flop
where it turns DLSS on and off in real time. And, like I said, you can
use these menu functions to pick what you want. Or you can do it the
old fashioned way which is just execute
a console command. Maybe that's how you
prefer to set it up. So, yeah, getting back to this. Yeah, I'll run it. And I think yeah there we go. So that's DLSS off, running TAA,
so the standard anti-aliasing that most everybody uses today. And then if I turn
it on it almost, almost, doubled my frame rate. And like I said, most
of the time quality will improve especially if
you're in quality mode DLSS. There may be some
edge cases where it does-- it's not
as good as TAA, but I think 90% of the time,
it's equal to or better. And this is from a
visual standpoint and you get the significant
frame rate boost. And here, down at the
bottom of my screen, you can see it's got
some debug information. We supply with the
plugin the source there, some Windows Registry keys. So you can drag and
drop those into your-- or you can double click them and
they become active in Windows. So this is just for my
sanity as a developer. End users won't see this. Because this is just
on my local machine. But if I want to know what-- if DLSS is actually
working and if it's active, I have this ability to
enable a registry key, and then see the
output down there. So down here at the
bottom of the screen, you can see that it's
got-- in quality mode. Quality mode is-- well let me-- I'll step through each one. Ultra performance is
33% input resolution. Performance is 50%
input resolution. Balanced is 57%,
and quality is 66%. So I'm doing quality
mode right now that means that it's rendering. The original image
is 66% of this size. So you can see what that is. It's 1281 by 694. That's the input pixels and
then it scales that up to 1920 by 1040. So basically 1080p. And I can swap back and forth. So there's standard
anti-aliasing, full resolution basically, that's every
pixel being drawn. VICTOR: And that
should be temporal, right? By default. RICHARD: Yes. And so there's with
DLSS active, frame rate goes up considerably. And you can see the
resolution is 66% now and it's scaling it up. But I really want to emphasize this
point about the scaling thing. Because you'd say,
oh, well I don't want to draw 66% of my pixels. Well if that's all
it was doing, I would totally agree with
somebody who had that concern. Because that's how games have
been made to date, right? Like I don't know when it
started exactly probably around maybe 2000
with the Xbox One. And maybe it was done
before then maybe around like Halo 1
days or something. But most games to date-- I shouldn't say most. A lot of them will render
the game at a low resolution, scale it up in real
time, and then do something on the back end
to try to clean that up. They'll do basically a
real time sharpened filter. This is standard practice. And some of the games
will, especially on the PC, they'll give you an
image scaling option. So maybe you're not
running the game at 100%, you're running at it like
80% or something like that. Most major titles do
something along those lines. DLSS seeks to improve
on all that behavior. Because now what
we're doing is we're using the Tensor
Cores of the RTX card. And that's part of the
reason why DLSS requires-- it requires an RTX card is
because only those cards have basically AI cores. The Tensor Core hardware
is making this possible. Makes it possible for us to feed
in a lower resolution image, scale it up in real time,
and then reconstruct missing details. And not just sharpen the
result but improve the result. Actually make the details
cleaner in some cases. And it does this because
the best explanation is it's a set of
AI algorithms that have been trained to know
what things should look like. It was trained on very
high resolution images, like 64,000 pixel images, very
high resolution photographs. Taught how light should behave,
how detailed it should look. And all this training was
done for a very long time to teach the code really
so it could teach itself how images should appear
perceptually to the human eye, and then fill in the blanks. So it's not just up
scaling and sharpening it. It's doing a lot more than that. And as you can see you
get a nice performance boost out of that. I think when we started
with this we had DLSS 1.0, and it was-- this is one of the
big changes now with the plugin is that when
DLSS was first invented, it was it was slow start. We had to get it trained
with very specific apps. We had to teach it like with
this particular software to make sure it would
produce the right details and things like that. But as time went on we've
been able to improve the software make
it more generalized so it can reconstruct those
details from any kind of image. And it doesn't matter what it
is. even stylised whatever. So that enabled us
to release the plugin where the plugin can work on
any kind of rendered image. And it will do the
appropriate thing. It'll know how to
take a lower res image of whatever it is you're
trying to draw and upscale it. And it's literally enhanced. Where the artificial
intelligence is inserting and
creating enhancing detail that's important to basically
how people perceive imagery. I guess it's like-- yeah,
that's how I might explain that. But, I mean, the
benefits are enormous. As I move around
everything looks great. Having a higher frame
rate in this case actually helps to stabilize
the ray traced image with the noised pixels. Because the faster it can update
when I whip the camera around, the quicker it can
produce a smoothed out image in this case. VICTOR: Jon Stoll
was asking, "Can you use this even without tracing enabled?" RICHARD: Oh, yeah it's-- the scene doesn't need
to be ray traced at all. It's for any kind of image. Raster, ray traced,
any mix in between. Yeah the only requirement
from a strict requirement side is that it requires
RTX hardware to run. There's not much we can do
about that because in order to reconstruct that image really
quickly, you need Tensor Cores. You need some type of neural
net hardware basically. Something that's really fast
and really powerful, capable of doing billions of
operations per second. But, yeah, that's
all part of it. But, yeah, it does not need
to be ray traced at all. I can get more
granular about what's going on here from
a cost standpoint. VICTOR: Yeah I think
we're all excited about that. RICHARD: So let's see. I'll show you a stat GPU. So let me turn it off. So, OK, this is with DLSS off so
I'm looking at just my GPU cost here. If you're familiar with
optimizing scenes graphically, you're probably used to looking
at the stat GPU command. And you can see here
my biggest cost, this is a pretty
outlandish cost, is almost eight and
1/2 milliseconds going to a ray traced skylight. That's a lot. But like I said I set the-- I turned ray traced shadow
casting skylight on, and then set the sample
count really high. So there's lots of pixel
operations happening in order to keep-- produce a nice clean
shadowed image. And then you see past that my
next cost is my lights cost, and ray tracing ambient
occlusion is very high. But this all makes
sense because like I said I cracked those values up. But all that sampling is it's
really on your screen space. So if you had a way to
reduce the amount of pixels that you're inputting
into the system, and then you could scale it up somehow. Then, in theory, you could shave
a lot of cost off those things and make them more affordable. So that's where a
lot of the savings are coming in this case. So let's see if I do-- so if I turn-- when I
turn it back on you'll see the skylight lights and
the occlusion values all drop. And pretty much everything
will go down too, but I'll turn it on. And there you can see everything
went considerably down. And there is DLSS off. And you'll note there's
a line in there for DLSS. DLSS itself is not free. It takes typically about
maybe 1/2 a millisecond in that ballpark for DLSS to
do its real time operation. But, I mean, in this
case, I'm spending 1/2 a millisecond to shave
off like six milliseconds. So that's a good win. Maybe even more than that. What does that-- reduces from
20 milliseconds down to 11? But the DLSS-- there you
can see the DLSS cost is 0.64 milliseconds. VICTOR: The numbers
speak for themselves. RICHARD: Yeah. [INTERPOSING VOICES] RICHARD: I mean-- yeah. So, OK. VICTOR: No,
I didn't mean you. I meant-- RICHARD:
No, you're good. So, yeah. I don't know, it's-- you might be OK not doing DLSS. You might say like I'm good
with 40 frames per second I don't need more than that. But there's lots
of circumstances maybe where you want 120
frames per second or 200 frames per second. It can become very
important depending on the use, the application. Or like I said maybe you're
just trying to do something really expensive on the GPU. And you're struggling to hit 30. DLSS can get you over
that line in some cases. Maybe if there's no other way
for you to optimize your scene. I would point out,
though, that-- and I think this is
just generally good. Like I'm not I'm not advocating
that, oh, you just put DLSS to make a slow thing run fast. Fundamentals are
still very important. Like, let me get
that back up, knowing what your base pass cost
is and what a base pass is. And there's lots of
good documentation out there about things
like your shader cost, your base pass, your pre
pass, your mesh draw calls. And you want all those
things to be in good shape. And when you do that, when you
have a fairly well constructed or fairly well optimized
or at least well set up project from the get go, and
then you put DLSS on top, that's where you get
the real benefit. Because if you're,
let's say, for example, if you have just a really
high draw call count and you haven't tuned
the amount of meshes that you're rendering, draw
calls affects the CPU-- that has a lot to do with
CPU and GPU performance. And you can end up with a
CPU bottlenecked project. So optimizing the GPU probably
won't help you that much. And in that circumstance if your
draw call count is really high, you put DLSS on, you may only
get a few frames per second. You're not going
to get the boost that you want because that's
not where your bottleneck is. So this is where it's just-- all the old rules still apply. You still want to
have everything that you would do for
ray tracing, raster, it doesn't matter. You want to have just a
well-constructed scene for all those basics. And then if you do that, like
I said, DLSS will really shine. So is there anything
else I could show with this basic project here? I guess I could show-- VICTOR: Yeah, we
have a couple of questions if you want to tackle some-- RICHARD: Sure. VICTOR: --right now. Yeah RICHARD: Go ahead. VICTOR: Some in
general, some stuff that you might be able to try the scene. Atlas BayVR asked, "Does DLSS
work with scene capture actors? I only got it to work with
regular camera actors." RICHARD:
Scene capture actors. So like that sounds
like you want to capture something put it to a texture? VICTOR: Mm-hmm,
like a render target. RICHARD:
Yeah, a render target. Well, so DLSS will work
on the final image. I mean, if you have a TV
screen and your rendering your texture live to that
TV screen in the game, then DLSS will work on the
end result there, like, the whole scene. It's not going to work
on it individually before basically the
scene is constructed. VICTOR: Let's see. There's another question
here from Encanta Games. They asked, "How does
enabling DLSS for my project work for players
with non NVIDIA GPUs. Do I need anything
extra for those GPUs, like shifting into the binary?" RICHARD: No you don't
need a different binary. We build this into the plugin. So there's failsafes here where,
like I said, you can just do-- wherever you set up
your game preferences. If that's in your UI Blueprint
or some other method. Or I believe we
supply the source code when you download the plugin. I'm not sure about that. But, I mean, if you have-- however you set up
your preferences I mean you can always test
to see if DLSS is currently supported and have
fallbacks to that. And if it's never going to
break a system because it's-- the fallback is
always going to go to standard anti-aliasing TAA
You can change that default. There's commands and
options to do that. But with just simple
scripting, just even like this just to test
is it supported or not. If it's not make sure it's off. This is probably even
redundant I wouldn't have to hook up anything
there but for sanity maybe go with something like that. VICTOR: Yeah we
can grab another one here. Matthias Casagrande asked,
"I have been testing DLSS and noticed a lot of blur
when rotating the camera. How to fix it?" RICHARD: Yeah, there's-- so in the documentation there
are some console commands and some things you
can try that have to do with motion vectors
and a few other details. Now if you step through-- let me point you
to where that is. You go into the plugin,
the installation guide will get you going. Like this is everything
I stepped through at the beginning. Go to this folder, put
your plugins there run it, you're good to go. But if you go to the Quickstart
guide, the next document and the list, this gets
a lot more granular and will step through
complex issues. So you go down here
and you can see that you might have some
problems with motion vectors for example, which sounds-- it's
hard to say or it's hard for me to say, right now
exactly what's going on without seeing the project,
seeing the content. But most every kind of
solution you're going to want is in here. And you can-- we document what
the console variables are. This is the more advanced stuff. And give it a try and see
if it helps you project. Now if it doesn't, then-- this is constantly
improving software. It's constantly developing. So we want to make
sure that it's going to work with your content. If not right now
then in the long run. So just let us know. We have-- I believe there's
methods documented here to reach out to us. And we also have our
forums, where could possibly post on in real forums. I check there and I would-- if it's something
we can address, I'll make sure an
engineer sees it, and we can try to get
it resolved for you. VICTOR: On that note
feel free to use the forum announcement post, where we
announced this Live stream as well-- RICHARD: Yes VICTOR: --to this topics
we're talking about here today. RICHARD: Yeah VICTOR: Yeah. Why don't we continue? We're still gathering
all the questions and we'll go ahead and make sure
we get to all of them at least at the very end. RICHARD: Sure. So, yeah. I mean, as you can
see, a basic project, just getting going isn't that
difficult. Once your you know it's there because you'll
get this menu option. I can turn it off, turn it back
on I could set different modes. And I'll show you those
real quick just so just so people understand. Let me max my frame
rate out because I think I'm screened locked
to 120 frames per second. OK there we go. So here I'm in ultra
performance mode. So this is, as you
can see down here from the debug information, this
is 33% pixel information being scaled up three times
in each direction. So that's a nine times scale up. That's a lot of-- that's like
a postage stamp size being blown up to a 1080p image. But you can see that it
doesn't look that good. And that totally makes sense. It just does not have
enough input pixels to generate a
really good quality image at this low
resolution at 1080p. And we outline that
in our documentation this one right here
developer guidelines. And these are guidelines these
aren't hard rules we're not forcing anybody to do anything. But if you-- these are
strongly recommended, and we find that they worked
for a lot of Triple-A developers and even some Indies. So if you step
through this document, you can see we document
what everything is we say right here ultra
performance is really intended for very high resolutions. Above 4K. That's why it exists. So it helps to make 8K possible. And, I think, yeah, if
you do the math on that, the 33% of an 8K image
is 1440p, I believe. So 1440p original image,
that's a lot of pixels. So that's enough
for it to work with in order to scale it up
and make that 8K possible. So, yeah. But these other modes,
performance, balanced, quality are all good for a range
of lower resolutions. You just have to
look at your content. It's just going to
depend sometimes. Like in the case of
a ray traced image, I would highly recommend
going with quality for almost any resolution. So you'll get less of
a performance boost but you will get a
performance boost and the reason why I recommend
that is because the ray traced denoiser just-- it just
needs a bare minimum of input pixels right now in order for
it to produce a good image. If you go too low
quality on the DLSS it's not going to
look very good. But like I said it's you've got
to look at your content too. Like you might put your
ray trace scene together and you're like, oh,
performance looks great. So these are really
just suggestions. And we talk about
what all these are and what the suggested
use of usages are. Do you still have me? VICTOR: Yes, sir. You're with us. RICHARD: OK. We got quiet for a second. I wasn't sure what
was happening. So, yeah, was there
another question after that or should I get back on track? VICTOR: We got plenty. Let's see. We can cover some
of the general ones. danielmn81 asked, "Will
the NVIDIA DLSS plugin utilize the DLSS reflection
smoothing, caustic meshing, and not sure what
that word is GI volume into the plugin like what
we showed off a few months ago. RICHARD: I'm looking
for the question in the chat. VICTOR: It's
way gone by this point. RICHARD:
Oh, it's way gone. OK. So what was the question again? VICTOR: Will the
NVIDIA DLSS plugin utilize the DLSS as reflection
smoothing, caustic meshing, and Sprars GI
volume into the plugin? RICHARD: Well,
I mean, graphically it should work with everything. Like there shouldn't-- there's
really no limits there. I mean it'll even-- the released version
even works in VR and it works with forward
rendering, and OpenXR. So there's, yeah,
there's really no limits that I'm aware of
as far as something it wouldn't work with. VICTOR: I think the
question was in regards if those features
would be implemented as part of the same plugin. RICHARD: But
there's something in there about caustics I'm not aware
of any DLSS and caustics issue. VICTOR: Not
[AUDIO OUT] I think the question is in regards
to will those features be implemented in the same
plugin as the DLSS one or will they be part of a
separate plugin or engine branch? RICHARD: Yeah. I think, yeah, that's
separate stuff. VICTOR: So you had
another question from-- a couple of
questions actually in regards to using DLSS with
the movie render queue. RICHARD: Oh, yeah. So I can jump to that project. This is a-- because this
is great to look at. We pulled down the
Meerkat Demo which-- yeah, sorry? VICTOR: Oh there we go. I think we're back. We had a little hiccup. RICHARD: Oh. VICTOR: Give
us just a moment to verify that we're
back live I think we are. Going to hit F5 right here. Twitch as well. Sorry, Richard. RICHARD: It's OK, VICTOR: We're
getting your stream and the [AUDIO OUT]. RICHARD: OK. You can hear me and everything? VICTOR: Yeah,
yeah, yeah you're good. RICHARD: OK. No whammies no whammies. VICTOR: Working out. Yeah, we have you
well on our end here. Seems to be only up to Twitch. Let's check YouTube. Seems to be YouTube as well. RICHARD: Oh, jeez. There we go. VICTOR: Yeah, I think, it's actually Twitch
only at this point because I'm seeing you
just fine on YouTube. RICHARD: OK. VICTOR: Give us
just a moment y'all. If you can hear us
we're just figuring out some technical
difficulties here. Getting the stream
live on Twitch. Yeah we have serious
issues on YouTube. All right. I think we're going to go
ahead and try to restart Twitch Sorry, all YouTube watchers. Got to make sure that everyone
can enjoy the content. RICHARD: Yes. But they can-- people on YouTube
can still see me and hear me? VICTOR: Yep we are a
loud and clear on YouTube. RICHARD: OK. Well, I don't want to
give anything away yet. VICTOR: Let's
go ahead and do this. We're going to try
a full restart here. So we all watching
on YouTube right now, we will be right back. Everyone is just
chilling at this point. RICHARD: Yeah. VICTOR: We have
excellent connection on YouTube. So it seems like we have
some issue with going through to the Twitch server. RICHARD:
Oh, my goodness. VICTOR: We're
back up on YouTube and I believe we are
back up on Twitch. Restarts good, Richard. RICHARD: Yeah VICTOR: Sorry about that. RICHARD: Well
it happens, right? Didn't you say
before that there's at least one major
glitch, at least, once in a while or
something like that. I don't know. Something happens. VICTOR: There's usually
something when we're live. Yes. RICHARD: Something. OK. So but I-- but you can-- everyone can hear me and see me? VICTOR: Yeah
you're looking good. Your stream is
coming through well, and we are live
on both platforms. So you're good to go. RICHARD: OK. So Yeah I just wanted to-- this is the Meerkat Demo and
I think I'll just play it. Yeah so this was-- this is-- it was put out by
Epic, made by some Weta artists. And, oh, I'm not-- nothing. Let me restart this. It's such a cool demo. But this is obviously
very high end graphically. This is very close
to CG quality stuff. I mean if you haven't
grabbed it and checked it out it's worth taking a look at. But anyway, so this is-- oh, let me-- jeez, I'm going
to try that one more time. Yeah, but here I've got-- so DLSS running on it. And it didn't come that way. I grabbed the project
and added DLSS on top. But you can see what this is
doing for this rendered scene because this was never
meant to be real time. This was intended to be a movie
file output using movie render queue or MRQ. And so a big question is, well,
can DLSS help to speed that up? Can it improve the workflow? I'll say that we're
still looking at this, we don't have clear answers yet. So where we're at is--
where we're at right now is DLSS works great
with Sequencer. Movie render queue, it's a
little bit of a question mark. I think it's not
fully implemented. But we're very interested
in this area and I don't-- I can't make any
promises about what might happen in the future. But this is an area of
investigation for us, and we want to
support it if we can. But anyway I've set
this up with DLSS DMX and you can actually see what
the differences visually, the quality differences. I've got a-- whoops hold on. Yeah I've got a shortcut
here for DLSS off and you can see that
on my 2080 Ti system. This is running at just over
30 frames per second, about 35 frames per second. And then delicious
on it's about 40-- it's hard to tell what that is. 47, 48 frames per second. So almost at least a
30% to 50% speed boost. Again, we can look at stat
GPU because this is almost all GPU cost going on here. And if I switch back to the
standard anti-aliasing, full resolution, you see
I got a lights cost. There's an RTX GI cost. There's hair strands,
post processing. All those things are
fairly expensive because we want this to look quality. But a lot of those are
screen resolution dependent. Not everything but
many of them are. So when I switched DLSS on
I get a significant boost in performance. And let's see, you'll notice it
even pulls out certain details. Like the hair strands,
in standard anti-aliasing the hair strands get
lost off the nose and on top of the
head and stuff. You can see them, they're
faint over the face and stuff but they're not
really showing up. But with DLSS-- this is an
area where DLSS can sometimes improve image quality in
ways you don't expect. Where it actually can almost
grab some sub-pixel detail and pull it out. I think we've shown
examples on our website where it's game
like that stranding. Where there's some sign
in the distance and it's literally too small to
read anything on it. But you turn on DLSS
maybe in quality mode, and it pulls out
those words, sharpens them you can actually read it. I mean, we need to
start using this for criminal investigations.
like scanning license plates, no you're right. Just like CSI
movies and whatever. But you can see what it does
the image here, speeds it up. And in this case that
actually improves the quality. Sometimes if you look at the
fur under his chin there-- so here's with DLSS off. There's like a shimmer to
it, like it's not stable. It's not able to, I don't
know, fully resolve it somehow. With DLSS on it not only
pulls that fur detail out it stabilizes it. Now, I don't want to get-- yeah, go ahead, go ahead. VICTOR: It also seems
like the depth of field is working as intended. We had a couple of
questions of that in chat. RICHARD:
Yeah, no, it is. Depth of field
will be different. If you look closely
at depth of field-- depth of field totally works. This is actually a
misunderstanding. Depth of field works with DLSS. All your post-process
effects are going to work. But in some cases namely
DLSS and Bloom they're going to work differently. And it may not give you the
result you want out of the box. So you might--
because when you-- let's say you're doing
TAA or TAAU which is the standard
anti-aliasing up sample. So you're doing one
of those methods. Like where you're
post-process effects sit in the chain is going
to be slightly different and give you slightly
different results. So between those two methods
today in Unreal Engine, you don't get quite the exact
same post-process result. Depending on how you-- whether or not you've
got an upsample flag on in your project, what
your input pixels are is going to determine how
your depth of field behaves. I mean, you can see this for
yourself even without DLSS. Just start changing resolutions
and change that upsample flag. And you'll see
different behaviors. So what's actually
going on is DLSS just as a slightly different way
but it sets in the existing post-process chain. And so it's present it's working
but it's going to be-- it's going to give you a different
result than how TAA currently looks. And in some cases, it might
make it seem like it just goes away completely depending
on your depth of field settings are set. Or in the case of
DLSS quality mode, the difference might be really
subtle almost nonexistent. So that's just a quality thing. And we also recognize too
that this is an area that needs to be improved. Like we're not happy with that. Like we would like
it if ideally as-- if you're setting up your
movie or sequence whatever or using depth of
field a certain way, you want it to be consistent
across all resolutions, all different anti-aliasing
methods, et cetera, et cetera. I'd say that this is
something that we just need to keep working on. And I would hope that a future
update will see improvements to this. But Yeah if I switch
back and forth. So here's standard anti-aliasing
and then here's DLSS. I'll just cycle back
and forth a little bit. You see is still doing depth
of field just like I said. Maybe not everywhere you
want or quite in the same way that you intended. But that's where you
can account for it. Like in the case of this
project here, the Meerkat, it's intended to be
rendered to a file. That's the whole point of it. Like I said, this was never
intended to be real time. I got into a discussion with
the Weta artists on that. They were like Yeah
we didn't think that people would be running
this in a live application like it's going to
be part of a game. And it's a real time cinematic. It's meant to be a movie render
cue that's the whole point. And so in that case, you
could very easily just design of your post-process
settings to be ideal for DLSS. If that's your
development machine, and you're spitting
out that movie file, then just build it
for that right there. But, yeah, in other
real time scenarios, this is something we
need to improve for sure. So, yeah, I get-- you can-- I can pause it, keep going. [INTERPOSING VOICES] And there's lot-- oh, yeah, sure
you want me to pause it again? VICTOR: Oh
I was just going to say the chat we're referring
to the typical scenario in a movie where
you just go enhance. RICHARD: Right. Exactly. Well maybe DLSS has other
applications beyond games. I mean, I'm largely
talking about games today. But there's lots of
potential for other things, including rendering video. Yeah here this is-- I just paused it. So this is DLSS off DLSS on. It's like a 50% speed boost and
look at all that back hair fur. Look at that how it
gets lost right there. But then pulled
out and stabilized. VICTOR: And it's actually
coming through the stream pretty well too. RICHARD: Yeah. Yeah. It's so I don't know. I mean, we made
this application-- I mean well this variant
of the Meerkat Demo so we can start to
study these issues and understand it better. This also-- I mean just so
everybody is aware of this also has a ray tracing
off versus ray tracing on. So we can study those
differences too. So we're trying to learn a lot
about how DLSS behaves, how GI, behaves, how tracing
versus raster. I mean, the project was
intended to be raster. So, yeah, we're just-- yeah, we're using this
as-- this as one test bed to try to understand some
of these differences. And then look for ways
we can improve it. Because, like I said, we
don't want the software to-- it's not like it's at version
2 and it's done forever. That's not the point. We know it has-- there's areas that
could be improved that can always be better. And it's also important
for us to understand what the differences
are too because we've looked at a lot
of content, but we haven't looked at everything. So there may be
interesting details that we don't expect
like I didn't I certainly didn't expect the hair to pop
out like that and stabilize. So that's interesting to me. Yeah this is all 4.26 stuff. So like the hair and fur
is brand new like we're seeing this for the first time. VICTOR: I had a question
from Hector Centeno "So enabling DLSS automatically
disables and replaces temporal anti-aliasing?" RICHARD: Yes. Yeah it's one or the other. But, yeah. Let's say you do your do your
standard way of optimizing the rendered scene
where you maybe run the project at a lower
resolution like 70% or 80% res, and then you upscale that. And then you do a tone mapper,
sharpened filter on top of all. That whole set of actions
gets replaced by DLSS if it's active. So, yeah, you're doing
one or the other. I should just let this play
because it's fun to watch. VICTOR: Yeah I
had another question from Wild Ox Studios. "Do you control the native
resolution manually or does DLSS do that?" RICHARD: Oh, well,
so you can specify the-- which or what resolution you're
using depending on the quality mode. So I'll just back
out here real quick. Let's look at our Meerkat,
he's such a cutie. VICTOR: Yeah. I've seen it a lot
in Zoom calls. RICHARD: So let's
see, if I set it to quality. Yeah. So, I mean, like I said it
tells you at the bottom-- if you've got the
registry key setup it tells you what's going on. If I set it to
balanced that's 57%, this is performance
is 50% resolution. ultra performance
is 33% resolution. And I think I might
have just bit it. VICTOR: Ah, If we
don't have at least one engine crash during a live
stream, it's not live. RICHARD: The Meerkat
thing is very taxing. Like I said, I've got
a 2080 Ti in here. It's a good card but-- oh, wait. No, there it goes it recovered. I thought I was going to have
a GPU crash for a moment. Jeez what's it what's going on. I think I'm just going to-- VICTOR: Those settings
are predefined, right? But you also have-- RICHARD:
It's predefined. VICTOR: --to set
that percentage yourself? RICHARD: Yeah. You can-- Yeah there's no
way I believe to specify that or at least-- I'd say check the
documentation on that. We do have-- we are work-- I can say that we
are working on-- I mean, like I said you
can specify for an end user or you specify it yourself
where this is the quality mode I want to be in, and you
set that up however you like. But one thing we're working
on is a variable rate shading or a dynamic resolution. So it's partially
implemented right now. It's there, you
can play with it, but this is not a final
implementation at all. But I can show that real quick. because the future
of DLSS might be this, where you don't even
specify the input resolution. The system is making very
intelligent dynamic decisions about frame to frame,
what the resolution is. So the command-- and this is--
like I said this is a beta. If you just want to experiment. But it's our test. And there's some fun stuff
in the R-test category. But this one is dynamic
resolution hell. And you turn it on. VICTOR: Always fun [INTERPOSING VOICES] RICHARD:
Yeah and then you see what's happening down there. Look at that. The input resolution
is changing rapidly. It's cycling between different--
like whatever perceptually thinks it's like the right
resolution moment to moment. It depends on the application. That could be a small
boost to a big boost. And like I said,
I'm very underscore. This is a beta feature right
now, we're working on it. But theoretically,
like I said maybe in the future you don't
even set the resolution. Maybe it all just
handles it for you. Giving you the
optimal frame rate based on what makes
sense in the moment. VICTOR: Cool. We did have some
questions about using DLSS with the metahuman project. I believe we were going
to cover that, right? RICHARD: Yeah. How are we doing on time? We're an hour into this already? Oh my goodness. VICTOR: But we
got plenty of time to go. RICHARD: OK. I'll load up metahumans. Yeah. I personally, I'm really
appreciative that you guys are doing all these
high end projects. Metahuman, Meerkat. Because I mean this is a
good way for us to make sure everything
is working with it including/but especially DLSS. We need to keep looking at as
much kind of varied content as possible. But these high end projects
that are like really GPU intensive or good
test cases for us. VICTOR:
Waiting for this load. RICHARD: Yeah VICTOR:
Yeah a little bit. Maybe we can another
question while waiting. RICHARD: Sure. VICTOR: Let's see. We covered depth of field. Rafael Medeiros Lima
asked, "Will DLSS have problems with screen
space Reflections?" RICHARD:
No it Shouldn't. I mean, any graphical
feature you can think of, it should operate on
the-- really great I mean like I said, most of
the time 90-- at least 90% of the time if it doesn't
have visual equivalency to standard anti-aliasing
it might actually enhance your details. Like I said, it'll depend on
what your input resolution is. It will depend what
quality mode you're in. But the potential for it to
actually improve your details is they're all giving
you a better frame rate. I will note a couple of
areas where, yeah, DLSS could improve. I mean as long as we're on that. And we already talked about one
of them which is post-process. Specifically depth
of field and bloom. There are I guess you
could call them asterisks. Things to be aware of that
might affect your project. It's not going to
affect every project. Another area is world space
post-process materials. And if you think through
the logic on that one it makes sense. Because if you've got-- you have some type of
post-process effect and it's specifically
tied to world space, the way it works currently
is it's counting on world coordinates to match
your screen resolution. And if you're working
with less input pixels in the case of DLSS world
space post-process materials won't render in
the correct place. It's really just a know issue. And if you're really
good with math you can actually work
out the differences and compensate for
that in the material. So if you are in
DLSS quality mode, you know but that's 66%
input resolution you could do some math on your material
and compensate for that so it looks correct when
you're in that mode. You just need a couple of
extra Blueprint functions and something like that. And then it should work OK. But that one's come up. And just because there's a
way to deal with it right now, It doesn't mean that that's-- we're happy with that
right like we would like it to just work if possible. So these are areas
where we always want to try to improve it. Make it a little bit
better if we can. Fix anything up that we can. So I mean Yeah like it's
actually important for us to get that feedback. If people have a particular
project and something is just not playing right, and
you've done everything you can. You step through the
more advanced settings and nothing seems to fix it,
just get in contact with us. I'm on the Unreal Engine forums. You can find me there. I don't post all the
time, but at least, I try to get in there check
it at least once in a while. And just flag me down I will-- if it's-- like I said if it's
something we can address then we will. VICTOR: Thanks, Richard. RICHARD: Sure VICTOR: Let's
take a look at this. RICHARD: Yeah. OK. So no DLSS. So obviously a very
GPU bound scene. Let's do a stat GPU. We can see exactly how
we're spending our time. A lot of time being
spent on lights. That make sense. I think there's a number of
point and rec lights in here that are all shadow casters. So the DLSS will
more than likely have a good impact on that. Most of these other
things, like I said, it's going to improve
your GPU time. So I can just turn
on quality mode. And don't worry I'll take this-- or maybe I should just
move this right over here. So we can see the
quality differences. But you can see what that
did to the frame rate. I think I went from 20 some
maybe 25 frames per second to almost 40. Cut the lights time
in half that's huge. I'll swap back. Turn it off. Yeah in this case, I'm not sure
if it's improving any detail. I don't think it's
losing any detail. Like her eyelashes
and it everything. So this is just-- I think this is just-- yeah, this is just
standard into aliasing. I switched to quality mode. I don't know that looks
pretty preserved to me. But it probably could be
gone over with a fine tooth comb for like image comparison. But, yeah, the performance
gain is definitely there. It affected almost
every category. It's not going to-- like I
said, if it's not a screen space if it's not a fill rate
issue or strictly a GPU type issue it's not going
to-- probably not going to impact it. So that's why it just depends
on what you're drawing. This gets to that whole point
is that depending on what scene you're seeing and how
you've constructed it, the gain can be a little bit. It might be 10% to 20%. It might be a lot it might
double your frame rate I think under very
ideal conditions it's more than double. But so that's why there's
some range there as far as-- it just, like I said it depends
on where your performance bottlenecks are at exactly. But going from 25 to
40 that's pretty good. This is a very intense scene. I can give it a play. VICTOR: And this
is with DLSS on right? RICHARD: Yes. You can see it's active
at the bottom there. I guess I'm running in
balanced mode at the moment. I didn't-- that is the default.
So it just went right to it. VICTOR: That shot right
there is my absolute favorite. RICHARD: Yeah,
it's killer, right? VICTOR: Yeah RICHARD:
It looks so good. And on it's performance,
it takes a big hit when it zooms in on
his face, but Yeah I think before for me, in
regular anti-aliasing, this is no better than
20 frames a second. And here just we're above 30. VICTOR: Had another
question from Sintask in regards to using DLSS for
virtual production workflows. I don't think that's a one
solution fits all, right? Depending on what
your production cycle looks like there. But something that
would be of benefit is the real time iteration
benefits that you might see, right? if you have a-- RICHARD: Absolutely VICTOR: --very intense
and performance-heavy scene might not necessarily
need DLSS when you're outputting your images. Unless you're seeing a image
quality improvement that you like. But especially when
it comes to iterating, and this is one of the great
benefits of using a real time engine for movies is that
you have the opportunity of having it run that at least
24 frames, which is just movie while you're just working on it. It's just make a change hit
play look at it right away. You don't have to wait to see-- RICHARD:
That's actually-- yeah, any-- it's going to benefit you
on almost any GPU bottleneck task. So yeah, as you're working
with something in real time. And if you can get a
sense for that is my GPU-- what's costing me? Like here let me turn off DLSS. So turn it off. My frame rate goes down. OK. So very basically I
can do a stat unit and get a look at
what my cost is. But that tells me right there. If game time can be like your-- things like your CPU, your
Blueprint scripts or whatever, your game code basically. But here you can see my
GPU is my highest cost. And you're only going to be as
fast as your slowest component. So that's why DLSS
has a out-sized impact on this particular scene. Because one GPU cost as you can
see right there is pretty high. So when-- yeah, if I put it
on, my GPU time went way down. I think it's shaved like
12 milliseconds off. VICTOR: Yeah, it did. RICHARD: Right
so that has a big effect. Not quite a doubling
but pretty good. and if it's a workflow
thing like you just trying to get the most
performance out of it, you could potentially
go ultra performance. And it'll be blurry or I
should say blurrier, but-- now I just I just went past
50 frames per second at least momentarily. So that can be a big help. But, yeah. Especially when
talking about movie render queue, like I
said, this is something that we need to further
investigate but there are interesting ideas there. Because everything I've talked
about so far is DLSS in a-- totally in a real time. Typically a game but maybe
a visualization setting too, where you're trying to
maximize frame rate. Well, what about what
if we took the lessons and ran it not at a lower
resolution but a higher resolution, and use that
to enhance details that you might output in a movie? That's interesting, right? And at the moment
it's unexplored, but we're thinking about it. And I think we'd like to do
it, but we need to look at it. But that's especially tied
with movie render queue. That might be where
you don't run DLSS at a lower resolution you
run it at 100% or maybe 150%. It's like how you'd--
typically you'd go into the-- you go here into the
screen percentage, and you'd upscale it. set it to some higher value
200% or something like that. So it's not going to-- doing it that way
sitting at 100% or higher is not going to save
you any frame rate. But it might enhance
your details. It might improve your quality
beyond what you thought was possible originally. So that's an interesting area. Yeah, I wish I had a more
clear answer about that today but that's something
that we're looking at. VICTOR: Mippithedork asked "Does DLSS render
operations happen before UMG slate UI or after?" RICHARD: I think, well
that stuff can be completely resolution independent. So it leaves it alone
as far as I know. Like if you do up-scaling right
now, like the UI elements all project at 100% resolution. And that doesn't change. VICTOR: Can't Pronounce
That Name asked, "Are there any circumstances where DLSS
might become a downside? And what would be the
worst case scenario?" RICHARD: Becomes a downside. Well, like I said there's
the aforementioned world position offset material,
post-process material issue. But that's a known
one, and like I said, we would like to improve that. And, yeah, just in the
current render pipeline the way things are
currently set up it's just good to be aware about
how your post-process effects might change. That totally depends
on the project. How much bloom are you using? What are you doing
with depth of field? Some games don't use
depth of field at all. And some games turn their
bloom off or make it an option. So it's just going to depend
on what you're currently doing. But I think my
point there would be knowing about these
things ahead of time. These edge case issues
or known issues. If you're aware of
them, you can build it into your plan for DLSS. I mean, our experience generally
with working with developers has been, it's always much
easier to know ahead of time what the technology is and
what you're dealing with. What are the strengths. What are the weaknesses. Just what does it do? And the sooner you can
get that into the process, the better off you're going
to be with your project. If you've got something
that's basically done, your final
beta you're heading towards your gold master,
you're just about there, and you try to add
DLSS on top, you could run into some problems. That way where you've got to
go back and change content. So, yeah, the sooner
you can be aware-- and that's part
of the reason why we're doing this today, right? Is the more knowledge
you have on it, the better of your
project is going to be. VICTOR: And that doesn't
only apply to DLSS, right? There are always upsides-- or not always but
frequently downsides to the features
you decide to use. RICHARD: Oh, yeah,
absolutely I mean, we see this across the board
with just about everything. RTX GI, ray tracing in general. The sooner people can be made
aware of what the options are and how to make use of
them, typically, the better your production is going to be. The easier it's going to be. The hardest thing to
do is the game is done, the project's done
and now we're going to try to take this
advance feature and go back and graft it onto that. Because then like maybe you're-- who knows what you've made? You've got lots
of different stuff lots of different
types of content, lots of different scenarios. And it's all been fine tuned to
the previous way of rendering things, whatever that is. And so, yeah. Like I said that's part of
why we're doing this today. So that-- help people
be aware of exactly what DLSS US is doing and, yeah,
just what to watch out for. But as you can
see you don't have to watch out for very much. Like I'm not aware
of a ton of issues. And like I said
in a lot of cases, you might find that the
quality actually improves, which is crazy but true. So it's not every case. I mean like I said,
the little details can come up where it's like Yeah
they enhance this detail here and made it better and got I got
an extra 20 frames per second. So that's wonderful. But maybe this other
detail over this part of the image I feel like
it took a step backwards and so we just take
those case by case. Like how can-- so
we look at it we want to try to make
these improvements. VICTOR: davig019 asked,
"Can you selectively use DLSS, for example, in a first
person shooter don't process the middle of the screen?" RICHARD: No but
that's an interesting idea. I'm not sure how
much performance benefit there would be if
it were possible to mix different modes like that. But it's an interesting idea. VICTOR:
That falls in line a little bit with how
variable rate shading works. RICHARD: That's true. Yeah, it doesn't work
that way currently but, who knows what's in
store for the future. VICTOR: Yeah it's
definitely an interesting idea RICHARD: Yeah VICTOR: Might
even be some cool effects and there's always so
much to think about. RICHARD: Well,
yeah I mean rendering today is a very complicated affair. because there's have all
your post-process effects. You have maybe different
translucent layers of different Niagara
effects or whatever you're putting in the scene. There's a lot of
different kinds of content that all comes together. And it's not all rendered
in one shot right like here there's different
passes that are happening. And they're happening at
different stages in the render cue. And, of course, we're trying to
do it all really quick right we want to do it in a
tenth of a millisecond or some number that's
really, really low. So there's big
challenges there there's a lot of difficult
things to solve. We want to do it as
fast as possible. Ideally everything
would be running at 1,000 frames per second frame
rate would never be an issue. It just runs no matter
what you throw in there. So I think this where it starts
to get a little theoretical. But anyway who knows
what the future brings. VICTOR: Yeah we're still
gathering all the questions here. But why don't we move on
with the presentation? I know we have some
cool stuff to show us. RICHARD: Yeah. OK. VR comes up a lot. So I can't show VR
are in real time. So I'm going to
show you a video. OK. Hold on let me do this. Do a zoom to fill and a repeat. Anyway. I wanted to show
some VR thing just to show everybody that
this actually works, and it's potentially very
good for your VR project. This right here, this
is not a product. This is not anything
this is just me messing around with
an idea I had at home because in a former life
I was a VR developer. And I have a strong
interest in VR, and someday I'd like to build
a home theater in my house. I'm not I'm not rich, but
it's cool to dream, right? So I started to game it out
I was like well I could just use Unreal and I've
got a VR headset and I could build something to
scale virtually, and game out. What does it feel like is this
makes sense that sort of thing. So yeah, this is all
like a theoretical room space in my house. And I wanted to see if I could
make like a virtual reality theater room, just like I
said, so I could sit in it and feel it out and
see what this is like. But I put deals on this too. I hadn't tried it
before yesterday. Just to see what
sort of performance I would get out of it. And as you can see I mean
this is set up for performance as is. It's running on a 2080 Ti. So performance is
really good but-- so now this-- I set it for VR mode. This is what DLSS off. And you can see the
frame rate's variable. I actually just so
everybody knows, this is a deferred rendering. This is lots of
shadow casting lights. It's all dynamic. You can see the shadows
they're coming in from different directions. Way too many shadow
casting lights. This is not like a realistic
production VR project. you'd want to optimize this
more if it was something you're putting out there. But maybe as you're developing
and working on it in real time you don't want to
optimize it as you go. Maybe you just want
like good frame rate like while you're building it. Like I said like I'm just-- I think the scene probably
has 20 shadow casting dynamic lights. It's too much. But it's VR. This is not ray traced its
raster rendered, So, yeah, this is just--
you can see what-- I've got a simple
function in there where I press the Space
bar and the lights go down they come back up
the door opens and closes. VICTOR: Yeah RICHARD: That sort of thing. It's just like I said,
this is not a product never will be a product.
it's for my own purposes. But I felt like it might
be pretty instructive on the benefits of DLSS. So DLSS off and then
here I'm enabling it. That's DLSS on. And you can see the frame rate
jumped about 20 frames a second in this case, it looks
like it went down about two milliseconds. It went almost straight
up to like 120 and just it's like pegged like
119 or close to it. VICTOR: When
we're talking VR, two milliseconds is huge. RICHARD: It's huge. You're a VR developer,
you understand that you're fighting for every
fraction of a millisecond you can get in VR. And yeah, I mean,
like I said, if you're fine-tuning the project
and you had a lot of detail to try on VR more
than you thought was previously possible. I mean DLSS, it might almost be
most ideal for VR applications. I mean it's got-- like I said it's a lot of
applications across the board but in some ways, it's
really meant for VR. Because here in this case. So I turn it off look at those-- look at the details
right in the center look at the line structure how
it's the high contrast area. Look how it sharpens and
cleans it up with DLSS on. So not only did I get
a frame rate boost, I got a detail enhancement. There's standard anti-aliasing. VICTOR: Yeah, and
it's an interesting solution if you are required
to use deferred render pass for whatever reasons
instead of forward right? RICHARD: Right. VICTOR: It probably
doesn't look great in VR. We all know that it
makes it rather blurry. And so it can instead use DLSS
to enhance the image that way. That's a win-win. RICHARD: Yeah
and, of course, DLSS does work like it's
intended to work with VR. So it does-- it will work
with forward rendering. It works with OpenXR. Yeah. And I would encourage
anybody if you're developing VR apps that
you want to use DLSS, and you see any
issues, please let us. This is an area that's
important to us. We want to improve it and
make sure it's rock solid. But as you can see
I didn't do any-- you can fine-tune
the sharpness factor. You can do a few
other details to DLSS. Like I said I just sent
it to a quality mode, got a small but noticeable
frame rate boost, and it pretty much just works. I didn't have to do any other
fine-tuning to my project or worry about any other details
or change anything or anything like that. And if you're an
audio-video file, you might notice in my
hypothetical theater down here, I've got an ultra short
throw positioned laser. I'm into that stuff. So I think it's pretty neat. VICTOR: Yeah RICHARD: Getting
off topic, but anyway. Any other questions
at this point or-- VICTOR: Sure. Yeah let's go through
some of them ForecsPC asked, "Is there a difference
between the DLSS 1.0 of the 20 series and the DLSS 2.0 of
the 30 series in this plugin? Or does both gens
use the same DLSS as this version of the plugin?" RICHARD:
Both generations use the same version
of the plugin. You might be playing a
game though that that's using an older version
of DLSS, but that's just built into that game. But the plugin
that we've released is, I think, technically version
2.1.5 or something like that. And that's the latest
release and yes it's the same plugin for
any ray tracing card. VICTOR: Let's see
RareBirdGames asked, "What happens if you optimize
your game for DLSS but port it console? It kind of throws me off." RICHARD: Oh, sure. Well, yeah. I mean, DLSS requires Tensor
Core hardware right now. So that means it requires
an RTX card in a PC. It doesn't run anywhere else. That's how things are at-- that's where they're at today. So if you're running
on any other platform, you're going to use another-- you're going to use a standard
anti-aliasing or up scaling, or maybe some specific
method that hardware offers. But that's really it it's
actually very simple. VICTOR: Xenthorx asked,
"Can you combine temporal upsampling and DLSS?" RICHARD: No. It is a one or the other thing. VICTOR: That was easy. RICHARD: Yeah, VICTOR:
Gideon October asked, "Is DLSS a final
post-process or can you implement several layers
within the image?" RICHARD: Several layers. I'm not sure I understand
the question exactly. VICTOR: Perhaps if it's-- is it affecting
several of the buffers or is it after all the buffers? RICHARD: Oh, well,
I believe as a plugin it's-- the best way to
understand it is there's a page on the Epic documentation
that talks about the whole TAA and TAAU render queue. And you can see where basically
the temporal anti-aliasing resides in that chain. So if you go there basically
just DLSS replaces that. So you're doing
one or the other, but it exists at that point. And it's going to depend
on whether you're doing the upsample version or not. VICTOR: Let's see. Robin Hasenbach asked,
"Will this be supported on mobile NVIDIA chip sets?" RICHARD: I don't have
any answer for that right now. VICTOR: Cool. Let's keep some of the
other questions in the end. I'm also getting
some general that we can cover a little bit later on RICHARD: Sure. Yeah. OK. So I also wanted to
show this project here. This is a game coming out
called The Fabled Woods, and it's made by
an Indie developer. This is really a great-- It's not out yet, it
comes out March 25th. But it's really a
great success story from a technical standpoint. The developer is
actually a one man team-- VICTOR: I think
we lost Richard for-- RICHARD:
--Pumpkin Jack as well. And so they end up-- Oh yeah, is everything-- VICTOR: Yeah let's just
wait for the project to load. I think you're CPU's taking
a little hit right there. RICHARD: Yeah. [AUDIO OUT] I was launching it,
so that makes sense. VICTOR: OK. Check one, check two. I think we're good. RICHARD: Yeah. VICTOR: OK
Yeah we're good. RICHARD: It's doing it again. OK. VICTOR: Yeah. Well, at least we have you. RICHARD: Yeah. OK, so I don't want
to show the whole game or anything like that. But I just wanted to show this
is an interesting and really cool project because I mean
NVIDIA is a big company. We work with a lot of Triple-As,
we work a lot of big games. But we work with Indies too. We work with a whole range
of different developers. And this particular
developer, is really incorporating our technology. So they're using,
our NvRTX branch for optimized ray tracing. They're using DLSS,
they're using RTX GI. They're doing a
whole bunch of things that are really powerful. And they're using it to produce
a ray traced forest scene. This might be one of the
first to be done on the scale. And I mean, I can just-- I did want to show you a
little bit of that content because I think it
speaks for itself. VICTOR: Yeah you
showed me this yesterday, and I was quite impressed. Especially considering
the difference between this scene
and the one you showed in the previous
stream we did. We can tell that Richard's PC
is once again loading a scene. I believe he'll be
back momentarily. All right scene
loaded into memory. Let's see if this could-- gets a little bit of that back. RICHARD:
This is [AUDIO OUT]. There it goes. VICTOR: All
right I think we can-- yes we're back, Richard. RICHARD: OK I
guess he needs to rebuild some shaders for some reason. Even though I just loaded
this just before the stream started at least to do this. Can you hear me OK? VICTOR: Yeah. You're good now. RICHARD: OK. So, anyway, it's
very interesting from a technical standpoint. I mean, I think this is going
to be a really cool game. This developer, they really
want to tell a story. It's a single player
story-driven game. But I wanted to talk
a little bit about it from a technical standpoint and
how DLSS is benefiting this. Because he got our plugin
and incorporated it. And, like I said, largely
it speaks for itself. And this is-- it's not necessary
this game is-- it's ray traced, it's also does raster. There's both. But this is a ray traced forest
on a very large environment with a ton of foliage overdraw. And this is normally
a pretty killer. This is really difficult
to do in any setting, ray tracing or not. But you can-- but anyway
this is without DLSS and it's hanging in
there on a 2080 card. It's getting a good frame
rate but we can do better. VICTOR: Enhance. RICHARD: Enhance. There we go. In this particular
case, performance does great in terms of
quality and I almost, almost doubled my frame rate. And real time reflections
not fake ones. Ray traced shadows
over everything. Over a very complicated
environment, I think, yeah, like
last time we talked, I showed a test forest
scene because this type of environment is
something we would like to make sure runs at a
really high quality and really fast all the time. And so we've spent a lot
of development effort to help boost that, optimizing
foliage and how it behaves, and all that sort of thing. And I want to know too
because sometimes it comes up people ask about our branch
the NvRTX branch which anyone can get. They get access to the
source code with a sign up. So you go to
GitHub, you sign up. Once you're in there, you have
access to the source code. You can see it all for yourself. But people ask NvRTX,
well, how custom is that? That sounds like it's only going
to work on NVIDIA hardware. That's not the case at all. NvRTX is it 100%
follows DXR standards. So that's Microsoft's
DX12 ray tracing standard. And NvRTX follows that 100%. The only thing that doesn't
follow DXR standards is DLSS. And that's because
it's not a ray tracing technology for one thing. And another thing is that it
requires Tensor Core hardware that's tied to the hardware. But everything we do would run
on any rat tracing hardware. It would run on a PlayStation 5. It would run on an AMD. It doesn't matter. We'd like to think that
on our hardware runs-- it runs the best
but we're not doing any specific optimizations. Zero optimizations
for our hardware. And like I said the
proof is in the pudding. You can get the source code you
can go through it line by line. You can see for yourself. So it's-- yeah. VICTOR: In that
previous camera position you ran if you could just
toggle it on and off. RICHARD: Oh, the
ray tracing or the DLSS? We're talking DLSS, right? VICTOR: Yes RICHARD: OK. So, yeah, that's
DLSS off and DLSS on. VICTOR: Thanks Richard RICHARD: Sure. Yeah. If there are any differences,
they are often very subtle. I mean we can also compare
it to DLSS quality mode. There's still a significant
performance bump. But that's-- like when you
said that the preferences for your game in this case, the
user can choose what quality mode does he want does
he even want DLSS? And set it up that way. And I'm pretty sure this
game is set up that way. So it's not required. But if you have an RTX card
and you want that boost, you get to choose
how you use it. VICTOR: Yeah. I guess something
that folks might be curious about
is are you aware, other than implementing
the plugin to this project, are you aware of
any other work that was done with DLSS to
get it to run this way, or was it more just an
implant and turn on? RICHARD: No. Most of the work is-- it was mostly just turn
it on or turn it off. Yeah a lot of-- I'd say most of the work
went into other areas. The reason why I talk about ray
tracing so much with this one is because we want
ray tracing to work with all kinds of content. Foliage and moving animated
shadows, and whatever. And so this takes
advantage of that. We did a lot of work to make
sure that part of it works. But, yeah, DLSS tends
to be very automatic. Unless you run into one
of the, like I said, the aforementioned maybe a
depth of field issue, maybe a bloom issue. Sometimes issues with motion
vectors we've documented that. In the PDF But Yeah but since
this particular game, I don't think he's doing
anything with depth of field so he didn't-- or very little it's
not he's trying to like blur things
at a distance. So, yeah, for the most
part, DLSS just worked. VICTOR: Awesome RICHARD: Yeah, VICTOR: Definitely a case of, let's just sit here and
look at pretty pictures. RICHARD: Yeah
it is pretty to look at. This actually has-- it's
come up occasionally, people will notice
this is that it-- whoops I'm selecting
everything now. There's volumetric
fog shadows going on. That's something we have
fixed in our NvRTX branch. So in ray tracing you can get
volumetric fog in ray tracing. But, yeah, like I said, the-- our code benefits everybody. People often have that question. there's really small, one
man teams like this project. There's big Triple-A games and
there's everything in between. We do a lot of interacting
with developers might be a 10 man team
or a 20 person team. And they've got maybe a PS 5
version of the game planned. It's a common question does-- will this technology
work on other platforms? Yes. Absolutely. DLSS says it won't
but that's not today. I'll say that. VICTOR: Yeah
I think you mentioned that the plugin that's available
is still in beta officially, right? RICHARD: What
are we talking about? For the-- VICTOR: DLSS
plugin for Unreal Engine. RICHARD: No. The dynamic resolution
feature is a beta. But, I mean, what
we have is I think it's considered a release code. VICTOR: OK RICHARD: Yeah. Ready to be used in any project VICTOR: Enhance RICHARD: Enhance. Exactly. VICTOR: Stuck on that now. Let's see. We've seen a couple
more questions here. Let's see if any of them are
related to the project here. RICHARD: Oh, sure. Yeah and let's see. Yeah, I did want to-- well I mean-- yeah
let's see if there are any other questions though. I don't want to miss
anything if people are wondering about something. VICTOR: I think you're good. Why don't you continue and I
can mark some of them here. Just to make sure that
we'll get to them. RICHARD: OK. I did want to take some time
to talk about some things we have coming up that aren't
specifically DLSS related. But if you're ready to move
into that I'm ready too. VICTOR: Yeah let's go. RICHARD: OK so if we
could pull up the RTXDI page or-- I don't remember what
we were going to do with that if I can get control. But-- VICTOR: We
got you, Richard. It's up. RICHARD: OK. Cool. So, yeah. If you-- RTXDI is a
ray tracing technology we have coming out very soon. And I really just want to put-- this is a public page
we have right now. So you can go to this web page
right now and read about it. This is going to be
coming to Unreal Engine. And this is a very big deal
from rendering and lighting standpoint. RTXDI it's just
fundamentally going to change how we light scenes. Using ray tracing methods. Basically, new denoising methods
and new lighting methods, new shadow casting methods. It is going to allow you to
have what basically amounts to unlimited light sourcing. I don't know if this one can
be overstated enough as being the big deal that it is. Not only are you going to
be able to have literally thousands of shadow casting
lights in your scene, but you are probably going to
be rendering the scene faster than you do today. And this is strictly software. These are software improvements
using the existing hardware to enable new
forms of rendering. I mean, I'm blown
away by this one because if you look back at
the history of modern gaming, the turning point for
modern 3D rendering was somewhere
around like Doom 3. Doom 3 comes out and they have
hardware base shadow casting lights. And they had-- they were
working with pretty hard limits that we still work with today. Doom 3 was like, you can only
have like four shadow casting lights in your scene. That's it. This changes all that it's
going to change it radically. So you could go crazy
with shadow casting lights and you don't have to worry
about culling them by distance like we do now, where you set
a max draw distance on them. Or maybe put stuff in a
different sub-level and stream it Out you don't have to
worry about any of that stuff. And, yeah. I mean we'll have more to
say about this coming up. I just I just wanted to
put a highlighter on it let everybody know
this is coming. And It's a real thing. I've seen it working. It actually started
as a technology under a different
name called Resta and as it gets moved
into the Unreal Engine it's called RTXDI. So direct illumination. And Yeah this is going
to change a lot as far as what people thought
ray tracing was all about and what you could do with it. VICTOR: That's very exciting. I showed a team
yesterday as well, and I think we
got some eyebrows. RICHARD: Yeah. I don't think it's-- I don't know, it's
probably not going to be any secret
that we want to-- not only do we want to get
this in everybody's hands and make it as
accessible as possible, we want to work with
you guys, with Epic so that this is a technology
that we're coordinating on now and into the future. And just like everything else
we do in the NvRTX branch this is not tied to our
specific hardware at all. It follows the same DXR
standards as everything else. The software that we're
developing, in theory, benefits the whole industry. And maybe in the future. It allows for some
future console or who knows what we're doing,
unlimited lighting there to maybe even on
your phone someday. So the potential is huge. I mean, this is like-- this
gets us potentially another step towards eliminating
baked lighting completely and just dynamic
everything all the time. VICTOR: Going
through our questions here. RICHARD: Oh, Sure. VICTOR: Do you have
anything else, Richard, or we're going-- RICHARD: Oh, yeah. Well, I just wanted
to let everybody know that GTC is
coming up next month almost exactly one
month from now. And GTC is our yearly
technology event and it's this year with
everything that's going on, it's online, and it's free
and open to everybody. So I think if you
register you can see all the conferences you can
see what we're talking about. There's going to be a lot of
Unreal Engine related stuff discussed there in whatever. Ray tracing, DLSS,
other technologies. So Yeah I would just encourage
people to check out GTC. And also there's-- we just
opened up a really interesting contest with the
Marbles content. If you can load up that page. VICTOR: We
couldn't get that one. But I can go ahead and link it
in chat for everyone to view. RICHARD: Oh, sure. OK. That's good. Yeah, I just want people
to be aware of that. Because we have this other
rendering workflow system called Omniverse. And in previous keynotes, we had
showed a thing called Marbles , Marbles at Night. And it was like
fully path traced. Really the whole
thing was to give us it was rendered in Omniverse. But it gives you a glimpse of
future rendering capabilities. Ray tracing is where we're at
now today, path tracing is-- well it's the next step. if we can get that to
where it's completely real time for
everybody, path tracing is where you might want to go. And so we showed
the Marbles demo and it was a path trace scene,
completely photorealistic. And we're doing a contest where
we put out the Marbles content. You can download it. There's a download
link on the web page. And construct
something in Omniverse But I would just point out,
Omniverse has in Unreal 4 Engine connection. So you can actually hook
into your Unreal 4 content and do like path tracing
of your own Unreal 4 content, using a different path
racier than the built in one. But you can see all that. And we're doing a contest
with the Marbles content where you could win an RTX card. I think there's like a 3080
and a 3090 on the queue. If you want to put
something together, I just want everybody to
know the contest is there. That content, by the
way, is pretty cool. It's really high fidelity,
high detail stuff. We started to see
people in the community were reconstructing
it themselves. Like I'm going to make a
Marble scene like based on that Marbles at Night
video and just reconstruct it themselves in one
engine or another. And we wanted to actually
put out the content and let people download it and
mess around with it themselves. So that's up there. And, yeah, so GTC, RTXDI,
and the marbles contest this is all stuff that's
there now, or coming soon and wanted people
to be aware of it. VICTOR: That's awesome. Cool are you ready for
a couple of questions? We can hammer out
some ones we received. RICHARD: Oh, yeah. VICTOR: Last time you
were on we showed off RTX GI and caustics and such. And I saw a few questions
and chats in regards to what the state of those
were and if you were still required to download
the NVIDIA branch source code of Unreal Engine to
be able to utilize those. RICHARD:
Yeah in the case of-- excuse me. In the case of caustics, yes. I think we're keeping
the caustics branch up to date with the latest NvRTX. I would need a
double check that. But I believe it's
been updated to 4.26. So that branch exists. And we still have goals
with all of this technology. RTX GI, caustics to
make it more available. We would like to increase
that availability. We would like to make
it easier to get-- we're not there yet. I don't have any announcements
specifically on that today. I wish I did. But I can say that we're close
to having an announcement and there'll be more information VICTOR: Another a
question from MasterOne Gaming. "Doesn't NVIDIA DSR impact DLSS? Said you did DSR 1440p
with a 1080p monitor, will DLSS a scale the
image up to 1,440p. RICHARD: Is
that these are DXR? VICTOR: DSR. Might have been DXR. I'm not familiar with
the abbreviation. RICHARD: Yeah
I'm not familiar with. I'm not sure I know what's
being talked about there. [INTERPOSING VOICES] RICHARD: Just so everybody's aware DXR is Microsoft's
DirectX ray tracing standard. So that's just if we're
talking about DXR that's a standard that we just follow. VICTOR: Had a
question-- another here from Mippithedork "Any plans on using the
underlying DLSS technology in other aspects
of game development beyond final score image output? Perhaps texture up-sampling
or such things." RICHARD: I mean Yeah. No plans specifically but those
are all interesting areas. And I think we're
very open to ideas. Like The whole movie
render a cue thing isn't something that we
initially thought about with the technology. But it's one of those
things where once you start to prove that it can work. And that's really
how DLSS has started. Because it was very
theoretical to start. If you went back
three years ago, a Tensor card hardware
was being invented. What is possible with AI
self-writing software, and what can it do to images? What is it capable of doing? These are very powerful ideas,
but it needed to be proven. Like could it actually be done? And DLSS has proven
something that was, , like I said three years
ago was just theoretical. So, Yeah. Technology like this starts to-- I think it opens up our eyes
to even more possibilities. like well, what else
can we use on this? What are the other
applications for it. So there's a lot of
discussion about that. And if anybody has any
ideas about what they think might be good for that I mean
like please post about it. Please talk about it. I mean we try and listen. We want to pay attention. We certainly want to try
to execute on your ideas. VICTOR: Hector Centeno had
another question. "Regarding the NVIDIA
driver is there any performance difference
with DLSS between the Game Ready and the Studio driver?" RICHARD:
None I'm aware of. It should be the
same performance. VICTOR: SirVoxelot asked, "Does DLSS work on quadro RTX 6,000,
I can't get it to work." RICHARD: I'm
not exactly sure. I'm more familiar
with the 8,000 I'm not so sure about the 6,000. We can take that one offline
and I can get an answer to that person. VICTOR: And for
those who aren't aware, we usually follow up after the
stream in the form announcement post on the
forum.unrealengine.com, in the Events section. That's where Find all
of the announcements for the livestream. Some where we sort of continue
the discussion around the topic that we presented on the live
stream once we're no longer live. RICHARD:
Yeah I mean what I can say is that,
like I said, DLSS works on Tensor Core based hardware. So if it's got a Tensor
Core in it, it should work. I'm just not familiar exactly
with the 6,000 series. VICTOR: Let's see. Navhkrin asked, "Does DLSS
use different model per GPU, or is it the same model under
the hood regardless of the GPU? And so"-- RICHARD: Go ahead. VICTOR: --"are you
planning to use better AI cores on Ampere? Ampere I'm probably-- RICHARD: Right. Yeah, well, I mean it's the same
under the hood fundamentally. But it does scale up as
the hardware scales up. So as we-- basically as
you add more transistors and you make the AI hardware
better, software gets better. And we will be continuously
updating software and improving it and maybe if little
optimizations can be done, we do that. In the most ideal
circumstances if you've got to look at
something like a 2060, and you want to get a really
good frame rate out of it, DLSS can do that. It can enhance the game's
graphics the same way it can for a high end
card and hopefully, do something like doubling
the frame rate even there. So, Yeah. I mean, we just want
it to be a really solid you know great
quality and performance enhancement across the board. VICTOR: Kentix95 asked, "Does UE4
handle the texture mipmap when DLSS is on? A lot of games using DLSS
have blurrier textures due to the smaller
internal resolution, and they forget to
change to mipmapping." RICHARD: Yeah it can
definitely help with textures. You'll sometimes see-- we
didn't look at it too closely with the metahumans project. And it might be better for you
individually just pull down the Metahuman's project,
put the DLSS plugin on it and take a look at it yourself. But it looked to
me there are even details with the
skin pores were-- it was getting just a little
bit sharper and a little bit better. And that's DLSS operating
on those pixels. So it's not just the edge of the
surface it's the whole image. VICTOR: Batou asked, "Is there any relationship between
DLSS and the up-scaling tech used in the NVIDIA
shield for video?" RICHARD: I don't know. But that's something we
can take offline too. I would suspect not,
but I just don't know. VICTOR: Another
General RTX question here. Bob's Definitely Not Your
Uncle asked, "Pixel offset doesn't currently
work with RTX which makes high quality blended
scenes difficult to work with. Is there something
in the works to fix this or any recommendations
to resolving the conflicts?" I think this a general Unreal
Engine question as well. RICHARD: Yeah. Actually I got educated
on this recently myself because I was wrong
about a certain detail. I thought it didn't work either. What's actually true is it
works but only in one direction. So the problem is
you're trying to-- oh, sorry I got to drink
something before I answer this. VICTOR: It's all good. We've been going
for two hours here. RICHARD: When you're ray
tracing shadows and the problem specifically here is
with ray tracing shadows. So if you're not
ray tracing shadows there's no issue whatsoever. If you're just doing
reflections or maybe-- well, yeah, if you're doing
reflections or translucency-- like you're doing
reflections or translucency you're probably OK. And if you're doing
raster shadows with ray traced reflections
there shouldn't be an issue. But if you're ray
tracing shadows and you've done that
pixel depth offset, and it's pushed outward
away from the surface, the ray tracer doesn't know-- because it's tracing off of
the literal geometry in order to produce it's-- where the shadows fall. So that pixel depth
offset is just a material effect and the
geometry hasn't changed. So the ray tracer doesn't know
how to compensate for that. And there's no good
way to do it right now. But if you pixel depth
offset in the other direction in the negative
direction, it does work. The issue just tends
to be more limited. It can depend on how much
you pixel depth offset and in which direction. And then what types of
shadow casting lights are hitting the surface. Now, that's instructional,
that's informational. Are we satisfied with that? No. We would like to make it better. We're aware of the issue. But I don't have anything
to announce on that yet. VICTOR: Let's see. Your next question
comes from DISPLACE. "With RTXDI using
Resta for denouncing do you think we could see Resta
for GI denoising in the NVIDIA builds?" RICHARD: Yes it's
possible I mean, when-- I don't when-- I don't know how much of
this is public exactly, but I mean it's well-known that
Fortnite now has ray tracing. And we helped with
those efforts in order to improve denoisers,
improve performance, all that sort of thing. And I believe Fortnite
has a version of RT GI currently implemented
with an improved denoiser. And it could be that that is not
out for a general development yet. So it's being used in Fortnite. And you would see
this right now. I don't think that's a
giant secret or anything. You'd see this if you
load up a Fortnite and turned on ray
tracing, and you'd see some GI that's
what's going on. So we do like if you
look at the RTXDI page, you'll see that improved
denoisers are in the game plan. That's part of what
makes this possible is that a technology like RTXDI,
instead of doing it the way it's been done for
the past 20 years where we sample
individual lights and cast shadows of individual
lights and process each light independently
of every other light. That's a very
expensive way to do it. It works it's just expensive RTXDI, part of that
is improved denoisers we have a separate page. If you surf around
the NVIDIA website, you'll find a separate page
talking about Watchdogs Legion, I believe, and how they're
using a next generation ray tracing denoiser. So our goal is at
a very high level it's faster better rendering
and better denoisers. And it's all linked together
because if you can instead of processing each light and shadow
independently of each other. If you can do them
all at the same time. So it's got one cost doesn't
matter how many lights you're doing and then you denoise it
in a more optimized way where you're not only getting
better quality out of it, but you're doing it in
a different way that's faster as well. You sort of changed everything
as far as how we light scenes. So improved denoiser
are in the future. Absolutely, And,
like I said, we've got public pages showing that
talking about that right now, VICTOR: Next question
comes from Bentraxx. "If I have DLSS version
0.2.11 2021 already installed and want to update to the new
version of 0.3.08 2021 for UE4, is it easy to update I am
a little afraid of it now." RICHARD:
Oh, yeah, it's easy. Just get the latest
plugin and copy over what you've installed before. And you could just manually
delete out the old plugin and put in the new one
and you're good to go. I mean, you should be able-- I've tested this personally
where change plugin version load your project back up. DLSS was enabled before it's
still enabled because it's-- that setting is done
in the Unreal project the U project settings. Like whether DLSS enabled or not
that it's in the text document there. So, yeah, you load up
and it should just work. VICTOR: This is
a pretty good question from Bassem Zammeli. "Is DLSS free to use
in a commercially-- in a commercial project or do
we need some sort of license from NVIDIA?" RICHARD:
It's free to use. It's against the end
user license agreement to redistribute our source. We want people to get our
source from our website. We do this really
just so we know who's using the software
how many people are getting the software. We're curious about
those details. But once you get it and
once you build your binary and you distribute it, we're
not asking anything from you at that point, You
don't have to notify us you don't have to do anything. It's your application. VICTOR: I don't
believe that is entirely true. I believe you are required
to contact NVIDIA. There is a page. RICHARD: Is there? VICTOR: Mm-hmm I was
looking into this earlier too. RICHARD: OK. Yeah, please correct
me, please correct me. VICTOR: I'm
trying to find the link. For some reason the
Marketplace page is not loading for me right now. And that's the route I
remember taking last time. I can find it within
the next 10 seconds, RICHARD: But the
person is free to use it and I don't think we would-- there's almost no way we would
block somebody from using it. VICTOR: Number four. "You are required
to notify NVIDIA prior to commercial release
of an application including a plugin to a
commercial application. Please send notifications
to developer.nvidia.com/sw notification. So there are a couple of-- essentially
notification to NVIDIA that you're shipping [INAUDIBLE]
card based on the end user license agreement. But that does not in
any way disqualify you from using it commercially
it is just a requirement. That it is you notify NVIDIA. RICHARD: Thank
you for the correction. Thank you. VICTOR: Yes. Now, when it comes to releasing
games and when you're using-- with native Unreal Engine
it's fairly simple. You're still required
to notify us prior to release of your application. And the same usually
applies to other resources. And so before you release
anything make sure that you read through the end
user license agreement of all of the products that
you utilizing now, it's important because you're
using software and products that other companies
have developed, and you are trying to
make money using those. And so just do
your due diligence, read through the
agreements and make sure you don't have any
angry emails in your inbox after a couple of weeks
of releasing your game. RICHARD: That's
very good advice. VICTOR: And
that involves-- it's fairly straightforward
with the marketplace in general. All the products
in the marketplace go on under the same end
user license agreement. Do not redistribute-- RICHARD: Yes. VICTOR: Yeah
and a couple more. It's just a general
advice to make sure that you cover your
bases when it comes to releasing your products. And it includes if it's released
for free in some instances. even though you're
not charging money for it is still a commercial
application that you have submitted for
public consumption. RICHARD: Right. Yeah. VICTOR: Cool. Let's move on from
the boring legal stuff and continue with the
interesting technology stuff. RICHARD: That's
a lot of legal, yeah. VICTOR: Yeah. Yosimba asked, "Is there any
one"-- oh, funny question, "Is there anyone
credited with inventing the DLSS algos or was the
team behind it with a name?" RICHARD: I don't know
the answer to that actually a lot of engineers a lot
of people have worked on it and it's-- at this point it has become
Blackbox software in the sense that-- I mean, we're fine tuning it and
guiding it if you will, but-- and we make a
plugin and so forth. But the core of it is-- it's self-writing
software in many respects. So you know it's
a brave new world but a lot of engineers
a lot of work has gone into it
from a lot of people. VICTOR: Next question
might be a follow up on our DSR/DXR question. But Felipe Magalhães asked,
"Could DLSS work together with dynamic resolution
scaling to achieve a target frames per second by
adjusting the render resolution on the fly?" RICHARD: Yeah,
theoretically, yes. We're not there yet
like I showed earlier. We have a beta of dynamic
resolution scaling implemented. And you can try it for
yourself with a CVar. But, yeah, nothing else really
for me to say about that yet just that it's on our radar
and we want to see it grow. VICTOR: Next question
comes from vainsoftgames. "Any plans on bringing
DLSS as to web browsers? Could be a method to save
bandwidth on video platforms." RICHARD: Yes. That's an interesting idea. I don't have anything to
say about that right now. VICTOR:
The next question, which is our last question
if you have anything else we got a couple of minutes left. Make sure you shoot
your questions to us before we go offline here. But next question
from wildoxstudios is, "Will RTXDI run
with DLSS in VR?" RICHARD: Well in VR. Right now ray tracing generally
works pretty poorly in VR. It's not mature yet, I think
it works in some respects or in a limited way,
but it's not there yet. Speaking personally, it
would be great to see all these technologies
play together. But I guess that's a
future conversation. VICTOR: Lottalava asked,
"Am I required to buy an hour takes 3090 to work with DLSS?" RICHARD: Oh, no. No it'll work on any RTX card. VICTOR: Any
RTX card, yeah. RICHARD: Yeah. You can-- a 2060
is all you need. VICTOR: Awesome. I think in terms of
questions that was it so far. I will scroll through
this here make sure I didn't miss
anything relevant. I think that was it actually. Richard was there
anything else you want to cover before I
do my autro spill here? RICHARD: Oh, goodness. No. I don't we covered
a lot of ground. And I'm happy we kept it-- I mean this is
two hours on DLSS. That's a lot to talk about
on just one technology And I we dipped into RTXDI
and a few other things that are coming up, but Yeah there's
a lot that DLSS has to offer. And that's very clear. VICTOR: Think so. I think so. Well, Richard, thank you
so much for coming out today and showing off
some pretty pictures and-- [INTERPOSING VOICES] RICHARD: [INAUDIBLE] mode. VICTOR: It is available. You can find a link to
the developer page-- NVIDIA developer page
where you can download DLSS on the UE4 Marketplace. But you can also find a link
on the forum announcement post. And I'm pretty sure
you can just google, and NVIDIA DLSS Unreal
Engine, and that will take you straight to the link as well
if you're interested in getting started using that. If you're new to game
development in general, and you've been enjoying
the stream today and are curious about
game development, on unrealengine.com You can go
ahead and download the engine for free. The launcher, then the engine. If you already have the
launcher just go ahead and hit the Unreal Engine tab on the
left and download the latest build of 4.26.1. I believe it's live. And you can also visit
learn.unrealengine.com for a plethora of courses
on how to do everything from Blueprints to
lighting to optimization. There's a great
library of content. And make sure you also
visit all of our community previous content. There's thousands of
tutorials on YouTube, as well as conversations
happening around our forums where people like to post them. There's two goo channels
there on the forums, Work in Progress as
well as Released. You can go ahead and
post your projects and let us know what
you're working on. We're also always listening on
Twitter, Facebook, and LinkedIn as well. Make sure you let us know
what you are working on, and then we can
potentially spotlight you as part of this live
stream at the beginning. And if you go ahead and
visit the launcher afterwards we have a nice little screenshot
there for your project. There are no physical
meet-ups going on right now throughout the pandemic. However some of the
communities group at communities.unrealengine.com
are still organizing virtual meet-ups If you're curious about
finding other like-minded, whether you're a game
developer, someone who wants to dabble with
film in Unreal Engine, There are plenty of other
like-minded individuals out there who are excited about
getting together and talking about these technologies,
showing off projects, potentially helping out. It's one of my best memories. Actually is from Seattle
when we were hosting the meet-up over
there and we all got to show off our projects
give each other feedback. It's a great environment that
you can find yourself in, to get a little bit more of
a connection to all the stuff that you're working on. I mentioned the forums,
but there's also a community run this channel
at unrealslackers.org. There were 50,000
members all talking about all the aspects of Unreal
Engine and game development. It's an exciting place to be. I also mentioned there's
another community group on Facebook as well as Reddit And if you are curious about-- well, was going to say there. I started off the
sentence wrong. if you would like to
submit a countdown videos for the live
stream please send us a captured 30 minutes of
developments in engine and then fast forward to five minutes. Send it to us together
with your logo. I know the main start
just submitted another one the other day I'm going
to go ahead and get that into our roll of our countdown
that we do every week. But we're excited about
seeing more projects. And we're looking for
non games projects as well as Unreal Engine is
now and compassing a lot more-- many more industries it's
not just video games anymore. If you stream on
Twitch, make sure you add the Unreal Engine tag as
well as game development tag. That's the best way for
viewers to filter your content if they're curious
about watching Unreal Engine and
the development on Twitch which can be fun. We're going to go
ahead and read someone I think as we're done with
the stream here today. Make sure you follow
us on social media. And if you're
watching on YouTube, hit that notification
bell and then you can see the next
time we go live, as well as all the
cool content that we're putting out on the channel. The evangelists are all-- since they're unable
to participate in live physical
events, they are working on awesome feature
videos for some new and some old features
in Unreal Engine. I know that Andreas
Suika, we just released yesterday
the video on how to use Control Rig for
a bipedal and a spider-- robotic spider how
we use Control Rig to drive your animations
dynamically at runtime. It's great stuff. Next week we're going to
have two of the developers from Mortal Shell on the stream. They're going to go behind the
scenes of Mortar Shell which is super exciting. I've been talking to them for
a while, happy to have them on the stream. Make sure you tune
in for that if you want to see a little bit
of behind the scenes how Mortal Shell was
developed and published. They're also going to talk
a little bit about how you can be Indie developer and
actually publishing your game. Which can be rather scary
when you get to the moment that you've been working
on it for a while now it's actually time
to release it. RICHARD: Very
impressive game by the way. VICTOR: Yes RICHARD: I mean we
did a lot of work with them. So, yeah, I'm very I'm very
happy with how it turned out for them. VICTOR: Cool. Yeah. Tune in next week if you
want to check that out. And with that said,
Richard, thanks again to you and for NVIDIA for coming on
the stream today, showing us some of your new technologies. You have anything you
want to leave to stream it before we go offline? RICHARD: No. Thanks, everybody
for being here today. Really appreciate it. And this was a lot of fun. I just hope we can do
it again in the future. VICTOR:
For sure, Richard. I would love to
have you back on. With that said I hope
everyone stay safe out there. Have a good weekend. And we will see you again
same time next week. Take care every one. RICHARD: Bye.