DLSS 3 Frame Generation - You're Doing It Wrong

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so a few months ago I did a video titled why do people hate dls's three frame generation and the video was mostly well received but it is also the video with the most dislikes on this channel the comments were pretty funny and it made me realize that maybe the point I was trying to get across got Lost in Translation somewhere so today we're going back to the lss3 frame generation and talk about why you shouldn't hate the lss3 frame generation first up let's address the differences between dialysis 2 and dlss3 and try to tear up some confusion here the lss2 is an upscaler from Nvidia that uses AI for more accurate up scanning what this means is that if you use dlss2 super resolution your game is running at a lower internal resolution determined by the dlss2 setting and then upscale to your monitors native resolution by leveraging the tensor cores on the GPU the lss3 on the other hand is just a term It's a combination of three things namely dlss2 in video reflex and frame generation Nvidia reflex is some trickery to reduce input latency and improve responsiveness in your games this is similar to radeon's anti-lag though I have no idea how both these Technologies work at their core nevertheless frame generation is another AI based utility that is used to increase frame rates or frames per second by inserting AI generated frames in between real frames that are generated by the game engine this uses the upgraded Optical flow accelerator found on the RTX 40 series gpus this is where the fake frames term comes from and is technically not incorrect all these three features combined make up dlss3 now these options can be independently toggled with one exception you can enable the lss2 super resolution without Nvidia reflex you can have Nvidia reflex enabled without having frame generation or dlss2 super resolution enabled and you can enable frame generation without enabling dlss2 super resolution in most games anyway what you can't do is disable Nvidia reflex when you enable frame generation the reason is when you enable frame generation it does increase your input latency slightly by how much you might ask well it depends on your base frame rate before frame generation is enabled we're going to have a look at that right now alright so this will just be a quick test to show you the difference in a PC latency or input latency I know that this method is maybe not 100 accurate you need the Nvidia reflex Monitor and tools Etc but I don't know if that so we'll just be relying on the GeForce experience performance overlay then the top right hand corner currently we're getting around 50 frames per second this is at 1440p with the RT Ultra preset but no DLS is to end no frame generation so let's just see going from 45 milliseconds the average PC latency there I'm just going to enable frame generation here and leave a super resolution and disabled and let's see what happens here to the latency it does take a little while to normalize and you can see we are sitting at around 55 millisecond latency around so that's a a big increase in the input latency although the game definitely still feels 100 fine this is a single player shooter time this latency is definitely not the end of the world but then let's see what happens if we just disable the frame generation and we just enable dialysis super resolution right and now you can see our average PC latency has decreased from 35 milliseconds to 30 milliseconds our frame rate went up from 45 50 frames per second to 80 frames per second so that's a pretty good increase in performance all by itself and now we'll enable a frame generation along with these settings as well just make sure the dlsa super resolution is set to the previous setting because it does default to O2 when you enable frame generation in cyberpunk right so let's just give it a little while to normalize and now you can see we're back to 40 millisecond latency all right so sure it's more or less the the same as what it was at 50 frames per second slightly lower but the motion fluidity of what is presented to me is much much improved and seeing that it's a single player game the slide increase in input latency is definitely not going to bother me that said aligners also did a few blind tests and most of his staff couldn't actually tell the difference between frame generation on versus of so if you do need a monitor or a tool to show you that there's an increase in input latency and you can't notice it by yourself then that just speaks volumes right it's the same as going from 110 to 120 frames per second you won't be able to to see or tell the difference unless you've got a frame rate counter alright but let's uh have a look at another game quickly all right so here we have Spider-Man remastered I just want to show you the settings quickly so we're at 1440p once again frame generation disabled up scaling disabled and everything is set to very high we've got the highest settings for Ray tracing enabled yet as well and here you can see our average PC latency is around 30 milliseconds and we're getting around 87 frames per second right so I'm just going to enable dlss super resolution quickly and see what we can get so this game is actually very nice because it supports multiple methods of upscaling but we'll stick to dlssip resolution because that's what we used in our previous test and remember we had around 80 frames per second 85 and 30 milliseconds input latency right let's just apply the changes here and now you can see that our frame rate actually did not increase now if we just open up our MSI afterburner overlay here you can see that the RTX system 4080 usage is sitting at around 60 percent now this is because the game at this point is CPU bound this is a 12 700k CPU so no slouch but uh definitely uh an issue for the RTX 4080 in this game at these settings now we can change the dlss settings as much as we want and we won't see any Improvement in performance we can even go to ultra performance we might see a slight changing performance but not a lot let's just go back there and still sitting with the 87 frames per second right so that's just a perfect example of a CPU bottleneck and the game doesn't look that good all right let's just uh move that back over to Quality quickly and I'll just show you another benefit of frame generation right so back at the dls's Quality still 85 frames per second 30 millisecond latency and the nothing that we do actually improved our frame rate right so let's just enable a frame generation yeah you'll see that the vsync now gets grayed out all right so let's just apply this here quickly and now we're getting 160 frames per second so pretty much double what we had the PC latency went up slightly by 10 milliseconds once again that's a 33 increase but 10 milliseconds is a hundredth of a second right so you I highly highly doubt that you're going to be noticing that and you can see that with frame generation Our frame rate now improved to well you can see it's a 160 frames per second basically and the motion fluidity is much improved as well so that's just another benefit of frame generation when you are CPU bound you can still get a decent frame rate improvements all right then on to our last game alright so the last game we're going to have a look at is a Ratchet and Clank it's actually developed by the same developers as Spider-Man and remastered Insomniac and it was ported by nyxus last week or thereabouts it's actually a pretty good game it's got its issues it's crashing quite a bit for me anyway let's just go over the settings here quickly so start at 1440p no DRS is no upscaling and uh we've got Ray tracing enabled everywhere as well and we're sitting at around 78 frames per second with around 35 milliseconds of uh PC latency so let's just enable frame generation here quickly alright so we can just go ahead and apply that and immediately we went from 80 frames per second to 130 frames per second our average PC latency once again increased by around 12 10 to 12 milliseconds definitely not the end of the world but now we have a high refresh rate experience right going from 80 frames per second to 130 frames per second depending on where you look obviously let's just see what happens when we only enable a dlss super resolution so I'm just going to disable the frame generation here and just use a dlss super resolution here on quality mode all right and our average PC latency stays on more or less the same as what it was at Native our frame rate did increase by around 25 30 percent which is good to see so just keep that in mind 25 milliseconds let's call it 30 milliseconds and let's then enable frame generation as well all right so just give it some time to settle down and we went from 80 frames per second give or take to 160 frames per second with around 10 to 12 milliseconds increase in PC latency now I think that's uh extremely good the game still feels very good and if you do play it with the controller like I do I'm benchmarking in with the keyboard house but if you do play with the controller you definitely won't feel that 10 millisecond increase in latency and even playing keyboard and mouse if you do feel that like you have to be extremely extremely sensitive to it so basically what I'm trying to say here is that if you have a high enough frame rate I'd say 50 60 frames per second and you start enabling a frame generation the input latency is definitely not an issue and it's just to give some players like myself a high refresh rate experience with Ray tracing Etc and Avon right it's not there to give you an end result of 60 frames per second your starting point should be around 60 frames per second otherwise the input latency will definitely become an issue I actually do have a cyberpunk video up where I tested the 4070 TI and I went from 30 frames per second to 50 frames per second and the input latency was around 200 million seconds and that you can definitely feel alright but uh back to my script now we'll talk about when you should be using frame generation this is not a magical fix that will suddenly turn your 4060 into a beast 4K GPU so you can probably go from 20 frames per second to 40 frames per second by using frame generation but the input latency will be horrendous and the game will basically be unplayable I personally have found that as long as I have around 50 frames per second as a base frame rate I can enable frame generation without the increased input latency be an issue sure there are people that are a lot more sensitive to input latency and you should probably avoid the stick if higher input latency bothers you the real benefit of frame generation comes in the form of high refresh rate gaming in cyberpunk 2077 for example at 1440p with the RT Ultra preset and dlss City quality I'm getting around 90 frames per second when I enable frame generation that now becomes around 120 to 130 frames per second well it actually goes around 160 frames per second when I'm not recording on this machine using OBS now this is not just just a number on a benchmark that is increasing the motion fluidity the user is experiencing is much improved it does indeed look like 160 frames per second because the monitor is displaying 160 frames per second whether 33 of the frames are AI generated it doesn't matter because I'm still presented with an output of 160 frames per second this is perfect for me because my monitor is 165 Hertz g-sync Monitor and now I get full use out of it at the highest possible settings in one of the most the mining games currently available on PC now you will see that the input latency is slightly higher than it was when we were getting around 100 frames per second but it is still low enough not to bother me personally that said normally when your frame rate goes up your input latency goes down with frame generation the opposite happens so it might not feel like 160 frames per second based on the input responsiveness once again if you are at all sensitive to additional input latency and it bothers you this take is definitely not for you maybe we will see the input latency penalty be reduced in future but at the moment unfortunately it is what it is next up we'll talk a bit about power consumption the whole Idol generation is very efficient when it comes to word per frame but that might not mean much to most people frame generation can further reduce the gpu's power usage by quite a bit I can for example run Diablo 4 at 1440p Ultra with dlss3 enabled at 90 frames per second while the gpus are using around 100 Watts I personally use this when I'm running off of batteries as in South Africa we have rolling blackouts daily sometimes even up to 12 hours a day this reduces my system's overall power consumption and allows me to be able to game while experiencing these blackouts sure the input latency isn't great when using these settings but Diablo 4 is not a latency sensitive game and it is definitely better to at least be able to play than say not play next up we'll talk about which games you should use it in well I see many comments that say not to use it in competitive multiplayer Shooters and the good news here is that there aren't any competitive multiplayer shooters that support frame generation or as far as I know anyway even if they were you shouldn't be using this it is perfectly fine in all single player games and even some multiplayer games as well multiplayer games that support the lss3 include games like F1 22 and 23 Diablo 4 for example and there is no issue whatsoever in these games if you enable frame generation these games do run at high frame rates anyway so what is the benefit of enabling frame Generation Well I'm glad you asked in Diablo 4 while it doesn't really increase the frame rate as much as in other games it does help in reducing stuttering in the F1 games it helps a lot once you start enabling Ray tracing and since a few milliseconds additional input latency is not going to make a difference in either of these games they are perfect candidates for this technology now you might be saying if you need a high frame rate for frame generation to be viable what is the whole point you don't need more than 60 frames per second in AAA titles well we already discussed this and it is not to give you a playable 60 frames per second it is there to give you a high refresh rate experience competitive players sure aren't the only Gamers that enjoy a higher frame rate and a higher refresh rate sure you don't need more than 60 frames per second but then what is the point of buying a high refresh rate monitor if you aren't going to be making full use of it anyway many enthusiasts myself included prefer a high refresh rate gaming experience whether it's Triple A single player titles or a competitive online shooter whatever the case and that is who the lss3 frame generation is actually for lastly we'll talk about the price premium the current generation of gpus is expensive very expensive and a lot of people put the blame on DLS S3 one of the complaints I see the most is the draw power of gpus haven't increased instead Nvidia is bumping up prices and relying on dls's and frame generation to sell chords and while there is some Merit to that I don't agree entirely let me explain the 490 is the most powerful GPU we have ever seen and by a long shot when compared to the previous generation it is on average around 50 faster than the 3090 TI the 4080 is also a massive increase in performance and is on average around 50 faster than the previous gen 3080. the 470 TI is about as fast as a 3090 TI at 1440p and around 42 faster than a 3070 Ti from last gen these are impressive gains and different shows that raw power has indeed increased it quite a lot at the top end now these cards aren't cheap and along with the nice performance pump we also got a not so nice bump in the price but the point here is that raw power of these gpus have increased substantially and these gpus don't need to rely on frame generation to definitively beat the previous generation counterparts while the 4070 is still a decent GPU being about 22 faster than the 3070 the 4060 and 4060 TI have proven all naysayers correct both these cards sometimes lose to the previous generation counterparts with the 4060 also having less vram than the 3060. now I know there are two 360s at 12 gig and an 8 gig version but we all know there's only one version and that's 12 gig version when running at higher resolutions or detail settings the 4060 oftentimes match the performance of the 3060 and sometimes even loses to the 3060. this is completely unacceptable so in this case I agree that Nvidia is relying purely on dlss3 to sell these chords that said you can still use the lss3 on the 4060 and 4060 TI to to turn your 1080p 60 frames per second experience into a 1080p High refresh rate experience in games that support the lss3 it should however not be the only selling point of these gpus and unfortunately it kinda is at this point and therefore I would highly suggest not to consider the 4060 or 4060 TI especially at these current prices we all know that the 4060 is supposed to be a 4050 or 4050 TI at best and the 4060 TI should at best be a normal 4060. it will be much better off buying previous generation AMD or Nvidia GPS at this point even if it means that you are missing out on frame generation
Info
Channel: Mostly Positive Reviews
Views: 67,987
Rating: undefined out of 5
Keywords: Nvidia DLSS 3, Cyberpunk 2077, Dying Light, Hogwarts Legacy, Hitman 3, The Witcher 3, AI upscaling, gaming performance, graphics comparison, gaming technology, frame generation, dlss 3.0, dlss 2.0, dlss frame generation, rtx on, rtx off, spider-man remastered, Nvidia, GeForce, RTX, RTX 4070, RTX 4070 Ti, RTX 4070 Ti Frame Generation, AI Frame Generation, fake frames, frame insertion, frame interpolation, higher FPS cyberpunk, higher fps games, reduce stuttering games
Id: 4YERS7vyMHA
Channel Id: undefined
Length: 17min 19sec (1039 seconds)
Published: Tue Aug 01 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.