CS2's Input Latency

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
picture the scene you're going pro in CS2 and you catch the final boss in your sights with cat light reflexes you prod at that fire button and information about the shot is sent over to your PC there your Hardware begins work on producing the image following the action once rendered the resulting frame screams back through the cables to your monitor where it's eventually displayed once the next refresh occurs and the pixels begin to change color this video is sponsored by Nvidia who are best known in gaming for their GeForce RTX graphics cards but they've made a bunch of of other latency reducing features you might also have heard of like g-sync and reflex the latter of which is featured in TS2 options menu Nvidia also provided me with some very precise latency measuring tools and so hopefully this video will help answer any questions you might have about input latency that skit you saw at the beginning wasn't in real time it was a demonstration of this graph which is all the steps your Hardware goes through whenever you perform an action in Counter Strike only once all these things are done will it show on your screen now each and every one of these stages is done almost most instantly and so hopefully the entire process should only take a tiny fraction of a second so this is what low input latency looks like I move the mouse and it updates almost immediately on the screen this is what you want it to be like and just for fun this is what bad input latency looks like this here is 400 milliseconds of input latency point for of a second I certainly hope that none of your PCS behave like this because if they do then there is something very wrong with it but no matter how your PC behaves it will still have some input latency every PC does the key is to minimize it which can be done by configuring your software correctly and also by buying faster and more responsive Hardware so how do you measure input latency the easiest way that you can measure latency is by using software of some kind like the frame view software which anyone can use or jeus experience's latency tool which shows a latency measurement on screen if it says render latency then is measuring just the graphics card bit of the process but some games will show an average system latency r reading which means it's measuring this whole section in other words it's displaying all of the latency apart from your mouse and monitor measuring your PC's latency is Handy for a number of reasons it covers any changes you might make to your computer like to see if disabling your processor's multi-threading would make a difference or if you're testing different speeds of Ram or different Graphics options in game to see if anti-aliasing increases your latency for instance you can check all these in real time in CS2 using frame view or the GeForce experience latency overlay as you can see here it also covers the composite stage which I know sounds like a boring word but it's the source of a lot of myths and rumors I've heard surrounding Counterstrike over the years you know how people claim Counterstrike has less latency when in fullcreen mode there is some truth to this and it all comes from this composite stage which can like Windows when Ironically in windowed mode but if you're using software like frame View and see an F or an i at the top of the screen then you should be okay because it's treating the game as though it's in full screen I had no problems when running the game in borderless window or when in full screen when my latency was consistently around 4 milliseconds however when playing in Window mode the eye in the corner disappeared and my latency rocketed to about four times the amount so this sort of thing can still be a problem even on Windows 11 so that's my PC stage of the process sorted but what about the mouse and monitor understandably you can't test the latency of these things using software on your computer so how are you supposed to measure them you can do this with your phone right now though your success may be limited set your phone's video capture to the highest video frame rate you can then point it at your PC and record a clip of you pressing the fire button in game you now need to analyze that video you just recorded counting the number of video frames after you press the mouse button before it shows on your monitor you do some maths and that's a rough estimate of the input latency ideally you'll want to capture this video at a frame rate that's at least as high as your monitor's refresh rate though even higher is still better filming it at 1,000 FPS would be nice because then you can count the milliseconds now I'm using a 360 HZ screen for this video so even my super duper 240 HZ camera isn't good enough but this thing here is this is an lat device which is nvidia's own tool which they use to measure the latency of everything all the way from monitor screen responsiveness right down to the buttons on the mouse with results that are accurate to a fraction of a millisecond now I'm not going quite that far I'm not stripping mice down to their bare components or anything like that but I will be using the lat unit and with some advice and tips from Nvidia Engineers I hope this video will get to the bottom of the mysterious ious topic of input latency once and for all how does this lat unit work it's quite simple this is the side that faces your screen and this little sensor here scans for changes at almost 9,000 FPS and it's triggered by the first sign that there is a change in brightness on the other side of the lat unit is this button which triggers a mouse click which goes into my PC out to my monitor and in this instance is causing the screen to flash then the sensor on the other side of the lat unit is counting how many milliseconds this whole process takes and hence how many milliseconds of input latency my PC has I can trick it by covering the sensor on the other side with my finger which totally defeats the purpose but I guess it's proving that it's working the rest of the time but to ensure my sausy little fingers don't do this in proper testing I'll be strapping this lat unit to my monitor I have an immense phobia of touching computer screens especially ones as expensive as this one is and especially since it isn't mine and I have even more of a phobia about a hard plastic device touching the screen especially when it's attached to it with a glorified rubber band but I'm overcoming all these phobias for you for science for the clearest in put latency results you will want your screen to instantly shift from being very dark to being very light like a gun's muzzle flash however even this in Counter Strike isn't displayed instantly due to tick rate and stuff like that I initially tested all of this video with the game set to 16 times speed and doing this got my muzzle flare results to within a millisecond of the actual system latency so it's probably the best way you can test it using some kind of in-game effect better still would be to test a mouse movement of some kind but it's very hard to cons consistently perform this kind of action in a way that can be immediately measured every time and believe me I tried luckily it turns out that lat has an option for this I was hoping for an instant 180° Aimbot flick but in reality the lat flick looks more like a casual aim adjustment that an easy bot would do but apparently this is good enough and should yield instant results the issue being that I still needed to somehow line up the sensor somewhere that's dark but that will instantly go to light the next frame and if I'm on I just wasn't satisfied with the speed of the flick nor the consistency I got out of this testing which involved positioning the unit so very precisely however there is an even better way nestled within GeForce experience's performance monitoring menu is this reflex flash option which creates a black box on the screen that flashes white every time the mouse button is pressed this works in a similar way as measuring the gun's muzzle flash but it's totally instant apart from your system's latency obviously like I said when the game was set to 16 times speed the latency from this testing was within 1 millisecond of my results from testing the muzzle flash but at just one times game speed the muzzle flash rather predictably got more delayed but this box testing method got faster presumably because running the game at one time speed is less demanding than running it at 16 times so as a result it increases your actual frame rate which in turn reduces the absolute latency so for this video I will be testing the box method instead of the muzzle flash and the lout unit is scoring roughly 7 milliseconds of latency in this setup which I'd considered to be optimal conditions these are the best results I got and I'd say it's about as close as you can get to a perfect response time using 223's Best Equipment but this process is still starting at the lat unit instead of from a mouse of some kind how do you measure Mouse and monitor latency with great difficulty I think most of us buy these things use them and think yeah this feels instant enough but say you wanted to know the science Nvidia have a system where Mouse and monitor companies can send over their products and Nvidia will measure their latency in milliseconds with monitors that involve sticking an L at unit right at them and measuring their response times when it comes to mice though they strip it right down to the raw components and directly hook their buttons up to an lat unit to measure everything to a fraction of a millisecond they then document the results and add it to the rla program this means if you really want to measure your system's latency you don't actually need an lat unit you just need an rla certified Mouse and monitor you plug the mouse into the monitor then using the on-screen reflex tool it measures with great Precision the time it takes for a mouse click to go into the computer back out again and to be displayed on the Monitor and thus will measure most of your system latency this is how qk Norris did his testing in case you were wondering but how accurate is it so I attempted the same tests with the monitors built-in reflex testing and then again with lat I was careful to configure my game and my PC in this manner in order to minimize the latency readings the monitor scored 7 milliseconds average latency versus lat's 7.4 which is a difference of 0.4 milliseconds which is really accurate do note that no mouse is being used here the lat unit is replacing the mouse in this instance and is what is triggering the mouse clicks that are being sent to the computer in the first place and using the monitor's reflex tool the figure that seen on screen doesn't include the mouse latency if you're using a mouse that Nvidia have tested then its latency will be known by the GeForce experience software and will be added to any performance logs that you document using its performance monitoring feature looks like from nvidia's testing the razor death add has 1.4 milliseconds of latency and then I retested at these settings to encourage a much higher and much more variable latency reading well lat and the monitor's reflex tool still match up while the monitor's reflex tool measured 29.7 milliseconds of latency while the lat unit measured exactly 30 so in these two extreme testing situations rla came within half a millisecond of lat's latency readings which in my opinion makes it rather accurate and a convenient way to measure your system's latency without requiring the hassle of a 1,500 lat unit strapped to your screen your mouse does have input latency even the lat units button has 1 millisecond of the stuff so it's not something that can be avoided but it is just a tiny amount of latency in the grand scale of things like I've said already Nvidia have measured the latency of some mice and do store these results in their logging software but say my mouse isn't known or I simply don't want to trust pre-recorded data this means I need to do my own latency measurements and that I need to somehow find a way of starting this super precise timing from the exact moment the mouse button is pressed in the first place yes I could use my video camera like so but I currently have a way of getting an even more precise measurement because ldac comes with a microphone as a fullback testing feature instead of using the lat's built-in click feature it instead uses a microphone to listen for the sound of a mouse click to start its latency measurement this to me sounds like a very janky testing method but that only makes me want to test it all the more and to my amazement it works brilliantly maybe a bit too brilliantly the microphone measured the exact same 7 milliseconds of latency as lat's built-in click method had done I was expecting eight or even 9 milliseconds of average latency given how the Death Adder I'm using is supposed to have 1.4 milliseconds of extra latency which should be added to these totals now that I'm measuring from the sound of the click of the mouse button itself Nvidia does warn that this microphone method of testing is less accurate but given that it's still within 1 or 2 milliseconds of what I'd expect to be getting or using just a leisurely speed of sound emitted from a mechanical click of a mouse it's not actually too bad bad so I extended my Mouse testing to other configurations to see if anything affected the latency readings pulling rate is likely a feature you've seen in your mouse settings and it's how many times a second it sends positional data to your PC I imagine you see this set it to the biggest value and think it's the best and you're right you can see here that the razor software lets me pick between 125 500 and 1,000 and the smoothness of my Mouse's position every frame is improved the higher it goes with 1,000 being a market improvement over 500 despite me using just a 360 HZ monitor would a polling rate of 360 make sense on a 360 HZ monitor apparently not according to nvidia's Engineers more is always better because Computing pipelines are complicated and stuff doesn't just synchronize like the numbers would suggest they would I was also told that the polling rate may or may not affect the mouse clicks responsiveness depending entirely on which Mouse it is that I'm testing because sometimes click information from a mouse is sent separately from positional data not that I didn't believe them but I tested it all the same and on the Death Adder at least this seems to be the case as at 8 milliseconds of average latency it's too small an increase to definitively conclude that there is more input latency whereas if click information was only being sent to the PC 125 times a second I'd have expected closer to 4 milliseconds of extra latency when testing this mode wireless wireless gaming is something that I know many people still fear because how could a wireless connection possibly be as good as a good oldfashioned cable well for a start you don't have a wire to get in the way of your mouse movements but that aside the big fear is latency the death out of Mouse has two wireless modes Bluetooth and 2.4g which they make out as being the pro setting Bluetooth wasn't great I no longer had a choice of polling rate but I could feel that it was low maybe as low as 125 possibly lower still it was an inferior aiming experience and the average latency of 16 was 9 milliseconds lower than in wired mode so I'm not rolling out all Bluetooth mice but with the death add of V2 pro at least this Wireless mode was inferior but then I switched to its 2.4g mode and much to my relief the latency returned to being just 8 milliseconds again it's close enough to being the same as wired mode was but what if I move the dongle further away from my mouse I put it in a worst case scenario by plopping the 2.4g dongle down the back of my desk that was across the room from where the mouse was and the average latency remained at 8 milliseconds all these years I've had my wireless dongle as close to my mouse pad as possible just in case but with the Death Adder at least it doesn't seem to matter where it's placed and now for the fun bit this right here is a cheap wireless mouse it isn't a gaming mouse by any stretch of the imagination I could tell its polling rate was low and I didn't have great hopes for its latency results either but to my surprise it wasn't terrible well it was it single-handedly doubled my latency adding an extra 100th of a second of delay but at the same time it only added 100th of a second of delay is it noticeable yes would I like to game like this no but I was honestly expecting a worse result I expect there are some real stinkers out there but this particular cheap wireless mouse I tested was a lot less bad than I had anticipated I spoke with Nvidia about all these results and they say the model of the mouse matters more than whether it's wired or Wireless some wired mice have upwards of 10 milliseconds of latency while there are some Wireless ones with as little as one as an example of how little difference Wireless mode can make the Logitech super light has 0.8 milliseconds of latency in wide mode versus 1.1 milliseconds in Wireless that's only 0 .3 milliseconds of difference but which USB port should you plug your mice into even this isn't simple because some of your USB ports don't go directly to their destination and the ones in the back of your motherboard can sometimes go through multiple hubs first adding several milliseconds to your latency it all depends on which motherboard you're using but to my surprise the ones you find at the front of your case can often be better than the ones which plug directly into your motherboard we've been doing it wrong all this time I tested the lat unit plugged into three different USB ports I tested a USB 3 and 2 connection at the front and a USB 3 Connection in the back and all the results were either 7 or 8 milliseconds average which to me signals that there isn't enough difference to be able to measure accurately not even with an lat unit but the fear I might have my mouse plugged into a laggier USB port isn't going to keep me up at night now we get on to monitor latency and this is a lot of fun because the difference in responsiveness between a good and bad screen can be vast much like polling rate when it comes to refresh rate more is always better it doesn't matter if your PC can't keep up a more responsive monitor will still display more recent images sooner here is how the 360 HZ monitor's latency changes as I drop the refresh rate even at 240 HZ which to me is still ridiculously high the latency is measurably worse I mean it's still great but not as good anymore and as you can see it gets progressively worse as I go down to 60 HZ and I know there will be someone out there with a high refresh rate monitor that's still set to its default 60 HZ setting so check to make sure you're not that person because if you are then you're missing out and you're missing shots monitors often come with an overdrive setting which attempts to change the pixels Faster by telling them to transition to a more Extreme color than what their meant display and then as they Rush past the color they should be it stops the change that's right involves tricking the pixels I put it to the test on two of my monitors and found zero difference between overdrive off and on the most extreme setting but then I am testing a section of the screen that's going from black to white so maybe overdrive can't do anything to accelerate this kind of already extreme brightness change this isn't a setting I feel comfortable testing using lat and I feel a high-speed camera may be a better option to see the difference that overdrive might make this is my 10-year-old cheap 40in TV at 60 HZ which is its maximum refresh rate I got a total system latency of 46 milliseconds compared with my 360 HZ screen scoring just 19 under the same conditions and refresh rate but why stop there I dropped the TV down to a cinematic 24 HZ and to my surprise at 61 milliseconds total system latency it still wasn't too bad but if I move my mouse you'll spot the problem every refresh contains about 20 different images all slightly staggering and resulting in a horribly wavy looking appearance and that's when I had a great idea let's enable vsync simply turning vyn con increased my latency to 202 milliseconds one F of a second that's abysal even setting it up to 60 HZ again vnon still yielded an 85 millisecond total latency that's worse than 24 HZ was without vsync but at least now the screen isn't all wavy looking I suppose 85 for minimal latency never have vsync on never sure this can result in Rolling shatow Vision like what's being seen here but the higher your monitors refresh rate is the less of an issue this becomes since less and less can change before the next screen refresh begins so on a high refresh rate screen disable vsync but do consider features like freesync or g-sync which gets us onto the last bit your PC I've had so many people telling me over the years to test how much latency settings like anti-aliasing or an isotropic filtering ads and the answer is it totally depends on your PC on a PC that's fast enough it barely makes any difference to the latency at all when I significantly up the resolution the settings and the complexity of the scene being displayed because on a faster PC like this you you can get away with suboptimal settings and the consequences on your frame rate and latency will be minimal so for this bit of the video I'm going to move to a more minimum spec kind of setup to demonstrate three things that will make a big Improvement to a slower PC's input latency this is a side project I've been working on it's a minimum spec PC for CS2 so I tested this at 720p at different Graphics quality presets to see how the FPS and latency changed and we can see here that different graphic settings now make a massive difference to the resulting frame rate and input latency even at this lowest quality preset we're still getting 33 milliseconds of total latency and there's not a lot more we can do to improve this as FSR is already on and set to its fastest performance mode setting rather predictably as the presets went up the frame rate plummeted and the latency skyrocketed and very high preset felt unplayable but it is a minimum spec PC so what do you expect with a slow system like this your best just sticking to the lowest preset I've always felt like limiting the frame rate using the FPS Max command can help further and sure enough limiting low preset to just 100 FPS drop the latency from 33 down to just 29 milliseconds confirming my suspicions though not by much which gets me on to the first Improvement you can make if nothing else works and those settings and resolutions still don't yield and acceptable experience then you can consider upgrading to faster components namely the processor and graphics card this video is sponsored by Nvidia whose lineup of GeForce 4000 graphics cards can be seen benchmarked in CS2 just here at 1080p maximum settings on overpass so plenty of leeway here should you want to run at lower settings instead but you can see that as the frame rates increase the latency decreases and that's because a faster graphics card shortens this part of the process and thus reduces the total system latency now the GeForce 970 isn't a particularly new or powerful card these days but it's a sizable improvement over the minimum spec and with it my frame rates were much higher and my latency a lot lower even at very high preset and with this new card installed I can now test The Reflex option in the graphics menu because the second way in which you can reduce latency is to better synchronize the way your computer's components communicate with one another this is what nvidia's reflex option does and it's available on GeForce cards it carefully times when the processor starts work on a new frame so that it can pass it straight on over to the graphics card and out to your monitor as quickly as possible instead of having new frames lingering around and queuing behind other ones reflex ensures that you're always seeing a very recent image which you guessed it will reduce the latency the Boost option times the even more aggressively so you might see higher power consumption lower frame rates but with it hopefully further reduced latency because every frame that is being generated is actually current and useful to see the difference it could make to the GeForce 970s results I set the game to very high preset and up the resolution to 1440p to really make it struggle G to older standards this limited the monitor's refresh rate down to just 144 HZ on this card further adding to the latency and sure enough the FPS was just 71 and the latency was a fairly horrible 61 millisecond seconds flicking Reflex on actually dropped the frame rate down to 67 FPS but contrary to what you might expect from a lower frame rate the average latency also dropped down to 42 milliseconds there may be fewer frames being displayed every second but they all appear to be more recent and it makes what would otherwise be an unplayable experience feel quite a bit more responsive I tested reflex plus boost as well but at 44 milliseconds the average latency hadn't dropped any further Nvidia said that they're still optimizing this mode for Counter Strike 2 each game and game engine requires custom tuning for the best results with reflex so this may be improved with future driver updates there's an ultra low latency mode in nvidia's control panel which I didn't enable for this video and they say this doesn't need turning on if you're already using reflex mode inside the game the last thing to consider is how and when the image goes from your PC to your monitor which is where features like vsync and g-sync come in if I pause that footage of my TV screen then you'll notice these lines across the image this is tearing where a new frame reaches the monitor before the old one has finished displaying you can see about five different images showing at the same time here there are two ways of getting rid of this one way is to enable vsync which prevents tearing by displaying just one image at a time but having VN con introduces serious levels of input latency by imposing a frame rate Block near the end of this process which causes a pile up further back and the maximum Q levels at the CPU and GPU stages which nobody wants so unless you really can't stand the look of tearing don't use vsync the other option is to use Technologies like freesync and g-sync and they work by synchronizing your monitor up with the new frames as they're being completed by your PC which means it's kind of like vsync but without the input latency but here's the big butt it only kicks in if your frame rate is lower than your monitor's refresh rate is if it goes above that then you're back to having either vsync on or off again but the good news is tearing becomes less obvious the higher your frame rate and monitors refresh rate becomes so for minimum latency always keep vsync off when the monitor's refresh rate was 144 HZ g-sync didn't drop the latency in fact it increased it by about 2 milliseconds on average but in return it does mean you'll get a tear-free viewing experience which might be worth it to some people with the monitor set to just 60 HZ though g-sync benefits were much more pronounced and it greatly dropped the latency when vsync was also one in Short g-sync won't drop your latency but if you don't like seeing tearing on your screen g-sync will remove that without the added latency that having vsync on can sometimes have but gsync benefits are reduced at higher monitor refresh rates simply because higher refresh rates naturally achieve a less torn looking image by refreshing it more frequently in conclusion testing input latency is not like I expected it to be I thought it would be a case of setting my PC up and testing a few different graphic settings in great depth instead it seems like input latency is a very broad topic and I've only briefly scratched the surface of each individual aspect of it even after weeks of testing this slide does sum it up well it's just about ensuring that each and every one of these stages doesn't delay the process more than it needs to getting a good Mouse a good Monitor and a fast PC takes care of half of it and the other half is in the software making sure all of it is set up optim I've been pleasantly surprised by how simple some parts of this are I was half expecting input latency testing to involve rooms of scientists and elaborate custommade machines so to be able to speak with Nvidia and to understand how they test and document Hardware latency has removed a large part of the mystery surrounding this topic for me and although custommade this lout unit is a very simple bit of Kit to understand and to use I think I was expecting higher frame rates to contribute more directly towards lowering the input latency but having hundreds of frames a second most definitely is not a guarantee that your responsiveness will be good I mean more FPS certainly helps you don't want low FPS if you can help it but and especially with venon I have encountered situations where high and consistent FPS can still result in noticeably high input latency likewise reflex definitely made lower FPS feel more playable than I felt it should have done and it highlighted to me how big an impact just one part of this graph can make depending on how it's configured as technical as bits of this video have become at the end of the day I feel like I could estimate with reasonable accuracy how much latency a computer has simply by wibbling the mouse about and by feeling how connected the resulting actions on screen are that's ultimately what input latency testing boils down to even if it is hidden behind clever looking equipment and complicated sounding features but if you'd know nothing but a laggy experience you won't know what low input latency feels like until you experience it I was horrified when I went about to Counter-Strike tournament and came across a room full of Cutting Edge PCS displaying orts titles but all of their monitors were still set to 60 HZ had none of the hundreds of other people using them in that day noticed I still don't know there definitely must be people out there who perhaps feel like older games feel more responsive than newer ones but who may suspect it's because it's using their Mouse inputs differently or perhaps they suspect frame rate difference to be the issue without knowing that the majority of what they're feeling is a difference in input latency does cter Strike Source feel noticeably more responsive to you than CS2 does because it doesn't have to and for those people who think they have a problem the solution is probably easy but it isn't always easy to know what that easy solution is software like GeForce experience and frame view will help identify bottlenecks inside your PC an rla monitor will do that for your display your best bet might be to have a friend over who knows what low input latency feels like and to get them to test your game out for you and to try setting it up differently Nvidia ended by remarking that CS2 has some of the lowest system latency they've ever measured so low they initially thought it must have been a bug it wasn't they say there's no such thing as a stupid question but I've definitely tested that notion with some of the things I've asked the team at Nvidia over these past few weeks so I'm going to say firstly sorry to Nvidia and thank you to Bill and Keith for being so patient and second yes there are stupid questions which hopefully this video is answered for you but if you have any others then ask in the comments section and I'll do my best to answer them
Info
Channel: 3kliksphilip
Views: 378,417
Rating: undefined out of 5
Keywords: input latency, cs2, framerate, responsive, high refresh, better performance, nvidia, reflex, gsync, reflex + boost, rla, ldat, #FramesWinGames, #ad
Id: NE0qg_8k0BE
Channel Id: undefined
Length: 27min 51sec (1671 seconds)
Published: Thu Nov 23 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.