The Real Reason Why CRT Monitors Are Better

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
there has been a recent movement of not just retro Gamers but also modern Gamers going back to using CRTs again as people have started to realize that even on low refresh rates of 70 or even 60 HZ can look smoother than high refresh rate LCD OLED or other flat panel displays and if there's a person out there that says that's ridiculous that doesn't make any sense well that's how I know that person has not used CRT at least recently it actually is very apparent when you see one in action in person a lot of people must understand that the reason why there is no motion blur is somehow tied to frame rate and yes that is true to a certain point but once you reach that magical 48 fps threshold give or take this actually no longer becomes the issue now don't misunderstand yes we can tell the difference between standard refresh rate with high refresh rates even though they are way above that 48 HZ threshold people that say we can't see above 48 fps seem to be getting confused with the technical consensus that 48 htz is enough to not perceive flicker in a theater using film projector technology in fact this is precisely why movies are in 24 frames per second because 48 would have been too expensive for a lot of production companies and so they decided to half that to 24 but would still keep the refresh rate at 48 which very quickly was changed to 72 to add some extra leeway for people that are sensitive to flickering lights in their peral Vision even if it really isn't very perceivable for the average person so that's where the 48 fps misconception came from but again this was just the threshold where flicker would start becoming invisible this is also why the AC Electrical frequency of certain countries is in 50 HZ because just like in a cinema the flickering of light bulbs would stop being noticeable above 48 Hertz but all of this only applies to flicker not frame rate humans absolutely have the ability to tell the difference between 50 and 100 frames per second or more the 48 fps thing is just where the Persistence of vision is able to eliminate flicker but that's just the threshold sometimes people would increase it to 50 60 or even 72 Hertz for extra breathing room because and this is important Flicker at 48 HZ is only seamless and not noticeable assuming the time the light is on is equal to when it is off if the burst of light is shorter than it is off then the duration of of light stimulating our retinas in our eyeballs will result in the Persistence of vision dissipating a lot quicker as a result we still see flicker so after all that explaining what causes motion blur in video games believe it or not it's not frame rate it's the lack of flicker that makes motion blur far more noticeable modern flat panel displays don't flicker like a CRT how flicker reduces motion blow is actually quite simple and intuitive by using an analogy let's say you're standing on the side of a road and a sports car Zips past you at high speed and if in that time your eyes have not been able to lock on to the sports car it would appear as blurry but if you were in a car driving next to this sports car it will be easy to focus on and there will be no motion blur but the road and everything else in the background will have a lot of motion blur so the reason why moving op objects on a Modern Display appear blurry even if your eyeballs are trying to follow said moving object is because you're not looking at a moving object like the allegory sports car but rather you're looking at a series of still images that were never moving to begin with but appear to be moving through the power of frame by frame animation but wait if the object on the screen isn't moving then why is there motion blur it's all about about relativity whether it's a sports car that is moving while your eyes are not or your eyes doing all the moving while the monitor sits still the effect is the same you see motion blow a computer monitor cannot move objects like a sports car can move on the road instead frames of Animation are stationary and are progressively being redrawn many times a second to give the illusion of motion However if those individual frames were to be visible for a shorter duration it it would give your retina less time to absorb light which means less time for motion blur to form the secret source of no motion blow lies in the crt's technological shortcomings or inability to retain an image longer than a literal millisecond hence why the flicker now I will bring up the elephant in the room I'm sure a lot of people are readying themselves to post a comment that high refresh rate monitors reduce motion blur well saying High refresh rate monitors reduces motion blur is like saying adding salt to a dish makes it taste better it's such a generalization of the nth degree would adding salt to a burnt steak make it taste better adding salt can work in a lot of situations but it's ultimately only one ingredient to making a great meal a high refresh rate monitor running at 120 HZ doesn't get get rid of motion blur it reduces it by half when compared to 60 and 240 will only reduce the blur also by half when compared to 120 htz but it never will get rid of it that's why you have display manufacturers trying to woo people into spending large amounts of money for something that can do 360 HZ or 500 htz maybe the motion blur would be reduced so much to the point where it's not really noticeable and yes High refresh rates can eventually get rid of noticeable motion blow but what's the difference between a 60 HZ monitor versus a 500 HZ monitor plugged into a PC that can only do 60 frames a second nothing an OP monitor like that is useless when paired with a PC or console like that will a high refresh rate monitor get rid of motion blur if is salt good for food in both examples it depends typically the PC Master race will have its usual knee-jerk reaction of well stop being such a ple and get a better PC okay so throwing money at the problem we'll fix it right well actually not really has anyone ever in their entire life heard of the most powerful consumer graphics card being able to run literally every game ever made that's able to stay above 360 FPS or 500 all the time always without needing to drop the resolution and or lowering graphical detail if your answer is yes then now I know you don't own a high-end gaming PC or at least seen an episode of NX gamer or digital Foundry look we still have problems running games consistently even at 60 FPS on beefy Hardware forget 120 or 144 and above 60 FPS gaming isn't new it was possible on the PlayStation 2 it was possible on the NES pong ran at 60 FPS I remember when people were saying the PlayStation 5 will run at 60 FPS at 4K and I just roll my eyes because of course it can that's exactly what they said about the PS4 Pro but did we always get 60 FPS at 4K trying to hit that golden 60 FPS Benchmark is already hard enough even on the PC now you're telling me that the only way I can achieve no motion blur is if my games always runs at 360 FPS at least for decades there have been people thinking that eventually 60 FPS and above will become the norm yeah a lot of modern systems can run games at 6044 or 500 but in the grand scheme it will never happen because when Hardware becomes powerful enough to pull off something like that there's always something else preventing it from happening the GameCube could do 60 FPS with most of its games but the more powerful Xbox 360 more often than not ran at 30 because now it had to push out more pixels for High defition video and improve the graphics to accommodate those extra pixels the GTX 1080 was really impressive and then Along Came the RTX line of graphics cards that could do rate tracing at an impressive 30 FPS in this game and 45 FPS in that game and now with this monster of a graphics card that is the RTX 490 still runs like garbage because many AAA games are being held together with duct tape and chewing gum forget High frame rates is this game even going to run well or is it yet another rushed unoptimized mess guys High refresh rate monitors is not the solution now they're not a wasted technology high refresh rate monitors are very useful but they are not a solution to a problem that was resolved decades ago sure you could smash your way into a building with Brute Force but sometimes it's simply better to just use a key to unlock the door to allow even the most frail person to enter the building that never would have been able to smash the door in in the first place this key to no motion blow comes in the form of the crt's flicker it offers a way to accommodate even underpowered systems or extremely unoptimized games we don't need more frames we need better frames frames that doesn't cause any kind of motion blow even as low as 50 htz how do I know this because I grew up in a region of the world that uses 50 HZ televisions and if you run a game like Super Mario Bros on the NES or Sonic 2 on the Mega Drive the the refresh rate is high enough to not have one perceive any flicker while the frame rate is fast enough to not perceive any motion blur 50 HZ on a CRT TV looks smoother than 144 on an LCD this is why Hardware manufacturers have been trying to emulate this flicker by using the method of black frame insertion to have alternating frames be completely black in order to reduce motion blow and although this is probably the best way to achieve this the technology however is just not quite there yet in the words of John of digital Foundry I don't care if you are using black frame insertion or any sort of ultr low motion blur it doesn't match up to a CRT how can such an old and obsolete technology do something very easily that not even cuttingedge displays of today can handle well first of all we're comparing apples to oranges the technology of a cathode ray tube is totally different from a liquid crystal display But ultimately the reason why we stopped using them is the same reason you probably stopped using them CRTs are huge they use more electricity desk monitors could weigh as much as a human child with large CRT televisions weighing as much as a human adult they're ugly they can seriously injure an individual if you accidentally drop one even repairing a CRT can potentially give you a massive electrical jolt strong enough to stop your heart LCDs didn't need to be better at everything to be a worthy replacement they only needed to be better in certain key areas more vibrant colors better black levels and the lack of motion blur was simply not more important than being slim light convenient using a lot less power and wasn't dangerous when carrying one down a flight of stairs but that's the thing LCDs were convenient in all honesty they really didn't look better back when CRTs were still common in fact the only real competition that a CRT has in terms of image and frame quality is the OLED except that because CRTs have not been manufactured or improved upon since the late 2000s an OLED display is vastly Superior in many ways such as running at much higher resolutions the implementation of HDR and widescreen support however just like with the LCD oleds don't have Flicker and so they too also suffer from motion blur so okay if you don't mind how heavy or cumbersome a CRT can be then yeah absolutely get one but what other benefits are there well number one they don't cost much or nothing at all depending how you can get your hands on one unless you want to invest in some of the more rare rare widescreen displays that can be bought for a small fortune off of eBay number two more natural vibrant colors it doesn't compare to a modern OLED but boy it does cause many LCD models to make a run for their money some of the reasons for this is the same reason why an OLED looks better than an LCD it doesn't use a backlight rather the actual scan lines themselves is what is illuminated because of this they can offer better image cont and deeper blacks assuming your brightness setting is turned down enough while the contrast setting is turned up number three lower resolutions look way better than any LCD or even OLED displays this is because a CRT has a resolution range not a native resolution an average 17in CRT from the 2000s could do between 400 to 1,024 vertical scan lines and if you go bigger than that it can be on average 1200p there are even some CRT models that can support a vertical resolution higher than 1440p and yet can still support an image signal as low as 200p it would just have to double the amount of scan lines for each vertical pixel to be displayed as a 400p image so because there is not one single native resolution like an OLED or LCD would have a crt's 7 20p image would look way better than 720p on let's say a 1080P or 4K flat panel display because of the required blur filter applied to the image to get rid of uneven pixels on a CRT 720p would look just as good as a 720p image on a 720p LCD display even though that same CRT can handle much higher resolutions but here's the best part if your display can only support 1200p or 960p it would actually still look better than a 1080p image on a native 1080p flat panel display because that aliasing that flat panels are known for becomes basically invisible at a crt's max resolution and not really because it's more blurry but because of sub pixel light scatter that makes the colors of a pixel blend with other neighboring pixels thus giving this sort of pseudo anti-aliasing look I mean at those High resolutions we only really use GPU produced anti-aliasing just to get rid of certain geometry Jitters more than getting rid of jagged edges it really is that seamless on a CRT this is why we didn't mind running games as low as 480p back in the '90s which means number four Retro Gaming looks much better on the CT than modern displays I don't know if any of you or maybe someone you know of experienced playing playing an old game on a modern display that you haven't played in a very long time only to then notice how much worse it looks than you remember well your memory for once isn't actually totally wrong games did look better back then because they were using CRTs again it has to do with that color scatter I talked about the differences are more apparent on a 480i TV than on an ultra sharp desktop model but even then the improvements are still noticeable number five even low n CRTs can handle High refresh rates this is specific to computer monitors which are designed to go above 60 HZ and because each scan line exposure needs to be even shorter to give the electron gun more time to draw more lines than what was required of a television the time that a single scan line emits light is even shorter than for how long the screen is actually completely black which is good because again this this is exactly why there's no motion blow but the flicker does become more apparent even at 60 HZ which is why the screen minimal recommended refresh rate is actually 70 a basic CRT from the 2000s can go as high as 120 HZ assuming the resolution is low enough but usually most people like to find a sweet spot between the two back in the late '90s or early 2000s that sweet spot was usually around 1024x 768 at 85 htz but that's what's great about CRTs you could change the combination of resolution and refresh rate to suit your needs 1200p at 72 htz which was perfect for watching movies or set it to 600p at 100 HZ for gaming it's all up to you and number six no lag now come on surely you mean lower lag not no lag that's the thing lag was only really an issue in recent decades because things like a memory buffer which is kind of required when it comes to processing Digital Data didn't exist back then because often they were using analog video signals that doesn't need to be stored and then converted but is fed directly into the CRT circuitry that does the actual drawing of the image on the screen and if it slows down then that image is gone because it doesn't remember what that image was I mean how else do you think TVs worked back in the 1940s before consumable computers were even a thing more recent CRT displays are built on that same technology so there was literally no lag not because CRTs Were Somehow more advanced than modern flat panel displays but because of how primitive the technology of a CRT was I mean yeah I guess you could say that there has to be some form of latency everything has latency but really the only latency of this analog technology is measured essentially by how fast the electrical signal itself in the VGA cable can travel to the circuitry of the CRT to then be shot out of an electron gun as an actual electron in a vacuum environment moving at half the speed of light to then be absorbed into the phosphor of the screen that lights up when excited by the electron so yeah there is latency a few milliseconds at worst or in other words basically no lag I'm not the first or the L to give advice on the benefits of Coots however some advice out there may make people feel like it's more difficult to get into than it actually can be but that'll be explored in another video but don't worry you won't have to wait long for it until next time
Info
Channel: MXDash2
Views: 41,039
Rating: undefined out of 5
Keywords:
Id: 7A7gjSaDaYw
Channel Id: undefined
Length: 21min 14sec (1274 seconds)
Published: Fri Nov 10 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.