Radeon RX 7900 XTX vs. GeForce RTX 4080, FSR vs. DLSS / Ray Tracing Benchmarks
Video Statistics and Information
Channel: Hardware Unboxed
Views: 234,361
Rating: undefined out of 5
Keywords: hardware unboxed
Id: YbKhxjw8EUE
Channel Id: undefined
Length: 26min 3sec (1563 seconds)
Published: Wed Oct 04 2023
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.
tl;dw:
The RX 7900XTX is:
Raster:
Ray Tracing:
Upscaling + RT:
both gpus are good, just depends on what you value more and what kind of games you play more. Tho both gpus needs to be cheaper
Could someone explain to me how has rtx 4080's rt ultra performance at 4k decreased 20% since last spring but 7900xtx's rt ultra performance has increased?
Back in february 2023:
https://www.techspot.com/articles-info/2627/bench/RT_2160p-color.png
https://www.techspot.com/articles-info/2627/bench/Hogsmeade\_RT\_2160p-color.png
Vs now
https://www.techspot.com/articles-info/2746/bench/Hogwarts_RT.png
Always take AMD vs NVIDIA comparisons with a grain of salt
Seems like AMD is really not making much progress on optimization with 7000 series. Less historically speaking there has always been a lot of room for improvement with drivers and the 7000 series hasnβt changed all that much.
I'm looking at getting one of these two cards so I wish I could trust these benchmarks, because everything I've seen outside of HUB has AMD really far behind in RT. With FSR 3 being decent enough and the 7900xtx being significantly cheaper, RT is pretty much the only unknown.
His remarks on power consumption are total crap. The 7900XTX consumes a lot more power at 1440P. For a lot of these titles itβs more than 100W difference between the 7900 XTX and 4080.
What about power consumption, the XTX wins in raster but at the cost of significant increased power consumption. On the other hand, the 4080 wins in ray tracing while still sipping less power than the XTX. In Ratchet and Clank and Cyberpunk 2.0 the XTX looks a couple of generations behind, let's see if that changes when Alan Wake and others show up with their heavy RT features.
On the other hand, Call of Duty could have been used to showcase a title specially optimized for the AMD architecture, whereas the new Counter Strike performs much better on Nvidia (also Fortnite DX11 shows Nvidia in need of improving their CPU overhead on DX12). All in all, at the end of the day it all comes down to the games you're interested in playing and what upscaling method looks better to you. I don't think you can go wrong with either of these cards, just that nowadays high end GPUs are a lot more expensive than they used to be.
Thing here is when you're spending sub $1000 to begin with, most will opt to pay the extra $100 or so for the feature set that Nvidia offers along with lower power consumption. The $100 is not a wash as you're not just spending more for no reason, you're getting something return for the +$100 that you will not get with the AMD card in the software stack.Particularly DLSS, where it performs AND looks better at lower settings than FSR at quality settings and considerably better performance at high RT workloads.
Of course if none of that matters and ALL you care about is running raster only at native, then you can save the $100, though if that was the case, you were likely considering AMD card to begin with because that crowed seems to be pretty vocal about their Raster Only preferences.