FSR 3.1 vs DLSS 3.7 vs XeSS 1.3 Upscaling Battle, 5 Games Tested
HTML-код
- Опубликовано: 6 июл 2024
- Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 4070 Super - geni.us/wSqSO07
GeForce RTX 4070 Ti Super - geni.us/GxWGmYQ
GeForce RTX 4080 Super - geni.us/80D6BBA
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 4070 - geni.us/8dn6Bt
GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
GeForce RTX 4060 - geni.us/7QKyyLM
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
Radeon RX 7800 XT - geni.us/Jagv
Radeon RX 7700 XT - geni.us/vzzndOB
Radeon RX 7600 XT - geni.us/eW2iWo
Radeon RX 7600 - geni.us/j2BgwXv
Video Index
00:00 - Welcome to Hardware Unboxed
01:39 - Performance Testing
09:33 - FSR 3.1 vs FSR 2.2
11:57 - Image Quality: Ratchet and Clank Rift Apart
14:12 - Image Quality: Horizon Forbidden West
17:05 - Image Quality: Ghost of Tsushima
18:37 - Image Quality: Spider-Man Miles Morales
19:55 - Image Quality: Spider-Man
20:50 - 1440p Image Quality Comparisons
22:35 - Quality vs Quality vs Quality
25:23 - Final Thoughts
FSR 3.1 vs DLSS 3.7 vs XeSS 1.3 Upscaling Battle, 5 Games Tested
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo Наука
FSR3.1: Ghosting in Tsushima
I laughed so hard. Nice one xD
even though ghosting is bad, ngl it makes the sword look cooler, stronger, faster
@@redbaronmars4176hell nah
@@redbaronmars4176and stop liking your own comment
@@lucaschenJCI liked his comment
I swear the video titles 5 years from now are probably gonna look like this "DLSS 4.1.69 vs FSRSS vs. XeSS vs. PiSS - Can Dr.Peppers' upscaling technique save us from the GeForce 7660's $2000 entry level price tag?"
PiSS lmao
PiSS >>>>
Picture Intact Super Sampling? Give me 14 to-go!!!
Raspberry Pi Super Sampling?
That PiSS really streamlines the flow through the pipeline.
"FSR looking like my DMs: quality is decent, but in the end the ghosting ruins everything" - someone somewhere, probably
Good one bro🤣
😂 I'm taking that one.
Underrated comment
@@MattJDylan Did you use DLSS 2.0?
man, it's hard for me to tell the difference between the quality comparison without being told. I wonder if im not alone
You're not, I can't see the difference and the differences are over blown.
Same I have a hard time telling the difference to
Because you're seeing a video that has been butchered by youtube compression
I can see the differences and they definitely help to understand how the tech stacks up against each other. But, I will agree that if im actually playing the game and not measuring my peepee with graphs and slowmos, I would find it borderline impossible to tell the difference between FSR 3.1 and dlss both at quality. Xess at quality I could prob tell apart in certain games.
This being at 4k of course, at lower resolutions i still think that all of the technologies look a bit wack, id rather upgrade than use upscalers at lower resolutions.
As an IT professional for 25 or more years now, I agree. Well, if you go looking for the artifacts you can find them, but the quality difference of most games and tech these days is not like it was in the past. When you played half life, or even the sequel, on a low end GPU to a Mid tier one, the change was pretty big, and if you went from low/mid to top tier, again the performance and graphical changes made a significant change that most untrained or uninformed people would pick up on.
I think on average the difference between low, mid, and high tier settings is much less dramatic in this current gaming era. When it comes to upscaling tech, most wont see the difference between 1080p and 1440p (okay maybe some would, but if we talk 1440 to 4k, yeah not many will see the change without slowdown or close inspection.
Performance, now that is probably more noticeable to everyone, as playing at sub 60 fps and jumping to a consistent 60 or above can be a game changer, and most will notice the improvement in fluidity, unlike extra grass or slightly clearer textures. So as long as these FG technologies can provide a smoother feel without degrading the image so much the average layman can tell the difference, I think it is a big win, especially for lower spec gamers!
The ghosting didn't really bother me, until you showed Ghost of Tsushima. My god, that's bad.
Ghosting in that game is a deliberate design choice
Did the frame jumping DLSS walking also lead you to say "My god, that's bad"? It isn't common, DLSS *GENERALLY* is better, but it is forced to be a closed box so it can make you buy their graphics cards, so that should be a hard nope.
@@markhackett2302 Look I have a Rx 7800xt PC and I don't use fsr quality. I only use raster because I can't stand that upscaler.
@@kagander8619no, it isn't 😂
Source
@@TheSometimeAfter A joke, that is. "Ghost" of Tsushima
it's a shame they waited until now to switch to a DLL packing, because now there's a few years of games stuck with an inferior upscaler unless the devs go back and patch it, but hey, at least that number won't grow anymore now
Thankfully, if the game does support DLSS 2.0+ then you can use OptiScaler to translate it to FSR or XeSS.
It only supports FSR 2.1.2 and 2.2.1 right now but there's a branch with FSR3 support for DirectX11 games.
You'd hope. FF14 just updated with FSR *1.0* last week. I was baffled when I found out since the option isn't labeled with a version number.
@@ironeleven considering the HW requirements for FF14, i wonder what´s even the point of implementing FSR...
18:23 Ghosting of Tsushima
the best temporal AA method in Ghost is SMAA T2x
That surely is a deliberate design choice?
@@trulsdirio who cares if it is a design choice if i do not like it.
by the way the sword swing effect is a feature not a bug. hope you like that feature then....
Ghosts of Fukushima
@@sudd3660What are you going on about
If you need to zoom 3x , slowmo at 1/4th speed to elucidate the difference , is there really any tangible difference while gaming in real time , i wonder
The flickering, Ghosting and smearing of FSR is extremely noticeable in many games while playing. I switched to XeSS 1.3 whenever possible because it doesn't have these issues anymore, and the shimmering of XeSS 1.3 is much less noticeable.
For me, the shimmering and flickering are the things I can always spot and it's really distracting. Also if there's overall blurriness. Ghosting is something I don't spot that much during gameplay, although the GoT example here was really bad for FSR.
It'd be nice to see closeup comparisons to the native and render resolutions as well.
DLAA, FSRAA, XeSSAA would be nice indeed!
Is it me or this was all these games from Sony. Take that in for a moment Sony.... Is giving gamers the most flexibility for upscaling
Doesn't the PlayStation use an AMD chip for graphics? Surely that must be related.
All thanks to Nixxes!
@@JayMaverick I never remember I saw reports of both that PlayStation has there own software in the PlayStation. I mean we been upscaling on game systems form at least PS4 erra I remember. At least in the terms on not just scaling on pure math.
@@xxstandstillxxyeah the ps4 pro been using upscaling since 2015 !
@@xxstandstillxx that is checkerboard rendering. But FSR is being used in many titles this generation since all current gen consoles have AMD chips.
I do understand the "performance" settings compared, but makes no sense (at least to me), to compare 1440P to 4K upscaling with DLSS and FSR and then use XeSS 1.3 in Balanced mode, which is upscaling from 1080P... that's a huge difference pixel wise, hence why XeSS is not delivering better results
I truly believe ur right. But for me I'm not sure with this new change intel made to the scale that balanced isn't upscaling to 4k from 1440p with their balanced setting
Well, he normalized for performance in the initial comparison. It doesn't really matter how good it looks if it's not at the FPS you want, so he made sure that all the upscaling techniques were performing similarly first.
isn't this equal performance testing?
HUB have hated Intel Arc since day 1.
@@QuentinStephensAny citations for that? Weird opinion.
The vibe in the industry is that its AMD vs Intel for the upcoming generation. Nvidia is pretending they don't have competition.
Well..... do they?
RDNA2 was a magical generation, RDNA3... not so much, RDNA4 is an RDNA3 bugfix.
Well if AMD is only making a cheaper to build part that matches around the 7900 XT/XTX they won't have any competition in the high end and intel will need a massive power draw reduction to be anywhere near AMD and Nvidias previous gen cards
I mean, Nvidia isn't going to fight until the have a reason to. Intel and AMD have 0 challengers at the top end of the GPU market, and the upcoming GPUs seem to be the same in that AMD isn't going to try to fight the 5090 for supremacy and AMD has already said as much as well. So nothing is changing anytime soon bud
What makes you think thats the vibe lol ? Maybe youre into underdog competition. But nvidia is doing better than ever and looks to take the next gen performance and software crown with their AI advancements.
@@KrazzeeKane who cares about the very high end most people buy way cheaper gpus
Im glad amd supports older nvidia cards that nvidia does not support
The cutoff isn't that different though, you aren't going to be playing any recent FSR3.1 games on your RX570 (that's no longer supported by AMD).
Amd abandoned r9 390, meanwhile nvidia only lately abandomed gtx 700 series and still support 900
@@yarost12 well a 1200 dollar gpu not supported like a 3090 is like being ripped off. Compared to a 10 yo 160 dollar gpu
@@user-wq9mw2xz3j We will see when Nvidia develop a new arch instead of using decade old CUDA core.
Especially since the 4070 used might become "older" at the end of the year ...
I'll be honest, while you can find these artefacts when looking for them, all the technologies are good enough that I'd be able to just play the game without it distracting me
Yeah I agree. To me these upscaler comparisons read a lot like audiophile speaker reviews.
copium.
@@WrexBFyou keep using that word, but I do not think you know what it means. (On a less memey note, I don't have a leg in the game so why would it be?)
Why would you take something inferior when something superior exists. This is nothing like an audiophile snob problem there's a reason each company and now Sony are all spending tens of millions to develop and market their own upscalers.
In the future you will be right but for now there are simply inferior and superior ones. They all have their own issues though.
copium.
Great comparison
I am confused when they explain XeSS quality and DLSS quality running at a higher resolution. If that's the case why not compare DLSS/FSR to the same resolution that XeSS is using, which is ultra quality at 1080p. It doesn't make sense to use "XeSS Quality" when they are fundamentally using different data sets.
I miss the times, when GPUs were able to run games well in native resolution...
they still do.
@@mryellow6918 at what cost, tho?
This is a problem, you're correct. Upscaling is a garbage bandaid tech.
They still do. Except now you can use mid range cards like a 4070 or 7800xt to play at 4k with decent frames. I'd sooner do that using up-scaling rather than play 1440p native.
At least it's over. Next generation they have to give actual performance increases and can't hide behind software whilst charging the same price to performance at native. Hopefully.
Thank you for testing with the different cards. A lot of other folks who are doing these reviews only using a nVidia card.
Been waiting for this video.
As the eye catches much less detail of fast moving objects, increasing temporal sampling makes sense if it reduces shimmer which is more notable.
While i understand the intent, i believe "performance normalized settings" was a mistake. The intent of the video was to compare the improvement of fsr 3.1. How it compares to older versions of itself and how it stacks up against its current competition. Showing us comparisons when we know that each upscaler is not using the same internal resolution almost defeats the purpose of the breakdown. Comparing one upscaler pulling from 1080p to one pulling from 1440p feels pointless. I understand your reasoning but I think it would have been better to just show the improvements and compare them at each similar internal resolution, then after that highlight the different fps performance at each setting. Anyhow love the work, long time viewer.
Upscaler performance also depends on your GPU generation and GPU class.
You might get completely different performance results on a 4090, 2060 Super, 1080ti, 5700xt, 2080 ti etc.
Yeah man completely agree
Yeah. Comparing DLSS Quality with FSR Balanced feels wrong. Of course DLSS is going to look better.
He basically did that near the end of the video, comparing the "Quality" modes across upscalers, did you actually watch the video? Even in that case, each upscaler uses slightly different internal resolution targets, so you're never going to get a true apples to apples comparison. Comparing them based on actual performance makes sense, as the whole reason these upscaling technologies exist is to provide improved performance while minimizing image quality loss.
Of course it makes sense, the only reason to use upscalers is for performance so normalizing for that is what makes the most sense.
XeSS XMX on ARC vs FSR 3.1 vs DLSS 3.7 would be interesting. Battlemage just around the corner this test would make sense to give some idea what to expect. High end INTEL CPU and A770.
Why would you need to use a high end Intel CPU for this test? This is a GPU test, so a 7800x3d would technically barely bottleneck less. I do like the idea of more XMX XeSS tests though, everyone seems to just test the DP4A path
@@superamigo987 Because there is performance gap using AMD CPU. It is noticeable gain with INTEL CPU. Also XMX is INTEL specific for ARC.
Two different architectures. AMD/Nvidia you can say don't use CPU much at all. INTEL splits the tasks with CPU and it matters. Hence REBAR for the data transfer and use of E cores. Lot less waiting and queuing for lot better performance. CPU is vital. Two silicone's working together. That's why the ARC is very affordable because it is the selling point for the INTEL CPU. Also a reason that I see for Nvidia looking into CPU's also. With INTEL every FPS counts. AMD/Nvidia need almost twice the FPS for same quality of performance. For example 40FPS on 4k with INTEL very playable. AMD/Nvidia need at least twice the FPS because the waiting and queuing with single silicone creates gaps and stutter to be filled. FPS from INTEL is not the same as it is from AMD/Nvidia that share similar architectures. Then there is Intel Application Optimization (APO) paired with INTEL high end CPU that is upcoming and promise 10-50% gain. So if you are looking something that will age like fine wine it is INTEL.
@@superamigo987 It's because almost on one uses Arc GPUs. But it should be added as a 4th option here.
@@radosuafI hope Battlemage will bring us good performance to not left out from these comparisons.
@@PeterPauls First people will have to buy these. Arc ones are not very well priced.
Pixel peeping is fairly pointless. It really only matters what actual game play looks like!
No one cares; they'll never do a proper scientific method to prove which is better. They're just going to keep doing this because it's all everyone cares about.
That thumbnail is hilarious.
I just hope Tim isn't peeping through _my_ windows with his 'binoculars'.
His 'binoculars' have interchangeable DLSS/FSR/XeSS upscaling, so you better have curtains on your windows lol
@@MuhlisErtugrul 🤣*
*That's actually a brilliant idea though - I'm sure someone could make digital binoculars where the built-in Ai can magnify and upscale the image beyond the raw capabilities of the max zoom level of physical hardware (lenses, mirrors, etc).
@@-T--T- Yeah you're right :D That's a cool idea. We have AI upscalers like Topaz AI but a real-time upscaling (with machine learning tech like DLSS) through lenses like binoculars would be very interesting.
Leave him alone. He's mastered. Aren't you TIroMe ?
Trossard
FSR seems to try to preserve a bit more detail, but the result is noisier. DLSS is better at hiding that noise, but at the cost of making some areas blurrier. If you pause, you can see that difference, but in motion and without zooming in, it's virtually unnoticeable.
I don't think reviewers should be focusing so much on 2D fakery (spatial and temporal interpolation) anyway. Manufacturers seem to have successfully diverted attention to that, and away from the fact that the current generation is overpriced and barely any faster than the previous one at *actually rendering 3D scenes.*
Likewise, in a lot of games the "raytracing" option seems to be just a switch that makes Nvidia cards slightly slower and other manufacturers' cards a lot slower, to change the "winner" of a benchmark with barely any noticeable change in image quality (in some games, RT actually makes shadows look a lot worse). Who wants to use 2x the amount of power, produce more heat, more fan noise, and _lose_ some FPS in exchange for (supposedly) more accurate reflections on irregular surfaces, that don't even look _nicer?_
And reviewers / journalists keep falling for it, and publishing two, three, or sometimes even more versions of the _same_ benchmark, which doesn't really help anyone except the manufacturers, by diverting attention from the lack of real 3D rendering performance improvements, considering the increase in price.
One question on RT is why the path tracing result is not used to optimizes the raster Light probe positions on cases of light bleeding and artifacts.
It doesn't matter the upscaling method, none of them are any excuse for poor optimisation.
I would still like to enable XeSS in CS2.
12:30 Surprised you didn't mention that weird warping DLSS is doing instead of ghosting. Like its splotching the ghosting out. Oddly enough it seems like FSR has a more clear picture here despite the ghosting due to it not being blurred
I don't see any warping from DLSS 🤔
@@EveryGameGuruAMD fanboys are coping
@@GewelReal Nvidia fanboi is salty :P
@@seaneriksen2695Salty for what? Being the best lol
@@joaquinbigtas1396 being at best for spending 2000$ on GPUs that have the same performance as 800$ ones
Nice zoomed in shots. This is the best video I've seen for a comparison so far.
Great deep dive on the comparisons and I'm glad that the big three are pushing the longevity of cards farther. I have noticed the ghosting with FSR 3.1 in HFW but was able to reduce its perceivabilty by reducing motion blur. Overall, I still feel hesistant to recommend graphics cards to friends based on image upscaling features because not everyone is willing to test to find the most optimized settings possible. With the state and quality of implementations, it's still a nice to have but not a must have. This may change as it gets implemented across more titles.
Good progress AMD but gotta fix the ghosting.
Me not being able to tell the difference between any of them even on still frames:
Whichever is cheapest.
🧠
must be watching on a 5" phone lol
That's just being a contrarian.
Maybe invest some of the money you save in an eye exam.
it kinda just seems like they slapped a layer of TAA over the top of FSR and called it a day.
Thanks for keeping up with the comparisons between the upscaling technologies it's extremely helpful when determining what to use and when. I would love to see an image quality comparison with XeSS in XMX mode to see what the differences are though I imagine it's best to wait for Battlemage to do that.
Thx, great video, I hope UE5 games test video soon.
I'm just wondering how long it's going to take nvidia to start charging a monthly subscription for dlss
When the 50 series launch.
Its very pleasant to see that the universally compatible offerings from red and blue are greatly improved compared to their initial versions. I believe that everyone's problem would be resolved if amd released an XMX equivalent version of FSR like intel did to please their owners. Although my gut feeling is that they may release something first that will run on their shiny new NPUs. I really hope that that's not the case and they briefly provide updates to fsr3 that will make 7000 series more compelling at least in terms of upscaled visual quality.
DLSS also has different preset options you can swap between using DLSStweaks. It's a great way to tune out ghosting or to increase the softness. In Death Stranding DC, DLSS 3.7.10 favors preset "C" due to the decreased ghosting, even though for most games I've tested or seen, preset "E" is considered the most performant and the default.
I'm not sure if either FSR or XeSS have preset options, but it's another layer to team greens cake that I enjoy.
This video would have been perfect if you included the native AA.
The closest you're going to get to native AA in most of these games, is DLAA, native plus temporal anti-aliasing is essentially native, as the temporal AA doesn't really encoura significant cost. What frustrates me is looking at any of these solutions as performance enhancements.. if anything that's a bonus, as the real benefit of the machine learning super sampling is that it does anti-aliasing in motion without many of the significant drawbacks to traditional TXAA or just temporal AA in general.. This is why FSR is such a joke as far as I'm concerned..
Depending on the resolution you're up sampling from, you end up with worse performance in motion than traditional TXAA depending on how it has been implemented.. and on top of that, the absolutely brilliant contrast adaptive sharpening that they have which is usually the perfect solution for TAA blurriness, it's already integrated into the package, and usually in a terrible way.
I used to have issues with DLSS, but I didn't realize is that in-game LOD settings were being hyper sharpened so that you could see them in all their blurry glory, not dissimilar to how TXAA is implemented in a lot of best practices, which is silly.. nothing feels like gas lighting like the entire world being blurry until you stop and look at it..
Once I fix the LOD settings, the performance in motion with transparencies and particle effects, it's like going back to the old days in terms of clarity in motion.. throw a light reshade of CAS on top of DLSS or DLAA and you got about as close as a modern game can get to the clarity that we used to have as normal.. depending on if you can fix the LOD settings lol as they are usually trash.. I swear developers... I don't think they have very good eyes lol.
@@JackWse nice post my dude.
Atleast in 4k high settings across all games I have played so far I never could tell the diff between 4k native and 4k with fsr Quality
I can easily tell the difference. Even dlss at quality looks bad to me at 4k. I play about 5 ft away from a 65" screen though. DLSS makes everything soft and blurry, fsr has ghosting and a LOT of visual noise and shimmering
@@Dempig I experience the same thing you mentioned. Once you see it, it can't be unseen. I would like to have a High Quality Setting between Native and Quality. Its just not good enough to switch to DLSS / FSR / XeSS. Graphics is king, especially with solo games like Horizon or Rachet & Clank where FPS and reaction is less important compared to games like Call of Duty or other E-Sports titles.
@@Dempig 4k 65' is a huge diff to 4k 27"
65 inches at 5 feet! That has to be really immersive. Are you able to see the edges of the screen at that distance without turning your head?
@@itisabird Yep it about perfectly fits my viewing range, I will eventually go bigger I love large screens.
Tried FSR 3.1 with Spider-Man on Steam Deck, it actually worked pretty well, just have to manually lock GPU at 1600MHz, and have a 60+fps frame rate cap
Nice video, still i would like to see a Native vs Upscale Battle on this level of detail.
I am more interested in native + frame generation, why is there barely any coverage...
I have no clue at some point how stuff is being tested or any feeling it is fair, because ..., DLSS only runs on an Nvidia card (so any testing of it is exclusively on Nvidia), is then FSR in the same game also exclusively being run on an Radeon 7900 XTX ? as we know by now it will show the best it can do an a Radeon card ? While the Xess is run at the same time on the best Xe Intel GPU as it runs the best there ? Or is Nvidia being given the best of both worlds and all stuff is run side by side on a RTX 4090 ???
FSR has no visual difference on a Nvidia card or Radeon card.
Great review! Can't say I can see much difference between the three technologies now, which is nice. I only use FSR on the laptop for select games, so hoping this update can improve that experience a bit 😀
FSR, DLSS, and XESS are so different in how they tackle things that it would be interesting in the future to use them as image and video filters for a unique look or even combining them to make a custom filter.
DLSS and XeSS are pretty similiar
What about "Lossless Scaling" available on steam? It claims to work on any game with any gpu, even older GTX cards.
its good but it isn't perfect
Have been wondering about this a ton lately as I consider AMD vs NVIDIA to replace my dying 1080! Thanks guys!
W GPU! Keep it, it'll be nice a memory
Dying how? Does it just need a thermal paste refresher?
What resolution you playing at? I'm at 1440p and went from 3080 to 7900XTX and its awesome
If you can afford the 5080, get it at the end of this year.
@@adi6293 I did the same and was a great jump but I still like the polish of Nvidia products when it comes to drivers and programs. I'll prob go back to nvidia when the 6000 series comes out or what ever is after the 5000 series this year.
23:16 I kinda like the FSR "light through the leafs" here, it's not a bug its a feature ;)
I promise you wouldn't like the look of it in game
All three looks pretty good now. Might not notice if not zooming in and slow motion.
I would love to see having 4k native next to the 3. Also I wonder how much of a problem are really some of the FSR 3.1 issues since I don't look at the monitor at 300% zoom and I don't play at 25% speed. I know you've added all that for the sake of the video, I'm not being harsh on you.
You're watching a compressed RUclips video here which subdues most upscaling artefacts by turning them into compression artefacts. Zooming in is basically necessary to SHOW you how that'd look like on uncompressed footage (I.E. your displays output).
I just finished playing Ghost of Tsushima at 4k with FSR enabled. During the whole 60 hours of gameplay, I experienced ghosting more than once per hour. Usually it was a relatively subtle artifact that disappeared in around 1 second. Sometimes, around 10 times (in 60 hours) in total, the artifact was severe enough that I had to stop playing for a couple of seconds because it interfered with my vision. I knew it was caused by FSR, but it didn't bother me enough to turn it off. The game looks espectacular anyway.
i dont understand the performance normalized idea. generally while using FSR, most will just use quality for the little extra boost to FPS while maintaining image quality.
Yeah, that entire section seemed pointless. I was actually WTF untiil I saw compariing quality across the board was in a section coming up.
It seems especially pointless since most will not notice the slight performance losses between the different upscalers thanks to displays with adaptive sync, but will most likely notice the big improvements in overall image quality.
It's not pointless because it's inherently an fps-increasing technology, and one that games are increasingly relying on. A 40% performance boost is preferable to a 20% performance boost 5:39. Just going blindly with the same name across different upscalers and only looking at the quality is going to give you a skewed image of the value they provide. With 4K, which is largely what the video is aimed for and where upscalers are the most relevant, that additional performance can make a big difference.
The section comparing the highest quality of those upscaling technologies is in the video. So, what are you mad about?
@@Maxoverpower I don't have an issue with normalizing, it makes sense here, I am just confused as to why they decided to use FSR Balanced in Horizon Forbidden West since, by their own charts, FSR Quality performs as good as DLSS Quality.
It's at 4:35. The difference is one frame in the AVG framerates, and the 1% lows are identical. This is as within the margin of error as possible.
Shouldn't that particular example constitute the exact same performance uplift?
I was seeing some stationary luminosity noise in the FSR I hadn't noticed before.
It may sound weird, but using a controller on pc helps me in some ways to notice less artifacts, ghosting and shimmering from using FSR 3.1 and XESS 1.3
It's not weird, especially with AFMF. AFMF is disabled with fast motion. Using a controller to move the camera alleviates that.
So what is XeSS Ultra Quality?
XeSS scaling factors in 1.3 versus earlier versions
Preset XeSS 1.3 scaling XeSS 1.0-1.2 scaling
Native AA 1.0x (Native) N/A
Ultra Quality 1.3x N/A
Ultra Quality 1.5x 1.3x
Quality 1.7x 1.5x
Balanced 2.0x 1.7x
Performance 2.3x 2.0x
Ultra Performance 3.0x N/A
🤔
@@Chasm9 Thanks! 2 questions: you wrote Ultra Quality twice (1.3x and 1.5x) - what's the difference between them? And secondly: what are the scaling factors of DLSS Quality and FSR Quality? Ty, my wholesome potato in shining armour :D
67% res scale, just like FSR Quality and DLSS Quality modes
Same render resolution as DLSS Quality and FSR Quality. They changed the render resolutions recently and so the old "Quality" became "Ultra Quality".
@@exscape Thanks, guys!
Glad FSR makes good progress. This tech works on virtually every (gaming) gpu so its already leagues better by giving almost every one this tech mate. Even if it looks less good in my opinion its already a more valueable tech to further develop. Not some gimmick only working on the latest hardware and DLSS 4 will probably work only for rtx50 because Nvidia desings tech this way to just create incentive to buy their newest gpu's same story with the rediculous Vram amount on GPUs costin North of 800 bucks. 16 gb on a 1100 euro GPU is a disgrace... CB2077 already (almost) uses that amount on 4k all max settings. No matter how fast the Vram is if its not enough u get stutters and low FPS. So lets focus on things that help gamers in a whole instead of bashing tech that gets better and better and is helping the entire (pc) gaming scene! Its just getting a tad boring most channels bashing FSR/XeSS. Ur paying 100-300 bucks more for DLSS so u should EXPECT it to better. Same pricepoint AMD u get 2 to 8 gbs of VRAM more and drivers and performance is actually really competitive.
Great breakdown. I thought it looked decent in my testing but then again I'm not pausing and zooming in 3X to look for issues. Though, I have a 3080 so I'll continue to use DLSS unless it's not an option in a game.
I look at this and I still say that upscalers are like televisions. The only way to make one look even remotely bad is to have it next to a superior model. If you only have one to use, you'll be perfectly happy with it because, at the end of the day, the impact that they'll have on your experience is very slight if they have any impact at all.
I remember i used to use xess on cod because fsr and dlss used to have this annoying artifecting in the waiting lobby wasn't noticeable in game but used to drive me crazy
Am I the only one who think that upscaling technology should be standardized for all?
yes by not existing.
No, it's just that good upscaling is highly dependent on vendor specific support for AI at the moment, so until the industry agrees on some common ground, we're going to be in this situation of vendor-specific upscaling solutions.
nice comparisons, would have been nice to see a bit more side by side of FSR 2.2 to 3.1. Shame to still see so much ghosting and shimmering in 3.1. Hopefully they keep improving and get it implemented more in consoles. Will be curious to see if PSSR catches on in the PS5 pro.
This is an upscaler that's constantly making progress and it's free. Not too long ago we were under the impression that Ampere and previous gen were not able to execute framegen. AMD exposed that. Now we have blockbuster titles that can utilize a mix of DLSS with FSR 3.1. I have friends with Turing and Pascal technology that are seeing their cards get a second wind which is awesome. On an OLED screen it still looks damn good with old tech.
At around 17:45 Ghost of Tshusima walking animation, see the jumping around the DLSS version of his head, especially noticeable in the feather, it does, compared to the far more flowing look from both FSR and XeSS. This is the problem of making the "baseline" 'wot DLSS duz' instead of "no upscaling". He's peering intently at both XeSS and FSR to determine WHAT FLAWS CAN I SEE, and thereby ignoring DLSS here.
It's not a widespread problem for DLSS use, but that is Tims problem with his focus on how DLSS HAS TO BE best.
16:18 the texture on the blade
@@brunogm Which Tim already pointed out as problematic, so at no point did I have to.
Your point seemed to have been inside your head only.
I bet that's just an animation glitch in the game, nothing to do with DLSS being the upscaler. You can see he briefly stops walking to reset the animation, and the movement becomes smooth again.
Because it is the best, So it's the benchmark the other two are aiming for.
It should be compared against native. Otherwise you compare errors to other different errors
What exactly would you say is "night and day" difference here? 11:10 Good video but you missed some issues. Like the newfound moirè pattern issue in Ratchet & Clank with FSR3.1, the aliasing on Clank when he's a backpack, or the disappearing confetti. I also think all the testing should've been done at lower resolutions than 4k to highlight the differences.
You cant catch everything when looking into this stuff but what you mention about R&C were some issues I came across when looking into it.
Impressive given that XeSS is not running in its highest quality and performance XMX mode
Pretty happy with FSR 3.1 on Horizon Forbidden West. Got a new 7900 GRE and on 1440p ultrawide I get about 100-120 FPS with it. Used settings recommended in some Reddit post, looks fantastic and it's smooth. Balanced setting.
Ghosting go brrrrr
Hi, thanks for the effort. I have a question, so I have GTX 1080ti still, and in this case, is it better to use FSR balanced rather than quality if I want to get best out of upscaling?
yeah, it doesn't make sense to me
Use the highest FSR setting, which is FSR quality. Or use XeSS. You should only choose lower FSR setting if you want more fps with a lower image quality.
@@WrexBF Yeah but the video says as "on nvidia gpu's FSR Balanced seems the way to go as quality doesn't give enough frames to justify upscaling".
If you watched this without audio, and just watch, you come away thinking that FSR has pretty much achieved parity with DLSS. But if you only listened to the audio, you'd think FSR isn't even close. Either my vision is going early, or what's his face has some personal preference influencing the interpretation. I say this having bought 6 Nvidia and 1 AMD gpu in my life. But I also have never bothered using up-scaling. But if I did, from just watching this video I would think FSR was close enough and go with the cheaper option. But again, if I only listened to it, I'd think, well better not take a chance on FSR.
One of two things, AMD and Intel either need an equal upscaler or they need to make a stand on RT being a useless tech and run the gamut on rasterized performance.
Great video as always. Would have appreciated(800p, 1080p) using some handheld resolutions given the increased issue seen at 1440p. My main issue with fsr 2 was the pixelation on foliage so it’s great to see the improvements.
I understand choosing the setting that has the same type of uplift but at the same time, realistically speaking, I think it would have been worth it to do a comparison with similar fps instead, because someone with a 4070 and a 7800xt playing the same game would probably go with what gives them acceptable fps
so if in both cards you get 100 fps on a game with the quality setting both would go for that setting even if the uplift of dlss is a higher % (so it's a more realistic comparison) there's value in comparing between the same uplift, but the real life comparison should also be used based on similar fps (I know you mentioned it would be too many combinations, but I think that's a more realistic scenario that people might use/choose, based on fps, not on % of performance uplift)
You need to rewatch the video.
I would have put the scaling ratio next to each bar because they have different names and I always forget which is what
I also noticing ghosting with FS3 TAA implamentation. Wondering what happens if it worked with other TAA tech.
Thanks, Steve!
5:15 Why do we want to match the uplift? We want to match the frames, kinda a bad take here but okay.
Agree
I second this. Like, I understand it from a "scientific" perspective, they want to know just how much better the methods are for improving performance, but for gamers, this is an irrelevant stat. All it matters is that the resulting frame-rate is satisfactory for the consumer.
Why are you comparing DLSS quality against FSR balanced 15:10 in Horizon FW when the FSR quality is just as fast as the DLSS quality?
And then DLSS quality against FSR balanced at 19:01 in Spiderman? Your Spiderman benches seem wonky as the 1% lows are worse for DLSS balanced than quality.
Should be comparing quality across the board, what a bunch of bias shit.
You totally missed the performance 8:55 upscaling part didn’t you💀
@@DrLogic_ The quality mode in Horizon-FW for FSR is just as fast as DLSS quality - so no reason to be running an inferior balanced mode. In Spiderman-MM the FSR balanced is faster than DLSS quality.
Still should be comparing quality vs quality, balanced vs balanced and let the user decide the level of performance. Didn’t see any concern about performance matching in previous reviews unless I missed something.
@@DrLogic_ He's right though, for Horizon Forbidden west, the numbers for DLSS Quality are 63 AVG, and 48 for 1% lows, and for FSR Quality it's 62 AVG and 48 for 1% lows. That's as neck and neck as you can get.
You can find that at 4:35 in the video.
IMO you should ignore upscalers naming and just compare them across base resolutions or % of screen resolutions. Then you compare image quality - ability of upscaler to upscale or even enhance image. You also get the performance of each upscaler as a bonus info.
My favorite upscaler is to just play native tbh lol. Forget about Raytracing and stuff.
I was surprised to see how well 4070 fares against 7800 XT in those cases when upscaling (DLSS and FSR correspondingly) is in use.
4070 is a pretty competent GPU after all. If only it's initial MSRP was $500, it would become a decent GPU.
enjoy using medium quality textures in 3 years if you buy a 4070/4070S
Oh yeah, most of Nvidia's cards are really great if we could just cut the pricing by like 30% lol
I think most people would agree that pricing is the worst thing about GeForce at this point. Unfortunate
@@thrafkroos would you mind having a conversation with Nvidia for me and asking them to put more than 12GB in a laptop under $2000?
Thanks
@@TheDarksideFNothing The pricing is the reason why I changed my 1080ti for a 6900xt when that dropped in price. I've used Nvidia since the 8800GTX but that current pricing is just... Fucking nuts.
@@thrafkroos I doubt any major changes will happen in 3 years. Next gen consoles will surely increase VRAM usage in games significantly. But they will not launch in 3 years, more like in 4-5 years.
But anyway in 3 years games will become much more demanding in terms of raw GPU power. By that time the owners of 4070 S will be lucky to run modern titles at 1080p/High settings. Thus 12GB will still be fine. Look at Hellblade 2, Banishers and Robocop - they use just about 8GB at 1440p. So 4070 Super runs out of power way before VRAM capacity becomes an issue. Since UE5 is the most popular game engine, most games will behave the same way. Only a small bunch of future console ports may become a problem for 12GB cards.
Here we go again.....
How does it make sense to test Qualty vs Balanced vs Performance mode and look at image quality?
The section comparing the highest quality of those upscaling technologies is also in the video. So, what are you mad about?
nice video more inline with a real users PC.... its amazing to see scene by scene how the tone make a huge difference
Think i'll still hold my view of avoiding all 3 of these techs unless absolutely no other option is available.
And I think that's the best option. If you can buy a card that plays games at the framerate you want at your desired resolution, none of this should be necessary. By the time it might be, it's likely all of these features will have reached parity.
Native rules.
DLSS at its quality settings looks BETTER THAN NATIVE.
Yup. I might think different if I had a 4K display, but 4k+high hz is still too expensive.
@@Galf506in 5% of the games?
Ancient Gameplays bet you to the video.
"FSR 2.2 vs FSR 3.1 vs XeSS 1.3 vs DLSS 3.7 - Which one is BETTER and WHY?"
Tbf, I've never once considered them competitors. AG even said he watches this channel sometimes.
Yeah and somehow on his comparison has DLSS on Nvidia and FSR/XESS on AMD. HUB is a big channel only don RTX card for all 3. and as we can see, FSR is worse on nvidia card since it has lower FPS so they need to drop quality setting to have better fps which leads to worse FSR image. This is no longer comparing quality setting image. HUB seem to managed to pull a smoke screen of nvidia marketing here. rarely watch HUB anymore aside for CPU comparison and News.
All of this is filler until the definitive DF video drops.
Geekslab / rtx raytracing training programs / memory tile training even over tile to get the chip to stimulate cells not normally used / then power and ai through power filter , it’s kinda why I like playing online with a battery now aka off grid
It depends on the implantation. Like in Forza Motorsport, using DLSS not only reduce visual quality but also manages to decrease performance. Just wonderful
Will you please always check you specs of the machines you use - while 4070 is nearer the type of GPU most people use only about 5% (guess) use a 4K monitor for gaming. This vid gives little info on whether the 4K monitor has a 'it matters' impact on the image as opposed to a 1440p or 1080p monitor
4070 actually isn't close to what "most" people use. Most people are a couple generations behind, and the 60 class cards have always been more popular.
@@matthews2243 Higher end GPU users are always delusional about the cards they own. lol.
According to the Steam hardware survey, around 7% of Steam users who use Nvidia GPUs have an RTX 4070 or better
Am I the only one that’s never touched any of these upscalers?
Got a mid range board, when played CP it was enabled by default, after some hours of gameplay i decided to turn off, and man, even with half fps the game was better than with upscalling....
@@cfzeroooUh-oh...
I didn’t use them yet, because I still use older gpu Vega 64, thats still roughly on par with RX7600 and thats on 1080p, so upscalers has little use there.
1080p is still the most used resolution and there upscalers have little use, so this whole technology is still pretty niche.
What I would like to use though is DLAA or FSR Native AA, but I can’t, because I don’t have newer gpu and very few games support it.
DLSS is often better than native TAA, so in those cases it would be foolish to not use it. I'm using a Mod in FO4 to enable DLSS in DLAA mode to get rid of TAA blur.
@@Littleandr0idman I can definitely see myself never touching it if I didn't have an Nvidia GPU although Intel's solution is getting pretty good even if it doesn't provide the same boost in performance.
Good evaluation. Its funny because with a few tweaks I was able to achieve 70+ in dying light with dx12 ray tracing all settings max with the 6800/FSR quality 1440p-4k. But I just couldn't get past the lows. Some times it would be under 55fps and it was noticeable. What saved me was fsr 2.0 Balanced. I never dipped below 60 again, but it is much more noticeable especially with fine level detail like tree's and gates etc. Ray tracing is something else and after seeing dying light with it, it is well sort from me. I am thinking of upgrading for a 4080 ti or super.
lol, Motion blur permanently enabled.
Had same conclusions in my FSR 3.1 videos 😅
AMD did a great job, so far i tested FSR only in Horizon and Ghost of Tsushima but in both there is big image quality stepup, only problem is much less visable weather on AMD's solution and alot of ghosting in Horizon😉
Thanks for video!
Don't take it personally: I've seen how the protagonist in horizon looks, she would be ghosting me too, no doubt 😔
@@MattJDylan 🤣
You should've reviewed every technique at the same or closest possible internal resolution. This benchmark isn't good because we aren't really having a fair comparison of quality, but instead of performance normalized quality.
Could you give us an update on xess in 1080p. Your previous video only had dlss and fsr tested in 1080p.
Seems like performance testing could have had a AA off option bar chart as well.
I have the 7800xt. I run most games in native 1440p. If i have to zoom in 200% and reduce the speed to notice the difference in render resolution or individual pixels then i have a bigger problem then deciding which upscaler is better.
Man i can confirm that as AMD user i dont even touch upscale technology cuz we have a monster in games such rx 7800 xt and rx 7900 series so no need even think about Upscaling things. But 😂in other hand Nvidia 😂 without useing Upscaling tech they will be equal to the last generation from Nvidia 300 series 😂plus is too expensive 😂
@@2284090my go to now with a 7900 xt is native res with the lossless scaling app at x3 frame gen and everything runs and feels extremely well
Bro thinks he smart oh nonono don't tell him
what does that have to do with the actual comparison of the feature ? typical AMD fanboy behavior comment
Incredible how FSR performs, specially because it doesn't use any type of AI
The issue is not performance but more about image quality and stability. At lower res like 1080p why FSR fall behind XeSS and DLSS? because it does not have the ML component to generate the missing data from lower res.
"AI" is kind of meaningless, it's just a marketing buzzword. All these algorithms are trained using machine learning (basically, they render the game at two different resolutions and then train an algorithm to create the higher-res version based on the lower-res version), and then manually tweaked based on human feedback (because training for some types of image contradicts training for other types, so some choices have to be made - an issue known as "overfitting").
None of the actual "AI" is running on _your_ system while gaming, because to do the training your system would have to be rendering both resolutions, comparing them, and readjusting all the weights (which would completely defeat the point of interpolation, which is to improve performance by rendering only at the lower resolution). Your system is just applying a fixed algorithm, with pre-calculated (and pre-adjusted) weights.
The real issue is that both manufacturers now seem to be focusing on ways of faking resolutions and frame rates, instead of developing GPUs that can actually render 3D scenes noticeably faster. And the way media has been focusing on the interpolation "technology" (which isn't even new; my 14-year-old TV set has a Faroudja chip able to do most of this stuff) has diverted attention from real 3D rendering performance.
It wouldn't surprise me if Nvidia (or AMD) start adding two (or more) interpolated frames between real ones to claim they've "tripled" the frame rate.
@@RFC3514 you kinda need new ways to fake resolutions when you use ray tracing, we are still not there yet to fully ray trace a scene in 4k, as long as the fake resolution looks almost or even the same as native, where's the problem? and nvidia is pretty close to that, amd on the other hand is not
@@thepirate4095 - In motion, and after they've been adjusted for each game, they're practically indistinguishable.
And there's nothing wrong with adding the _option_ to use interpolated frames. The issue is pretending you need to buy new (overpriced) hardware to do it, or hoping that people will evaluate the performance of the new hardware based on the fakery, while comparing it with non-interpolated benchmarks from previous generations.
Also, "ray tracing" currently is a mess with a huge cost in terms of power consumption and heat production for very little benefit (in some games it actually makes shadows look noticeably worse). It's going to take at least two generations (of GPUs _and_ game engines) to make it worthwhile.
@@RFC3514 DLSS isn't trained for each game. It was in the beginning but it quickly changed to a general model.
What if the less shimmering and sizzling the more ghosting and the less ghosting the more shimmering and sizzling? In other words when you minimize one you maximize the other?
That's not how it works in general, that's just how it happened to work in the case of FSR
Everything looks good and everybody should be happy to enjoy upscaling.
Comparing DLSS Quality vs FSR Balanced is meh ... 5% difference in FPS is not important. AMD owners mostly use FSR Quality in 90% cases and don't care about FSR Balanced
You using a performance increasing setting. So performance should be the most important variable to control
The thing is that even DLSS performance looks better than FSR quality in most games. So you have additional performance on Nvidia GPU.
@@imo098765 Yeah, but the reason why people use FSR/DLSS instead of just manually setting resolution to 720p is to get an fps improvement without losing too much quality.
@@imo098765 It is anti aliasing setting, not performance increasing setting. People are just using it wrong.
@@KrisDee1981idk maybe I’m blind but it looks really good to me , I give it a 8/10 , I don’t use upscalers very often only in games that are kind of hard to run , I have a 6800xt
Hello Everyone!
Hello 😀😄
Hey. How are your temps
Hi
My wife let me take a break off work so that I can reply to this comment. Hello to you too! (Please save me)
@@yourlocalhuman3526 So nice of her. And thank you as well.
(Type the address of pickup point and the time and I'll see what I can do).
Thank you for this work !
Seems like FSR still needs one more major update to get close to DLSS. But that’s great to see 4k performance which is perhaps the most important mode in upscaling is looking way better ! Ghosting is an issue but I do think shimmering is probably the most noticeable thing while gaming so they need to fix that definitely.
The problem for AMD that nVidia releases very often sometimes monthly a new DLSS version now we are at DLSS Super Resolution version 3.7.10 and their Supercomputers are learning even at that moment.
@@PeterPauls true true. I love what nvidia is doing to push the technology. I do think for a lot of gamers they just want AMD to get to a passable level of very hard to see shimmering and ghosting and they would be happy. If the pixel counters find issues with something, so be it. But the big noticeable things when gaming just need to get almost non existent and then gamers will probably be much more willing to go cheaper. Which comes down to is this Nvidia dream moment where AMD just happens to be so far behind on really important 4k performance…but they are so close to closing the gap. Or can nvidia come up with another must have feature like perfect motion clarity to widen the gap again? Of course AMD still needs to fix their 2 major issues for this to matter.