To the people asking why this video is relevant - AMD is now ubiquitous in the gaming world. Look at how many companies are using their architectures in their PCs/handhelds/consoles. They fall into the perfect Venn Diagram of: Being a single source for both CPU and GPU hardware, having a very *mature* set of architectures (cough, Intel graphics), and actually being willing to be aggressive on price. Nvidia doesn’t care enough to compete much where AMD seems to shine, they’d rather make bank in the AI space. So, if AMD is the only real option for the majority of gaming machines outside the home built PC space, they’re the only company that needs to be convinced of anything. And a video like this is the only way to make the point. If it takes multiple reminders over the course of a couple years, so be it. Yes, Tim is beating a dead horse, but that’s AMD’s fault.
blows my mind that you had to say this because of how rampant AMD fanboys are especially in the youtube section its like an echo chamber thanks to channels like Niktek that only had the mindset of "Nvidia = bad guy, AMD = good guy"
Thanks for stating the obvious (unironically), whenever a video like this comes out you'll find a hilariously absurd numbers of AMD fanboys (who'll never describe themselves as such, because ofc not, they're all "pc enthusiasts") saying how it doesn't matter, how AMD magically released a patch just yesterday that fixes everything, how a user on reddit found a fix that makes a 7900xt perform better than a 4090, how Tim&Steve actually don't understand what they're talking about etc. etc. ad nauseam. It's really tiresome.
@@Chrissy717 Honesty we know that not to be true. At least in the AA and AAA space, games have become a bit more unoptimized despite the slight increase in visuals. This generation feels like a low increase despite the nice power increase in the move to PS5. Most games should look a bit better and run better than the previous generation.
@@geekmechanic1473Plus people tend to overuse the word "optimization" without actually knowing what it means. To paraphrase a classic movie: They keep using that word, I do not think it means what they think it means!
Yes, but developers also didn't have to deal with the added burden of resolutions above 1080p being available just over a console generation ago. It's much more complicated when they have to increase fidelity and features while also pumping up pixels.
Regarding whether DLSS is 'running a more taxing algorithm' to get better image quality, the video states it's not the case, but at the same time states that the only difference of DLSS and FSR is 'hardware acceleration vs generic shaders'. HW Acceleration is exactly that - it speeds up something that could be done in shader cores. It's very likely that the algorithm used by DLSS is more complex (thus it catches more of the corner cases that FSR misses), so it leverages HW Accel to reduce the negative performance impact. One can see something similar with XeSS's fallbacks - the best XeSS algo uses Intel XMX cores and may achieve visual results closer to DLSS (compared to FSR), but the DP4a fallback suffers from a significantly larger performance hit VS the XMX path. Speed and quality has always been inversely proportional, which is why DLSS was initially seen on launch as a kind of black magic because it got so close to native while still offering so much more performance. We now know it really isn't free - it truly had to leverage additional compute resources to make that a reality.
Yeah current DLSS and FSR ultimately don't make the same tradeoffs so it's not really an apples to apples comparison. It's unlikely one could make FSR equal DLSS in image quality without either needing the same hardware acceleration that may not exist in AMD's current architecture (unless the AI cores in RDNA3 can be used) but may be applicable in the next generations, and in the next console, or a significant performance hit.
Either DLSS is either very inefficient or it does not use tensor cores. Nvidia did this in the past with physX and borderlands 2 for example is not playable on FX4350 for example with physX enable because CPU is not fast enough.. Cyberpunk runs at 900p at roughly the same framerate as FSR rendering at 720p so performance cost is rather substantial. On top of it, UE4 (means, no TSR) exists and every UE4+ game has ability to surpass DLSS image quality with proper TAA and sharpening configuration. Upscaling algorithm is less relevant when you have choices for it as well, because cheaper (simple bilinear is almost free) can provide higher quality then more expensive 5-tap Catmull-Rom bicubic (not the most expensive) because it has performance advantage and therefore, higher render resolution can be used for same level of performance.
The case for DLSS (and XeSS) could simply be the fact, that higher image quality is a result of brute forcing the AI algorithm thanks to the dedicated HW acceleration and still providing better performance. FSR simply can't afford this, being multi-platform and thus relying on the lowest common denominator -- the pixel shaders.
@@CrazySerb I mean sure TAA sure has better image quality if implemented properly because it literally cost a performance hit, the opposite of what DLSS do.
This also served as a good example of the failings of TAA, which shows ghosting similar to FSR in some scenes like the stickers on the shop door in the Spiderman section.
I have never found a scenario where fsr had worse ghosting that temporal AA. Anything working temporally will have these types of effects.. even the latest dlss.
I have a cousin who works with a team on TAA improvement and did you know how complicated to improve such technology you are using for free? They don't get much funding and you expect them to work it to perfections?
TAA is much worse than FSR... and AMD has eliminated ghosting completely on FSR2.2. It's just that most developers haven't implemented the measures to combat such scenarios... CDPR added a reactive mask to the car in v.1.61, but then it was magically removed in 1.62 with the release of DLSS3. fan made FSR2 Injection mods also completely remove ghosting. So it's not a matter of tech, but rather the implementation in itself.
One interesting comparison would be between TSR and FSR. They are both temporal upscalers that don't use AI. TSR is only available on UE5 games, but there are already a few available to make a small comparison.
UE5 will be used on console so they can pick the builtin upscaler. UE5 one looks better, it goes from 1080p to 4k by design. Every time I read about TSR people state its better than FSR but that FSR has frame generation.
Where available please include DLAA / FSR native so you can compare the built in TAA to NVs TAA to AMD TAA at the same input resolution. It also allows you to see how much of the IQ delta is from the TAA method used and how much is from the upscaling.
I would also like to see this kind of analysis, but it might reduce the number of games that could be compared due to FSR Native not being nearly as common as DLAA, as all games from DLSS 1.9 can be very easily set to use DLAA even if the game doesn't expose the DLAA option. I'm not aware of a similar solution being available for FSR, but please correct me if I'm wrong.
AMD doesn't have its own TAA, FSR (all versions) use the games built in one (I might add that allot of games have trash TAA and we just blame AMD for it). NVIDIA DLSS has had its own implementation of TAA baked into DLSS from the start along with the use of AI compute requiring a Nvidia card. DLAA came after and maybe it's similar or the same as what's in DLSS and it's a good TAA filter 👍 but AMD has no equivalent to be compared against. Edit: OK AMD does have FSR native in 3.0 but not all titles that have 3.0 also have AA native aswell so it's even less available then the very few titles that have 3.0 doesn't make for a good argument yet.
I always consider upscaling from 1080p to be an odd choice. As I see it, if you're GPU isn't strong enough to game at 1080p, chances are your CPU is probably not strong enough to have notable gains from dropping to 1080p to a lower resolution. For the most part, while FSR may not be as good as DLSS, I've been happy with it from 1440p and up, and it's not the garbage fire the internet often tells me it is.
I think it kinda depends. There are a lot of people on AM4 that have a 3700x or they jumped with a 5600x and a GPU such as 2060 or 2070, it's already good for 1080p, DLSS just improves the situation and they have the headroom for that. Most of the time they should be GPU Bound in these games, so their CPU can at least push some more FPS anyway even if they would get even more with a more powerful CPU
I'm considered poor by American standards and have had 1080p monitors since 2007, 1440p since 2012. This comment is off topic but it's funny to me that people still roll with 1080p in 2024.
It's all relative. A lot of people upgrade there platform in chunks. Say mobo, ram, psu, cpu in platform upgrade, then they do not buy the cpu until 9 months to a year later. I.e. when I built my 5800x build in December 2020, I did not buy my Radeon 6800xt until September due to the gpu prices at the time. That entire time I was using a 4gb rx580. Of course fsr did not exist then, so I was stuck at 30fps in games like rust and other newerish games. If fsr existed then it would be a different story.
In Cyberpunk 2077 2:59 the ghosting is in the original image, it is of course worse in the upscaling versions. I noticed similar behavior with shadows in Days Gone, no upscaling. I'm not denying the upscaled versions look worse, I'm saying that any issues in the original image will be amplified by the upscaler. It seems like an engine issue that was well hidden but made apparent by the upscaler.
Is it in the image without TAA? FSR and DLSS replace the games own TAA solution, so they can help with or create new problems. FSR1 and DLSS1 would have preserved the original problems.
@@JohnSmith-ro8hk FSR1 relied on any ingame AA to create gradients which it used for edge creation. FSR2 and later are completely different algorithms and can be described as TAA which resolve to larger target buffer. (Like DLSS 1.8 and later or temporal inection, TAAU etc.) gpuopen.com/gdc-presentations/2022/GDC_FidelityFX_Super_Resolution_2_0.pdf
@@JohnSmith-ro8hk FSR1 relied on ingame AA to create gradients which it used to find edges. FSR2 and later can be described as TAA which resolve to larger buffer. (Like DLSS 1.8 onwards, TAAU etc.) gpuopen.com/gdc-presentations/2022/GDC_FidelityFX_Super_Resolution_2_0.pdf
DLSS Quality at 1080p on a 24 inch monitor looks good to me. I have since moved to a 27 inch 1440p monitor, and I can tell the difference more clearly between native and upscaled on this new screen. On smaller screens, the sharpness loss is not as apparent is what I'm trying to say.
But in case of Nvidia, if you have a 1080p screen, you can enable DLDSR (1,78x), set in-game resolution to 1440p and set DLSS to "Balanced". This way you will get better image quality than native 1080p (with TAA) and still noticeably better performance. It is a win-win situation for all owners of low end Nvidia cards paired with 1080p screens.
Agreed, but many of us don't have a choice, using older GPUs. The option is absolutely lowest graphics settings or straight up resolution reduction. Both are worse than Upscaling imho.
AMD has ALREADY delivered solutions fixing sub-pixel shimmering (for powerlines, fences, etc.), smearing, and disocclusion artifacts with FSR 2.2, you just assign reactive-, composition- or transparency masks to those objects, and you can also "pixel-lock" certain sub-pixel detail.... heck CDPR added a reactive mask to the car, fixing ghosting in Cyberpunk v.1.61 completely, but then it was removed and the motion blur re-introduced for FSR in the v1.62-DLSS3 patch. Community-made injected version of FSR fixes ghosting as well (you can assign a autogenerated helper function if you don't manually add "masks" to the game, it isn't as accurate, but still helps out), so really, it's more a matter of how much effort developers put into the FSR implementation compared to DLSS, rather than it being an indicator of the technology's capability in itself. I have played around with FSR SDK, and I would argue that the biggest difference is that it lacks the last temporal smoothing passes that DLSS has (due to AMD not relying on specialized cores for accelerated matrix calculations). Otherwise the techniques function similarly. Ultimately DLSS will always yield a more stable image, but in a scenario where FSR2.2 has a feature-complete implementation, it actually gets rid of many issues that still plague the DLSS's machine learned algorithm (like ghosting and oclussion artifacts). These videos ultimately compare "implementations" rather than "technology". The reason being that these games-, and side-by-side comparison doesn't offer a like-for-like scenario. However, if the video is looking at the current state of DLSS vs FSR in games I would still argue that that's a valid goal. Especially since it spotlights the current gaming landscape and how these technologies have been utilized by developers regardless of capabilities. I just think that the wording used throughout the video is kind of misleading, as this is NOT the state of the technology itself. And the headline about AMD "needing to fix FSR" is also kind of wrong since AMD already fixed most complaints, however developers have chosen not to anything about their current implementations. What AMD could do, and something I wish they did, is to offer a temporal smoothing pass in the temporal pipeline (like DLSS has) that can be toggled on/off. That way the quality can be more on-par with DLSS at the cost of performance. But then it will be more an issue of FSR performing worse, which isn't a good marketing point of course. But I think it would cool down the "quality debate" a whole lot
What is stopping the developers to add the smoothing pass themselves? Is it a known algorithm? Can it be implemented universally on shader cores even if it would be higher performance loss than through DLSS tensor implementation. Would it be similar performance if implemented on AMD RDNA3 equivalent hardware. I get you might not know this but this is what I wonder. After your explanation of what FSR is lacking and suggestion that AMD should implement the smoothing pass they are missing.
@@cajampaI mean, FSR is open source, so anyone could modify how the temporal pipeline works. However, it's A LOT of work, and require deep low-level technological skill, and architectural knowledge of how FSR2.2 is currently constructed. Most game developers don't have time for the technical investment it would require. But if they did, one would hope that the changes wouldn't be kept to their own implementation, but rather merged into a new version of FSR. I might've made it sound more trivial than it actually is in my initial post, it's really nothing you just "slap on" easily... Personally, I wouldn't have the technical expertise to do it, but I have a friend that is insanely talented that probably could. However, that friend works for Epic, so he isn't allowed to work on things outside of that, and I think that happens with a lot of talent. Companies eat them up, make them sign NDA and they can't help out elsewhere.
I'd love to see more games offer a 'render resolution slider', like we see in Starfield. This'll allow users to find their own balance between resolution and performance, while still being able to take advantage FSR2's upscaling features. I personally ran Starfield at 1440p + optimised settings + 75% render scale with FSR2 - and it looked great. 75% was above the standard "FSR Quality" (65% I believe), which allowed for cleaner image quality with only a minor hit to performance. At 1440p or lower, the standard resolution scale of "FSR2 Quality" might actually be too aggressive for most games.
For single player games it isn't really a problem (at least with nvidia) because with dlss tweaker you can set whatever internal render resolution you want, but with multiplayer games it's a big problem since dlss tweaker maybe just get you banned. RDR2 Online is a good example, the native taa is extremely blurry (especially at 1080p), but FSR is unusabably bad at 1080p quality, while dlss quality is probably marginally worse than native taa quality wise, but at least you get a small performance uplift.
Agreed. I use DLSS Tweaks to set the render resolution to 80% on 1440p. This makes the image quality almost indistinguishable from native res while giving a moderate boost to fps. Basically free performance.
For me the main perk in having RTX cards for 1080p rig is an unspoken DLDSR Playing Cyberpunk with DLDSR x2,25 + DLSS Balance on high preset was a treat. Much more crispier image than native 1080p
It's a bummer that we don't get more in-depth professional analysis of DLDSR, as combined with DLSS it makes for a very effective anti-aliasing solution.
@@kamilkornobis5585yep I still game on 1080p plasma, but thanks to DLDSR and DLSS (usually using 2880x1620 with DLSS quality, so around 900p) I get absolutely pristine image quality at fantastic performance.
my biggest issue with using that trick is that it completely messes up my second monitor...when I enable DSR/ DLDSR in NVCP it pushes all content of my second monitor almost out of screen. Might be a big win for single display users tho.
Ah, DLDSR and DLSS are the main reasons I went with Nvidia GPU last year. I was thinking about 7900 XT, but since it does not offer a straight alternative to DLDSR and FSR looked (and still looks) worse than DLSS, I chose 4070 Ti. Nvidia is selling tech this gen, not (only) compute power, like AMD.
Why didn't you compare it to an unupscaled 720p image? If your gpu isn't powerful enough for rendering a native 1080p image, is it overall better to use fsr or should we stick with a 720p native rendering?
I agree with you , on steam deck I found out using 960 x 540 p resolution more stable in some games than using native (1280x800)+ FSR Quality. The problem at least for me is many games don't allow that resolution
In my situation (1440p screen), I've found that the quality goes like this: 1080p rendering > FSR1 > FSR2 > 1440p native rendering I've personally found that FSR2 gets pretty close to native 1440p rendering, and IMO you can really only see the artifacts if you really look for it. FSR1 gets you like... 70% if the way to 1440p native rendering. But I would take FSR1 over 1080p rendering any day. FSR1 just does a way better job of smoothing out the jaggies and the pixels than if you didn't use it. So yeah, if you're running a game at lower resolution than your screen, IMO always use FSR if you can
I always have issues with any upscaling technique since I almost always tend to notice artifacts that I find really ugly. What I often do now when I use upscaling with my RX 7900XT to get very high FPS at (upscaled) 4K resolution is to use the driver integrated RSR (Radeon Super Resolution). While image quality might still be up to debate I found that surprisingly really annoying artifacts are less common and in some titles (Dead Space) it offers clearly better results than the in game FSR. In games like Cyberpunk and Remnant 2 it is also better in my opinion. This is also partially explained by the fact that the driver option allows upscaling from 1800P to 2160P (kind of ultra quality mode). I have consistently good to very good results with this. Maybe it would be interesting to test this as well for you guys at Hardware Unboxed :).
with RSR the temporal artifacts are less common or in fact completely missing because RSR is using FSR1, which is just a 2D spatial upscaler without any temporal component or even knowledge of the image content.
@@Sp00kyFoxNo idea, might be. I never checked it up. Still it is funny that this version within the driver can look better. For me at least it really yields often better results than the game implemented FSR.
@@Crysalis-jd1ub It could be that you're sensitive to certain types of artifacts but don't notice others, so spatial upscaling is more appealing to you.
When FSR 2.0 released I was ecstatic, and happy with the base quality for an initial release. However they haven’t meaningfully improved it since, even after launching frame gen, and the quality just isn’t good enough to justify using it.
They cant improve it without AI, FSR was never meant to be a true DLSS competitor, it was a boxcheck and an attempt to muddy the waters and say theyre the same. They arent
DLSS is genuinely important. Sony is making their own tech to compete with it. I was surprised AMD didn't figure out AI upscaling with RX 7000. They only recently announced they were even working on it.
AMD has just given up in their graphics department by the looks of it. NVIDIA cards offer so many software features that are too good to ignore. Their recently released RTX HDR is a game changer.
@msnehamukherjee I don't think it is that, more amd only has a certain amount of wafer allocation from TSMC, and they can sell AI accelerators based on their CDNA platform for far more than they can consumer GPUs. The main thing they need their gpus for is to maintain competitiveness in the console and laptop space.mainstream gpus do that well enough. If amd had infinite manpower and fab allocation I am sure they would go after all markets, but as it is they have to prioritise. Nvidia and AMD have about the same headcount, but for amd this must make both cpus and gpus.
I think them focusing so much on hardware agnostic fsr was to try to let older cards hold off on upgrading to 30 and 40 series GPUs sooner than they would have. With the newer consoles on the horizon its probably a feature that Microsoft and Sony demanded be included.
And people are assuming AMD was talking about FSR when they said "AI upscaling" (most media just ran with it which doesn't help). But AMD actually didn't mention FSR a single time and used "gaming device" instead of "gpu". In the last DF Direct Weekly, they talked about it and they don't think it's FSR. They think AMD might just be talking about something like a PS5 pro AI upscaler. Which doesn't mean FSR with Ai isn't coming at some point, but it might take even longer.
Another great option for people with RTX cards who plays in 1080p and the game's textures/antialiasing looks like shit is to use DLDSR (Nvidia control panel>Manage 3D settings>Global settings>DSR Factors - 2.25x DL / DSR Smoothness 15%). It will create a super resolution for all fullscreen applications (otherwise you will need to change your desktop resolution before launch, some dx12 only games don't have a real fullscreen mode) and together with DLSS - Quality it'll use much more detailed frames while FPS stays the same as native 1080p. It's pretty much DLAA but with better textures, antialiasing, and no FPS drops compared to native. Don't forget to disable Antialiasing, Sharpening, Motion Blur, Depth of Field in game settings.
5:06 Did DLSS messed up the rendering of the blue sign on the right hand side or is that down to a timing difference between the frames where the engine hasn't quite finished rendering the image?
What i would be very interested in would be a comparison/discussion of 1080p native vs 1440p DLSS/FSR. What i mean is a genuine discussion if it is better for the final gaming experience to spend more on a 1440p VRR Display and a GPU that is only able to deliver good FPS at 1440p utilizing upscaling or to get a 1080p VRR Display (which is cheaper, so you can get a better GPU) and render at 1080p native?
Same. I'd just put it off until either a patch improves performance enough or I upgrade my hardware. I have plenty of games in my backlog that I put off for the same reason that I can now play at epic settings and high framerates to keep me occupied in the meantime.
Completely agree with your POV. It's in a kick in the teeth when a new game come out (or even a sequel on the exact same game engine!) and the devs just _couldn't manage_ to get 1080p at a decent fps for your hardware that _was_ perfectly suitable for 1440p+ titles a year ago because of course the hardware is capable, it's just not optimised. And upscaling to 1080p? 1080p should become some sort of minimal standard to target, or the game needs to go back in the oven. IMO, it's not ready yet if (per one of the graphs in this vid), your 3060 12GB a ~£300 card) gets an average of 50fps at 1080p. 1080p60 ultra MINIMUM (I promise it can be done).
moral of the story JUST buy a good gpu or disable ray tracing which is still a gimmick (not properly used in a good artistic way ) , disable TAA , enjoy native , the way 3d rendering was intended from the beginning of time , no blur added or sharpness needed aka Vaseline with taa or dlss fsr , it's okey to have low fps or maxed out qames without rt on for eye candy , but it's not okey to play with VASELINE ON , take it from an old guy .
I play at 1080p and I highly prefer playing at native with reduced details over upscaled with more details. Maybe upscaling has its use where say you buy a 1440p gpu and play on a 4k monitor, but at 1080p I'm not convinced of its usefulness unless maybe you're playing on a 1030 and need upscaling in order to run games or something.
FSR doesnt look worse than native 6:30 (stairs) so not really a problem, if you're having to run FSR at 1080p then its probably worth buying a cheap used gpu for £50 so you can run 1080p native and leave the gt630 in a drawer. Edit: I have a titanV (not bought for gaming) and run FSR quality at 1080p where possible because whenever I've done so in COD: warzone and a few others is looks distinctly better than native with textures on the ground and walls being higher detailed and no its not a fidelity CAS thing.
Question Tim, does the amount of fps have an effect on the output quality of the fsr/dlss? For example does a game with fsr quality have a noticeable improvement in image quality of it were running at say 120fps vs at 60fps?
It does, and that makes sense with the core of these technologies being smart re-use of information from prior rendered frames. If there are more of those frames and they're temporally closer then there is more information to work with and the information is of a higher quality (less different than the target frame). It's not as pronounced as with framegen or making massive changes to the internal resolution, but it's there. You get quite notably better quality using DLSS-Q to go from e.g. 120 to 200fps than you do going from 30 to 50.
It does matter, but not as much. Technically speaking, both technologies depend on previous frame and motion vectors. Now if frametimes are shorter, that means motion vectors diffrence is smaller and if frametime is longer, motion vectors are bigger. So they are self correcting here. In case of FSR it only matters practically only in context of ghosting. Ghosting effect on higher FPS simply is shorter as trail lenght takes more frames and over many frames it vanishes faster. In case of DLSS, it matters more, because there is actual neural network trained that guesses data based on current and previous frame. On DLSS faster presets like ultra performance you can notice it a lot - if you move and take frame out of motion well it doesn't look that good. But if you stop, DLSS gathers data about relativly static image and better recreates text and fine details. If you had infinite FPS, every frame you would treat as static. FPS count also matters a lot more in case of frame generation.
@@piotrj333 Seems like something that could benefit more from high framerates by keeping a duration of frames instead of a single frame. If frames are quick more of them are going to be close to the current frame, if frames are slow then only the previous one will be valuable.
My uninformed and uneducated guess is that FSR has reached its peak as an upscaling, sharpening, and temporal vector feature. Development has probably shifted to tensor cores for both mainstream and future AI products. AI upscaling will probably be a byproduct benefit to gaming compared to what they likely want to sell in the datacenter. Though my hope is that Cerebras kicks GPU makers out of the AI space entirely so we stop getting wrecked on GPU prices.
I love all the movement ghosting echos using DLSS. Just finished Days Gone and DLSS looked absolutely awful coming off the back of the motorcycle. Both of these solutions are bandaids for low framerate, IMO. That's all they are. They work great if you take a screenshot. DLSS in Cyberpunk (when I played at launch haven't been back to check this now) The NUMBERS on the shipping containers in the parade float warehouse. 3 Digital numbers... every one of the containers looked like 888 with DLSS on. Turn off DLSS and you could actually see the number. This is without movement at all. What really sucked was one of the objectives was to "go to container #____" But you can't read the number on any of them.
I hate all this upscaling technologies, they're ruining gaming! AAA Half ass broken games will always come out with upscaling tech to cover the bad optimization of their games!
Lol, i played Cyberpunk with DLSS performance and FG averaging 60fps at 4K max settings, and couldn't be more happier, everything looks gorgeous. This won't be possible w/o upscaling tech. Imagine the raw gpu power needed and the cost as well to run my setting w/o upscaling. I always LOL at ppl who despise upscaling tech. Whether u like it or not, upscaling is the true future of gaming.
I played Cyberpunk in 4k dlss balanced and it was fine, ghosting problems were a problem and just like in this video streetlights and wires were sh**, overall it was "fine", meaning that I got to play the game, the other option was to go for native rendering at 1440p low-medium or 1080p high, both looked quite bad on a 32" monitor. I then used DLSS in some other games and had a much better experience (always 4k balanced/quality), for instance dlss does a really good job in Hogwarts Legacy. Unfortunately I can't say the same for FSR, I tried to use it in Horizon ZD and it was basically unusable even at 4k quality, it did do better in games like FC6 and Hogwarts legacy and Cyberpunk, but in all cases was particularly obvious that the game was running at a lower res (mostly because of foliage and general shimmering). I'm overall fine with upscaling, but its implementation needs to be consistently good on all games for it to be a viable option, and AMD really needs to do something to fix their FSR, because rn using FSR truly is the last resort to squeeze out some extra fps
@@maverichz to be completely clear, if for you that looked "gorgeous" is because you don't see all the problems, CP has suboptimal dlss performance. for instance Hogwarts Legacy is kinda the opposite
@@maverichz Yes, in comparison to a situation that would be even worse than the current one, upscaling is pretty good. What if we also weren't overcharged for what's essentially a software solution to the problem of "not having powerful enough hardware to achieve X settings at Y cost?" Actual native output is so bad TAA is applied by default, upscalers may have hurt as much as they've helped.
Why would you even want to use fake frame generation at 1080p though? Most newer, reasonable gpus can handle most games at 1080p at mid-high settings. I never use upscaling, unless my mediocre rx5700xt can't handle the game at all. I just prefer the image quality over hundreds of useless fps.
People have been demanding that, including reviewers (at least those I watch). That doesn't mean upscaling methods don't have a place in today's gaming and should be disregarded.
@@eliadbu All upscalers wether based on AI or not rely on temporal pixel interpolation and motion vectors from previous frames. Whenever temporal pixel techniques like temporal AA are used the picture will inevitably loose details and produce ghosting up to muddy pixels. You can fine tune upscaling but never prevent artifacts completely. Even DLSS 3 is still affected by that and even worse the AI does not always recognize objects properly and misplaces wrong patterns. Conclusion: Native frames still present the best image quality.
I'd like to see a comparison for 1440p and above. Thats where they are really useful. At low resolutions, chances are your CPU will be lacking and upscalers can only marginally help. I also want a cost comparison. Nvidia has an advantage with specialized cores, but sell this at a premium, so i'd like a comparison which takes into account the card required for DLSS and an alternative from AMD which might be more powerful for general compute and can choose a better quality mode for FSR. Just because both are called 'Quality' doesnt mean they are really equal and comparable.
Purely academic but I'd love to see what visual quality you could get out of DLSS/FSR with performance parity to native, how high does the resolution have to be to get down to native's performance and does it look better
@@Relex_92 I'm not sure that was the question, it was about comparing visual quality between upscalers and native (TAA) at performance parity, not comparing FSR with DLSS (I think we've established DLSS has better visual quality than FSR in most if not all games).
These artifacts are 100% easy to notice while gaming. I don't even use fsr at 1440p on my 6800xt thanks to these artifacts. Even at quality 1440p still looks noticeably worse
@@lilpain1997 Yes. For us. But this is niche for nerds. We know what to look for. Most people think it is just the way it is supposed to be, don't care, find it good enough or juts don't notice it. Don't forget, most of the people also think artefact-ridden smartphone photos are of great quality...
I have an RX 6900 XT and generally find that FSR 2.1 or higher has been better than DLSS on my brother's RTX 3090 there's a lot of DRAM stutter and artifacting particularly in no man's sky where the game is entirely unplayable using DLSS but even on the RTX 3090 if you put FSR on in no Man's sky is pretty acceptable Edit I forgot to note that I'm on a 1440p monitor so that might be why I have a different experience
I think fixing can go from both ways - from and and from game developers. I hope that developers testing FSR when imementing it, so when they see artifacts they can apply workarounds. But as I see common situation for developers is applying only settings that controls FSR quality levels and that's all. And if it works bad they transfer all negative feedback to AMD. My notice not about AMD only, it's about all upscaling technologies, that need to be applied not only as an option in graphics option. This option needs testing from developer and maybe some communication between developer and GPU vendor.
Something that this video does not adress is the fact that most TAA implementations in newer games make your game look like a blurry mess at 1080p. Even with added sharpening the image still looks like a valesine smeared mess. Introducing FSR to the mix fixes most of this since the AA used in FSR is much more sharp than native implementations. I can't recall the last AAA game with a good native 1080p TAA look.
I have to wonder though: if you have to use any upscaling tech with 1080p final resolution, aren't you likely to have other issues by being on very underpowered hardware for the game? How relevant is this comparison, really?
Isn't the purpose of upscaling to elongate the life of underpowered hardware? Did you see the performance numbers at the end? 20-40% boost in performance for not much worse image quality with DLSS. That's like buying a new GPU, which Isn't a good option in 2024.
@@DeepteshLovesTECH The question is, if they have a GPU weak enough they need upscaling at 1080p, what are the chances they have a strong enough CPU that it will actually give them a notable boost in FPS when they're dropping the resolution below 1080p with upscaling?
@@VelcroSnake93 There is a high chance the CPU is good enough. Something like a 3050 will also be pretty GPU limited in new titles. Also take for example: Laptops which have powerful CPUs but weak GPUs.
Ima just say this, FSR open source, DLSS limited to one brand. For something thats a free performance boost, i dont think nitpicking on whats better should sway buyers, when you are pretty much selling lesser VRAM and charging more for lesser raw performance, it should be shown more whats actually better. The video is nice to point out and make AMD improve, however many will choose a brand based off this video. A better test end result is to mention future titles more than current, as current is a benchmark for today and not tomorrow. In that logic, you cant just magically add VRAM, and anything under 20 gigs of VRAM for future titles will show much more worst performance and graphic quality. DLSS and FSR will improve over time, but it wont fix less VRAM, so i see absolutely no reason in Nvidia at all. Until the next gen GPUs release for Nvidia, which they can still scam you out somewhere as usual, anything under a 4070 ti super is going to be a waste later on. It was Nvidias plan to cheap you out so you have to upgrade. Meanwhile, AMD is future proofed fully, unless you invest in an overpriced 4080 or an overpriced 4090. I'll take current lesser quality FSR over paying an arm and a leg for something just to compete with the longevity of even a 7900gre or a 7800xt. AMD wins.
I know right? 😂 Jokes aside, for some people it’s really noticeable. I think that if you are an old gamer it’s easier to ignore because graphics used to be so much worse. 😂 But maybe it’s just a per person thing.
@@DavideDavini i remember when i got 3070 after many praises how good DLSS 2.0 looks, installed cyberpunk to check this out and my first reaction was how shit this looks.
@@DavideDavini it all depends on sensitivity to things each persons going to be different fsr is 90% of dlss most of the time. personally for me i dont notice anything unless i specifically looking. the main thing i notice on both dlss and fsr when using them is the softness of the image not the shimmer on a corner at a distance or on small things its the overall softness that gets me. for me i prefer native resolution all day and dont bother turning these things on and i upgrade enough to make sure i dont have to every 4-5 years on average. im also more sensitive to latency than i am to slight shimmer at a distance while gaming. like 20ms to 40ms latency change is night and day and feels horrible.
the quality of FSR is one thing I definitely noticed moving from my 2060 super to the 7800 XT. I couldn't even tell the difference between DLSS balanced and native, but anything below FSR quality just makes the game look terrible.
I think its funny that you see a game "looks awful" when I'm an old guy who used computer games of the 80's and 90's where the pixels were visible on almost every frame!
a different types of awful, back in the day graphics were simple but since crt's were a thing things like motion clarity were arguably better nowadays with lcd's and so many badly implemented temporal aa and upscalers your screen can end up looking like a smeary ghosty dipped in vaseline mess
Well this is not 80s or 90s. I played games from 90s and all the way to here. For a game to release in 2024. the pixelated look is not acceptable. Simple as that. Jedi looking like absolute shit on native is insane. Especially from AAA studio.
@@autumn42762 you could also see 50/60 Hz flickering on CRT screens all the time. And some distortions around the edges. Our brains don't appreciate visual inconsistency. We are trained to seek for typos in texts and rare gems/food in the world. We don't care as much for the average quality as for the flow of things going wrong. Be it 320*240 green (orange/gray/beige) pixels or 4K in 10bit HDR.
I'm from the same era of video games and played my fair share of 240P video games and still do. In that sense I'm not bothered by what's now considered low resolution, or in other worlds 1080P. Ugly upscaling, artifacts, shimmering, blurriness and latency don't fit well with my gaming experience though.
There seems to be a common sentiment in the responses to this video that upscaling 1080p is pointless, along with this HUB video. As of right now, Steam Charts show that 1080p is BY FAR the most commonly used resolution, with nearly 59% share. 1440p is next highest at 19%. To say that upscaling is of no relevance to that large segment is ignoring the most common user. Given that the days of budget GPUs are gone, for those that don’t want to spend $1000 to be able to enjoy higher quality settings, upscaling is the option. HUB, this is a highly relevant video. Well done, thank you. Please keep up the quality work.
"For example when swinging your light sabre around, the disocclusion artifacts with FSR are more noticeable than with DLSS, although they are present for both." People base their GPU purchases off this stuff? Urgh.
He's just using an example of where FSR falls short. And yes, upscaling techniques are a huge part of PC gaming now. It is definitely a consideration when purchasing a GPU.
@@jayg339Sure he is, but if you spent your time nit-pickicking over every little piece of the tech like this, you'd never buy anything. Neither will ever be perfect. I've played games with FSR 2 and 3 with my 7900XTX just to see what it's like, but am too busy actually playing my games to notice any of these "defects" for them to ever be an actual concern. Anyone else who was actually buying their GPUs to PLAY GAMES would notice the same thing.
@@lvmarv1 I literally don't get this response. "I'm essentially too busy gaming to notice things like graphics" - then why buy a 7900xtx at all? Just buy a 7600 and call it a day because you'd be too busy "gaming" to actually notice the graphical defects.
@@jayg339 I notice graphics just fine. Put it this way, looking at this review, these are obscure "defects" that are often highlighted by pausing a game, slowing it right down, or zooming in unnaturally on a specific spot on-screen. When playing a game normally, a lot of these things discussed are not things that Joe Bloggs playing a game would notice or care about. Oh, and why do I have a 7900XTX? Because I can.
In some games like Ready or Not, the FSR2 implementation is so bad that even at 1440p the FSR quality option looks much worse than its DLSS counterpart. Although I’m glad they created frame gen so RTX 30 users like myself can utilise it alongside DLSS upscaling
This video feels unnecessary, I dunno. The RX 6600 seems to be capable enough in 1080p without using FSR in most, if not all games. And successors to that card will presumably be better and not require it as well.
@@Dexion845 devs should make game more "lighter" to spec, not use upscale for their laziness excuse or conspiracy with nvidia to sell their high end card
tbh in jedi survivor most noticable for me was when running DLSS has stability issues with robot on his shoulder, it feels like Tim skips way more issues with DLSS compared to FSR, for example CP street scene, for me it feels like DLSS has less but more noticable issues compared to FSR having more less visible ones (that excludes green neon sign as DLSS probably has AI to stabilize it while FSR takes artifacts from native) is it bias or is it that he wants FSR to mature more, i have no idea
Let's be honest, Jedi Survivor is a poor game visual quality wise anyway. DigitalFoundry has found animation skipping in the game. The game runs poorly on PC. It's blurry and grainy even with native 4K TAA and it's not a visually impressive game either. Both FSR and DLSS just accentuate the artifacting the game natively has, but in different ways.
Or you could just...you know....play the game normally without all the upscaling crap. FSR and DLSS are the crutches of modern performance and its troubling to me.
To this day I still don't use any of these technologies. Native resolution, no ray tracing. I use AA and texture filtering and that's about it. Not sure if I'm behind or these technologies just aren't used all that often.
This here. I run 4k/120 48" OLED w/ 7900xtx, I absolutely avoid this crap as it looks bad. BLURLSS/FSR etc all look bad to me. RT, meh and not worth the downgraded quality with fake frames/upscaling just to get playable FPS. So you add lighting to make it look better, then mush it out with upscaling, what a waste of time. I don't see any CC's bringing up this point because it would make Nvidia mad and they won't get any samples.
You want a real eye opener? Open up an old game that your hardware can run easily and apply msaa and or supersampling in driver to upscale it. You'll be floored by the image quality even with inferior effects and textures. These newer engines are garbage, and having to use upscaling to even make them relevant makes it worse.
I find it absurd that XeSS isn't included in this comparison. I think the majority of pc gamers have understood pretty well that DLSS is better than FSR and that statement only gets more true as resolution decreases. But the relevancy of FSR comes from the fact that it is usable in more GPUs than just RTX cards. XeSS also does that. And in most of the games tested in this video XeSS is available. I would really like to see how XeSS on Quality matches up to these 2 techinques at 1080p
I'm actually impressed at how good DLSS is at 1080p, in a lot of scenes its actually a visual improvement and not just a framerate improvement over native
At these price points, they don’t have much incentive to improve it. What’s the alternative? Spend a ton more money. But if you are doing that, you aren’t in that price group any more.
I play at 4k so usually fsr quality is an easy option for some extra performance, but even then, there are some games with awful implementations (Darktide is the worst fsr 2 I've seen). Hopefully they get their "ai upscaler" working well soon. I just hope that isnt also increadibly inconsistent.
Am I weird for thinking that the answer to all upscaling is "Don't use upscaling"? Why does it seem like a minority opinion to say that video cards should just be capable of rendering our games at native resolutions like they have for the last 30 years.
@@MadridistaFrierenNot really, like holy moly. I think FSR2 can be good at higher resolutions, but there was a general perception where it was seen equivalent. I've seen this niktek fella, and it's one of the lamest fanboy things ever xd. XeSS is good on ARC as the DPA4 path isn't a proper solution, but from a technical standpoint it's a lot better than FSR. ML based upscaling can perform a lot better yep, but I think even UE5's TSR is doing better in some games compared to FSR. Also Frieren is goated.
@@MadridistaFrierenI love how massively pro AMD biased stuff (like Nik)gets pushed by RUclips's algorithm super heavily yet you won't see that happening on the other side (pro Nvidia or intel)
Unlike things like raytracing, upscaling has its place in all price ranges - it isn't just a "luxury" feature. A good upscaling implementation allows the user to get more graphical fidelity or performance than they otherwise would, regardless of price range. The main reason you don't see people using FSR/DLSS at 1080p is just that the results are quite poor due to having less data to predict from. That will eventually change as each implementation improves.
@@matthias6933 The difference is FSR is not and has never been a competitor to DLSS. It cannot hold a candle to DLSS without AI. FSR competes in the special Olympics and comparing it to an actual Olympian (DLSS) is distasteful and disgusting.
@@calvin659You seem so excited to let Jensen squeeze you out. Be a good consoomer to Senpai Huang and make sure to save for the next card soon, so you dont feel the emptiness of being arbitrarily left out out of feature sets once again.
I usually agree with a lot of the stuff the boys say. But seriously "awful"? "significantly worse"? "poor"? You've shown one game where you can tell that there is a quality issue - everything else requires side by side comparisons with wild zoom-in levels and pixel peeping on a frame by frame basis. Yeah FSR is probably worse. I'm also fairly sure that the damn thing is supported to run on 1440p and 4k and not below that (not 100% sure but I swear I remember reading that being the spec). But its genuinely so unnoticeable in actual gameplay (outside of some outliers) that a normal person wouldn't give two monkeys about it. I'm sorry but you've lost me on this particular topic.
TAA ruined graphics for me. I know why it's used, but back then 1080p was all i needed to play, and games looked great. Now even at 4k games are a blurry mess, and i just want sharpness. Rdr2 shouldn;t even be allowed to be released with amount of vaseline on screen it generates
I'm a budget (1080p) buyer, and this feature was extremely useful for me. *Also, I'd request your thoughts on 1080p to 4k 'integer' scaling vs 'FSR/DLSS' upscaling please.* Usecase is this: I'm playing 1080p games on a 4k screen. For a better image quality, should I 'integer' scale the image (using AMD/Nvidia control panel), or should I use FSR/DLSS? I tried searching online, but surprisingly there is no coverage on this. Even though this looks like a very frequent usecase to me. Maybe most please just let their monitors handle the scaling. So your views will be much appreciated, though a video will be lovely 😊
with The Talos Principle 2 is because it was certainly removed motion blur, that in that game is what serves as denoiser for FSR, I recommend to use half motion blur and reduce sharpening to even zero, that gives the best image. That way FSR ends up being the best of the non-dlss upscalers and with Frame Generation
@@TheGoncas2 yes it does lmao. All of these techs have visual issues and it's why I'd rather just shell out for a card where I don't have to 'turn it down'.
For every pro/con you stated for *whatever* tech, I'd see a clear failure in the alternate tech. In the end, if your PC can't do 1080p without scaling then you'll be happy with whatever tech you have available
I get what this video is pointing out but to be honest, and maybe it's just me, but I didn't notice any issues with either technology until they were pointed out. I think that when people are just playing the game, and again maybe it's just me, I don't think these issues are really going to get in the way of the overall experience.
I often felt that way too about this kind of video, but from experience I can tell that the shimmering/flickering can get really annoying really fast if you are actually playing a game. With the other stuff it is more like Tim says: you probably won't notice the specific things that are wrong, but just that something about the game doesn't look right.
yeah the problems feel extremely exaggerated and feel like momentary inferiority complex for shiny features that will be superseded by something else in the future anyway. Meanwhile Nvidia has been tightening the memory width and size in order to make these stupid features possible but somehow nobody criticizes that. The last thing I want from AMD is to build custom processors into the GPU so they can cheap out on other parts that truly count for a GPUs longevity.
Is it me or from just observing the thumbnail image alone, which can you can do by configuring your web browser to 'not auto-play video' , the FSR image of Ratchet actually looks better than the DLSS image. Despite the title claiming 'AMD Must Fix FSR upscaling'?
I think that that's just because of the extra sharpening with FSR. At 5:10 of the video is the same image comparison, but with native 1080p, as well, and you can see that the DLSS version looks closer to the native. That's supposed to be the goal of upscaling and FSR's sharp pixels there, while looking good in a freeze frame, may sizzle a lot in motion.
At 5:01. I think the rain in FSR is closer to the native representation. I don't know this scene, but is the blue sign in the top right supposed to be a garbled mess, as reconstructed by DLSS? And at 5:15 it just looks like DLSS uses filters the image much more to soften the edges.
TAA isn't native representation. If you want proof, if he turns up the resolution to 4k with TAA you'll see that DLSS is the one that is correct. As far as 5:15, well, this is literally just a case of FSR looking similar to it's internal resolution and not the upscaled resolution it's supposed to mimic.
Always the same story every few weeks. DLSS is visually better than FSR. That is now known. Nevertheless, FSR 2 and 3 are not unusable! But what the test also shows. DLSS is not fundamentally better than the native image. Because when playing normally, no one makes direct side-by-side comparisons. Or detailed comparisons in zoomed-in slow motion or on high-resolution still images. The comparison is also misleading because some titles compare older FSR versions against the latest DLSS implementations. The comparisons only fuel fanboy discussions, they're not good for anything more. In addition, there are already rumors online that FSR+AI will come in 2024. Hopefully the unspeakable, impractical discussions will finally come to an end.
DLSS is not always better than native, but it's actually way closer to native than FSR. The old versions of DLSS and fsr because that's what officially supported in the game.
@@notenjoying666 Most games have bad taa, and even if taa is decent it will always be an inferior sampling method to even fsr, especially at 1080p. You can blame the game to make excuses for amd all you want, but thats the reality.
I don't think using any form of upscaling is worth using at 1080p. Even dlss looks blatantly worse than native. I'm sorry but why do you want to run games at lower image quality than most PS4 games...
While probably not the intended use of the video, it does also give us an interesting look at what we can expect out of the Switch 2 now that it will have access to DLSS 2 (I seriously doubt it will have DLSS3). Its also unclear how many Tensor Cores it will have as well, but generally speaking its nice to see how well DLSS compares to native while still delivering a significant boost to performance from such low starting base resolutions. Hopefully most devs take advantage of it, since poor performance, really low resolutions and no AA were way too common on the Switch 1
Switch 2 will have access to Ray Reconstruction (since even the 20-series cards have it) but not Frame Generation since it lacks the Ada Lovelace OFA, but that won't be too much of a drag since Nvidia doesn't recommend using Frame Generation below 60 fps anyways.
Papermaster confirmed that FSR AI is coming. This is both good and bad news. The good news is that it will probably catch up with DLSS 3.5 upscaling. The bad news is that I suspect that it won't be on par with DLSS 4 upscaling which is incoming with the Blackwell release on the market... basically they will always be one or even two steps behind. AMD needs to start funding (and supplying human resources) to their software team more and they need to focus theose resources on specific useful techs rather than more side projects (AFMF) which can really be done if you are dominating and you have spare time and money to throw somewhere. And this latter part is definitely Nvidia's situation not AMD's.
worst thing about fsr for me is poor resolution in distances cuz im born to look ahead :D it destroys all vibe of playing CP with poor details in distance
I love how bad the rachet and clank run on my 6900xt while other games I out perform the rtx 3090. Please make it notice I want insomniac fix the rachet and clank..
@@dssd7685 Yes, yes my friend. Every example of AMD being worse is just "Nvidia shilling machine", "paid bastards" and so on. Every single one of them. Huh, it seems NVidia is sponsoring like 95% of games today...
@@gargean1671 from crysis to Alan wake its ngreedia tech only and yes its ngreedia shilling machine, holy sh!t this comment sections filled with nvidia bots, not to mention fsr 3 is the latest and implemented on only 3games and perfectly implemented on only one game and this video is comparing the ancient fsr 2. smh
@@H.S03Its still the exact same amount of pixels, just on a different space. What youre talking about is how well a monitor is handling the amount of pixels rendered. He's talking about IMAGE quality, not MONITOR quality.
@@Dajova The number of pixels are the same, not the pixels themselves as they are physically smaller in order to cover a smaller area. I didn’t reference any aspect of monitor rendering or quality. So long as the resolution stays the same, a smaller screen will have a lower pixel pitch (can be thought of as PPI), which will always lead to a more detailed image when all else is equal. The entire reason resolution increases image quality is because of PPI. The same screen space + more pixels = higher density of information which naturally allows for a greater level of detail since more visual information can be represented for any given area of the screen.
I don't like deep learning shit. But there was a very good deal on "4060". After playing reddeadredemption2 with dlss on quality, I could not tell the difference to native. But trees looked smeared with FSR quality.
As someone who still plays on a 1650, an unstable image is infinitely better than 20fps, but no change to the upscaling component from FSR 2 to 3 is ridiculous.
I believe AMD stopped working on FSR2 altogether and is currently focusing on the next iteration that uses machine learning cores as mentioned last week, probably named FSR4. They gave up on what was obviously not working and moved on. Tensor cores won.
Slamming calculations using specially made hardware acceleration cores is obviously superior. Also means you need that hardware. And Nvidia wont allow anyone else, because they dont care about spread of technology, they only want to sell.
Yeah, playing on a AMD GPU at 1440p myself, I try to use native resolution whenever possible. The hit to image stability is just too noticeable, I'd rather turn down stuff like shadows and reflections than use FSR. More importantly, XeSS and TSR both do a much better job imo, so I occasionally use those where possible.
It's funny because I experienced FSR well before DLSS and just accepted that upscaling comes with a certain amount of ghosting and artifacts. I didn't get why so many claimed DLSS was better until I got an RTX card and realized what _good_ upscaling was. Don't get me wrong, I'm still grateful that FSR exists because it can be used on my wife and kids' GPUs giving them a longer life with newer games, but it really doesn't compare with DLSS, no matter what the AMD fanboys pretend. The fact that _no one_ uses FSR if they have access to DLSS is all the proof you need that AMD still needs to do a lot of work with FSR.
DLSS seems to prioritize image stability and that was always one of my biggest personal annoyances before DLSS (damn you thatched roofs in Witcher 3). Which is a big reason I've been a big fan girl. I do like a clear image but I'm far more bothered by frame to frame flicker. Esp when standing still which FSR STILL has issues with.
it's funny that modded fsr version in cyberpunk actually fixes ghosting and did it even before official implementation
Passion will always be better than greed.
@@OnBrandRP but.... fsr is free
@@TheHighborn Greed in the sense that it costs AMD money to hire competent Software developers. They dont want to do that.
@@KS-tz9sg FSR1 showcase title btw. Also Steam Deck flagship title. Both AMD products.
@@haewymetal CP 2077 is an nvidia sponsored game. So of course they made bad implementation of an older version of FSR2.
To the people asking why this video is relevant - AMD is now ubiquitous in the gaming world. Look at how many companies are using their architectures in their PCs/handhelds/consoles. They fall into the perfect Venn Diagram of: Being a single source for both CPU and GPU hardware, having a very *mature* set of architectures (cough, Intel graphics), and actually being willing to be aggressive on price. Nvidia doesn’t care enough to compete much where AMD seems to shine, they’d rather make bank in the AI space. So, if AMD is the only real option for the majority of gaming machines outside the home built PC space, they’re the only company that needs to be convinced of anything. And a video like this is the only way to make the point. If it takes multiple reminders over the course of a couple years, so be it. Yes, Tim is beating a dead horse, but that’s AMD’s fault.
"actually being willing to be aggresive on price"
lmao no amd and nvidia are a duopoly. and amd havent done anything to dispel that notion
blows my mind that you had to say this because of how rampant AMD fanboys are especially in the youtube section its like an echo chamber thanks to channels like Niktek that only had the mindset of "Nvidia = bad guy, AMD = good guy"
Thanks for stating the obvious (unironically), whenever a video like this comes out you'll find a hilariously absurd numbers of AMD fanboys (who'll never describe themselves as such, because ofc not, they're all "pc enthusiasts") saying how it doesn't matter, how AMD magically released a patch just yesterday that fixes everything, how a user on reddit found a fix that makes a 7900xt perform better than a 4090, how Tim&Steve actually don't understand what they're talking about etc. etc. ad nauseam. It's really tiresome.
@@Eleganttf2i see you everywhere bro 😂.
Bruh really? Amd CPUs are better and consume less power than Intel CPUs, what are you talking about!
How about better optimize games so we dont freaking need to use upscaling methods to play in basic 1080p resolution?!
As long as Gamers are on 10 year old graphics cards with a fraction of the power of current gen consoles.
No.
@@Chrissy717 Honesty we know that not to be true. At least in the AA and AAA space, games have become a bit more unoptimized despite the slight increase in visuals. This generation feels like a low increase despite the nice power increase in the move to PS5. Most games should look a bit better and run better than the previous generation.
@@waterheart95I think it takes time for devs to learn and adapt to the new technologies especially unreal engine 5
@@geekmechanic1473Plus people tend to overuse the word "optimization" without actually knowing what it means.
To paraphrase a classic movie:
They keep using that word, I do not think it means what they think it means!
Yes, but developers also didn't have to deal with the added burden of resolutions above 1080p being available just over a console generation ago. It's much more complicated when they have to increase fidelity and features while also pumping up pixels.
Regarding whether DLSS is 'running a more taxing algorithm' to get better image quality, the video states it's not the case, but at the same time states that the only difference of DLSS and FSR is 'hardware acceleration vs generic shaders'. HW Acceleration is exactly that - it speeds up something that could be done in shader cores. It's very likely that the algorithm used by DLSS is more complex (thus it catches more of the corner cases that FSR misses), so it leverages HW Accel to reduce the negative performance impact.
One can see something similar with XeSS's fallbacks - the best XeSS algo uses Intel XMX cores and may achieve visual results closer to DLSS (compared to FSR), but the DP4a fallback suffers from a significantly larger performance hit VS the XMX path. Speed and quality has always been inversely proportional, which is why DLSS was initially seen on launch as a kind of black magic because it got so close to native while still offering so much more performance. We now know it really isn't free - it truly had to leverage additional compute resources to make that a reality.
Yeah current DLSS and FSR ultimately don't make the same tradeoffs so it's not really an apples to apples comparison. It's unlikely one could make FSR equal DLSS in image quality without either needing the same hardware acceleration that may not exist in AMD's current architecture (unless the AI cores in RDNA3 can be used) but may be applicable in the next generations, and in the next console, or a significant performance hit.
the power of AI Tensor cores
Either DLSS is either very inefficient or it does not use tensor cores. Nvidia did this in the past with physX and borderlands 2 for example is not playable on FX4350 for example with physX enable because CPU is not fast enough..
Cyberpunk runs at 900p at roughly the same framerate as FSR rendering at 720p so performance cost is rather substantial.
On top of it, UE4 (means, no TSR) exists and every UE4+ game has ability to surpass DLSS image quality with proper TAA and sharpening configuration. Upscaling algorithm is less relevant when you have choices for it as well, because cheaper (simple bilinear is almost free) can provide higher quality then more expensive 5-tap Catmull-Rom bicubic (not the most expensive) because it has performance advantage and therefore, higher render resolution can be used for same level of performance.
The case for DLSS (and XeSS) could simply be the fact, that higher image quality is a result of brute forcing the AI algorithm thanks to the dedicated HW acceleration and still providing better performance. FSR simply can't afford this, being multi-platform and thus relying on the lowest common denominator -- the pixel shaders.
@@CrazySerb I mean sure TAA sure has better image quality if implemented properly because it literally cost a performance hit, the opposite of what DLSS do.
This also served as a good example of the failings of TAA, which shows ghosting similar to FSR in some scenes like the stickers on the shop door in the Spiderman section.
I have never found a scenario where fsr had worse ghosting that temporal AA. Anything working temporally will have these types of effects.. even the latest dlss.
I have a cousin who works with a team on TAA improvement and did you know how complicated to improve such technology you are using for free? They don't get much funding and you expect them to work it to perfections?
@@christophermullins7163 I haven't either, but TAA looks worse than DLSS, and surely DLAA, Native, MSAA, and SSAA when in motion.
😢
TAA is much worse than FSR... and AMD has eliminated ghosting completely on FSR2.2. It's just that most developers haven't implemented the measures to combat such scenarios... CDPR added a reactive mask to the car in v.1.61, but then it was magically removed in 1.62 with the release of DLSS3. fan made FSR2 Injection mods also completely remove ghosting. So it's not a matter of tech, but rather the implementation in itself.
One interesting comparison would be between TSR and FSR. They are both temporal upscalers that don't use AI.
TSR is only available on UE5 games, but there are already a few available to make a small comparison.
UE5 will be used on console so they can pick the builtin upscaler. UE5 one looks better, it goes from 1080p to 4k by design. Every time I read about TSR people state its better than FSR but that FSR has frame generation.
@winj3r tsr looks better bro.
@@Scott99259 Yes, it does. But it would be nice to see a professional comparison between the 2.
TSR in Robocop was the best one to use even over DLSS, dunno what went wrong with DLSS fizzling on streets, but TSR does it way way less.
@@SKHYJINX yeah tsr looks incredible in robocop.
Where available please include DLAA / FSR native so you can compare the built in TAA to NVs TAA to AMD TAA at the same input resolution. It also allows you to see how much of the IQ delta is from the TAA method used and how much is from the upscaling.
I would also like to see this kind of analysis, but it might reduce the number of games that could be compared due to FSR Native not being nearly as common as DLAA, as all games from DLSS 1.9 can be very easily set to use DLAA even if the game doesn't expose the DLAA option. I'm not aware of a similar solution being available for FSR, but please correct me if I'm wrong.
Yeah, this!
It seems like crucial parameters to control were a bit put to the side.
Crazy how dlss looks even better/sharper than dlaa at slightly higher resolution.
Ex. 2160p dlss quality(1440p) > 1920p dlaa native.
AMD doesn't have its own TAA, FSR (all versions) use the games built in one (I might add that allot of games have trash TAA and we just blame AMD for it). NVIDIA DLSS has had its own implementation of TAA baked into DLSS from the start along with the use of AI compute requiring a Nvidia card. DLAA came after and maybe it's similar or the same as what's in DLSS and it's a good TAA filter 👍 but AMD has no equivalent to be compared against.
Edit: OK AMD does have FSR native in 3.0 but not all titles that have 3.0 also have AA native aswell so it's even less available then the very few titles that have 3.0 doesn't make for a good argument yet.
@@clem9808Imagine dldsr + dlaa.
I always consider upscaling from 1080p to be an odd choice. As I see it, if you're GPU isn't strong enough to game at 1080p, chances are your CPU is probably not strong enough to have notable gains from dropping to 1080p to a lower resolution. For the most part, while FSR may not be as good as DLSS, I've been happy with it from 1440p and up, and it's not the garbage fire the internet often tells me it is.
I think it kinda depends.
There are a lot of people on AM4 that have a 3700x or they jumped with a 5600x and a GPU such as 2060 or 2070, it's already good for 1080p, DLSS just improves the situation and they have the headroom for that.
Most of the time they should be GPU Bound in these games, so their CPU can at least push some more FPS anyway even if they would get even more with a more powerful CPU
The tryhards internet tech nerds love to act like their decade old knowledge is always applicable
I'm considered poor by American standards and have had 1080p monitors since 2007, 1440p since 2012. This comment is off topic but it's funny to me that people still roll with 1080p in 2024.
It's all relative. A lot of people upgrade there platform in chunks. Say mobo, ram, psu, cpu in platform upgrade, then they do not buy the cpu until 9 months to a year later.
I.e. when I built my 5800x build in December 2020, I did not buy my Radeon 6800xt until September due to the gpu prices at the time. That entire time I was using a 4gb rx580.
Of course fsr did not exist then, so I was stuck at 30fps in games like rust and other newerish games. If fsr existed then it would be a different story.
Bs, lots of people still have GPUs like the 2060 with AM4 CPUs
In Cyberpunk 2077 2:59 the ghosting is in the original image, it is of course worse in the upscaling versions. I noticed similar behavior with shadows in Days Gone, no upscaling. I'm not denying the upscaled versions look worse, I'm saying that any issues in the original image will be amplified by the upscaler.
It seems like an engine issue that was well hidden but made apparent by the upscaler.
Is it in the image without TAA?
FSR and DLSS replace the games own TAA solution, so they can help with or create new problems.
FSR1 and DLSS1 would have preserved the original problems.
@@pottuvoi2 wrong DLSS replaces TAA as its a requirement, FSR does not. It relies on the game's taa which is why it really suffers in some games.
@@JohnSmith-ro8hk FSR1 relied on any ingame AA to create gradients which it used for edge creation.
FSR2 and later are completely different algorithms and can be described as TAA which resolve to larger target buffer. (Like DLSS 1.8 and later or temporal inection,
TAAU etc.)
gpuopen.com/gdc-presentations/2022/GDC_FidelityFX_Super_Resolution_2_0.pdf
@@JohnSmith-ro8hk
FSR1 relied on ingame AA to create gradients which it used to find edges.
FSR2 and later can be described as TAA which resolve to larger buffer. (Like DLSS 1.8 onwards, TAAU etc.)
gpuopen.com/gdc-presentations/2022/GDC_FidelityFX_Super_Resolution_2_0.pdf
@@pottuvoi2 I'll check, I noticed it in Cyberpunk in this video, in Days Gone TAA is probably enabled.
I'd say any upscaling isn't worth using at just 1080p. At low resolutions the base res is just horrendous. It starts to look decent at 1440p.
DLSS Quality at 1080p on a 24 inch monitor looks good to me. I have since moved to a 27 inch 1440p monitor, and I can tell the difference more clearly between native and upscaled on this new screen.
On smaller screens, the sharpness loss is not as apparent is what I'm trying to say.
But in case of Nvidia, if you have a 1080p screen, you can enable DLDSR (1,78x), set in-game resolution to 1440p and set DLSS to "Balanced". This way you will get better image quality than native 1080p (with TAA) and still noticeably better performance. It is a win-win situation for all owners of low end Nvidia cards paired with 1080p screens.
@@stangamer1151 Hmm, I should try that. Let me get back to you
Agreed, but many of us don't have a choice, using older GPUs. The option is absolutely lowest graphics settings or straight up resolution reduction. Both are worse than Upscaling imho.
nonsense dlss quality is good for 1080p
AMD has ALREADY delivered solutions fixing sub-pixel shimmering (for powerlines, fences, etc.), smearing, and disocclusion artifacts with FSR 2.2, you just assign reactive-, composition- or transparency masks to those objects, and you can also "pixel-lock" certain sub-pixel detail.... heck CDPR added a reactive mask to the car, fixing ghosting in Cyberpunk v.1.61 completely, but then it was removed and the motion blur re-introduced for FSR in the v1.62-DLSS3 patch. Community-made injected version of FSR fixes ghosting as well (you can assign a autogenerated helper function if you don't manually add "masks" to the game, it isn't as accurate, but still helps out), so really, it's more a matter of how much effort developers put into the FSR implementation compared to DLSS, rather than it being an indicator of the technology's capability in itself. I have played around with FSR SDK, and I would argue that the biggest difference is that it lacks the last temporal smoothing passes that DLSS has (due to AMD not relying on specialized cores for accelerated matrix calculations). Otherwise the techniques function similarly. Ultimately DLSS will always yield a more stable image, but in a scenario where FSR2.2 has a feature-complete implementation, it actually gets rid of many issues that still plague the DLSS's machine learned algorithm (like ghosting and oclussion artifacts).
These videos ultimately compare "implementations" rather than "technology". The reason being that these games-, and side-by-side comparison doesn't offer a like-for-like scenario. However, if the video is looking at the current state of DLSS vs FSR in games I would still argue that that's a valid goal. Especially since it spotlights the current gaming landscape and how these technologies have been utilized by developers regardless of capabilities. I just think that the wording used throughout the video is kind of misleading, as this is NOT the state of the technology itself. And the headline about AMD "needing to fix FSR" is also kind of wrong since AMD already fixed most complaints, however developers have chosen not to anything about their current implementations. What AMD could do, and something I wish they did, is to offer a temporal smoothing pass in the temporal pipeline (like DLSS has) that can be toggled on/off. That way the quality can be more on-par with DLSS at the cost of performance. But then it will be more an issue of FSR performing worse, which isn't a good marketing point of course. But I think it would cool down the "quality debate" a whole lot
What is stopping the developers to add the smoothing pass themselves? Is it a known algorithm? Can it be implemented universally on shader cores even if it would be higher performance loss than through DLSS tensor implementation. Would it be similar performance if implemented on AMD RDNA3 equivalent hardware.
I get you might not know this but this is what I wonder. After your explanation of what FSR is lacking and suggestion that AMD should implement the smoothing pass they are missing.
@@cajampaI mean, FSR is open source, so anyone could modify how the temporal pipeline works. However, it's A LOT of work, and require deep low-level technological skill, and architectural knowledge of how FSR2.2 is currently constructed. Most game developers don't have time for the technical investment it would require. But if they did, one would hope that the changes wouldn't be kept to their own implementation, but rather merged into a new version of FSR. I might've made it sound more trivial than it actually is in my initial post, it's really nothing you just "slap on" easily... Personally, I wouldn't have the technical expertise to do it, but I have a friend that is insanely talented that probably could. However, that friend works for Epic, so he isn't allowed to work on things outside of that, and I think that happens with a lot of talent. Companies eat them up, make them sign NDA and they can't help out elsewhere.
These lazy CCs don't want to do that, just need to push Nvidia for free samples next launch.
@@N4CR Funny you say that when Nvidia tried to blacklist HUB not too long ago.
I'd love to see more games offer a 'render resolution slider', like we see in Starfield.
This'll allow users to find their own balance between resolution and performance, while still being able to take advantage FSR2's upscaling features.
I personally ran Starfield at 1440p + optimised settings + 75% render scale with FSR2 - and it looked great. 75% was above the standard "FSR Quality" (65% I believe), which allowed for cleaner image quality with only a minor hit to performance.
At 1440p or lower, the standard resolution scale of "FSR2 Quality" might actually be too aggressive for most games.
For single player games it isn't really a problem (at least with nvidia) because with dlss tweaker you can set whatever internal render resolution you want, but with multiplayer games it's a big problem since dlss tweaker maybe just get you banned. RDR2 Online is a good example, the native taa is extremely blurry (especially at 1080p), but FSR is unusabably bad at 1080p quality, while dlss quality is probably marginally worse than native taa quality wise, but at least you get a small performance uplift.
FSR 2 quality is 66 percent ultra quality is definitely more than 65
Agreed. I use DLSS Tweaks to set the render resolution to 80% on 1440p. This makes the image quality almost indistinguishable from native res while giving a moderate boost to fps. Basically free performance.
Yeah I would enjoy tweaking my upscale amount, as FSR2 tends to keel over at 1440p.
You mean fsr2 quality
Quality is around 67%
Ultra quality if that even exists is definitely above that
For me the main perk in having RTX cards for 1080p rig is an unspoken DLDSR
Playing Cyberpunk with DLDSR x2,25 + DLSS Balance on high preset was a treat. Much more crispier image than native 1080p
It's a bummer that we don't get more in-depth professional analysis of DLDSR, as combined with DLSS it makes for a very effective anti-aliasing solution.
@@kamilkornobis5585yep I still game on 1080p plasma, but thanks to DLDSR and DLSS (usually using 2880x1620 with DLSS quality, so around 900p) I get absolutely pristine image quality at fantastic performance.
my biggest issue with using that trick is that it completely messes up my second monitor...when I enable DSR/ DLDSR in NVCP it pushes all content of my second monitor almost out of screen. Might be a big win for single display users tho.
@@8Paul7 1620p with DLSS quality is 1080p. You are using DLAA in that case. 1440p with DLSS Quality is 960p.
Ah, DLDSR and DLSS are the main reasons I went with Nvidia GPU last year. I was thinking about 7900 XT, but since it does not offer a straight alternative to DLDSR and FSR looked (and still looks) worse than DLSS, I chose 4070 Ti. Nvidia is selling tech this gen, not (only) compute power, like AMD.
Why didn't you compare it to an unupscaled 720p image? If your gpu isn't powerful enough for rendering a native 1080p image, is it overall better to use fsr or should we stick with a 720p native rendering?
I agree with you , on steam deck I found out using 960 x 540 p resolution more stable in some games than using native (1280x800)+ FSR Quality.
The problem at least for me is many games don't allow that resolution
this
I don't know about fsr but with intel's XeSS it does look better
even better if you have a 1440p screen, allowing for the use of integer scaling instead of fractional scaling
In my situation (1440p screen), I've found that the quality goes like this:
1080p rendering > FSR1 > FSR2 > 1440p native rendering
I've personally found that FSR2 gets pretty close to native 1440p rendering, and IMO you can really only see the artifacts if you really look for it. FSR1 gets you like... 70% if the way to 1440p native rendering. But I would take FSR1 over 1080p rendering any day. FSR1 just does a way better job of smoothing out the jaggies and the pixels than if you didn't use it.
So yeah, if you're running a game at lower resolution than your screen, IMO always use FSR if you can
I always have issues with any upscaling technique since I almost always tend to notice artifacts that I find really ugly. What I often do now when I use upscaling with my RX 7900XT to get very high FPS at (upscaled) 4K resolution is to use the driver integrated RSR (Radeon Super Resolution). While image quality might still be up to debate I found that surprisingly really annoying artifacts are less common and in some titles (Dead Space) it offers clearly better results than the in game FSR. In games like Cyberpunk and Remnant 2 it is also better in my opinion. This is also partially explained by the fact that the driver option allows upscaling from 1800P to 2160P (kind of ultra quality mode). I have consistently good to very good results with this. Maybe it would be interesting to test this as well for you guys at Hardware Unboxed :).
with RSR the temporal artifacts are less common or in fact completely missing because RSR is using FSR1, which is just a 2D spatial upscaler without any temporal component or even knowledge of the image content.
@@Sp00kyFoxNo idea, might be. I never checked it up. Still it is funny that this version within the driver can look better. For me at least it really yields often better results than the game implemented FSR.
@@Crysalis-jd1ub It could be that you're sensitive to certain types of artifacts but don't notice others, so spatial upscaling is more appealing to you.
My question for you now is... Is it better to render natively at 720p or use FSR quality then, if you don't have the hardware to run at native 1080p?
540p integer.
@@enricofermi3471 yes (if you have a 1080p monitor, 720p if 1440p
Most console gamers use 4k tv you can't do 720p on that tv
It's always better to use upscaling.
When FSR 2.0 released I was ecstatic, and happy with the base quality for an initial release. However they haven’t meaningfully improved it since, even after launching frame gen, and the quality just isn’t good enough to justify using it.
They cant improve it without AI, FSR was never meant to be a true DLSS competitor, it was a boxcheck and an attempt to muddy the waters and say theyre the same. They arent
@@Angel7black I couldn't have said it better!
DLSS is genuinely important. Sony is making their own tech to compete with it. I was surprised AMD didn't figure out AI upscaling with RX 7000. They only recently announced they were even working on it.
AMD has just given up in their graphics department by the looks of it. NVIDIA cards offer so many software features that are too good to ignore. Their recently released RTX HDR is a game changer.
@@msnehamukherjeeabsolutely! The worst thing about an HDR monitor is non HDR content. RTX HDR is absolutely massive
@msnehamukherjee I don't think it is that, more amd only has a certain amount of wafer allocation from TSMC, and they can sell AI accelerators based on their CDNA platform for far more than they can consumer GPUs. The main thing they need their gpus for is to maintain competitiveness in the console and laptop space.mainstream gpus do that well enough. If amd had infinite manpower and fab allocation I am sure they would go after all markets, but as it is they have to prioritise. Nvidia and AMD have about the same headcount, but for amd this must make both cpus and gpus.
I think them focusing so much on hardware agnostic fsr was to try to let older cards hold off on upgrading to 30 and 40 series GPUs sooner than they would have. With the newer consoles on the horizon its probably a feature that Microsoft and Sony demanded be included.
And people are assuming AMD was talking about FSR when they said "AI upscaling" (most media just ran with it which doesn't help). But AMD actually didn't mention FSR a single time and used "gaming device" instead of "gpu". In the last DF Direct Weekly, they talked about it and they don't think it's FSR. They think AMD might just be talking about something like a PS5 pro AI upscaler.
Which doesn't mean FSR with Ai isn't coming at some point, but it might take even longer.
AI acceleration in FSR will be a blessing for lower resolutions, although it will probably require dedicated hardware like dlss or xess..
Another great option for people with RTX cards who plays in 1080p and the game's textures/antialiasing looks like shit is to use DLDSR (Nvidia control panel>Manage 3D settings>Global settings>DSR Factors - 2.25x DL / DSR Smoothness 15%).
It will create a super resolution for all fullscreen applications (otherwise you will need to change your desktop resolution before launch, some dx12 only games don't have a real fullscreen mode) and together with DLSS - Quality it'll use much more detailed frames while FPS stays the same as native 1080p. It's pretty much DLAA but with better textures, antialiasing, and no FPS drops compared to native.
Don't forget to disable Antialiasing, Sharpening, Motion Blur, Depth of Field in game settings.
So like Radeon super resolution
Not sure, DLDSR uses AI cores of the GPU@@NadeemAhmed-nv2br
Some of these issues with FSR can be mitigated somewhat by reducing the over sharpening that it applies by default.
@@kadupse True but the apparent fizziness can be lessened by having a smoother presentation.
5:06 Did DLSS messed up the rendering of the blue sign on the right hand side or is that down to a timing difference between the frames where the engine hasn't quite finished rendering the image?
Nope. Sign is animated and its changes like this.
@@kesselring22 thanks for clarifying thst
What i would be very interested in would be a comparison/discussion of 1080p native vs 1440p DLSS/FSR.
What i mean is a genuine discussion if it is better for the final gaming experience to spend more on a 1440p VRR Display and a GPU that is only able to deliver good FPS at 1440p utilizing upscaling or to get a 1080p VRR Display (which is cheaper, so you can get a better GPU) and render at 1080p native?
Ik watching this cos its you. I couldnt give a shit.Native is always for me. These upscaling things are to take away from pure rasterisation.
Id rather drop the game than have to rely on upscaling to get playble fps at 1080p medium.
Same. I'd just put it off until either a patch improves performance enough or I upgrade my hardware. I have plenty of games in my backlog that I put off for the same reason that I can now play at epic settings and high framerates to keep me occupied in the meantime.
Completely agree with your POV. It's in a kick in the teeth when a new game come out (or even a sequel on the exact same game engine!) and the devs just _couldn't manage_ to get 1080p at a decent fps for your hardware that _was_ perfectly suitable for 1440p+ titles a year ago because of course the hardware is capable, it's just not optimised.
And upscaling to 1080p? 1080p should become some sort of minimal standard to target, or the game needs to go back in the oven.
IMO, it's not ready yet if (per one of the graphs in this vid), your 3060 12GB a ~£300 card) gets an average of 50fps at 1080p. 1080p60 ultra MINIMUM (I promise it can be done).
moral of the story JUST buy a good gpu or disable ray tracing which is still a gimmick (not properly used in a good artistic way ) , disable TAA , enjoy native , the way 3d rendering was intended from the beginning of time , no blur added or sharpness needed aka Vaseline with taa or dlss fsr , it's okey to have low fps or maxed out qames without rt on for eye candy , but it's not okey to play with VASELINE ON , take it from an old guy .
I play at 1080p and I highly prefer playing at native with reduced details over upscaled with more details. Maybe upscaling has its use where say you buy a 1440p gpu and play on a 4k monitor, but at 1080p I'm not convinced of its usefulness unless maybe you're playing on a 1030 and need upscaling in order to run games or something.
FSR doesnt look worse than native 6:30 (stairs) so not really a problem, if you're having to run FSR at 1080p then its probably worth buying a cheap used gpu for £50 so you can run 1080p native and leave the gt630 in a drawer.
Edit: I have a titanV (not bought for gaming) and run FSR quality at 1080p where possible because whenever I've done so in COD: warzone and a few others is looks distinctly better than native with textures on the ground and walls being higher detailed and no its not a fidelity CAS thing.
Question Tim, does the amount of fps have an effect on the output quality of the fsr/dlss? For example does a game with fsr quality have a noticeable improvement in image quality of it were running at say 120fps vs at 60fps?
It does, and that makes sense with the core of these technologies being smart re-use of information from prior rendered frames. If there are more of those frames and they're temporally closer then there is more information to work with and the information is of a higher quality (less different than the target frame).
It's not as pronounced as with framegen or making massive changes to the internal resolution, but it's there. You get quite notably better quality using DLSS-Q to go from e.g. 120 to 200fps than you do going from 30 to 50.
It does matter, but not as much. Technically speaking, both technologies depend on previous frame and motion vectors. Now if frametimes are shorter, that means motion vectors diffrence is smaller and if frametime is longer, motion vectors are bigger. So they are self correcting here. In case of FSR it only matters practically only in context of ghosting. Ghosting effect on higher FPS simply is shorter as trail lenght takes more frames and over many frames it vanishes faster.
In case of DLSS, it matters more, because there is actual neural network trained that guesses data based on current and previous frame. On DLSS faster presets like ultra performance you can notice it a lot - if you move and take frame out of motion well it doesn't look that good. But if you stop, DLSS gathers data about relativly static image and better recreates text and fine details. If you had infinite FPS, every frame you would treat as static.
FPS count also matters a lot more in case of frame generation.
@@piotrj333 Seems like something that could benefit more from high framerates by keeping a duration of frames instead of a single frame.
If frames are quick more of them are going to be close to the current frame, if frames are slow then only the previous one will be valuable.
My uninformed and uneducated guess is that FSR has reached its peak as an upscaling, sharpening, and temporal vector feature. Development has probably shifted to tensor cores for both mainstream and future AI products. AI upscaling will probably be a byproduct benefit to gaming compared to what they likely want to sell in the datacenter.
Though my hope is that Cerebras kicks GPU makers out of the AI space entirely so we stop getting wrecked on GPU prices.
I love all the movement ghosting echos using DLSS. Just finished Days Gone and DLSS looked absolutely awful coming off the back of the motorcycle. Both of these solutions are bandaids for low framerate, IMO. That's all they are. They work great if you take a screenshot. DLSS in Cyberpunk (when I played at launch haven't been back to check this now) The NUMBERS on the shipping containers in the parade float warehouse. 3 Digital numbers... every one of the containers looked like 888 with DLSS on. Turn off DLSS and you could actually see the number. This is without movement at all. What really sucked was one of the objectives was to "go to container #____" But you can't read the number on any of them.
I hate all this upscaling technologies, they're ruining gaming! AAA Half ass broken games will always come out with upscaling tech to cover the bad optimization of their games!
Lol, i played Cyberpunk with DLSS performance and FG averaging 60fps at 4K max settings, and couldn't be more happier, everything looks gorgeous. This won't be possible w/o upscaling tech. Imagine the raw gpu power needed and the cost as well to run my setting w/o upscaling. I always LOL at ppl who despise upscaling tech. Whether u like it or not, upscaling is the true future of gaming.
I played Cyberpunk in 4k dlss balanced and it was fine, ghosting problems were a problem and just like in this video streetlights and wires were sh**, overall it was "fine", meaning that I got to play the game, the other option was to go for native rendering at 1440p low-medium or 1080p high, both looked quite bad on a 32" monitor. I then used DLSS in some other games and had a much better experience (always 4k balanced/quality), for instance dlss does a really good job in Hogwarts Legacy. Unfortunately I can't say the same for FSR, I tried to use it in Horizon ZD and it was basically unusable even at 4k quality, it did do better in games like FC6 and Hogwarts legacy and Cyberpunk, but in all cases was particularly obvious that the game was running at a lower res (mostly because of foliage and general shimmering).
I'm overall fine with upscaling, but its implementation needs to be consistently good on all games for it to be a viable option, and AMD really needs to do something to fix their FSR, because rn using FSR truly is the last resort to squeeze out some extra fps
@@maverichz to be completely clear, if for you that looked "gorgeous" is because you don't see all the problems, CP has suboptimal dlss performance. for instance Hogwarts Legacy is kinda the opposite
@@maverichz Yes, in comparison to a situation that would be even worse than the current one, upscaling is pretty good. What if we also weren't overcharged for what's essentially a software solution to the problem of "not having powerful enough hardware to achieve X settings at Y cost?" Actual native output is so bad TAA is applied by default, upscalers may have hurt as much as they've helped.
Why would you even want to use fake frame generation at 1080p though? Most newer, reasonable gpus can handle most games at 1080p at mid-high settings. I never use upscaling, unless my mediocre rx5700xt can't handle the game at all. I just prefer the image quality over hundreds of useless fps.
How about demanding more raw fps from gpu vendors instead of getting conditioned by them to upscaling.
Why not both
People have been demanding that, including reviewers (at least those I watch). That doesn't mean upscaling methods don't have a place in today's gaming and should be disregarded.
@@eliadbu All upscalers wether based on AI or not rely on temporal pixel interpolation and motion vectors from previous frames. Whenever temporal pixel techniques like temporal AA are used the picture will inevitably loose details and produce ghosting up to muddy pixels. You can fine tune upscaling but never prevent artifacts completely. Even DLSS 3 is still affected by that and even worse the AI does not always recognize objects properly and misplaces wrong patterns.
Conclusion: Native frames still present the best image quality.
@@thefirstone4864because they charge you more for the silicon that it'll cost.
@@aladdin8623 nah, DLSS (not talking about FG) is better than native in some scenarios/points, not that clean and cut as some may think
I'd like to see a comparison for 1440p and above. Thats where they are really useful.
At low resolutions, chances are your CPU will be lacking and upscalers can only marginally help.
I also want a cost comparison. Nvidia has an advantage with specialized cores, but sell this at a premium, so i'd like a comparison which takes into account the card required for DLSS and an alternative from AMD which might be more powerful for general compute and can choose a better quality mode for FSR.
Just because both are called 'Quality' doesnt mean they are really equal and comparable.
Purely academic but I'd love to see what visual quality you could get out of DLSS/FSR with performance parity to native, how high does the resolution have to be to get down to native's performance and does it look better
@@Relex_92 I'm not sure that was the question, it was about comparing visual quality between upscalers and native (TAA) at performance parity, not comparing FSR with DLSS (I think we've established DLSS has better visual quality than FSR in most if not all games).
Great to hear that 1080p is still an important resolution on 2024!!!!
why is this great? the industry should be moving forward and 1440p is not expensive anymore...
A lot of people won't even notice these artifacts. They are to busy gaming...
Nevertheless I totally agree: FSR needs work. A lot.
Too clear, especially flickering
Blind people don't usually play games
Alot of people dont buy amd so i guess you are right.
These artifacts are 100% easy to notice while gaming. I don't even use fsr at 1440p on my 6800xt thanks to these artifacts. Even at quality 1440p still looks noticeably worse
@@lilpain1997 Yes. For us. But this is niche for nerds. We know what to look for.
Most people think it is just the way it is supposed to be, don't care, find it good enough or juts don't notice it.
Don't forget, most of the people also think artefact-ridden smartphone photos are of great quality...
I have an RX 6900 XT and generally find that FSR 2.1 or higher has been better than DLSS on my brother's RTX 3090 there's a lot of DRAM stutter and artifacting particularly in no man's sky where the game is entirely unplayable using DLSS but even on the RTX 3090 if you put FSR on in no Man's sky is pretty acceptable
Edit I forgot to note that I'm on a 1440p monitor so that might be why I have a different experience
I completely agree with you. I've have a RTX NVIDIA card and FSR looks better than DLSS in any game, I don't know why
I think fixing can go from both ways - from and and from game developers. I hope that developers testing FSR when imementing it, so when they see artifacts they can apply workarounds. But as I see common situation for developers is applying only settings that controls FSR quality levels and that's all. And if it works bad they transfer all negative feedback to AMD.
My notice not about AMD only, it's about all upscaling technologies, that need to be applied not only as an option in graphics option. This option needs testing from developer and maybe some communication between developer and GPU vendor.
Ah yes, amd apologist, it is never their fault blame anyone else
@@samgoff5289 Well... Why is Avatar the best example of FSR so far, when plants are generally one of the worst things for FSR to upscale?
The role of developers is to report artefacts, bugs, and such, their role is not to work on getting around a tech bug/artefact. imho
It's always somebody's else's fault whenever AMD/Radeon can't produce a decent product.
@@janbenes3165 it still suffers the usual far artifacts though? It's better but still not good imo
Playstation & Nintendo own super sampling technology much more better in their own hardware compare to Nvidia, Amd & Intel solution
How does it look like when comparing FSR vs DLSS in 1440p ?
That would be really interesting for me
There's already a video for 1440p and 4K
@@thelegendaryklobb2879 really? Oh, thx for the information. I'm gonna search for it
Something that this video does not adress is the fact that most TAA implementations in newer games make your game look like a blurry mess at 1080p.
Even with added sharpening the image still looks like a valesine smeared mess. Introducing FSR to the mix fixes most of this since the AA used in FSR is much more sharp than native implementations.
I can't recall the last AAA game with a good native 1080p TAA look.
I have to wonder though: if you have to use any upscaling tech with 1080p final resolution, aren't you likely to have other issues by being on very underpowered hardware for the game? How relevant is this comparison, really?
Exactly my thoughts.
Isn't the purpose of upscaling to elongate the life of underpowered hardware?
Did you see the performance numbers at the end? 20-40% boost in performance for not much worse image quality with DLSS. That's like buying a new GPU, which Isn't a good option in 2024.
@@DeepteshLovesTECH I see it more like a replacement tire until you get a new one in that case 😁
@@DeepteshLovesTECH The question is, if they have a GPU weak enough they need upscaling at 1080p, what are the chances they have a strong enough CPU that it will actually give them a notable boost in FPS when they're dropping the resolution below 1080p with upscaling?
@@VelcroSnake93 There is a high chance the CPU is good enough. Something like a 3050 will also be pretty GPU limited in new titles. Also take for example: Laptops which have powerful CPUs but weak GPUs.
I'd like to see a similar test but at 1440p
DLSS definitely looks blurrier to me. That probably helps hide some of the "shimmering" of distant objects.
Ima just say this, FSR open source, DLSS limited to one brand. For something thats a free performance boost, i dont think nitpicking on whats better should sway buyers, when you are pretty much selling lesser VRAM and charging more for lesser raw performance, it should be shown more whats actually better. The video is nice to point out and make AMD improve, however many will choose a brand based off this video. A better test end result is to mention future titles more than current, as current is a benchmark for today and not tomorrow. In that logic, you cant just magically add VRAM, and anything under 20 gigs of VRAM for future titles will show much more worst performance and graphic quality. DLSS and FSR will improve over time, but it wont fix less VRAM, so i see absolutely no reason in Nvidia at all. Until the next gen GPUs release for Nvidia, which they can still scam you out somewhere as usual, anything under a 4070 ti super is going to be a waste later on. It was Nvidias plan to cheap you out so you have to upgrade. Meanwhile, AMD is future proofed fully, unless you invest in an overpriced 4080 or an overpriced 4090. I'll take current lesser quality FSR over paying an arm and a leg for something just to compete with the longevity of even a 7900gre or a 7800xt. AMD wins.
I'm so glad I got over pixel ogling 20 years ago. Now I can't even tell if FSR is enabled or not, and I simply do not care. It's incredibly freeing :>
copium
I know right? 😂
Jokes aside, for some people it’s really noticeable. I think that if you are an old gamer it’s easier to ignore because graphics used to be so much worse. 😂
But maybe it’s just a per person thing.
@@DavideDavini i remember when i got 3070 after many praises how good DLSS 2.0 looks, installed cyberpunk to check this out and my first reaction was how shit this looks.
@@DavideDavini it all depends on sensitivity to things each persons going to be different fsr is 90% of dlss most of the time. personally for me i dont notice anything unless i specifically looking. the main thing i notice on both dlss and fsr when using them is the softness of the image not the shimmer on a corner at a distance or on small things its the overall softness that gets me. for me i prefer native resolution all day and dont bother turning these things on and i upgrade enough to make sure i dont have to every 4-5 years on average. im also more sensitive to latency than i am to slight shimmer at a distance while gaming. like 20ms to 40ms latency change is night and day and feels horrible.
the quality of FSR is one thing I definitely noticed moving from my 2060 super to the 7800 XT. I couldn't even tell the difference between DLSS balanced and native, but anything below FSR quality just makes the game look terrible.
I think its funny that you see a game "looks awful" when I'm an old guy who used computer games of the 80's and 90's where the pixels were visible on almost every frame!
a different types of awful, back in the day graphics were simple but since crt's were a thing things like motion clarity were arguably better
nowadays with lcd's and so many badly implemented temporal aa and upscalers your screen can end up looking like a smeary ghosty dipped in vaseline mess
Well this is not 80s or 90s. I played games from 90s and all the way to here. For a game to release in 2024. the pixelated look is not acceptable. Simple as that. Jedi looking like absolute shit on native is insane. Especially from AAA studio.
@@autumn42762 you could also see 50/60 Hz flickering on CRT screens all the time. And some distortions around the edges.
Our brains don't appreciate visual inconsistency. We are trained to seek for typos in texts and rare gems/food in the world. We don't care as much for the average quality as for the flow of things going wrong. Be it 320*240 green (orange/gray/beige) pixels or 4K in 10bit HDR.
I'm from the same era of video games and played my fair share of 240P video games and still do. In that sense I'm not bothered by what's now considered low resolution, or in other worlds 1080P.
Ugly upscaling, artifacts, shimmering, blurriness and latency don't fit well with my gaming experience though.
It's a different fidelity target
Yeah Atari looked like lines moving... So what ?
There seems to be a common sentiment in the responses to this video that upscaling 1080p is pointless, along with this HUB video.
As of right now, Steam Charts show that 1080p is BY FAR the most commonly used resolution, with nearly 59% share. 1440p is next highest at 19%. To say that upscaling is of no relevance to that large segment is ignoring the most common user. Given that the days of budget GPUs are gone, for those that don’t want to spend $1000 to be able to enjoy higher quality settings, upscaling is the option.
HUB, this is a highly relevant video. Well done, thank you. Please keep up the quality work.
"For example when swinging your light sabre around, the disocclusion artifacts with FSR are more noticeable than with DLSS, although they are present for both."
People base their GPU purchases off this stuff? Urgh.
He's just using an example of where FSR falls short. And yes, upscaling techniques are a huge part of PC gaming now. It is definitely a consideration when purchasing a GPU.
@@jayg339Sure he is, but if you spent your time nit-pickicking over every little piece of the tech like this, you'd never buy anything. Neither will ever be perfect. I've played games with FSR 2 and 3 with my 7900XTX just to see what it's like, but am too busy actually playing my games to notice any of these "defects" for them to ever be an actual concern. Anyone else who was actually buying their GPUs to PLAY GAMES would notice the same thing.
@@lvmarv1 I literally don't get this response. "I'm essentially too busy gaming to notice things like graphics" - then why buy a 7900xtx at all? Just buy a 7600 and call it a day because you'd be too busy "gaming" to actually notice the graphical defects.
@@jayg339 I notice graphics just fine. Put it this way, looking at this review, these are obscure "defects" that are often highlighted by pausing a game, slowing it right down, or zooming in unnaturally on a specific spot on-screen. When playing a game normally, a lot of these things discussed are not things that Joe Bloggs playing a game would notice or care about. Oh, and why do I have a 7900XTX? Because I can.
In some games like Ready or Not, the FSR2 implementation is so bad that even at 1440p the FSR quality option looks much worse than its DLSS counterpart. Although I’m glad they created frame gen so RTX 30 users like myself can utilise it alongside DLSS upscaling
And we rtx users are also able to easily update our DLSS dlls so if a game has really outdated DLSS then we can just download newest dll and swap
This video feels unnecessary, I dunno. The RX 6600 seems to be capable enough in 1080p without using FSR in most, if not all games. And successors to that card will presumably be better and not require it as well.
Talking about upscaling isn't unnecessary, because it's useful and here to stay. Cards like rx6600 will benifit the most from dlss.
upscaling tech is modern excuse for devs laziness
Get used to it, not everyone can afford a 4080/4090 if they want to game at 1440p/4K resolution.
@@Dexion845 devs should make game more "lighter" to spec, not use upscale for their laziness excuse or conspiracy with nvidia to sell their high end card
tbh in jedi survivor most noticable for me was when running DLSS has stability issues with robot on his shoulder, it feels like Tim skips way more issues with DLSS compared to FSR, for example CP street scene, for me it feels like DLSS has less but more noticable issues compared to FSR having more less visible ones (that excludes green neon sign as DLSS probably has AI to stabilize it while FSR takes artifacts from native)
is it bias or is it that he wants FSR to mature more, i have no idea
Let's be honest, Jedi Survivor is a poor game visual quality wise anyway. DigitalFoundry has found animation skipping in the game. The game runs poorly on PC. It's blurry and grainy even with native 4K TAA and it's not a visually impressive game either. Both FSR and DLSS just accentuate the artifacting the game natively has, but in different ways.
Or you could just...you know....play the game normally without all the upscaling crap. FSR and DLSS are the crutches of modern performance and its troubling to me.
I hear you say "bad result" for such and such technology, but yet without slow motion and zoom I couldn't notice a difference most of the time.
For me fsr is so ugly id rather use Xess. Too much shimmering , the only reason why you can barely notice it is because of youtube.
RUclips compression dude
You're blind.
All three look identical in gameplay.
bruh...
To this day I still don't use any of these technologies. Native resolution, no ray tracing. I use AA and texture filtering and that's about it. Not sure if I'm behind or these technologies just aren't used all that often.
This here. I run 4k/120 48" OLED w/ 7900xtx, I absolutely avoid this crap as it looks bad. BLURLSS/FSR etc all look bad to me. RT, meh and not worth the downgraded quality with fake frames/upscaling just to get playable FPS. So you add lighting to make it look better, then mush it out with upscaling, what a waste of time. I don't see any CC's bringing up this point because it would make Nvidia mad and they won't get any samples.
You want a real eye opener? Open up an old game that your hardware can run easily and apply msaa and or supersampling in driver to upscale it. You'll be floored by the image quality even with inferior effects and textures. These newer engines are garbage, and having to use upscaling to even make them relevant makes it worse.
If you need scaling at 1080p then your graphics card is due for replacement.
Why avatar frontiers of Pandora is not in this video which has the best implementation of fsr yet 🙄
Who cares its just 1 game for AMD vs over a 100 games for NVIDIA
because Hardware unboxed only makes clickbait.
I find it absurd that XeSS isn't included in this comparison. I think the majority of pc gamers have understood pretty well that DLSS is better than FSR and that statement only gets more true as resolution decreases. But the relevancy of FSR comes from the fact that it is usable in more GPUs than just RTX cards. XeSS also does that. And in most of the games tested in this video XeSS is available. I would really like to see how XeSS on Quality matches up to these 2 techinques at 1080p
I'm actually impressed at how good DLSS is at 1080p, in a lot of scenes its actually a visual improvement and not just a framerate improvement over native
There is noticably more showing with dlss on quality in a lot of games like SOTF, where shimmering is especially noticeable
Dlss is simply better in every possible way and this is said by someone who has always been pro AMD
Amh, no. Even DLSS sux @1080p.
Amh, no. DLSS is on par with Native. FSR is complete garbage and should dissuade all buyers from purchasing AMD GPUs.
6:31 the stairs behind Miles actually look better with DLSS than native, much better antialiasing
At these price points, they don’t have much incentive to improve it. What’s the alternative? Spend a ton more money. But if you are doing that, you aren’t in that price group any more.
I play at 4k so usually fsr quality is an easy option for some extra performance, but even then, there are some games with awful implementations (Darktide is the worst fsr 2 I've seen). Hopefully they get their "ai upscaler" working well soon. I just hope that isnt also increadibly inconsistent.
Am I weird for thinking that the answer to all upscaling is "Don't use upscaling"? Why does it seem like a minority opinion to say that video cards should just be capable of rendering our games at native resolutions like they have for the last 30 years.
How the tables have turned lmao. Seems like FSR isn't the DLSS that the media made it out to be ;)
That Niktek Made it out to be*
XESS is the last hope to challenge DLSS.
@@MadridistaFrierenNot really, like holy moly. I think FSR2 can be good at higher resolutions, but there was a general perception where it was seen equivalent. I've seen this niktek fella, and it's one of the lamest fanboy things ever xd. XeSS is good on ARC as the DPA4 path isn't a proper solution, but from a technical standpoint it's a lot better than FSR. ML based upscaling can perform a lot better yep, but I think even UE5's TSR is doing better in some games compared to FSR.
Also Frieren is goated.
@@R3endevous too bad the anime is ending soon 😭😓🥲. Every good thing must come to an end
@@MadridistaFrierenI love how massively pro AMD biased stuff (like Nik)gets pushed by RUclips's algorithm super heavily yet you won't see that happening on the other side (pro Nvidia or intel)
With upscaling becoming more and more relevant - it's easily the biggest difference between AMD and NVIDIA GPUs holistically.
Unlike things like raytracing, upscaling has its place in all price ranges - it isn't just a "luxury" feature. A good upscaling implementation allows the user to get more graphical fidelity or performance than they otherwise would, regardless of price range. The main reason you don't see people using FSR/DLSS at 1080p is just that the results are quite poor due to having less data to predict from. That will eventually change as each implementation improves.
Yeah difference being that FSR is universally applicable
@@matthias6933 The difference is FSR is not and has never been a competitor to DLSS. It cannot hold a candle to DLSS without AI.
FSR competes in the special Olympics and comparing it to an actual Olympian (DLSS) is distasteful and disgusting.
@@calvin659You seem so excited to let Jensen squeeze you out. Be a good consoomer to Senpai Huang and make sure to save for the next card soon, so you dont feel the emptiness of being arbitrarily left out out of feature sets once again.
Also funny that FSR does very well in Avatar
Who upscales to 1080p though? You're doing it wrong
I usually agree with a lot of the stuff the boys say. But seriously "awful"? "significantly worse"? "poor"? You've shown one game where you can tell that there is a quality issue - everything else requires side by side comparisons with wild zoom-in levels and pixel peeping on a frame by frame basis.
Yeah FSR is probably worse. I'm also fairly sure that the damn thing is supported to run on 1440p and 4k and not below that (not 100% sure but I swear I remember reading that being the spec). But its genuinely so unnoticeable in actual gameplay (outside of some outliers) that a normal person wouldn't give two monkeys about it.
I'm sorry but you've lost me on this particular topic.
TAA ruined graphics for me. I know why it's used, but back then 1080p was all i needed to play, and games looked great. Now even at 4k games are a blurry mess, and i just want sharpness. Rdr2 shouldn;t even be allowed to be released with amount of vaseline on screen it generates
I'm a budget (1080p) buyer, and this feature was extremely useful for me. *Also, I'd request your thoughts on 1080p to 4k 'integer' scaling vs 'FSR/DLSS' upscaling please.*
Usecase is this: I'm playing 1080p games on a 4k screen. For a better image quality, should I 'integer' scale the image (using AMD/Nvidia control panel), or should I use FSR/DLSS?
I tried searching online, but surprisingly there is no coverage on this. Even though this looks like a very frequent usecase to me. Maybe most please just let their monitors handle the scaling. So your views will be much appreciated, though a video will be lovely 😊
That's a good question. I'm thinking about buying a 4k monitor because I work with texts, but I also play games on the PC
with The Talos Principle 2 is because it was certainly removed motion blur, that in that game is what serves as denoiser for FSR, I recommend to use half motion blur and reduce sharpening to even zero, that gives the best image. That way FSR ends up being the best of the non-dlss upscalers and with Frame Generation
So do you prefer FSR artifacts, TAA artifacts or DLSS grease?
DLSS does not have "grease"
XeSS weirdness! jk DLDSR for me.
@@TheGoncas2 yes it does lmao. All of these techs have visual issues and it's why I'd rather just shell out for a card where I don't have to 'turn it down'.
For every pro/con you stated for *whatever* tech, I'd see a clear failure in the alternate tech. In the end, if your PC can't do 1080p without scaling then you'll be happy with whatever tech you have available
I get what this video is pointing out but to be honest, and maybe it's just me, but I didn't notice any issues with either technology until they were pointed out. I think that when people are just playing the game, and again maybe it's just me, I don't think these issues are really going to get in the way of the overall experience.
I often felt that way too about this kind of video, but from experience I can tell that the shimmering/flickering can get really annoying really fast if you are actually playing a game. With the other stuff it is more like Tim says: you probably won't notice the specific things that are wrong, but just that something about the game doesn't look right.
yeah the problems feel extremely exaggerated and feel like momentary inferiority complex for shiny features that will be superseded by something else in the future anyway. Meanwhile Nvidia has been tightening the memory width and size in order to make these stupid features possible but somehow nobody criticizes that. The last thing I want from AMD is to build custom processors into the GPU so they can cheap out on other parts that truly count for a GPUs longevity.
1080p gang represent!
Is it me or from just observing the thumbnail image alone, which can you can do by configuring your web browser to 'not auto-play video' , the FSR image of Ratchet actually looks better than the DLSS image. Despite the title claiming 'AMD Must Fix FSR upscaling'?
Problems with these techniques crop up when in motion, so this can't be judged from just a thumbnail.
I think that that's just because of the extra sharpening with FSR. At 5:10 of the video is the same image comparison, but with native 1080p, as well, and you can see that the DLSS version looks closer to the native. That's supposed to be the goal of upscaling and FSR's sharp pixels there, while looking good in a freeze frame, may sizzle a lot in motion.
At 5:01. I think the rain in FSR is closer to the native representation. I don't know this scene, but is the blue sign in the top right supposed to be a garbled mess, as reconstructed by DLSS? And at 5:15 it just looks like DLSS uses filters the image much more to soften the edges.
TAA isn't native representation. If you want proof, if he turns up the resolution to 4k with TAA you'll see that DLSS is the one that is correct. As far as 5:15, well, this is literally just a case of FSR looking similar to it's internal resolution and not the upscaled resolution it's supposed to mimic.
Always the same story every few weeks. DLSS is visually better than FSR. That is now known. Nevertheless, FSR 2 and 3 are not unusable! But what the test also shows. DLSS is not fundamentally better than the native image. Because when playing normally, no one makes direct side-by-side comparisons. Or detailed comparisons in zoomed-in slow motion or on high-resolution still images. The comparison is also misleading because some titles compare older FSR versions against the latest DLSS implementations. The comparisons only fuel fanboy discussions, they're not good for anything more. In addition, there are already rumors online that FSR+AI will come in 2024. Hopefully the unspeakable, impractical discussions will finally come to an end.
DLSS is not always better than native, but it's actually way closer to native than FSR. The old versions of DLSS and fsr because that's what officially supported in the game.
@@NamTran-xc2ip dlss is only "better" than native if taa implementation was shit.
@@notenjoying666 Most games have bad taa, and even if taa is decent it will always be an inferior sampling method to even fsr, especially at 1080p. You can blame the game to make excuses for amd all you want, but thats the reality.
@@NamTran-xc2ip At 1080p fsr is a ghosting mess. Taa is way better.
I don't think using any form of upscaling is worth using at 1080p. Even dlss looks blatantly worse than native. I'm sorry but why do you want to run games at lower image quality than most PS4 games...
for the love of god please Sony gift your PSSR AI upscaler to AMD
While probably not the intended use of the video, it does also give us an interesting look at what we can expect out of the Switch 2 now that it will have access to DLSS 2 (I seriously doubt it will have DLSS3).
Its also unclear how many Tensor Cores it will have as well, but generally speaking its nice to see how well DLSS compares to native while still delivering a significant boost to performance from such low starting base resolutions.
Hopefully most devs take advantage of it, since poor performance, really low resolutions and no AA were way too common on the Switch 1
Switch 2 will have access to Ray Reconstruction (since even the 20-series cards have it) but not Frame Generation since it lacks the Ada Lovelace OFA, but that won't be too much of a drag since Nvidia doesn't recommend using Frame Generation below 60 fps anyways.
Papermaster confirmed that FSR AI is coming.
This is both good and bad news.
The good news is that it will probably catch up with DLSS 3.5 upscaling.
The bad news is that I suspect that it won't be on par with DLSS 4 upscaling which is incoming with the Blackwell release on the market... basically they will always be one or even two steps behind.
AMD needs to start funding (and supplying human resources) to their software team more and they need to focus theose resources on specific useful techs rather than more side projects (AFMF) which can really be done if you are dominating and you have spare time and money to throw somewhere. And this latter part is definitely Nvidia's situation not AMD's.
worst thing about fsr for me is poor resolution in distances cuz im born to look ahead :D it destroys all vibe of playing CP with poor details in distance
I love how bad the rachet and clank run on my 6900xt while other games I out perform the rtx 3090. Please make it notice I want insomniac fix the rachet and clank..
That game is Nvidia shilling machine and many others
Thats how favoring works 3090 and 6900xt is just the same in 1440p and below
on my rx6600 the game is stable only with dynamic resolution
@@dssd7685 Yes, yes my friend. Every example of AMD being worse is just "Nvidia shilling machine", "paid bastards" and so on. Every single one of them. Huh, it seems NVidia is sponsoring like 95% of games today...
@@gargean1671 from crysis to Alan wake its ngreedia tech only and yes its ngreedia shilling machine, holy sh!t this comment sections filled with nvidia bots, not to mention fsr 3 is the latest and implemented on only 3games and perfectly implemented on only one game and this video is comparing the ancient fsr 2. smh
1080p dlss quality on 24' vs 1440p dlss performance on 27'. Which one has higher image quality?
Screen size has no impact on image quality whatsoever, fyi.
@@Dajova it's not only screen size, I m saying 1080p dlssquality vs 1440p dlss performance
@@Dajova It influences PPI, which affects image quality.
@@H.S03Its still the exact same amount of pixels, just on a different space. What youre talking about is how well a monitor is handling the amount of pixels rendered. He's talking about IMAGE quality, not MONITOR quality.
@@Dajova The number of pixels are the same, not the pixels themselves as they are physically smaller in order to cover a smaller area. I didn’t reference any aspect of monitor rendering or quality. So long as the resolution stays the same, a smaller screen will have a lower pixel pitch (can be thought of as PPI), which will always lead to a more detailed image when all else is equal.
The entire reason resolution increases image quality is because of PPI. The same screen space + more pixels = higher density of information which naturally allows for a greater level of detail since more visual information can be represented for any given area of the screen.
Fuzziness Smeared Resolution is what FSR truly stands for.
I don't like deep learning shit. But there was a very good deal on "4060". After playing reddeadredemption2 with dlss on quality, I could not tell the difference to native. But trees looked smeared with FSR quality.
DLSS more like: Don't Look Still Smudged
@@Moon-v5xI don't know. I jumped from 1060ti to 4060. And most games I tried looked good with dlss "quality".
With FSR there always something wrong.
@@fanaticaudienc8089There's no card named 1060ti
@@NamTran-xc2ip huh?
As someone who still plays on a 1650, an unstable image is infinitely better than 20fps, but no change to the upscaling component from FSR 2 to 3 is ridiculous.
I believe AMD stopped working on FSR2 altogether and is currently focusing on the next iteration that uses machine learning cores as mentioned last week, probably named FSR4. They gave up on what was obviously not working and moved on. Tensor cores won.
Slamming calculations using specially made hardware acceleration cores is obviously superior.
Also means you need that hardware.
And Nvidia wont allow anyone else, because they dont care about spread of technology, they only want to sell.
Yeah, playing on a AMD GPU at 1440p myself, I try to use native resolution whenever possible. The hit to image stability is just too noticeable, I'd rather turn down stuff like shadows and reflections than use FSR.
More importantly, XeSS and TSR both do a much better job imo, so I occasionally use those where possible.
I don't agree with upscaling at 1080p... mid range GPU prices have dropped to around 200-300€ and they can easily handle any game at 1080p...
What mid range GPUs cost less than 300$?
@@roki977 6600 around 200€, 6600xt around 300€, 7600 around 300€, 3060 around 200€, 4060 around 300€...
It's funny because I experienced FSR well before DLSS and just accepted that upscaling comes with a certain amount of ghosting and artifacts. I didn't get why so many claimed DLSS was better until I got an RTX card and realized what _good_ upscaling was. Don't get me wrong, I'm still grateful that FSR exists because it can be used on my wife and kids' GPUs giving them a longer life with newer games, but it really doesn't compare with DLSS, no matter what the AMD fanboys pretend.
The fact that _no one_ uses FSR if they have access to DLSS is all the proof you need that AMD still needs to do a lot of work with FSR.
In other words: Don´t use FSR at 1080p. Drop settings to high or even medium and enjoy.
FSR will get better at time
so is DLSS
DLSS seems to prioritize image stability and that was always one of my biggest personal annoyances before DLSS (damn you thatched roofs in Witcher 3). Which is a big reason I've been a big fan girl. I do like a clear image but I'm far more bothered by frame to frame flicker. Esp when standing still which FSR STILL has issues with.