Is DLSS Really "Better Than Native"? - 24 Game Comparison, DLSS 2 vs FSR 2 vs Native

Поделиться
HTML-код
  • Опубликовано: 5 окт 2024

Комментарии • 1,4 тыс.

  • @vernacular3289
    @vernacular3289 Год назад +992

    This is really a test of bad anti-alias has become in games. So much that resolution scaling becomes the only means of avoiding shimmering jaggies.

    • @quantumsage4008
      @quantumsage4008 Год назад +47

      havent played that many modern games recently but good lord war thunder has a really bad problem with anti aliasing. Its so bad, SSAA is mandatory. Even with that, theres so much noise you will have a real hard time spotting enemies. TAA helps too but certain texture still blinks though its not as bad as it was previously. You have to sacrifice so much performance

    • @CC-vv2ne
      @CC-vv2ne Год назад +178

      @@quantumsage4008 TAA is worst thing to happen to games, every single TAA implementation makes images noticebly blurry

    • @Dyils
      @Dyils Год назад +23

      @@CC-vv2ne So add sharpness, what's the issue? You seen TAA in R6 with a bit of sharpness (right under the TAA)? It looks CRYSTAL CRISP. It's AMAZING. I wish all games had such good anti-aliasing and sharpness.

    • @jeanmartin7166
      @jeanmartin7166 Год назад +26

      @@Dyils Ubisoft is usually among the best technically.
      It would be like saying DX12 has not been a severe issue because WD Legion managed to make dx12 running better than dx11.
      TAA is usually badly implemented, R6 isn't a proof that most games could achieve the same thing.

    • @DanielOfRussia
      @DanielOfRussia Год назад +12

      @@CC-vv2ne FH5 is the one example I can think of where TAA looks way better than even MSAA. But it is more of an outlier

  • @zenpool
    @zenpool Год назад +531

    This gets interesting when some game devs enable DLSS as an antialiasing solution (DLAA) while retaining the native render resolution.

    • @LuchsYT
      @LuchsYT Год назад +32

      The Finals enabled DLAA for its beta and it looked incredible

    • @lexsanderz
      @lexsanderz Год назад +43

      don't wait for developers to learn the nvidia sdk. Instead you can use the project dlss tweaks to enable DLAA into any game with the nvidia dll.

    • @winebartender6653
      @winebartender6653 Год назад +4

      Yup, dlaa looks really really good with native (what I use with Hogwarts legacy )with a 27" 1440p monitor. I found dlss quality looks quite bad in Hogwarts, noticeably softer and blurrier, especially indoors.
      I did not realize it was possible to enable on your own for any game with dlss, will have to try that in other games.

    • @SweatyFeetGirl
      @SweatyFeetGirl Год назад +9

      Dlaa looks absolutely horrible to me...

    • @znoozaros
      @znoozaros Год назад +13

      @@SweatyFeetGirl Too soft on lower resolutions, unlike msaa which works great

  • @noliferxd
    @noliferxd Год назад +323

    Guys, I'm shocked by the scale of the work done. Thanks a lot. I would also like to see a comparison of DLAA with other AAs in native resolution

    • @jeanmartin7166
      @jeanmartin7166 Год назад +6

      DLAA would also removed the "bad AA implementation" bias when comparing native to upscaling methods

    • @bluesharpie9744
      @bluesharpie9744 Год назад +6

      DLDSR 1.78x +DLSS Quality (1.78x * 0.444% = about 80% the amount of native pixels) is a close contender to DLAA with similar ±performance in case DLAA is not available (although there now seem to be mods to hack it in).
      In my experience DLAA tends to ghost quite a bit on around edges

    • @jeanmartin7166
      @jeanmartin7166 Год назад +1

      @@bluesharpie9744 I tried dldsr 2.25 for Death Stranding director's cut and it wasn't good. Not as bad as TAA alone but still a lot behind DLSS alone.
      PS: it's only one game in 1080p, I have test every game

    • @teddyholiday8038
      @teddyholiday8038 Год назад +3

      DLAA will always be better

  • @jayjay00agnt
    @jayjay00agnt Год назад +137

    Awesome information. This channel has been especially hitting it out of the park with valuable and hard to find information. I know these videos are a ton of work, and I'm hoping others can appreciate and find this content as helpful and useful as I do.

    • @drek9k2
      @drek9k2 Год назад

      Yeah for real, and these guys are truly independent and not willing to shine a turd. Like it'd be funny watching them seeing a GTX 1630 in action for the cost. It confuses me because I think they're pretty much fair, and rightfully slammed teh 6500XT as being a garbage product, yet lots of butthurt fanboys had to cope about what seemed like blatantly obvious common sense to me 8gb was not enough. I'm confused by techpowerup though it says a 6500XT performs the same as a 5500XT? Why really?? Funny because mostly AMD was the best brand last gen, except that one stinking turd. I automatically expect Gamersnexus and Hardware Unboxed to be utterly unswayed by corporate bullshit artists.

    • @vmafarah9473
      @vmafarah9473 Год назад +1

      Meanwhile Lius techtips uses 1/20th of the effort to make a video, but includes some hand held camera movements, some gpu dropping, some sponsourshots, some jokes .

  • @Maddin-6616
    @Maddin-6616 Год назад +193

    I will never understand why the DLSS or FSR files are not updated... there are really big differences in quality and it's not a big job for the DEVs. but the result is all the greater.

    • @NostalgicMem0ries
      @NostalgicMem0ries Год назад

      what you mean updated? to lastest versions or implemented in other games?

    • @DavidFregoli
      @DavidFregoli Год назад +61

      you can't just boot the game and test a couple scenes to verify it works correctly, there might be a specific area or shader that doesn't play well with it, it's not that straight forward

    • @muchvibrant
      @muchvibrant Год назад +16

      @@DavidFregoli I agree but I personally use the latest DLSS version with the games that support DLSS but are stuck with older version (eg. RDR2 ) and the game looks native with more fps. DLSS 2.5.1 or any 3.x.x looks so good

    • @lexsanderz
      @lexsanderz Год назад +30

      As a developer, it comes to : there is no ticket for that. Anyway, the nvidia SDK states that the dll is drop-in replaceable and that is a major feature of their sdk. Even have an update function you can call but nobody ever does.

    • @muchvibrant
      @muchvibrant Год назад +25

      @@TechTusiast Also dont get confused, NVIDIA DLSS DLL 3.1.11 is not DLSS 3 which we know.. Its DLSS 2 only but with version name 3.1.11

  • @ryomario90
    @ryomario90 Год назад +67

    You have to disable then re-enable AA in Death Stranding Directors Cut edition each time you launch the game, otherwise it'll be without AA, even though in the graphics menu it says it's on. The standard Death Stranding didn't had this issue.

    • @Keivz
      @Keivz Год назад +7

      Wow, great observation. Doesn’t change the overall result but the devil is in the details.

    •  Год назад +2

      Who fucking plays walking simulator!!

    • @B.E.3.R
      @B.E.3.R Год назад +3

      I knew something was wrong with the results. I played the standard edition of DS a few months ago and thought the TAA implementation was actually good. And I played at 1080P!

    • @starcultiniser
      @starcultiniser Год назад +1

      @ people without legs?

    • @ShadeKill
      @ShadeKill Год назад

      @@starcultiniser Imagining it, is its own journey

  • @c.chepre8452
    @c.chepre8452 Год назад +87

    would love to see DLDSR together with DLSS added at some point in the future, i often use both together at 1440p and it further reduces flickering/aliasing.

    • @8Paul7
      @8Paul7 Год назад +17

      Yeah I play on 1080p plasma and the picture is just perfect when using DLDSR and DLSS quality combo.

    • @Dionyzos
      @Dionyzos Год назад +12

      I love DLDSR even at 4K. It performs even better than native and looks significantly better in most titles.

    • @Lollakuk
      @Lollakuk Год назад +3

      @@8Paul7 so you DLDSR it to 4K then do DLSS performance on it or?

    • @impc8265
      @impc8265 Год назад +3

      @@Lollakuk you can't dldsr to 4k on a 1080 screen, but yes the idea is correct

    • @Lollakuk
      @Lollakuk Год назад +5

      @@impc8265 you definitely can. I have DSRd to 2160p on my 1080p monitor.

  • @kalark
    @kalark Год назад +70

    I honestly just assumed going into this video that Native > DLSS and FSR in terms of quality, so it was pretty interesting to see how it resolved some issues with some games like DS. Great video!

    • @dante19890
      @dante19890 Год назад

      That's not always the case sometimes dlss 2 will have better image Integrity than native

    • @razoo911
      @razoo911 Год назад

      in fact its true from this video you can see the only thing that fsr and dlss do better is antialiasing

    • @uhurunuru6609
      @uhurunuru6609 Год назад

      Well you assumed correctly, because Tim makes a common mistake, Native > Upscaling (ANY), but DLAA > AA (ANY), because all AA is trying to imitate DOWNscaling.
      Render @ 8k, Display @ 4k gives you the best AA you can get, at cost of huge performance hit, and every AA ever invented tries to mimic this taxing AA method, without high FPS hit.
      DLAA is training the AI on Downscaling, to do better AA, without upscaling the render. So all this video shows is what we already know DLAA is "Better than any other AA.
      This is worst video Tim has ever made, especially as we can now force DLAA, in any game that has DLSS support (How? See my other posts in these comments).

  • @JohnThunder
    @JohnThunder Год назад +123

    I noticed that some DLSS version give you heavy ghosting. If i notice this in a new released game I replace the DLSS file with 2.5.0 or 2.5.1 version.

    • @Napoleonic_S
      @Napoleonic_S Год назад +1

      Where to get those dlss files?

    • @conyo985
      @conyo985 Год назад +1

      I like those versions too! It's gives the sharpest and the least ghosting image.

    • @jemborg
      @jemborg Год назад +5

      Latest is 3.1.11, why not use that?

    • @mylliah7648
      @mylliah7648 Год назад +10

      I use the 3.1.11 dll with DLSSTweaks to change the preset to C (preset used in 2.5.1). So it's even better than 2.5.1!

    • @Shendue
      @Shendue Год назад +27

      @@jemborg Precisely for the reason stated, duh. "Some DLSS versions give you heavy ghosting"

  • @glittlehoss
    @glittlehoss Год назад +24

    Unbelievable content. Need to continue to showcase games relative performance between DLSS, fsr and native twice per year to keep up with the updates and new games coming out.

  • @lhv2k
    @lhv2k Год назад +18

    I think you should consider testing more games with manually updated DLL file for the lastest version of DLSS. Since as you said, just by updating its version can change the result from worse than Native to match or even better, so I wonder if this is a trend across the board, and since this is a comparison of DLSS vs Native, I think it's fair to use the best and lastest version of it, since it can be manually done by users for free, not depending on games developers to implement. If the results are indeed positive for all, then it's gonna be truly a game-changing technology, since it'd be a truly free, stable and noticeable performance boost toggle that comes at no cost, and might even improve visual, simply a much better solution than overclocking since it's not dependent on silicone lottery.

  • @cromefire_
    @cromefire_ Год назад +95

    Maybe instead of or in addition to further optimizing DLSS and FSR image quality, maybe developers should just spend more time properly fixing the native rendering in many cases... Basically every single time DLSS was better, it wasn't because it was so good, but because native TAA just was so bad. We have seen that before with stuff like AMD's CAS which seemed to work just a lot better than what ever developers built themselves...

    • @cromefire_
      @cromefire_ Год назад +7

      @@jkahgdkjhafgsd it does absolutely not, not even on a 4K monitor

    • @cromefire_
      @cromefire_ Год назад +4

      @@jkahgdkjhafgsd Even on a 140ppi monitor jagged edges looks worse than everything, I'd rather take bad TAA that flickers a lot, but it's better than literally everything looking absolutely awful... If you want something good, look at something like SOTTRs SMAA + TAA implementations. They are expensive, but it works wonders, not a lot of flickering and other artifacts, just plain solid.

    • @normaalewoon6740
      @normaalewoon6740 Год назад +1

      No anti aliasing is almost perfect during movement, it absolutely slams the clarity of even dlaa. Aliasing patterns are a problem because the pixels are only rasterized in the centers, which is pretty wasteful. It's better to rasterize random locations inside the pixels, which also randomizes the aliasing patterns and makes them a lot smoother in real time. To improve the clarity even further, foveated supersampling/downsampling is needed, on top of backlight strobing/black frame insertion and eye movement compensated motion blur, which make 100 hz look and feel like 1000 hz in a non destructive way

    • @coaltrain2299
      @coaltrain2299 Год назад

      @@jkahgdkjhafgsd dlaa

  • @chrisvicera6696
    @chrisvicera6696 Год назад +13

    In some games I honestly prefer 4K native without any temporal AA/ similar techniques . FSR, DLSS and TAA when poorly implemented show lots of ghosting and shimmering which is super annoying

    • @prman9984
      @prman9984 Год назад +2

      In most games.

    • @DavidFregoli
      @DavidFregoli Год назад

      yes that's the point of the video

    • @chrisvicera6696
      @chrisvicera6696 Год назад +2

      @@DavidFregoli No he compares native with the best AA which is typically TAA, if you actually listened to the video lmao

    • @DadGamingLtd
      @DadGamingLtd Год назад +9

      It bothers me that taa has become the goto AA technique. I realize it takes a lot less power than smaa, but it consistently looks like ass. Luckily, you can turn it off in most games.

    • @Edinburghdreams
      @Edinburghdreams Год назад +7

      Glad to see people are waking up to TAA looking like garbage.

  • @awesomecarl
    @awesomecarl Год назад +55

    DLSS is just really good anti aliasing, that's why they added DLAA, which is just DLSS without upscaling

    • @HanSolo__
      @HanSolo__ Год назад

      This

    • @Superdazzu2
      @Superdazzu2 Год назад +5

      no it's not lol, it also gives back huge chunks of fps, ranging from 20+ (quality) to almost double with performance

    • @johnnypopstar
      @johnnypopstar Год назад +11

      @@Superdazzu2 Yes, it is. He is absolutely correct, from a technical perspective, on what the actual image processing techniques involved in the upscaling are doing. There is also a *side effect* that because it's starting with a lower resolution image, you get an FPS bump - but the fundamentals of the image processing that's going on, are, *unavoidably* that it's doing extremely computationally inefficient anti-aliasing.

    • @Drip7914
      @Drip7914 Год назад +1

      @@Superdazzu2 no it doesn’t lmao, dlss3 the latest iteration of dlss actually REDUCES performance due to 10-15x the latency, fsr actually increases performance/quality. (Source - HUB, AdoredTV, NotAnAppleFan, PCworld etc)

    • @awesomecarl
      @awesomecarl Год назад

      @@Superdazzu2 Yeah because DLSS is applying that anti aliasing to a lower resolution, instead of native

  • @julfy_god
    @julfy_god Год назад +18

    24 games, that's a lot of work) great job, Tim!

  • @valrond
    @valrond Год назад +50

    You know, I had been playing newer games with DLSS quality activated at 4K and 1440p with my 4090 and 3080 respectively, as they came as default. They looked fine. However, a few days ago I went back to native and I was surprised by how much crisper everything looked.

    • @beachslap7359
      @beachslap7359 Год назад +3

      Well, it's a trade-off. You easily get used to less crispy image, but the smoothness makes up for it and once that extra smoothness is gone you IMMEDIATELY notice it.

    • @sebastien3648
      @sebastien3648 Год назад +3

      because AA sucks, when no AA (TAA, dLAA, DLSS) games look way sharper... but sadly these day you are forced to have at least taa in like 90% of games.. even thought at 4K i would personnaly rather have the slight shimmering than the smoothness.. basically thiscomparison is between TAA and DLSS/fsr, native doesn't mean s*** anymore..

    • @valrond
      @valrond Год назад

      @@beachslap7359 I'm already getting well over 60 fps. I don't care if I get 80 or 120.

    • @TheReferrer72
      @TheReferrer72 Год назад

      @@sebastien3648Suck? You mean look sharper when you take a screen shot and look at the resulting image for 5 minutes?
      Graphics in games are brilliant.

    • @Terepin64
      @Terepin64 Год назад

      @@cptnsx Calm your tits. I wrote "to mitigate", not "to fix".

  • @krystiano.610
    @krystiano.610 Год назад +17

    22:10 While in theory automatic updates sounds good, as a developer - in practice, we never want to release an untested update.
    In normal development cycle (and most of the cases), after releasing version 1.0 active development stops and we move to a support phase - where we do not add any new features and focus only on fixing bug and glitches. Most of the developers move to another projects (which may be even DLC for that game, but that also count as a different project) and since there is less human resources, you can't just constantly update such an important library like DLSS (important since it directly affects visuals of the game).
    If something worked before it's not guaranteed that it will still work after update - e.g. there may be a particle effect at a waterfall which after update starts producing ugly artifacts, QA may not catch this (mostly due to reduced resources), but gamers surely will - that's a big risk. Risk which can be avoided by not updating that library.

    • @jemborg
      @jemborg Год назад

      It helps if you make a backup of the original dll before you update... in case it looks worse.

    • @Vantrakter
      @Vantrakter Год назад +3

      That sounds pretty logical. And yet if upscaling techniques are being employed by more and more games and gamers and the version of the upscaling library directly affects the enjoyment of the game, it seems logical for a project management to allocate resources to keeping it updated after release, or at least evaluate an update some time after release. I'm not sure one visual glitch in a waterfall matters if the rest of the game looks better and is slightly faster than before.

    • @deviouslaw
      @deviouslaw Год назад +2

      Outdated thinking

    • @jeanmartin7166
      @jeanmartin7166 Год назад

      If the result is better in 99% of the cases: Why not just always update and let the tested version as an option by default ? In many games, you can chose which version of dlss/fsr you want in the game settings.
      Players would probably love to see some minors updates after release. Especially when it doesn't require a heavy DL.

  • @daelionmr4782
    @daelionmr4782 Год назад +4

    That native image instability and flickering bothers me a lot once I notice it and DLSS Quality fixes it in most cases so I'm using it almost all the time.
    To me that looks better than native TAA and I can't notice the ghosting tbh + less power draw and lower temps with better performance so its a win-win for me.
    Even on my lowly 29" 2560x1080 ultrawide Quality mode looks good enough to me, usually I manually update the DLSS dll with the 2.5.1 version tho I'm yet to check the newer versions.

  • @gunnarthegumbootguy7909
    @gunnarthegumbootguy7909 Год назад +4

    DLSS has a wonderful effect on surfaces that get shimmering lighting/shading or shimmering/bubbling screen space reflections that is pretty common on games of the last 3 years, my guess is that the temporal effect of it smooths it out, in sometitles it's very obvious like Cyberpunk 2077 where this kind of shimmering appears on a lot of surfaces. I guess theoretically a good TAA setting would be able to do the same, but it would have to be able to understand know what kinds of objects it should do it to, or it would cause ghosting... i guess this is where DLSS comes in, since the machine learning can make it understand what parts should reasonably not be shimmering... and modern 3.1 dlss is very low in noticeable ghosting anyway...
    also now with dlsstweaks which is a fabulous tool, and now it even comes with a UI program for people who might feel intimdated by editing the text files directly.
    With this, you can set your own internal resolutions... i've been playing around with it a lot... most games can run on really weird internal resolutions, like rendering at 3:2 3240x2160, which i've tried for 4k, so it is DLAA vertically, and DLSS horizontally... in games where you would need just a bit more performance with dlss but you are not satisfied with the "quality" option... or in games where you want something in between the available settings. I hope in the future this can be baked in as a slider in games, so that instead of having a number of options "quality, balanced, performance and ultra performance", you can just have a slider from full internal 1x resolution (DLAA) down to even below the ultra performance (x0.333333) level... I've had great use of this tool to max out for example The Last Of Us Part 1, so that I can set a resolution just high enough to look very close to native (so above the DLSS quality setting, which is x0.6666667) but still fitting within my VRAM limitations for a certain set of settings, that i wouldn't have been able to do if i ran it at native. DLSS really saves the day and i've tried using FSR and XeSS to see if they are any better in any game, but i haven't yet found any case where they look better. Having gone over to 4k tv for gaming and content viewing recently, I wouldn't even be able to run a lot of the games i like in 4k without dlss, it's really the main thing that would keep me from thinking of going over to AMD cards if i was looking for a new graphics card now.

  • @Techfanatic73
    @Techfanatic73 Год назад +11

    Some other channels could learn from your image quality. Always sharp and clear 4k. Probably the best looking channel I watch. Why do other channels not look this good? Lol. Keep it up!

    • @Sal3600
      @Sal3600 Год назад +3

      Biased. They're all the same.

  • @brucethen
    @brucethen Год назад +14

    Thank you for all your effort in producing these comparison videos it is much appreciated

  • @benfowler2127
    @benfowler2127 Год назад +13

    I’ve felt that native is the only true sense of what a gpu can do, but I can’t tell the difference in SOME scenarios. If I’m in the middle of a game, I’d say there’s only certain things like the fence and things flickering that would get on my nerves. So it seems like it’ll end up being a game by game thing for me as to whether or not I’d use the tech or just play with native rendering.

    • @gozutheDJ
      @gozutheDJ Год назад +6

      in terms of judging how performant a card is, 100% native. in terms of playing and enjoying a game, whatever looks best to you.

    • @dante19890
      @dante19890 Год назад +1

      Why would u sacrifice performance if you can't tell the difference anyway.

    • @benfowler2127
      @benfowler2127 Год назад

      @@dante19890 I am basing my statement on what I see in the video. I haven’t yet compared them for myself in the games I play. I tend to play in native res without the frame generation on anyways. The one time I tried it, it wasn’t impressive to me, so I haven’t bothered since. Also, with whatever extra processing youtube does to these videos, and depending on the screen someone is watching on, it can be difficult to tell if I’m seeing things the same way. Especially since “seeing 4k” on a tablet vs a monitor or tv is dependent on those devices. What I should do is watch this video again on the pc and monitor I’d be gaming on. And I did say “SOME” scenarios. Not all. There’s a clear difference in many of the examples he showed.

    • @dante19890
      @dante19890 Год назад

      @@benfowler2127 ye but either way its gonna be a little better, same or a little worse depending on the DLSS version and the game, but you are getting a huge performance boost so its a instant net win no matter how u look at it.

    • @sito3539
      @sito3539 Год назад +1

      Problem is it's not just fences, it's bloom, rain, depth of field, air particles, god rays, wide color gamuts, OLED contrast, etc. on top of increased latency.
      There're lots of things AI can't do, and it certainly can't do native graphics better, just TAA better, when TAA is bad.
      Hitman in the rain being the worst offender here, even when DLSS is technically better as an AA solution it makes lighting look awful.

  • @87crimson
    @87crimson Год назад +9

    Another use case for these upscalling techniques is to reduce the workload and power consumption on your graphics card. If it looks the same to you, just enable it.

    • @eternalbeing3339
      @eternalbeing3339 Год назад

      It increases the cpu usage though. Some people shouldn't if they are using ancient hardware.

    • @Jonatan606
      @Jonatan606 Год назад

      ​@Eternal Being33 A GPU consumes more power than CPU. You can also combine it with frame generation which will give it breathing room.

  • @PoRRasturvaT
    @PoRRasturvaT Год назад +6

    The dragon neon sign on Hitman is losing a lot of brightness and bloom compared to native. The ~1-2 pixel wide neon information is lost and thus doesn't produce the light it was supposed to. That's pretty impactfull in my opinion. It's completely extinguished/muted on FSR.
    This is something that will be noticed a lot in Cyberpunk, where 1-2 pixel elements (neon signs again) have to convey bright, contrasted information and is further enhanced by bloom post processing.

    • @MichaOstrowski
      @MichaOstrowski Год назад +1

      The intensity of bloom is, sadly, often related to the native resolution, so when that drops, you get less bloom...

    • @monotoneone
      @monotoneone Год назад +2

      Yes, lighting effects take a hit when using upscaling. This is why I ran Death Stranding at native despite the shimmering, the lights look "wrong" when using DLSS. The more bloom light sources have the more it is noticeable. Neon lights often look like LED lights with DLSS.

    • @juanblanco7898
      @juanblanco7898 Год назад

      There's a gamma shift and overal loss in color accuracy when scaling is performed, which is to be expected due to obvious reasons. And sadly this is not a problem most people, be it developers or consumers, are concerned about. Or what is actually sad is that I'm definitely not one of them as I'm very sensitive to such things.

    • @gozutheDJ
      @gozutheDJ Год назад

      good call i didnt even notice that.

    • @gozutheDJ
      @gozutheDJ Год назад

      @@juanblanco7898 you can't adjust gamma settings in game?

  • @lupusprobitas
    @lupusprobitas Год назад +8

    I would love if you tested DLSS balanced also. Would be nice to know whether it would be a good tradeoff or whether it in general is something to be avoided.

    • @paulcox2447
      @paulcox2447 10 месяцев назад +1

      Depends on your starting resolution. At 4k balanced works well. 2k it's ok. 1080p I'd stick to quality

  • @ginzero
    @ginzero Год назад +7

    Very well done balanced comparison! Thanks for the timely update on where we are at with the technologies.

  • @elaiswaifu5459
    @elaiswaifu5459 Год назад +7

    When I have performance left…
    I tend to do this trick…
    DLDSR 2.25x + DLSS Quality…
    I play on a 4K monitor, so the game would be rendered at 6K and DLSS Quality would make the game render internally at 4K and upscale to 6K…
    This makes the image look a lot sharper and better than regular 4K actually!
    Even on 6K Performance mode (which renders internally at 1620p), it usually looks almost like native 4K and lots of times even better than it by a bit.

    • @shockwav3xx
      @shockwav3xx Год назад +1

      What is DLDSR?

    • @lonerprofile
      @lonerprofile Год назад +2

      It's called supersampling, also be used in video/anime industry. Render video at 4k and downscale to 1080p or 8k downscale to 4k, that's why you see those studio use 8k/12k high end camera.

    • @elaiswaifu5459
      @elaiswaifu5459 Год назад +2

      @@shockwav3xx opposite of upscaling…
      It downscales the game instead, this gives a sharper and crisper image than native.

    • @Dionyzos
      @Dionyzos Год назад +1

      I often use 1.78x which already looks great

    • @shanroxalot5354
      @shanroxalot5354 Год назад

      @@Dionyzos Thats what I am wondering is it better to do 1.78x with Quality or 2.25x with Balanced/Performance etc..

  • @Raums
    @Raums Год назад +74

    My takeaway from this is I can’t really tell the difference so I can just go for fps ❤

    • @kilmor3151
      @kilmor3151 Год назад +3

      Yea with DLSS. Good luck not being able to tell the difference FSR2.

    • @Verpal
      @Verpal Год назад +9

      @@kilmor3151 FSR 2 is pretty good on 4K, I think its okay to use it for 4K quality, below that I am not sure.

    • @LeegallyBliindLOL
      @LeegallyBliindLOL Год назад +3

      @@Verpal heavily depends on the game. Unusable in RE4 for me.

    • @anuzahyder7185
      @anuzahyder7185 Год назад

      Bro fsr suc*s. I use nvidia but most of the times i prefer native over dlss q especially in cyberpunk. 80% of the time i wud go with native n 20% of the time i wud go with dlss quality. Never fsr it is horrible

  • @awphobic
    @awphobic Год назад +36

    Great video !
    People need to remember that DLSS / FSR / XeSS "ADD" performance with much better FPS. Which mean much better frametime and input lag that lead to smoother gameplay. When you can pull 30-40fps on native 4K/1440p/1080p in games you play and you can turn on DLSS to get much better fps or even stable 60fps that should be no brainer to turn on based on the 24 game comparison. With DLSS is edging closer to native in quality mode, FSR and XeSS catching up just mean better for costumer. The real problem just like the video point out, that new games release (hogwarts, deadspace, deathstranding, last of us) in native have much worse anti aliasing method implemented on their game in NATIVE. This feels the norm now that need more attention that developer making unoptimized game like when you can pull 14gb VRAM on the last of us just LOOKING AT A WALL.

    • @DavidFregoli
      @DavidFregoli Год назад +2

      I'd say it's a no brainer on Ties and DLSS wins. on Native+ it's the best tradeoff to get some performance, while on Native++ and Native+++ games you activate it only if you absolutely need, and potentially test lowering the quality settings first

    • @flintfrommother3gaming
      @flintfrommother3gaming Год назад

      VRAM is reserved, it doesn't matter if you look at a wall or look at a very filled place

    • @Jakiyyyyy
      @Jakiyyyyy Год назад +2

      VRAM already loaded as soon as you started the game, it dozen matter if you just staring at a wall or having intense gunfights.

    • @naturesown4489
      @naturesown4489 Год назад +3

      You should see the native without TAA - you'll see jaggies everywhere and there will be shimmering on all the edges. There's a reason games use TAA....

  • @bdifferentb
    @bdifferentb Год назад +2

    I honestly don't see any differences between any of these. Maybe I'm blind.

  • @ian260672
    @ian260672 Год назад +5

    The reason I stuck with Nvidia last year was due to DLSS, not too bothered over Raytracing, just DLSS. I use a 50 inch 4k TV and my old 2060 super using DLSS got stable 60fps at 1440p. My 3070ti on Dying light 2 gets stable 60fps at 4k using DLSS. Only thing I don't like is the vram only 8gig on 3070ti, Nvidia messed up with vram.

    • @eternalbeing3339
      @eternalbeing3339 Год назад

      8gb is fine if you dont max out the textures.

    • @vladvah77
      @vladvah77 Год назад

      Well, the RTX 3070/Ti is a 1440p video card, so 8 GB VRAM is exactly what you need.

    • @wertyuiopasd6281
      @wertyuiopasd6281 5 месяцев назад

      ​@@eternalbeing3339Cope

    • @wertyuiopasd6281
      @wertyuiopasd6281 5 месяцев назад

      ​@@vladvah77Fanboys should stay silent.

  • @aisliin
    @aisliin 5 дней назад +1

    very useful info, would love a revisit with more recent games someday !

  • @Argoon1981
    @Argoon1981 Год назад +5

    IMO when testing native using post-process TAA (temporal anti-alising) is not showing native at its best, TAA when compared to DLSS or FSR, lacks a sharpening filter, it slightly blurs the image instead and worse creates artifacts that wouldn't exist with no AA.
    MSAA (multi-sampling anti-alising) would be the best AA option to use for native, when comparing image quality, but unfortunately, modern games have no MSAA support at all and forcing it on the drivers, only works for older D3d9 (direct X 9) games and that is a real tragedy to me.
    You can of course, still use SSAA (super sampling anti-alising) in any game, by just rendering it at a higher resolution than the one you display at, and that has the best image quality of all but also destroys performance.

  • @druout1944
    @druout1944 Год назад +2

    DLSS looks awful compared to native; you gotta be blind to think otherwise. This is a very clear (or blurry) example of the old fable "The Emperor's New Clothes".

  • @rakon8496
    @rakon8496 Год назад +12

    That edition was "helpful-just-in-time". Getting these days frustrated with DLSS stuttering and lacking quality in FlightSim you mentioning the cpu limitation as deal breaker was very helpful. Enjoying the scenery low and slow with a real-life viewing distance means to run the simulation in a CPU limit more often then not. Your confirmation of the title looking much better in Native Res will have serious implications: the 7800X3d has to come to rescue. 😄💙PS: Regarding VRAM, the title (even in Mid Textures) uses more then 12GB ded. in dense areas.... the 3070 wasnt long in my system! Thanks

  • @NostalgicMem0ries
    @NostalgicMem0ries Год назад +5

    remember when they introduced dlss everyone said its gimmick and never be usable etc etc, now few gens later we are comparing who is better native or upscaling tech.... same will be with frame generation, now its kinda noticeable, latency and some ghosting, but give it some time and it will be no brainer tech to double fps and smoove out all games.

    • @mryellow6918
      @mryellow6918 Год назад +2

      Nobody said it was a gimmick, everyone said it was crap. And it was

    • @NostalgicMem0ries
      @NostalgicMem0ries Год назад +1

      @@mryellow6918 trust me i was there and remember reviews, many said it will not work , and here we are. im just saying same for frame gen, see you in 2 4 years

    • @mryellow6918
      @mryellow6918 Год назад

      @@NostalgicMem0ries not working and being a gimmick arnt the same thing

  • @jinx20001
    @jinx20001 Год назад +23

    incredible result for DLSS lets be honest, just being a tie to native would have been incredibly impressive but to pull wins out the bag on multiple occasions, wow.

  • @adriankoch964
    @adriankoch964 Год назад +10

    I used DLSS when first playing Cyberpunk on a 1080p ultrawide.
    Set a higher resolution than my panel and balanced it so it would render natively at the resolution of my panel; My panel would then downscale the higher game resolution image back to its native resolution. This gave me more distance clarity and a bit more stable image in Cyberpunk (that a month after release of CP, was before DLAA was an official thing in the driver or the game).

    • @kiburikikiam
      @kiburikikiam Год назад

      so you're basically using dldsr.

    • @adriankoch964
      @adriankoch964 Год назад

      @@kiburikikiam before that was part of the driver, yes.

  • @EvLTimE
    @EvLTimE Год назад +3

    I just pick 1440p dlss quality + sharpening all day vs forced TAA in latest titles where you cant even pick another AA (cp2077 as example). Dlaa is good alternative

    • @mryellow6918
      @mryellow6918 Год назад +1

      Forced taa is so fkn annoying it's the reason I could never play the newer battlefields past hardline can't stand any type of aa it's all just blury

    • @EvLTimE
      @EvLTimE Год назад

      @@mryellow6918 Main reason why I upgraded from 1080p...simply because no matter which settings you use you cant see anything. Blur mess vs pixelated mess. Lately my friend and me tried the Division 1 and we were astonished about AA implementation. Both 1080p and 1440p. I think we should blame devs fot AA at this point.

  • @Akkbar21
    @Akkbar21 Год назад +4

    6:40 ummm what is happening when it pans down on the spiderman brick wall with DLSS? It is hitching badly and looks so glitchy. If I saw that in game, I would immediately turn off whatever is causing it (so DLSS). Why wasn’t this incredibly obvious artifact mentioned? Am I confused here and missed you addressing it? Thx

  • @existentialselkath1264
    @existentialselkath1264 Год назад +7

    Standard TAA is often absolutely awful. I'll never understand how it was ever acceptable. DLSS (and fsr) often does a better job dealing with artifacts like ghosting, but obviously it can't compete with the sheer pixel count of native.
    A prime example is often flowing water. Transparencies often don't create motion vectors so any moving texture gets smeared like crazy with TAA, but dlss is capable of keeping the texture crystal clear in cases

    • @zxbc1
      @zxbc1 Год назад +1

      It kind of depends. For a long time in M&B Bannerlord for example, even DLSS quality gives off major shimmering with water and ripples, and default AA at native looks significantly better. But later DLSS versions fixed some of that. What they should have done is to swap the DLL to latest version for all the comparisons.

    • @existentialselkath1264
      @existentialselkath1264 Год назад +2

      @@zxbc1 for sure, it's not always better. Spiderman for example has very good temporal anti aliasing tech so these issues aren't as apparent.
      As for older dlss versions, it's extremely easy to replace the dll with a newer version so I don't really consider the older versions anymore

    • @prman9984
      @prman9984 Год назад +1

      Why even use AA at 4k? It hurts performance and looks worse.

    • @zxbc1
      @zxbc1 Год назад +3

      @@prman9984 To fix aliasing, like you would at any resolution. 4K doesn't somehow fix aliasing, it just makes the aliasing size smaller. For me it's absolutely required for a lot of games because I game on a 48" LG C2 at desktop distance, and I can see aliasing very clearly at this distance. If you game on a big screen in living room couch distance you probably won't need it, although for some games that have very bad aliasing, you will still notice severe shimmering even if you can't see the exact aliasing artifact.

    • @cptnsx
      @cptnsx Год назад +1

      @@zxbc1 I game on a big screen in 4K since 2016 and hate AA especially TAA with passion. It destroys image quality - its why I HATE these people that say DLSS is BETTER than Native because its NOT. It may be better (image quality) than Native with AA but NOT without.

  • @misterinfinity4076
    @misterinfinity4076 Год назад +2

    The questions that arise from this are:
    1) Is the gain in visual quality when using DLSS quality, solely lies in the anti aliasing technic used in native rendering? i.e would compering dlss quality to 4k native with DLAA provide similar results as the ones in this video? I think not, but there are only a handful of games that support DLAA.
    2) Does it matter? We already have concluded that "ultra" settings are not worth it, isn't dlss quality(or even lower) essentially similar to that? You get marginally worse image fidelity (50% of the time) and a massive 30-40%+ fps boost.
    3) Is there any instance, where if you are forced to render (due to performance constrains of your gpu) on a lower than your monitor's native resolution, you shouldn't use DLSS and opt instead on relying on "dumb" gpu scaling or even letting your monitor to do the upscaling? for example, if I have a 4k monitor but i am only able to run the game at 1080p is there a case where it's better to render at 1080p instead of using dlss performance (internal 1080p rendering) ? Building on this: If I have the option of buying a 4k monitor or a 2k monitor, and money is not a concern, even though i don't have the hardware to run 4k native games but can run 2k native, why should i trouble myself with that and not just get the 4k one and use dlss to render at aprox 2k, effectively rendering the big old question of "who buys 4k when its so hard to run" irrelevant (notice that the difference in price of 4k screen as compared to a 2k one is much lower than the difference of a 4k vs a 2k capable gpu + a screen "ages" much slower than a gpu.

  • @DadGamingLtd
    @DadGamingLtd Год назад +4

    Can we take a second to talk about the jumpy vertical scroll at the end of spiderman using dlss. Is that an actual thing? Or trouble during editing/encoding?

  • @Zaney_
    @Zaney_ Год назад +6

    I would also like to see DLAA results and rendering the game at a higher resolution then down scaling it. I.e 4k to 1440p and seeing how those two compare to traditional TAA.

    • @gozutheDJ
      @gozutheDJ Год назад +2

      you people demand a lot without compensating them for it.
      what's stopping you from testing yourself?
      I'll give you a hint, downscaling pretty much always gives you the best image quality.

    • @tvcultural68
      @tvcultural68 Год назад

      ​@@gozutheDJ if we were to follow your brilliant logic, this video would not exist. Hw does technical analysis that most people are not capable of, so it's interesting to see H.W opinion. about the lol reward the channel earns money with the videos by adsense

    • @gozutheDJ
      @gozutheDJ Год назад +1

      @@tvcultural68people are not capable of turning a setting on, checking how it looks in game, turning it off, and comparing? wtf?

    • @tvcultural68
      @tvcultural68 Год назад

      ​@@gozutheDJ It seems like you didn't understand anything I said. Hardware Unboxed goes much deeper into the differences in graphics than most people, in summary, the channel offers a much better analysis. And if someone were to do their own analysis on dozens of games, it would take dozens of hours, I know this because I analyzed 5 games. It's much easier to watch a video that provides a great summary of this issue than for a person to spend dozens of hours analyzing games. Is it difficult for you to understand this?

    • @gozutheDJ
      @gozutheDJ Год назад +1

      @@tvcultural68 because this is literally just content dude. its fun to watch but no one is using this to tune their games. because to tune their game they would just .... do it themselves. i have eyes. i dont need a channel to tell me what looks better to me. its all coming down to personal preference as i said, unless you just go along with what other people tell you is better.
      and also, your monitor factors heavily into this as well. and that's something they can't account for, they can only tell you how it looked to them on their monitors, but you might have a completely different result, so at the end of the day you STILL need to test it for yourself.

  • @jlgroovetek
    @jlgroovetek Год назад +6

    Reading the comments, it would seem many haven't actually watched the video???

    • @solarstrike33
      @solarstrike33 Год назад

      Mudslinging…mudslinging everywhere…

  • @RedundancyDept
    @RedundancyDept Год назад +9

    It boils down to anti-aliasing, which can be pretty bad at native. Interesting that the FSR flaws to which Tim most often points are usually also present in the native/TAA presentation. This suggests to me that complaints about FSR are overblown. Yeah, FSR is worse than DLSS, but it's still pretty close to, and in rare cases better than, native quality. And given that modern games tend to have glaring issues otherwise--perhaps micro-transactions, brain-dead gameplay, bad AI/scripting, and/or things like persistent shadow/texture/mesh pop-in--given all of that, I have a hard time stressing out about minute differences in image quality.
    Still, I prefer to run native when I can.

    • @mryellow6918
      @mryellow6918 Год назад

      Native + blur

    • @naturesown4489
      @naturesown4489 Год назад

      It's not just that, DLSS is trained on a 16K resolution dataset which is why it can generate fine detail better than native 4k sometimes.

    • @DimkaTsv
      @DimkaTsv Год назад +3

      Which for addition makes up additional point.
      Unlike DLSS, FSR works with just what it gets, not some magic box of trained data to mask lower resolution source.
      So, if native have shimmering and stability issues, FSR would get them as well, while sometimes exaggerating, sometimes suppressing.
      It's not that FSR is bad. It is just that DLSS creates stuff that wasn't there originally. It's also not a bad thing, but by itself it shouldn't make FSR worse technology.
      And cost of such creation (and reliance for TAA and AI) is ghosting (more for DLSS), or not fixing original shimmering (more for FSR).
      Probably that DLSS/FSR comparsion should've went with native as well (just not including it in results). Maybe overall conclusion would've not been so negative towards FSR. It is still inferior for multiple reasons, as, for example, there is no magic box to draw stuff from air with. But it doesn't "destroy" playing experience.

    • @Andytlp
      @Andytlp Год назад +2

      Agree. People forget how bad games looked 15 years ago. Its no longer about graphics or resolution. Gameplay is stagnating more. Physics implementations dont get enough attention nor audio. Worst offender is a.i. Most games have brain dead a.i especially if its a tacked on singleplayer with main focus being multiplayer. Which then make multiplayer bots suck too.

    • @naturesown4489
      @naturesown4489 Год назад +1

      @@DimkaTsv I think they do have to train FSR 2 for games somehow, as it cannot be injected into any game like FSR 1 can be with the Magpie mod. I don't know how they do the training though. You make some very good points.

  • @MrSmitheroons
    @MrSmitheroons Год назад +3

    I'm super surprised how close it is. My mind doesn't want to believe it can be this close while giving so much better performance... To the point I keep forgetting it. Brain just doesn't want to get the picture, it doesn't fit with my expectations.
    Thanks for such patience diving deep into these topics! Making these questions as close to objective, and best explained as possible. Super amazing work taking something like subjective visual quality in upscaling, and digesting it into something nigh objective! Your commentary about what game devs should do is also super concise and actionable.
    If I could give a gold star award to your channel, I would. I'm having to seriously consider donating or subscribing on Patreon, even though that's something I rarely (just about never) do. Thank you again for your tremendous efforts over the years. Gamers can be so much more informed now than any time I can recall in history. And it's due to dedicated reviewers such as yourselves.

    • @Sal3600
      @Sal3600 Год назад +1

      Disable taa and dlss and fsr doesn't stand a chance when compared to native.

    • @dante19890
      @dante19890 Год назад +3

      ​​@@Sal3600 incorrect. If u play native without TAA, DLSS will actually have a better image stability

    • @AnuragCrafts
      @AnuragCrafts 3 месяца назад +1

      ​@@dante19890Why would you even do that lol. TAA has a very slight performance hit which is not even noticeable.

    • @dante19890
      @dante19890 3 месяца назад +1

      @@AnuragCrafts A lot of old school pc gamers hate TAA and just run native resolution with no image treatment or AA

    • @AnuragCrafts
      @AnuragCrafts 3 месяца назад +1

      @@dante19890 Better add sharpening from control panel or sharpening reshade or any mod for that particular game. It should be fine.

  • @EpicPianoArrangements
    @EpicPianoArrangements Год назад +1

    A couple of things... is there a reason that the DLSS version of Miles Morales videos started stuttering at 6:35-6:47, and God of War did the same at 12:45-13:00. You'd think DLSS would perform much smoother than native resolution.
    Also, thank you for making a point about games updating their DLSS versions. Trying to update DLSS in RDR 2 is almost impossible since Rockstar Launcher ALWAYS check to see if files have been moderated every time you play, and even if you update the DLSS version, Rockstar Launcher will ALWAYS replace it with its older v2.2.10 file. If you try and deny write/edit changes to the file, Rockstar Launcher won't even launch the game; It will stay stuck trying to 'update' the file.

    • @kevinerbs2778
      @kevinerbs2778 Год назад

      Trash D.R.M.

    • @vladvah77
      @vladvah77 Год назад

      Well mate, you can always use EMPRESS edition for single player campaign of RDR 2, she has cracked a new version with DLSS so I would say F u c k off Rockstar launcher!!!! and enjoy my game anyway :-)

  • @pasi123567
    @pasi123567 Год назад +3

    In my personal experience I noticed that the newer DLAA anit-aliasing algorithm works incredibly well with solving flickering issues and making the image look high res. I think this test shows that the Nvidia AA algorithm on DLSS handles scenes much better than TAA for example hence why DLSS looks better than native in many cases. I wonder if we will ever get a test comparing DLSS with DLAA. DLAA native scenes should theoretically be better than DLSS.

    • @1GTX1
      @1GTX1 Год назад

      Last of Us and Marvel's Spiderman look amazing even on a 24 inch 1080p screen with DLAA. If every game looked like that, 1080p resolution wouldn't be a problem.

  • @ziben2748
    @ziben2748 Год назад +2

    I love videos like this, GREAT JOB! i strongly hope develops will be updating their upscalers

  • @Cptraktorn
    @Cptraktorn Год назад +8

    in my opinion the boost in FPS is ALWAYS worth using DLSS in quality at minimum. sure in some games the ghosting will be noticeable but with higher fps you see more detail than at a lower fps and higher (native) res.

    • @2xKTfc
      @2xKTfc Год назад

      It's funny how everyone complains to no end about tearing/vsync but now crappy ghosting is somehow acceptable. Gamers really buy into everything..

    • @Verpal
      @Verpal Год назад +5

      @@2xKTfc Ghosting can be fixed with a simple DLSS dll replacement, unless you are on multiplayer competitive title.... which you would probably have plenty of FPS to begin with.

    • @Cptraktorn
      @Cptraktorn Год назад

      @@2xKTfc I've got a g-sync monitor so tearing isnt exactly an issue for me.
      It's up to you if you think playing at a lower framerate, with blurry motion and higher input lag is worse than a small amount of ghosting.

  • @marsovac
    @marsovac Год назад +1

    Better is subjective. What bothers you more? Blurry? jagged edges? Flickering fences? ghosting? Sharpened/oversharpened? SHimmering?
    Every one of these might have s different weight for each of us. I personally find DLSS fine on quality most of the time, and if I need it more aggressive than that then I drop resolution. But at times when flickering or ghosting is introduced I simply drop resolution instead. I can bear sharpening problems and blurriness, that's why I prefer resolution change to the other artifacts. However I don't play much of these games. A lot of these have bad TAA by default.

  • @leostival
    @leostival Год назад +8

    I'd love to see something like that for 1080p. Is it crap or is it any good?

  • @Felttipfuzzywuzzyflyguy
    @Felttipfuzzywuzzyflyguy Год назад +11

    The motion performance of both brothers me, I notice it immediately. Obviously they're always improving it but I personally choose not to enable them.

    • @taufikaji3150
      @taufikaji3150 Год назад

      agree

    • @prman9984
      @prman9984 Год назад +2

      There's one clip in the video where he's saying DLSS is superior, while it's stuttering on DLSS but not in native.

    • @DavidFregoli
      @DavidFregoli Год назад +3

      TAA has the same problem so I guess no AA for you

    • @gavinderulo12
      @gavinderulo12 Год назад +5

      ​@@prman9984 could just be the capture.

    • @nahush6299
      @nahush6299 Год назад

      @@prman9984 DLSS improves performance so I don't see how it could be stuttering when native doesn't, especially since Tim didn't mention anything as he tested it & would have felt it. Ig it could just be a bug while capturing gameplay.

  • @What_In_This_World
    @What_In_This_World Год назад +3

    I have a theory about some of your test data, for Spider-Man for example. I wonder if the game engine and texture packs are only optimized and designed for 1080p and 2160p. So when the game renders natively at 1440p, it’s just using something like a checkerboard technique to up the image to 1440p, whereas DLSS can upscale to 1440p from 1080p (quality mode) using its magical mathematical algorithms, which would also explain why the native 4K presentations were better than DLSS upscaling from 1440p, because the game engine already adjusted the image from 1080p to 1440p prior to DLSS then upscaling from 1440p to 2160p, which could explain why in Spider-Man and last of us, 4K native was so much better in it’s fine details then DLSS/FSR but native 1440p was much closer to DLSS/FSR. 🤷‍♂️🤷‍♂️ maybe this is something to look at in certain games, do certain resolutions offer up certain pros/cons and do some of those go away with certain DLSS/FSR settings ?? Is this just a DLSS/FSR issue, or is it also a possible game engine issue? Do some developers just not take the time to properly develop 1440p textures and details, which would make the DLSS quality mode using an already less optimized image ??

  • @blidibliblubla
    @blidibliblubla Год назад +2

    Im interest about the Wattage used in all modes as well in times like we have today. Hope we can see a bit more information as well in the modes. Thx anyway for the good work :)

  • @jaroslavmrazek5752
    @jaroslavmrazek5752 Год назад +6

    It would be good to point out that the image differs a lot based on the DLSS version which can be mostly replaced in a matter of minutes, for example the DLSS in RDR2 is trash but when replaced with version 2.5.1 it looks amazing.

    • @DavidFregoli
      @DavidFregoli Год назад

      unless there's a convenient method that scales for every owner (possibly GF Experience integration) or an always up-to-date table somewhere that matches game to dlss version this is an impractical solution that only a few enthusiast will research. Nvidia hasn't even been able to stay on top ReBar On/Off per-game settings as shown by HUB so not much to place trust on here.

    • @breeminator
      @breeminator Год назад

      He covered that with that exact game in the final thoughts. It maybe deserved a section of its own, as people probably don't expect new information to be provided in the final thoughts section.

    • @jaroslavmrazek5752
      @jaroslavmrazek5752 Год назад

      @@DavidFregoli there is an app called DLSS swapper that makes it dead simple

  • @PixelShade
    @PixelShade Год назад +8

    To be fair, I think the "better than native" really comes into place in scenes with limited motion. It could be an RTS game, like The Riftbreakers, a slow dialogue focused cutscene like the Witcher 3, puzzle games like Lego Builders or just when standing still, detail studying environments or characters in Cyberpunk. In those cases the temporal resolve has a super-sampled "better than native" quality to it. With FSR showcasing flickering instability with sub-pixel geometry (due to the feature lacking a temporal smoothing pass, which DLSS has. However, this extra pass often causes extra ghosting in DLSS as a result).
    There are so many components to upscaling, and there's just a lot of subjective opinion of the image quality that you can't or don't really need to quantify. For me personally I rather play Cyberpunk with FSR Quality over the native image at 1440p due to the super sampled temporal resolve in cutscenes, dialogues, or just when detail studying the environment. During motion it's close to native, but with added flickering and slight fuzz. which I don't mind when I get into the game. So even if flickering CAN be distracting, to me it isn't all that important, and I value the benefits of FSR, over its drawbacks. And that's something that you can't really quantify even in an image comparison, because it really depends on the scene, what you value, how much you are into the game and what best depicts the artists intended look for the game. and so on.

  • @marcelo19071998
    @marcelo19071998 Год назад +5

    always update dlss to the newest version and test if its better. most of the time it will have great improvement in image quality (there are exceptions though)

  • @B.E.3.R
    @B.E.3.R Год назад +1

    Good video but 1. There has to be something wrong with the Death Stranding results, I played it recently and the TAA implementation was the best I've seen in years. Maybe it's bugged in the Director's cut? And 2. We seem to have gone backwards in regards to AA quality. I remember playing Dead Space 3 and Battlefield 3 on a 720P TV and the image quality was spotless for both with just post-process AA. DS3 used SMAA, not sure about BF though. At least we've moved on from the hot garbage that was FXAA, 90% of the time the raw image was better (TAA isn't radically better but to its credit, it is "low-cost" and usually better than not having any AA)

    • @1GTX1
      @1GTX1 Год назад

      AA is not the only problem, a lot of modern games are running screen space reflections, ambient oclusion, and other ''game parts'' on half or quarter resolution compared to native, this can look ok on 1440p, but not below that, in some modern games like Ghostwire: Tokyo, reflections break on 1080p. In Cyberpunk 2077 Digital foundry has noticed that on 1440p you could lower reflections to get better performance, but on 1080p you have to set reflections at least on high for them to look good. Metro Enhanced looks extremely blury on my 1080p monitor. but with DLSS/DLDSR 2880X1620 balanced looks amazing for same performance as native 1080p without DLSS.

  • @monsG165
    @monsG165 Год назад +18

    FSR seems to always have a shimmer going on. I have both setups (5900x 3060) and (5800x 6700xt), and the FSR shimmer is really obvious in some games like RE4.

    • @leroyjenkins0736
      @leroyjenkins0736 Год назад +4

      I agree in the last of us in the dark it's horrible

    • @Eleganttf2
      @Eleganttf2 Год назад +4

      @satoyamazaki3659 bro what lmao omg your poor 1060 is literally cryin

    • @tuanld91
      @tuanld91 Год назад +6

      @@Eleganttf2 bro he just not a sheep like you are

    • @biansanity
      @biansanity Год назад

      @@Eleganttf2 bro stfu not everyone have daddy's money like you

    • @Eleganttf2
      @Eleganttf2 Год назад +1

      @@tuanld91 what ? i literally on sayin that 1060 is legit cryin especially hes usin it for 1440p, your assumption about me is on a whole another level tho and proves you are one of those people that just calls other "sheep" when in a disagreement

  • @sireveman
    @sireveman Год назад +1

    I would really appreciate the test with NO AA!
    The biggest take is that if you are going to downgrade quality, FSR or DLSS makes no difference.

  • @wynard
    @wynard Год назад +4

    This tech has gotten so good lately, the very small issues that can be seen if you really look for it do not bother me.

    • @fernandochapa1433
      @fernandochapa1433 Год назад +1

      Fr I can play with Dlss performance and I dont mind the “issues” because of the FPS gains

  • @vmafarah9473
    @vmafarah9473 Год назад +1

    for some people over sharpened image with denoised textures with low flickering is better than non sharpened images with normal textures with flickers only seen when zoomed too far. But for me the later is better.when ghosting is minused image looks little bad at the edges

  • @MichaelChan0308
    @MichaelChan0308 Год назад +10

    This is unreal... you guys are putting out informative in-depth videos so frequently from hardware unboxed, hub clips and monitors unboxed, I have difficulty catching up and watching all of them!
    I highly suspect DLSS 4.0 video generation trickery has been used =P

  • @ricky_pigeon
    @ricky_pigeon Год назад +1

    When hes staying "Stability" are we speaking about anti aliasing flickering? because it seems rather confusing talking about stability because its often associated with camera/display shaking.

  • @abhishek_k7
    @abhishek_k7 Год назад +15

    I could barely tell any difference in most cases lol. but that took a lot of effort so massive respect! 🙌🏻

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 Год назад

      True unless i use like performance settings both on fsr and dlss

    • @kilmor3151
      @kilmor3151 Год назад +3

      That's why videos on this are a plain moronic concept. You won't ever be able to tell in a video.

    • @DavidFregoli
      @DavidFregoli Год назад +1

      hard to cross reference same area unless you pause, I'd rather see more stills than playback for fine detail, but playback also has its place to highlight image stability and shimmering. also YT compression doesn't help.

  • @Hybred
    @Hybred Год назад +2

    This test is weird I don't understand. Instead of creating two graphs one where its FSR vs Native and DLSS vs Native you combine all 3 into one so that only DLSS is shown... why? What advantage does that serve us the consumers of this video?
    For non-RTX users (GTX, AMD, Intel) how do they know which games FSR is better than native in? The only technology they can use. That would be useful information. This test only benefited NVIDIA RTX users because you've left out useful results for everyone else. If FSR is ALSO better than Native in that game that's good information to know rather than simply just knowing what's best.

    • @Hybred
      @Hybred Год назад +1

      Love the video just some feedback

  • @rickinielsen1
    @rickinielsen1 Год назад +13

    Interesting video. If anyone had asked me, I would never have thought any of the upscaling solutions would ever be better native. But it does make sense with titles that haven’t implemented proper high-end graphic features. Might even be a great way to visually overhaul older games if you can simply tack DLSS/FSR on to it, rather than having to update the core mechanics of the engine...

    • @konga382
      @konga382 Год назад +2

      "Better than native" presentations are made possible due to shortcomings in the anti-aliasing solutions most games use, and it doesn't have anything to do with the games' age or graphical feature support. And DLSS/FSR2 require quite a bit of work on the engine side to get a good implementation of since these are temporal upscalers. You need extensive motion vector support at the very least. You do not simply tack these features on. And with motion vectors, you can also do good-quality TAA, which would reduce the need for DLSS/FSR to begin with.

    • @FranzKafkaRockOpera
      @FranzKafkaRockOpera Год назад +1

      It makes sense if you think about modern anti-aliasing techniques like TAA, which are essentially DLSS without (usually) the upscaling element, i.e., they look at past frames to reconstruct a smoother image than the native raster. Otoh I think those would be tough to implement in older games without some quite noticeable artefacts: what makes DLSS good is that it can look at depth information, motion vectors, and so on in order to create a coherent image, and old engines just don't have the capacity to communicate that info.

  • @blazbohinc4964
    @blazbohinc4964 Год назад +3

    What you ought to be testing instead of this (which has been done a billion times by everyone) is 4K DLSS performance vs native 1080p. This actually makes sense and shows how much more detail you can pull from a 1080p image. It would result in a much more interesting video. It would also be interesting to see how much performance it takes to upscale from 1080p to 4k compared to just rendering in native 1080p, and how much performance toll a certain GPU / GPU brand takes. Just a thought.

    • @johnhughes9766
      @johnhughes9766 Год назад

      This

    • @lonerprofile
      @lonerprofile Год назад

      My test 6600xt 4k in cyberpunk:
      4k Ultra 18.96
      4k FSR Quality 24.14
      4k FSR Balance 30.04
      4k FSR Performance 38.84
      1440p Ultra 54.35
      1080p Ultra 89.36
      4k DLSS/FSR quality = 1440p, performance = 1080p, is a marketing scam. 4k performance is between 1440p and 1080p, and 4k quality is around 1850p, I'm talking about upscaled resolution here not image quality.

  • @viniqf
    @viniqf Год назад +27

    I think DLSS doed not look better than native because objectively speaking, there is no way for an upscaled image to look better than the original image.
    What DLSS does, however, is replace bad TAA implementations with their own anti aliasing method, which is vastly superior. So even though it may look marginally worse, it will be easier on the eyes due to the better anti aliasing.
    If your GPU can run at native with no issues, I recommend either DLAA or DLDSR, if you have enough performance to spare.

  • @paulsef92
    @paulsef92 Год назад

    Random question: why does the video quality on this channel always look so good? Uploaded in a different format?

  • @Bry.89
    @Bry.89 Год назад +3

    This just makes me hope that AMD eventually releases a version of FSR that utilizes AI to enhance the quality of the image similar to DLSS. Perhaps it might only work on newer AMD gpus, and would otherwise revert back to normal FSR. But knowing AMD it will also work on any gpu capable of utilizing tensor ML cores. That would be the true "DLSS Killer"

  • @slapnut892
    @slapnut892 2 месяца назад

    Since I have a RTX 4060 in my G14 I don't have the luxury of running modern titles maxed out at 1080p and up, but I have been seeing a noticeable improvement modding DLSS 2 titles with DLSS 3 at 1080p, which is where the differences of DLSS will be more prominent. So far I've tried with Metro: Exodus and Hellblade: Senua's Sacrifice.
    I'm hoping it's not a placebo effect but would like to see a review on the matter to rule it out.

  • @HandsomeAlex25
    @HandsomeAlex25 Год назад +3

    That's very interesting. I didn't expect DLSS or FSR to match and even beat the native rendering. I guess it's because the TAA is not a great anti-aliasing technology... and hey, I'll take free performance with a better image any time! Thanks for the detailed look!

    • @robertcarhiboux3164
      @robertcarhiboux3164 Год назад +5

      native is better than TAA most of the time, it creates blurryness, I even prefer the jaggies to this technology.

    • @damara2268
      @damara2268 Год назад +4

      @@robertcarhiboux3164 most people hate the jaggies, including me.
      It just looks so bad that I can't stop looking at the jaggies moving around over edges instead of enjoying the game

    • @dante19890
      @dante19890 Год назад

      @@robertcarhiboux3164 U have to use Native with TAA or otherwise u get a very unstable image

    • @robertcarhiboux3164
      @robertcarhiboux3164 Год назад

      @@damara2268 I hate jaggies, but TAA even more. I prefer MSAA x4 but it's less and less available.

    • @robertcarhiboux3164
      @robertcarhiboux3164 Год назад

      @@damara2268 TAA is blurry and even worse than jaggies. I prefer MSAA but when the game does not have it I use DSR 4.0x when my GPU can handle it (with minimum smoothing factor like 6%). I also like very much the hitman resolution upscale. It's the only game in which it really improved the sharpness of the image with no particular downside. Even performancewise it was better than DSR.

  • @jeanmartin7166
    @jeanmartin7166 Год назад +2

    Thanks for that.
    Did you guys plan to do some recent AA comparison ? (Including dldsr)

  • @MorbidHunter
    @MorbidHunter Год назад +2

    I always noticed that DLDSR + DLSS quality looked way better than native 1440p while performing better or the same

  • @lordcola-3324
    @lordcola-3324 Год назад +2

    I feel like grafics technologies are going in the completely wrong direction. I don't want upsacaling. I want to do down sampling

    • @lonerprofile
      @lonerprofile Год назад

      In-game downscale 8k to 4k in YS9 is so awesome.

  • @arshiatn2895
    @arshiatn2895 Год назад +15

    I use 4k + DLDSR 1.78x with DLSS P. It runs better than native and worse than 4k DLSS Q but looks miles better than DLSS Q and better than native.
    Using a 4090 so VRAM is no issue
    Edit:
    So it renders in 1440p (dlss p) -> 5k (dldsr) -> 4k
    Instead of 1440p (dlss q) -> 4k

    • @atnthekoala
      @atnthekoala Год назад

      Yeah. I test games with DLDSR + DLSS myself and they look much better than DLSS quality 4k.

    • @GewelReal
      @GewelReal Год назад

      wait, there is DSR version that uses DLSS? (I'm still on that GTX grind)

    • @arshiatn2895
      @arshiatn2895 Год назад

      @@GewelReal yeah there is and dldsr 1.78x looks almost like dsr 4x in 4k + it runs so much better
      (dldsr = deep learning dsr) really cool tech 😃

    • @hideff4982
      @hideff4982 Год назад +1

      DLDSR is the truth. In combination with DLSS it looks ALWAYS better than native 4K.

    • @clem9808
      @clem9808 Год назад +2

      I can't really tell the difference ,but on my 1080p monitor , I prefer using 1620p+dlss quality(Dldsr) than 4k+dlss performance(dsr) same being upscaled at 1080p res.

  • @rtyzxc
    @rtyzxc Год назад +1

    There is reason why VR games use MSAA, it simply provides far superior fine detail compared to any of the temporal AAs. Also, while MSAA is expensive, it's NOT the same thing as super-resolution and is not as expensive. Any time I've accidentally turned on TAA in a VR game it made the game terribly blurry. In motions, TAA and DLSS get noticeable blur penalty compared to no change in native. MSAA further adds detail and reduces aliasing. I wonder if we can find a game with both MSAA and DLSS because MSAA is the actual final boss of clarity.

  • @bodieh1261
    @bodieh1261 Год назад +3

    First off I just wanna say I really appreciate the effort you guys put into these videos to provide us with such in depth analysis and information on topics.
    That said, in this instance I think a more useful and widely applicable analysis would be to compare the image quality of native 1080p or 1440p to that of higher but upscaled resolutions, at quality settings where the upscalers render at or close to 1080p or 1440p respectively.
    For example:
    - Native 1080p vs 4K Upscaled on Performance (1080p render res)
    - Native 1440p vs 4K Upscaled on Quality (1440p render res)
    - 1440p Upscaled on Performance vs 1080p Upscaled on Quality (720p render res)
    I think these comparisons could be more useful because most gamers have hardware that actually allows them to use those setting combinations at playable framerates, whereas very few actually have the hardware that affords them the luxury of "choosing" between native 4K and upscaled 4K. Dlss/fsr were always meant to be tools to boost performance at a given res(or enable playable fps at higher res) while minimizing losses in image quality, more so than tools to outright boost image quality itself.
    Personally from my own sample size of 1, I have found that running games like MW2, Cyberpunk and Hogwarts at 2160p Performance or even 1620p Quality(also ~1080p render res) actually produces a more stable and sometimes better looking image than native 1080p, particularly in Cyberpunk. This makes the ~5-8% fps hit of running 2160p Performance Mode worth it over native 1080p. It would be nice to see what you guys think about this and if others experience the same thing.

    • @KarlDag
      @KarlDag Год назад

      Absolutely! Originally DLSS was there to take you to 4K with a GPU that wasn't powerful enough to render native 4K. Now if you're running a 4090, sure you can choose native vs DLSS, but that's a far cry from most people's need. Native 4K doesn't perform nearly the same as DLSS 4k.

  • @Hyperdriveuk
    @Hyperdriveuk Год назад +1

    Great vid as always- There's also this strange micro stutter and ghosting in the DLSS my eyes can't unsee it, especially when it's scrolling past something, it's not smooth. 11:39 In hogwards legacy, the freaking coat judders and all the small things that move around like birds have massive ghosting. 12:44 judder judder judder. 13:35 rain ghosting, the rain is literally ghosting across the screen.

  • @robblack8754
    @robblack8754 Год назад +3

    I play at native 4k and I have yet to see a game where dlss was better than native outside of some games where the anti aliasing was better but the clarity and detail were still inferior to varying degrees. This then is a dlaa vs native (usually taa) anti aliasing thing, not dlss vs native.
    I'll add I game on a 65" oled tv so the "shortcomings" of dlss in this area are more apparent than those that only game on monitors that are in the low 30" or under range. I'll also add that I think dlss is impressive for what it is but native is going to beat dlss quality in detail and clarity.

    • @NamNguyen-up5xi
      @NamNguyen-up5xi Год назад

      Yeah this comparison is more of AA than actual image quality.
      Hogwarts legacy turned on dlss by default and I thought some one was f'around with me. Color gradient and textures were really bad.

  • @SirJohnsonP
    @SirJohnsonP Год назад +1

    This is a software uplift and it should be free when you buy their gpu, but they upsell it with this ludacris gpu pricing… if people continue buying their stories, I expect them to sell it as a monthly service

  • @TheActualChosenOne
    @TheActualChosenOne Год назад +5

    Could you test ultrawide FSR/DLSS in the future? For some reason, even the quality mode produces a way worse image than native on 3440x1440.

  • @ivaniveciqtests6085
    @ivaniveciqtests6085 Год назад +1

    You are really a great channel. But who told you that people do not play on 1080p any more?

  • @andreigreu
    @andreigreu Год назад +6

    11:15 the falling leaves on DLSS leave a long trail behind them and FSR creates a distortion effect but no trail

    • @jinx20001
      @jinx20001 Год назад +4

      whats your point? pause at 14.36 and you will see DLSS is much much better than native or FSR. Like the whole video tries to explain to you, every game is different and all of them have their ups and downs in image quality but FSR is not in the discussion, its trash, finding one example is desperation lol

    • @andreigreu
      @andreigreu Год назад +1

      @@jinx20001 no point just wanted to show some difference, no need to be aggresive about it :)

    • @A.Froster
      @A.Froster Год назад +1

      @@jinx20001 How boi , somebody is angry they touched their favorite upscaling solution 😂

    • @jinx20001
      @jinx20001 Год назад +3

      ​@@A.Froster bro i dont need any upscaling, bet you can guess why.

  • @Akkbar21
    @Akkbar21 Год назад +1

    Pretty obvious that we see way more ghosting with DLSS that FSR in these examples. I can’t say I’ve noticed any “jump off the screen” problems with FSR yet (commented at 11:26 mark) as I watch.

  • @crashtestdummy87
    @crashtestdummy87 Год назад +3

    watching this 4k comparison video on 720p youtube quality on a 1080p screen, gg me and probably 90% of viewers

  • @rippenzong2455
    @rippenzong2455 Год назад +2

    this was very interesting. see i play mostly at 1440 because i like high fps and ive found i really like the sharpness that dlss or even fsr can give. like hogwarts and rdr2 and a couple others im not really thinking of imo just look better at 1440 than taa

  • @xxovereyexx5019
    @xxovereyexx5019 Год назад +3

    Native is GIGACHAD, no downscaling/upscaling, pure raw performance ftw!

  • @xblur17
    @xblur17 Год назад +1

    DLSS 2.5.1 has been available for many months now, why not just use that for testing? DLSS Swapper tool allows you to do just that, it would streamline your testing. You're using like 1-2 year old versions of DLSS in half of these tests, and they're all different...

    • @jeromenancyfr
      @jeromenancyfr Год назад

      I don't understand how having to change additional files does streamline testing ?

  • @David_Raab
    @David_Raab Год назад +2

    I would asume that a native 4K image is better than an upscaled 4K image. The question is how much better (or not) looks a 4K Quality to a native 1440p. Because that is what 4K will render in Quality. Or 4K Performance vs 1080p native. So do you gain a better image with 4K even if you still use Performance compared to staying at native 1080p or native 1440p?

    • @cheese186
      @cheese186 Год назад +4

      4k quality looks a lot better than 1440p on a 4k screen, it's not even close

    • @WhiteCrowInteractive
      @WhiteCrowInteractive Год назад +2

      In my experience, 4k quality always gives way better resolution of details compared to native 1440p. That is the best use case of dlss, getting pretty close to 4k visuals with 1440p performance cost. I wouldn't expect 4k dlss quality to be better than native 4k unless native has a bad aa technique in use.

    • @prman9984
      @prman9984 Год назад

      ​@@WhiteCrowInteractive which is why I always turn off AA tech on 4k. At least for me it isn't necessary and is like putting Vaseline on your camera's lens.

    • @lonerprofile
      @lonerprofile Год назад +1

      You can't compare 4k dlss quality to 1440p native, as you will get significant less fps. To my eyes, 4k dlss balance upscaled resolution is about 1440p native. 1440p native will have more fps than 4k performance WITH better image. Same thing to 4k performance is not relevant to upscaled 1080p, they are marketing scam. At least to what I've tested in cyberpunk2077.

    • @lonerprofile
      @lonerprofile Год назад

      @@WhiteCrowInteractive Because 4k quality in practical, not upscaled from 1440p, maybe around 1850p, ultra quality about 2000p. These are marketing scam. Look at the actual fps you will get from both 4k dlss quality and 1440p will get you the answer.

  • @clifflenoir4323
    @clifflenoir4323 Год назад +2

    I have been running older games like Death Stranding at 8K resolution with DLSS Performance mode (50% source resolution, i.e., 4K) in order to get DLAA where it is not yet supported. It looks a lot smoother and more like a pre-rendered film than native 4K

    • @4riel
      @4riel Год назад +2

      Does it introduce ghosting?

    • @clifflenoir4323
      @clifflenoir4323 Год назад

      @Ariel not noticed any driving down the motorways I built

  • @kameron2768
    @kameron2768 Год назад +4

    Personally, I quite like using 4k performance mode given that I play on a 4k 120hz display with a 3080. The image is definitely not as good as native rendering, but the performance gains are huge and very often worth it for games that are difficult to run imo. The times I avoid using dlss are usually when it produces heavy ghosting, but that's thankfully not so common these days.

    • @eternalbeing3339
      @eternalbeing3339 Год назад +1

      Performance mode looks really bad on my 4k 120hz lg b9 oled. You notice it when looking at distant objects. Things become blurry far away.

  • @fizzleman
    @fizzleman Год назад +1

    I wonder where 4k balanced might fit in and if it's even worthwhile vs performance based on this video for some cards that might struggle with vram at 4k like the 3080

  • @yarg2015
    @yarg2015 Год назад +15

    Have you ever thought to test if FSR looks better on an AMD GPU vs an Nvidia card?
    AMD could be building things in there cards to make FSR better just like I am sure DLSS is optimized for Nvidia

  • @blackfieId
    @blackfieId Год назад +2

    Cool video! I've been wondering about this for some time so it was useful to me. I also noticed that DS was far better looking with DLSS enabled. The aliasing with native was atrocious but with DLSS it looks fantastic. HU please do this grading for all games in existence so we can just look up the game we're gonna play in a database and set our game accordingly.

    • @gozutheDJ
      @gozutheDJ Год назад

      or you could take the two seconds to check the game yourself. why are you so lazy?

  • @gustavoalmeida1579
    @gustavoalmeida1579 Год назад +5

    I think developers are not updating the DLSS or FSR versions of their games because they don't want to spend time testing the final result, is not a given that updating the upscaling will always make the visual better or keep the artistic vision consistent.

    • @corylyon545
      @corylyon545 Год назад +2

      Most publishers are developing a track record of abandoning titles unless it's a reoccurring revenue stream.

  • @zsreich
    @zsreich Год назад +2

    Its true in some games that do not have an anti aliasing solution. For example, nioh 2 on PC looks better because the game has no anti aliasing and needs dlss to clean up the image.