Is DLSS Really "Better Than Native"? - 24 Game Comparison, DLSS 2 vs FSR 2 vs Native

Поделиться
HTML-код
  • Опубликовано: 25 янв 2025

Комментарии • 1,4 тыс.

  • @vernacular3289
    @vernacular3289 Год назад +1043

    This is really a test of bad anti-alias has become in games. So much that resolution scaling becomes the only means of avoiding shimmering jaggies.

    • @quantumsage4008
      @quantumsage4008 Год назад +51

      havent played that many modern games recently but good lord war thunder has a really bad problem with anti aliasing. Its so bad, SSAA is mandatory. Even with that, theres so much noise you will have a real hard time spotting enemies. TAA helps too but certain texture still blinks though its not as bad as it was previously. You have to sacrifice so much performance

    • @CC-vv2ne
      @CC-vv2ne Год назад +195

      @@quantumsage4008 TAA is worst thing to happen to games, every single TAA implementation makes images noticebly blurry

    • @DanielOfRussia
      @DanielOfRussia Год назад +13

      @@CC-vv2ne FH5 is the one example I can think of where TAA looks way better than even MSAA. But it is more of an outlier

    • @aerosw1ft
      @aerosw1ft Год назад +35

      RDR2 is a perfect example of that. You're stuck with 0 good option if you wanna play at 1080p. FXAA is shimmering shit, TAA made me wanna puke of blurriness, and MSAA cuts your fps by 20-30. DLSS looks bad at 1080p but it's sadly the least blurry/shimmery out of the other.

    • @winebartender6653
      @winebartender6653 Год назад +8

      ​​@@jeanmartin7166 at 1440p (on a g9, so 27"), found Taa low with sharpening around 40% or so was the best visually without trading away much performance, about on par with msaa 2x (if ever so slightly blurrier though not noticable in motion).
      I feel like screen size (rather, PPI) plays a role in all of this as well. The lower PPI will stand to exaggerate blurriness to a larger degree than say flicker or shimmering.
      But yeah, really, it's just shit AA in games that is frustrating at the end of the day. There is always traditional super sampling and if there is a particular game you spend a lot of time with, it is worth it to mess with NVCP's available AA options if dlss is all that great. For instance, I find Hogwarts legacy on quality mode dlss looks quite bad, especially indoors (image becomes very soft and blur near TA levels) which is counter to what HU noted.

  • @zenpool
    @zenpool Год назад +570

    This gets interesting when some game devs enable DLSS as an antialiasing solution (DLAA) while retaining the native render resolution.

    • @LuchsYT
      @LuchsYT Год назад +37

      The Finals enabled DLAA for its beta and it looked incredible

    • @lexsanderz
      @lexsanderz Год назад +47

      don't wait for developers to learn the nvidia sdk. Instead you can use the project dlss tweaks to enable DLAA into any game with the nvidia dll.

    • @winebartender6653
      @winebartender6653 Год назад +4

      Yup, dlaa looks really really good with native (what I use with Hogwarts legacy )with a 27" 1440p monitor. I found dlss quality looks quite bad in Hogwarts, noticeably softer and blurrier, especially indoors.
      I did not realize it was possible to enable on your own for any game with dlss, will have to try that in other games.

    • @SweatyFeetGirl
      @SweatyFeetGirl Год назад +9

      Dlaa looks absolutely horrible to me...

    • @znoozaros
      @znoozaros Год назад +13

      @@SweatyFeetGirl Too soft on lower resolutions, unlike msaa which works great

  • @noliferxd
    @noliferxd Год назад +339

    Guys, I'm shocked by the scale of the work done. Thanks a lot. I would also like to see a comparison of DLAA with other AAs in native resolution

    • @bluesharpie9744
      @bluesharpie9744 Год назад +6

      DLDSR 1.78x +DLSS Quality (1.78x * 0.444% = about 80% the amount of native pixels) is a close contender to DLAA with similar ±performance in case DLAA is not available (although there now seem to be mods to hack it in).
      In my experience DLAA tends to ghost quite a bit on around edges

    • @teddyholiday8038
      @teddyholiday8038 Год назад +4

      DLAA will always be better

    • @steel5897
      @steel5897 2 месяца назад

      Old comment I know, but DLAA is literally just DLSS Quality except a little bit better under a microscope. It's technically the best image quality possible, but I think most people would argue it's not worth the FPS loss compared to Quality (unless you're at 1080p, as at that output even Quality dips too low in terms of internal pixels)
      Oh and the ghosting issue was fixed with Preset F in recent versions, there was a period between 2.5.1 and 3.6 (iirc) where the default preset caused ghosting in small objects, but they have since fixed it. You can use DLSS Tweaks to force a preset in any game.

  • @Maddin-6616
    @Maddin-6616 Год назад +208

    I will never understand why the DLSS or FSR files are not updated... there are really big differences in quality and it's not a big job for the DEVs. but the result is all the greater.

    • @NostalgicMem0ries
      @NostalgicMem0ries Год назад

      what you mean updated? to lastest versions or implemented in other games?

    • @DavidFregoli
      @DavidFregoli Год назад +70

      you can't just boot the game and test a couple scenes to verify it works correctly, there might be a specific area or shader that doesn't play well with it, it's not that straight forward

    • @muchvibrant
      @muchvibrant Год назад +18

      @@DavidFregoli I agree but I personally use the latest DLSS version with the games that support DLSS but are stuck with older version (eg. RDR2 ) and the game looks native with more fps. DLSS 2.5.1 or any 3.x.x looks so good

    • @lexsanderz
      @lexsanderz Год назад +31

      As a developer, it comes to : there is no ticket for that. Anyway, the nvidia SDK states that the dll is drop-in replaceable and that is a major feature of their sdk. Even have an update function you can call but nobody ever does.

    • @muchvibrant
      @muchvibrant Год назад +27

      @@TechTusiast Also dont get confused, NVIDIA DLSS DLL 3.1.11 is not DLSS 3 which we know.. Its DLSS 2 only but with version name 3.1.11

  • @ryomario90
    @ryomario90 Год назад +79

    You have to disable then re-enable AA in Death Stranding Directors Cut edition each time you launch the game, otherwise it'll be without AA, even though in the graphics menu it says it's on. The standard Death Stranding didn't had this issue.

    • @Keivz
      @Keivz Год назад +7

      Wow, great observation. Doesn’t change the overall result but the devil is in the details.

    •  Год назад +2

      Who fucking plays walking simulator!!

    • @B.E.3.R
      @B.E.3.R Год назад +3

      I knew something was wrong with the results. I played the standard edition of DS a few months ago and thought the TAA implementation was actually good. And I played at 1080P!

    • @starcultiniser
      @starcultiniser Год назад +2

      @ people without legs?

    • @ShadeKill
      @ShadeKill Год назад

      @@starcultiniser Imagining it, is its own journey

  • @kalark
    @kalark Год назад +74

    I honestly just assumed going into this video that Native > DLSS and FSR in terms of quality, so it was pretty interesting to see how it resolved some issues with some games like DS. Great video!

    • @dante19890
      @dante19890 Год назад

      That's not always the case sometimes dlss 2 will have better image Integrity than native

    • @dylan.dog1
      @dylan.dog1 Год назад +1

      in fact its true from this video you can see the only thing that fsr and dlss do better is antialiasing

    • @uhurunuru6609
      @uhurunuru6609 Год назад +1

      Well you assumed correctly, because Tim makes a common mistake, Native > Upscaling (ANY), but DLAA > AA (ANY), because all AA is trying to imitate DOWNscaling.
      Render @ 8k, Display @ 4k gives you the best AA you can get, at cost of huge performance hit, and every AA ever invented tries to mimic this taxing AA method, without high FPS hit.
      DLAA is training the AI on Downscaling, to do better AA, without upscaling the render. So all this video shows is what we already know DLAA is "Better than any other AA.
      This is worst video Tim has ever made, especially as we can now force DLAA, in any game that has DLSS support (How? See my other posts in these comments).

  • @cromefire_
    @cromefire_ Год назад +99

    Maybe instead of or in addition to further optimizing DLSS and FSR image quality, maybe developers should just spend more time properly fixing the native rendering in many cases... Basically every single time DLSS was better, it wasn't because it was so good, but because native TAA just was so bad. We have seen that before with stuff like AMD's CAS which seemed to work just a lot better than what ever developers built themselves...

    • @cromefire_
      @cromefire_ Год назад +7

      @@jkahgdkjhafgsd it does absolutely not, not even on a 4K monitor

    • @cromefire_
      @cromefire_ Год назад +4

      @@jkahgdkjhafgsd Even on a 140ppi monitor jagged edges looks worse than everything, I'd rather take bad TAA that flickers a lot, but it's better than literally everything looking absolutely awful... If you want something good, look at something like SOTTRs SMAA + TAA implementations. They are expensive, but it works wonders, not a lot of flickering and other artifacts, just plain solid.

    • @normaalewoon6740
      @normaalewoon6740 Год назад +1

      No anti aliasing is almost perfect during movement, it absolutely slams the clarity of even dlaa. Aliasing patterns are a problem because the pixels are only rasterized in the centers, which is pretty wasteful. It's better to rasterize random locations inside the pixels, which also randomizes the aliasing patterns and makes them a lot smoother in real time. To improve the clarity even further, foveated supersampling/downsampling is needed, on top of backlight strobing/black frame insertion and eye movement compensated motion blur, which make 100 hz look and feel like 1000 hz in a non destructive way

    • @coaltrain2299
      @coaltrain2299 Год назад

      @@jkahgdkjhafgsd dlaa

  • @JohnThunder
    @JohnThunder Год назад +127

    I noticed that some DLSS version give you heavy ghosting. If i notice this in a new released game I replace the DLSS file with 2.5.0 or 2.5.1 version.

    • @Napoleonic_S
      @Napoleonic_S Год назад +3

      Where to get those dlss files?

    • @conyo985
      @conyo985 Год назад +2

      I like those versions too! It's gives the sharpest and the least ghosting image.

    • @jemborg
      @jemborg Год назад +5

      Latest is 3.1.11, why not use that?

    • @mylliah7648
      @mylliah7648 Год назад +11

      I use the 3.1.11 dll with DLSSTweaks to change the preset to C (preset used in 2.5.1). So it's even better than 2.5.1!

    • @Shendue
      @Shendue Год назад +29

      @@jemborg Precisely for the reason stated, duh. "Some DLSS versions give you heavy ghosting"

  • @jayjay00agnt
    @jayjay00agnt Год назад +139

    Awesome information. This channel has been especially hitting it out of the park with valuable and hard to find information. I know these videos are a ton of work, and I'm hoping others can appreciate and find this content as helpful and useful as I do.

    • @drek9k2
      @drek9k2 Год назад

      Yeah for real, and these guys are truly independent and not willing to shine a turd. Like it'd be funny watching them seeing a GTX 1630 in action for the cost. It confuses me because I think they're pretty much fair, and rightfully slammed teh 6500XT as being a garbage product, yet lots of butthurt fanboys had to cope about what seemed like blatantly obvious common sense to me 8gb was not enough. I'm confused by techpowerup though it says a 6500XT performs the same as a 5500XT? Why really?? Funny because mostly AMD was the best brand last gen, except that one stinking turd. I automatically expect Gamersnexus and Hardware Unboxed to be utterly unswayed by corporate bullshit artists.

    • @vmafarah9473
      @vmafarah9473 Год назад +1

      Meanwhile Lius techtips uses 1/20th of the effort to make a video, but includes some hand held camera movements, some gpu dropping, some sponsourshots, some jokes .

  • @c.chepre8452
    @c.chepre8452 Год назад +90

    would love to see DLDSR together with DLSS added at some point in the future, i often use both together at 1440p and it further reduces flickering/aliasing.

    • @8Paul7
      @8Paul7 Год назад +19

      Yeah I play on 1080p plasma and the picture is just perfect when using DLDSR and DLSS quality combo.

    • @Dionyzos
      @Dionyzos Год назад +13

      I love DLDSR even at 4K. It performs even better than native and looks significantly better in most titles.

    • @Lollakuk
      @Lollakuk Год назад +3

      @@8Paul7 so you DLDSR it to 4K then do DLSS performance on it or?

    • @impc8265
      @impc8265 Год назад +4

      @@Lollakuk you can't dldsr to 4k on a 1080 screen, but yes the idea is correct

    • @Lollakuk
      @Lollakuk Год назад +5

      @@impc8265 you definitely can. I have DSRd to 2160p on my 1080p monitor.

  • @lhv2k
    @lhv2k Год назад +22

    I think you should consider testing more games with manually updated DLL file for the lastest version of DLSS. Since as you said, just by updating its version can change the result from worse than Native to match or even better, so I wonder if this is a trend across the board, and since this is a comparison of DLSS vs Native, I think it's fair to use the best and lastest version of it, since it can be manually done by users for free, not depending on games developers to implement. If the results are indeed positive for all, then it's gonna be truly a game-changing technology, since it'd be a truly free, stable and noticeable performance boost toggle that comes at no cost, and might even improve visual, simply a much better solution than overclocking since it's not dependent on silicone lottery.

  • @Akkbar21
    @Akkbar21 Год назад +6

    6:40 ummm what is happening when it pans down on the spiderman brick wall with DLSS? It is hitching badly and looks so glitchy. If I saw that in game, I would immediately turn off whatever is causing it (so DLSS). Why wasn’t this incredibly obvious artifact mentioned? Am I confused here and missed you addressing it? Thx

  • @glittlehoss
    @glittlehoss Год назад +27

    Unbelievable content. Need to continue to showcase games relative performance between DLSS, fsr and native twice per year to keep up with the updates and new games coming out.

  • @awphobic
    @awphobic Год назад +37

    Great video !
    People need to remember that DLSS / FSR / XeSS "ADD" performance with much better FPS. Which mean much better frametime and input lag that lead to smoother gameplay. When you can pull 30-40fps on native 4K/1440p/1080p in games you play and you can turn on DLSS to get much better fps or even stable 60fps that should be no brainer to turn on based on the 24 game comparison. With DLSS is edging closer to native in quality mode, FSR and XeSS catching up just mean better for costumer. The real problem just like the video point out, that new games release (hogwarts, deadspace, deathstranding, last of us) in native have much worse anti aliasing method implemented on their game in NATIVE. This feels the norm now that need more attention that developer making unoptimized game like when you can pull 14gb VRAM on the last of us just LOOKING AT A WALL.

    • @DavidFregoli
      @DavidFregoli Год назад +2

      I'd say it's a no brainer on Ties and DLSS wins. on Native+ it's the best tradeoff to get some performance, while on Native++ and Native+++ games you activate it only if you absolutely need, and potentially test lowering the quality settings first

    • @flintfrommother3gaming
      @flintfrommother3gaming Год назад +1

      VRAM is reserved, it doesn't matter if you look at a wall or look at a very filled place

    • @Jakiyyyyy
      @Jakiyyyyy Год назад +3

      VRAM already loaded as soon as you started the game, it dozen matter if you just staring at a wall or having intense gunfights.

    • @naturesown4489
      @naturesown4489 Год назад +4

      You should see the native without TAA - you'll see jaggies everywhere and there will be shimmering on all the edges. There's a reason games use TAA....

  • @ElmorQuistin
    @ElmorQuistin 18 дней назад +4

    Will you be making the same concept with new enhanced dlss super resolution which uses Transformer model? It would be such a good concept to see the evolution of dlss and whether it has been worth to use after all these years of development.

  • @julfy_god
    @julfy_god Год назад +19

    24 games, that's a lot of work) great job, Tim!

  • @chrisvicera6696
    @chrisvicera6696 Год назад +16

    In some games I honestly prefer 4K native without any temporal AA/ similar techniques . FSR, DLSS and TAA when poorly implemented show lots of ghosting and shimmering which is super annoying

    • @prman9984
      @prman9984 Год назад +3

      In most games.

    • @DavidFregoli
      @DavidFregoli Год назад

      yes that's the point of the video

    • @chrisvicera6696
      @chrisvicera6696 Год назад +4

      @@DavidFregoli No he compares native with the best AA which is typically TAA, if you actually listened to the video lmao

    • @DadGamingLtd
      @DadGamingLtd Год назад +11

      It bothers me that taa has become the goto AA technique. I realize it takes a lot less power than smaa, but it consistently looks like ass. Luckily, you can turn it off in most games.

    • @Edinburghdreams
      @Edinburghdreams Год назад +8

      Glad to see people are waking up to TAA looking like garbage.

  • @gunnarthegumbootguy7909
    @gunnarthegumbootguy7909 Год назад +4

    DLSS has a wonderful effect on surfaces that get shimmering lighting/shading or shimmering/bubbling screen space reflections that is pretty common on games of the last 3 years, my guess is that the temporal effect of it smooths it out, in sometitles it's very obvious like Cyberpunk 2077 where this kind of shimmering appears on a lot of surfaces. I guess theoretically a good TAA setting would be able to do the same, but it would have to be able to understand know what kinds of objects it should do it to, or it would cause ghosting... i guess this is where DLSS comes in, since the machine learning can make it understand what parts should reasonably not be shimmering... and modern 3.1 dlss is very low in noticeable ghosting anyway...
    also now with dlsstweaks which is a fabulous tool, and now it even comes with a UI program for people who might feel intimdated by editing the text files directly.
    With this, you can set your own internal resolutions... i've been playing around with it a lot... most games can run on really weird internal resolutions, like rendering at 3:2 3240x2160, which i've tried for 4k, so it is DLAA vertically, and DLSS horizontally... in games where you would need just a bit more performance with dlss but you are not satisfied with the "quality" option... or in games where you want something in between the available settings. I hope in the future this can be baked in as a slider in games, so that instead of having a number of options "quality, balanced, performance and ultra performance", you can just have a slider from full internal 1x resolution (DLAA) down to even below the ultra performance (x0.333333) level... I've had great use of this tool to max out for example The Last Of Us Part 1, so that I can set a resolution just high enough to look very close to native (so above the DLSS quality setting, which is x0.6666667) but still fitting within my VRAM limitations for a certain set of settings, that i wouldn't have been able to do if i ran it at native. DLSS really saves the day and i've tried using FSR and XeSS to see if they are any better in any game, but i haven't yet found any case where they look better. Having gone over to 4k tv for gaming and content viewing recently, I wouldn't even be able to run a lot of the games i like in 4k without dlss, it's really the main thing that would keep me from thinking of going over to AMD cards if i was looking for a new graphics card now.

  • @krystiano.610
    @krystiano.610 Год назад +19

    22:10 While in theory automatic updates sounds good, as a developer - in practice, we never want to release an untested update.
    In normal development cycle (and most of the cases), after releasing version 1.0 active development stops and we move to a support phase - where we do not add any new features and focus only on fixing bug and glitches. Most of the developers move to another projects (which may be even DLC for that game, but that also count as a different project) and since there is less human resources, you can't just constantly update such an important library like DLSS (important since it directly affects visuals of the game).
    If something worked before it's not guaranteed that it will still work after update - e.g. there may be a particle effect at a waterfall which after update starts producing ugly artifacts, QA may not catch this (mostly due to reduced resources), but gamers surely will - that's a big risk. Risk which can be avoided by not updating that library.

    • @jemborg
      @jemborg Год назад

      It helps if you make a backup of the original dll before you update... in case it looks worse.

    • @Vantrakter
      @Vantrakter Год назад +5

      That sounds pretty logical. And yet if upscaling techniques are being employed by more and more games and gamers and the version of the upscaling library directly affects the enjoyment of the game, it seems logical for a project management to allocate resources to keeping it updated after release, or at least evaluate an update some time after release. I'm not sure one visual glitch in a waterfall matters if the rest of the game looks better and is slightly faster than before.

    • @deviouslaw
      @deviouslaw Год назад +2

      Outdated thinking

  • @lupusprobitas
    @lupusprobitas Год назад +9

    I would love if you tested DLSS balanced also. Would be nice to know whether it would be a good tradeoff or whether it in general is something to be avoided.

    • @paulcox2447
      @paulcox2447 Год назад +2

      Depends on your starting resolution. At 4k balanced works well. 2k it's ok. 1080p I'd stick to quality

  • @PoRRasturvaT
    @PoRRasturvaT Год назад +6

    The dragon neon sign on Hitman is losing a lot of brightness and bloom compared to native. The ~1-2 pixel wide neon information is lost and thus doesn't produce the light it was supposed to. That's pretty impactfull in my opinion. It's completely extinguished/muted on FSR.
    This is something that will be noticed a lot in Cyberpunk, where 1-2 pixel elements (neon signs again) have to convey bright, contrasted information and is further enhanced by bloom post processing.

    • @MichaOstrowski
      @MichaOstrowski Год назад +1

      The intensity of bloom is, sadly, often related to the native resolution, so when that drops, you get less bloom...

    • @monotoneone
      @monotoneone Год назад +2

      Yes, lighting effects take a hit when using upscaling. This is why I ran Death Stranding at native despite the shimmering, the lights look "wrong" when using DLSS. The more bloom light sources have the more it is noticeable. Neon lights often look like LED lights with DLSS.

    • @juanblanco7898
      @juanblanco7898 Год назад

      There's a gamma shift and overal loss in color accuracy when scaling is performed, which is to be expected due to obvious reasons. And sadly this is not a problem most people, be it developers or consumers, are concerned about. Or what is actually sad is that I'm definitely not one of them as I'm very sensitive to such things.

    • @gozutheDJ
      @gozutheDJ Год назад

      good call i didnt even notice that.

    • @gozutheDJ
      @gozutheDJ Год назад

      @@juanblanco7898 you can't adjust gamma settings in game?

  • @awesomecarl
    @awesomecarl Год назад +60

    DLSS is just really good anti aliasing, that's why they added DLAA, which is just DLSS without upscaling

    • @HanSolo__
      @HanSolo__ Год назад

      This

    • @Superdazzu2
      @Superdazzu2 Год назад +6

      no it's not lol, it also gives back huge chunks of fps, ranging from 20+ (quality) to almost double with performance

    • @johnnypopstar
      @johnnypopstar Год назад +12

      @@Superdazzu2 Yes, it is. He is absolutely correct, from a technical perspective, on what the actual image processing techniques involved in the upscaling are doing. There is also a *side effect* that because it's starting with a lower resolution image, you get an FPS bump - but the fundamentals of the image processing that's going on, are, *unavoidably* that it's doing extremely computationally inefficient anti-aliasing.

    • @Drip7914
      @Drip7914 Год назад +2

      @@Superdazzu2 no it doesn’t lmao, dlss3 the latest iteration of dlss actually REDUCES performance due to 10-15x the latency, fsr actually increases performance/quality. (Source - HUB, AdoredTV, NotAnAppleFan, PCworld etc)

    • @awesomecarl
      @awesomecarl Год назад

      @@Superdazzu2 Yeah because DLSS is applying that anti aliasing to a lower resolution, instead of native

  • @benfowler2127
    @benfowler2127 Год назад +13

    I’ve felt that native is the only true sense of what a gpu can do, but I can’t tell the difference in SOME scenarios. If I’m in the middle of a game, I’d say there’s only certain things like the fence and things flickering that would get on my nerves. So it seems like it’ll end up being a game by game thing for me as to whether or not I’d use the tech or just play with native rendering.

    • @gozutheDJ
      @gozutheDJ Год назад +8

      in terms of judging how performant a card is, 100% native. in terms of playing and enjoying a game, whatever looks best to you.

    • @dante19890
      @dante19890 Год назад +1

      Why would u sacrifice performance if you can't tell the difference anyway.

    • @benfowler2127
      @benfowler2127 Год назад

      @@dante19890 I am basing my statement on what I see in the video. I haven’t yet compared them for myself in the games I play. I tend to play in native res without the frame generation on anyways. The one time I tried it, it wasn’t impressive to me, so I haven’t bothered since. Also, with whatever extra processing youtube does to these videos, and depending on the screen someone is watching on, it can be difficult to tell if I’m seeing things the same way. Especially since “seeing 4k” on a tablet vs a monitor or tv is dependent on those devices. What I should do is watch this video again on the pc and monitor I’d be gaming on. And I did say “SOME” scenarios. Not all. There’s a clear difference in many of the examples he showed.

    • @dante19890
      @dante19890 Год назад

      @@benfowler2127 ye but either way its gonna be a little better, same or a little worse depending on the DLSS version and the game, but you are getting a huge performance boost so its a instant net win no matter how u look at it.

    • @sito3539
      @sito3539 Год назад +2

      Problem is it's not just fences, it's bloom, rain, depth of field, air particles, god rays, wide color gamuts, OLED contrast, etc. on top of increased latency.
      There're lots of things AI can't do, and it certainly can't do native graphics better, just TAA better, when TAA is bad.
      Hitman in the rain being the worst offender here, even when DLSS is technically better as an AA solution it makes lighting look awful.

  • @DadGamingLtd
    @DadGamingLtd Год назад +4

    Can we take a second to talk about the jumpy vertical scroll at the end of spiderman using dlss. Is that an actual thing? Or trouble during editing/encoding?

  • @valrond
    @valrond Год назад +52

    You know, I had been playing newer games with DLSS quality activated at 4K and 1440p with my 4090 and 3080 respectively, as they came as default. They looked fine. However, a few days ago I went back to native and I was surprised by how much crisper everything looked.

    • @beachslap7359
      @beachslap7359 Год назад +5

      Well, it's a trade-off. You easily get used to less crispy image, but the smoothness makes up for it and once that extra smoothness is gone you IMMEDIATELY notice it.

    • @sebastien3648
      @sebastien3648 Год назад +4

      because AA sucks, when no AA (TAA, dLAA, DLSS) games look way sharper... but sadly these day you are forced to have at least taa in like 90% of games.. even thought at 4K i would personnaly rather have the slight shimmering than the smoothness.. basically thiscomparison is between TAA and DLSS/fsr, native doesn't mean s*** anymore..

    • @valrond
      @valrond Год назад

      @@beachslap7359 I'm already getting well over 60 fps. I don't care if I get 80 or 120.

    • @TheReferrer72
      @TheReferrer72 Год назад

      @@sebastien3648Suck? You mean look sharper when you take a screen shot and look at the resulting image for 5 minutes?
      Graphics in games are brilliant.

    • @Terepin64
      @Terepin64 Год назад

      @@cptnsx Calm your tits. I wrote "to mitigate", not "to fix".

  • @EpicPianoArrangements
    @EpicPianoArrangements Год назад +1

    A couple of things... is there a reason that the DLSS version of Miles Morales videos started stuttering at 6:35-6:47, and God of War did the same at 12:45-13:00. You'd think DLSS would perform much smoother than native resolution.
    Also, thank you for making a point about games updating their DLSS versions. Trying to update DLSS in RDR 2 is almost impossible since Rockstar Launcher ALWAYS check to see if files have been moderated every time you play, and even if you update the DLSS version, Rockstar Launcher will ALWAYS replace it with its older v2.2.10 file. If you try and deny write/edit changes to the file, Rockstar Launcher won't even launch the game; It will stay stuck trying to 'update' the file.

    • @kevinerbs2778
      @kevinerbs2778 Год назад

      Trash D.R.M.

    • @vladvah77
      @vladvah77 Год назад

      Well mate, you can always use EMPRESS edition for single player campaign of RDR 2, she has cracked a new version with DLSS so I would say F u c k off Rockstar launcher!!!! and enjoy my game anyway :-)

  • @Ribcut
    @Ribcut Год назад +4

    Considering DLSS is objectively rendering at lower resolution then attempting to upscale to native, it doesn't even make sense for it to ever be better than native.

  • @Techfanatic73
    @Techfanatic73 Год назад +13

    Some other channels could learn from your image quality. Always sharp and clear 4k. Probably the best looking channel I watch. Why do other channels not look this good? Lol. Keep it up!

    • @Sal3600
      @Sal3600 Год назад +4

      Biased. They're all the same.

  • @daelionmr4782
    @daelionmr4782 Год назад +4

    That native image instability and flickering bothers me a lot once I notice it and DLSS Quality fixes it in most cases so I'm using it almost all the time.
    To me that looks better than native TAA and I can't notice the ghosting tbh + less power draw and lower temps with better performance so its a win-win for me.
    Even on my lowly 29" 2560x1080 ultrawide Quality mode looks good enough to me, usually I manually update the DLSS dll with the 2.5.1 version tho I'm yet to check the newer versions.

  • @rakon8496
    @rakon8496 Год назад +12

    That edition was "helpful-just-in-time". Getting these days frustrated with DLSS stuttering and lacking quality in FlightSim you mentioning the cpu limitation as deal breaker was very helpful. Enjoying the scenery low and slow with a real-life viewing distance means to run the simulation in a CPU limit more often then not. Your confirmation of the title looking much better in Native Res will have serious implications: the 7800X3d has to come to rescue. 😄💙PS: Regarding VRAM, the title (even in Mid Textures) uses more then 12GB ded. in dense areas.... the 3070 wasnt long in my system! Thanks

  • @brucethen
    @brucethen Год назад +14

    Thank you for all your effort in producing these comparison videos it is much appreciated

  • @ginzero
    @ginzero Год назад +7

    Very well done balanced comparison! Thanks for the timely update on where we are at with the technologies.

  • @87crimson
    @87crimson Год назад +9

    Another use case for these upscalling techniques is to reduce the workload and power consumption on your graphics card. If it looks the same to you, just enable it.

    • @eternalbeing3339
      @eternalbeing3339 Год назад

      It increases the cpu usage though. Some people shouldn't if they are using ancient hardware.

    • @Jonatan606
      @Jonatan606 Год назад

      ​@Eternal Being33 A GPU consumes more power than CPU. You can also combine it with frame generation which will give it breathing room.

  • @elaiswaifu5459
    @elaiswaifu5459 Год назад +7

    When I have performance left…
    I tend to do this trick…
    DLDSR 2.25x + DLSS Quality…
    I play on a 4K monitor, so the game would be rendered at 6K and DLSS Quality would make the game render internally at 4K and upscale to 6K…
    This makes the image look a lot sharper and better than regular 4K actually!
    Even on 6K Performance mode (which renders internally at 1620p), it usually looks almost like native 4K and lots of times even better than it by a bit.

    • @shockwav3xx
      @shockwav3xx Год назад +1

      What is DLDSR?

    • @lonerprofile
      @lonerprofile Год назад +2

      It's called supersampling, also be used in video/anime industry. Render video at 4k and downscale to 1080p or 8k downscale to 4k, that's why you see those studio use 8k/12k high end camera.

    • @elaiswaifu5459
      @elaiswaifu5459 Год назад +2

      @@shockwav3xx opposite of upscaling…
      It downscales the game instead, this gives a sharper and crisper image than native.

    • @Dionyzos
      @Dionyzos Год назад +1

      I often use 1.78x which already looks great

    • @shanroxalot5354
      @shanroxalot5354 Год назад

      @@Dionyzos Thats what I am wondering is it better to do 1.78x with Quality or 2.25x with Balanced/Performance etc..

  • @Argoon1981
    @Argoon1981 Год назад +6

    IMO when testing native using post-process TAA (temporal anti-alising) is not showing native at its best, TAA when compared to DLSS or FSR, lacks a sharpening filter, it slightly blurs the image instead and worse creates artifacts that wouldn't exist with no AA.
    MSAA (multi-sampling anti-alising) would be the best AA option to use for native, when comparing image quality, but unfortunately, modern games have no MSAA support at all and forcing it on the drivers, only works for older D3d9 (direct X 9) games and that is a real tragedy to me.
    You can of course, still use SSAA (super sampling anti-alising) in any game, by just rendering it at a higher resolution than the one you display at, and that has the best image quality of all but also destroys performance.

  • @MrSmitheroons
    @MrSmitheroons Год назад +3

    I'm super surprised how close it is. My mind doesn't want to believe it can be this close while giving so much better performance... To the point I keep forgetting it. Brain just doesn't want to get the picture, it doesn't fit with my expectations.
    Thanks for such patience diving deep into these topics! Making these questions as close to objective, and best explained as possible. Super amazing work taking something like subjective visual quality in upscaling, and digesting it into something nigh objective! Your commentary about what game devs should do is also super concise and actionable.
    If I could give a gold star award to your channel, I would. I'm having to seriously consider donating or subscribing on Patreon, even though that's something I rarely (just about never) do. Thank you again for your tremendous efforts over the years. Gamers can be so much more informed now than any time I can recall in history. And it's due to dedicated reviewers such as yourselves.

    • @Sal3600
      @Sal3600 Год назад +2

      Disable taa and dlss and fsr doesn't stand a chance when compared to native.

    • @dante19890
      @dante19890 Год назад +3

      ​​@@Sal3600 incorrect. If u play native without TAA, DLSS will actually have a better image stability

    • @AnuragCrafts
      @AnuragCrafts 6 месяцев назад +1

      ​@@dante19890Why would you even do that lol. TAA has a very slight performance hit which is not even noticeable.

    • @dante19890
      @dante19890 6 месяцев назад +1

      @@AnuragCrafts A lot of old school pc gamers hate TAA and just run native resolution with no image treatment or AA

    • @AnuragCrafts
      @AnuragCrafts 6 месяцев назад +1

      @@dante19890 Better add sharpening from control panel or sharpening reshade or any mod for that particular game. It should be fine.

  • @pasi123567
    @pasi123567 Год назад +4

    In my personal experience I noticed that the newer DLAA anit-aliasing algorithm works incredibly well with solving flickering issues and making the image look high res. I think this test shows that the Nvidia AA algorithm on DLSS handles scenes much better than TAA for example hence why DLSS looks better than native in many cases. I wonder if we will ever get a test comparing DLSS with DLAA. DLAA native scenes should theoretically be better than DLSS.

    • @1GTX1
      @1GTX1 Год назад +1

      Last of Us and Marvel's Spiderman look amazing even on a 24 inch 1080p screen with DLAA. If every game looked like that, 1080p resolution wouldn't be a problem.

  • @aisliin
    @aisliin 3 месяца назад +1

    very useful info, would love a revisit with more recent games someday !

  • @Raums
    @Raums Год назад +76

    My takeaway from this is I can’t really tell the difference so I can just go for fps ❤

    • @kilmor3151
      @kilmor3151 Год назад +3

      Yea with DLSS. Good luck not being able to tell the difference FSR2.

    • @Verpal
      @Verpal Год назад +10

      @@kilmor3151 FSR 2 is pretty good on 4K, I think its okay to use it for 4K quality, below that I am not sure.

    • @LeegallyBliindLOL
      @LeegallyBliindLOL Год назад +3

      @@Verpal heavily depends on the game. Unusable in RE4 for me.

    • @anuzahyder7185
      @anuzahyder7185 Год назад

      Bro fsr suc*s. I use nvidia but most of the times i prefer native over dlss q especially in cyberpunk. 80% of the time i wud go with native n 20% of the time i wud go with dlss quality. Never fsr it is horrible

  • @ngkindig
    @ngkindig Год назад +2

    Its was frowned when consoles said they can play games at 4k but not native 4k. Now PC gamers are ok using these new features and not batting an eye that they are not playing at full native res. Silly

  • @jinx20001
    @jinx20001 Год назад +25

    incredible result for DLSS lets be honest, just being a tie to native would have been incredibly impressive but to pull wins out the bag on multiple occasions, wow.

  • @ricky_pigeon
    @ricky_pigeon Год назад +1

    When hes staying "Stability" are we speaking about anti aliasing flickering? because it seems rather confusing talking about stability because its often associated with camera/display shaking.

  • @xxovereyexx5019
    @xxovereyexx5019 Год назад +5

    Native is GIGACHAD, no downscaling/upscaling, pure raw performance ftw!

  • @Hybred
    @Hybred Год назад +2

    This test is weird I don't understand. Instead of creating two graphs one where its FSR vs Native and DLSS vs Native you combine all 3 into one so that only DLSS is shown... why? What advantage does that serve us the consumers of this video?
    For non-RTX users (GTX, AMD, Intel) how do they know which games FSR is better than native in? The only technology they can use. That would be useful information. This test only benefited NVIDIA RTX users because you've left out useful results for everyone else. If FSR is ALSO better than Native in that game that's good information to know rather than simply just knowing what's best.

    • @Hybred
      @Hybred Год назад +1

      Love the video just some feedback

  • @leostival
    @leostival Год назад +8

    I'd love to see something like that for 1080p. Is it crap or is it any good?

    • @kanakabp
      @kanakabp Год назад

      it is good.

    • @vladvah77
      @vladvah77 Год назад

      @@kanakabp Old DLSS was crap, but 2.5.0.+ is very good.

  • @robertcarhiboux3164
    @robertcarhiboux3164 Год назад +1

    On youtube it all look like 720p. Except DLSS has texture popup problem even on your footage. (on top of ghosting, lag input, flickering problems, blurryness, random texture collision, foreground loss of details, among others...)

  • @ian260672
    @ian260672 Год назад +6

    The reason I stuck with Nvidia last year was due to DLSS, not too bothered over Raytracing, just DLSS. I use a 50 inch 4k TV and my old 2060 super using DLSS got stable 60fps at 1440p. My 3070ti on Dying light 2 gets stable 60fps at 4k using DLSS. Only thing I don't like is the vram only 8gig on 3070ti, Nvidia messed up with vram.

    • @eternalbeing3339
      @eternalbeing3339 Год назад +1

      8gb is fine if you dont max out the textures.

    • @vladvah77
      @vladvah77 Год назад +1

      Well, the RTX 3070/Ti is a 1440p video card, so 8 GB VRAM is exactly what you need.

    • @wertyuiopasd6281
      @wertyuiopasd6281 8 месяцев назад

      ​@@eternalbeing3339Cope

    • @wertyuiopasd6281
      @wertyuiopasd6281 8 месяцев назад +1

      ​@@vladvah77Fanboys should stay silent.

  • @adriankoch964
    @adriankoch964 Год назад +10

    I used DLSS when first playing Cyberpunk on a 1080p ultrawide.
    Set a higher resolution than my panel and balanced it so it would render natively at the resolution of my panel; My panel would then downscale the higher game resolution image back to its native resolution. This gave me more distance clarity and a bit more stable image in Cyberpunk (that a month after release of CP, was before DLAA was an official thing in the driver or the game).

    • @kiburikikiam
      @kiburikikiam Год назад

      so you're basically using dldsr.

    • @adriankoch964
      @adriankoch964 Год назад

      @@kiburikikiam before that was part of the driver, yes.

  • @Zaney_
    @Zaney_ Год назад +7

    I would also like to see DLAA results and rendering the game at a higher resolution then down scaling it. I.e 4k to 1440p and seeing how those two compare to traditional TAA.

    • @gozutheDJ
      @gozutheDJ Год назад +2

      you people demand a lot without compensating them for it.
      what's stopping you from testing yourself?
      I'll give you a hint, downscaling pretty much always gives you the best image quality.

    • @tvcultural68
      @tvcultural68 Год назад

      ​@@gozutheDJ if we were to follow your brilliant logic, this video would not exist. Hw does technical analysis that most people are not capable of, so it's interesting to see H.W opinion. about the lol reward the channel earns money with the videos by adsense

    • @gozutheDJ
      @gozutheDJ Год назад +1

      @@tvcultural68people are not capable of turning a setting on, checking how it looks in game, turning it off, and comparing? wtf?

    • @tvcultural68
      @tvcultural68 Год назад

      ​@@gozutheDJ It seems like you didn't understand anything I said. Hardware Unboxed goes much deeper into the differences in graphics than most people, in summary, the channel offers a much better analysis. And if someone were to do their own analysis on dozens of games, it would take dozens of hours, I know this because I analyzed 5 games. It's much easier to watch a video that provides a great summary of this issue than for a person to spend dozens of hours analyzing games. Is it difficult for you to understand this?

    • @gozutheDJ
      @gozutheDJ Год назад +1

      @@tvcultural68 because this is literally just content dude. its fun to watch but no one is using this to tune their games. because to tune their game they would just .... do it themselves. i have eyes. i dont need a channel to tell me what looks better to me. its all coming down to personal preference as i said, unless you just go along with what other people tell you is better.
      and also, your monitor factors heavily into this as well. and that's something they can't account for, they can only tell you how it looked to them on their monitors, but you might have a completely different result, so at the end of the day you STILL need to test it for yourself.

  • @Akkbar21
    @Akkbar21 Год назад +2

    Pretty obvious that we see way more ghosting with DLSS that FSR in these examples. I can’t say I’ve noticed any “jump off the screen” problems with FSR yet (commented at 11:26 mark) as I watch.

  • @existentialselkath1264
    @existentialselkath1264 Год назад +9

    Standard TAA is often absolutely awful. I'll never understand how it was ever acceptable. DLSS (and fsr) often does a better job dealing with artifacts like ghosting, but obviously it can't compete with the sheer pixel count of native.
    A prime example is often flowing water. Transparencies often don't create motion vectors so any moving texture gets smeared like crazy with TAA, but dlss is capable of keeping the texture crystal clear in cases

    • @zxbc1
      @zxbc1 Год назад +1

      It kind of depends. For a long time in M&B Bannerlord for example, even DLSS quality gives off major shimmering with water and ripples, and default AA at native looks significantly better. But later DLSS versions fixed some of that. What they should have done is to swap the DLL to latest version for all the comparisons.

    • @existentialselkath1264
      @existentialselkath1264 Год назад +2

      @@zxbc1 for sure, it's not always better. Spiderman for example has very good temporal anti aliasing tech so these issues aren't as apparent.
      As for older dlss versions, it's extremely easy to replace the dll with a newer version so I don't really consider the older versions anymore

    • @prman9984
      @prman9984 Год назад +1

      Why even use AA at 4k? It hurts performance and looks worse.

    • @zxbc1
      @zxbc1 Год назад +3

      @@prman9984 To fix aliasing, like you would at any resolution. 4K doesn't somehow fix aliasing, it just makes the aliasing size smaller. For me it's absolutely required for a lot of games because I game on a 48" LG C2 at desktop distance, and I can see aliasing very clearly at this distance. If you game on a big screen in living room couch distance you probably won't need it, although for some games that have very bad aliasing, you will still notice severe shimmering even if you can't see the exact aliasing artifact.

    • @cptnsx
      @cptnsx Год назад +2

      @@zxbc1 I game on a big screen in 4K since 2016 and hate AA especially TAA with passion. It destroys image quality - its why I HATE these people that say DLSS is BETTER than Native because its NOT. It may be better (image quality) than Native with AA but NOT without.

  • @Hyperdriveuk
    @Hyperdriveuk Год назад +1

    Great vid as always- There's also this strange micro stutter and ghosting in the DLSS my eyes can't unsee it, especially when it's scrolling past something, it's not smooth. 11:39 In hogwards legacy, the freaking coat judders and all the small things that move around like birds have massive ghosting. 12:44 judder judder judder. 13:35 rain ghosting, the rain is literally ghosting across the screen.

  • @HeLithium
    @HeLithium Год назад +3

    The artifacts produced by DLSS is something I would never accept.
    Especially the ghosting

  • @detail___5387
    @detail___5387 Год назад +2

    Aren't we talking about antialiasing quality though? I can tell the difference in Hogwarts for example between DLSS performance and Native in terms of overall resolution, DLSS performance looks blurred in comparison. The aliasing is cleaner for sure but the image is noticeably lower res imo.

  • @rickinielsen1
    @rickinielsen1 Год назад +13

    Interesting video. If anyone had asked me, I would never have thought any of the upscaling solutions would ever be better native. But it does make sense with titles that haven’t implemented proper high-end graphic features. Might even be a great way to visually overhaul older games if you can simply tack DLSS/FSR on to it, rather than having to update the core mechanics of the engine...

    • @konga382
      @konga382 Год назад +2

      "Better than native" presentations are made possible due to shortcomings in the anti-aliasing solutions most games use, and it doesn't have anything to do with the games' age or graphical feature support. And DLSS/FSR2 require quite a bit of work on the engine side to get a good implementation of since these are temporal upscalers. You need extensive motion vector support at the very least. You do not simply tack these features on. And with motion vectors, you can also do good-quality TAA, which would reduce the need for DLSS/FSR to begin with.

    • @FranzKafkaRockOpera
      @FranzKafkaRockOpera Год назад +1

      It makes sense if you think about modern anti-aliasing techniques like TAA, which are essentially DLSS without (usually) the upscaling element, i.e., they look at past frames to reconstruct a smoother image than the native raster. Otoh I think those would be tough to implement in older games without some quite noticeable artefacts: what makes DLSS good is that it can look at depth information, motion vectors, and so on in order to create a coherent image, and old engines just don't have the capacity to communicate that info.

  • @RedundancyDept
    @RedundancyDept Год назад +10

    It boils down to anti-aliasing, which can be pretty bad at native. Interesting that the FSR flaws to which Tim most often points are usually also present in the native/TAA presentation. This suggests to me that complaints about FSR are overblown. Yeah, FSR is worse than DLSS, but it's still pretty close to, and in rare cases better than, native quality. And given that modern games tend to have glaring issues otherwise--perhaps micro-transactions, brain-dead gameplay, bad AI/scripting, and/or things like persistent shadow/texture/mesh pop-in--given all of that, I have a hard time stressing out about minute differences in image quality.
    Still, I prefer to run native when I can.

    • @mryellow6918
      @mryellow6918 Год назад

      Native + blur

    • @naturesown4489
      @naturesown4489 Год назад

      It's not just that, DLSS is trained on a 16K resolution dataset which is why it can generate fine detail better than native 4k sometimes.

    • @DimkaTsv
      @DimkaTsv Год назад +4

      Which for addition makes up additional point.
      Unlike DLSS, FSR works with just what it gets, not some magic box of trained data to mask lower resolution source.
      So, if native have shimmering and stability issues, FSR would get them as well, while sometimes exaggerating, sometimes suppressing.
      It's not that FSR is bad. It is just that DLSS creates stuff that wasn't there originally. It's also not a bad thing, but by itself it shouldn't make FSR worse technology.
      And cost of such creation (and reliance for TAA and AI) is ghosting (more for DLSS), or not fixing original shimmering (more for FSR).
      Probably that DLSS/FSR comparsion should've went with native as well (just not including it in results). Maybe overall conclusion would've not been so negative towards FSR. It is still inferior for multiple reasons, as, for example, there is no magic box to draw stuff from air with. But it doesn't "destroy" playing experience.

    • @Andytlp
      @Andytlp Год назад +3

      Agree. People forget how bad games looked 15 years ago. Its no longer about graphics or resolution. Gameplay is stagnating more. Physics implementations dont get enough attention nor audio. Worst offender is a.i. Most games have brain dead a.i especially if its a tacked on singleplayer with main focus being multiplayer. Which then make multiplayer bots suck too.

    • @naturesown4489
      @naturesown4489 Год назад +2

      @@DimkaTsv I think they do have to train FSR 2 for games somehow, as it cannot be injected into any game like FSR 1 can be with the Magpie mod. I don't know how they do the training though. You make some very good points.

  • @MichaelChan0308
    @MichaelChan0308 Год назад +10

    This is unreal... you guys are putting out informative in-depth videos so frequently from hardware unboxed, hub clips and monitors unboxed, I have difficulty catching up and watching all of them!
    I highly suspect DLSS 4.0 video generation trickery has been used =P

  • @B.E.3.R
    @B.E.3.R Год назад +1

    Good video but 1. There has to be something wrong with the Death Stranding results, I played it recently and the TAA implementation was the best I've seen in years. Maybe it's bugged in the Director's cut? And 2. We seem to have gone backwards in regards to AA quality. I remember playing Dead Space 3 and Battlefield 3 on a 720P TV and the image quality was spotless for both with just post-process AA. DS3 used SMAA, not sure about BF though. At least we've moved on from the hot garbage that was FXAA, 90% of the time the raw image was better (TAA isn't radically better but to its credit, it is "low-cost" and usually better than not having any AA)

    • @1GTX1
      @1GTX1 Год назад

      AA is not the only problem, a lot of modern games are running screen space reflections, ambient oclusion, and other ''game parts'' on half or quarter resolution compared to native, this can look ok on 1440p, but not below that, in some modern games like Ghostwire: Tokyo, reflections break on 1080p. In Cyberpunk 2077 Digital foundry has noticed that on 1440p you could lower reflections to get better performance, but on 1080p you have to set reflections at least on high for them to look good. Metro Enhanced looks extremely blury on my 1080p monitor. but with DLSS/DLDSR 2880X1620 balanced looks amazing for same performance as native 1080p without DLSS.

  • @PixelShade
    @PixelShade Год назад +8

    To be fair, I think the "better than native" really comes into place in scenes with limited motion. It could be an RTS game, like The Riftbreakers, a slow dialogue focused cutscene like the Witcher 3, puzzle games like Lego Builders or just when standing still, detail studying environments or characters in Cyberpunk. In those cases the temporal resolve has a super-sampled "better than native" quality to it. With FSR showcasing flickering instability with sub-pixel geometry (due to the feature lacking a temporal smoothing pass, which DLSS has. However, this extra pass often causes extra ghosting in DLSS as a result).
    There are so many components to upscaling, and there's just a lot of subjective opinion of the image quality that you can't or don't really need to quantify. For me personally I rather play Cyberpunk with FSR Quality over the native image at 1440p due to the super sampled temporal resolve in cutscenes, dialogues, or just when detail studying the environment. During motion it's close to native, but with added flickering and slight fuzz. which I don't mind when I get into the game. So even if flickering CAN be distracting, to me it isn't all that important, and I value the benefits of FSR, over its drawbacks. And that's something that you can't really quantify even in an image comparison, because it really depends on the scene, what you value, how much you are into the game and what best depicts the artists intended look for the game. and so on.

  • @Wildmutationblu
    @Wildmutationblu Год назад

    I think you might have gotten your pics mixed up for the Spider-Man comparison as the hair on the DLSS Quality mode looks betther than the native 4k pic @5:09

  • @andreigreu
    @andreigreu Год назад +7

    11:15 the falling leaves on DLSS leave a long trail behind them and FSR creates a distortion effect but no trail

    • @jinx20001
      @jinx20001 Год назад +4

      whats your point? pause at 14.36 and you will see DLSS is much much better than native or FSR. Like the whole video tries to explain to you, every game is different and all of them have their ups and downs in image quality but FSR is not in the discussion, its trash, finding one example is desperation lol

    • @andreigreu
      @andreigreu Год назад +2

      @@jinx20001 no point just wanted to show some difference, no need to be aggresive about it :)

    • @A.Froster
      @A.Froster Год назад +2

      @@jinx20001 How boi , somebody is angry they touched their favorite upscaling solution 😂

    • @jinx20001
      @jinx20001 Год назад +3

      ​@@A.Froster bro i dont need any upscaling, bet you can guess why.

  • @blidibliblubla
    @blidibliblubla Год назад +2

    Im interest about the Wattage used in all modes as well in times like we have today. Hope we can see a bit more information as well in the modes. Thx anyway for the good work :)

  • @Felttipfuzzywuzzyflyguy
    @Felttipfuzzywuzzyflyguy Год назад +13

    The motion performance of both brothers me, I notice it immediately. Obviously they're always improving it but I personally choose not to enable them.

    • @taufikaji3150
      @taufikaji3150 Год назад +1

      agree

    • @prman9984
      @prman9984 Год назад +4

      There's one clip in the video where he's saying DLSS is superior, while it's stuttering on DLSS but not in native.

    • @DavidFregoli
      @DavidFregoli Год назад +4

      TAA has the same problem so I guess no AA for you

    • @gavinderulo12
      @gavinderulo12 Год назад +5

      ​@@prman9984 could just be the capture.

    • @nahush6299
      @nahush6299 Год назад

      @@prman9984 DLSS improves performance so I don't see how it could be stuttering when native doesn't, especially since Tim didn't mention anything as he tested it & would have felt it. Ig it could just be a bug while capturing gameplay.

  • @tommyb6611
    @tommyb6611 Год назад +1

    I read a funny thing somewhere which i also think is true.
    When you want high visual quality, you enable highest settings possible right?! If your machine can't handle it, you reduce the video settings, right. All in native.
    If you activate the upscalers while selecting the high visual settings you are literally reducing the video quality.
    So why enable dlss/fsr instead of just keeping native with slight lower visual settings.
    By keeping native you aren't compromising anything in terms of video quality and you aren't demanding anything extra from the gpu either (by forcing a silly upscale calculation)

  • @abhishek_k7
    @abhishek_k7 Год назад +15

    I could barely tell any difference in most cases lol. but that took a lot of effort so massive respect! 🙌🏻

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 Год назад

      True unless i use like performance settings both on fsr and dlss

    • @kilmor3151
      @kilmor3151 Год назад +4

      That's why videos on this are a plain moronic concept. You won't ever be able to tell in a video.

    • @DavidFregoli
      @DavidFregoli Год назад +1

      hard to cross reference same area unless you pause, I'd rather see more stills than playback for fine detail, but playback also has its place to highlight image stability and shimmering. also YT compression doesn't help.

  • @6TheBACH
    @6TheBACH Год назад +1

    Only thing I see here is how fucking great FSR is, and that it does not need meme hardware to actually just work.

  • @bodieh1261
    @bodieh1261 Год назад +3

    First off I just wanna say I really appreciate the effort you guys put into these videos to provide us with such in depth analysis and information on topics.
    That said, in this instance I think a more useful and widely applicable analysis would be to compare the image quality of native 1080p or 1440p to that of higher but upscaled resolutions, at quality settings where the upscalers render at or close to 1080p or 1440p respectively.
    For example:
    - Native 1080p vs 4K Upscaled on Performance (1080p render res)
    - Native 1440p vs 4K Upscaled on Quality (1440p render res)
    - 1440p Upscaled on Performance vs 1080p Upscaled on Quality (720p render res)
    I think these comparisons could be more useful because most gamers have hardware that actually allows them to use those setting combinations at playable framerates, whereas very few actually have the hardware that affords them the luxury of "choosing" between native 4K and upscaled 4K. Dlss/fsr were always meant to be tools to boost performance at a given res(or enable playable fps at higher res) while minimizing losses in image quality, more so than tools to outright boost image quality itself.
    Personally from my own sample size of 1, I have found that running games like MW2, Cyberpunk and Hogwarts at 2160p Performance or even 1620p Quality(also ~1080p render res) actually produces a more stable and sometimes better looking image than native 1080p, particularly in Cyberpunk. This makes the ~5-8% fps hit of running 2160p Performance Mode worth it over native 1080p. It would be nice to see what you guys think about this and if others experience the same thing.

    • @KarlDag
      @KarlDag Год назад

      Absolutely! Originally DLSS was there to take you to 4K with a GPU that wasn't powerful enough to render native 4K. Now if you're running a 4090, sure you can choose native vs DLSS, but that's a far cry from most people's need. Native 4K doesn't perform nearly the same as DLSS 4k.

  • @Peanut-Butter-Banana-Rama
    @Peanut-Butter-Banana-Rama 6 месяцев назад

    Since I have a RTX 4060 in my G14 I don't have the luxury of running modern titles maxed out at 1080p and up, but I have been seeing a noticeable improvement modding DLSS 2 titles with DLSS 3 at 1080p, which is where the differences of DLSS will be more prominent. So far I've tried with Metro: Exodus and Hellblade: Senua's Sacrifice.
    I'm hoping it's not a placebo effect but would like to see a review on the matter to rule it out.

  • @misterinfinity4076
    @misterinfinity4076 Год назад +3

    The questions that arise from this are:
    1) Is the gain in visual quality when using DLSS quality, solely lies in the anti aliasing technic used in native rendering? i.e would compering dlss quality to 4k native with DLAA provide similar results as the ones in this video? I think not, but there are only a handful of games that support DLAA.
    2) Does it matter? We already have concluded that "ultra" settings are not worth it, isn't dlss quality(or even lower) essentially similar to that? You get marginally worse image fidelity (50% of the time) and a massive 30-40%+ fps boost.
    3) Is there any instance, where if you are forced to render (due to performance constrains of your gpu) on a lower than your monitor's native resolution, you shouldn't use DLSS and opt instead on relying on "dumb" gpu scaling or even letting your monitor to do the upscaling? for example, if I have a 4k monitor but i am only able to run the game at 1080p is there a case where it's better to render at 1080p instead of using dlss performance (internal 1080p rendering) ? Building on this: If I have the option of buying a 4k monitor or a 2k monitor, and money is not a concern, even though i don't have the hardware to run 4k native games but can run 2k native, why should i trouble myself with that and not just get the 4k one and use dlss to render at aprox 2k, effectively rendering the big old question of "who buys 4k when its so hard to run" irrelevant (notice that the difference in price of 4k screen as compared to a 2k one is much lower than the difference of a 4k vs a 2k capable gpu + a screen "ages" much slower than a gpu.

  • @ryancbarrett96
    @ryancbarrett96 Год назад +1

    I don't understand how updating from a lower resolution to a higher one can look better than the higher resolution running natively. That's like saying a blu-ray player upscaling to 4k looks better than an actual 4k player.

  • @HandsomeAlex25
    @HandsomeAlex25 Год назад +4

    That's very interesting. I didn't expect DLSS or FSR to match and even beat the native rendering. I guess it's because the TAA is not a great anti-aliasing technology... and hey, I'll take free performance with a better image any time! Thanks for the detailed look!

    • @robertcarhiboux3164
      @robertcarhiboux3164 Год назад +5

      native is better than TAA most of the time, it creates blurryness, I even prefer the jaggies to this technology.

    • @damara2268
      @damara2268 Год назад +5

      @@robertcarhiboux3164 most people hate the jaggies, including me.
      It just looks so bad that I can't stop looking at the jaggies moving around over edges instead of enjoying the game

    • @dante19890
      @dante19890 Год назад

      @@robertcarhiboux3164 U have to use Native with TAA or otherwise u get a very unstable image

    • @robertcarhiboux3164
      @robertcarhiboux3164 Год назад

      @@damara2268 I hate jaggies, but TAA even more. I prefer MSAA x4 but it's less and less available.

    • @robertcarhiboux3164
      @robertcarhiboux3164 Год назад

      @@damara2268 TAA is blurry and even worse than jaggies. I prefer MSAA but when the game does not have it I use DSR 4.0x when my GPU can handle it (with minimum smoothing factor like 6%). I also like very much the hitman resolution upscale. It's the only game in which it really improved the sharpness of the image with no particular downside. Even performancewise it was better than DSR.

  • @L0rd_0f_War
    @L0rd_0f_War Год назад +1

    Tim/HUB, thank you for this analysis. Question: Should we simply update all DLSS2 DLLs (in games with DLSS2) with the latest available DLSS2 DLL? Is there a downside?? And what about issues like bird trails in RDR2 even in latest DLSS DLL version? I couldn't get a satisfactory result in RDR2 4K Max (using 4090) using the latest DLSS2 3.x DLL. There were still bird trails and the texture quality was just not on par with Native in some scenes (side by side comparisons of images).

    • @lexsanderz
      @lexsanderz Год назад +1

      I use DLAA in that title and thought it looked great. Maybe I just can't see so many artifacts.
      Preset F.

    • @L0rd_0f_War
      @L0rd_0f_War Год назад

      @@lexsanderz I haven't tried DLAA in any game yet, which I believe requires a third party tool to enable (in most games). I'll check it out at some point.

  • @monsG165
    @monsG165 Год назад +18

    FSR seems to always have a shimmer going on. I have both setups (5900x 3060) and (5800x 6700xt), and the FSR shimmer is really obvious in some games like RE4.

    • @leroyjenkins0736
      @leroyjenkins0736 Год назад +4

      I agree in the last of us in the dark it's horrible

    • @Youtubecensorsmesince2015
      @Youtubecensorsmesince2015 Год назад +12

      FSR needs some work still but thx to FSR i can game on 1440p with a 1060 6gb thx AMD

    • @Eleganttf2
      @Eleganttf2 Год назад +4

      @@RUclipscensorsmesince2015 bro what lmao omg your poor 1060 is literally cryin

    • @tuanld91
      @tuanld91 Год назад +6

      @@Eleganttf2 bro he just not a sheep like you are

    • @biansanity
      @biansanity Год назад

      @@Eleganttf2 bro stfu not everyone have daddy's money like you

  • @Arkstellar
    @Arkstellar Год назад

    What's up with the weird stuttering with DLSS as the camera pans down at 6:43?

  • @EvLTimE
    @EvLTimE Год назад +3

    I just pick 1440p dlss quality + sharpening all day vs forced TAA in latest titles where you cant even pick another AA (cp2077 as example). Dlaa is good alternative

    • @mryellow6918
      @mryellow6918 Год назад +1

      Forced taa is so fkn annoying it's the reason I could never play the newer battlefields past hardline can't stand any type of aa it's all just blury

    • @EvLTimE
      @EvLTimE Год назад

      @@mryellow6918 Main reason why I upgraded from 1080p...simply because no matter which settings you use you cant see anything. Blur mess vs pixelated mess. Lately my friend and me tried the Division 1 and we were astonished about AA implementation. Both 1080p and 1440p. I think we should blame devs fot AA at this point.

  • @EdKenny
    @EdKenny Год назад +1

    I'm interested in how the particles are being rendered at different resolutions. The example of the particles in the air in The last of us for example, is the issue that the lower resoltution render before upscaling, simply renders less particles therefore this is an expected outcome, or is the DLSS algorithm smoothing out some of the particles? I would be very cool if you were able to ask the game dev's some questions about how they find working with DLSS versus Native resolution.

    • @VectorGaming4080
      @VectorGaming4080 Год назад +1

      The game engine is likely only creating so many particles for a given internal rendering resolution. This is why setting the correct LOD bias is so important for DLSS and developers need to make sure that they allow DLSS to sample higher quality assets than what their internal resolution would have if it were native.

  • @druout1944
    @druout1944 Год назад +3

    DLSS looks awful compared to native; you gotta be blind to think otherwise. This is a very clear (or blurry) example of the old fable "The Emperor's New Clothes".

  • @vgmms
    @vgmms 8 месяцев назад +2

    in cyberpunk, dlss or fsr looks very blurry when you walk or sprint and also when the view is very far it looks funny like losing the image or texture makes the building look plain. antiliasing is also very bad on fences or any pole in every game. while native is very clear image and detailed but the performance drop 20%

  • @Bry.89
    @Bry.89 Год назад +3

    This just makes me hope that AMD eventually releases a version of FSR that utilizes AI to enhance the quality of the image similar to DLSS. Perhaps it might only work on newer AMD gpus, and would otherwise revert back to normal FSR. But knowing AMD it will also work on any gpu capable of utilizing tensor ML cores. That would be the true "DLSS Killer"

  • @ivaniveciqtests6085
    @ivaniveciqtests6085 Год назад +1

    You are really a great channel. But who told you that people do not play on 1080p any more?

  • @mirage8753
    @mirage8753 Год назад +4

    performance upscaling is only usable when you are lacking performance badly. there is no point to use it on high end gpus

    • @mirage8753
      @mirage8753 Год назад +2

      @@McLeonVP do online games lacking performance on high end gpus? ppl usually prefer to set lower settings there so performance isnt an issue

  • @paulsef92
    @paulsef92 Год назад

    Random question: why does the video quality on this channel always look so good? Uploaded in a different format?

  • @jaroslavmrazek5752
    @jaroslavmrazek5752 Год назад +6

    It would be good to point out that the image differs a lot based on the DLSS version which can be mostly replaced in a matter of minutes, for example the DLSS in RDR2 is trash but when replaced with version 2.5.1 it looks amazing.

    • @DavidFregoli
      @DavidFregoli Год назад

      unless there's a convenient method that scales for every owner (possibly GF Experience integration) or an always up-to-date table somewhere that matches game to dlss version this is an impractical solution that only a few enthusiast will research. Nvidia hasn't even been able to stay on top ReBar On/Off per-game settings as shown by HUB so not much to place trust on here.

    • @breeminator
      @breeminator Год назад

      He covered that with that exact game in the final thoughts. It maybe deserved a section of its own, as people probably don't expect new information to be provided in the final thoughts section.

    • @jaroslavmrazek5752
      @jaroslavmrazek5752 Год назад

      @@DavidFregoli there is an app called DLSS swapper that makes it dead simple

  • @Ti-JAC
    @Ti-JAC Год назад +2

    Fake image is still fake image, The GPU card I have can render the graphic the way it was meant to be immersed in or it just can't... not using fake stand-in. Sorry but I'm sticking with native.

  • @crashtestdummy87
    @crashtestdummy87 Год назад +3

    watching this 4k comparison video on 720p youtube quality on a 1080p screen, gg me and probably 90% of viewers

  • @Hirens.
    @Hirens. Год назад +1

    Love the content!
    Keep up the good work! 💪🏻

  • @Dionyzos
    @Dionyzos Год назад +13

    Can we just take a moment and appreciate that this kind of upscaling is even possible? I still remember being blown away when DLSS 2 came out. It looked like magic to me honestly.

    • @GTFour
      @GTFour Год назад +1

      It’s still magic to me tbh

    • @wertyuiopasd6281
      @wertyuiopasd6281 8 месяцев назад

      It's not magic ^^

  • @alumlovescake
    @alumlovescake Месяц назад +1

    Do they affect vram at all? Even if it's slightly more or slightly less

    • @da3sii
      @da3sii Месяц назад

      LESS VRAM

  • @marcelo19071998
    @marcelo19071998 Год назад +5

    always update dlss to the newest version and test if its better. most of the time it will have great improvement in image quality (there are exceptions though)

  • @me1259ify
    @me1259ify Год назад

    Love the content and a;; you do. Low key it would be cool to see a Matt cameo soon 👀

  • @wynard
    @wynard Год назад +4

    This tech has gotten so good lately, the very small issues that can be seen if you really look for it do not bother me.

    • @fernandochapa1433
      @fernandochapa1433 Год назад +1

      Fr I can play with Dlss performance and I dont mind the “issues” because of the FPS gains

  • @Morgan-iv4ye
    @Morgan-iv4ye Год назад

    What is outro song name? Artist name doesn’t pop anything up on Spotify or RUclips and Shazam isn’t working

  • @David_Raab
    @David_Raab Год назад +2

    I would asume that a native 4K image is better than an upscaled 4K image. The question is how much better (or not) looks a 4K Quality to a native 1440p. Because that is what 4K will render in Quality. Or 4K Performance vs 1080p native. So do you gain a better image with 4K even if you still use Performance compared to staying at native 1080p or native 1440p?

    • @cheese186
      @cheese186 Год назад +4

      4k quality looks a lot better than 1440p on a 4k screen, it's not even close

    • @WhiteCrowInteractive
      @WhiteCrowInteractive Год назад +2

      In my experience, 4k quality always gives way better resolution of details compared to native 1440p. That is the best use case of dlss, getting pretty close to 4k visuals with 1440p performance cost. I wouldn't expect 4k dlss quality to be better than native 4k unless native has a bad aa technique in use.

    • @prman9984
      @prman9984 Год назад

      ​@@WhiteCrowInteractive which is why I always turn off AA tech on 4k. At least for me it isn't necessary and is like putting Vaseline on your camera's lens.

    • @lonerprofile
      @lonerprofile Год назад +1

      You can't compare 4k dlss quality to 1440p native, as you will get significant less fps. To my eyes, 4k dlss balance upscaled resolution is about 1440p native. 1440p native will have more fps than 4k performance WITH better image. Same thing to 4k performance is not relevant to upscaled 1080p, they are marketing scam. At least to what I've tested in cyberpunk2077.

    • @lonerprofile
      @lonerprofile Год назад

      @@WhiteCrowInteractive Because 4k quality in practical, not upscaled from 1440p, maybe around 1850p, ultra quality about 2000p. These are marketing scam. Look at the actual fps you will get from both 4k dlss quality and 1440p will get you the answer.

  • @robertstan298
    @robertstan298 Год назад

    @13:43 Pretty sure the rain shouldn't be THAT obvious, I don't remember white streaks falling down from the sky when it rains, it's much more subtle than that. I think you guys are off there, there's this thing called "too much of a good thing". And both upscalers seem to exaggerate the rain visuals.

  • @blazbohinc4964
    @blazbohinc4964 Год назад +3

    What you ought to be testing instead of this (which has been done a billion times by everyone) is 4K DLSS performance vs native 1080p. This actually makes sense and shows how much more detail you can pull from a 1080p image. It would result in a much more interesting video. It would also be interesting to see how much performance it takes to upscale from 1080p to 4k compared to just rendering in native 1080p, and how much performance toll a certain GPU / GPU brand takes. Just a thought.

    • @johnhughes9766
      @johnhughes9766 Год назад

      This

    • @lonerprofile
      @lonerprofile Год назад

      My test 6600xt 4k in cyberpunk:
      4k Ultra 18.96
      4k FSR Quality 24.14
      4k FSR Balance 30.04
      4k FSR Performance 38.84
      1440p Ultra 54.35
      1080p Ultra 89.36
      4k DLSS/FSR quality = 1440p, performance = 1080p, is a marketing scam. 4k performance is between 1440p and 1080p, and 4k quality is around 1850p, I'm talking about upscaled resolution here not image quality.

  • @vincentphilippot9562
    @vincentphilippot9562 Год назад +1

    This video make me hesitate between a 7900xt and a 4070ti for 3840x1620 gaming,
    Does the 7900xt is sufficient to play all games without fsr ?
    I'm affraid that, with time, dlss become truly superior and that I will regret a 7900xt,
    I need advice please !

  • @zsreich
    @zsreich Год назад +3

    Its true in some games that do not have an anti aliasing solution. For example, nioh 2 on PC looks better because the game has no anti aliasing and needs dlss to clean up the image.

  • @DmitryCee
    @DmitryCee Год назад +1

    Tim and Steve giving people the goodies! Good job!

  • @What_In_This_World
    @What_In_This_World Год назад +3

    I have a theory about some of your test data, for Spider-Man for example. I wonder if the game engine and texture packs are only optimized and designed for 1080p and 2160p. So when the game renders natively at 1440p, it’s just using something like a checkerboard technique to up the image to 1440p, whereas DLSS can upscale to 1440p from 1080p (quality mode) using its magical mathematical algorithms, which would also explain why the native 4K presentations were better than DLSS upscaling from 1440p, because the game engine already adjusted the image from 1080p to 1440p prior to DLSS then upscaling from 1440p to 2160p, which could explain why in Spider-Man and last of us, 4K native was so much better in it’s fine details then DLSS/FSR but native 1440p was much closer to DLSS/FSR. 🤷‍♂️🤷‍♂️ maybe this is something to look at in certain games, do certain resolutions offer up certain pros/cons and do some of those go away with certain DLSS/FSR settings ?? Is this just a DLSS/FSR issue, or is it also a possible game engine issue? Do some developers just not take the time to properly develop 1440p textures and details, which would make the DLSS quality mode using an already less optimized image ??

  • @viniqf
    @viniqf Год назад +29

    I think DLSS doed not look better than native because objectively speaking, there is no way for an upscaled image to look better than the original image.
    What DLSS does, however, is replace bad TAA implementations with their own anti aliasing method, which is vastly superior. So even though it may look marginally worse, it will be easier on the eyes due to the better anti aliasing.
    If your GPU can run at native with no issues, I recommend either DLAA or DLDSR, if you have enough performance to spare.