FSR 3.1 vs DLSS 3.7 vs XeSS 1.3 Upscaling Battle, 5 Games Tested

Поделиться
HTML-код
  • Опубликовано: 6 июл 2024
  • Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    GeForce RTX 4070 Super - geni.us/wSqSO07
    GeForce RTX 4070 Ti Super - geni.us/GxWGmYQ
    GeForce RTX 4080 Super - geni.us/80D6BBA
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 4070 - geni.us/8dn6Bt
    GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
    GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
    GeForce RTX 4060 - geni.us/7QKyyLM
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    Radeon RX 7800 XT - geni.us/Jagv
    Radeon RX 7700 XT - geni.us/vzzndOB
    Radeon RX 7600 XT - geni.us/eW2iWo
    Radeon RX 7600 - geni.us/j2BgwXv
    Video Index
    00:00 - Welcome to Hardware Unboxed
    01:39 - Performance Testing
    09:33 - FSR 3.1 vs FSR 2.2
    11:57 - Image Quality: Ratchet and Clank Rift Apart
    14:12 - Image Quality: Horizon Forbidden West
    17:05 - Image Quality: Ghost of Tsushima
    18:37 - Image Quality: Spider-Man Miles Morales
    19:55 - Image Quality: Spider-Man
    20:50 - 1440p Image Quality Comparisons
    22:35 - Quality vs Quality vs Quality
    25:23 - Final Thoughts
    FSR 3.1 vs DLSS 3.7 vs XeSS 1.3 Upscaling Battle, 5 Games Tested
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo
  • НаукаНаука

Комментарии • 1,5 тыс.

  • @tenow
    @tenow 3 дня назад +706

    FSR3.1: Ghosting in Tsushima

    • @Kony_BE
      @Kony_BE 3 дня назад +10

      I laughed so hard. Nice one xD

    • @redbaronmars4176
      @redbaronmars4176 3 дня назад +50

      even though ghosting is bad, ngl it makes the sword look cooler, stronger, faster

    • @lucaschenJC
      @lucaschenJC 3 дня назад +12

      @@redbaronmars4176hell nah

    • @lucaschenJC
      @lucaschenJC 3 дня назад +6

      @@redbaronmars4176and stop liking your own comment

    • @user-hq9fp8sm8f
      @user-hq9fp8sm8f 3 дня назад +38

      @@lucaschenJCI liked his comment

  • @ginibehpunkt4702
    @ginibehpunkt4702 3 дня назад +605

    I swear the video titles 5 years from now are probably gonna look like this "DLSS 4.1.69 vs FSRSS vs. XeSS vs. PiSS - Can Dr.Peppers' upscaling technique save us from the GeForce 7660's $2000 entry level price tag?"

    • @waltuhputurdaway
      @waltuhputurdaway 3 дня назад +52

      PiSS lmao

    • @SM-mt8pz
      @SM-mt8pz 3 дня назад +16

      PiSS >>>>

    • @JVCFever0
      @JVCFever0 3 дня назад +67

      Picture Intact Super Sampling? Give me 14 to-go!!!

    • @maicee7603
      @maicee7603 3 дня назад +7

      Raspberry Pi Super Sampling?

    • @xpeterson
      @xpeterson 3 дня назад +28

      That PiSS really streamlines the flow through the pipeline.

  • @MattJDylan
    @MattJDylan 3 дня назад +410

    "FSR looking like my DMs: quality is decent, but in the end the ghosting ruins everything" - someone somewhere, probably

    • @ShizzleGaming14
      @ShizzleGaming14 3 дня назад +4

      Good one bro🤣

    • @RedSky8
      @RedSky8 3 дня назад +2

      😂 I'm taking that one.

    • @tranthien3932
      @tranthien3932 2 дня назад +2

      Underrated comment

    • @adi6293
      @adi6293 2 дня назад +1

      @@MattJDylan Did you use DLSS 2.0?

  • @maicee7603
    @maicee7603 3 дня назад +262

    man, it's hard for me to tell the difference between the quality comparison without being told. I wonder if im not alone

    • @fatrat600284
      @fatrat600284 3 дня назад +57

      You're not, I can't see the difference and the differences are over blown.

    • @Kylethejobber
      @Kylethejobber 3 дня назад +9

      Same I have a hard time telling the difference to

    • @sadman.saqib.zahin01
      @sadman.saqib.zahin01 3 дня назад +113

      Because you're seeing a video that has been butchered by youtube compression

    • @TeamTheAces
      @TeamTheAces 3 дня назад +26

      I can see the differences and they definitely help to understand how the tech stacks up against each other. But, I will agree that if im actually playing the game and not measuring my peepee with graphs and slowmos, I would find it borderline impossible to tell the difference between FSR 3.1 and dlss both at quality. Xess at quality I could prob tell apart in certain games.
      This being at 4k of course, at lower resolutions i still think that all of the technologies look a bit wack, id rather upgrade than use upscalers at lower resolutions.

    • @kjellvb1979
      @kjellvb1979 3 дня назад +12

      As an IT professional for 25 or more years now, I agree. Well, if you go looking for the artifacts you can find them, but the quality difference of most games and tech these days is not like it was in the past. When you played half life, or even the sequel, on a low end GPU to a Mid tier one, the change was pretty big, and if you went from low/mid to top tier, again the performance and graphical changes made a significant change that most untrained or uninformed people would pick up on.
      I think on average the difference between low, mid, and high tier settings is much less dramatic in this current gaming era. When it comes to upscaling tech, most wont see the difference between 1080p and 1440p (okay maybe some would, but if we talk 1440 to 4k, yeah not many will see the change without slowdown or close inspection.
      Performance, now that is probably more noticeable to everyone, as playing at sub 60 fps and jumping to a consistent 60 or above can be a game changer, and most will notice the improvement in fluidity, unlike extra grass or slightly clearer textures. So as long as these FG technologies can provide a smoother feel without degrading the image so much the average layman can tell the difference, I think it is a big win, especially for lower spec gamers!

  • @hjge1012
    @hjge1012 3 дня назад +213

    The ghosting didn't really bother me, until you showed Ghost of Tsushima. My god, that's bad.

    • @kagander8619
      @kagander8619 3 дня назад +50

      Ghosting in that game is a deliberate design choice

    • @markhackett2302
      @markhackett2302 3 дня назад +25

      Did the frame jumping DLSS walking also lead you to say "My god, that's bad"? It isn't common, DLSS *GENERALLY* is better, but it is forced to be a closed box so it can make you buy their graphics cards, so that should be a hard nope.

    • @adlibconstitution1609
      @adlibconstitution1609 3 дня назад +10

      ​@@markhackett2302 Look I have a Rx 7800xt PC and I don't use fsr quality. I only use raster because I can't stand that upscaler.

    • @TheSometimeAfter
      @TheSometimeAfter 3 дня назад +3

      ​​@@kagander8619no, it isn't 😂
      Source

    • @atirta777
      @atirta777 3 дня назад +17

      @@TheSometimeAfter A joke, that is. "Ghost" of Tsushima

  • @elizabethagudelo7179
    @elizabethagudelo7179 3 дня назад +87

    it's a shame they waited until now to switch to a DLL packing, because now there's a few years of games stuck with an inferior upscaler unless the devs go back and patch it, but hey, at least that number won't grow anymore now

    • @TFSned
      @TFSned 2 дня назад +17

      Thankfully, if the game does support DLSS 2.0+ then you can use OptiScaler to translate it to FSR or XeSS.
      It only supports FSR 2.1.2 and 2.2.1 right now but there's a branch with FSR3 support for DirectX11 games.

    • @ironeleven
      @ironeleven 2 дня назад +1

      You'd hope. FF14 just updated with FSR *1.0* last week. I was baffled when I found out since the option isn't labeled with a version number.

    • @Morpheus-pt3wq
      @Morpheus-pt3wq 20 часов назад

      @@ironeleven considering the HW requirements for FF14, i wonder what´s even the point of implementing FSR...

  • @HeretixAevum
    @HeretixAevum 3 дня назад +235

    18:23 Ghosting of Tsushima

    • @brunogm
      @brunogm 3 дня назад +5

      the best temporal AA method in Ghost is SMAA T2x

    • @trulsdirio
      @trulsdirio 3 дня назад +1

      That surely is a deliberate design choice?

    • @sudd3660
      @sudd3660 3 дня назад +1

      @@trulsdirio who cares if it is a design choice if i do not like it.
      by the way the sword swing effect is a feature not a bug. hope you like that feature then....

    • @Martin-wj6um
      @Martin-wj6um 3 дня назад +1

      Ghosts of Fukushima

    • @AbsoleteAim
      @AbsoleteAim 3 дня назад +1

      ​@@sudd3660What are you going on about

  • @somnambulist6636
    @somnambulist6636 2 дня назад +12

    If you need to zoom 3x , slowmo at 1/4th speed to elucidate the difference , is there really any tangible difference while gaming in real time , i wonder

    • @thetruth5232
      @thetruth5232 22 часа назад +1

      The flickering, Ghosting and smearing of FSR is extremely noticeable in many games while playing. I switched to XeSS 1.3 whenever possible because it doesn't have these issues anymore, and the shimmering of XeSS 1.3 is much less noticeable.

    • @tonan8888
      @tonan8888 6 часов назад

      For me, the shimmering and flickering are the things I can always spot and it's really distracting. Also if there's overall blurriness. Ghosting is something I don't spot that much during gameplay, although the GoT example here was really bad for FSR.

  • @Szwagi
    @Szwagi 3 дня назад +32

    It'd be nice to see closeup comparisons to the native and render resolutions as well.

    • @brunogm
      @brunogm 3 дня назад +5

      DLAA, FSRAA, XeSSAA would be nice indeed!

  • @xxstandstillxx
    @xxstandstillxx 3 дня назад +198

    Is it me or this was all these games from Sony. Take that in for a moment Sony.... Is giving gamers the most flexibility for upscaling

    • @JayMaverick
      @JayMaverick 3 дня назад +74

      Doesn't the PlayStation use an AMD chip for graphics? Surely that must be related.

    • @DarkPST
      @DarkPST 3 дня назад +54

      All thanks to Nixxes!

    • @xxstandstillxx
      @xxstandstillxx 3 дня назад +1

      @@JayMaverick I never remember I saw reports of both that PlayStation has there own software in the PlayStation. I mean we been upscaling on game systems form at least PS4 erra I remember. At least in the terms on not just scaling on pure math.

    • @AUZYE
      @AUZYE 3 дня назад +9

      ​@@xxstandstillxxyeah the ps4 pro been using upscaling since 2015 !

    • @adityadas5820
      @adityadas5820 3 дня назад +13

      @@xxstandstillxx that is checkerboard rendering. But FSR is being used in many titles this generation since all current gen consoles have AMD chips.

  • @AncientGameplays
    @AncientGameplays 3 дня назад +97

    I do understand the "performance" settings compared, but makes no sense (at least to me), to compare 1440P to 4K upscaling with DLSS and FSR and then use XeSS 1.3 in Balanced mode, which is upscaling from 1080P... that's a huge difference pixel wise, hence why XeSS is not delivering better results

    • @Booth73
      @Booth73 3 дня назад +1

      I truly believe ur right. But for me I'm not sure with this new change intel made to the scale that balanced isn't upscaling to 4k from 1440p with their balanced setting

    • @Vikushu
      @Vikushu 3 дня назад +42

      Well, he normalized for performance in the initial comparison. It doesn't really matter how good it looks if it's not at the FPS you want, so he made sure that all the upscaling techniques were performing similarly first.

    • @GewelReal
      @GewelReal 3 дня назад +16

      isn't this equal performance testing?

    • @QuentinStephens
      @QuentinStephens 3 дня назад +2

      HUB have hated Intel Arc since day 1.

    • @Navi_xoo
      @Navi_xoo 3 дня назад +8

      ​@@QuentinStephensAny citations for that? Weird opinion.

  • @marktackman2886
    @marktackman2886 3 дня назад +395

    The vibe in the industry is that its AMD vs Intel for the upcoming generation. Nvidia is pretending they don't have competition.

    • @elirantuil5003
      @elirantuil5003 3 дня назад +126

      Well..... do they?
      RDNA2 was a magical generation, RDNA3... not so much, RDNA4 is an RDNA3 bugfix.

    • @WayStedYou
      @WayStedYou 3 дня назад +24

      Well if AMD is only making a cheaper to build part that matches around the 7900 XT/XTX they won't have any competition in the high end and intel will need a massive power draw reduction to be anywhere near AMD and Nvidias previous gen cards

    • @KrazzeeKane
      @KrazzeeKane 3 дня назад +37

      I mean, Nvidia isn't going to fight until the have a reason to. Intel and AMD have 0 challengers at the top end of the GPU market, and the upcoming GPUs seem to be the same in that AMD isn't going to try to fight the 5090 for supremacy and AMD has already said as much as well. So nothing is changing anytime soon bud

    • @RadialSeeker113
      @RadialSeeker113 3 дня назад +44

      What makes you think thats the vibe lol ? Maybe youre into underdog competition. But nvidia is doing better than ever and looks to take the next gen performance and software crown with their AI advancements.

    • @Ted_Kenzoku
      @Ted_Kenzoku 3 дня назад +36

      @@KrazzeeKane who cares about the very high end most people buy way cheaper gpus

  • @broose5240
    @broose5240 3 дня назад +176

    Im glad amd supports older nvidia cards that nvidia does not support

    • @yarost12
      @yarost12 3 дня назад +18

      The cutoff isn't that different though, you aren't going to be playing any recent FSR3.1 games on your RX570 (that's no longer supported by AMD).

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j 3 дня назад +20

      Amd abandoned r9 390, meanwhile nvidia only lately abandomed gtx 700 series and still support 900

    • @broose5240
      @broose5240 3 дня назад +22

      @@yarost12 well a 1200 dollar gpu not supported like a 3090 is like being ripped off. Compared to a 10 yo 160 dollar gpu

    • @noobgamer4709
      @noobgamer4709 3 дня назад +1

      @@user-wq9mw2xz3j We will see when Nvidia develop a new arch instead of using decade old CUDA core.

    • @timshel1429
      @timshel1429 3 дня назад +1

      Especially since the 4070 used might become "older" at the end of the year ...

  • @matthews2243
    @matthews2243 3 дня назад +25

    I'll be honest, while you can find these artefacts when looking for them, all the technologies are good enough that I'd be able to just play the game without it distracting me

    • @nathanpose8607
      @nathanpose8607 3 дня назад +13

      Yeah I agree. To me these upscaler comparisons read a lot like audiophile speaker reviews.

    • @WrexBF
      @WrexBF 3 дня назад +1

      copium.

    • @matthews2243
      @matthews2243 3 дня назад +3

      @@WrexBFyou keep using that word, but I do not think you know what it means. (On a less memey note, I don't have a leg in the game so why would it be?)

    • @Navi_xoo
      @Navi_xoo 2 дня назад

      Why would you take something inferior when something superior exists. This is nothing like an audiophile snob problem there's a reason each company and now Sony are all spending tens of millions to develop and market their own upscalers.
      In the future you will be right but for now there are simply inferior and superior ones. They all have their own issues though.

    • @magnomliman8114
      @magnomliman8114 2 дня назад +1

      copium.

  • @TroubleChute
    @TroubleChute 3 дня назад +11

    Great comparison

    • @alargecorgi2199
      @alargecorgi2199 2 дня назад

      I am confused when they explain XeSS quality and DLSS quality running at a higher resolution. If that's the case why not compare DLSS/FSR to the same resolution that XeSS is using, which is ultra quality at 1080p. It doesn't make sense to use "XeSS Quality" when they are fundamentally using different data sets.

  • @Morpheus-pt3wq
    @Morpheus-pt3wq 3 дня назад +59

    I miss the times, when GPUs were able to run games well in native resolution...

    • @mryellow6918
      @mryellow6918 3 дня назад +7

      they still do.

    • @Morpheus-pt3wq
      @Morpheus-pt3wq 3 дня назад +4

      @@mryellow6918 at what cost, tho?

    • @VeiLofCognition
      @VeiLofCognition 3 дня назад +17

      This is a problem, you're correct. Upscaling is a garbage bandaid tech.

    • @mojojojo6292
      @mojojojo6292 3 дня назад +6

      They still do. Except now you can use mid range cards like a 4070 or 7800xt to play at 4k with decent frames. I'd sooner do that using up-scaling rather than play 1440p native.

    • @paulc5389
      @paulc5389 3 дня назад +1

      At least it's over. Next generation they have to give actual performance increases and can't hide behind software whilst charging the same price to performance at native. Hopefully.

  • @pinoib0iify
    @pinoib0iify 3 дня назад +5

    Thank you for testing with the different cards. A lot of other folks who are doing these reviews only using a nVidia card.

  • @puertadlm163
    @puertadlm163 3 дня назад +4

    Been waiting for this video.

  • @donh8833
    @donh8833 3 дня назад +2

    As the eye catches much less detail of fast moving objects, increasing temporal sampling makes sense if it reduces shimmer which is more notable.

  • @criminalle88
    @criminalle88 3 дня назад +224

    While i understand the intent, i believe "performance normalized settings" was a mistake. The intent of the video was to compare the improvement of fsr 3.1. How it compares to older versions of itself and how it stacks up against its current competition. Showing us comparisons when we know that each upscaler is not using the same internal resolution almost defeats the purpose of the breakdown. Comparing one upscaler pulling from 1080p to one pulling from 1440p feels pointless. I understand your reasoning but I think it would have been better to just show the improvements and compare them at each similar internal resolution, then after that highlight the different fps performance at each setting. Anyhow love the work, long time viewer.

    • @SiberdineCorp
      @SiberdineCorp 3 дня назад +24

      Upscaler performance also depends on your GPU generation and GPU class.
      You might get completely different performance results on a 4090, 2060 Super, 1080ti, 5700xt, 2080 ti etc.

    • @sommyaruproy8405
      @sommyaruproy8405 3 дня назад +22

      Yeah man completely agree

    • @adeptuspotatocus6451
      @adeptuspotatocus6451 3 дня назад +39

      Yeah. Comparing DLSS Quality with FSR Balanced feels wrong. Of course DLSS is going to look better.

    • @Echo-Head
      @Echo-Head 3 дня назад +42

      He basically did that near the end of the video, comparing the "Quality" modes across upscalers, did you actually watch the video? Even in that case, each upscaler uses slightly different internal resolution targets, so you're never going to get a true apples to apples comparison. Comparing them based on actual performance makes sense, as the whole reason these upscaling technologies exist is to provide improved performance while minimizing image quality loss.

    • @musguelha14
      @musguelha14 3 дня назад +21

      Of course it makes sense, the only reason to use upscalers is for performance so normalizing for that is what makes the most sense.

  • @paulboyce8537
    @paulboyce8537 3 дня назад +27

    XeSS XMX on ARC vs FSR 3.1 vs DLSS 3.7 would be interesting. Battlemage just around the corner this test would make sense to give some idea what to expect. High end INTEL CPU and A770.

    • @superamigo987
      @superamigo987 3 дня назад +8

      Why would you need to use a high end Intel CPU for this test? This is a GPU test, so a 7800x3d would technically barely bottleneck less. I do like the idea of more XMX XeSS tests though, everyone seems to just test the DP4A path

    • @paulboyce8537
      @paulboyce8537 3 дня назад

      @@superamigo987 Because there is performance gap using AMD CPU. It is noticeable gain with INTEL CPU. Also XMX is INTEL specific for ARC.
      Two different architectures. AMD/Nvidia you can say don't use CPU much at all. INTEL splits the tasks with CPU and it matters. Hence REBAR for the data transfer and use of E cores. Lot less waiting and queuing for lot better performance. CPU is vital. Two silicone's working together. That's why the ARC is very affordable because it is the selling point for the INTEL CPU. Also a reason that I see for Nvidia looking into CPU's also. With INTEL every FPS counts. AMD/Nvidia need almost twice the FPS for same quality of performance. For example 40FPS on 4k with INTEL very playable. AMD/Nvidia need at least twice the FPS because the waiting and queuing with single silicone creates gaps and stutter to be filled. FPS from INTEL is not the same as it is from AMD/Nvidia that share similar architectures. Then there is Intel Application Optimization (APO) paired with INTEL high end CPU that is upcoming and promise 10-50% gain. So if you are looking something that will age like fine wine it is INTEL.

    • @radosuaf
      @radosuaf 3 дня назад +5

      @@superamigo987 It's because almost on one uses Arc GPUs. But it should be added as a 4th option here.

    • @PeterPauls
      @PeterPauls 3 дня назад +2

      @@radosuafI hope Battlemage will bring us good performance to not left out from these comparisons.

    • @radosuaf
      @radosuaf 3 дня назад

      @@PeterPauls First people will have to buy these. Arc ones are not very well priced.

  • @tstager1978
    @tstager1978 2 дня назад +3

    Pixel peeping is fairly pointless. It really only matters what actual game play looks like!

    • @kevinerbs2778
      @kevinerbs2778 2 дня назад

      No one cares; they'll never do a proper scientific method to prove which is better. They're just going to keep doing this because it's all everyone cares about.

  • @-T--T-
    @-T--T- 3 дня назад +128

    That thumbnail is hilarious.
    I just hope Tim isn't peeping through _my_ windows with his 'binoculars'.

    • @MuhlisErtugrul
      @MuhlisErtugrul 3 дня назад +20

      His 'binoculars' have interchangeable DLSS/FSR/XeSS upscaling, so you better have curtains on your windows lol

    • @-T--T-
      @-T--T- 3 дня назад +3

      @@MuhlisErtugrul 🤣*
      *That's actually a brilliant idea though - I'm sure someone could make digital binoculars where the built-in Ai can magnify and upscale the image beyond the raw capabilities of the max zoom level of physical hardware (lenses, mirrors, etc).

    • @MuhlisErtugrul
      @MuhlisErtugrul 3 дня назад +2

      @@-T--T- Yeah you're right :D That's a cool idea. We have AI upscalers like Topaz AI but a real-time upscaling (with machine learning tech like DLSS) through lenses like binoculars would be very interesting.

    • @juGGaKNotEmpire
      @juGGaKNotEmpire 3 дня назад

      Leave him alone. He's mastered. Aren't you TIroMe ?

    • @Mans1884
      @Mans1884 2 дня назад

      Trossard

  • @RFC3514
    @RFC3514 2 дня назад +11

    FSR seems to try to preserve a bit more detail, but the result is noisier. DLSS is better at hiding that noise, but at the cost of making some areas blurrier. If you pause, you can see that difference, but in motion and without zooming in, it's virtually unnoticeable.
    I don't think reviewers should be focusing so much on 2D fakery (spatial and temporal interpolation) anyway. Manufacturers seem to have successfully diverted attention to that, and away from the fact that the current generation is overpriced and barely any faster than the previous one at *actually rendering 3D scenes.*
    Likewise, in a lot of games the "raytracing" option seems to be just a switch that makes Nvidia cards slightly slower and other manufacturers' cards a lot slower, to change the "winner" of a benchmark with barely any noticeable change in image quality (in some games, RT actually makes shadows look a lot worse). Who wants to use 2x the amount of power, produce more heat, more fan noise, and _lose_ some FPS in exchange for (supposedly) more accurate reflections on irregular surfaces, that don't even look _nicer?_
    And reviewers / journalists keep falling for it, and publishing two, three, or sometimes even more versions of the _same_ benchmark, which doesn't really help anyone except the manufacturers, by diverting attention from the lack of real 3D rendering performance improvements, considering the increase in price.

    • @brunogm
      @brunogm 2 дня назад

      One question on RT is why the path tracing result is not used to optimizes the raster Light probe positions on cases of light bleeding and artifacts.

  • @slapnut892
    @slapnut892 3 дня назад +6

    It doesn't matter the upscaling method, none of them are any excuse for poor optimisation.

  • @BrightPage174
    @BrightPage174 3 дня назад +71

    12:30 Surprised you didn't mention that weird warping DLSS is doing instead of ghosting. Like its splotching the ghosting out. Oddly enough it seems like FSR has a more clear picture here despite the ghosting due to it not being blurred

    • @EveryGameGuru
      @EveryGameGuru 3 дня назад +14

      I don't see any warping from DLSS 🤔

    • @GewelReal
      @GewelReal 3 дня назад +27

      ​@@EveryGameGuruAMD fanboys are coping

    • @seaneriksen2695
      @seaneriksen2695 3 дня назад +22

      @@GewelReal Nvidia fanboi is salty :P

    • @joaquinbigtas1396
      @joaquinbigtas1396 3 дня назад +12

      ​@@seaneriksen2695Salty for what? Being the best lol

    • @gd3vp
      @gd3vp 3 дня назад +18

      @@joaquinbigtas1396 being at best for spending 2000$ on GPUs that have the same performance as 800$ ones

  • @mleise8292
    @mleise8292 2 дня назад +1

    Nice zoomed in shots. This is the best video I've seen for a comparison so far.

  • @samov497
    @samov497 3 дня назад +2

    Great deep dive on the comparisons and I'm glad that the big three are pushing the longevity of cards farther. I have noticed the ghosting with FSR 3.1 in HFW but was able to reduce its perceivabilty by reducing motion blur. Overall, I still feel hesistant to recommend graphics cards to friends based on image upscaling features because not everyone is willing to test to find the most optimized settings possible. With the state and quality of implementations, it's still a nice to have but not a must have. This may change as it gets implemented across more titles.

  • @brokemono
    @brokemono 3 дня назад +18

    Good progress AMD but gotta fix the ghosting.

  • @hadesflames
    @hadesflames 2 дня назад +20

    Me not being able to tell the difference between any of them even on still frames:
    Whichever is cheapest.

  • @thehorsefromGOTs8
    @thehorsefromGOTs8 2 дня назад +1

    it kinda just seems like they slapped a layer of TAA over the top of FSR and called it a day.

  • @thestrykernet
    @thestrykernet 2 дня назад

    Thanks for keeping up with the comparisons between the upscaling technologies it's extremely helpful when determining what to use and when. I would love to see an image quality comparison with XeSS in XMX mode to see what the differences are though I imagine it's best to wait for Battlemage to do that.

  • @pegadpal
    @pegadpal 3 дня назад +3

    Thx, great video, I hope UE5 games test video soon.

  • @rellikai945
    @rellikai945 3 дня назад +5

    I'm just wondering how long it's going to take nvidia to start charging a monthly subscription for dlss

  • @MariosAdamidis
    @MariosAdamidis 3 дня назад +1

    Its very pleasant to see that the universally compatible offerings from red and blue are greatly improved compared to their initial versions. I believe that everyone's problem would be resolved if amd released an XMX equivalent version of FSR like intel did to please their owners. Although my gut feeling is that they may release something first that will run on their shiny new NPUs. I really hope that that's not the case and they briefly provide updates to fsr3 that will make 7000 series more compelling at least in terms of upscaled visual quality.

  • @waveLINEx
    @waveLINEx 2 дня назад +1

    DLSS also has different preset options you can swap between using DLSStweaks. It's a great way to tune out ghosting or to increase the softness. In Death Stranding DC, DLSS 3.7.10 favors preset "C" due to the decreased ghosting, even though for most games I've tested or seen, preset "E" is considered the most performant and the default.
    I'm not sure if either FSR or XeSS have preset options, but it's another layer to team greens cake that I enjoy.

  • @xKyleNxCoD
    @xKyleNxCoD 3 дня назад +4

    This video would have been perfect if you included the native AA.

    • @JackWse
      @JackWse 22 часа назад +1

      The closest you're going to get to native AA in most of these games, is DLAA, native plus temporal anti-aliasing is essentially native, as the temporal AA doesn't really encoura significant cost. What frustrates me is looking at any of these solutions as performance enhancements.. if anything that's a bonus, as the real benefit of the machine learning super sampling is that it does anti-aliasing in motion without many of the significant drawbacks to traditional TXAA or just temporal AA in general.. This is why FSR is such a joke as far as I'm concerned..
      Depending on the resolution you're up sampling from, you end up with worse performance in motion than traditional TXAA depending on how it has been implemented.. and on top of that, the absolutely brilliant contrast adaptive sharpening that they have which is usually the perfect solution for TAA blurriness, it's already integrated into the package, and usually in a terrible way.
      I used to have issues with DLSS, but I didn't realize is that in-game LOD settings were being hyper sharpened so that you could see them in all their blurry glory, not dissimilar to how TXAA is implemented in a lot of best practices, which is silly.. nothing feels like gas lighting like the entire world being blurry until you stop and look at it..
      Once I fix the LOD settings, the performance in motion with transparencies and particle effects, it's like going back to the old days in terms of clarity in motion.. throw a light reshade of CAS on top of DLSS or DLAA and you got about as close as a modern game can get to the clarity that we used to have as normal.. depending on if you can fix the LOD settings lol as they are usually trash.. I swear developers... I don't think they have very good eyes lol.

    • @xKyleNxCoD
      @xKyleNxCoD 16 часов назад

      @@JackWse nice post my dude.

  • @sensei_...
    @sensei_... 3 дня назад +12

    Atleast in 4k high settings across all games I have played so far I never could tell the diff between 4k native and 4k with fsr Quality

    • @Dempig
      @Dempig 3 дня назад +2

      I can easily tell the difference. Even dlss at quality looks bad to me at 4k. I play about 5 ft away from a 65" screen though. DLSS makes everything soft and blurry, fsr has ghosting and a LOT of visual noise and shimmering

    • @bartbroekhuizen5617
      @bartbroekhuizen5617 3 дня назад +2

      @@Dempig I experience the same thing you mentioned. Once you see it, it can't be unseen. I would like to have a High Quality Setting between Native and Quality. Its just not good enough to switch to DLSS / FSR / XeSS. Graphics is king, especially with solo games like Horizon or Rachet & Clank where FPS and reaction is less important compared to games like Call of Duty or other E-Sports titles.

    • @sensei_...
      @sensei_... 3 дня назад +1

      @@Dempig 4k 65' is a huge diff to 4k 27"

    • @itisabird
      @itisabird 3 дня назад

      65 inches at 5 feet! That has to be really immersive. Are you able to see the edges of the screen at that distance without turning your head?

    • @Dempig
      @Dempig 3 дня назад +1

      @@itisabird Yep it about perfectly fits my viewing range, I will eventually go bigger I love large screens.

  • @jensonbrudy9826
    @jensonbrudy9826 3 дня назад +1

    Tried FSR 3.1 with Spider-Man on Steam Deck, it actually worked pretty well, just have to manually lock GPU at 1600MHz, and have a 60+fps frame rate cap

  • @bartbroekhuizen5617
    @bartbroekhuizen5617 3 дня назад +1

    Nice video, still i would like to see a Native vs Upscale Battle on this level of detail.

  • @Multimeter1
    @Multimeter1 3 дня назад +3

    I am more interested in native + frame generation, why is there barely any coverage...

  • @vuurdraak-
    @vuurdraak- 2 дня назад +3

    I have no clue at some point how stuff is being tested or any feeling it is fair, because ..., DLSS only runs on an Nvidia card (so any testing of it is exclusively on Nvidia), is then FSR in the same game also exclusively being run on an Radeon 7900 XTX ? as we know by now it will show the best it can do an a Radeon card ? While the Xess is run at the same time on the best Xe Intel GPU as it runs the best there ? Or is Nvidia being given the best of both worlds and all stuff is run side by side on a RTX 4090 ???

    • @waltuhputurdaway
      @waltuhputurdaway 2 дня назад

      FSR has no visual difference on a Nvidia card or Radeon card.

  • @okp247
    @okp247 2 дня назад

    Great review! Can't say I can see much difference between the three technologies now, which is nice. I only use FSR on the laptop for select games, so hoping this update can improve that experience a bit 😀

  • @Mirage_Unknown
    @Mirage_Unknown День назад

    FSR, DLSS, and XESS are so different in how they tackle things that it would be interesting in the future to use them as image and video filters for a unique look or even combining them to make a custom filter.

  • @Konrad-z9w
    @Konrad-z9w 3 дня назад +10

    What about "Lossless Scaling" available on steam? It claims to work on any game with any gpu, even older GTX cards.

    • @Cooler-wh4rg
      @Cooler-wh4rg 2 дня назад +2

      its good but it isn't perfect

  • @Madmeerkat55
    @Madmeerkat55 3 дня назад +79

    Have been wondering about this a ton lately as I consider AMD vs NVIDIA to replace my dying 1080! Thanks guys!

    • @Jaster0303
      @Jaster0303 3 дня назад +14

      W GPU! Keep it, it'll be nice a memory

    • @douglasmurphy3266
      @douglasmurphy3266 3 дня назад +12

      Dying how? Does it just need a thermal paste refresher?

    • @adi6293
      @adi6293 3 дня назад +12

      What resolution you playing at? I'm at 1440p and went from 3080 to 7900XTX and its awesome

    • @RadialSeeker113
      @RadialSeeker113 3 дня назад +12

      If you can afford the 5080, get it at the end of this year.

    • @ajjohnston78
      @ajjohnston78 3 дня назад +3

      @@adi6293 I did the same and was a great jump but I still like the polish of Nvidia products when it comes to drivers and programs. I'll prob go back to nvidia when the 6000 series comes out or what ever is after the 5000 series this year.

  • @RmX.
    @RmX. 2 дня назад +1

    23:16 I kinda like the FSR "light through the leafs" here, it's not a bug its a feature ;)

    • @waltuhputurdaway
      @waltuhputurdaway 2 дня назад

      I promise you wouldn't like the look of it in game

  • @SkandsKvist
    @SkandsKvist 13 часов назад

    All three looks pretty good now. Might not notice if not zooming in and slow motion.

  • @danielangelov91
    @danielangelov91 3 дня назад +4

    I would love to see having 4k native next to the 3. Also I wonder how much of a problem are really some of the FSR 3.1 issues since I don't look at the monitor at 300% zoom and I don't play at 25% speed. I know you've added all that for the sake of the video, I'm not being harsh on you.

    • @MLWJ1993
      @MLWJ1993 3 дня назад +2

      You're watching a compressed RUclips video here which subdues most upscaling artefacts by turning them into compression artefacts. Zooming in is basically necessary to SHOW you how that'd look like on uncompressed footage (I.E. your displays output).

    • @itisabird
      @itisabird 3 дня назад +1

      I just finished playing Ghost of Tsushima at 4k with FSR enabled. During the whole 60 hours of gameplay, I experienced ghosting more than once per hour. Usually it was a relatively subtle artifact that disappeared in around 1 second. Sometimes, around 10 times (in 60 hours) in total, the artifact was severe enough that I had to stop playing for a couple of seconds because it interfered with my vision. I knew it was caused by FSR, but it didn't bother me enough to turn it off. The game looks espectacular anyway.

  • @miknew20
    @miknew20 3 дня назад +29

    i dont understand the performance normalized idea. generally while using FSR, most will just use quality for the little extra boost to FPS while maintaining image quality.

    • @LukewarmEnthusiast
      @LukewarmEnthusiast 3 дня назад +5

      Yeah, that entire section seemed pointless. I was actually WTF untiil I saw compariing quality across the board was in a section coming up.

    • @Shieftain
      @Shieftain 3 дня назад

      It seems especially pointless since most will not notice the slight performance losses between the different upscalers thanks to displays with adaptive sync, but will most likely notice the big improvements in overall image quality.

    • @Maxoverpower
      @Maxoverpower 3 дня назад +3

      It's not pointless because it's inherently an fps-increasing technology, and one that games are increasingly relying on. A 40% performance boost is preferable to a 20% performance boost 5:39. Just going blindly with the same name across different upscalers and only looking at the quality is going to give you a skewed image of the value they provide. With 4K, which is largely what the video is aimed for and where upscalers are the most relevant, that additional performance can make a big difference.

    • @WrexBF
      @WrexBF 3 дня назад +1

      The section comparing the highest quality of those upscaling technologies is in the video. So, what are you mad about?

    • @Mikri90
      @Mikri90 2 дня назад

      @@Maxoverpower I don't have an issue with normalizing, it makes sense here, I am just confused as to why they decided to use FSR Balanced in Horizon Forbidden West since, by their own charts, FSR Quality performs as good as DLSS Quality.
      It's at 4:35. The difference is one frame in the AVG framerates, and the 1% lows are identical. This is as within the margin of error as possible.
      Shouldn't that particular example constitute the exact same performance uplift?

  • @jamieferguson935
    @jamieferguson935 2 дня назад

    I was seeing some stationary luminosity noise in the FSR I hadn't noticed before.

  • @thkhoa8805
    @thkhoa8805 3 дня назад

    It may sound weird, but using a controller on pc helps me in some ways to notice less artifacts, ghosting and shimmering from using FSR 3.1 and XESS 1.3

    • @sodapopinksi667
      @sodapopinksi667 3 дня назад +1

      It's not weird, especially with AFMF. AFMF is disabled with fast motion. Using a controller to move the camera alleviates that.

  • @Polotoed
    @Polotoed 3 дня назад +6

    So what is XeSS Ultra Quality?

    • @Chasm9
      @Chasm9 3 дня назад +8

      XeSS scaling factors in 1.3 versus earlier versions
      Preset XeSS 1.3 scaling XeSS 1.0-1.2 scaling
      Native AA 1.0x (Native) N/A
      Ultra Quality 1.3x N/A
      Ultra Quality 1.5x 1.3x
      Quality 1.7x 1.5x
      Balanced 2.0x 1.7x
      Performance 2.3x 2.0x
      Ultra Performance 3.0x N/A
      🤔

    • @Polotoed
      @Polotoed 3 дня назад

      @@Chasm9 Thanks! 2 questions: you wrote Ultra Quality twice (1.3x and 1.5x) - what's the difference between them? And secondly: what are the scaling factors of DLSS Quality and FSR Quality? Ty, my wholesome potato in shining armour :D

    • @krspy1337
      @krspy1337 3 дня назад +2

      67% res scale, just like FSR Quality and DLSS Quality modes

    • @exscape
      @exscape 3 дня назад +1

      Same render resolution as DLSS Quality and FSR Quality. They changed the render resolutions recently and so the old "Quality" became "Ultra Quality".

    • @Polotoed
      @Polotoed 3 дня назад

      @@exscape Thanks, guys!

  • @Timmysan
    @Timmysan 3 дня назад +27

    Glad FSR makes good progress. This tech works on virtually every (gaming) gpu so its already leagues better by giving almost every one this tech mate. Even if it looks less good in my opinion its already a more valueable tech to further develop. Not some gimmick only working on the latest hardware and DLSS 4 will probably work only for rtx50 because Nvidia desings tech this way to just create incentive to buy their newest gpu's same story with the rediculous Vram amount on GPUs costin North of 800 bucks. 16 gb on a 1100 euro GPU is a disgrace... CB2077 already (almost) uses that amount on 4k all max settings. No matter how fast the Vram is if its not enough u get stutters and low FPS. So lets focus on things that help gamers in a whole instead of bashing tech that gets better and better and is helping the entire (pc) gaming scene! Its just getting a tad boring most channels bashing FSR/XeSS. Ur paying 100-300 bucks more for DLSS so u should EXPECT it to better. Same pricepoint AMD u get 2 to 8 gbs of VRAM more and drivers and performance is actually really competitive.

  • @Paulie8K
    @Paulie8K 3 дня назад +1

    Great breakdown. I thought it looked decent in my testing but then again I'm not pausing and zooming in 3X to look for issues. Though, I have a 3080 so I'll continue to use DLSS unless it's not an option in a game.

  • @AvroBellow
    @AvroBellow 2 дня назад

    I look at this and I still say that upscalers are like televisions. The only way to make one look even remotely bad is to have it next to a superior model. If you only have one to use, you'll be perfectly happy with it because, at the end of the day, the impact that they'll have on your experience is very slight if they have any impact at all.

  • @kellyjr0918
    @kellyjr0918 3 дня назад +5

    I remember i used to use xess on cod because fsr and dlss used to have this annoying artifecting in the waiting lobby wasn't noticeable in game but used to drive me crazy

  • @KoudZ
    @KoudZ 3 дня назад +7

    Am I the only one who think that upscaling technology should be standardized for all?

    • @WrexBF
      @WrexBF 3 дня назад +1

      yes by not existing.

    • @UrbenLegend
      @UrbenLegend 2 дня назад

      No, it's just that good upscaling is highly dependent on vendor specific support for AI at the moment, so until the industry agrees on some common ground, we're going to be in this situation of vendor-specific upscaling solutions.

  • @Kryptonic83
    @Kryptonic83 2 дня назад +1

    nice comparisons, would have been nice to see a bit more side by side of FSR 2.2 to 3.1. Shame to still see so much ghosting and shimmering in 3.1. Hopefully they keep improving and get it implemented more in consoles. Will be curious to see if PSSR catches on in the PS5 pro.

  • @KeepIt1HuniTT
    @KeepIt1HuniTT 2 дня назад +1

    This is an upscaler that's constantly making progress and it's free. Not too long ago we were under the impression that Ampere and previous gen were not able to execute framegen. AMD exposed that. Now we have blockbuster titles that can utilize a mix of DLSS with FSR 3.1. I have friends with Turing and Pascal technology that are seeing their cards get a second wind which is awesome. On an OLED screen it still looks damn good with old tech.

  • @markhackett2302
    @markhackett2302 3 дня назад +37

    At around 17:45 Ghost of Tshusima walking animation, see the jumping around the DLSS version of his head, especially noticeable in the feather, it does, compared to the far more flowing look from both FSR and XeSS. This is the problem of making the "baseline" 'wot DLSS duz' instead of "no upscaling". He's peering intently at both XeSS and FSR to determine WHAT FLAWS CAN I SEE, and thereby ignoring DLSS here.
    It's not a widespread problem for DLSS use, but that is Tims problem with his focus on how DLSS HAS TO BE best.

    • @brunogm
      @brunogm 3 дня назад +1

      16:18 the texture on the blade

    • @markhackett2302
      @markhackett2302 3 дня назад +8

      @@brunogm Which Tim already pointed out as problematic, so at no point did I have to.
      Your point seemed to have been inside your head only.

    • @a1337cookie
      @a1337cookie 3 дня назад +11

      I bet that's just an animation glitch in the game, nothing to do with DLSS being the upscaler. You can see he briefly stops walking to reset the animation, and the movement becomes smooth again.

    • @Forthecasuals
      @Forthecasuals 3 дня назад

      Because it is the best, So it's the benchmark the other two are aiming for.

    • @damianabregba7476
      @damianabregba7476 3 дня назад +14

      It should be compared against native. Otherwise you compare errors to other different errors

  • @SpectreICollateral
    @SpectreICollateral 3 дня назад +7

    What exactly would you say is "night and day" difference here? 11:10 Good video but you missed some issues. Like the newfound moirè pattern issue in Ratchet & Clank with FSR3.1, the aliasing on Clank when he's a backpack, or the disappearing confetti. I also think all the testing should've been done at lower resolutions than 4k to highlight the differences.

    • @TerraWare
      @TerraWare 3 дня назад

      You cant catch everything when looking into this stuff but what you mention about R&C were some issues I came across when looking into it.

  • @alexkotlar2327
    @alexkotlar2327 2 дня назад +1

    Impressive given that XeSS is not running in its highest quality and performance XMX mode

  • @jeroenvdw
    @jeroenvdw 3 дня назад

    Pretty happy with FSR 3.1 on Horizon Forbidden West. Got a new 7900 GRE and on 1440p ultrawide I get about 100-120 FPS with it. Used settings recommended in some Reddit post, looks fantastic and it's smooth. Balanced setting.

    • @WrexBF
      @WrexBF 3 дня назад

      Ghosting go brrrrr

  • @tndgu
    @tndgu 3 дня назад +5

    Hi, thanks for the effort. I have a question, so I have GTX 1080ti still, and in this case, is it better to use FSR balanced rather than quality if I want to get best out of upscaling?

    • @krayon2982
      @krayon2982 3 дня назад

      yeah, it doesn't make sense to me

    • @WrexBF
      @WrexBF 3 дня назад

      Use the highest FSR setting, which is FSR quality. Or use XeSS. You should only choose lower FSR setting if you want more fps with a lower image quality.

    • @tndgu
      @tndgu 2 дня назад

      @@WrexBF Yeah but the video says as "on nvidia gpu's FSR Balanced seems the way to go as quality doesn't give enough frames to justify upscaling".

  • @NoMasters.
    @NoMasters. 2 дня назад +6

    If you watched this without audio, and just watch, you come away thinking that FSR has pretty much achieved parity with DLSS. But if you only listened to the audio, you'd think FSR isn't even close. Either my vision is going early, or what's his face has some personal preference influencing the interpretation. I say this having bought 6 Nvidia and 1 AMD gpu in my life. But I also have never bothered using up-scaling. But if I did, from just watching this video I would think FSR was close enough and go with the cheaper option. But again, if I only listened to it, I'd think, well better not take a chance on FSR.

  • @waynetuttle6872
    @waynetuttle6872 День назад +1

    One of two things, AMD and Intel either need an equal upscaler or they need to make a stand on RT being a useless tech and run the gamut on rasterized performance.

  • @rhoderzau
    @rhoderzau 2 дня назад

    Great video as always. Would have appreciated(800p, 1080p) using some handheld resolutions given the increased issue seen at 1440p. My main issue with fsr 2 was the pixelation on foliage so it’s great to see the improvements.

  • @josluivivgar
    @josluivivgar 3 дня назад +7

    I understand choosing the setting that has the same type of uplift but at the same time, realistically speaking, I think it would have been worth it to do a comparison with similar fps instead, because someone with a 4070 and a 7800xt playing the same game would probably go with what gives them acceptable fps
    so if in both cards you get 100 fps on a game with the quality setting both would go for that setting even if the uplift of dlss is a higher % (so it's a more realistic comparison) there's value in comparing between the same uplift, but the real life comparison should also be used based on similar fps (I know you mentioned it would be too many combinations, but I think that's a more realistic scenario that people might use/choose, based on fps, not on % of performance uplift)

    • @WrexBF
      @WrexBF 3 дня назад +2

      You need to rewatch the video.

  • @xxstandstillxx
    @xxstandstillxx 3 дня назад +3

    I would have put the scaling ratio next to each bar because they have different names and I always forget which is what

  • @TuranArekGames
    @TuranArekGames 2 дня назад

    I also noticing ghosting with FS3 TAA implamentation. Wondering what happens if it worked with other TAA tech.

  • @SimplexPL
    @SimplexPL День назад

    Thanks, Steve!

  • @amplemind9739
    @amplemind9739 2 дня назад +3

    5:15 Why do we want to match the uplift? We want to match the frames, kinda a bad take here but okay.

    • @laszlodajka5946
      @laszlodajka5946 2 дня назад

      Agree

    • @Mikri90
      @Mikri90 2 дня назад

      I second this. Like, I understand it from a "scientific" perspective, they want to know just how much better the methods are for improving performance, but for gamers, this is an irrelevant stat. All it matters is that the resulting frame-rate is satisfactory for the consumer.

  • @lawrencekallal6640
    @lawrencekallal6640 2 дня назад +7

    Why are you comparing DLSS quality against FSR balanced 15:10 in Horizon FW when the FSR quality is just as fast as the DLSS quality?
    And then DLSS quality against FSR balanced at 19:01 in Spiderman? Your Spiderman benches seem wonky as the 1% lows are worse for DLSS balanced than quality.
    Should be comparing quality across the board, what a bunch of bias shit.

    • @DrLogic_
      @DrLogic_ 2 дня назад

      You totally missed the performance 8:55 upscaling part didn’t you💀

    • @lawrencekallal6640
      @lawrencekallal6640 2 дня назад +1

      @@DrLogic_ The quality mode in Horizon-FW for FSR is just as fast as DLSS quality - so no reason to be running an inferior balanced mode. In Spiderman-MM the FSR balanced is faster than DLSS quality.
      Still should be comparing quality vs quality, balanced vs balanced and let the user decide the level of performance. Didn’t see any concern about performance matching in previous reviews unless I missed something.

    • @Mikri90
      @Mikri90 2 дня назад

      @@DrLogic_ He's right though, for Horizon Forbidden west, the numbers for DLSS Quality are 63 AVG, and 48 for 1% lows, and for FSR Quality it's 62 AVG and 48 for 1% lows. That's as neck and neck as you can get.
      You can find that at 4:35 in the video.

  • @Mike79745
    @Mike79745 3 дня назад +1

    IMO you should ignore upscalers naming and just compare them across base resolutions or % of screen resolutions. Then you compare image quality - ability of upscaler to upscale or even enhance image. You also get the performance of each upscaler as a bonus info.

  • @theofficialpollo
    @theofficialpollo 3 дня назад +1

    My favorite upscaler is to just play native tbh lol. Forget about Raytracing and stuff.

  • @stangamer1151
    @stangamer1151 3 дня назад +7

    I was surprised to see how well 4070 fares against 7800 XT in those cases when upscaling (DLSS and FSR correspondingly) is in use.
    4070 is a pretty competent GPU after all. If only it's initial MSRP was $500, it would become a decent GPU.

    • @thrafkroos
      @thrafkroos 3 дня назад +2

      enjoy using medium quality textures in 3 years if you buy a 4070/4070S

    • @TheDarksideFNothing
      @TheDarksideFNothing 3 дня назад +3

      Oh yeah, most of Nvidia's cards are really great if we could just cut the pricing by like 30% lol
      I think most people would agree that pricing is the worst thing about GeForce at this point. Unfortunate

    • @TheDarksideFNothing
      @TheDarksideFNothing 3 дня назад

      @@thrafkroos would you mind having a conversation with Nvidia for me and asking them to put more than 12GB in a laptop under $2000?
      Thanks

    • @Exile1a
      @Exile1a 3 дня назад +1

      @@TheDarksideFNothing The pricing is the reason why I changed my 1080ti for a 6900xt when that dropped in price. I've used Nvidia since the 8800GTX but that current pricing is just... Fucking nuts.

    • @stangamer1151
      @stangamer1151 3 дня назад +1

      @@thrafkroos I doubt any major changes will happen in 3 years. Next gen consoles will surely increase VRAM usage in games significantly. But they will not launch in 3 years, more like in 4-5 years.
      But anyway in 3 years games will become much more demanding in terms of raw GPU power. By that time the owners of 4070 S will be lucky to run modern titles at 1080p/High settings. Thus 12GB will still be fine. Look at Hellblade 2, Banishers and Robocop - they use just about 8GB at 1440p. So 4070 Super runs out of power way before VRAM capacity becomes an issue. Since UE5 is the most popular game engine, most games will behave the same way. Only a small bunch of future console ports may become a problem for 12GB cards.

  • @ahmadmahdee007
    @ahmadmahdee007 3 дня назад +4

    Here we go again.....

  • @soulhazetv
    @soulhazetv 3 дня назад

    How does it make sense to test Qualty vs Balanced vs Performance mode and look at image quality?

    • @WrexBF
      @WrexBF 3 дня назад

      The section comparing the highest quality of those upscaling technologies is also in the video. So, what are you mad about?

  • @azjeep
    @azjeep 2 дня назад

    nice video more inline with a real users PC.... its amazing to see scene by scene how the tone make a huge difference

  • @nbates66
    @nbates66 3 дня назад +18

    Think i'll still hold my view of avoiding all 3 of these techs unless absolutely no other option is available.

    • @originalscreenname44
      @originalscreenname44 3 дня назад +1

      And I think that's the best option. If you can buy a card that plays games at the framerate you want at your desired resolution, none of this should be necessary. By the time it might be, it's likely all of these features will have reached parity.

    • @ShinyHelmet
      @ShinyHelmet 3 дня назад +10

      Native rules.

    • @Galf506
      @Galf506 3 дня назад +5

      DLSS at its quality settings looks BETTER THAN NATIVE.

    • @termitreter6545
      @termitreter6545 3 дня назад

      Yup. I might think different if I had a 4K display, but 4k+high hz is still too expensive.

    • @Norim_
      @Norim_ 3 дня назад +5

      ​@@Galf506in 5% of the games?

  • @25MHzisbest
    @25MHzisbest 3 дня назад +6

    Ancient Gameplays bet you to the video.
    "FSR 2.2 vs FSR 3.1 vs XeSS 1.3 vs DLSS 3.7 - Which one is BETTER and WHY?"

    • @sodapopinksi667
      @sodapopinksi667 3 дня назад +4

      Tbf, I've never once considered them competitors. AG even said he watches this channel sometimes.

    • @noobgamer4709
      @noobgamer4709 3 дня назад +6

      Yeah and somehow on his comparison has DLSS on Nvidia and FSR/XESS on AMD. HUB is a big channel only don RTX card for all 3. and as we can see, FSR is worse on nvidia card since it has lower FPS so they need to drop quality setting to have better fps which leads to worse FSR image. This is no longer comparing quality setting image. HUB seem to managed to pull a smoke screen of nvidia marketing here. rarely watch HUB anymore aside for CPU comparison and News.

    • @krazyfrog
      @krazyfrog 3 дня назад

      All of this is filler until the definitive DF video drops.

  • @EinSwitzer
    @EinSwitzer 3 дня назад

    Geekslab / rtx raytracing training programs / memory tile training even over tile to get the chip to stimulate cells not normally used / then power and ai through power filter , it’s kinda why I like playing online with a battery now aka off grid

  • @gytispranskunas4984
    @gytispranskunas4984 2 дня назад

    It depends on the implantation. Like in Forza Motorsport, using DLSS not only reduce visual quality but also manages to decrease performance. Just wonderful

  • @markcollins6578
    @markcollins6578 3 дня назад +3

    Will you please always check you specs of the machines you use - while 4070 is nearer the type of GPU most people use only about 5% (guess) use a 4K monitor for gaming. This vid gives little info on whether the 4K monitor has a 'it matters' impact on the image as opposed to a 1440p or 1080p monitor

    • @matthews2243
      @matthews2243 3 дня назад +1

      4070 actually isn't close to what "most" people use. Most people are a couple generations behind, and the 60 class cards have always been more popular.

    • @skinscalp222
      @skinscalp222 3 дня назад +3

      @@matthews2243 Higher end GPU users are always delusional about the cards they own. lol.

    • @kendokaaa
      @kendokaaa 2 дня назад +1

      According to the Steam hardware survey, around 7% of Steam users who use Nvidia GPUs have an RTX 4070 or better

  • @Littleandr0idman
    @Littleandr0idman 3 дня назад +33

    Am I the only one that’s never touched any of these upscalers?

    • @cfzerooo
      @cfzerooo 3 дня назад +6

      Got a mid range board, when played CP it was enabled by default, after some hours of gameplay i decided to turn off, and man, even with half fps the game was better than with upscalling....

    • @Farsaar42
      @Farsaar42 3 дня назад +1

      ​@@cfzeroooUh-oh...

    • @Rosaslav
      @Rosaslav 3 дня назад +2

      I didn’t use them yet, because I still use older gpu Vega 64, thats still roughly on par with RX7600 and thats on 1080p, so upscalers has little use there.
      1080p is still the most used resolution and there upscalers have little use, so this whole technology is still pretty niche.
      What I would like to use though is DLAA or FSR Native AA, but I can’t, because I don’t have newer gpu and very few games support it.

    • @peterscott2662
      @peterscott2662 3 дня назад +10

      DLSS is often better than native TAA, so in those cases it would be foolish to not use it. I'm using a Mod in FO4 to enable DLSS in DLAA mode to get rid of TAA blur.

    • @elirantuil5003
      @elirantuil5003 3 дня назад +1

      @@Littleandr0idman I can definitely see myself never touching it if I didn't have an Nvidia GPU although Intel's solution is getting pretty good even if it doesn't provide the same boost in performance.

  • @XxNewmilleniumxX
    @XxNewmilleniumxX 2 дня назад

    Good evaluation. Its funny because with a few tweaks I was able to achieve 70+ in dying light with dx12 ray tracing all settings max with the 6800/FSR quality 1440p-4k. But I just couldn't get past the lows. Some times it would be under 55fps and it was noticeable. What saved me was fsr 2.0 Balanced. I never dipped below 60 again, but it is much more noticeable especially with fine level detail like tree's and gates etc. Ray tracing is something else and after seeing dying light with it, it is well sort from me. I am thinking of upgrading for a 4080 ti or super.

  • @_Kaurus
    @_Kaurus 2 дня назад +1

    lol, Motion blur permanently enabled.

  • @Club_of_Gamers
    @Club_of_Gamers 3 дня назад +23

    Had same conclusions in my FSR 3.1 videos 😅
    AMD did a great job, so far i tested FSR only in Horizon and Ghost of Tsushima but in both there is big image quality stepup, only problem is much less visable weather on AMD's solution and alot of ghosting in Horizon😉
    Thanks for video!

    • @MattJDylan
      @MattJDylan 3 дня назад +3

      Don't take it personally: I've seen how the protagonist in horizon looks, she would be ghosting me too, no doubt 😔

    • @Club_of_Gamers
      @Club_of_Gamers 3 дня назад +2

      @@MattJDylan 🤣

  • @beea314
    @beea314 3 дня назад +9

    You should've reviewed every technique at the same or closest possible internal resolution. This benchmark isn't good because we aren't really having a fair comparison of quality, but instead of performance normalized quality.

  • @mrpotch
    @mrpotch День назад

    Could you give us an update on xess in 1080p. Your previous video only had dlss and fsr tested in 1080p.

  • @enire8477
    @enire8477 2 дня назад

    Seems like performance testing could have had a AA off option bar chart as well.

  • @billb0313
    @billb0313 3 дня назад +31

    I have the 7800xt. I run most games in native 1440p. If i have to zoom in 200% and reduce the speed to notice the difference in render resolution or individual pixels then i have a bigger problem then deciding which upscaler is better.

    • @2284090
      @2284090 3 дня назад +8

      Man i can confirm that as AMD user i dont even touch upscale technology cuz we have a monster in games such rx 7800 xt and rx 7900 series so no need even think about Upscaling things. But 😂in other hand Nvidia 😂 without useing Upscaling tech they will be equal to the last generation from Nvidia 300 series 😂plus is too expensive 😂

    • @abdultariq3457
      @abdultariq3457 3 дня назад

      @@2284090my go to now with a 7900 xt is native res with the lossless scaling app at x3 frame gen and everything runs and feels extremely well

    • @J0ttaD
      @J0ttaD 3 дня назад +7

      Bro thinks he smart oh nonono don't tell him

    • @Eleganttf2
      @Eleganttf2 3 дня назад

      what does that have to do with the actual comparison of the feature ? typical AMD fanboy behavior comment

  • @felix-td1bo
    @felix-td1bo 3 дня назад +18

    Incredible how FSR performs, specially because it doesn't use any type of AI

    • @arenzricodexd4409
      @arenzricodexd4409 3 дня назад +7

      The issue is not performance but more about image quality and stability. At lower res like 1080p why FSR fall behind XeSS and DLSS? because it does not have the ML component to generate the missing data from lower res.

    • @RFC3514
      @RFC3514 2 дня назад +4

      "AI" is kind of meaningless, it's just a marketing buzzword. All these algorithms are trained using machine learning (basically, they render the game at two different resolutions and then train an algorithm to create the higher-res version based on the lower-res version), and then manually tweaked based on human feedback (because training for some types of image contradicts training for other types, so some choices have to be made - an issue known as "overfitting").
      None of the actual "AI" is running on _your_ system while gaming, because to do the training your system would have to be rendering both resolutions, comparing them, and readjusting all the weights (which would completely defeat the point of interpolation, which is to improve performance by rendering only at the lower resolution). Your system is just applying a fixed algorithm, with pre-calculated (and pre-adjusted) weights.
      The real issue is that both manufacturers now seem to be focusing on ways of faking resolutions and frame rates, instead of developing GPUs that can actually render 3D scenes noticeably faster. And the way media has been focusing on the interpolation "technology" (which isn't even new; my 14-year-old TV set has a Faroudja chip able to do most of this stuff) has diverted attention from real 3D rendering performance.
      It wouldn't surprise me if Nvidia (or AMD) start adding two (or more) interpolated frames between real ones to claim they've "tripled" the frame rate.

    • @thepirate4095
      @thepirate4095 2 дня назад

      @@RFC3514 you kinda need new ways to fake resolutions when you use ray tracing, we are still not there yet to fully ray trace a scene in 4k, as long as the fake resolution looks almost or even the same as native, where's the problem? and nvidia is pretty close to that, amd on the other hand is not

    • @RFC3514
      @RFC3514 2 дня назад

      @@thepirate4095 - In motion, and after they've been adjusted for each game, they're practically indistinguishable.
      And there's nothing wrong with adding the _option_ to use interpolated frames. The issue is pretending you need to buy new (overpriced) hardware to do it, or hoping that people will evaluate the performance of the new hardware based on the fakery, while comparing it with non-interpolated benchmarks from previous generations.
      Also, "ray tracing" currently is a mess with a huge cost in terms of power consumption and heat production for very little benefit (in some games it actually makes shadows look noticeably worse). It's going to take at least two generations (of GPUs _and_ game engines) to make it worthwhile.

    • @musguelha14
      @musguelha14 2 дня назад

      @@RFC3514 DLSS isn't trained for each game. It was in the beginning but it quickly changed to a general model.

  • @Overonator
    @Overonator 2 дня назад

    What if the less shimmering and sizzling the more ghosting and the less ghosting the more shimmering and sizzling? In other words when you minimize one you maximize the other?

    • @waltuhputurdaway
      @waltuhputurdaway 2 дня назад

      That's not how it works in general, that's just how it happened to work in the case of FSR

  • @rockybalboa5611
    @rockybalboa5611 2 дня назад

    Everything looks good and everybody should be happy to enjoy upscaling.

  • @tushkan4ik111
    @tushkan4ik111 3 дня назад +45

    Comparing DLSS Quality vs FSR Balanced is meh ... 5% difference in FPS is not important. AMD owners mostly use FSR Quality in 90% cases and don't care about FSR Balanced

    • @imo098765
      @imo098765 3 дня назад +5

      You using a performance increasing setting. So performance should be the most important variable to control

    • @KrisDee1981
      @KrisDee1981 3 дня назад +8

      The thing is that even DLSS performance looks better than FSR quality in most games. So you have additional performance on Nvidia GPU.

    • @toad7395
      @toad7395 3 дня назад

      @@imo098765 Yeah, but the reason why people use FSR/DLSS instead of just manually setting resolution to 720p is to get an fps improvement without losing too much quality.

    • @eniff2925
      @eniff2925 3 дня назад

      @@imo098765 It is anti aliasing setting, not performance increasing setting. People are just using it wrong.

    • @AzSureno
      @AzSureno 3 дня назад

      @@KrisDee1981idk maybe I’m blind but it looks really good to me , I give it a 8/10 , I don’t use upscalers very often only in games that are kind of hard to run , I have a 6800xt

  • @jannegrey593
    @jannegrey593 3 дня назад +9

    Hello Everyone!

    • @oliverstich2193
      @oliverstich2193 3 дня назад +4

      Hello 😀😄

    • @sodapopinksi667
      @sodapopinksi667 3 дня назад +4

      Hey. How are your temps

    • @IlMemetor72
      @IlMemetor72 3 дня назад +1

      Hi

    • @yourlocalhuman3526
      @yourlocalhuman3526 3 дня назад +2

      My wife let me take a break off work so that I can reply to this comment. Hello to you too! (Please save me)

    • @jannegrey593
      @jannegrey593 3 дня назад +1

      @@yourlocalhuman3526 So nice of her. And thank you as well.
      (Type the address of pickup point and the time and I'll see what I can do).

  • @Blafard666
    @Blafard666 2 дня назад

    Thank you for this work !

  • @Ray-dl5mp
    @Ray-dl5mp 3 дня назад

    Seems like FSR still needs one more major update to get close to DLSS. But that’s great to see 4k performance which is perhaps the most important mode in upscaling is looking way better ! Ghosting is an issue but I do think shimmering is probably the most noticeable thing while gaming so they need to fix that definitely.

    • @PeterPauls
      @PeterPauls 3 дня назад +1

      The problem for AMD that nVidia releases very often sometimes monthly a new DLSS version now we are at DLSS Super Resolution version 3.7.10 and their Supercomputers are learning even at that moment.

    • @Ray-dl5mp
      @Ray-dl5mp 3 дня назад

      @@PeterPauls true true. I love what nvidia is doing to push the technology. I do think for a lot of gamers they just want AMD to get to a passable level of very hard to see shimmering and ghosting and they would be happy. If the pixel counters find issues with something, so be it. But the big noticeable things when gaming just need to get almost non existent and then gamers will probably be much more willing to go cheaper. Which comes down to is this Nvidia dream moment where AMD just happens to be so far behind on really important 4k performance…but they are so close to closing the gap. Or can nvidia come up with another must have feature like perfect motion clarity to widen the gap again? Of course AMD still needs to fix their 2 major issues for this to matter.