DLSS/FSR bad for gamers?

Поделиться
HTML-код
  • Опубликовано: 2 янв 2024
  • Tim and Steve discuss whether upscaling technologies like DLSS and FSR and their improvements have been a net positive overall for gaming.
    Join us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 3050 - geni.us/fF9YeC
    GeForce RTX 3060 - geni.us/MQT2VG
    GeForce RTX 3060 Ti - geni.us/yqtTGn3
    GeForce RTX 3070 - geni.us/Kfso1
    GeForce RTX 3080 - geni.us/7xgj
    GeForce RTX 3090 - geni.us/R8gg
    Radeon RX 6500 XT - geni.us/dym2r
    Radeon RX 6600 - geni.us/cCrY
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6700 XT - geni.us/3b7PJub
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6900 XT - geni.us/5baeGU
    DLSS/FSR bad for gamers?
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunb. .
    Music By: / lakeyinspiredg. .

Комментарии • 290

  • @icydec3346
    @icydec3346 6 месяцев назад +11

    It's pretty crazy that a large portion of the industry can't comprehend there are people who prefer sharp with jaggies over ray traced but blurred.

  • @TheGameBench
    @TheGameBench 6 месяцев назад +10

    I feel like biggest issue with upscalers are that some devs are using it as a crutch for lazy optimization, even if it is initially.

  • @Teapose
    @Teapose 6 месяцев назад +65

    The fact that it’s marketed not only purely as a positive and featured prominently as THE reason to upgrade, while performance stagnates and optimization runs backwards tells me all I need to know as to whether it’s a net positive or not.

    • @ChrisM541
      @ChrisM541 6 месяцев назад

      Exactly!

    • @ralphwarom2514
      @ralphwarom2514 6 месяцев назад +5

      Optimisation is really important. We have to push devs to release games that are optimised.

    • @SvalbardSleeperDistrict
      @SvalbardSleeperDistrict 6 месяцев назад +5

      @@ralphwarom2514 Gamers "pushing devs" to do things has no power in the reality of what the economics of today's gaming industry compels them to do.

    • @jamesbyrd3740
      @jamesbyrd3740 6 месяцев назад

      nvidia also locks out newer versions from the old hardware....

    • @TheBlackIdentety
      @TheBlackIdentety 6 месяцев назад

      You'd have to be an epic clown to claim performance is stagnating. 🤡😂

  • @Cowlord2000
    @Cowlord2000 6 месяцев назад +7

    I think they need to realize that most people don’t play at 4k or even 1440p with dlss or fsr they use 1080p and it looks horrible at anything below quality

  • @katocmd
    @katocmd 6 месяцев назад +41

    The big issue I have with DLSS and FSR are how they're marketed as some kind of huge performance enhancement. They aren't, essentially they're selling me a card that has barely any real performance uplift vs older cards while touting inflated FPS numbers and jacking up the price. Rendering at 1080p and upscaling to 4k isn't the same has rendering 4k natively and selling it like it is just feels like creative accounting at best and a con at worst. I'm glad we haven't seen a race to the bottom on this because I'm sure AMD could go and upscale 640x480 and plaster some insane fps number at "4k" and say they have the fastest thing ever. But the whole thing stinks like those old memmaker and drive compression programs back in the 90's.
    We just want some honesty and a break from the corporate douchebaggery and that's really been lacking from the pc part scene in recent years.

    • @lachlanB323
      @lachlanB323 6 месяцев назад

      They're a quality uplift not a performance uplift. But they're tricking people into thinking it's a performance improvement.

    • @maverichz
      @maverichz 6 месяцев назад +3

      As consumer, we don't give a shit how they do things, as long as the graphic card we buy improve both fps and and image quality. For me DLSS massively improves my gaming experience, boosting FPS while retaining image quality. Let me give you an example from my personal experience, i upgraded to 4070 Ti from 3070 and OLED 4K monitor last year. Ppl online screaming 4070 Ti bad card, not suitable for 4K, etc and guess what: I'm very happy with my 4070 Ti purchase, the Frame Generation combined with DLSS has been a bless in 4K, even i'm playing Phantom Liberty Path Tracing with decent fps thanks to all those tech you all calling FAKE tech. For me AS A CONSUMER, performance upgrade is a performance upgrade and i don't give a shit how they do it, either via hardware or software as long as i get the benefit.

    • @lachlanB323
      @lachlanB323 6 месяцев назад +1

      @@maverichz It depends what you want. If you want performance then DLSS and RTX is a useless feature. If you care about graphics then it's great as it allows you to play at higher resolutions.
      Stop calling it a performance improvement cause it ain't. If it was a performance improvement then why don't pros use DLSS? They will do whatever it takes to get better fps... but they don't use DLSS. That says everything.

    • @evaone4286
      @evaone4286 6 месяцев назад +1

      ​@@lachlanB323 DLSS isn't even considered a graphical improvement. It looks straight up dogshit compared to native and I have absolutely no intent on using it.

    • @MrBALAKUMARA
      @MrBALAKUMARA 6 месяцев назад

      @@maverichz what are u playing is actually 720p upscaled to 4K, so much for evolution of games...

  • @pwnomega4562
    @pwnomega4562 6 месяцев назад +22

    It's crazy how after 2 generations of 60 tiers cards from the 2060 to 4060, the raw performance hasn't even increased 50% meanwhile you compare a 2060 to a 960 and there is no competition at all... All the while Nvidia is shoving upscaling and RT down your throat as the reason why to buy 4000 series... Why else do you think they exaggerated the DLSS 3.0 benchmarks?
    Dont even get me started on how the amount of vram has hardly increase for these cards as well. Nvidia seems contempt on dying on the 8gig hill.

    • @ChrisM541
      @ChrisM541 6 месяцев назад

      Nvidia and an increasing number of devs treating gamers as braindead sheep. It's a shame that time and time again, 'The Majority' never learns from past mistakes to use the biggest tool at it's disposal...if we all significantly reduced buying, prices would immediately drop.

    • @pdmerritt
      @pdmerritt 6 месяцев назад +6

      Nvidia doesn't care about the budget gamers! I keep on saying it...AMD is 100 percent better for budget gamers. You mentioned how the 2060 to 4060 was pitiful but look at 2080 vs 4080 or 3090 vs 4090 and you can see what Nvidia cares about.... It isn't you... That's for sure.... You may want to think about that before you spend your money.

    • @0M0rty
      @0M0rty 6 месяцев назад +1

      960 to 2060 was a 150$ increase in price ($100 for the 4GB model of 960 IIRC), 2060 to 4060 is a $50 reduction..... People always scream about model numbers and overlook pricing differences.

    • @pwnomega4562
      @pwnomega4562 6 месяцев назад +1

      @@0M0rty ok you need to calm down im simply comparing all of those cards in terms of performance over a certain time scale (4 years give or take)
      The 2060 is 170% faster than a 960 meaning you'd need roughly 3 960s to get similar performance to a 2060 , meanwhile the 4060 is almost 40% faster than the 2060. this shows me that performance gains from gen to gen has stalled greatly. this could be for a multitude of reasons... I Don''t know for sure if this something Nvidia can help or if nvidia is doing this on purpose to get you to buy more powerful cards that they keep increasing the price of. (I honestly wouldn't be surprised if that were the case)
      I have also noticed the past few gens that RX 600 tier cards haven't improved a whole lot from gen to gen either.
      as for pricing, I do indeed care about that. but since the 6gig 1060, '60' tier cards haven't increased in price very much only deviating about 50 dollars from the 300 dollar price point. in several years the prices of 60 tier cards literally have not changed and stayed relatively the same. and for 350 dollars we got alot with the 2060, i mean that card was able to keep up with last gen gtx 1080, a 600 dollars almost enthusiast class gpu... which that is almost unheard of
      the thing people get upset about when it comes to pricing is whenever Nvidia releases a 50 tier card for terrible price to perf asking like 250 bucks and releases newer version of 60 tier cards at terrible prices all the while increasing the price of 70, 80, and 90 tier cards.
      these are simply my observations.

    • @baoquoc3710
      @baoquoc3710 6 месяцев назад

      @@pdmerritt as long as AMD playing a long catchup game of try to copy nvidia's technologies, I doubt AMD will be considered seriously as an high-end alternative. Even their "flagship" cards are in the lower midrange region of prices. Good, but it's the reaction to anything happen on the Greener side of things.

  • @mithrillis
    @mithrillis 6 месяцев назад +14

    Upscaling and frame gen are not fundamentally different from any other shortcuts real time rendering take to make games possible. Anti-aliasing is "guessing" edges without actually having the resolution to make out the edges. Game physics is "guessing" object movement without having a full-fledged physics engine. LOD is approximating how details would look at a distance as cheaply as possible without actually showing these details.
    If anything, I think we will eventually have to go further than that. Maybe soon we will more closely integrate the traditional graphics and machine learning pipelines so that we no longer need the traditional renderer to output full frames, but only simpler representations of the frames any more, which the AI model will pick up and convert to the final output. Maybe we can dynamically balance the workload between the two based on rasterisation difficulty vs upscaling difficulty to achieve the best output. Maybe some scenes can be entirely rendered with AI models. A lot of possible directions we can take. It will not merely be a postprocessing step to "fix" traditional rendering, but a whole area of study all on its own.

  • @vespa731
    @vespa731 6 месяцев назад +14

    This is probably one of the few vids by Hardware Unboxed where I disagreed with their angle. I remember when the consoles were pushing checkerboard rendering to push “what looks like 4K” and the masses started to accept it. DLSS and Frame Generation seems to cover up the real issue here: poor game optimization. I’ve always aimed for native resolution and performance over using scaling workarounds. Just my two cents.
    Overall, I enjoy the vids. Keep up the great work, Hardware Unboxed!

    • @ChrisM541
      @ChrisM541 6 месяцев назад +1

      I always only play at native, buying a good CPU + GPU to last around 5 years. I've never had an issue with this. Every new AAA game still runs acceptably even 5 years down the line. My last GPU was a GTX 1080 that I changed to a 7900XT a few months ago. The 1080 was still pushing playable frames.
      I've called out HUB in the past 12 months for clearly pushing an Nvidia-sponsored message. Time and time again they recommend the 4090 as the best GPU, totally and utterly oblivious to the fact that Nvidia & Co are clearly profiteering to sickening levels. When a company does this, you DO NOT reward them with a 'best buy'. HUB (and others) clearly forgets that it's the 'common Joe' that got them to where they are.

    • @SquarelyGames
      @SquarelyGames 6 месяцев назад +3

      Why not get a huge performance boost for free? I always use DLSS. I really see no difference in looks to native, and when I do it often looks better. And performance is amazing.
      This is very different from previous upscaling techniques.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +3

      @@SquarelyGames You sound exactly like someone too 'familiar' with Jensen ;)
      Let's educate you - DLSS/FSR/XESS/etc are ALL based on the same, standard upscaling algorithm. Jensen simply polishes his turd some more.
      Upscaling should(!!) be a way to make lower power cards achieve higher frame rates (at lower quality, obviously). Unfortunately, it is 100% marketed today as 'allowing' $2000 GPU's to get an acceptable fps number ;)

    • @SquarelyGames
      @SquarelyGames 6 месяцев назад +1

      @@ChrisM541 You reaaaaally need to work on not sounding condescending. And no, the technology is different, even if it gets similar performance. I never even mentioned FSR, so no idea what you are going on about.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +2

      @@SquarelyGames The 'technology' is 99% of a very standardised algorithm that's been in use for years. All 3 GPU players base everything off this same base algorithm. The extra 1% is polishing, though mainly marketing bullcrap - despite what many say, there is NO 'AI' unit in these cards...obviously.
      And that's it, in a nutshell. Being a dev myself, I have 'some' inside knowledge here.
      Be wary of using the often-used (by paid Nvidia shills) phrase you used here..."I always use DLSS. I really see no difference in looks to native, and when I do it often looks better." - for obvious reasons, that is wrong - especially the "it often looks better" - that is VERY wrong!

  • @Novskyy621
    @Novskyy621 6 месяцев назад +7

    It's more about bad implementation and insane motion blur and artifacting (Jedi Survivor) and overall image noise in many titles. While image quality itself ofc got better the noise got so worse, especially when there is something like upscaling from 720p on current gen consoles. Unacceptable

  • @evalangley3985
    @evalangley3985 6 месяцев назад +3

    Higher resolution are not more accessible.
    2160p actually become 1440p with Quality preset.
    1440p actually become 1080p with Quality preset.
    1080p actually become 720p with Quality preset.

  • @rcole134
    @rcole134 6 месяцев назад +5

    The biggest issue I find with the current Upscaling tech isn't the tech itself, as that will improve over time, it's that some games seem to be being made around the upscaling instead of it being an added feature.

    • @gruntaxeman3740
      @gruntaxeman3740 6 месяцев назад

      I don't see this as issue. Game content is always designed to some resolution and using higher resolution where content is created, it reveals too much information like artefacts or cutscenes look weird. Resolution where content is designed may be also works in artistic reasons.
      Also need to think this as how game visual design costs. Adding more fidelty make it more expensive to make content match it so it is obvious way to produce game cheaper is to choose lower fidelity how everything is designed. Upscaling may work here to scale it native resolution uniformly.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 6 месяцев назад +3

      ​​@@gruntaxeman3740The issue is that when you build around it the game is less optimized leading to it being a necessity to use upscaling to get playable fps

    • @MrBALAKUMARA
      @MrBALAKUMARA 6 месяцев назад +2

      @@gruntaxeman3740 the problem starts around when we are strangely including the scenario known as "situation of devs and costs". i mean what?, did the consumer's money are falling from trees?, i think its time to put devs made to be accountable for the poor optimizations and the lazy dev's defenders in comment sections to be put in place right now!
      Now the new trend is starting , if the poorly optimized games are released then defenders be like "Hey! did u know how much it difficult to optimize?", "did you know the economics of gaming company?", suddenly everybody turning to saint now.

    • @gruntaxeman3740
      @gruntaxeman3740 5 месяцев назад

      ​@@MrBALAKUMARA
      In my opionion, bad performance is mostly caused by graphic design workflow, having unoptimized game assets.
      And economics of creating game is that when aiming to higher fidelity, it costs more to create assets matching that high fidelity. So it can make sense to choose targetting to lower fidelity and upscale image. That can make asset creation cheaper and also increase performance.
      But those performance issues are mostly caused by not having any kind of workflow where so specify how assets are created in texel/vertex density and LOD levels. Worst case is using assets intended for prerendering without making them to match required fidelity. Assets intended for rendering movie scenes are different than real time rendering assets.
      What I'm telling that scaling algorithms are not only for increasing performance. They can be used to hide missing details when game is created lower fidelity.

    • @MrBALAKUMARA
      @MrBALAKUMARA 5 месяцев назад +2

      @@gruntaxeman3740 again you coming to the same scenario "situation of devs and costs", i don't care, I give money, i want the performance to be good and the gameplay to be good. If your house fall down because of construction failure, will you ask "hey did you know how difficult is the site supervision? ", no you wouldn't do that. You straight away sue the builder, the same way it applies for devs too.

  • @nbates66
    @nbates66 6 месяцев назад +1

    So i keep hearing it said that upscaling from DLSS and FSR look bad for 1080p output, but this makes me wonder, how many of those on budget/midlevel graphics cards (and thus perhaps in a position to benefit from these featurs) have a monitor with a native resolution any higher than 1080p? I currently lack any monitors higher res than 1080p myself.
    Also, any thoughts on the effectiveness of other technologies such as TAA? I don't purchase many titles but have had a forced encounter with it in EA Sports WRC and even maxed out it's creating terrible smeary blurs and trails (on wiperblades is the most annoying).

  • @Gamer-q7v
    @Gamer-q7v 6 месяцев назад +6

    There are two big problems I have with upscaling and frame gen. One is that they use these software technologies as the main selling point of their GPUs to try and justify cutting corners on the hardware. The second reason is that developers sometimes use them as a crutch to not optimise some of their games properly. In my opinion, they dedicate more die space for CUDA Cores and RT Cores to result in bigger performance increases in rasterization and raytracing. Gamers would certainly appreciate that and prefer that instead of just getting lackluster downgraded hardware with frame gen and upscaling. Bigger performance increases in rasterization and ray tracing would allow developers to push the graphics of rasterization and ray tracing to the next level. Resulting in even better looking games.

  • @0M0rty
    @0M0rty 6 месяцев назад +2

    Where do the figures of 30-40% more performance, if we omitted RT/Tensor hardware come from? I remember some analysis from when RTX 2000 series came out, and only somewhere between 10-15% of the die was dedicated to this specialized hardware. That's pretty insignificant, upscaling/RT absolutely provides more graphical fidelity than 10% more performance could.

  • @balazsvasvari9688
    @balazsvasvari9688 6 месяцев назад +15

    MSAA handled alpha testing quite well, using "alpha to coverage" and properly authored mips, it's just most developers never bothered to do it right.

  • @tomthomas3499
    @tomthomas3499 6 месяцев назад +3

    I'm starting to lean on net positive, when imolemented correctly, it can help prolong your gpu usage, problem is like you said when game devs starting to push the boundary of graphic fidelity by implementing some sort of forced RT, where most of gpus available still struggle to run it, it end up gamers needed to use dlss/fsr even at 1080p resolution, and this situation made worse when framegen first introduced by Nvidia, not as a feature like Dlss/FSR but being sold as native fps performance

  • @PixelShade
    @PixelShade 6 месяцев назад +9

    I completely agree with you guys on DLSS and FSR and frame generation etc... I have an RX6800, and although it is far away from a high-end card like the 4090. It still delivers framerates that allow me to target 90fps+ without upscaling in the games I play. So when I add FSR Quality at 1440p, I often get a substantial boost to performance, without the image quality taking much of a hit. The base resolution is high enough, and the framerate is also high enough to increase the temporal resolve of FSR; perceptually reducing its flaws... However, If I was stuck with an older card like the GTX1060. It would definitely be more of a literal "sacrifice" to the image quality (with added artifacts) in order to gain performance... Something that would be much harder to weigh in for me personally than what it is today.
    So yeah, I definitely agree that these technologies is best paired with more powerful or capable hardware.

    • @JustMe192-xo1qw
      @JustMe192-xo1qw 6 месяцев назад +3

      Rx 6800 gang

    • @TheKproductionsful
      @TheKproductionsful 6 месяцев назад +1

      A fellow RX 6800 owner here too. I guess I don't feel the effects of DLSS/FSR being a clutch because I'm a lot more selective in the games I play. The newest game I have in my library is the Dead Space remake.

    • @pdmerritt
      @pdmerritt 6 месяцев назад

      Well said

    • @pdmerritt
      @pdmerritt 6 месяцев назад

      @@TheKproductionsful well let's keep it real, in most games, if you get lower frames than you'd like toggling a few options really can get you where you want to be without using FSR if you choose to do so. Or, you can leave everything on higher settings and use FSR. FSR gets a much worse rap than what it's worth though. Most of the time FSR is completely fine, yeah it isn't as good as DLSS blah blah blah but a lot of these differences that RUclipsrs point out.... You don't even see it when you're playing the game... Because you're not focused on it or if you do see it, it's such a minor thing compared to the whole scene being presented that it hardly stands out. Last time I checked, we're here to play games not nitpick every pixel rendered on the screen.

  • @svenweyers3916
    @svenweyers3916 6 месяцев назад +4

    Yes, dlss and fsr is ruining gaming the way it is being handled and marketted now.
    FSR and DLSS should be a means of stretching the useable life of your (discrete) GPU. It should not be needed on any new GPU above 200$ for 1080p.
    It is also a good thing to improve performance of igpus like on the steam deck.
    The way things are going now, we'll be using upscaling to go to 1080p on a 700$ gpu. Look at alan wake 2... or even cyberpunk with RT...
    It is also making developers lazy and is decreasing their general intelligence since they are not thinking about how to optimize performance anymore. We used to be able to cram an entire game in something like 64kb and 8kb of ram. To do this you actually needed to be able to program and know what you were doing. Now, developers can't even get a game running properly with basically infinite storage and 8GB of vram.
    So upscaling is a nice technology but the way it is used and marketted now, it is ruining games.

  • @tekelupharsin4426
    @tekelupharsin4426 6 месяцев назад +19

    I find the "Witcher 3 visuals not holding up" comment a bit odd. The game still looks quite good in my opinion and will age much like Skyrim SE has. The reason the devs added more vibrant colors in Skyrim SE is because they tend to age better than "realistic" colors (like the grays that the original release had). That's why the Witcher 3 graphics downgrade between announcement and release had more vibrant colors - they age better when they're meant to look artistic instead of realistic.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 6 месяцев назад +1

      Yep, and the Witcher 3 still looks amazing especially the Blood and Wine expansion.

    • @DaviAtrock
      @DaviAtrock 6 месяцев назад +1

      Very strange comment indeed. Horizon Zero Dawn still looks amazing too. And for sure you could start to create a list here...

    • @gustavobraga8199
      @gustavobraga8199 6 месяцев назад +1

      I think the problem is a lot of people claim games like the witcher 3 were better looking than recent games, and thats just not true. It still looks amazing, but if a AAA game comes out looking like that today, it will be criticized. Games of this era like the witcher, Gta V, Arkhan knight are all still gorgeous, but they clearly don't stand a chance visually against an Alan Wake 2, Spider Man 2, RE 4 Remake. But people's memory trick then to think that was the pinnacle of graphics.

    • @DaviAtrock
      @DaviAtrock 6 месяцев назад +2

      @@gustavobraga8199 They are indeed better looking than some nowadays games, but yes If you compare them against top tiers games like Dead Space and Alan Wake no, They are not. BUT (and yes... That's a big but...) What VGA today can run Alan Wake in a good Level of graphics + good FPS dude? Back in the day of Witcher, you could run the game with beautiful graphics and reasonable FPS with more VGA options than you have nowadays to play Alan Wake.

    • @evalangley3985
      @evalangley3985 6 месяцев назад

      I am playing it right now. Anyone saying the game look bad is obviously trolling.

  • @gedeuchnixan3830
    @gedeuchnixan3830 6 месяцев назад +2

    There´s a good way of upscaling on a 1080p monitor: upscale to 1440p because that looks great, but paying 200-300€ to render a game at 20years old 720p just to be able to play 1080p is nothing but an insult. A brand new GPU can´t run a game nativly on 1080p, a resolution I´m gaming at since 2008. Really am looking forward to Battlemage with the latest driver update my hopes went up a lot.

  • @cookiehead2
    @cookiehead2 6 месяцев назад +3

    I like having upscaling as an option, but I think it’s being used as a crutch by a lot of the latest games to excuse poor performance. However, It is nice to be able to buy a higher end 4k monitor and then run it at a reduced render resolution, as needed.

    • @faultier1158
      @faultier1158 6 месяцев назад

      Yeah, it's not just "reducing render resolution for better performance". It's also "being able to use a higher display resolution at the same render resolution", which gets you a huge visual upgrade.

    • @evaone4286
      @evaone4286 6 месяцев назад

      ​@@faultier1158but not at 4k. Either way you use it at that Resolution, it produces a dogshit image. Maybe it's different at lower res

    • @cheeeeezewizzz
      @cheeeeezewizzz 6 месяцев назад

      ​@@evaone4286just render at 1080p for your 4k monitor. Even integer scaling will be perfect, dlss and fsr both handle the 1080p to 4k conversion really really well. Same for 960x540-->1080p handles that really well on handhelds too.

  • @HOTPLATEGAMING
    @HOTPLATEGAMING 6 месяцев назад +15

    I’m old school.
    I stick with native resolution and don’t rely on upscaling.

    • @RavTokomi
      @RavTokomi 6 месяцев назад +1

      Do you also disable antistrophic filtering? Disable anti-aliasing? Heck might as well disable T&L and go back to software rendering. No point discounting new technologies.

    • @evalangley3985
      @evalangley3985 6 месяцев назад +3

      Amen...
      2160p is the goal...

  • @ElysaraCh
    @ElysaraCh 6 месяцев назад +1

    i think in a world where every GPU is just expected to do 1080p as a baseline, regardless of the image quality or technology requirements of the game, we've hit a point of homogenization of the product stack and a weird issue where hyper-premium GPUs provide added value more from their additional features for the average buyer. 1080p is still by far the most common monitor resolution and not very many people own 4K panels.
    Back in the day the tradeoff for buying a $100 or lower GPU was that you had to drop your resolution. I still have the Radeon 9500 Pro I bought over 20 years ago because I felt at that time that spending twice as much on a 9700 Pro to get playable 1280x1024 or 1600x1200 wasn't worth it when I felt that 1024x768 was "good enough" and the highest resolution my monitor could run at 75Hz anyway.
    However, like, they tried cutting features with Turing for the low end and we got the 16x0 cards as a result, and they suck. So... I don't think disabling features for "efficiency" or "raw performance" is what people want either. Of course, what people WANT is sub-$200 4090s, but well... lmao. :)

  • @WrexBF
    @WrexBF 6 месяцев назад +5

    The quality of textures has gotten better, but TAA and those upscaling technologies make everything look blurry and blobby. You can see a clear lack of scene depth in recent games, especially in motion.

  • @Mr.Sjottdeig
    @Mr.Sjottdeig 6 месяцев назад +3

    i always assume dlss would only be to help make the jump to 4K gaming.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +1

      Remember, low res textures upscaled to 4K does NOT equal a human gfx artist texture set generated natively at 4K !!!

    • @faultier1158
      @faultier1158 6 месяцев назад

      @@ChrisM541 You wouldn't notice the difference.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +1

      @@faultier1158 ...then you clearly support a retrograde in gfx quality. Next, you'll be saying that today's 100% autotuning excrement in pop music is acceptable because the buyer is unaware that the singer can't sing and is miming at every 'live' concert.

  • @CheapBastard1988
    @CheapBastard1988 6 месяцев назад +6

    I disagree with the old cars statement. Cars from before 2010 are typically more reliable than new cars. Because they didn't use direct injection very often and didn't use turbochargers unless they ran on diesel or were very fast. The wiring insulation wasn't made of biodegradable materials so rodents didn't like eating that and there also there were far less unnecessary microprocessors integrated in parts (like headlamp units) to save on copper in wires. All that tech makes newer cars less reliable and more expensive to repair.

    • @aidenquinn19975
      @aidenquinn19975 6 месяцев назад +1

      100%
      I’ve seen people with corollas from the 90’s still running strong. Tech these days are just designed to fail, or at least that’s how I see it. Older tech durability and viability will always amaze me. I still see people running 1080ti’s in their builds and refusing to upgrade. They haven’t even touched their cards for thermal paste replacement or internal cleaning.

  • @evalangley3985
    @evalangley3985 6 месяцев назад +1

    Ahh... TIM didn't played The Witcher 3 or Nioh 2 in HDR then... they look frigging impressive still. Same as Resident Evil 2, or Devil May Cry 5, or God of War... or TLOUP1 (It is a PS4 port FGS)... or FFVII remake...
    Cyberpunk was released in 2020 by the way...

  • @liiich6175
    @liiich6175 6 месяцев назад +1

    2080ti 2560x1080 144hz user....I use DLSS/FSR as the hit in quality is worth the lower noise/power and more FPS...I also lock my FPS at 90 for high demanding games as I feel 90hz is more then enough and FPS dips wont be really noticed (compared to 144FPS dipping to 75-80FPS)
    To get back some of the sharpness I found using the Nvidia overlay works but uses at least 10w more at compared to off, but the next quality jump with DLSS/FSR can use 25w+ more.
    While in the perfect world solid stable frames and all the quality set to max would be nice....I am not keen on paying $3k+ on a gpu to do it. Even low res with DLSS/FSR looks fine if thats all you have ever seen, and if it gets you frames to play and enjoy your games then its a win.
    I do agree though, more power to the card and less on the RT aspect would of made for better game visuals and textures...look at old games that didn't use that tech and they till look amazing.

  • @ycageLehT
    @ycageLehT 6 месяцев назад +1

    11:45
    This is the problem.
    If you still had the 30-40% increase in rasterization in the different tiers, with the features they have added, at reasonable prices, you wouldn't see people complaining as much.
    As it is, they give you a slower product, you need to use the features to make up for the performance loss and they charge the living crap out of you for a GPU. NO, NONO NO NOOOOOOOOOOOOOOOOOOOOOOOOOOOO!

  • @iikon01
    @iikon01 6 месяцев назад +3

    They should respect the native resolution people like me
    I don't like anything to ruin my picture or decrease my resolution
    Why i should buy rtx 4090 to play at 4k native?
    Before even rtx 1070 can do 4k native !

  • @Invesre
    @Invesre 6 месяцев назад +2

    I bet in like 5 years most aaa games will require upscaling and even frame gen to run good. Dev cost will go down when you dont need optimise game since they will say you need to use those techs

    • @hyperturbotechnomike
      @hyperturbotechnomike 6 месяцев назад +1

      They already began doing this. Nothing new.
      I bet in 5 years the new RTX or Radeons are only faster because of newer upscaling technologies and not because the GPU or memory itself got better. nVidia will still serve middle class GPU's with 8GB and 64bit bus width but with DLSS 4.0 baked right into the firmware without being able to turn it off and AMD is on it's fourth RDNA refresh with FSR7 permanently activated.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +1

      Unfortunately, that is, indeed, the way we're headed. The same parallel can be seen in pop 'music' spanning the last 20 years. Antares Autotune and Celemony's Melodyne (the real culprit) have now been abused to such an extreme extent that almost everyone mimes when playing 'live'.
      --> The ability of singers to sing is NOT a requirement.
      --> The ability of game devs to optimise is NOT a requirement.
      It's unfortunate that humans seem utterly incapable of taking more apples from that tree. The #1 problem is, there's zero control.

  • @DaviAtrock
    @DaviAtrock 6 месяцев назад +1

    Im still on a GTX 1080, and i've played Dead Space in 2023 in 1080p using the in game FSR option in balanced mode, graphics were set between medium and high, and i was able to lock 70 FPS in riva tuner and yes... while the image lost a little sharpness, i just added nvidia sharp filtering via game filters with alt+z nvidia experience filters and got the sharpenint that was lost with FSR back and yes, the game looked amazing, played amazing and we could say that it was a good FSR implementation, it just proves that the real problem with DLSS/FSR is poor implementation in some titles. But yes, i tend to agree that raw performance was a better way to pursuit in graphic cards evolution, and i think there are currently some really good cards to play in 2k res that to me should be the res that gaming developers should be aiming to make overkill mid tiers graphic cards, because to me the real problem is the 4k performance, that still these days is not that good without the use of upscaling and devs/industry are for sure trying to skip some steps of development to reach a resolution that to me should not be yet the focus on gaming development due to the low number os users. I think that we're reaching a point that there is only two options left to developers and to Nvidia/AMD cards in their next generations that are... 1. Dlss/fsr should improve image quality hit of these techs and/or... 2. The cards should have enough power to reach 100+ FPS in 1440p without the use of upscaling like Pascal GTX were to 1080p at a certain point in the past.

  • @liaminwales
    @liaminwales 6 месяцев назад +1

    Ill never forget being a kid waiting for the car to start to go to school, the street was full of the sound of cars not starting every morning.
    I think people mix up old games with good art direction with good graphics, Zelda Wind Waker still looks good but it's thanks to art direction not 'amazing graphics'.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +1

      Sentimental memories certainly are generally always through rose tinted glasses. The point here, though, is that upscaling is a crutch that will be used by more and more devs to release poorly optimised games. The bigger danger is that devs will start hard-coding upscaling to be turned on, covertly (and pacifying all by implementing a faux upscaling switch in the menu that does little).

    • @liiich6175
      @liiich6175 6 месяцев назад

      Though if we are comparing GPUs to cars....nowadays its like the manufacture is only trying to sell us Lambos and nothing else.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +1

      @@liiich6175 If there's enough buyers they'll keep doing this. It's truly eye-opening to see just how dumb the buyer is, dumb enough to forget, en masse, that it is they who control pricing.

  • @captainthunderbolt7541
    @captainthunderbolt7541 6 месяцев назад

    I used to think that DLSS was a switch to enable extra performance. I now think of DLSS as a switch to enable dev laziness.
    Undoubtedly many of us are playing with degraded image quality today when compared to even two years ago, because we used to be able to play at native resolution, and now it is not uncommon for me to have to go as low as 1080p+DLSS just to get 60fps. This is especially the case on UE5 titles.

  • @faultier1158
    @faultier1158 6 месяцев назад

    The thing with upscaling having better results on higher resolutions leads to an interesting effect of making higher resolutions more accessible. You don't want do enable upscaling when playing at 1080p, but if you can comfortably run the game at native 1080p, you could also run the game at 2160p output with DLSS performance mode, which doesn't need that much more processing power while looking much better than native 1080p. You can now make proper use of a 4k monitor without owning a current gen flagship GPU.

    • @evalangley3985
      @evalangley3985 6 месяцев назад

      Higher resolution are not more accessible.
      2160p actually become 1440p with Quality preset.
      1440p actually become 1080p with Quality preset.
      1080p actually become 720p with Quality preset.

  • @AtaGunZ
    @AtaGunZ 6 месяцев назад

    I hate the ghosting dlss does, but I really really that it removes aliasing. It's almost a must in some games. Playing talos principle 2 recently, the aliasing on water at a distance is really distracting, and only dlls fixed it.
    Also I really really hate the ray tracing ray noise. The light changing with time is really really distracting...

  • @_FLAK_
    @_FLAK_ 6 месяцев назад

    4k upscale to 5k VSR while using FSR quality in some titles is chefs kiss

  • @kingcrackedhen3572
    @kingcrackedhen3572 6 месяцев назад +2

    I want Raster over dlss fsr
    Native performance ftw

  • @gruntaxeman3740
    @gruntaxeman3740 6 месяцев назад

    In my opinion, moving from direct rendering to deferred rendering caused degraded image quality in antialiasing and it also started wasting video memory bandwidth.
    FSR is however, is great thing. With optimized assets and game engine using direct rendering, hardware doesn't matter anymore. You can make very same game to scale from mobile phone GPU to some RTX 4090.
    So I kind of like to get idea to have optimized asset creation workflow and engine because we don't need gaming machines to have 800 watt PSU, they can be made to run couple of watts. I've tried successful tests using optimized assets in Godot 3 + FSR, and when I calculated it was obvious that I can run photorealistic like of 3D graphics only 1 teraflop GPU power using less than 50Gb/s memory bandwidth. Sure adding more bandwidth and flops allows to crank some details up but there is possible to make huge improvements in efficiency.

  • @JeepBully
    @JeepBully 6 месяцев назад +2

    Upscaling is one of the worst features on PC. I cannot unsee the judder, artifacting, shimmer, etc. Even with DLSS 3.5. I literally cannot play with dlss, fsr, frame gen, whatever else they’re using.
    But on console it works really well and isn’t headache inducing.
    Is also false advertising from manufacturers. This new card gets 100 frames more than the previous gen. In real time raster is more like 20-30 frames and at times less, depending on the card. And then locking out people from using it unless they pay out the butt for a newer card. When it’s just a software add on and can run on anything below the 40 series but Nvidia locks you out unless you pay 2-3 times what you paid for your current card. Is a scam. They can improve raster. Look at Intels gpus.
    Is like buying a car that they say gets 100mpg. But is only Coasting downhill. But in actual real world you get 40mpg.

    • @Aquve
      @Aquve 4 месяца назад

      Agreed. I recently started playing cyberpunk and I had to research and test so much shit about all those "fancy technologies" before starting the game. And for the most part they're all flawed, with noticeable downsides and other issues. In the end I just disabled RT and scaling and just went all ultra native and that's it. Until there are affordable cards that can utilize all RT gimmicks with the native 1440p let alone 4k resolution, it doesn't make much sense to use it.

  • @PoRRasturvaT
    @PoRRasturvaT 6 месяцев назад +1

    The jump to 4k is pretty steep, and I wouldn't have got my monitor if upscalers weren't a thing. It would have required waiting for one or two more GPU generations to get there at the framerate I wish to have.

    • @ChrisM541
      @ChrisM541 6 месяцев назад

      I hope you do realise that, at an upscaled 4K, you are looking at low resolution textures being upscaled. That, unfortunately for yourself, is night and day worse than a real, human-produced 4K texture. You should always be using textures generated for the 'native resolution of your monitor', and if your GPU can't handle that, you should be using a lower res monitor.

    • @faultier1158
      @faultier1158 6 месяцев назад

      @@ChrisM541 Games usually enable a negative mip bias when upscaling is active, so that higher res textures are loaded that they normally at that resolution. That results in textures looking pretty much the same at native 4k vs upscaled 4k.

    • @ChrisM541
      @ChrisM541 6 месяцев назад

      @@faultier1158 A negative mipmap bias simply follows AMD's own recommendations for upscaling as listed on the FSR 2 GitHub page, applying particularly to Starfield. This is still upscaling a low res texture, with the exact same issue...that it will always be significantly inferior to a real, human-produced 4K texture. That, I hope, seems reasonably obvious.

    • @evaone4286
      @evaone4286 6 месяцев назад

      Depends on the game. The GTX 1080ti was already pretty much a 4K ready card. So you do have lots of games that are capable of running well. Over time as with any card, performance wanes especially when they release unoptimized games like we saw in 2023

  • @JoneNascimento
    @JoneNascimento 6 месяцев назад

    Yes. It can be useful for 1440p and 4k resolutions since cards still struggle with it but image quality is terrible below it. Games that rely on smart upscaling to have a decent framerate can be a terrible trend for the industry.
    Yet, upscaling is something that is done on consoles for a long long time so if the upscaling algorithm can achieve better results than the other consoles relied upon back then, I think we got a win situation.

  • @atnfn
    @atnfn 6 месяцев назад

    You certainly do remember old games looking better than they did. Like when some remastered game comes out I often think. So what did they actually change? Didn't it always look like this, this is how I remember it looking. (Starcraft 2) Other times like Stronghold 2 Definitive Edition or whatever it's called, it actually looks nearly identical to the original release. They barely did anything to improve the graphics, other than making the animations look more fluid.

    • @cheeeeezewizzz
      @cheeeeezewizzz 6 месяцев назад

      I think the older witcher 3 looks better than the next gen updated one...

  • @nickglover
    @nickglover 6 месяцев назад

    For me, I see upscaling (not frame generation) as a means of achieving a) a higher resolution at a given frame rate, b) a higher frame rate at a given resolution, or c) better graphics at a given frame rate (for example, being able to use RT without losing FPS). For all of those things, I think it's a great thing, though I would love for DLSS to go open source so everyone can benefit. Frame generation, on the other hand, is absolutely a net negative that just needs to go away.

    • @evalangley3985
      @evalangley3985 6 месяцев назад

      You are NOT achieving a higher resolution. You are upscaling a lower resolution on your NATIVE resolution. You are simply playing at 1440p on a 2160p resolution. You are NOT playing at 2160p. Stop believing that garbage from Nvidia.

  • @robipindric7654
    @robipindric7654 6 месяцев назад +3

    RT,DLSS etc is the future of gaming.Like it or not.People were complaining about every new thing that came like vertex shaders,Pixel shaders,AA just to name the few. People need to realize that AI enhanced graphic is here to stay.

    • @SquarelyGames
      @SquarelyGames 6 месяцев назад

      And I love it. There is nothing fake about it, just another part of the pipeline.
      "I don't want AA! It's fake smoothing of the pixels! Give me 8K instead. "

    • @evalangley3985
      @evalangley3985 6 месяцев назад +2

      1080p forever...

    • @maverichz
      @maverichz 6 месяцев назад +1

      COMPLETELY AGREE. Ppl who say DLSS/FSR is just fake thing is DELUSIONAL. I have been gaming for years with DLSS and it massively improves my gaming experience, boosting FPS while retaining image quality. Same thing with FSR 3.
      I don't understand all those resolution purist who are willing to sacrifice tons of fps just for the sake of slightly better image (which in my experience in most cases i can't tell the difference between dlss quality and native), which is silly.
      The only bad thing from DLSS/FSR comes from developer side, they are becoming more and more lazy to optimize their games.

    • @MrBALAKUMARA
      @MrBALAKUMARA 6 месяцев назад +2

      yeah "future", sure!, dlss is what the "technology" that had been used from the start of console era, its been the same upscaling algorithm used for years and there is NO way the dlss/fsr will give 100% native output (nearly maybe 80-90%), if you argue that dlss is as same as native or having better image quality then you are a flat out liar,
      Also about RT, any RT in forced game will considerably fail in market, i agree that Avatar ,alan wake have great graphics but these games will vanish into thin air, like lords of fallen,forspoken,fort solis...they will fail like a failed landslide.
      This is because in rasterization itself we can get RT like lightning, the best example is horizon forbidden west, its pure rasterization, no RT is used. RT is overrated imo, the lightning makes things shiny, the only reason devs using RT because its easy to implement and no hard work needed...lazy devs! They can just brute force it. Hence naturally gamers won't even care about the unoptimized games, THIS IS going to be the future like it or not.

    • @robipindric7654
      @robipindric7654 6 месяцев назад +1

      @@MrBALAKUMARA you do realize we have almost achieved 3nm tech and that we cannot make it smaller than that? We literally can't. That means that we need to make GPUs even bigger than they already are OR generate fps in some other way.
      They will have to utilize the allocated space somehow better than they did so far. And i can guarantee you that necessary performance will be achieved through one form of DLSS or another. We have no choice.
      Is it without flaws at this stage? No,of course not! But they will find a way like they always do. They didn't need to improve the technology until now because they could just brute force the performance. This isn't possible anymore. As i said,we heard all those excuses with techs of earlier generations. You were all saying the same things.
      I will repeat it once again because this isn't hard to understand. They do not have a choice.

  • @EXOWill
    @EXOWill 6 месяцев назад

    I look at it as if it's a technology that's there if you feel the need to use it. I did however buy a 40 series card to see how it works on my system.

  • @johntet
    @johntet 6 месяцев назад

    someone remind them that 5 years ago was 2018 and not 2015 (witcher 3). Look at AC Odyssey and how stagnant AC games have been for example.

  • @evilhorde8151
    @evilhorde8151 6 месяцев назад

    I recently tried Frame generation for the first time in Cyberpunk 2077 and I have to say that the extra frames aren't worth the increased latency.
    Nvidia reflex helps a bit but you can still clearly feel the lag.
    While DLSS upscaling is a cool feature the DLSS frame generation is a 100% gimmick and a net negative.
    It may look cool on a youtube benchmark to have more frames but the increased latency is triggering and feels like your entire system spec has regressed.

  • @paulmememan508
    @paulmememan508 6 месяцев назад

    I think the first guy meant upscaling inherently is an inferior visual experience, comparing apples to apples.

  • @xSkittlesxNewbx
    @xSkittlesxNewbx 6 месяцев назад +1

    With DLSS it heavily depends on the game you play on how it affects visuals, you start noticing lines like fencing where there shouldn't be, some times on areas with high textures random black spots would be formed is if it assumed that's what the shadows were meant to look like. So when it comes to visual fidelity, it all depends on how sensetive you are to picking up minor changes like that, and the game, for me personally, it sucks, and once i spot it, I can not unsee it; so personally, I -dislike- DLSS, and just wish they made the GPU's better; and originally I was happy for it, thinking my friends and other people will get better experience with lower end GPU's, but at times the jump isn't too much due to it benifiting from the game running at higher FPS by default, or at other times experiencing visual issues like I did. Additionally, while yes, it could benifit 1440p, but, a lot of people play 1080p still, so DLSS feels like a tool for people that already spend a lot of money on a GPU, for them not to notice that the actual difference in performance without DLSS is marginal.

  • @evalangley3985
    @evalangley3985 6 месяцев назад

    Upscaling is only good when you need to play at a different resolution on a native higher resolution monitor that is incompatible, AKA 1440p on a 2160p monitor.

  • @Nateyo
    @Nateyo 6 месяцев назад +1

    I was able to enjoy the PC port of god of war with my RX590 at high settings thanks to them adding FSR 2.0 to it

  • @Bekcak
    @Bekcak 6 месяцев назад

    DLAA might be THE best thing have come over the years for AA. TAA-MSAA-FXAA all had too many disadvantages, but now I can mod DLAA into many games I want and have a crispier look.

  • @-Rizecek-
    @-Rizecek- 6 месяцев назад

    A topic that is dealt with more and more often. Yes, it ruins the optimization. As games from 2024 run with DLSS 1440p I would imagine native

  • @supremeboy
    @supremeboy 6 месяцев назад

    The question should have been that near future game devs will in core develope games where you have to enable DLSS/FSR to be able to play high quality preset graphics even the high-end cards. Trend is already going to that direction. If AMD cant speed up with RT department and FSR picture quality then its a really bad for gamers. Nvidia can push their prices even higher if games require frame generation in future games. So i kinda agree with the question there. 5 year ago we all looked raster performance and this is how good card is and game runs on it. Now we have to consider FSR/DLSS to be kinda future proof because its lazy coding to rely on AI and frame generation to make up the game fps

  • @andyflange
    @andyflange 6 месяцев назад

    I'm against upscaling as a selling point. If games have pushed too far ahead of what the hardware is capable of at native resolutions then it's either poor optimisation or evidence of a lack of progress on the hardware technology from generation to generation.
    I'm much more likely to drop to a lower resolution to stabilise a frame rate, just as I always have on the past.
    I also couldn't care less about ray tracing at the moment since most affordable hardware can barely run it and upscaling is often needed for a passable frame rate.
    Finally, frame generation is awful. My eyes can't tolerate it just like those terrible "smooth motion" settings that LCD/LED TV's have... Just looks really fake and causes artifacts which (no matter how small) I wouldn't put up with.
    If a game won't run well on my current hardware then I simply don't buy it until new hardware has caught up enough to run it natively or it has been optimised properly. Bonus is I generally get the game cheaper that way too once they're not too new to be included in the sales.

  • @chasetheswift
    @chasetheswift 6 месяцев назад +1

    Yes.

  • @krazyolie
    @krazyolie 6 месяцев назад +1

    I rather resent that frame generation is under the dlss/fsr banner as it's totally different technology and should be treated as such. Upscaling resolution is inherently good, thr gpu can do it better than a tv/monitor. Frame generation is something some ppl will like and some not.

    • @krazyolie
      @krazyolie 6 месяцев назад

      It does somewhat bring it back to crt displays where you could effectively set whatever resolution you wanted, you didn't have to match the panel.

  • @evalangley3985
    @evalangley3985 6 месяцев назад

    Image Fidelity priority in order of importance...
    Resolution > HDR > Framerate > Ray Tracing (LAST)

  • @gilot19
    @gilot19 6 месяцев назад

    Upscaling felt like magic the first time I tried it. Also, I don't think the hardware can advance much more and make massive jumps in performance.

    • @evaone4286
      @evaone4286 6 месяцев назад

      "The hardware isn't capable" they say that every gen. They are more than capable of producing quality products it's just they like scamming people out of lots of money for weaker tech

  • @feliperomo3015
    @feliperomo3015 6 месяцев назад

    They're definitely a W for steam deck/Ally/Lenovo

  • @thenext9537
    @thenext9537 6 месяцев назад

    Not a fan of brute force for anything. Ie, console port slapped into pc for the giggles and relying on brute speed to run it. One game that was highly optimized, and when I had my 2080 super was halo infinite. I played 3/4 of that game on Intel 6700k oc at 5ghz, 32GB ram and 2080 super. I was hitting 100+ fps on ultra. Went to a 7900x3d and 3080, and it hit 144 locked vsync. Wonderful experience.
    Then Starfield came, and while it’s 60+ fps at 1080p feels like it should run better considering the fidelity. Dlss, I run quality. Depends on visuals, I say play with it and get what you want.

  • @Toutvids
    @Toutvids 5 месяцев назад

    We still play those games from 5 to 10 years ago. I played Oblivion a couple weeks ago and finished a run through the main story. Most of my racing games are 5+ years old at this point and they are just as good graphically, in many cases. Kinda dumb to claim we don't play those games all the time, LOL. Those games were better than today's garbage.
    Upscaling is a massive con and anyone thinking they are getting a better product by lowering resolution natively and having ai 'generate' the difference is delusional at best. I ONLY buy products based on rasterized performance. I skip over any dlss or fsr benchmarks because I never use them and never will.

  • @walker2006au
    @walker2006au 6 месяцев назад

    Is a group on reddit with 6K members hating on TAA craving the old jaggies

  • @G-Clips54
    @G-Clips54 6 месяцев назад

    This video was reupladed 2nd time if im not mistaken 😂

  • @manigmaX
    @manigmaX 6 месяцев назад

    DLSS is amazing. I never play games without it. I wish AC Valhalla got DLSS implementation too. I am playing it now on my LG C2 55" at 5248x2952. Without DLSS and reduced power limit it uses 275w constant load of my 4080 with 55~100 fps. With DLSS things would've improved a lot.

  • @zeljkoklepac3180
    @zeljkoklepac3180 6 месяцев назад

    i can" t help but to comment (gamer and a technician) , at launch the game is good at current gaming hardware but with a ton of updates etc. current gpu hardware is unusebale , example: bouth Icarus game - runs super on a rx 550 2gb - rx 570 8gb -but as time passes eaven my new rx 6700 10 Gb is sortoff enough to play on medium settings. updates are good to have dont get me wrong and bug fixes also but sometimes it increases useage of pc to and over limits. cant be helped

  • @RAM_845
    @RAM_845 6 месяцев назад +1

    I've been following you guys for years, but excuse my ignorance, I know Tim's the monitor man but what's the other dudes name it escapes me...the older I get my memory getting worse, bloody ADHD i'll be hitting 40 this year.

    • @lepari9986
      @lepari9986 6 месяцев назад

      It was either Steve or Eustace, I'm sure of it.

    • @Enzed_
      @Enzed_ 6 месяцев назад +1

      His name's Steve

    • @RAM_845
      @RAM_845 6 месяцев назад +1

      @@Enzed_ Appreciate it mate :)

    • @lolish1234
      @lolish1234 6 месяцев назад

      40 isn't even close to older :D

  • @gusmlie
    @gusmlie 6 месяцев назад +1

    Star wars jedi survivor is still busted.

  • @avibrenner1580
    @avibrenner1580 6 месяцев назад

    I think realistically sone technologies are a head of their time like path tracing. GPUs cant handle this well enough today and the technology basically screams upscaling. There is also games who see upscaling as an opportunity to not put in the work to optimize and rely on the upscaling technology to fix performance deficiences

  • @jamesgodfrey1322
    @jamesgodfrey1322 6 месяцев назад

    DLSS/FSR - Upscaling and frame gen - should be a bonus not a requirment for gaming
    First you need a base to upscale aka good GPU and if good gpu chaces are you not going to upscale now diffrent games diffrent needs
    My fear is the issue Upscaling optimise game, game that needs the fake frame Upscaling to run in first that going backwards
    For my eyes I do not use Upscaling now it is subjectif but I do find my eyes get tired quicker if am gaining use upscaling yes take breaks etc but choice as have realy good GPU

  • @snozzmcberry2366
    @snozzmcberry2366 6 месяцев назад

    It's simply too early to say. The market has been so incredibly volatile in recent years for multiple enormous reasons (when else has the GPU market seen such a one-two punch as the crypto craze & pandemic back to back?), coupled with capitalistic opportunism, that things are still up in the air.
    How will low- and midrange GPUs pan out? How will game optimization pan out? How will the maturation of these software features like upscaling and ray tracing pan out? To what extent will Radeon & Intel succeed or fail in disrupting the present effective anti-competitive, anti-consumer Nvidia GPU monopoly? What the actual fuck is going on with AI and how will that end up affecting everything (or not) in 5, 10, 20 years? Where the hell are we on the intersectional bell curve of all these things coming together as one overarching "games market" development?
    There's so much going on and so many things up in the air, and it hardly looks like the global landscape is headed for peace and quiet, that it's just... It's impossible to form any kind of educated guess. We'll just have to wait and see and hope.

  • @romangregor4552
    @romangregor4552 6 месяцев назад

    i play native at linux i dont need upscaling

  • @squidtest
    @squidtest 6 месяцев назад

    IF you have a high-end GPU why bother care about upscalling if you can play in native quality in high frame rate.
    For some of us who prefer native quality over upscaling, we can just turn off the feature. Because you know, it's an option feature.
    But for some people who only have lower-end GPU (and if your gpu and game meets upscaling requirements) and want to experience higher frame rate than what native resolution could deliver. Then turn upscaling on, then you might get the better experience BUT with some compensation.
    It's just An optional feature that company give to you to access. You can turn it off if you hate it, and you can turn it on if you need it.
    Now you have additional optional feature to increase your frame rate without lowering your graphics quality settings.

  • @Abydos01
    @Abydos01 6 месяцев назад

    1:38 if you go back enough you will see that all AA solution is just a bandaid for LCD monitors being crap compared to CRT. The CRT subpixel smoothing or what it called are better than any AA ant when the industry witched to a more crisp LCD display text become sharper but game pixels no longer smoothed by the monitor. so AA was invented and still crap.

  • @madffx1986
    @madffx1986 6 месяцев назад

    Before it was what the card can do that's what you pay for. Now imagine if integrated graphics could utilize dlss or fsr and could reach the performance of a 1660 or 3050.

  • @starc.
    @starc. 6 месяцев назад

    latency is KING in gaming, I didnt know til getting a 4090 rig thought yay high fps nice graphics but no its latency

    • @starc.
      @starc. 6 месяцев назад

      high latency gaming destroys developing minds slowing the process down. Imagine if you have input faster than you can process, that drives evolution forward but having inputs slower than your potential isnt going to push biology to get better and only worse over prolonged time.

  • @ivanjovanovic7118
    @ivanjovanovic7118 6 месяцев назад

    Negative, as a graphic designer can notice degradation in sharpness as it was native 4k and for now nothing is close to native good to have choice because it can pass only 8n some super fast titles but in third person single player adventures and RPGs or isometric it is not welcome

  • @wawaweewa9159
    @wawaweewa9159 6 месяцев назад

    Short term benefit longterm pain as devs get lazy, devs nowadays are shit, both in optimisation and in the games they make

  • @jankypox
    @jankypox 6 месяцев назад

    Nostalgia Goggles! Thank goodness my eyesight is poorer now that I’m older 😂

  • @TewaAya
    @TewaAya 6 месяцев назад

    Games look like generic fantasy anime from last decade with their beautiful fidelity but horrendous frame rates. Wish old/classic shows could be reworked if japan had public domain like dorororo.

  • @sapphyrus
    @sapphyrus 6 месяцев назад +1

    People really forgot the days of FXAA which aren't that far behind. Even TAA which was a massive improvement after that looks bad compared to DLSS. The only thing that looks bad is FSR1 and FSR2 still isn't good enough. DLSS simply looks the best since flat-out super-sampling and I've seen most everything since Magnavox Odyssey 2.

    • @ChrisM541
      @ChrisM541 6 месяцев назад

      Lol, ok Jensen ;)
      Let me educate you...DLSS/FSR/XESS/etc are ALL based on the exact same upscaling algorithm ;)

    • @sapphyrus
      @sapphyrus 6 месяцев назад

      @@ChrisM541 Ok Lisa, one is way worse than the others, so.

    • @ChrisM541
      @ChrisM541 6 месяцев назад

      @@sapphyrus It's all about polishing a turd. Nothing more, nothing less. Your Jensen master clearly is up to his neck in turd polishing ;)

    • @sapphyrus
      @sapphyrus 6 месяцев назад

      @@ChrisM541 Imagine being a company simp.

    • @ChrisM541
      @ChrisM541 6 месяцев назад

      @@sapphyrus You should know ;)

  • @Void-uj7jd
    @Void-uj7jd 6 месяцев назад +1

    It's a curse, all new games using it are an unoptimized pile of Gaza rubble.
    5800X3D 7800XT user

  • @Beeterfish
    @Beeterfish 6 месяцев назад +6

    We need to get rid of the ray-grafix nonsense and instead concentrate on making textures and global illumination better without the hassle of GPU-killing path tracing.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +2

      Exactly. Always use human gfx artist-generated textures 'native for the resolution your playing at', not low res textures upscaled - using textures that way is insane.

    • @SparkerR1987
      @SparkerR1987 6 месяцев назад +1

      I wouldn't call it nonsense. RT/PT has the potential to completely replace how game engines render graphics. It is the future. We are just in the very early stages of RT/PT.
      I remember in the 90s when GLquake released and everyone was running out to buy a voodoo or nvidia graphics card to play it in hardware acceleration, even though texture compression made the game look far worse than the software renderer. Now, we don't even consider software rendering a thing and in time, rasterized rendering will be a thing of the past as well.

    • @ChrisM541
      @ChrisM541 6 месяцев назад +1

      @@SparkerR1987 I remember saving up enough money to do the voodoo sli thing. That was a game changer. I totally agree that full frame RT/PT is the future, though it's a damn shame companies (Nvidia in particular) have to be so messy/underhanded to get there.

    • @jamesbyrd3740
      @jamesbyrd3740 6 месяцев назад +1

      @@SparkerR1987 called it a gimmick with the 2080 & it's still just a gimmick. amd decides the fate of the tech with the next gen of consoles. until then it's just nvidia gimpworks bs

    • @SparkerR1987
      @SparkerR1987 6 месяцев назад +1

      @@ChrisM541 I agree. Nvidia is definitely hindering progress by gatekeeping AI and overpricing their hardware. AMD is not really innocent either since they are competing with Nvidia's pricing.
      @jamesbyrd3740 Funny u call RT/PT a gimmick 5 years ago and still call it a gimmick, yet all major GPU and CPU manufacturers are adding hardware RT/PT and AI support. Its a slow roll out but RT/PT is the future along with AI rendering assistance.

  • @rambo9199
    @rambo9199 6 месяцев назад

    Am I the only one shocked by 1080p being referred to as low resolution?

    • @iikon01
      @iikon01 6 месяцев назад +1

      Nope
      In 2024 1080p now is like 720p 5 years ago
      Most of the gamers upgrading to 1440p you should to
      After one year from now 8k will be officially released in gaming industry too

    • @rambo9199
      @rambo9199 6 месяцев назад

      @@iikon01 The acceleration is too much.... but then again hardware is it seems much much more expensive here than in NA. It's just too expensive for most people to move beyond 1k as it would take a new GPU, a new PSU and new monitors which is a small fortune..... I guess the 3rd world should just stop AAA gaming soon at this pace. 1st worlders thing GPU's are expensive now.... they have no idea. It will take at least 5 more years for most people to actually accept 1k as being low res.

    • @iikon01
      @iikon01 6 месяцев назад

      @rambo9199 I'm from the 3rd world too
      My monitor right now is 1440p from msi
      My pc gpu is rtx 3070 ti
      True because I'm from 3rd world it was to hard to build a pc like that but nothing impossible bro if you put a hard time let's say for one year working extra and saving money even on 3rd world you can build a high end pc setup
      True it's cruel because 1rd world people can build the strongest pc in 1 month and less but that's the life we didn't chose to born in 3rd world you know but we should accept it and don't make excuses and try harder to get what we really want not just giving up

    • @rambo9199
      @rambo9199 6 месяцев назад

      @@iikon01 Haha I am so careful with my money, a few years back I bought the most expensive monitor I have ever had.... a 75htz 24". Very few people stay up to date with what devs THINK should be the standard, bubbles within bubbles, it's just too expensive when you have more pressing expenses.... like food and rent nvm the electricity when it bothers to be on. You often hear from the loudmouths bragging but they cannot numerically be that many or they got cheap models pretending to be something else or something. Speaking for myself I am going to have to stick with my 1050ti till it dies for now.... I just cannot justify paying 5 times what I did for it for a replacement combo just to pretend 1k is now low resolution when it's still got plenty of life left in it... especially with the current gen being obvious extortion.

  • @yobson
    @yobson 6 месяцев назад

    RDR2 came out in 2018 and is still one of the best looking games out

  • @josiahsuarez
    @josiahsuarez 6 месяцев назад

    Fake frames isn't the way things used to be 😢

  • @ghstbstr
    @ghstbstr 6 месяцев назад

    Talking about upscaling when is Cyberpunk 2077 gonna get official FSR3 and AMD frame gen? I want to be able to play Cyberpunk 2077 and use my 7900xtx at 4k at the highest graphical setting with full ray tracing/path tracing/ray reconstruction and get atleast 90fps preferably 120fps. I hope the devs add it soon.

    • @maxiejohnson8356
      @maxiejohnson8356 6 месяцев назад

      U can mod it in right now for a glimpse of the future

    • @pR0ManiacS
      @pR0ManiacS 6 месяцев назад

      Tbh u bought the wrong gpu for rt. U should know that by now im an AMD user too. But overspending for rt now îs not worth it for me. As there are very few games that actually requiers me to use rt and doesn't work.on AMD native. When u buy a card see what games u gonna play and buy accordingly. Fsr3 mod my lukefz îs out there already. U could also take ultra pt rt mod. It's basically pt with less rays and works very well with your fps. Only needing rt lightning on medium or ultra

    • @ghstbstr
      @ghstbstr 6 месяцев назад

      I don’t do mods.

    • @maxiejohnson8356
      @maxiejohnson8356 6 месяцев назад

      strange respond from a PC gamer lol, but I get you, mods can be tedious and troublesome to deal with@@ghstbstr

  • @mariop8101
    @mariop8101 6 месяцев назад

    I always use DLSS with my 3060 ti, without it's impossible to play even at 1080p.

    • @evalangley3985
      @evalangley3985 6 месяцев назад

      WTH are you playing then? Deactivate Ray Tracing, problem solved. Anyway, you are not investing enough on your GPU to turn that feature on.

  • @aidenquinn19975
    @aidenquinn19975 6 месяцев назад

    In my opinion, the only game that FSR is used in pretty well is Cyberpunk 2077. Anything else using FSR has that annoying shimmering/halo effect around characters
    DLSS will always bother me because the smearing of the generated frames.
    I’m disappointed with rasterized performance in the current gen of pc gaming but there’s not much I could do about it besides complain

  • @hmst5420
    @hmst5420 6 месяцев назад

    What i hate about modern graphics the most are grainy shadows. It's a plague of PC games.

  • @Thor_Asgard_
    @Thor_Asgard_ 6 месяцев назад

    What ? Witcher 3 Next Gen still looks amazing! The only thing that never looked good are the cutszenes. There is only 1 true nextgen looking game, and that is the burning shores DLC for Forbidden West and Cyberpunk with PT, but not the unrendered Cutszenes. To me everything else looks cheap. Especially that overhyped Alan Wake 2, it doesent look good at all to me.

  • @evalangley3985
    @evalangley3985 6 месяцев назад

    Yes it is. The same as Ray Tracing... it is giving the pretext for developers to be lazy by not doing any optimization and just implement a bunch of features that downgrade the experience and the image fidelity.

  • @Arigal3
    @Arigal3 6 месяцев назад +2

    while in general i can agree that this technologies are a net positive, im dissapointed with the results, only when i have enough performance in the game, is when this is useful, and a part of my brain think the resources used to develops upscaling tech could be better used in other performance settings

  • @walker2006au
    @walker2006au 6 месяцев назад

    People don't seem to understand just how demanding real time ray tracing is.. It's essentially early access its that demanding.. DLSS just allows us to experience it earlier in playable frame rates.
    Stuff like starfield is the issue.. The game doesn't even look that good or have any ground breaking tech and still ran like absolute crap on launch. I guess they relied on FSR too much.

  • @Saintedlight
    @Saintedlight 6 месяцев назад

    The Lord of the Rings: Gollum worse in every way then even the old version of the witcher 3. Thats a tripple A game that came out May 25, 2023. The witcher 3 came out in May 18, 2015. Thats almost exactly 8 years apart but Gollum looks like its budget game from 2015.
    This is just one example, yes Its not DLSS fault that the game is bad. But it proves that not everything made is better then it was 5-10 years ago.

  • @mrnicktoyou
    @mrnicktoyou 6 месяцев назад +6

    Older games definitely looked cleaner. I just played Dead Space 2 and it looks so crisp! Even DLSS adds a layer of blurriness to distant objects. Try Cyberpunk with just straight raster and it looks much cleaner. But, DLSS makes close objects look so clean and spectacular so it's a win some, lose some.

    • @user-dp8ez9ik9e
      @user-dp8ez9ik9e 6 месяцев назад

      Due to the absence of granular detail, textures, small particles, and objects lack the intricacy and depth seen in today's games, resulting in a cleaner visual perception.

    • @sapphyrus
      @sapphyrus 6 месяцев назад

      It was either FXAA with garbled mess or jagged ladders everywhere. I'll take DLSS with DLDSR that can draw and show distant objects that weren't even rendered in native before.

    • @faultier1158
      @faultier1158 6 месяцев назад +1

      You can also add contrast adaptive sharpening via Reshade if basic DLSS output is too blurry for you. Works quite well and you can seemlessly adjust the strength of the sharpening effect.

  • @evalangley3985
    @evalangley3985 6 месяцев назад +1

    Upscaling is NOT native. And yes, it look WORST than native.
    DLSS and FSR Quality at 2160p is in reality 1440p.

  • @blaster-vv8so
    @blaster-vv8so 6 месяцев назад +2

    5 years ago was 2018, we had games like god of war Spiderman, red dead redemption 2, forza hosizon 4, far cry 5,and shadow of the tomb raiders. that many people still think look stunning today. many are in your sweet of testing today. you can not tell me new games are that much better looking, and are worth the massive gpu requirements are worth it.

  • @histerical90
    @histerical90 6 месяцев назад +2

    I think the games should be optimized or created so that on the latest hardware (4090 in this case) should run native 4k 60 with the settings maxed out and if you want higher fps then up scaling should be a thing. Nvidia is just selling DLSS3 with this generation of video cards which is implemented in a handful of games and totally useless in any older titles, I have a library of ~1k games on steam and I can count on my fingers how many of them have DLSS3 implementation, at least AMD is trying to improve the whole experience with the Driver level frame gen which in theory should work on all the DX11/12 games, for now is not great but if they gonna make it work properly that would be huge for them and a worthy feature to get.

    • @tomorpedreiro3032
      @tomorpedreiro3032 6 месяцев назад

      if they gonna make it work properly" is the one thing that will never be realised. even tho fsr is open tech for some time dlss still has more game numbers since its just way better to update dlss library by yourself vs devs. also since 80%+ of all aib cards are still sold by nvidia it makes no sense to focus on little amd

    • @nossy232323
      @nossy232323 6 месяцев назад

      But why would you need frame gen in older games? You can probably run those on a potato!

    • @leucome
      @leucome 6 месяцев назад

      @@nossy232323 Reducing power consumation on laptop/portable console by example is an actual good usage even for an old game. But yeah on a desktop it is no that useful for old game.

    • @nossy232323
      @nossy232323 6 месяцев назад

      @@leucome But you DO NOT want frame gen when your normal frame rate is low. You will get too much lag + visual artifacts.

    • @c523jw7
      @c523jw7 6 месяцев назад +2

      AmD took away frame gen advantage from NVIDIA which is great.
      Now NVIDIA don't have that BS to try and convince you their GPUs are worth those ridiculous prices.
      I say this and I got a 4080. Hopefully AMD gets better at rt and undercuts NVIDIA.