Is native resolution for PC gaming a thing of the past?

Поделиться
HTML-код
  • Опубликовано: 1 окт 2024
  • September Q and A:
    Steve and Tim discuss whether FSR and DLSS improvements have made native resolution redundant.
    Join us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane....
    Buy relevant products from Amazon, Newegg and others below:
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 3050 - geni.us/fF9YeC
    GeForce RTX 3060 - geni.us/MQT2VG
    GeForce RTX 3060 Ti - geni.us/yqtTGn3
    GeForce RTX 3070 - geni.us/Kfso1
    GeForce RTX 3080 - geni.us/7xgj
    GeForce RTX 3090 - geni.us/R8gg
    Radeon RX 6500 XT - geni.us/dym2r
    Radeon RX 6600 - geni.us/cCrY
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6700 XT - geni.us/3b7PJub
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6900 XT - geni.us/5baeGU
    Is native resolution for PC gaming a thing of the past?
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunb. .
    Music By: / lakeyinspiredg. .

Комментарии • 444

  • @DKTD23
    @DKTD23 Год назад +99

    Native res is the foundation regardless how much upscaling technology evolves.

    • @AdrianValenciaYT
      @AdrianValenciaYT Год назад +1

      Yep but also it's nonsense to test a 4060's performance in Cyberpunk at 4K with Path Tracing and RR on lol Nvidia's bet is on making rasterization more efficient with AI technologies, nothing more. If you can make this 4K game run smoother while it still looks 4K, you got a win. It is a necessity for making RT approachable without sacrificing fps too much, since RT is actually the future of gaming both Nvidia and AMD need to find a way to make it sustainable. If we could run all this new tech native, there'd be no subject to discuss.

    • @blkspade23
      @blkspade23 Год назад +1

      @@AdrianValenciaYTWhat's funny is upscaling should be more relevant the further down the line you go. I don't care about it for a GPU I'm paying more than $1000 for, because I'm paying for more native performance. I have a laptop with a RTX 3060, with a 1080p 165hz panel. Cyber punk is still shit on it with RT and DLSS. Not that I bought it for gaming (or really care that much about RT) since I prefer my 3440x1440 screen on my desktop, but very few games I play get anywhere near 165fps on the laptop, but anything north of 60 is generally fine native. I'd only use DLSS to make an otherwise unplayable game playable, but it would still barely be 60fps.

    • @AdrianValenciaYT
      @AdrianValenciaYT Год назад +1

      @@blkspade23 of course but you said it yourself. Resolution matters! Also a laptop's 3060 is not comparable with a desktop 3060. I'm running just that on 1440p and I'm getting away with it. I (of course) won't turn Overdrive on, let's also understand limits. There's demanding games and poorly optimized games, can't blame a 3060 for that. That card came out Feb 2021, it's bee a while, you know? 😂
      The other thing, ask a 4090 (a $2000 dllr gpu, not even $1000) if it can handle Path Tracing in Cyberpunk2077 natively. I think you already know it can't. That's the whole point, we're trying to run Path Tracing in real time for videogames 🤯 that should be considered a freaking miracle, but people just can't be surprised with anything nowadays. DLSS, FSR and XeSS matter, they just don't put a lot of attention to them cause they're so used to running everything on native raster. Nex gen won't be so gentle.

    • @blkspade23
      @blkspade23 Год назад

      @@AdrianValenciaYTYeah my point is, its GPU i can't change, on panel I wouldn't change. I'm going to prefer native, until I'm forced otherwise. For far less intensive stuff I could even hook it up to my 34" ultrawide. It's pretty much where DLSS should be a selling point. DLSS at the high-end is just a solution looking for a problem. PT is just them creating the problem. The hardware isn't really there for it yet, and no one was really asking for it. Standard RT certainly gets the point across, and no really focused on playing games was nitpicking the potential inaccuracies of shadows.

    • @T1tusCr0w
      @T1tusCr0w Год назад +1

      @@AdrianValenciaYTthe fools don’t understand. It’s just another button to click & a block of text next to it they don’t read.

  • @Saieden
    @Saieden Год назад +42

    Whether it's AI generative or not, native resolution, AKA raw graphics performance, will always be relevant to features are slapped on top it. It's relative gains, so getting 10x means nothing if you're starting at 1fps.

    • @T1tusCr0w
      @T1tusCr0w Год назад +1

      I felt this on my 4090 with the path tracing everthing maxed out. Was getting 70-80 4k this was with the gram gen etc etc. on. I suddenly wished I had a 5090. Then gave my self a shake! Did 2 hours sightseeing round the city then played the actual 2nd run and expansion with ONLY max Ray tracing on dlss quality & was over 110+ the heavy feeling to controls was too much with path tracing right now.
      I love the tech. Reason I got a 4090 i9 was to play 4k with everything beautiful. But it seems I will not sacrifice a certain amount of response to get it. Especially with max ray trace normal being also really good looking.

    • @Saieden
      @Saieden Год назад

      @@T1tusCr0w It's a similar thing for me with upscaling, though from the opposite side being on a 3060ti (since release) and 1440p/165hz display. My upgrades are value-oriented, both in terms of resale and the replacement, so it lets keep my GPU a bit longer for better parts to be released, while my current hardware holds more relevance and value over time as well. Though I must say the 7800XT is looking pretty damn tempting right now...

    • @T1tusCr0w
      @T1tusCr0w Год назад

      @@Saieden even though I got a great rig I definitely thought about value. I was upgrading from 2016. The pandemic & mining stuff meant it was a long time in between. I was originally planning to get a 3080. Left it to late. As in past the 1st 2 days. & you didn’t see a 3080 for normal price for years. When the current gen came out every single card basically pointed me to the 4090 as the only one that was "worth" it.
      I was originally budgeting 1000 for a card & through I was pushing the boat out. 🤦🏻‍♂️ So I totally am with you on value. The 7800xT Is looking pretty good value right now. Haha. Tbh I’m glad to be "out of it" the upgrade roundabout. I hope 👀🥺

  • @Amp_Edition
    @Amp_Edition Год назад +27

    I play native 4k 120hz, I use a 7900xtx and if I can't max a game out I turn things down just a bit and it still looks stunning. Most games still look amazing at high settings instead of ultra, I always leave textures at the highest setting, and turn down things like shadows and sun shafts.

    • @MultiPochop
      @MultiPochop Год назад +5

      Textures is the main thing for Visual Quality in games. It's a shame that most PC gamers have far too little of it, even though VRAM is so cheap, which forces the developers to severely limit the visuals of the games. We could have had photorealistic games a long time ago. The next consoles will certainly achieve this, while most PC gamers are still stuck with 8 GB.

    • @WarumWiesoWeshalb
      @WarumWiesoWeshalb Год назад

      totaly agree!

    • @pasha715
      @pasha715 11 месяцев назад

      forgot who said it, maybe moreslawisdead, that if they increased vram on every gpu, 4070 16 or more vram, 4060 /ti 12 gb, memory producers would increase price a lot

  • @V3ntilator
    @V3ntilator Год назад +63

    They should optimize games for native resolutions and not for "fake" resolutions.
    DLSS and FSR should be optional, not a requirement.

    • @Flyon86
      @Flyon86 Год назад +9

      I think the original intention was to make lower end graphics cards able to still play new games longer. However, it's been used by developers to hide poor optimization. Look at starfield where FSR is on by default if you use higher graphical presets.

    • @MechAdv
      @MechAdv Год назад +5

      @@Flyon86Then you have Lies of P which looks great, and runs natively 90-100fps at 4k highest settings on my 3080, AND it’s somehow made with Unreal Engine without even a hint of stuttering. Little Korean Dev showing who actually did their homework and who’s copying notes. 👀
      Additionally, GOW and Atomic Heart look WAY better than Starfield, and somehow run 4k ultra 60fps without upscaling, or 80-90fps with DLSS quality(regularly getting triple digits for GOW)

    • @Flyon86
      @Flyon86 Год назад +1

      @@MechAdv Yeah, remnant 2 is another recent one that had DLSS on as a default I believe at higher settings. RDR2 is another one that lower end systems can still play at decent settings and looks better than basically anything that's out still.

    • @MaksKuaka
      @MaksKuaka Год назад

      To an extent. If DLSS advances to a point where it is virtually indistinguishable in motion to native, I have no complaints. Moores law is dead, there's only so far we can go by just throwing more power at a problem.

    • @kishaloyb.7937
      @kishaloyb.7937 Год назад +1

      @@MechAdv except the scope and scale of Lies of P, GOW and Atomic Heart nowhere comes close to Starfield. None of them has a persistent open world with physics interactable clutter scattered around. None of them has a freeform physics engine operating 100% o f the time. They are good looking games, but a video game is more than just "graphics". A lot of other stuff is also there which makes video games demanding aside from graphics.

  • @TheRogueWolf
    @TheRogueWolf Год назад +10

    "Why pay a premium for _real_ performance when you can pay a premium for _fake_ performance?" - Nvidia

  • @arghpee
    @arghpee Год назад +8

    native res is a benchmark of the true raster performance of a card.

  • @2284090
    @2284090 Год назад +4

    Nvidia fanboys explain why we need to love upscailing more than native

  • @fatidicusaeternus6498
    @fatidicusaeternus6498 Год назад +5

    The case may be different at 4k, but at 1440p I can certainly tell there is artefacting and inaccuracies when upscaling. Maybe it's a luxury, but I prefer native rendering

  • @mix3k818
    @mix3k818 Год назад +16

    Many will still prefer native resolutions, I believe. AI should not be the crutch for releasing an otherwise unoptimised game.

    • @kaimojepaslt
      @kaimojepaslt Год назад +3

      cause native is native, you got more original pixels, not fake stuff , the real problem is , lazy optimizitation of games devs, and greedy cards prices. its all downhill for regular players.

    • @McLeonVP
      @McLeonVP Год назад +2

      Depends on the game
      DLSS quality looks better than 1080P native

    • @yoked391
      @yoked391 Год назад +6

      ​@@McLeonVPno it doesn't

    • @McLeonVP
      @McLeonVP Год назад

      In CYberpunk yes it does.. @@yoked391

    • @_godsl4yer_
      @_godsl4yer_ Год назад +2

      @@yoked391 except in some games it does, HardwareUnboxed already have made a video comparing upscaling technology and native. When will people stop being in denial of the fact that software techniques are going to become a bigger factor than just hardware alone?

  • @joel3683
    @joel3683 Год назад +22

    You guys have made the right choice and I appreciate you guys always trying to not any biases or any other factors get in the way of keeping all these reviews objective. Its all extremely helpful and you made some great points!

  • @reinhardtwilhelm5415
    @reinhardtwilhelm5415 Год назад +7

    I should also note that upscalers continue to suck at 1080p, by far the most common gaming resolution, and AAA game devs are now actively designing games to run heavily upscaled by default even at 1080p. What that means, then, is that for most people DLSS and FSR are functionally another way to avoid having to give them an acceptable native 1080p experience, and I really don’t like that.

    • @Flyon86
      @Flyon86 Год назад +2

      Yeah, I think according to the latest steam survey over 60% of people are still using 1080p. Also the 3060 and 2060 are the most common gpu's. But they release games that have requirements assuming most people have a 3080 or above.

  • @LJDMabc
    @LJDMabc Год назад +8

    Half the reason i upgraded from my 5700xt was so i could play at 1440p WITHOUT needing FSR

    • @Yerinjibbang
      @Yerinjibbang Год назад

      i upgraded my 5700xt to 7800 xt huge improvement on 1440p

    • @LJDMabc
      @LJDMabc Год назад

      @@Yerinjibbang i upgraded to a 6800 a few months ago but had to RMA because of a bad fan and got a 7800xt instead

  • @carljones9640
    @carljones9640 Год назад +9

    Rendering at native, especially above 1080p, is already not a major concern of any of the major producers now. It costs less money to add more memory capacity or compute performance than it does to pay an engineer to optimize algorithms or hardware, and it's even cheaper to train an AI/machine learning algorithm to "optimize" for you (read: render it at half your native resolution then upscale it so it looks the """"same""""). This is exactly why I haven't bothered buying a 4K monitor. Why would I put out the money for a good 4K monitor when I won't even get to see 4K unless I pay over $1000 for one of the two cards that still can't run every game at 4K/60FPS? Why are developers constantly pushing newer visual graphics technology with more impressive graphics - why did this become the definitive selling point generation over generation - if now the argument is "Don't actually get that graphical fidelity, get Close Enough™"? No, thank you. We already traded emergent and innovative gameplay and story-telling for graphical fidelity and higher FPS. Now you're telling me to trade the graphical fidelity for the FPS? The entire reason I got into PC gaming was because it was supposed to be better graphics at higher framerates, not one or the other. If I want worse graphics or framerate, I'll buy a console again and save myself the money and headache. Just give me the stuff that you said you'd sell me, the stuff that I already paid for, instead of bait-and-switch'ing me. I paid for full-bodied, imported coffee grounds - stop trying to tell me instant is "close enough" and just give me the damn coffee I paid for!

    • @pasha715
      @pasha715 11 месяцев назад

      1200$ is 4080 which in remnant 2, at least 2 month ago, idk now, had barely 30 fps at 4k native, ofc, its not gpu's fault but for that money i expect 4k raster 70-80 fps native at least

    • @paul1979uk2000
      @paul1979uk2000 10 месяцев назад

      It wouldn't be so bad if the resolution scale was limited to 1080p, but I'm seeing games that are dropping the resolution to 720p, sometimes 480p, which is way too low and insane in this day and age.
      Personally, I think developers need to take a step back on visuals, I'm all for better visuals, but not at the cuts they are having to make in other areas, and for now, we don't have a solid open solution that works well at lower resolutions, we've got FSR and XeSS, both work well at high resolutions, but are rather crap at lower resolutions.
      Until that is resolved, developers shouldn't be pushing gamers where they need to use resolution scaling to get an acceptable performance.
      The irony is, the games that I'm seeing where the resolution scale is being overused, the games don't actually look that great, they don't look bad, but they sure as hell not blowing my socks off, and a lot of these games are UE5 games, so either the developers are pushing the engine too hard, or bad optimisations, or the engine is too demanding or maybe too hard to work with, looking at some other games on other engines from a performance to visual level, I'm starting to have doubts about UE5, after all, it's all well and good being impressive on the visual front whiles needing a super computer to do it, it's not that impressive when developers are using the engine and dropping the resolution so low that even the PS1 would be proud, for me, it feels like a backwards step, and I really hope this doesn't stick.

  • @Baarkszz
    @Baarkszz Год назад +3

    No. No it's not. It's a crutch and COULD be helpful, but I swear to fucking god it's one of the worst things to have happened to PC gaming.

  • @kingcrackedhen3572
    @kingcrackedhen3572 Год назад +3

    I’m using a 1440p display and want my native res fps to be at very minimum 60fps and average at 144fps before I ever consider adding dlss fsr etc
    I don’t want blurryness because of upscaling

  • @dademr
    @dademr Год назад +3

    Forget 4K and 8K next year 1080p will be the goal LOL

  • @Us3rN4m32
    @Us3rN4m32 Год назад +19

    We went from 1080p, 1440p then 4k, now we are back to 720p and 1080p. The industry is going 3 steps forward and 2 steps back
    And Devs are starting to use this tech as a catch all for bad optimization which is only going to get worse.

    • @guille92h
      @guille92h Год назад +2

      They want to sell 4k monitors even if gpus can't render new games at 4K

  • @26Guenter
    @26Guenter Год назад +60

    Native resolution isn't dead despite the industry wanting to move towards AI generated gaming.

    • @Bassjunkie_1
      @Bassjunkie_1 Год назад +3

      100% agreed m8

    • @christophermullins7163
      @christophermullins7163 Год назад +7

      This idea that upscaling will eventually get good enough that it does not matter what input resolution you use is absolutely absurd. I would bet that traditional upscaling using AI can only potentially get 20% better than dlss is now.

    • @davidjohansson1416
      @davidjohansson1416 Год назад +10

      I’d rather play at 1080p native. ”4k” is just a marketing gimmic if it’s upscaled content.

    • @christophermullins7163
      @christophermullins7163 Год назад +6

      @@davidjohansson1416 well sir that is simply not true. Smokin copium 💯

    • @stevfell
      @stevfell Год назад +10

      ​@@davidjohansson1416can confirm that you are coping af

  • @Saith516
    @Saith516 Год назад +30

    Imagine how good games would look if we focused on filling up VRAM with crisp and detailed textures instead of RT lighting that looks a bit different and RT surfaces that are unrealistically reflective.

    • @Ghastly10
      @Ghastly10 Год назад +3

      Sure you could do that, but if the lighting of a game sucks, no amount of fancy highly detailed textures will make it look any better.

    • @KPAki1Ler
      @KPAki1Ler Год назад

      So a more balanced approach

    • @sandrahiltz
      @sandrahiltz Год назад +1

      Exactly, I hate RT as really it's not better looking, just different looking with surfaces that are way too reflective like you said, but for a stupidly massive performance hit, give me high res textures and HDR over RT any day of the week.

    • @truemonet
      @truemonet Год назад +1

      ​@@Ghastly10gaming light still good without RT, RDR2 it's a good example

    • @Saith516
      @Saith516 Год назад +2

      @@Ghastly10 where did I say we should abandon lighting effects? You know that pre-rendered ray tracing exists, right? Skilled developers can apply pre-rendered RT for impressive visuals so that we don't uselessly cut our FPS in half. You don't NEED real-time RT for good lighting. Whereas better textures are straight undeniable upgrades --- the more detailed they are, the better they are. We've got the VRAM to support it. It's quite literally a free and IMMEDIATELY APPARENT visual upgrade, versus real-time RT which honestly just looks like a reshade mod that makes things look DIFFERENT but not necessarily BETTER in any obvious way.

  • @squirrelsinjacket1804
    @squirrelsinjacket1804 Год назад +3

    The overwhelming majority of PC gamers are playing at 1080p. I think within 5-10 years that's gonna change quite a lot in the 1440p to 4k direction.

  • @XBnPC
    @XBnPC Год назад +2

    I literally never use upscaling. I'm at native 4k or 1800p on my 7900 XTX and native 1080p or 900p on my RTX 3060. I prefer the display's upscaling to TAA personally

  • @Gamer-q7v
    @Gamer-q7v Год назад +10

    Native resolution will always be better than these overrated software technologies. I have always preferred native rendering and been more into the hardware specs of a GPU rather than features such as DLSS, FSR, and FG.

    • @maverichz
      @maverichz Год назад

      Overrated? Don't me laugh. If you are telling me there is a big difference between DLSS quality and native on res 1440p and above, I call that bullshit.

    • @Gamer-q7v
      @Gamer-q7v Год назад +2

      ​@@maverichz DLSS quality looks good and retains most of the image quality for a decent performance boost. DLSS performance looks crap, and ultra performance just looks atrocious. Native 4K definitely looks better than DLSS 4K in most cases. With DLSS, you are rendering the game at a lower resolution and then upscaling it to a higher resolution, so that will always have a reduction in image quality. You lose pixels when rendering at a lower resolution, and fewer pixels mean lower image quality.

    • @mojojojo6292
      @mojojojo6292 Год назад

      @@Gamer-q7v You aren't getting high frames at native 4k with more demanding games though, that is the reality. A better comparison would be comparing 4k DLSS quality to 1440p native. Both are rendering at 1440p so the performance is roughly the same but the upscaled 1440p image on a 4k monitor will look far superior to the 1440p native on the 1440p monitor. So if you can do 1440p native well then you might as well go 4k with DLSS. There's also DLSDR/DLAA which gives you super sampling AA at roughly native performance that will also look better than native. Either way upscaling wins over native.

    • @Gamer-q7v
      @Gamer-q7v Год назад

      @mojojojo6292 DLSS quality at 4K is rendering at 1440p, so I know that will be similar to native 1440p in terms of performance. 4K DLSS quality probably looks better. You can get 60+ FPS at native 4K max settings in demanding games with a powerful GPU. DLAA is native as well since its DLSS without the upscaling. It looks superior to DLSS in terms of image quality. If I have the performance headroom, I always enable DLAA if it's in the game.

  • @gail_blue
    @gail_blue Год назад +2

    Please stop saying "You can turn all those things on at native." I'm worried Nvidia will "fix" that.

  • @beatwolf44
    @beatwolf44 Год назад +2

    Dlss and FSR are nothing but cop-outs from Nvidia and AMD.

  • @johnl.7582
    @johnl.7582 Год назад +2

    4k is just a bad resolution, and I say this as someone who uses 4k displays for both desktop and couch gaming. 4k not high enough for "retina" scaling in desktop productivity apps, but it's too high to make sense for gaming. 1440p and 2880p is where it's at. Unfortunately we're stuck with 4k because...TVs.

  • @wishwewere1256
    @wishwewere1256 Год назад +2

    Native or nothing. DLSS, FSR no matter what, when I move I see can see the ugly software at work trying to upscale pixels. Eww.

  • @sandrahiltz
    @sandrahiltz Год назад +2

    IMO native resolution being irrelevant is just developer's excuse/rationalization to release a poorly optimized mess a la Starfield.

  • @matthuang4632
    @matthuang4632 Год назад +2

    upscaling is what low-end GPUs should be benefit from. if a 4090 can't get 60fps at 4k native shows how bad the game optimization is.

  • @ChrisM541
    @ChrisM541 Год назад +2

    Lol, you'd only need to engage a single neuron to instantly realise Ngreedia would LOVE everyone to buy a sub-optimal (for their monitor resolution) version of their cards and use upscaling to get the fps required to play. In the end, upscaling is upscaling and no amount of marketing bullsh#t can cover the game engine/gfx quality compromises therein...and as a games programmer myself, there are many. If you have an underpowered gpu then, of course, turn on upscaling - THAT'S the only situation where upscaling make sense. If your gpu can handle native then use it. Only a braindead idiot would purchase an ngreedia 4090...and turn on DLSS/DLSS 3.

  • @9mmfederalrimmed235
    @9mmfederalrimmed235 Год назад +2

    I upscaled Enlisted with FSR Radeon RX 6600 card but did not see any fps improvement and the picture was just worse. Upscaling does not give the same quality of picture when your native Monitor is 1080p and you lovered the resolution to somewhat of 900p or 720p. I somehow never could get it right or better than native 1080p.
    Then again to switch back to Windows the proportions of the screen are iffy and it does not act right neighter.
    Just stay on native and rely on raw processing power of the GPU. Thats way less hassle and straight foreward.

  • @Bigbacon
    @Bigbacon Год назад +2

    I hate all this DLSS/FRS garbaage. it always makes it looks worse than native yet now we have features you can't even use without using it, like the extra RT stuff in cyberpunk.

  • @Uachtar
    @Uachtar Год назад +2

    Gaming was and is still all about using shorcut or cheat to get the best visual possible with the best performance. Upscaler are just another tools in all the thing they can use to improve FPS. Some will use these newer technique to improve a unoptimized game like they always did with other technique, but the best studios will improve visual greatly (or make it run smooth on much lower end hardware) using upscaler.
    Upscaler are here to stay. The main issue with them is not what it is vs Native, it's the segmentation of them. FSR is not good enough to be the default feature and still too linked to AMD marketing. DLSS is Nvidia only and XeSS is crap on non-intel Hardware.
    What we need is a new standard for running upscaler using iA where we could have a DirectX upscaler that would run the same on all hardware. At first maybe not all hardware could run it if they don't support the feature level (like they did in the past for pixel shaders version), but in the long term all hardware company would support this feature and go on. Devs would prefer to work with a technology that work the same all the time for all vendors.
    But the main issue there is we all let (including game studio and microsoft) Nvidia take the floor with DLSS. They now know that their tech is better and gamers too. They now overcharge for their cards because of it and every gamers is losing. It feel like we are back to the Glide era with 3DFX were it was the best API at the time and 3DFX was the card to have (but much more expensive).
    Is it too late to change the situation? i dont know. It's not Unreal 5 Temporal Super Resolution that will change things. From what i saw, it seem inferior to FSR2. We need something that can run on all hardware that is ML based.

  • @satanspy
    @satanspy 9 месяцев назад +2

    The real question is why were these upscaling technologies introduced in the first place instead of making bigger more powerful GPUs that can render at native ? It’s like the future will be fake frames and upscaling instead of real native res

  • @SkrifftheDragon
    @SkrifftheDragon Год назад +2

    Upscaling is just a big cope for game devs that don't know fuck all about optimizing. You all know damn well that they're going to start relying on it more and more to the point that it will force anyone with a midrange card or below to enable DLSS or FSR. Upscaling will never be as good as native in terms of artifacting and latency.

  • @Whoisthis911
    @Whoisthis911 Год назад +2

    upscaling tech is a software gimmick for some single player games, if you do not mind all the drawbacks of using upscaling.
    It has no purpose in multiplayer games.

  • @t1se
    @t1se Год назад +3

    I think 2k 27 inches monitor is enough for full pc game experience 4k is more for TV i think

  • @kostihouse
    @kostihouse Год назад +2

    Native res is not dead, try playing a vr game with dlss.. even in the ones, where you actually can, it looks totally horrible

  • @Jacob-hl6sn
    @Jacob-hl6sn Год назад +2

    Upscaling crap looks really bad. 4k gaming is still a joke.

  • @Sid-Cannon
    @Sid-Cannon Год назад +9

    Recent games coming out with upscaling included in the presets, or the game made to heavily rely on upscaling has made me regret getting my 4070. Gaming @1080p which I'm fine with, it's what I did with my 1080 for 6 years, it looks pants when upscaling is used. If this is the way things are going it won't be long before my 4070 requires upscaling just to do 1080p60 High. If my 4070 doesn't last as long as my 1080 did, then PC gaming is in the bin for me. I refuse to throw more good money after bad.

    • @abdullahnaeem7275
      @abdullahnaeem7275 Год назад

      Really 4070 can't do 1080p 60 fps.
      Can you tell me what type of games you play.
      Cause I'm going to upgrade from a 1660s and what to knowna card that can last long. At 1440p
      My options were looking like 4070 to 7800xt. Or I can increase my bugdet to get a 7900xt but performance is concerning to me.

    • @notrixamoris3318
      @notrixamoris3318 Год назад

      Bro when you play in 1440p in three days you wont want 1080p anymore...

    • @Sid-Cannon
      @Sid-Cannon Год назад

      @@abdullahnaeem7275 I didn't say it can't do it, but the way things are going it won't be long before it will need upscaling to do it. I got 6 years of 1080p60 High with my 1080 and I'm hoping my 4070 will do the same, but I doubt it.

    • @NemoNoomin
      @NemoNoomin Год назад

      ​@@abdullahnaeem72754070 will do great for 1440p, however depending on the games you play, I would say go for a used 3080, similar performance, cheaper. Watch a few comparisons. A used 6950 will cost slightly more, but give you far more performance.

  • @Ebrahim766
    @Ebrahim766 Год назад +2

    Nah , I always play native res , no aa , no upscaling

  • @portman8909
    @portman8909 Год назад +15

    I don't think so. These technologies are only added to triple A titles. I've yet to see any AA or Indie titles implement this technology. Sure there's one here and there but most don't add it yet. For that reason I think native still matters and GPU vendors should still improve raw performance. AI isn't a quickfix for everything far from it in fact.

    • @SexManNordin
      @SexManNordin Год назад +2

      they were sold to us as a piece of software to give older computers a bit more life but now they are using it more and more as a default which annoys me i much prefer playing on native resolution

    • @PineyJustice
      @PineyJustice Год назад +1

      @@SexManNordin We should be all for it, have ultra settings be unusable at native on the top end cards, what ultra used to be, a setting for next gen hardware then when newer stuff comes out you can go back and replay on ultra for that instant remaster. FSR3 and DLSS are good enough now that no average person can even tell if they're on, only the discerning enthusiasts who are actively looking for it, so the wider market just wants more fps and better graphics.

    • @Hyde1415
      @Hyde1415 Год назад

      ​@@SexManNordinDLSS always seemed like a solution to increase performance for ray tracing games. Not to boost older cards.

    • @Pantaro2007
      @Pantaro2007 Год назад

      agree! 90% of what i play is indies. occasionally i play a triple A game here or there. the framerate i would want to get to justify spending over 1k on a new gpu isnt gonna happen with all these unoptimized garbage triple A games we get these days.

    • @portman8909
      @portman8909 Год назад

      I play at 1440p and the don't like the fact that a RTX 4070 and 7800 XT struggles with some poorly optimised triple A titles already. They force you to enable DLSS and so on.@@Pantaro2007

  • @TerraWare
    @TerraWare Год назад +4

    It's obviously not a thing of the past. These technologies effectiveness revolves around native resolution. They are just very good technologies that do what they do well so you could choose to not run a game at its native resolution if you wanted more frames or lower power consumption, temps etc.
    The big question for me is that is PC gaming better with or without these technologies? The answer is obvious.

  • @hartsickdisciple
    @hartsickdisciple Год назад +6

    No, native resolution rendering is still the best. All the other AI-powered solutions are somewhat compromised.

  • @hinrei1986
    @hinrei1986 Год назад +2

    I only believe in native resolution.

  • @seeibe
    @seeibe Год назад +13

    What I think could be a really interesting future for video, whether it be games or movies, is that we could find ways to encode video that's not based on "pixels", but a more abstract representation that can be turned into a high quality image by AI based techniques very efficiently. This could both reduce video size drastically, but also make rendering much less resource intensive for games.

    • @DawnSentinel
      @DawnSentinel Год назад +3

      So vector based rather than pixel based. That would be interesting. Voxel based pictures are the way to go for really good scaling between sizes without distortion.

    • @TheShitpostExperience
      @TheShitpostExperience Год назад +1

      Flash animations/games kinda worked like that back in the day, where all the frames were vector based and used shape aware color fills which scaled to any resolution without losing image quality, leaving the browser to do the antialiasing I assume.
      The main issue with this, is that vector based format images were never meant for images that had over thousands of colors, hence why flash died back in the day.
      I wonder if with the current technologies we could have something like that brought back to relevancy, but I haven't heard of anyone working on this kind of shading tech yet.

    • @Deliveredmean42
      @Deliveredmean42 Год назад

      ​@@TheShitpostExperiencetechnically flash didn't die, just that instead of flash player they just use html like any other website, and the name Adobe Flash was changed to something else, because you know adobe being lazy in fixing legacy problems. Since it's just vectors at least other alternative softwares came along for some folks to keep using it.

    • @yarmgl1613
      @yarmgl1613 Год назад

      video is already a bunch of vectors, curves and cubes, look at how h265/av1 is encoded, also vectors suck at showing details at acceptable file sizes

    • @seeibe
      @seeibe Год назад +1

      @yarmgl1613 Yeah, it'd have to be way more advanced than vectors. That's just traditional compression. More along the lines of stable diffusion where the model has seen so many hills in the background that it can generate a high res landscape from a 20 byte prompt. Of course it'd have to be a different kind of ML algorithm but that's the kind of level of compression I'm expecting to see in the future.

  • @HEAD123456
    @HEAD123456 Год назад +1

    ALL games today using crap TAA with crazy shimmering/bluring/ghosting.Using DLSS/FSR2 is pretty much number one priority if someone want max quality.This is where Nvidia have HUGE andvantage because FSR2 is trash compared latest DLSS versions...(i am 4K user)

  • @AMO_BO6
    @AMO_BO6 Год назад +3

    Native is native, nothing can replace that

  • @arbernuka3679
    @arbernuka3679 Год назад +2

    Not for me, i never use dlss/fsr

  • @xpodx
    @xpodx Год назад +1

    I have a 4k 144hz and will get 8k 144hz when it comes out, I play alot of older games, 15+ years old to 2019 pretty frequently, so most my games I still play don't have dlss.

  • @Icureditwithmybrain
    @Icureditwithmybrain Год назад +1

    I use FSR during the day so my room doesn't get too hot as my GPU produces less heat with FSR on, at night I turn FSR off.

  • @mattzun6779
    @mattzun6779 Год назад +1

    To some extent, I see the point.
    For most people gaming at 4k, being able to render 1440p at high/ray traced settings with a 60FPS min is sufficient as long as the card has enough memory.
    Upscaling from 1440p to 4k looks good and the 60 fps min makes frame generation a reasonable option.
    That really just means that there isn't a reason for anything above a 16GB 70/80 class card in the next generation
    I'm sure NVidia will still sell a 5090 for rich folk and AI workstation, but there would be very little practical difference between a 5070Ti and a 5090 for most gamers.
    The same is NOT true for 1440p or 1080p gaming at the moment but I assume he is talking about 4k or 8k.
    Rendering at 720p or 1080 and upscaling does noticeably affect video quality.
    I'd love to see improvements in FSR at the lower resolutions - it would make the next gen APUs and iGPUs practical for mid range gaming.

  • @davidjohansson1416
    @davidjohansson1416 Год назад +1

    4k dlss == 1080p with graphical artifacts. Who the fuck cares about higher resolution if original content is lower. It’s like watching 1080p on a 4k monitor. There is no point..

  • @madant7777
    @madant7777 Год назад +4

    All the "filters" and "cleaners" can be made for/applied to the native picture also. So this discussion is at least insidious, if not stupid.

  • @nathanddrews
    @nathanddrews Год назад +1

    GPU makers can add any gimmicks and features that they want, I welcome it. However if you're going to try to sell me a GPU for the same price or higher than previous generation that is only marginally better in rasterization performance, that's a major value problem.

  • @humanityerror404
    @humanityerror404 Год назад +6

    I have been playing for the last 5 years at 2160p 60fps and I can say that I don't like either DLSS or FSR and let it give me more fps, personally I like the native resolution the most and let me play at 60fps locked as for RTX at 2160p with high graphics are unplayable and I think we need at least 2 more generations to be able to say that there will be a GPU that can run 2160p (netive) max setting with RTX and have at least low 0.1% 60 fps

  • @sgtjarhead99
    @sgtjarhead99 Год назад +2

    The only thing I miss from the old CRT days is that native resolution was not a big deal. If performance sux'd, it was OK to go down one res without turning the screen into a blurry mess. My first "gaming" computer used a 1024x768 CRT with a 3dfx Voodoo 1 which only supported a MAX 640x480x16 resolution and games still looked GREAT.

    • @GraveUypo
      @GraveUypo Год назад

      the secret then, is to get a 16k screen. that way you can use:
      15360 × 8640
      7,680 x 4,320
      5.120 x 2880
      3840 x 2160
      3072 x 1728
      2560 x 1440
      2194 x 1234 (this one isn't perfect but close enough)
      1920 x 1080
      1706 x 960
      1536 x 864
      1396 x 664 (also wastes 1 pixel of the screen on each axis)
      1280 x 720
      All with integer scaling. Not to mention there's so many pixels in the damn canvas that you can just use any resolution you want with a nearest neighbor approach (basically don't use fugly bilinear smearing) and it'll look close enough to being integer scaled anyway.

    • @ChrisM541
      @ChrisM541 Год назад +1

      I remember those days well. Good gaming times...honest gaming times, not the fake gaming experiences we have today.

    • @cattysplat
      @cattysplat 9 месяцев назад

      2D sprites looked great at low resolution. In 3D rendering native resolution is huge, especially matching the native texture quality.

  • @Ryet9
    @Ryet9 Год назад +1

    inb4 nVidia sneaks in driver level DLSS to boost 50 series performance 🤣

  • @fredeso7844
    @fredeso7844 Год назад +31

    For me personally, yes. I have a 4K monitor. I didn't choose a 4K monitor, but it's my TV so it's 4K (144Hz). I'm perfectly happy with DLSS Quality. I don't think I can tell the difference, but I appreciate the extra frames.

    • @bearwynn
      @bearwynn Год назад +9

      for super high resolutions like 4k and 8k DLSS is fantastic.
      1080p is absolutely dog doo doo

    • @vicv3940
      @vicv3940 Год назад +2

      In motion, there really isn't any discernable difference between DLSS Quality and native at 2k or higher resolutions. You can nitpick differences in fidelity with side by side screenshots, but even then the difference is negligible. I find that using DLSS also reduces or eliminates artifacts, ghosting, etc that you'll get when playing natively - even more reason to turn on DLSS.

    • @SexManNordin
      @SexManNordin Год назад +9

      i can very easily tell the difference for dlss 2 on 1440p 165hz monitor doesnt matter what fps i play or graphics setting if dlss is enabled (all modes) i can tell it clearly especially when im moving the screen looks weird and a bit washed out and its like edges of almost everything have a weird look to them like it isnt sharp which makes sense knowing how upscaling works

    • @Extreme96PL
      @Extreme96PL Год назад +1

      @@bearwynn sometimes DLSS quality can be better than native 1080p because of shitty TAA.

    • @sniper8567
      @sniper8567 Год назад +1

      I think that even Balanced is often very good. Buy it depends.

  • @POVwithRC
    @POVwithRC Год назад +1

    My general view is that incumbent hardware vendors are just trying to con users into accepting lower and lower gen over gen hardware improvements by slapping an AI lipstick on that constantly decaying pig. Raw raster or die.

  • @hinchlikescake7592
    @hinchlikescake7592 Год назад +1

    Yeah for sub 4K even for 1440P we still have ways to go to match native res. Even for single player games that are far less dependant on fast input for latency and quick response. As far as image quality is concerned. In the age of Raytracing and lets face it, poor quality ports.. having the hardware capable but with performance hit in mind, good upscaling is still very important to have but in its current form in no way replaces native. Up until its not perceivable from native with good AA and reponse rate to match.
    Competive players would want the cleanest and lowest input lag possible and only native gets there and would likely be in 1440P or lower monitors.

  • @sniper8567
    @sniper8567 Год назад +1

    In my opinion using DLSS in a number of games on quality but mostly Balanced. I would say that native gaming is dead looking forward. It makes absolutely no sense.

  • @markp2085
    @markp2085 Год назад +2

    I think FSR and DLS is just a blip on the radar. They are technologies that will not be needed as GPUs get more powerful, unless you are one of those people who want to game at 8k l. 5000 FPS, on your little 21 inch monitor. Today, with the current batch of games I am playing, I am not using these technologies. I am currently using a 6900xt at 1440p on a 32" monitor. I might purchase a future 5090 depending on where games are next year, GPU tech, and pricing. I suspect upscaling is allowing developers to not optimize their games as good as they shouls. It will be an interesting next few years.

    • @mojojojo6292
      @mojojojo6292 Год назад

      They aren't going anywhere. As gpu's get more powerful developers use that power to push more demanding graphics. There is also not much room to improve on the gpu front. We are literally reaching the limits of silicon. Without these technologies graphics would stagnate.

    • @BlindTrustProject
      @BlindTrustProject Год назад

      I guarantee you your comment is going to age like milk.

  • @ssjtom118
    @ssjtom118 Год назад +6

    I will ALWAYS prefer native resolution over anything else. In my view, DLSS and FSR are strictly a crutch to extend the lifespan of GPUs. So with that said I’m glad they exist, but they are not a factor in my purchasing decision.

    • @evaone4286
      @evaone4286 Год назад +4

      This 👍

    • @notafanboy250
      @notafanboy250 Год назад +4

      This 👍

    • @TahaJelani
      @TahaJelani Год назад +1

      DLSS quality literally looks better than native

    • @ssjtom118
      @ssjtom118 Год назад +3

      @@TahaJelani not to my eyes. But more power to you.

    • @TahaJelani
      @TahaJelani Год назад

      @@ssjtom118 No it objectively looks better, not considering placebo

  • @steel5897
    @steel5897 Год назад +1

    Yes, yes it is. At least as a valuable metric for end-users. UE5 being built around reconstruction is all the writing you need on all of the walls, it's not the future, it's now.

  • @sohodon
    @sohodon Год назад +1

    Sorry to tell you guys this but no.. we still want native resolutions.. how else are we to measure how far we come technologically.. upscaling is cheating.. raw performance is necessary

  • @SeanFarley25
    @SeanFarley25 Год назад +3

    I have been asking myself the same question for a while now.. I'm tired of hearing about up scaling to be quiet honest with you.. I hope we get to a point where we can nativity render 4k without up scaling.

    • @diaman_d
      @diaman_d Год назад

      we already can render 4k gaming without upscaling, 4K Native 60FPS minimum is no longer a dream.

    • @ChrisM541
      @ChrisM541 Год назад

      @@diaman_d For many games, it's still a very expensive reality. In time, that will change, but understand that Nvidia will want that change to take as long as it can. AMD, in pulling out of the future high end market, clearly agrees.

  • @mjn5016
    @mjn5016 Год назад +1

    Many people here acting like dlss doesn't have a huge smearing/ghosting issue. And dont even get me started on FSR.

  • @Zettoman
    @Zettoman Год назад +1

    dlss and fsr should be used for high refresh rating rendering. 60fps should be a minimum.

  • @Rayu25Demon
    @Rayu25Demon Год назад +1

    this marketing idea is going to let us more expensive and weaker GPU at native, despite all this more and more bad optimized games.

  • @keithspiteri5862
    @keithspiteri5862 Год назад +3

    Let's keep it simple and straightforward.
    At the three main price segments that matter in terms of critical mass (4060 Ti vs 7700 XT, 4070 vs 7800 XT and 4070 Ti vs 7900 XT), AMD is consistently better at rasterization performance and in VRAM size.
    Nvidia is charging similar or higher prices while offering worse 'raw' performance - and selling like hotcakes anyway - purely due to:
    - Better RT performance.
    - DLSS2 offering better upscaling than FSR2 due to the latter's motion artifacts and shimmering.
    - DLSS3.5 offering Ray Reconstruction, a new tech that AMD cannot yet match.
    These Nvidia advantages are due to its undeniable tech advantage in tensor AI cores.
    DLSS3 Frame Generation is no longer a factor as early reviews have indicated FSR3 matching it, helped by being universally supported on top of that. Few gamers care about the power consumption differences.
    So, logically, it is no wonder that Nvidia claims "raster is dead" and "RT and PT are the future" and that "Native is a thing of the past". In this vision of the future, Nvidia will win by default unless AMD catches up quickly - and Nvidia wants to win so it can continue to dominate market share and charge higher prices.
    These people are professionals and they know people's buying decisions are influenced more by selling the future and appealing to emotions.
    This is smart, but also highly misleading and unfair, in my opinion. We live in the present - and right now rasterization performance in Native matters more. By the time RT/PT flips the balance, any card you buy today will be obsolete and you will not use in the future.

  • @ApocalypseGuy1
    @ApocalypseGuy1 Год назад +15

    The upscaling technologies are insanely good. I'm now playing the new Mortal Kombat upscaled on 1440p and it was literally a 30 fps gain compared to native res, with no visual differences that I can detect. At this point, I feel like hanging onto my GTX 1070 just to see how much I can prolong its effective life with upscaling.

    • @wizardd
      @wizardd Год назад +1

      utilising dsr/dldsr is also insanely handy with upscaling on plenty of games but particularly older titles

    • @orangecat1596
      @orangecat1596 Год назад

      That depends on the games too

    • @Eleganttf2
      @Eleganttf2 Год назад +3

      tell that to people who bitch about how using upscaling is considered a big heresy and a "sin" and any form of upscaling no matter how good it is they call it "but its not native1!1!" im so sick of those types of people

    • @bverto979
      @bverto979 Год назад

      Easy life hack, just don't listen to or engage with them, don't try to debate your point with somebody who is not keeping an open mind and/or is not willing to listen to different opinions. Enjoy what you enjoy, problem solved@@Eleganttf2

    • @puffyips
      @puffyips Год назад +2

      @@Eleganttf2I’m pretty sure those people would be glad to here upscaling being used for an older card, but being used as a crutch for new gpus/games that’s a sin yeah

  • @dazextralarge
    @dazextralarge Год назад +1

    This is outrageous. We are paying more for less and getting even less?

  • @LeJohnnyBoy
    @LeJohnnyBoy Год назад +1

    4:56 people focusing so much on reflections they don't see the low-texture NPCs.
    Feels like we're watching Crazy Taxy from 10 years ago

    • @myne00
      @myne00 Год назад

      Unreal 1.0 had reflections not unlike those.
      Duke Nukem 3d had mirrors.

  • @TennessseTimmy
    @TennessseTimmy Год назад +1

    No matter what, in FPS games like tarkov, running DLSS and FSR adds a bit of lag, even if my FPS is higher, I can see it.
    It's painfully obvious when turning it off and playing at native 1080 instead of 1440p with DLSS.
    These are temporal effects, they will never beat native, no matter how sharp the image is, it's delayed and it's visible.

    • @mojojojo6292
      @mojojojo6292 Год назад

      Tarkov is not remotely gpu demanding unless you are cranking the settings to max at higher resolutions. The cpu/cache and memory are always the biggest bottleneck. I run tarkov at 1440p with DLSDR/DLAA. This renders native 1440p then upscales it to 4k, then downsamples it back to 1440p for a far superior AA that looks much better than native. On my 4070 this still doesn't come close to pushing the card to 100% usage. If DLSS gives you higher frame rates then the input lag will always be lower. Higher frames = lower latency. I'm guessing in your case DLSS is not giving any extra frames over native so you are perceiving extra input lag.

    • @TennessseTimmy
      @TennessseTimmy Год назад

      No you are missing the point.
      the point is DLSS is not instant.@@mojojojo6292

  • @Logical
    @Logical Год назад +1

    Absolutely not lol

  • @cursedowlsgaming5355
    @cursedowlsgaming5355 Год назад +1

    To me, I game solely at full native resolution. I don’t like any of the upscale features or frame generation. I like high pixel count graphics and I EXPECT to use it natively.

    • @mojojojo6292
      @mojojojo6292 Год назад

      Well get used to turning settings down then because you won't be running high frame native in many modern titles without it.

    • @cursedowlsgaming5355
      @cursedowlsgaming5355 Год назад

      @@mojojojo6292I buy gpus accordingly. Single player story driven games with adaptive sync are just fine in that 70-100fps range. Competitive games 120+. I don’t need 300 fps to be competitive in games I care about.

  • @erf8406
    @erf8406 Год назад +1

    Upscalers suck

  • @OrlandoCaba
    @OrlandoCaba Год назад

    Been gaming in 4k since 2013 and there are games that support 10 bit color 1.03 billion colors plus 4k textures and yes its available in other resolutions but in 4k looks amazing. Native resolution looks better than any upscaler and gaming in 4k isn't unfeesable, especially in 2023. I'm sick of the marketing shifting it's focus on this just to benefit game developers and game companies on saving man hours on optimization. This year we have seen some pretty horribly optimized games come out and unfortunately I don't see this trend getting any better. Xbox has been upscaling since the beginning because it's hardware was insufficient to run native 1080p and now all these companies are selling this garbage to us. I understand for people that buy mid range to low end hardware that want the boost but even mid range can play in naitve 4k. Just now people have been conditioned to only want 100fps😅 40 to 60fps is too slow.

  • @jeremyhouse129
    @jeremyhouse129 Год назад +1

    native resolution gaming is a thing of Nvidias past.

    • @ChrisM541
      @ChrisM541 Год назад

      Nvidia: Native Resolution Gaming Is A Thing Of Our Past.
      Nvidia: Upscaling...Enjoy The Future, Artifacts And ALL ;)

  • @McLeonVP
    @McLeonVP Год назад

    9:50
    I use a 1650 gddr5 evga sc in Cyberpunk 2077
    XeSS perfomance looks better than fsr 2.1 quality XD

  • @rodrigomendes3327
    @rodrigomendes3327 Год назад

    Now that you are awake for this? Serious? After deceiving thousands of people by encouraging them to buy cards without DLSS or DLSS Frame Generation, having a very inferior product? Interestingly, at the same time that AMD launches something to compete.

  • @mitsuhh
    @mitsuhh Год назад +1

    I still use 1080 Ti at native 1440p

  • @vaghatz
    @vaghatz Год назад +1

    The issue I have with 4K native-res benchmarks is that I cannot understand if I can enjoy a game at 4K or what GPU should I buy to enjoy the game at 4K.
    For example at a 4K native benchmark a GPU might seem to run the game at 35fps, should I buy it? Who knows? Maybe when I enable DLSS this GPU will run the game above 60. I need this information to make my purchase.

    • @ChrisM541
      @ChrisM541 Год назад

      The wise buyer targets 1440p ;)

  • @Pitbull0669
    @Pitbull0669 Год назад

    This is why I only watch You guys and gamers. Nexus, because Linus and j both definitely drink that cool aid. Regurgitate what makes money and ads more sponsorship for them

  • @doc7000
    @doc7000 Год назад +2

    I feel like Nvidia wants to push this idea because right now they are ahead in these features. So they may feel that it benefits them to push it however this won't last forever.

    • @ChrisM541
      @ChrisM541 Год назад +2

      Correct...and 'somewhat' understated.

  • @FramkM-zq3jn
    @FramkM-zq3jn Год назад

    DLSS 2.0 Super Sampling gives you better resolution. DLSS 3.0 Frame Generation gives you higher framerate. DLSS 3.5 Ray Reconstruction gives you better raytracing. DLSS 4.0 Texture Recompression gives you more VRAM. DLSS 5.0 Voltage Recirculation gives you lower TDP. DLSS 6.0 Game Generation creates new games you can play.

  • @GraveUypo
    @GraveUypo Год назад

    i can't use upscalers in forza horizon. the power lines look like shit. I'd rather lower graphics and keep native 4k with TAA. not that i need to, my card runs it at maxed settings like that just fine

  • @McLeonVP
    @McLeonVP Год назад

    Native 1080P is still alive.
    For a RTX 4090
    (don't worry rich people)

  • @KPAki1Ler
    @KPAki1Ler Год назад

    At 1080p not worth it. At 4K worth it. At 1440, generally unnecessary. Especially based on performance charts. Just give us faster hardware!

  • @Thor_Asgard_
    @Thor_Asgard_ Год назад +4

    Rendering super low will never look as good as native. Even Nvidia cant ignore how math and physics work. If your low resolution cant see a detail, you cant create it from thin air.

    • @leovanlierop4580
      @leovanlierop4580 Год назад +2

      There's more to image quality then resolution alone. If you have to scale down a bunch of things, it might even look worse then to keep native resolution.

    • @BlindTrustProject
      @BlindTrustProject Год назад

      That's what deep learning is for and DLSS is about. So yes DLSS is actually about creating from thin air based on its training on billions of higher rez pictures. The CSI enhancement joke is over

  • @jeanmartin7166
    @jeanmartin7166 Год назад +1

    If you played RDR2 at 1080p, you know devs don't give a shit about fhd even on AAA. Anti-aliasing always get worse at 1080p.

  • @bakernation
    @bakernation Год назад

    Here's how you shop for graphics card in 2023
    Used Gpu's across the board are dirt cheap a 5600xt is like 110 US on ebay. Buy used if you can find a good deal. You can get a whole gaming pc for under 300 US that can play anything at 1080p medium. A lenovo P520 workstation with 16gb of ram is 160 US slap in a cheap ssd and 5600xt/5700/5700xt/1070ti/1660super. What ever the best deal is at the time and your good to go with room to upgrade.
    6600 is a good buy
    6700xt is a good buy
    7800xt is decent
    everything else is over priced unless your getting a 4090 or 7900 xtx for 4k and you have money to burn.
    The consoles are good deals as well if you only play a few games its not a bad option as well.

  • @wh3resmycar
    @wh3resmycar Год назад

    1080p needs to die. It is a decade old resolution that’s been holding back GPUs. 1080p in 2023 is equivalent to 540p in 2010. GPU processing power increased exponentially when full Hd became the norm a decade ago.

  • @Herkan97
    @Herkan97 Год назад

    If I can get 1080p60fps, I'm golden. 144fps would be nice, but 60 is the base.
    That newer games aren't going to target even that because they shove all the optimisation to DLSS and FSR is probably going to ruin that scene for me.
    Indie here I come, I guess. I'm already primarily playing BattleBit Remastered, so maybe it's on the way and I will just never play newer higher-end games.
    Or maybe it'll depend on developer. Maybe GTA VI will be alright, but then 5 other devs went "Wow, upscaling tech? Optimisation, begone!".
    But it's going to be further into the future than GTA VI where we should see the effects of this tech on these devs and their choices. Or rather their publisher's choices.
    That the tech went from "It might help your older card" to "No card is safe without it" is a shame and that's already arrived. Just not on many games yet.
    Whatever "better" it can get, it can never be best as that can't be.
    But maybe a game not running well on an older card is because they didn't optimise it well enough..Maybe thanks to leaning on DLSS or FSR. Maybe they did make it run well on certain generations and gave up on anything else so it'll look like it helps older cards.
    Maybe not for current games, but future ones? "We optimised this game for RTX 3000-5000, anything older was skipped and you'll have to use DLSS for a smooth experience".
    That's probably common when you try to push the graphics like bigger budget games clearly have done for at least a decade, you have a point where you just don't bother making the game run well because the hardware is just too ancient.
    A game can't look like it's from 2012 and runs like a 2012 game with a 2023 GPU. No, a game has to look like it's from 2023 running on a same-year GPU. Why it has to always be pushed, I don't know. That's where the indie scene can come in and has many times. Maybe it's stupid to want the bigger budget devs to make games that way, as it somewhat nullifies them being bigger budget.
    But is it bigger budget if it takes shortcuts like DLSS? Maybe the bigger budget that should allow for more optimisation is instead cut and given to those at the top
    I'll just stick to someone not using DLSS and having an RTX 4090 do a game movie. For most of these games I don't want to play them anyway, however I did play Remnant 1 because of the Remnant 2 trailer and would've also played Remnant 2, but that interest died when I heard about the poor performance and if there somehow was interest left, it died for good with the comment about expecting you to use DLSS or similar tech.
    Maybe it doesn't matter if someone uses DLSS for game movies, as the compression ruins the image enough as it is. I saw this with my own video that I compared with one downloaded with Takeout. Not only did it look several times worse, it was also maybe 1.5x bigger in file size.
    But for playing the game myself? Never DLSS or similar tech, if the game runs poorly enough I'll just uninstall it. Hey, more reason to pirate a game first. If it runs like poo, you get to find that out without giving them money, as a demo nor someone else's experience even with the same hardware isn't going to be indicator it runs smoothly for you. I've heard hype about demos recently, but they're not useful for knowing how a game will perform with the full build. For testing the gameplay, they're alright..But maybe they only have the good parts in the demo and the full game has the good parts you saw in the demo and the rest is bad. Seems like a waste of time not just waiting for reviews and playing a game you already have installed that isn't a demo while you wait.

  • @diebycore7562
    @diebycore7562 Год назад

    Why everyone is talking about this upscalers but no mention of the downscaler that we already have and is supported by GTX 1000 series and up? It does the same shit that DLSS is doing, but instead of working from a lower resolution up you go down from your native one. The performance increase is very good with minimal visual loss (you have to do some sharpening tweaks per application) wile having a much broader support among games (you don't need a special function integrated into the game to make it work) and also not introducing latency. The function I'm talking about is found in nvidia's control panel at the top of the list - Image Scaling. I get very good performance boost on my GTX 1080 Ti with this function (ex: 40% FPS boost in The Division 2 at 1440p). I will try to test this in Red Dead Redemption 2 soon.

  • @ravipeiris4388
    @ravipeiris4388 Год назад

    Native gaming vs AI discussion is pointless. AI is the future. You can drive or walk to work - Which is more productive? Exactly.

  • @ralkros681
    @ralkros681 Год назад

    I feel like all the performance and optimization features of dx12 and Vulcan are just getting the side eye. All that beautiful development just for devs to still use dx11 only and give us these horrible running games calling our pcs trash… looking at you starfield. I don’t want to use dlss or FSR unless it’s at 100% resolution to replace the horrid abomination known as TAA. Remember when the 1080ti was supposed to run games at 4k60 😢

  • @JacoB-wp4ws
    @JacoB-wp4ws Год назад

    Irrelevant cuz of bad optimization, i will be still using native resolution outside of rt, but raster performance should be very good and for 1080p and 1440p with max settings with no rt should be always using below 6GB, and anything above 6GB with no rt is an engine/optimization issue

  • @FATBOY_.
    @FATBOY_. Год назад

    I thought upscaling was meant to be used with demanding ray tracing effects.
    Why do games like Starfield require you to enable dlss/fsr on a mid range GPU at 1440p ?
    Lazy developers are abusing this upscaling technology.

  • @RmRoyalflush
    @RmRoyalflush Год назад

    FSR and DLSS is nice when playing on a 80 inch 4k TV that's 4-5 meters away. It's not so nice on even my 1440p 27inch monitor that I am sitting 60cm away.
    For consoles and TV gaming AI image all the way.