Gaming On The RTX 5090 With DLSS 4!

Поделиться
HTML-код
  • Опубликовано: 27 янв 2025

Комментарии • 5 тыс.

  • @PcCentric
    @PcCentric  13 дней назад +726

    EDIT - After publish, Nvidia asked for me to remove the "Exclusive" and "Benchmark" aspect of the title, which I have obliged by
    FAQ - This was filmed at CES in an Nvidia briefing, embargoed until today. Full Review embargo is separate.
    Does it use reflex 2? No, that's going to be a future release for esports titles only. FG does require Reflex 1 though
    Does Frame Gen look better than last gen - hard to say without a side by side, but for normal gameplay it works great, but will struggle with fast non linear movement and hair

    • @jamieferguson935
      @jamieferguson935 13 дней назад +36

      Linus did this a few days ago. Same game too.

    • @belec5964
      @belec5964 13 дней назад +26

      Wait, is it really just for esports titles only? I haven't heard about that.. If that's true then it's a shame :/

    • @joen4287
      @joen4287 13 дней назад +54

      Cyberpunk with 4K RT Overdrive + DLSS Performance + RR + FG = 100+ FPS, 8-20ms (in-between) Latency [RTX 4090]
      Cyberpunk with 4K RT Overdrive + DLSS Performance + RR + FG = 200+ FPS, 35-40ms (in-between) Latency [RTX 5090]
      Just posting this for myself, Pls correct me if I'm wrong

    • @MaximumCarnage-
      @MaximumCarnage- 13 дней назад +6

      @@jamieferguson935 Actually no. Rewatch his vid and then rewatch this vid.

    • @DeathComeInPeace
      @DeathComeInPeace 13 дней назад +5

      what CPU was in the system?

  • @kyyyy21
    @kyyyy21 13 дней назад +3093

    "The video isn't seen by Nvidia" if any Nvidia employee is watching this, please close your browser immediately.

    • @larkinc
      @larkinc 13 дней назад +67

      LOL! I'm guessing it was OK'ed by NVIDIA, but they didn't restrict going into the settings and playing around with them, so I can't imagine they'll be taking him to court or anything. I'm guessing NVIDIA ok'ed this to counteract all the "fake frames" rhetoric (some of which I find pretty toxic and/or opportunistic) that has flooded RUclips and this video gives more context.

    • @PcCentric
      @PcCentric  13 дней назад +218

      @@larkinc Yeah they oked their demo, and changing the settings, but not showing temps or power usage. Basically they want the demo to be as public as possible, without inviting millions of people to their HQ

    • @skyserf
      @skyserf 13 дней назад +7

      @@PcCentric What model monitor was that?

    • @mikebellamy9052
      @mikebellamy9052 13 дней назад +2

      do you think nevidia dos t look their presentation, where they show everything by them self😅😂

    • @hadoru314
      @hadoru314 13 дней назад

      @@skyserf Lg 32GS95UV-W

  • @diligencehumility6971
    @diligencehumility6971 13 дней назад +6900

    A $2000 GPU only running CP2077 at 34 fps without fake frames.... impressive

    • @SquishyBoi5463
      @SquishyBoi5463 13 дней назад +849

      Genuinely unless they start making GPUs with larger sizes it will not get higher than 60 at 4k Path traced rendering. Unless there is an incredible leap in architecture of course.

    • @extremedaemon
      @extremedaemon 13 дней назад +1370

      I prefer 300 Fake frames compared to native 60

    • @iko20101
      @iko20101 13 дней назад +188

      A test build of the 5090 while also Cyberpunk being the cutting edge game to look at makes sense. Cyberpunk isn't really made for optimization and if you are going to play on the highest of settings you can't without DDLS or FSR

    • @drunkhusband6257
      @drunkhusband6257 13 дней назад +330

      Path traced games basically require dlss

    • @MusicalWhiskey
      @MusicalWhiskey 13 дней назад +267

      I think it actually averaged 27-28. Still quite a jump from the 4090's 21fps though. But yeah.....2000 bucks....sheesh.

  • @JAZEONTV
    @JAZEONTV 13 дней назад +1073

    I love how Crysis used to be the benchmark for testing graphics cards before. Now it is Cyberpunk.

    • @DBTHEPLUG
      @DBTHEPLUG 12 дней назад +15

      Good

    • @EarthRising97
      @EarthRising97 12 дней назад +125

      Only because people gave up trying the impossible task of running Crysis.

    • @cdshivers
      @cdshivers 12 дней назад +5

      @@EarthRising97 😅😅

    • @PeterPauls
      @PeterPauls 12 дней назад +8

      @@EarthRising97 It is not impossible. I have an RTX 4080 and the game in 1440p (this is my monitor's resolution) can produce 90-130 fps with maxed out + PT. Of courses I'm using DLSS, Ray Reconstruction and Frame Generation. I don't know, everybody wants to play in 4K but for me 1440p is the sweetspot and most people forgets that Path Tracing is only an option there are "old RT" in the game which looks fine and even that you can turn off and simple get 144 fps locked without RT.

    • @jongeorge3358
      @jongeorge3358 12 дней назад +14

      The witcher 3 was too for quite awhile!

  • @brianmount366
    @brianmount366 12 дней назад +255

    The developers of cyberpunk must be so happy they made an extremely hard game to run, now they get so much advertising! Lol

    • @Grass_rock-kb7gi
      @Grass_rock-kb7gi 12 дней назад +22

      Cp2077 doesn't need advertisement anymore
      Literally everyone knows about this game

    • @jame2433
      @jame2433 12 дней назад +11

      I mean, it runs on my steam deck at 45fps. Obviously lower resolution and not max settings but they shows how well optimised this game is

    • @stansmith7445
      @stansmith7445 12 дней назад +9

      It’s actually pretty well optimized nowadays.

    • @detrizor7030
      @detrizor7030 12 дней назад +4

      There's no pride in making unoptimized game which can't be run with decent performance in max settings even on top-of-the-line hardware. With such logic stalker 2 devs should be even more happy, because this crap can give decent latency only on minimum settings. On 4090 (!!!).

    • @ElderSnake90
      @ElderSnake90 12 дней назад +2

      I'm surprised how many people say how unoptimised it is. I have a modest system (Ryzen 5 5600 and 6600-XT) and it runs fantastic. Only basic RT of course and I'm on 1080p but with my TV's 4k upscaling, it looks nice and i'm happy.

  • @Tushar_KN
    @Tushar_KN 13 дней назад +1706

    They didn't even allow Linus to tweak the settings. Bro, you have given us a lot.

    • @Ahamshep
      @Ahamshep 13 дней назад +195

      Would you let Linus mess with settings? Lets be honest here...

    • @icygangonice
      @icygangonice 13 дней назад +117

      @@Ahamsheplinus would try to frankenstein the 5090 into running an SLI with a 1050 ☠️

    • @Gradymeister
      @Gradymeister 12 дней назад +58

      @@Ahamshep
      And just like that. the first 5090 in Linus's hands hits the floor with a thud.

    • @SamKryptonian
      @SamKryptonian 12 дней назад +35

      Learn to read. The embargo got lifted today thats why he was able to mess with the settings. Full review embargo is on the 24th.

    • @DBTHEPLUG
      @DBTHEPLUG 12 дней назад

      That's how time works.

  • @Mayeloski
    @Mayeloski 13 дней назад +953

    Not bad, but I think most of us are interested in the 70 or 80 class gpus, and how well will they do in 1440p and 4k

    • @notwhatitwasbefore
      @notwhatitwasbefore 13 дней назад +22

      same as 40 series +5/10% based on SMs with a slightly better result in RT on than standard raster would be my asumption right now

    • @B_Machine
      @B_Machine 13 дней назад +31

      Indeed. Many of us have a QHD 180-240 Hz monitor.

    • @XMatt5X
      @XMatt5X 13 дней назад +21

      @@B_Machine 180-240 wtf i'm still in 60

    • @lsnexus2080
      @lsnexus2080 13 дней назад +24

      @@Mayeloski actually interested with the difference of 5070 and 5070ti as compared to my current card 4070 Ti Super.

    • @TechSid-hn6eg
      @TechSid-hn6eg 13 дней назад +5

      i wanna see if 5070 TI would be worth the money. I am looking to make a pc, switching over from laptops. Really keen to see if 5070 TI is worth it or should just go with 4070 ti super and upgrade wayy into the future lols

  • @mikeoes1
    @mikeoes1 12 дней назад +328

    Just wanted to say out of all the big youtubers covering rtx 5000 CES launch, this video give the best and most detailed information of what we want to know. You basically allowed people watching this to calculate the performance uplifts between 4090-5090 in all the different scenarios. Exactly what I was looking for. Thanks so much!

    • @bigturkey1
      @bigturkey1 12 дней назад +4

      the big tech tubers know if you make pro amd content it gets shares and views on reddit and more money for them

    • @userunknown1030
      @userunknown1030 12 дней назад +2

      and to me there is none it is almost no point in getting the 4090 or 5090

    • @franklinokeke7848
      @franklinokeke7848 12 дней назад +1

      The big tech channels were not allowed to show fps because Nvidia is scared of their massive influence 😂. Let’s be real, they didn’t let Linus show fps in his video because that guy can instantly tank nvidia’s gpu sales by a single comment

    • @jamesm568
      @jamesm568 12 дней назад +2

      If you can sell you 4090 for $1,500 then it's worth getting the 5090 for an extra $500.

    • @conorstewart2214
      @conorstewart2214 12 дней назад +1

      @@franklinokeke7848 Linus did show fps though didn't he? He wasn't allowed to change settings but he was allowed to post his video days earlier.

  • @Posh_Quack
    @Posh_Quack 12 дней назад +224

    A GPU for the price of a used car running cyberpunk 2077 at 28 FPS. Wonderful.

    • @getrekkedon
      @getrekkedon 12 дней назад +26

      It's not a GPUs fault a game is made so you can adjust settings to make it unplayable. It's like saying a car sucks cause it barely makes it up this jagged icy mountain.

    • @ZyphexPtTTV
      @ZyphexPtTTV 12 дней назад +32

      @@getrekkedonto be fair, it’s like you’re going 160mph in the speedometer, but it’s fake, you’re only going 100, but AI makes the engine sound louder and tricks you.

    • @slickysan
      @slickysan 12 дней назад +6

      @@getrekkedon You are so cringe lol.

    • @getrekkedon
      @getrekkedon 12 дней назад +10

      @ Is there an AMD card that can push path tracing with no upscaling? What are you saying? You know you can turn those settings off right? Just like you can choose not to drive your car up icy cliffs. I think most people who have the option to play graphical single player games with path tracing on using AI will because it actually looks that good.

    • @getrekkedon
      @getrekkedon 12 дней назад +5

      ​@ Unlike car race times, it's a subjective experience one in which the majority will appreciate actually being able to play full path traced aaa titles because of the AI.

  • @EFUwazurike
    @EFUwazurike 13 дней назад +589

    Does anyone actually play the game? Get into a gunfight, with explosions and particles flying. Why are we just standing at an intersection? There is no purpose praising the latency if you’re not showing the conditions in which the latency needs to be good.

    • @zinamogg
      @zinamogg 12 дней назад +84

      I was thinking the same. To truly test a card you need to play the game. Period.

    • @welshe222
      @welshe222 12 дней назад +94

      Yeah not only this, but the latency standing still in a low density area is 40MS, soon as he turned to quality mode it was 50MS, for a guy that claims hes optimistic about latency why is he not complaining? 40ms/50ms is horrendous it would be like playing on geforce now in the cloud lmao.

    • @t2nexx561
      @t2nexx561 12 дней назад +36

      @@welshe222 Thats what im saying like are we really gassing 40ms

    • @Dude3ro
      @Dude3ro 12 дней назад +20

      THANK YOU.... this is what every single comment should be talking about. You are standing still doing nothing

    • @anonony9081
      @anonony9081 12 дней назад +15

      That main area of the city is usually one of the worst for performance so I think it works.

  • @nellynelson9387
    @nellynelson9387 13 дней назад +1291

    Will wait for a 4090 vs 5090 comparison with raw frames rather than dlss/Ai

    • @PAIN-ot4cj
      @PAIN-ot4cj 13 дней назад +137

      the 5090 is 33% faster

    • @aHungiePanda
      @aHungiePanda 13 дней назад +154

      Just take the numbers here and compare it to a 4090 Cyberpunk 2077 Overdrive benchmark.
      RAW: 5090: 28-35 fps | 4090: 15-20 fps
      DLSS Q FG 2x: 5090: 110 fps | 4090 80fps

    • @4evahodlingdoge226
      @4evahodlingdoge226 13 дней назад +10

      @@aHungiePanda So 80% more performance in ray tracing.

    • @axeandace7728
      @axeandace7728 13 дней назад +80

      cope AI is the future

    • @bananachips8402
      @bananachips8402 13 дней назад +30

      Its only an 8 frames gain with the 5090 vs the 4090

  • @tomi2257
    @tomi2257 7 дней назад +2

    Standing still and playing with motion blur enables surely is a best way to demonstrate how this over 2000 dollar GPU fares. Bravo!

  • @957ad
    @957ad 13 дней назад +56

    Appreciate you showing the latency and giving more awareness about it- definitely makes a huge difference in how a game 'feels' when you're actually playing it

    • @detrizor7030
      @detrizor7030 12 дней назад +7

      Yeah, 40-50 ms latency is unplayable. For first person shooter it's nice to be below 20 ms, MAX 30 ms, anything higher is garbage. Fake frames nonsense is just a scam, it's such a shame that most people don't even understand it.

    • @getrekkedon
      @getrekkedon 12 дней назад +4

      @@detrizor7030 everyone understands it. It's a trade off. If you want path traced super graphical games, AI is a must right now. Esports where that low of a latency is a must are typically not graphically intensive and on top of that highly optimized and don't need AI. If you're really competiive you really don't care much at all how beautiful the game is and will turn settings down to get rock bottom latency and visual clarity anyway. The 5090 will likely be the most powerful GPU with or without the AI, so having the AI on top to play single player games where lightning reflexes don't matter and you like incredible graphics is nice.

    • @Sarvesh_Yenarkar
      @Sarvesh_Yenarkar 12 дней назад +2

      @@detrizor7030 how is 50 ms that bad , i usually play online games at that ping and honestly i don't feel a difference, especially in online games it should matter how come its a problem in game like cyberpunk 2077?? since when 50 ms became that bad

    • @TheWeezyfly
      @TheWeezyfly 12 дней назад +2

      @@getrekkedonfinally someone gets it. With out the Ai features for fps games turn it on for single player games

    • @detrizor7030
      @detrizor7030 12 дней назад +7

      @@getrekkedon "It's a trade off." - no, it's just nonsense. There's absolutely no sense in playing first person shooter with 40-50+ ms of input lag, this path tracing stuff just isn't worth it. Better not play a game at all, than with input lag higher than 30 ms.
      "Esports where that low of a latency is a must" - that's what i'm talking about - almost everybody keep talking some gibberish about multiplayer/singleplayer games and so on. That's NOT about esports, we need decent input lag (below 20 ms, 30 ms tops) NOT ONLY in multiplayer games, but in ANY game with mouse input, ANY. Come on, we need it even without any game, in plain operating system. In esports we need not just decent input lag, but lowest input lag possible, there's a VERY big difference between these 2 things.
      "where lightning reflexes don't matter" - what are you even talking about? For you there's only 2 options available - "lightning reflexes" and 50 ms ugly mess? Nothing in between? Seriously?

  • @qb3540
    @qb3540 13 дней назад +62

    Thank you Centric for the benchmark with no bells and whistles

    • @greenbow7888
      @greenbow7888 12 дней назад

      Would have loved to have known the temps. That Nvidia card is quiet and if temps are good, the AIBs are going to have to compete.

  • @mart8749
    @mart8749 13 дней назад +316

    10:55 nice npc walking path there

    • @aryanwizz554
      @aryanwizz554 13 дней назад +1

      wht

    • @ClintochX
      @ClintochX 13 дней назад +1

      😂 💔

    • @MilosCsrb
      @MilosCsrb 12 дней назад +24

      failed game, used as bechmark

    • @DBTHEPLUG
      @DBTHEPLUG 12 дней назад +2

      ​@@MilosCsrb2nd best singleplayer game I've ever played besides Elden Ring.

    • @urulooke
      @urulooke 12 дней назад +7

      This is amazing! I have walked like that sometimes, but it's the first time seeing NPC doing that.

  • @dannyx1991
    @dannyx1991 12 дней назад +18

    I honestly don't think I'm going to notice the occasional smudged graphic generated by the AI unless I plunge my face on the monitor, actively looking for it

    • @albertcamus6611
      @albertcamus6611 11 дней назад +3

      tool

    • @Billy_bSLAYER
      @Billy_bSLAYER 10 дней назад

      In a multiplayer game online with many movements and actions taking place around the screen you can notice the MFG..

    • @albertcamus6611
      @albertcamus6611 9 дней назад +1

      @@Billy_bSLAYER don't waste your time explaining. this person has accepted the lie and justified believing it....

    • @X-S-V-b1e
      @X-S-V-b1e 9 дней назад +1

      ​@@Billy_bSLAYERhopefully it gets better. Cause it seems nvidia isnt gonna change. Gonna go all in on this and it sucks. We are gonna get to a point where our gpus can only handle fake frames 😂

    • @superior_nobody07
      @superior_nobody07 9 дней назад

      What lies? You are getting more frames. Ai or not, 200 frames is still better than 45​@@albertcamus6611

  • @MrAnimescrazy
    @MrAnimescrazy 13 дней назад +142

    Cant wait for the 5090 vs the 4090 raster, dlss, raytracing benchmarks.

    • @HauntedBastard
      @HauntedBastard 12 дней назад

      didnt dlss4 have problems and also nvidia lied about their ai framerates.
      it couldn't even get 30 fps on cyberpunk.
      xD

    • @Celatra
      @Celatra 12 дней назад +9

      Raster seems to be like 33% difference based on this game

    • @jtrain9926
      @jtrain9926 12 дней назад +1

      5070= 2x4090

    • @NEWLifeXs
      @NEWLifeXs 12 дней назад +1

      ​@@Celatraless

    • @YamGaming
      @YamGaming 12 дней назад +4

      the 5080 vs 4080 will be hilarious since its a mere 8-14% raw performance increase lmao

  • @shaikdilkhushsyedbabasaifu3361
    @shaikdilkhushsyedbabasaifu3361 13 дней назад +193

    If 5090 gets released please do one video on RTX 4090 vs RTX 5090 Raw Performance and Native benchmarks 😅

    • @ryanhamrick8426
      @ryanhamrick8426 12 дней назад +15

      Why does that even matter? 5000 series is about AI and new tech that 4090 doesn't have and can't compete with.

    • @RedSky8
      @RedSky8 12 дней назад +52

      ​@@ryanhamrick8426 Clearly you aren't listening, all of that AI performance means nothing if the base performance is shit. If all you care about are generated frames then great buy a 5000 series GPU.
      For those of us that care about 0 AI generated frames or just just base DLSS with no frame gen. We want to know how much they actually improved the raw performance compared to the 4000 series. Is the dollar per base fps worth it or not.

    • @manar_belaid_007
      @manar_belaid_007 12 дней назад +12

      @@RedSky8 what's truly concerning is the latency which will worsen

    • @504Trey
      @504Trey 12 дней назад

      Jensen: STFU 🤡 & GO EARN ME A NEW JACKET!

    • @mattkavanagh4000
      @mattkavanagh4000 12 дней назад +13

      @@ryanhamrick8426why would you want ai frames that have artifacts and will increase latency. When you can have raw power that will have lower latency and less artifacts? Nothing cool about new technology that is more lazy and less useful then how they were doing things before

  • @trevorduncan9580
    @trevorduncan9580 12 дней назад +64

    yes please everyone hate on the 50 series. DO NOT buy a 50 series. wait AT LEAST 6 - 7 months. do not pre-order. just keep posting on the internet about how bad it is, This might help me get one at launch. thanks.

    • @TheFlyinFisch
      @TheFlyinFisch 12 дней назад +13

      amen to that. sick of these crybabies

    • @laszlodajka5946
      @laszlodajka5946 12 дней назад +1

      If they have a 4090 yeah it is senseless to jump on the new line. If you come from a gtx card that is another story

    • @trevorduncan9580
      @trevorduncan9580 12 дней назад

      @@laszlodajka5946 no shut your mouth. It's trash. No one should buy one. Terrible. Fake frames ya know

    • @trevorduncan9580
      @trevorduncan9580 12 дней назад +6

      @@TheFlyinFisch FACTS. You'd think Nvidia is out here with guns to everyone's head DEMANDING they buy a 5090 the day it drops lol weirdos 🤣 wonder if they get all mad when Rolex drops a new expensive watch they can't afford too?

    • @PhilosophicalSock
      @PhilosophicalSock 12 дней назад +2

      the funniest thing imo is: absolute most of those who cry about the "disgusting fakery huang scam" are the same ppl who either sit on Amd or on the gtx1060. They are not going to buy the new gpus anyway. And they have no clue about how the new tech FEELS with your own eyes and hands.
      Also, they are sure that 88% of PC gamers (who buy nvidia) are plain stupid hamsters and scammed by huang.
      Yes, i dont think i would upgrade from 4090 atm, but still the new features look decent if you have a much older gpu

  • @outsidethepyramid
    @outsidethepyramid 13 дней назад +78

    I'm a 1998 Tomb Raider II guy, so I know that gameplay is the main consideration, like about 70% game design and gameplay. Graphics are great but there's a point of diminishing returns, like a glitch here and there makes rock all difference to the gaming experience. These GPUs are put on a pedestal and you think they will make you happy, but in actual fact is just way too expensive for 99% of the population.

    • @jaaqob2
      @jaaqob2 13 дней назад +9

      Well yeah, these gpus are obviously not targeted towards 99% of population

    • @senti2175
      @senti2175 13 дней назад +4

      Totally agree with you except when it comes to VR.

    • @outsidethepyramid
      @outsidethepyramid 13 дней назад +4

      @ ok then 99.999& of gamers. how can u spend £2000+ on a gpu, like it's a Rolex or a Gibson? It's a complete rip off like more or less everything today; KFC, cars whatever. You pay LOTS more and d get a lot less value. Get real, £600 is too much for a GPU

    • @Emanuele-j2e
      @Emanuele-j2e 13 дней назад +4

      ​@@outsidethepyramidif u have the money who cares, its your choice and its your money

    • @Emanuele-j2e
      @Emanuele-j2e 13 дней назад +1

      ​@@jaaqob2 gamers will probably get the 5080/5070

  • @cobra2994
    @cobra2994 13 дней назад +17

    Wow this may be one of the best videos I have seen from you!! I love how you went through many settings and even did a noise test on the card with a quiet enviroment this is way better than Linus's video on this!

  • @JustoneAssassin
    @JustoneAssassin 13 дней назад +55

    Great video exactly what I wanted to see love how you were tweaking with the settings very informative! Any chance of a similar video for the rtx 5080?

    • @PcCentric
      @PcCentric  13 дней назад +30

      I spent too much time making this to try the other demos sadly

    • @CrashBashL
      @CrashBashL 13 дней назад +1

      I told you why he didn't test the 5080 but he deleted my comment.
      Damage control.

    • @VoidBass6412
      @VoidBass6412 12 дней назад

      Why? What happened? ​@@CrashBashL

    • @CrashBashL
      @CrashBashL 12 дней назад

      @@VoidBass6412 Each time that I answer, he is deleting my comment.
      No bad words are being used.
      He just damage controlles.

    • @Rivstar
      @Rivstar 12 дней назад +1

      @@CrashBashL I can assure you PC centric doesnt manually delete comments like this lol.

  • @stuartchapman7934
    @stuartchapman7934 12 дней назад

    Great Video tbf, ultra comprehensive!!!

  • @BockworschtSoldier
    @BockworschtSoldier 13 дней назад +36

    I will use my 3090 with upscalers. It can still spill out decent framerates.

    • @VFXShawn
      @VFXShawn 13 дней назад +9

      Im rocking my 3090, still not convinced to upgrade.

    • @veniulem5676
      @veniulem5676 13 дней назад +6

      it honestly amazes me that people complain about the 2k price tag when the 3090 for a long time was sold above 2k and plenty of people bought it

    • @damara2268
      @damara2268 13 дней назад +1

      You will get DLSS4 upgrade too, the transformer model with much better image quality as it was said in the video.

    • @markymoo22
      @markymoo22 12 дней назад

      ​@veniulem5676 Yep, I went daft in COVID £2,150 for a 3090 Suprim X.
      Not sure i can stomach the big prices again. Need to wait for 60 series I think. Unless the 5080 comes out really good.

    • @maximum_effort1
      @maximum_effort1 12 дней назад +5

      Yup! Pair it with Lossless Scaling and you basically got yourself a 5090 lol

  • @moise1545
    @moise1545 13 дней назад +408

    send me the card i can benchmark it too

    • @madeinnola
      @madeinnola 13 дней назад +3

      Sell it for a 2011

    • @KLOWED
      @KLOWED 13 дней назад +12

      sure thing lil bro

    • @hmklg
      @hmklg 13 дней назад +17

      Yeah just send me your adress and your mums credit card information so I can ship it lmao.

    • @Timppasaurus
      @Timppasaurus 13 дней назад

      Do you mean keep it?

    • @opticalsalt2306
      @opticalsalt2306 13 дней назад +1

      I can test it for free and never return it pls

  • @nolan6137
    @nolan6137 13 дней назад +27

    Thanks for going into the latency and not just being a fps fan boy. I really appreciate deeper dives like this and has answered many of my questions.

  • @Ang3r87
    @Ang3r87 8 дней назад +1

    Optimize games. We dont want fake frames. We want native resulotion.

  • @YoDefend
    @YoDefend 13 дней назад +295

    Wooo UK person does the 5090 review!

    • @AmanSingh-xk1me
      @AmanSingh-xk1me 13 дней назад +20

      He passed Jensen a botta-of-woaa

    • @jamescpalmer
      @jamescpalmer 13 дней назад +5

      @@AmanSingh-xk1me tasted slightly better than wadder

    • @mohamedatta3586
      @mohamedatta3586 12 дней назад

      Better than review on red note ? 😂

    • @hanzcholo
      @hanzcholo 12 дней назад

      Buncha wanks

    • @EarthRising97
      @EarthRising97 12 дней назад +1

      @@AmanSingh-xk1me Woaa > Wahda

  • @Master_Psyper
    @Master_Psyper 12 дней назад +11

    Definitely feeling better about having recently gotten the 4080s. The frame generation is a nice buffer but without it, it's a marginal improvement. Something I expect to develop more for cards that are not the 50 series. Still cool, and way more appealing to those building a newer pc with a smaller budget

    • @nlluke5207
      @nlluke5207 12 дней назад +1

      Only 1000 for the 5080 idk mate….

    • @Master_Psyper
      @Master_Psyper 12 дней назад

      @ I think they are no brainer for the story games, but for the fps they don’t seem like must upgrades. I’d like to see a VR run with the frame gen, that might be epic

    • @ThisLuvee
      @ThisLuvee 12 дней назад

      ​@@nlluke5207
      If you could even get it in the first place lmao.
      People who got a 4080s at launch made the right choice.

    • @nlluke5207
      @nlluke5207 12 дней назад

      @@ThisLuvee ppl that did buy the 1080 ti back in 2016 made te right choice hihi…

    • @Whreck11
      @Whreck11 12 дней назад +2

      ​@@nlluke5207it will be just 15% uplift unless you prioritize fake frames lol,4080s dropped in prices.

  • @housebywill1715
    @housebywill1715 13 дней назад +12

    Thought this was Scott the Woz, I should wear my glasses more

    • @arride4590
      @arride4590 12 дней назад

      I believe he is a character from Lord of Ring

  • @boxposed3855
    @boxposed3855 12 дней назад +8

    4 years later and Cyberpunk still can't run at 60fps 4k Max settings natively on the most "powerful" gaming GPU without cutting corners.

    • @erenyaeger7662
      @erenyaeger7662 10 дней назад

      Yeah and the 5090 is nearly twice the speed of a 3090.

    • @aether8412
      @aether8412 7 дней назад +1

      Well tbf they added path tracing way later after the game released

    • @aquacube
      @aquacube 6 дней назад

      Now that Moore's Law is dead, short of tech paradigm shift the way we are going to get more performance is by 'cutting corners'. Thankfully it looks like we won't really be able to tell corners have been cut.

    • @boxposed3855
      @boxposed3855 5 дней назад

      @aquacube The current state of DLSS is really noticeable in a lot of games that feature any kind of hair or foliage. Maybe one day it'll be better

    • @aquacube
      @aquacube 5 дней назад

      @ fingers crossed for future generations, I'm sure it will as AI/software tech is always evolving.

  • @leoutriainen
    @leoutriainen 13 дней назад +39

    I feel like you should've explained that PC latency isn't the same thing as frametimes, since that seems to be a common misconception and something there was tons of outrage about on Reddit.

    • @daunted232
      @daunted232 13 дней назад +1

      That could be it's own video.

    • @Sp3cialk304
      @Sp3cialk304 13 дней назад +11

      100 percent I can't stand when I see people say they are getting 16-17ms of latency when at 60fps. They literally don't understand that they are looking at only the frame time and not the total system latency.

    • @Celatra
      @Celatra 12 дней назад +1

      ​@@Sp3cialk304system latency is dependent on your keyboard and monitor and even quality of HDMI/ display port cables

  • @sethvam
    @sethvam 13 дней назад +11

    05:40 minutes. the Latency jumps to 72 for a second.

  • @Eldergloom
    @Eldergloom 13 дней назад +170

    I'ma stick with my 4080 Super for the next 7+ years.

    • @MOAB-UT
      @MOAB-UT 13 дней назад +7

      Amazing GPU! 4080S. All you need. Though 16GB VRAM will become needed for some games.

    • @smaze1782
      @smaze1782 13 дней назад +5

      Not a bad idea.

    • @chrisking6695
      @chrisking6695 13 дней назад +2

      I have that one and the card is strong enough for 24gb of vram. But with 16gb in limited in terms of modding and certain games with path tracing. So I’m considering the 5090.

    • @prenti1205
      @prenti1205 13 дней назад +5

      16gb vram wont even caery you 3 years if play 4k

    • @Tech-is1xy
      @Tech-is1xy 13 дней назад +7

      @@prenti1205The 4080 super won't remain a 4k card forever. Games are inevitably going to get more demanding and it will drop down to a 1440p card and then to 1080p card. So 16gb of Vram isn't going to be much of an issue for the future.

  • @BUDA20
    @BUDA20 13 дней назад +38

    Reflex low latency is activated automatically with Frame Gen, but NOT if you disable frame gen, you will get less input lag with Reflex ON

    • @crispycrusader1
      @crispycrusader1 13 дней назад +7

      Yeah with reflex 2 the latency should be 1/4th of what he was seeing with frame gen off
      So 9-12ms

    • @xSkully_H
      @xSkully_H 13 дней назад

      Reflex 2 wont actually work with FG due to how R2 changes frames to get the lower lantency ​@crispycrusader1

    • @notbrokebrolt6281
      @notbrokebrolt6281 13 дней назад +1

      It stays on if you turn frame gen off

    • @notbrokebrolt6281
      @notbrokebrolt6281 13 дней назад +1

      @@crispycrusader1This game doesn’t have reflex 2 and reflex is on still

    • @crispycrusader1
      @crispycrusader1 13 дней назад +1

      @@notbrokebrolt6281 Ah i thought that it was a combined system, like if they updated to dlss4 they would update reflex as well to newest mode

  • @TheSoxor123
    @TheSoxor123 11 дней назад +2

    I dont think gamers truly understand how much is required to run a game in native 4k let alone at 60 frames per second.

  • @yiravarga
    @yiravarga 12 дней назад +5

    It’s like going from traditional CDs to MP3 CDs. 12 songs on a disc can now be 100+ songs on the same disc, on the same CD player. The difference? Compression. Now we’re back to lossless audio options, and I assume gaming will go from lossless like now, to DLSS which is like MP3, then back to a mix depending on user preference. Eventually, lossless and DLSS rendering will be so close, it’ll be like arguing about hearing the difference of high bitrate MP3s, and lossless (or Bluetooth versus cable)!

    • @СергейЮрьевич-ж6л
      @СергейЮрьевич-ж6л 12 дней назад +1

      MP3 will never compare to the quality of vinyl

    • @ZEthanEZ
      @ZEthanEZ 11 дней назад

      I prefer the quality of record players thank you very much
      Jk this is a pretty good example tho

  • @akira1072
    @akira1072 13 дней назад +74

    10:55 Good to see npcs in cyberpunk are still half baked after 4 years.

    • @theninjarush1163
      @theninjarush1163 13 дней назад +6

      😂. Unfortunately no amount of ai is gonna fix that lol

    • @Dbxmdkdjdjdi
      @Dbxmdkdjdjdi 12 дней назад +6

      No game is perfect even Elden ring has bugs here and there

    • @blaze26700
      @blaze26700 12 дней назад +2

      I have been seeing this kind of stuff in GTA 5 for 10 years now.

  • @marxerr
    @marxerr 7 дней назад +1

    i’m disappointed, i wanted to buy the 5090 but it doesn’t seem worth it, and now the worst part is we have to wait for the 60 series for a major raw performance increase.

  • @RurouTube
    @RurouTube 13 дней назад +35

    The more you add generated frames, the higher the latency is, but at infinite MFG, the additional latency, assuming perfect scaling (as in generating the frame have 0 cost), will be similar to the cost of rendering a frame. So a 60fps game will have a max latency penalty of 16.7ms (1s divided by 60) assuming you're using infinite MFG. The main contributor for the latency is not the processing cost but the fact that it needs to delay showing the real frame because the need for frame pacing and interpolation. If you search the internet about FSR3 GPU Open, you can see why it introduce additional latency. Basically the game engine render a frame but instead of being shown, it is being hold by half the frame time (assuming it is the normal FG) and then when a new frame is ready being displayed, the game doesn't display that new frame but use that frame + previous frame for interpolation and present the generated frame. The actual new frame will be hold for around half the frame time and you repeat this process. Because the game holding the real frame for half the frame time, normal FG will have an additional 8.3ms for a game with 60fps base frame rate. For 4X MFG, instead of half, it hold the real frame for around 3/4 of the actual frame time, meaning 16.7ms * 0.75 = 12.5ms additional latency or around 4ms higher latency vs 2X FG. Again, all this numbers doesn't take into account the time needed for the game to process the generated frame, but as you can see, while yes, more generated frames add latency, but the subsequent latency increase will be smaller vs the initial latency increase from 2X frame gen.
    Basically if you're not bothered with the increased latency from 2x frame gen, 4x frame gen shouldn't bother you in terms of added latency unless we are talking about really low base frame rate.

    • @sadiqfaysal921
      @sadiqfaysal921 13 дней назад +5

      So are you saying the higher your base frame rate, the less latency. For example I play marvel rivals with frame gen on. I get about 100fps +- without it on, but when I turn on frame gen. I don’t feel the difference in latency, makes me wonder why it’s so shit on

    • @thetranya3589
      @thetranya3589 13 дней назад +16

      Because most of the people commenting on it have no idea what they are talking about.

    • @SLVYER1
      @SLVYER1 12 дней назад

      Never thought we'd get 4ki in 2025 lmfao

    • @user-sv2wy6gx7u
      @user-sv2wy6gx7u 12 дней назад +2

      Honestly 40-50ms latency isn't even noticeable unless you compare it side by side with 15ms. It's incredible Nvidia has figured out how generate so many fake frames with sub 50ms latency.

    • @gschaaf713
      @gschaaf713 12 дней назад

      so basically fram gen sucks ass

  • @DJJoMix
    @DJJoMix 12 дней назад +6

    If you don't understand what it takes to run Full Ray Tracing, let alone RT with full Path Tracing in 4K, then it's time to just walk away. The real question is, how is the image quality on the 4x DLSS vs Raster? Cause if our human eyes cannot tell, then who cares if it is AI.

    • @PhilosophicalSock
      @PhilosophicalSock 12 дней назад

      "if our human eyes cannot tell, then who cares if it is AI"
      brain-healthy people do not care, while the crybabies in comments somehow do.
      probably a religious cult of "rAw FpS", no matter what

  • @WhyNotGaming1
    @WhyNotGaming1 13 дней назад +28

    30 fps with path tracing is legit good tbh. I would still wait for raw offline rendering benchmarks though.

    • @NEWLifeXs
      @NEWLifeXs 12 дней назад

      Legit good ? No, it's a facking disappointment of a product... should have been getting at least 50!!!!! Ffs !!!

  • @louiedfn6552
    @louiedfn6552 6 дней назад +1

    im coming from a 3070 and planning on 3080 barring a terrible release. i just wanna play my games at 1440p and maybe 4k. i have an overkill monitor and i wanna finally take full advantage of it

    • @Doomguy-777
      @Doomguy-777 6 дней назад

      My 3070 Ti plays fine on 1440p 21:9. Just the VRAM is low. Why a 3080 lol just get a RTX 5070 Ti instead

  • @Yanto-rv1qu
    @Yanto-rv1qu 13 дней назад +43

    What did the vents smell like? Or is that under embargo?

    • @kevo300
      @kevo300 13 дней назад +29

      Smells like Jensens 9k leather jacket he wore

    • @PcCentric
      @PcCentric  13 дней назад +117

      Money leaving your bank account

    • @DBTHEPLUG
      @DBTHEPLUG 12 дней назад

      Who cares?

    • @qwertyrewtywyterty
      @qwertyrewtywyterty 12 дней назад

      lmao

  • @DavidFregoli
    @DavidFregoli 13 дней назад +15

    65 fps in PT DLSS Q 4K is pretty good actually, 4090 does around 40. Let's say it's 60 to 40 because the scene was pretty basic, it's still a massive 50% jump. I think 30% average is more likely but if it gets close to 50% in the workloads where 4090 doesn't cut it (4k dlss Q PT) it's a worthwhile upgrade for enthusiasts.

    • @guerrierim15
      @guerrierim15 13 дней назад +2

      Um. My 4090 gets 60 fps. Not 40. Ngl this doesn’t seem like a big jump at all without the MFG

    • @FloIstMoep
      @FloIstMoep 13 дней назад

      For enthusiasts 10% will do, but that's not what nvidia could aim for.

    • @Cptraktorn
      @Cptraktorn 12 дней назад

      ​@@guerrierim15no it doesn't, I have a 4090 and it does not get to 4k 60 on quality dlss

    • @marekavro
      @marekavro 12 дней назад +1

      @@guerrierim15 ummmmm yea, my 4090 gets 500 fps at native 4K so basically 5090 sucks ass

    • @hydzifer
      @hydzifer 10 дней назад

      @@marekavropls stop for god sake

  • @Collioure232
    @Collioure232 13 дней назад +6

    I’m a 4090 owner and am severely triggered and threatened by a new product coming on the market *dribble *

    • @Mike-Dynamike
      @Mike-Dynamike 13 дней назад +5

      You will continue to give them money no matter what they do 😂

    • @TampaGigWorker
      @TampaGigWorker 13 дней назад +1

      ​@@Mike-Dynamiketrue. There's no real competition to choose from. Not like in the auto industry where there's so many to choose from.

    • @Mike-Dynamike
      @Mike-Dynamike 13 дней назад +2

      @@TampaGigWorker true, no competition in high end gpu market but 90% of user don’t really need a 4090……they make a conscious decision to give Nvidia money and they already know how they are. Just don’t cry after it.

    • @TampaGigWorker
      @TampaGigWorker 13 дней назад +1

      @@Mike-Dynamike i made a conscious decision to spend $1069 on the Gigabyte 4080super OC. A tough pill to swallow but no complaints after making the final payment through Affirm financing.
      Im happy with it for now. I'll upgrade in about 2 years and stay in the 80 ti/super series braket.

    • @AD-sg9tr
      @AD-sg9tr 13 дней назад +1

      @@Mike-Dynamike lmao fax

  • @amsbaia
    @amsbaia 7 дней назад +1

    With input lag increased, so it's the same thing of LossLess Scalling

  • @maccollo
    @maccollo 12 дней назад +10

    That artifacting is definately in the "Now I can't unsee it" category

  • @TheArchitect4K
    @TheArchitect4K 13 дней назад +18

    Nice video mate. Should have tried without RT and without DLSS/ frame gen !

    • @theotherLewbowski
      @theotherLewbowski 13 дней назад +5

      Thanks for this comment. Now I know not to watch this shit. Did he really not test native res😂😂😂

    • @TheArchitect4K
      @TheArchitect4K 13 дней назад +5

      @theotherLewbowski no no, he did. Native, but full RT. I was interested in native, no RT 😅 but the other tests he made were good, watch it !

    • @XMatt5X
      @XMatt5X 13 дней назад

      true

  • @PAIN-ot4cj
    @PAIN-ot4cj 13 дней назад +30

    If nvidia didn't soft lock the 40 series with the new MFG x4 there would be really no difference

    • @brotherali-e1d
      @brotherali-e1d 13 дней назад +7

      there is an app with a duck on steam

    • @brotherali-e1d
      @brotherali-e1d 13 дней назад +11

      lossless scaling

    • @tucci06
      @tucci06 12 дней назад +2

      Why do people keep saying this? It is blowing my mind. Do you actually believe there will be a ZERO PERCENT difference in raw performance? I think the consensus is that there will be around 30-35% uplift. which is pretty standard.

    • @DavidDuchodca
      @DavidDuchodca 12 дней назад +7

      ​@@tucci06by pretty much having 30% more power consumption. :) it's basically like a overclocked version. Dont get me wrong i get that nowadays you dont pay for HW, but for frame gen... Thats where the future is heading,but dont lie to yourself regarding HW improvement...

    • @PAIN-ot4cj
      @PAIN-ot4cj 12 дней назад +1

      @@tucci06 5090 with mfg enabled 400 fps ok a 4090 with mfg enabled 350 fps ok a new 5090 2k plus dollars do you think that’s worth an upgrade 🤣 ok

  • @ExhaleDJ
    @ExhaleDJ 12 дней назад

    Awesome video, thanks!

  • @Jr_TopG
    @Jr_TopG 12 дней назад +13

    I have a 4090 and trust me, we dont need more than 120 fps in games, developers just need to optimize there games

    • @scottwatrous
      @scottwatrous 12 дней назад

      I remember when console people were saying we don't need more than 30fps to have an enjoyable game, and the PC people were bragging about glorious 60fps.
      But yeah 120 is getting to be plenty.

    • @N4CR
      @N4CR 12 дней назад

      @@scottwatrous 120 was about where I couldn't see much difference past back in the CRT days. Once you hit around there it was good for almost any type of game. Yes twitch FPS guys can benefit from more but most games... nope lol. These days I care more about pixel response and individual lighting.. so OLED and similar is king. 48" C2 is my daily driver and absolutely love it with an XTX Nitro and 7800x3d.

    • @tarragoni4161
      @tarragoni4161 12 дней назад

      its plenty for single player story games but nowadays the majority of gamers play competitive shooters and actually need the fps to reach their refresh rate. i do agree games like warzone and Fortnite need to optimize their games better because out of all the years of occasionally loading into Fortnite i still feel like i have to put my settings at low to get my fps to match my monitors refresh rate despite having a 4070 super

    • @PhilosophicalSock
      @PhilosophicalSock 12 дней назад

      In singleplayer titles you dont need more than 120 fps, but in multiplayer you only want maximum fps without any FG.
      Interesting marketing Nvidia has right there.. huh

    • @jamesm568
      @jamesm568 12 дней назад

      I want a stable 120 FPS.

  • @thatwasgaming6225
    @thatwasgaming6225 12 дней назад +11

    10:50 spend 2000$ to play at 28 fps... sounds good to NVIDIA

    • @hagaiak
      @hagaiak 12 дней назад +7

      You think people with 4090 are playing Cyberpunk at 20 fps? Use your brain

    • @Dexion845
      @Dexion845 10 дней назад +2

      There's no saving y'all I swear 😂

    • @fana9863
      @fana9863 7 дней назад +1

      People who utter this sentence are amusing to me. Who forces you to use very intensive settings in a game? If they would add a feature to increase the path tracing accuracy even more when the user want to activate it, then you would be upset??

  • @jakek3653
    @jakek3653 13 дней назад +9

    This is the reason I grabbed a 4090, I didn’t think they could get much better performance then it over the next couple of years… and I was right. Prob won’t be upgrading again until 6090 or maybe even the 7090.

    • @MrMeanh
      @MrMeanh 13 дней назад

      Same here, don't see myself upgrading until the gen after the PS6 is released, and that will probably be the 70-series. I also hope that I can get at least 4090 +50% performance at 250-300W then, since the 4090 is already making my room into a sauna in the summer months (the 5090 would probably do this 6-9 months a year).

    • @jakek3653
      @jakek3653 13 дней назад

      @ I mean I’m waiting till they come out with a card that can run 4k 240hz RAW performance not fake AI frames.

    • @filthyrando3632
      @filthyrando3632 13 дней назад

      Exactly.

    • @BladeRunner031
      @BladeRunner031 13 дней назад +1

      @@jakek3653 Waiting what now?? 😂😂

    • @jakek3653
      @jakek3653 13 дней назад +1

      @@BladeRunner031 a card that can run 240hz at 4k.

  • @dan.downing
    @dan.downing 11 дней назад

    Really nice bit of analysis, especially for doing this on the fly, at CES in a hotel room. Kudos.

  • @DeathDeclined
    @DeathDeclined 13 дней назад +19

    wouldn't that latency be horrible for multiplayer games tho?

    • @dechrysoph5298
      @dechrysoph5298 13 дней назад +10

      Yes exactly this feature is only good if you want to play story games

    • @johnrehak
      @johnrehak 13 дней назад +3

      If You plan to play competitive games at 4K with pathtracing and frame gen to achieve 240hz, so yes.

    • @jaaqob2
      @jaaqob2 13 дней назад +11

      who plays compettetive games in 4k with dlss and frame gen? pls explain

    • @Joeythejew
      @Joeythejew 13 дней назад

      As of recent data, a small percentage of PC gamers use 4K monitors. According to a Steam hardware survey, only 1.74% of PC gamers utilize single monitor setups at 4K resolution. (What is the point building a gpu that's catered to less then 2% of potential buyers?)

    • @BladeRunner031
      @BladeRunner031 13 дней назад +4

      @@jaaqob2 Hey,people think they play competitive gaming if they play COD online lol

  • @Cooper3312000
    @Cooper3312000 11 дней назад +3

    No one wants fake blurry frames just pure raw speed. Reviewers just kissing ass.

    • @superior_nobody07
      @superior_nobody07 9 дней назад

      More frames equals more frames. I'm not going to notice minor miniscule blurs every so often, but I sure as he'll well notice 120fps compared to 40fps

    • @Cooper3312000
      @Cooper3312000 8 дней назад

      @@superior_nobody07 I guess you don't see the massive trails behind the cars in Cyberpunk 2077 with DLSS on to get a playable frame rate. Or the massive loss in detail on everything. Probably play on a 24" LCD monitor at 1080p. I use a 45" Corsair OLED 3440x1440 240Hz monitor. I'm not upgrading for $2,000 from the 4090 to the 5090 for a 10% at MAX increase of raw performance that's insane. I was personally expecting way more.

  • @REDZ28won
    @REDZ28won 12 дней назад +10

    I just want to know what the 5090 does at Native 4k without RT, DlSS and Frame generation vs a 4090!

    • @bra1nc417d
      @bra1nc417d 12 дней назад +1

      We will know in 9 days. IDC about RT, DLSS, Frame Gen or any of that shit. I want the raw MAX performance at 4k.

    • @dimitrirouge5568
      @dimitrirouge5568 12 дней назад

      true never liked DLSS ... making blury line ... I m not a console gamer.

    • @toxotis70
      @toxotis70 12 дней назад

      i think about 20-30% better

  • @richardbixler
    @richardbixler 12 дней назад

    Appreciate the in depth test. It’s all the sort of experimenting I would have done. Great to see an influencer/reviewer really taking advantage of their early access to benefit of their audience. I feel I have a pretty good understanding of the performance gains of the 50 series. Thank you!

  • @ubivatel4207
    @ubivatel4207 12 дней назад +12

    Something to notice here is that the 5090 gets ~68fps at DLSS Quality, while the 4090 was only able to get the same framerates with DLSS Performance (as far as I remember). With the new transformer model the improvement to image quality should be substantial

    • @N4CR
      @N4CR 12 дней назад +2

      both still look crap. the transformer demo frames didn't even have the same damn textures or texture maps, it's like throwing away the original game and making a fake one out of cardboard. trash

    • @kam9223
      @kam9223 12 дней назад

      It was most likely at DLSS Performance, the video didn't show what supersampling preset they chose after disabling FG, notice the cuts at 8:14 and 10:31. I'm happy to be wrong though

    • @ubivatel4207
      @ubivatel4207 12 дней назад

      ​@@kam9223 At 8:57 he says that it's DLSS Quality

    • @alessiomasciandaro1022
      @alessiomasciandaro1022 12 дней назад +1

      ​@@N4CRwhat are you on about😂😂😂

    • @hagaiak
      @hagaiak 12 дней назад

      @@N4CR Imagine calling the greatest graphics available to humans crap, lmao

  • @cidronsenburg7242
    @cidronsenburg7242 13 дней назад +10

    You will notice the MFAA frame gen issue even more when it comes to a scenario with a lot of vegetation.

    • @damara2268
      @damara2268 13 дней назад +5

      question is, do you play game to have fun or to zoom on grass and shake the camera?

    • @laszlodajka5946
      @laszlodajka5946 12 дней назад

      ​@@damara2268well at that point we could argue the same about rayter vs ray tracing

    •  12 дней назад

      I play games to win and having artefacting, ghosting and motion blur at high refresh rates kind of makes high refresh rate monitors useless no? Now I know this is going to be hard to grasp but - other people are different than you, they aren't the same as you. Try to understand this simple concept in the most simple manner and the rest shall follow. Peace

  • @blaze26700
    @blaze26700 12 дней назад +4

    2:08 If you have played multiplayer games in 50+ Ping, I don't think you will have any problem playing story mode games in 30+ Ping

    • @ruven782
      @ruven782 12 дней назад +2

      I play games with 5 to 9 ms.. this will kill me when playing competitive

    • @SamuelPGM
      @SamuelPGM 11 дней назад

      ​@@ruven782look... Until 70ms is fine

    • @aether8412
      @aether8412 7 дней назад

      Those have rollback tho, this would be straight up delay so it’d feel a lot worse

  • @pags1981
    @pags1981 5 дней назад +1

    Meh...getting 70 fps, 4k, DLAA (no frame gen) with path tracing using 3080 Ti. I want raw power like the 1080Ti 😅

  • @AshtonCoolman
    @AshtonCoolman 13 дней назад +11

    It's faster than a 4090 while native rendering with path tracing enabled, but not by much. DLSS means we're NOT rendering at native resolution, and it can't get decent frames without it. This isn't a huge jump in REAL performance over a 4090. It's just a huge jump in software crutches to make things appear faster. I have a 4090, but my personal rule is to not upgrade until a GPU is 50% faster in native rendering performance. Good native performance plus the software crutches will enhance the experience far more. I'll save up $3000 for that 6090 I guess😂

    • @ronyomar100
      @ronyomar100 13 дней назад +2

      Yes I have 4090 and I will not upgrade until the next generation 6000

    • @greenbow7888
      @greenbow7888 12 дней назад +1

      Both 4090 and 5090 weree built on TSMC 4nm. Meaning only increase in raster comes from more power and more cores. No increase in performance increase or efficiancy increase, from reduction in fabrication node size. .......... Meaning the next gen of Nvidia will likely drop node fabrication node size and be a monster. TSMC now have 2nm fabrication, so hell yeah.

  • @Sader325
    @Sader325 12 дней назад +4

    I don't understand why anyone would think a 50 ms delay is acceptable.

  • @oligerolsen2533
    @oligerolsen2533 13 дней назад +14

    Why didn't you tested 4k native res without frame gen, with dlaa or not at all, no dlss, no PT and RT and max settings?

    • @bra1nc417d
      @bra1nc417d 12 дней назад

      Thats what I wanna know. RAW power. Thats all that matters to me, as I dont like frame gen or DLSS or RT or any of that nonsense (MOST of the time). I just love a steady smooth high framerate with "normal" max settings.

    • @superior_nobody07
      @superior_nobody07 9 дней назад

      ​@@bra1nc417dthen you'll still be fine with this card, or really any of the 40 series and most of the 30 series cards. Most games aren't anywhere near as taxing as cyberpunk

    • @sroymoeun
      @sroymoeun 9 дней назад

      @@bra1nc417dWhat you want is an airplane performance in a car engine.

  • @rexwaly4411
    @rexwaly4411 7 дней назад +1

    for me raw performance is no ray tracing and no AI generator or DLSS stuff like that. That is raw performance and what I wanted to see. I don't think he show it in the video I don't think.

  • @HirXeBomb
    @HirXeBomb 13 дней назад +17

    Thanks for the video, I for one will stick with my 40 series card, I can''t be bothered to pay more for better fake fps. I hope 1 day the native side will be upgraded!!!!!

    • @Hottiechika010
      @Hottiechika010 13 дней назад

      fr

    • @cxngo8124
      @cxngo8124 13 дней назад +1

      And lossless scaling does the same job for you without making you bankrupt

    • @ronyomar100
      @ronyomar100 13 дней назад

      I have a 4090 card and I was going to sell it and buy 5090 but after watching this video I will keep my card and will not buy the Bullshit you are nvidia selling

    • @tucci06
      @tucci06 12 дней назад

      @@cxngo8124 It's not the same thing but it's a useful program.

    • @Wolfyyy
      @Wolfyyy 12 дней назад

      just get lossles scalling ...
      its looks better and have more options, and will always be updated

  • @James-qi2hy
    @James-qi2hy 12 дней назад +11

    I honestly don't understand all this drama over "fake frames". The entire point of a gaming GPU is to provide optimal gaming performance. That means graphical fidelity and smooth, high FPS. Why does it matter how it achieves that? Is my game experience going to be ruined if there's an artifact in the background every once and a while? Am I going to be panning around as I'm playing on the hunt for imperfections resulting from AI frame generation? No. When I was in the hospital, I didn't reject morphine because it would provide "fake comfort" to combat my pain, so why would I reject AI frames that resolve low FPS and stutter? What am I missing here? I know it has to be me because I trust that a million people aren't wrong and I'm the one who's right.

    • @behcetozer6356
      @behcetozer6356 12 дней назад +6

      most of these people are in an echo chamber. 99% of people think the way you do, including me. People are always pessimistic when it comes to new things, especially new tech/software. Just wait for the reviews to come out and youll see most of these people will backtrack on their statements about fake frames. I have an RTX 3080 and I'm eyeing the 5080. With frame gen it would be 3-4x better then my current card. For $1k that is not a bad deal at all.

    • @FLarezzedits
      @FLarezzedits 10 дней назад

      ​@@behcetozer6356what do you think about the vram on the 5080? My 3080 10gb is having issues at 4k and im worried the 16gb of vram on the 5080 is going to cause issues in 2-3 years again

    • @zaboono
      @zaboono 10 дней назад +5

      They’re just parroting statements they’ve heard, and most of their examples make no sense or actually don’t matter at all. Stuff like “Why can it only run cyberpunk at 30fps 4k on native” is just undermining the technological achievement it is to be able to play a real time path traced game at over 100 fps, people complaining about the latency in multiplayer competitive games are just completely ignoring that there is no fps game that’s demanding enough to need you to turn mfg on in the first place, and everyone’s silent all of a sudden about pricing on something like the 5070, because it’s actually lower than what the 4070 launched with.
      Vram and the 192 bit bus are the only things to really complain about when it comes to the 5070, but the importance of vram has also been grossly over exaggerated, I’m still occasionally using a laptop with 6gb vram and it’s perfectly fine as long as you turn a couple settings down that don’t really affect how the game looks regardless. I run a 7800xt now and it definitely feels more comfortable, but I still don’t think vram is nearly as big an issue as people make it out to be

    • @Balila_balbal_loki
      @Balila_balbal_loki 9 дней назад +1

      We are reaching the limit of technology. We can force frames to be 300fps without AI but the gpu size will be large. The 4k series reached our limit with forcing frames and instead of forcing frames we are using ai to make frames the main difference is visual bugs bluring and rotation quick combat and ray tracing for example playing hogwarts legacy the ai frames will delay the deployment of certain visual used in combat but you can hear the sound of it meaning instead of the sound and visual being in sync you get the thunder before the lightning. Forcing frames though you'll get both at the same time. 99% of the games dont have issues with ai generated frames but the ones that do it affects gameplay and forces the game devs to address it in the case of hogwarts legacy they hard patched the game so that those visuals and the sound are synced. This is forcing devs to do nvidia's job.

    • @moxopixel
      @moxopixel 8 дней назад

      Optimial? When the picture does have artifacts. That is NOT optimal.

  • @jiteshshankhpal3034
    @jiteshshankhpal3034 12 дней назад +3

    raw performance from 10:24 30 fps

  • @sunny113866
    @sunny113866 12 дней назад

    One of the best vids out there!

  • @berrybushbandit5527
    @berrybushbandit5527 12 дней назад +7

    If the new Reflex works well, i dont even care about the fake stuff anymore, just enjoy the features

    • @PhilosophicalSock
      @PhilosophicalSock 12 дней назад

      They would not release a feature that does not work well.
      Reflex 2 is tested and approved by esports gamers QA btw

    • @matx3563
      @matx3563 12 дней назад

      ​@@PhilosophicalSock Paid experts approve everything. Even not-tested vaccines 😂

    • @DarkoP9.13
      @DarkoP9.13 10 дней назад

      Reflex 2 has AI component to IT... to "predict" the next frame.

  • @SquishyBoi5463
    @SquishyBoi5463 13 дней назад +11

    I can't wait for the side by side of the 9070XT and 5070, if they are in the same price bracket I'll probably pick the one with the best raster performance within reason, if it's close NVidia features will probably make the difference, this is going to be an incredible generation for mid range competition and a horrible one for mid range value

    • @VeeHausen
      @VeeHausen 13 дней назад +1

      Me too.

    • @LeafyTheLeafBoy
      @LeafyTheLeafBoy 13 дней назад +4

      nah no matter what i'll still pick Nvidia for the sole reason off...
      Ray tracing is the future and no matter what its here to stay
      DLSS is superior and you'll still use FSR even at 9070XT
      DLSS Transformer
      heck i rather choose Intel over AMD because of AI too

    • @MrEditorsSideKick
      @MrEditorsSideKick 13 дней назад +8

      @@LeafyTheLeafBoy FSR 4 looks very similar to DLSS tho, I hope AMD finally fixes their upscaler (only took them 4+ years😅).

    • @supta2110
      @supta2110 13 дней назад +2

      Amd 😅😂, better pick nvidia 😅

    • @deltagaming5005
      @deltagaming5005 13 дней назад

      This is exactly where my mind is right now. Totally agree with you. I do want a little bit of future proofing though so I may just shut my eyes and click "buy" on the 5080 because of the increased vram.

  • @petza94
    @petza94 12 дней назад +11

    turn off DLSS lol

    • @lneeshan
      @lneeshan 11 дней назад

      Fr the one thing we want to see

  • @Hy6KO
    @Hy6KO 7 дней назад +1

    moment when fps drops to 30 is like hell

  • @AB-td5oo
    @AB-td5oo 13 дней назад +8

    So 5090 is just a 4090 with 600w tdp and ai

    • @bwv582
      @bwv582 12 дней назад +1

      No

    • @AB-td5oo
      @AB-td5oo 12 дней назад +1

      @bwv582 yes

    • @bwv582
      @bwv582 12 дней назад +1

      @@AB-td5oo No.

    • @hydzifer
      @hydzifer 10 дней назад +1

      No

  • @Eusebiugh
    @Eusebiugh 13 дней назад +5

    Dlss performance, that make 1080p render? or less?

    • @3dninja54
      @3dninja54 12 дней назад

      For 4K, it's 1080p upscaled on DLSS Performance.

  • @d4mterro320
    @d4mterro320 13 дней назад +7

    9:14 16ms is the standard ms time for 60fps. You have the latency of console players. So what type of “good latency” did you mean to say???

  • @m.j.5681
    @m.j.5681 12 дней назад +3

    2:30 Can see all the artifacts on screen, and they are a lot! Looks like garbage! Fake frames are garbage!

  • @rtx2080ubermacht
    @rtx2080ubermacht 11 дней назад +1

    if we get twice or tiple fps for little increase in latency and some graphical distortions, this is a massive win

  • @n1lknarf
    @n1lknarf 12 дней назад +10

    They have to update the fps counters to show CPU fps and GPU fps.
    The CPU is still processing on real frames, adding fake frames won't make it run faster, but it will seem like it's taking longer to process due to the fake frames keeping the frame still for longer with the fake frames.
    This is obvious to anyone with a brain.

    • @cqllel5186
      @cqllel5186 12 дней назад

      They should. I've been wanting to know what my real frames were

  • @techno-kitchen
    @techno-kitchen 12 дней назад +5

    Что может RTX 5090:
    28 FPS - трассировка путей, лучи, ультра, 4К без dlss и без мультигенератора
    RTX 5090
    60 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, без мультигенератора
    115 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, мультигенератор x2
    170 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, мультигенератор x3
    Качество картинки не падает! Лагов в управлении нет! Это революция!!
    Продавайте свою 4000 и 3000 / 2000 серию в Авито пока она еще стоит не копейки!

    • @МихаилЛипатников-у3ю
      @МихаилЛипатников-у3ю 12 дней назад

      Пока эти глупцы заняты своей жадностью, но потом сразу понизят свои цены после первых реальных тэстов

    • @techno-kitchen
      @techno-kitchen 12 дней назад

      @@МихаилЛипатников-у3ю ага конечно ) 5000 серия будет стоить дорого. 5090 будет в жутком дефиците. AMD уже сдалась - прямо на CES. Они были настолько в ахуи, что просто заткнулись потому что лучше молчать чем нести бред про новую серию - что она что то там может. FSR 4 показали что мыла стало намного меньше - но этого мало )

    • @dibaklp
      @dibaklp 9 дней назад

      Лагов в управлении нет?? хахахахаха

    • @techno-kitchen
      @techno-kitchen 9 дней назад

      @@dibaklp хаха. Посмотри ролик. С Яндекс переводом. Никаких лагов нет в Киберпанке

  • @andrewdavies3584
    @andrewdavies3584 12 дней назад +12

    Fake frames

  • @Ladioz
    @Ladioz 12 дней назад

    Thanks for the sound test. This is the only part that matters for me

  • @allasnaveras
    @allasnaveras 11 дней назад +1

    Now we are waiting for non-corrupt bloggers

  • @TheFoolTohru
    @TheFoolTohru 11 дней назад

    Great work! This is how you showcase a new GPU in detail 🖥

  • @Dueno430
    @Dueno430 12 дней назад

    Thank you for this very informative video. It helped me to visually understand some things about AI, DLSS and ray tracing that I was struggling to figure out. I also just noticed something interesting. The game is being played on a Falcon NW PC. That might have something to do with how quiet the machine is. Back in the day, my first serious computer was a Falcon NW. Everything they say about Falcon NW is true. Superb build quality and incredible direct support. Unfortunately, it is also true that they are extremely expensive. I couldn't afford to keep up with the pace of the rapid changes in computers at their price point. So I learned to build my own, at least partially through my interactions with their customer service (I5 8600K, GTX 1080, 1TB M.2 Sata SSD). I have to admit it has held up amazingly well, mainly due to the incredible longevity of the 1080 card.
    However, it is getting to be time to get a new rig. This video was very helpful to me in my research. I have to say the first thing I do when I am researching a new setup is go to the Falcon NW website and see what they are putting in their machines.

  • @jackkrauser2361
    @jackkrauser2361 10 дней назад

    10:38 Gonna destroy the pc doing that 😅 Good vid 👍

  • @I_hunt_lolis
    @I_hunt_lolis 8 дней назад

    It's wild that latest stuff is still being benchmarked on a game that is over 4 years old now

  • @maherylala2153
    @maherylala2153 12 дней назад

    10:55 That NPC casually walking over that road pillar

  • @kaiakoken
    @kaiakoken 12 дней назад +2

    Remember that these are the first drivers. It will only be better :)

  • @maadv7237
    @maadv7237 12 дней назад +2

    Just for reference, the latency for 60fps is 16.67ms, 30fps is 33.34ms, and 24fps is 41.67ms.

  • @setitfree78
    @setitfree78 11 дней назад +1

    Turn to the monitor
    Turn to the camera
    Turn to the monitor
    Turn to the camera

  • @pgr3290
    @pgr3290 9 дней назад +2

    37 milliseconds of latency isn't amazing. Not great, not terrible.

    • @bluelinewhitetails6205
      @bluelinewhitetails6205 9 дней назад

      It will be interesting to see how much reflex 2 brings things down. I’m building new PC and torn on a Gpu. I feel the future is geared towards AI and anyone would be stupid in my shoes not to buy a 5000 series my only hang up is that latency.

    • @sterlingarcher2366
      @sterlingarcher2366 8 дней назад

      ​@@bluelinewhitetails6205 it's only an issue if you need Ray tracing.
      With it off you can game decent at 4k with a 4070ti super so a 5070ti or up be good depending on your budget.
      Get a good surge protection plug adapter Never forget it.

  • @amaybhandari
    @amaybhandari 12 дней назад

    10:55 lol that NPC 😂

  • @BallisticSollution
    @BallisticSollution 12 дней назад

    Great vid, heaps of info and lots to think about. I've got a 4090 and have found that is does struggle sometimes at 4k. There is no doubt that with DLSS4 and multi frame gen that it will run very well in supported titles, but it's those unsupported titles I'm worried about. Obviously it's the Path Tracing that is the insane performance killer in CP2077, and if there is only ~30% performance uplift over 4090 in rasterisation, I'm not sure I can justify the expense of the 5090. I'll just wait until the 24th for more info I think, but hopefully with DLSS4, the 4090 will continue to perform well enough to hold out another generation.

  • @PenkoAngelov
    @PenkoAngelov 12 дней назад +2

    Can't get over the massive nauseating Motion Blur... to hide the AI artifacts.
    Doubling the framerate with the same latency is essentially double the processing time. Also for games that do not support DLSS4, this is completely useless.

  • @mopnem
    @mopnem 9 дней назад

    THIS is literally the best video on the 5090 so far. Thank goodness for a vid that shows what we ACTUALLY want to see. Def a like & share

  • @Kasun6534
    @Kasun6534 12 дней назад +2

    I think the cyberpunk 2077 game is designed to work with low fps and high latency even on the newest GPU. Then people with money build the best PC build with the latest best GPU to get the best performance. Then those companies get a lot of money. I think the companies that make games and the companies that make PC parts (gpu, cpu...) are doing This(some kind a deal). For example, some people with money give up their old iPhone and buy the latest iPhone. But there is no big difference between the iPhone from last year and the new iPhone. PC centrics, if you can, test other games in 4k in addition to the cyberpunk game and show us. That would be a big help. Another thing is that we really need to do a real world test to see if the game that looks the most graphically is the cyberpunk game or another game. Even though the cyberpunk game shows low fps and high latency in the test (dlss off, frame gen off), we cannot say from that alone that the cyberpunk game has the most graphics.

  • @Tkohr78
    @Tkohr78 11 дней назад

    The cooler from the 5090 is the most impressive thing, a two slot cooler, and on high load at over 500 watt you can whisper in the room.