Will RTX 5090 Be Too Fast For Any Current CPU?

Поделиться
HTML-код
  • Опубликовано: 1 окт 2024
  • ► Watch the FULL Video: • DF Direct Weekly #159:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry RUclips: / digitalfoundry
    ► Digital Foundry Merch: store.digitalf...
    ► Digital Foundry at Eurogamer: eurogamer.net/...
    ► Follow on X/Twitter: / digitalfoundry

Комментарии • 739

  • @UnimportantAcc
    @UnimportantAcc 5 месяцев назад +346

    Just replace the CPU with a GPU silly 🥰🥰

    • @tsorakin
      @tsorakin 5 месяцев назад +15

      Like duh 😂

    • @vitordelima
      @vitordelima 5 месяцев назад +25

      This is kind of possible but it takes too long to explain. IBM Cell was one failed attempt at it.

    • @jmssun
      @jmssun 5 месяцев назад +23

      The more you buy, the more you save ~

    • @daniil3815
      @daniil3815 5 месяцев назад +11

      exactly. you have money for 5090, but not for CPU upgrade lol

    • @photonboy999
      @photonboy999 5 месяцев назад +3

      @@daniil3815
      I think you missed the joke.
      Anyway, I remember Intel trying to go the other way with Larrabee trying to use stripped-down x86 cores to create a GPU. It was interesting, but predictably was very inefficient for GPU tasks. So I'm curious WHY they put the money into it. I'm not going to just assume there was no purpose whatsoever based on my limited computer experience.

  • @markusmitchell8585
    @markusmitchell8585 5 месяцев назад +77

    Diminishing returns at this point. Games are unoptimized that's the only reason why these ridiculously stupid specs are needed.

    • @saliva1305
      @saliva1305 5 месяцев назад +5

      so true

    • @justhomas83
      @justhomas83 5 месяцев назад +4

      correct I have a Rtx 4080 i am not upgrading for another 3 years.
      It's getting pointless

    • @saliva1305
      @saliva1305 5 месяцев назад +1

      @@justhomas83 i play to get 50 series since im on a 3070ti, but maybe AMD us an option too.. sad we have to buy top tier gpu's to play games that should run on 3080 and no frame gen

    • @justhomas83
      @justhomas83 5 месяцев назад +3

      @@saliva1305 I feel you I just can't do it anymore. I have more bills coming in now and my daughter is graduating college.
      I understand though you're right the top tier is the only path now. We are chasing rabbits in Alice Wonderland at this point 😔😔😔 damnit

    • @Koozwad
      @Koozwad 5 месяцев назад +3

      yes exactly and the fact that RT/PT exists - it's there to make people $pend, $pend and $pend some more
      amazing graphics are possible using non-RT/PT - just look at RDR2 from what 2018(?)
      I've been saying for years now that devs should be using RT/PT as a TOOL to see how scenes should be lit and then recreate them with NON-RT means which will give the players a ton of performance and be much friendlier to their wallets
      plus hand-crafting it can look nicer

  • @wickfut8917
    @wickfut8917 5 месяцев назад +179

    VR needs more power. Always. My 4090 isn't good enough for high resolution headsets in graphic intense games. The new headsets on the horizon with 3800x3800 resolution per eye will easily chew through the power of the next few generations of GPUs.

    • @clockworklegionaire2135
      @clockworklegionaire2135 5 месяцев назад +14

      Real

    • @mattzun6779
      @mattzun6779 5 месяцев назад +11

      Who in their right mind would develop a game that NEEDS something faster than a 4090 to be good.
      If VR needs that much power, VR games need to go for several thousand dollars each to make a profit with current tech.
      One can hope that next gen consoles and NVidia 6000 series mid range cards get to that level
      Hopefully, there are tricks like frame generation and higher resolution where you are looking that help.

    • @rahulahl
      @rahulahl 5 месяцев назад +31

      @@mattzun6779 Not official games. But the UEVR mod allows you to play non VR UE games in VR mode. Imagine running the latest UE5 games at about 6k resolution, aiming for 90FPS stable. My 3080 couldn`t even run simple games like Talos Principle 2 at a playable quality or frame rate. Best I got was a blurry mess equivalent to 720p at low settings at about 80ish FPS. This is why I am waiting for the 5080/90 so I can finally play those UE games in VR.

    • @xpodx
      @xpodx 5 месяцев назад +8

      Yea and tons of games can do higher render scaling, and the 4090 is not strong enough 4k max 8k Render at 144hz+

    • @nossy232323
      @nossy232323 5 месяцев назад

      @@mattzun6779 I personally would hope games will be scalable enough to use all the power from the lower end up to the ultra high end.

  • @leo_stanek
    @leo_stanek 5 месяцев назад +85

    What a world we live in where we are concerned that our GPUs have gotten so good we bottleneck the CPU at 4K high refresh rate. I remember when getting 1080p60 was the dream for high end hardware.

    • @stephenmeinhold5452
      @stephenmeinhold5452 5 месяцев назад +6

      its still is for me on a 2080ti although with DLSS i can go up to 1440p.

    • @Veganarchy-Zetetic
      @Veganarchy-Zetetic 5 месяцев назад +8

      @@stephenmeinhold5452 Yh Alan Wake 2 can barely run at 1080P on my 4090 with raytracing on lol.

    • @UTFapollomarine7409
      @UTFapollomarine7409 4 месяца назад

      my 3900x lacks in 4k in some areas believe it or not

    • @LordKosmux
      @LordKosmux 4 месяца назад +3

      ​@@Veganarchy-Zetetic You know that this is due to developer's laziness to optimize the game, right?

    • @Veganarchy-Zetetic
      @Veganarchy-Zetetic 4 месяца назад +2

      @@LordKosmux I would say it has a lot to do with Ray Tracing.

  • @professorJorge11
    @professorJorge11 5 месяцев назад +182

    I need a 5090 for My 1080p monitor, like a fish needs a bicycle

    • @darkdraconis
      @darkdraconis 5 месяцев назад +11

      Hey hey hey hey now!
      What kind of "fish-racist" are you?
      Does a fish not have the right to evolve into a bike riding creature?
      Incredible you anti fishists!

    • @professorJorge11
      @professorJorge11 5 месяцев назад +1

      @@darkdraconis it's a song bruh

    • @darkdraconis
      @darkdraconis 5 месяцев назад +4

      @@professorJorge11 it's a joke brah

    • @CeceliPS3
      @CeceliPS3 5 месяцев назад +4

      Let me teach you, professor. There's this thing called DLDSR. You can render games at 1440p and 1620p, mantain high af FPS and still get a much better and crispy clean graphics on your 1080p. Sure, a 4090 user (or even a 5090 user) could do with a 1440p monitor, but your analogy is entirely wrong in this case as there is a use for those GPUs with a 1080p monitor.

    • @professorJorge11
      @professorJorge11 5 месяцев назад

      @@CeceliPS3 I have a Radeon 7600. There's no DLSS, it's FSR2

  • @byronfranek2706
    @byronfranek2706 5 месяцев назад +156

    32" 4K/240hz OLED displays would be an obvious target for the 5090.

    • @xpodx
      @xpodx 5 месяцев назад +18

      8k 120hz/144hz

    • @GatsuRage
      @GatsuRage 5 месяцев назад +21

      even a 5090 wouldn't be able to push 4k 240fps... unless u only playing cs and LoL lmao so I seriously see no point in looking at those displays yet. 1440p still makes way more sense for high refresh rates.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 5 месяцев назад +1

      That's all well and good, but I've just gotten a 4k 240hz monitor, I have a 4090 and 5900X, and even games from 2015 are CPU bottlenecked, 240hz should never be a target. Making 8k gaming doable seems to be what it'll be targeting, we don't need better GPUs for 4k right now.

    •  5 месяцев назад +4

      @@GatsuRage Well the more FPS the better. 4K looks way better than 1440P, and OLED is better than any display in the market.

    • @xpodx
      @xpodx 5 месяцев назад +1

      @GatsuRage I get 180-220 in cod Vanguard with my 4090 at.4k max no dlss. But yea the 5090 won't be able to do cyberpunk 2077 max 4k 240 and other similar games. But easier games for sure.

  • @heyguyslolGAMING
    @heyguyslolGAMING 5 месяцев назад +133

    I'll only consider the 5090 if it puts my house at risk of burning down. If it can't do that then its not powerful enough.

    • @ThePlainswalker13
      @ThePlainswalker13 5 месяцев назад +14

      Nvidia Power Plug Engineer: "Hold my half caf soy milk grande carmel macchiato."

    • @EdNarculus
      @EdNarculus 5 месяцев назад +12

      I'm an overheating enthusiast myself and would like to see products that carry risk of spontaneous human combustion.

    • @murray821
      @murray821 5 месяцев назад

      Easy, just put steel wool on it while playing

    • @chillnspace777
      @chillnspace777 5 месяцев назад

      Just get a 14900ks then your good to go

    • @dieglhix
      @dieglhix 5 месяцев назад

      it will be more efficient than a 4090, which can powercapped at 70% and running at 98% performance.. meaning a 5090 will be able to run at itsnfull potential at lower than 300w. stock power is too much power already

  • @mchits9297
    @mchits9297 5 месяцев назад +31

    [It's 2030, goes to buy RTX 9090 with all of my savings.]
    Me: Hey, you have latest GPU, May be RTX 9090.
    dealer: (goes inside and comes with a 8 feet server wrack.) here's Your GPU sir, just $100k dollars.

    • @akam9919
      @akam9919 5 месяцев назад

      rah. it'll be a tiny quantum board... but you need to buy a giant freezer sized cooler...and not like the fridge you have at your house...like a giant walk in-freezer. You will also have to pay $500K to turn it on, wait for 2 days to get the thing to cold enough, and then spend $2.3M to run crysis, $2.345 for doom, and $76B for fortnite... no the game will not look more realistic.

    • @mchits9297
      @mchits9297 5 месяцев назад

      @@akam9919 I might create physical sets for all those games with that kinda money 🤑💰

    • @mchits9297
      @mchits9297 5 месяцев назад

      @@akam9919
      Virtual reality ❌
      Reality ✔️

    • @GamingXPOfficial
      @GamingXPOfficial 4 месяца назад +1

      This was somehow very hard to read/understand, but I got it at the end of the day.

    • @LordKosmux
      @LordKosmux 4 месяца назад +2

      What if they get smaller instead? A GPU the size of your smartphone. And the price of a house.

  • @EmblemParade
    @EmblemParade 5 месяцев назад +110

    As a 4K/120 gamer I can promise you that we're still GPU limited with the 4090. I often have to compromise on AAA games by enabling DLSS 3 or lower settings, and sometimes just hit 60 FPS. At the same time, I do think upscaling is changing our requirements and expectations, so I hope the silicon can be optimized around that. We don't necessarily need more pixel shader performance if we assume upscaling. The die space is better spent on other features.

    • @StarkR3ality
      @StarkR3ality 5 месяцев назад +4

      I can think of one title you would be fully GPU limited on that card is Cyberpunk 2077 path tracing mode.. and that game is an outlier and also extremely CPU heavy, I'm CPU bottlenecked on that title in certain areas on a 4070s so I'd get your card looked at because something wrong there.

    • @lorsch.
      @lorsch. 5 месяцев назад +10

      And to max out high end VR headsets these days a 6090 is probably not enough...

    • @ghostofreality1222
      @ghostofreality1222 5 месяцев назад +4

      @EmblemParade - What CPU and RAM are you running? CPU and RAM has a lot to do with it as well. - but I also agree with @starkr3ality - 4090 @ 4k 120 should be running fine on all AAA games with a couple of exceptions being Cyberpunk or Microsoft Flight Sim - A 4090 should be maxing out all AAA @4Kx120hz. You have got to be hitting a CPU Bottleneck and that is why your having to lower graphics settings to get your desired results. Again this is mostly assumption at this point as I have no idea what CPU or RAM your running but this is what makes sense in my mind with what you stated in your post.

    • @EmblemParade
      @EmblemParade 5 месяцев назад

      @@ghostofreality1222 Look at benchmarks from HardwareUnboxed, GamersNexus, and others, and see that you are very far from the mark. I have a 5800X3D and high-end DDR4 RAM. I'm not saying that I'm not having a great time with this setup, but forget about ultra settings AAA at 4K without the help of DLSS. The 4090 is great, but 4K is a lot of pixels.

    • @blackcaesar8387
      @blackcaesar8387 5 месяцев назад +6

      @@StarkR3ality cyberpunk is no longer an outlier...Alan wake 2 made sure of that. I am guessing helllblade 2 further confirm that.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 5 месяцев назад +19

    There is always a bottleneck in a pc depending on the resolution and game settings. Doesn't matter if it's a $400 pc or a $4000 pc.

    • @ZackSNetwork
      @ZackSNetwork 5 месяцев назад +1

      Not exactly if your hardware pars together fine the bottleneck can be the software and not the hardware.

    • @EspHack
      @EspHack 5 месяцев назад +1

      thats why I aim for a monitor bottleneck

    • @ProjectMore69
      @ProjectMore69 5 месяцев назад +3

      Such a dumb argument, there is a difference between something BARELY hindering performance of another part vs a HUGE hinderance.

    • @aberkae
      @aberkae Месяц назад

      Even Micro-software is a bottleneck before 8/2024 update.

  • @tomthomas3499
    @tomthomas3499 5 месяцев назад +50

    50%, 70%, or even double the power of 4090, as long as it's not melting it's connector it's fine by me

    • @ZackSNetwork
      @ZackSNetwork 5 месяцев назад +6

      I expect %60 faster rasterization performance and 2.5x better raytracing. While pushing the same Power output.

    • @ianism3
      @ianism3 5 месяцев назад +6

      @@ZackSNetwork exactly. efficiency is really underrated

    • @nicane-9966
      @nicane-9966 5 месяцев назад +1

      gonna pack 2 of those filthy ass conectors man lol

    • @nicane-9966
      @nicane-9966 5 месяцев назад +1

      @@ianism3 is not, is just that those very high tier cards are made to get the maximum amount of power possible, for efficiency you have the 80s and below that.

    • @aberkae
      @aberkae 5 месяцев назад

      ​@@ianism3 4N node how much efficency can they squeeze out of a similar node🤔.

  • @gnoclaude7945
    @gnoclaude7945 5 месяцев назад +7

    Il ride my 4070 Super until next gen consoles drop. Thats when the jump to ai NPU's for gaming and other new features will push me to upgrade. Honestly its overkill for 1440p in the games i play at the moment.

    • @TheMichaelhi
      @TheMichaelhi 3 месяца назад

      i got a 4070 super also and its great for 1440p oled gaming. But im going to upgrade to a 5090 and pass the 4070 super to the wife and get rid of her 4060ti rig. 4070 super should last a while with no problem atleast 3 years more before you start seeing it start to struggle a bit

  • @hoverbike
    @hoverbike 5 месяцев назад +27

    the 4090 is still hugely gpu limited in games like MSFS2020 in VR, and i expect 2024 to be utterly devastating in VR - and i'm all for it. We get closer and closer to Star Trek computer simulations every year.

    • @2drealms196
      @2drealms196 5 месяцев назад +2

      Visually yeah a single flagship videocard is getting closer and closer when it comes to rasterization. But the holodeck virtual npc's have GTP-5 or higher level AI, the physics simulations are orders of magnitudes more realistic and computationally demanding, true pathtracing without the need for any denoising, ultra realistic animations. Latency is also an issue with these LLM responses. So you'd need an entire futuristic datacenter's worth of power with futuristic networking that provides exponentially quicker responses, so even a single flagship videocard from 2045 wouldn't be enough.

    • @numlock9653
      @numlock9653 5 месяцев назад +3

      Actually the cpu is definitely the bottle neck when using a 4090 in MSFS VR, in my experience . I have a 7800x3d and a 4090 using a Vive pro 2 on highest settings and no matter what graphical setting I change it makes little difference in frame rate, but if I lower traffic I get a huge boost, which is all CPU based calculations. Very Poor CPU optimization unfortunately.

    • @hoverbike
      @hoverbike 5 месяцев назад

      @@numlock9653 Vive 2 pro Res is very low

    • @hoverbike
      @hoverbike 5 месяцев назад +1

      ​@@numlock9653huh, well you must've either clogged up that CPU with poor bios settings, or your Vive 2 Pro just has subpar resolution.

    • @Myosos
      @Myosos 5 месяцев назад

      ​@@numlock9653 get a better VR headset

  • @yves1926
    @yves1926 5 месяцев назад +9

    No, games are just badly optimized

    • @glenmcl
      @glenmcl 5 месяцев назад

      Exactly

  • @Torso6131
    @Torso6131 5 месяцев назад +57

    I mean, 4k120 has to be on the table for most games. Even 4k90, throw on some DLAA, call it good. Aside from totally broken PC ports I feel like we're still GPU limited 99% of the time, especially if you have something like a 7800x3D.

    • @StarkR3ality
      @StarkR3ality 5 месяцев назад +4

      depends what res like you've said, but a lot of the games alex mentioned and others, I'm bottlenecked on a 5800x3d with a 4070 super at 1440p in a lot of modern titles, which is crazy right?
      For me, only use case for 90 class and even 80 class GPU's is if you're going to be using 4K and ray tracing in every title available. When you're getting GPU's like the 4090 that are doubling performance gen on gen, and then you get a what? 20% increase in perf going from a 5800x3d to the 7800x3d.
      CPU's cannot keep up and it's really starting to show.

    • @WhoIsLost
      @WhoIsLost 5 месяцев назад

      @@StarkR3alitysomething must be wrong with your PC if you’re getting bottlenecked with that hardware. 5800x3d is only 12% slower than the 7800x3d in 1440p

    • @StarkR3ality
      @StarkR3ality 5 месяцев назад

      @@WhoIsLost defo not pal can assure you, I still great good performance, and I'm not talking about every title only the recent big ones, Baldurs gate 3, cyberpunk, witcher 3 next gen.
      My point was is if I'm CPU bottle necked at times in some titles, what's a 4090 gunna be which I think offer 2.5x the performance? not to even mention the 5090.

    • @JBrinx18
      @JBrinx18 5 месяцев назад

      ​@@StarkR3ality4090 is only ~60% stronger than a 4070 super. But yes, CPU performance is an issue... I think 4K will be a problem, and there's just not a market for 8K... The only avenue that's available might be VR

    • @Plasmacat91
      @Plasmacat91 5 месяцев назад +2

      @@WhoIsLost Negative. I have a 5800x3D and 6900XT and am CPU limited most of the time at 1440p.

  • @LeoDavidson
    @LeoDavidson 5 месяцев назад +19

    Until it does triple 4K at 240Hz without DLSS or frame generation, there's always room for more. :) Whether there's enough of a market for that outside of the sim-racing and flight-sim niches, I don' tknow, but I'd buy one.

    • @clockworklegionaire2135
      @clockworklegionaire2135 5 месяцев назад +3

      Thats not happening ever with new features keep coming out like RT and PT

    • @fcukugimmeausername
      @fcukugimmeausername 5 месяцев назад

      Battlemage will do this.

    • @clockworklegionaire2135
      @clockworklegionaire2135 5 месяцев назад +6

      @@fcukugimmeausername You are clearly out of your mind

    • @xpodx
      @xpodx 5 месяцев назад

      8k 144hz will come out soon definitely need more power. Never enough.

    • @Sota_eth
      @Sota_eth 5 месяцев назад +1

      64k even

  • @Fiwek23452
    @Fiwek23452 5 месяцев назад +2

    The current 4080 and 4090 are already bottlenecked by current cpu’s, hyper threading and e cores are total bs

  • @eugkra33
    @eugkra33 5 месяцев назад +20

    Alex said especially with RT it'll be CPU bound, because of the BVH workload. But what the next generation could offer is moving the BVH maintenance to the GPU alleviating a whole bunch of CPU work.

    • @Hi-levels
      @Hi-levels 5 месяцев назад +1

      Npu s will also come to rescure either on gpus or cpus

    • @yesyes-om1po
      @yesyes-om1po 5 месяцев назад

      @@Hi-levels I don't think that has anything to do with RT's BVH workload, the only thing an NPU could do better is AI denoising, but nvidia already has dedicated hardware to do that on RT cards, and I'm pretty sure an NPU would introduce too much latency for realtime denoising as a separate piece of hardware.

    • @BaieDesBaies
      @BaieDesBaies 5 месяцев назад

      RT is so GPU intensive that I don't see how it could CPU limit games.
      If I activate RT in games, CPU load tends to lower because GPU is struggling.
      I have i5 and 3080

  • @garethperks7032
    @garethperks7032 5 месяцев назад +3

    Thankfully AMD have opened a door for us with X3D. CPUs finally start becoming useful for gaming with ultra high speed memory (e.g on-die cache).

  • @powerpower-rg7bk
    @powerpower-rg7bk 5 месяцев назад +3

    The thing I'd be hoping for on a RTX 5090 would be two 8 pin power connectors and a return to more sane power consumption. More/faster VRAM would be nice too as I feel that that has been the RTX 4090's bottleneck, especially at 4K.
    Other grab bag of features would be the return of nvLink/SLI support to scale up via multi-GPU and integrating some Thunderbolt 5 controllers. It'd be nice to be able to plug a USB-C monitor directly into the GPU without external cabling and get full USB support on the display and other peripherals connected to it. Similarly with Thunderbolt 5, it'd be clever to include a mode where you could use the GPU externally for a laptop without the need for a Thunderbolt bridge board in an external chassis. Literally just the card, power supply and a power switch to turn it on. The PCIe slot connector would go unused.

  • @lil----lil
    @lil----lil 5 месяцев назад +3

    U know what's gonna happen? Nvidia be like, so guys can't keep with us huh? We gonna design our own CPU! That's _exactly_ what happened to Intel. Apple be like, get you $hit together or we going our own way. Now we have M1/2/3 and soon 4. Watch Intel/AMD stocks TUMBLE when Nvidia announce their own CPU!!! AMD/Intel better get their $hit together.

  • @PokeTheMostest
    @PokeTheMostest 3 месяца назад +2

    if the 5090 is bottlenecked by current market cpus like the ryzen 9 79503xd or the i7 14700k / 14900k which cpu would bottleneck the 5090 the least

    • @andyakarudolfhessiansack7936
      @andyakarudolfhessiansack7936 2 месяца назад +2

      9800x3d

    • @aberkae
      @aberkae Месяц назад

      ​@andyakarudolfhessiansack7936
      True. gpu get 50 to 70% delta gains cpu gets 5 to 10% with each succesion 🤪.

  • @saesang352
    @saesang352 5 месяцев назад +2

    Honestly just stop it with this CPU nonsense. Unless you are competitive gaming at 120-240hz, a 3700X will be fine for 60fps for another 4-5 years.

    • @chy.0190
      @chy.0190 5 месяцев назад +1

      Its not nonsense, saying a 3700x would be fine for another 5 years is crazy when it bottlenecks the latest gpus now on high end gpus even at 4K.

    • @XAMPOL
      @XAMPOL 4 месяца назад

      Not everyone plays on 1080p/1440p.... and only 60 FPS??? you're kidding me!

  • @dystopia-usa
    @dystopia-usa 5 месяцев назад +3

    Once you hit certain quality-experience performance thresholds in gaming, it doesn't matter & becomes overkill for the sake of giggles. It only matters to professional bench markers & internet braggarts.

  • @ijustsawthat
    @ijustsawthat 5 месяцев назад +9

    Not if it burns your whole house down.
    Cant they design a better connector instead ?

    • @ProjectMore69
      @ProjectMore69 5 месяцев назад +1

      Can you have more strength than a 12yr old virgin and properly seat the connector?

    • @ijustsawthat
      @ijustsawthat 5 месяцев назад

      @@ProjectMore69 if you think it's force related, you need to watch/read more on the topic. GamerNexus did a full investigation on these connectors, and clearly demonstrated how they are flawed by design.

    • @rambo9199
      @rambo9199 5 месяцев назад +5

      @@ProjectMore69 Has this ever been a problem in the past for you? Are you speaking from experience?

    • @ProjectMore69
      @ProjectMore69 5 месяцев назад

      @@rambo9199 deflect more virgin boy.. wanna see a video of my 4090 ive had since launch working perfectly fine? I bet you dont 😂

    • @Oliver-sn4be
      @Oliver-sn4be 4 месяца назад

      ​​@@ProjectMore69what if you move your pc and it comes Auth a bit hmm ? It is not like you open the case always to see ther is always a cance for it and most of all even if u put it all the way in It can still burn 🔥 ther is never a 100 that it won't 😂

  • @Boss_Fight_Index_muki
    @Boss_Fight_Index_muki 5 месяцев назад +49

    Considering the 5090 will be $1700 at least, it needs to be twice as fast as the 4090 to be worth it.

    • @SuperSavageSpirit
      @SuperSavageSpirit 5 месяцев назад +7

      Rumor is it's 70%.

    • @Chuck15
      @Chuck15 5 месяцев назад +15

      twice? 🤣🤣🤣🤣

    • @squirrelsinjacket1804
      @squirrelsinjacket1804 5 месяцев назад +9

      70% raw performance improvement, along with a better version of frame gen to boost frame rates even more than the 40 series version would be worth it.

    • @daniil3815
      @daniil3815 5 месяцев назад +4

      that's a weird argument.

    • @dpptd30
      @dpptd30 5 месяцев назад +9

      I don't think so, it likely will be above $2000 due to them using the same node as Ada and Hopper, their data center B100 is already twice as large as the H100 in order to have a decent performance uplift on the same node, so twice as large on the same node should mean twice as expensive, especially when they still aren't using chiplets yet.

  • @Goblue734
    @Goblue734 5 месяцев назад +28

    I have an RTX 4090 the only way you would see me with a 5090 is if we can get 4K visuals with RT and can have at least 60 FPS rasterized performance no frame gen or DLSS.

    • @xSabinx1
      @xSabinx1 5 месяцев назад +11

      You're the 0.1% who upgrade top end GPUs each generation

    • @ZackSNetwork
      @ZackSNetwork 5 месяцев назад +5

      What games do you play I can do that already with my 4090?

    • @JeremyFriebel
      @JeremyFriebel 5 месяцев назад

      ​@@ZackSNetworksame but 4080

    • @kevinerbs2778
      @kevinerbs2778 5 месяцев назад +3

      never going to happen in the next 3 years. RT takes about 100x more computational power than rasterization does & most RT relies on rasterization as a base. Expect Blackwell to only be 20%-30% faster than a RTX 4090 at most. unless Blackwell comes out with a massive 224 R.O.P.'s ore more it isn't going to be that fast.

    • @aberkae
      @aberkae 5 месяцев назад

      dlss set to dlaa is where it's at though imo. Better than TA AA native resolution.

  • @chrisguillenart
    @chrisguillenart 5 месяцев назад +5

    Ampere didn't have price cuts of any kind, it maintained the price hikes of Turing.

  • @jaffaman99
    @jaffaman99 5 месяцев назад +2

    Enough now of this, it’s becoming click bait

  • @Hi-levels
    @Hi-levels 5 месяцев назад +1

    No but the devs are too dumb to utilise the power of 5090. Even with core i9 19900Ks turbo 7Ghz cpu dragon shitma 2 like games will run very poorly. And even ps7 Pro with 6090ti performance will run games 30fps per your yapping

  • @emal2170
    @emal2170 4 месяца назад +1

    Computer tech has always moved like a teeter totter. video cards pushing CPU advancement, pushing video cards. The same with consoles pushing video cards. Dx can be a huge factor driving the market, but Microsoft just seems more interested in controlling your life and switching to subscriptions.

  • @Monsux
    @Monsux 5 месяцев назад +18

    I will just use DLDSR + DLAA on a 4k 120 Hz TV/monitor. CPU won't be the limiting factor, and I'm always getting graphical upgrade. Add path tracing with maxed out settings and the GPU (even RTX 5090 TI Super) would scream for help. I just love DLDSR and how versatile it is for all type of games… Doesn't matter if I'm playing new or older titles.

  • @ImakeTanks
    @ImakeTanks 19 дней назад +1

    all new games suck so who cares about running a fast computer, 1080p is fine when all new games suck anyway

  • @AdiiS
    @AdiiS 5 месяцев назад +6

    2000 series NEW tech and NEW features
    3000 series nothing changed besides getting a little bit faster, no new features
    4000 series NEW tech and NEW features
    5000 series will be 3000 series all over again
    6000 series NEW tech and NEW features

  • @blast_processing6577
    @blast_processing6577 4 месяца назад +1

    At this point, graphics cards are a side-hustle for NVidia's primary business: AI infratructure.

  • @jaredangell8472
    @jaredangell8472 5 месяцев назад +2

    The 9800x3d will be able to handle it. Nothing else will though.

    • @andyakarudolfhessiansack7936
      @andyakarudolfhessiansack7936 2 месяца назад

      I'm not too sure. It's rumoured to be only 15% IPC uplift from 78003xd. For flight sims in particular, its still gonna be cpu bound.

    • @jaredangell8472
      @jaredangell8472 2 месяца назад

      @@andyakarudolfhessiansack7936 its overCLockable and it has 16 THREADS THAT WILL BE 15% FASTER

  • @Bluth53
    @Bluth53 5 месяцев назад +2

    Stop ignoring people without monitors that use HMDs instead. These elitist 2D pancake POVs annoy me

  • @chrissoucy1997
    @chrissoucy1997 5 месяцев назад +22

    GPUs are getting faster at a pace that CPUs can't quite keep up with. I have an RTX 3090 paired with a Ryzen 7 5800X 3D and in games with heavy ray tracing like Spider Man Remastered, my 3090 is CPU limited even at 1440p and to some degree at 4K. I am itching a CPU upgrade right now way more than a GPU upgrade, my 3090 is still fine. I am looking to upgrade to a 9800X 3D when it comes out.

    • @vitordelima
      @vitordelima 5 месяцев назад +9

      Stupid methods of rendering that move too much data around, need to rebuilt complex data structures all the time, ...

    • @StreetPreacherr
      @StreetPreacherr 5 месяцев назад +6

      It sounds like the game engines aren't designed 'properly'... Can't the (RTX) GPU handle most of the processing necessary for high quality ray tracing? I didn't realize that even with RTX that Ray Tracing was most often restricted by CPU performance! Isn't the GPU supposed to be doing all the additional Ray Tracing processing?

    • @vitordelima
      @vitordelima 5 месяцев назад +4

      @@StreetPreacherr There is a lot of intermediate steps that need a lot of CPU, in case of raytracing it seems the spatial subdivision is one of them for example. Uncompressed assets, assets with excessive detail, poor code parallelism... are other examples of causes of bottlenecks in general.

    • @ZackSNetwork
      @ZackSNetwork 5 месяцев назад

      That’s because the Ampere GPU’s handled data weird. The 4070 Super is way faster than a 3090 in 1080p and faster in 1440p as well.

    • @mackobogdaniec2699
      @mackobogdaniec2699 5 месяцев назад +3

      Spider-Man is a very specific game, it is heavily CPU-limited (but with high fps, not like DD2), but it is an exception. It is hard to find visible CPU-bottlenecks in most games in 4k, with 4090 and top CPU (or even new mid or smth like 5800X3D).
      If we're talking about RT I think it heavily differ from game to game. It's very CPU demanding in S-M:R or especially Hogwart's Legacy, but not at all in CP2077.

  • @SuprUsrStan
    @SuprUsrStan 5 месяцев назад +1

    Just get a G9 57" monitor or any other 4K+ monitor. You'll instantly be GPU bound again.

  • @ScientificZoom
    @ScientificZoom 4 месяца назад +1

    Imstead of Silicon why not opt for Graphite

  • @iLegionaire3755
    @iLegionaire3755 5 месяцев назад +4

    I hope the 14900K and 14900KS can drive an RTX 5090!

    • @CrashBashL
      @CrashBashL 4 месяца назад

      They self degrade after 3 months of using, I don't expect them to drive anything in the future.

    • @aapzehrsteurer9000
      @aapzehrsteurer9000 4 месяца назад

      Get a 7800X3D. Similar performance, much cheaper and more energy efficient.

  • @favoritodiavolo
    @favoritodiavolo 5 месяцев назад +1

    I just want to upgrade for GTA 6 😂

  • @kathleendelcourt8136
    @kathleendelcourt8136 5 месяцев назад +10

    People getting GPU limited in 95% of their games: ...
    The same people getting CPU limited in the remaining 5%, of which only 1% actually results in a sub 100fps framerate: OH NO I'M CPU BOTTLENECKED!!

    • @GabrielPassarelliG
      @GabrielPassarelliG 5 месяцев назад +1

      There are ways to spend the GPU extra power, like on monitors of higher resolution and refresh rate. And if you care much about a specific game where CPU bottleneck is a thing, then invest more in CPU and less in GPU. Not hard, giving high tier GPUs cost multiples of high tier CPUs.

    • @fleecejohnsonn
      @fleecejohnsonn 5 месяцев назад

      Yeah pretty much. CPU rarely bottlenecks unless you're playing an RTS.

    • @FantasticKruH
      @FantasticKruH 5 месяцев назад

      Not to mention that 4090 and 5090 are 4k cards, even older cpus rarely bottleneck the 4090 on 4k.

  • @mahouaniki4043
    @mahouaniki4043 5 месяцев назад +2

    $2500 for +15% max over 4090.

  • @0x8badbeef
    @0x8badbeef 5 месяцев назад +8

    Frame-gen is not a solution. It is a work-around. Those with a 5090 won't have to use it.

    • @XZ-III
      @XZ-III 5 месяцев назад +1

      I think you mean shouldnt have to use it

    • @OldMobility
      @OldMobility 5 месяцев назад +3

      DLSS is magic in a lot of cases. I’ve played many games now that look better under DLSS quality with sharpening at 100%.

    • @numlock9653
      @numlock9653 5 месяцев назад +2

      Frame gen is specifically designed to boost performance in cpu limited games. Has little to do with lack of graphics performance.

    • @0x8badbeef
      @0x8badbeef 5 месяцев назад

      @@numlock9653 it is not little. Asset quality is on the GPU. With frame-gen you can have those higher assets render at a lower frame-rate and have frame-gen fill in the gap.

    • @OldMobility
      @OldMobility 5 месяцев назад

      @@numlock9653 you don’t know what you’re talking about clearly. DLSS is much more than that

  • @100500daniel
    @100500daniel 5 месяцев назад +2

    directx13 will prob relive cpu bottlenecks

  • @johndavis29209
    @johndavis29209 5 месяцев назад +3

    Rich is a gem.

  • @OldMobility
    @OldMobility 5 месяцев назад +1

    No it’s not, gaming in 4K with Ray Tracing even with DLAA or DLSS is the most beautiful graphics I’ve ever seen and that’s just with my 3090. With a 5090 I can turn off DLSS and go Native and still get spectacular FPS.

  • @francoisleveille409
    @francoisleveille409 5 месяцев назад +1

    A Geforce RTX 3060 is right at home with an old i7-4790 especially when playing games in 4K so a 5090 would be right at home with either a 13th/14th generation i9 or Ryzen 9 7950X/7950X3D.

  • @neti_neti_
    @neti_neti_ 4 месяца назад

    It is unlikely that even the upcoming Intel Arrow Lake 15900K or AMD Ryzen 9 will be able to keep up with the RTX5090. Intel, in particular, will need to introduce its Novalake (16P core xxE core) with DDR6 RAM to support the RTX5090 ASAP .

  • @mrwang420
    @mrwang420 10 дней назад

    How muc is a game gonna bottlekneck on a 3950x? Can a game really bottleneck with 32 threads?

  • @Zapharus
    @Zapharus 5 месяцев назад +1

    "Hi guys exclamation point"
    LOL Dafuq! That was hilarious.

  • @SgtWooo
    @SgtWooo 5 месяцев назад +1

    if they delivered a card which can drive 240hz on my superwide i'd get one, otherwise no thx

  • @tofu_golem
    @tofu_golem 5 месяцев назад +1

    If no one can afford it, does it really matter?

  • @Koozwad
    @Koozwad 5 месяцев назад

    Literally ANY card is too fast for current CPUs if you lower the resolution far enough. Even the 3090 struggles at 4k maxed especially with RT/PT and without using fake resolutions/fake frames. Native maxed out PT rendering at 8k or 16k DSR on an 8k monitor/TV? Forget about it. 1 FPS maybe, if the PC doesn't explode and take the neighbourhood with it.

  • @mrwang420
    @mrwang420 10 дней назад

    In the future we will just have entire frame gen games that have no real frames. Lol. Everything is just made up. Lol.

  • @ThunderTheBlackShadowKitty
    @ThunderTheBlackShadowKitty 5 месяцев назад

    Unoptimized games. Our CPUs are insanely powerful nowadays. Force game companies to DO BETTER. That's the plain and simple truth, no exceptions, no twist, no nothing. Unoptimized games are 100% the sole issue here.

  • @betonman9
    @betonman9 4 месяца назад

    Lol....not in 4k....4090 barely hold 60fps in alan wake 2 with PT....

  • @frankfortfcgoc
    @frankfortfcgoc 5 месяцев назад +1

    No way i have a 4090 paired with an Athlon 3000g and the gpu is killing my performance

  • @videocruzer
    @videocruzer 18 дней назад

    what I read is that the 5090 and other 50 series off loads more of the Raytracing geometry onto the GPU instead of the CPU. something tells me this channel is more about click baiting for views, and who cares about anything under 4096x2160 anyways

  • @TimberWulfIsHere
    @TimberWulfIsHere 5 месяцев назад

    The 5090 would only be worth it for native 4k in most games, probably worth it for path tracing in cp2077 in 4k 60-120 fps

  • @mytech6779
    @mytech6779 Месяц назад

    Gaming cards are a byproduct the workstation and server designs. A GPU has many uses beyond splattering game frames on the screen as fast as possible. These non-gaming tasks are not CPU bound even with 4090 GPUs.
    Ever look at the specs for an AMD MI300x? They are usually running 8 per chassis. (Or look at the MI300A [24core Epyc based APU with 128GB HBM3 for the GPU])

  • @hartyeah
    @hartyeah 4 месяца назад

    I’m gaming on 7680x2160 samsung g9 57” with 4090 and 7800x3d. I sure hope I can get more fps with the 5090.

  • @mrwang420
    @mrwang420 10 дней назад

    Doesn't telling windows to make the hardware scheduler the gpu itself get rid of that?

  • @turboimport95
    @turboimport95 23 дня назад

    hate to tell yall, but at 4k the 5090 is still gonna be the bottle neck.. 5090 is not gonna have 2x the performance of the 4090 at raster. look at the 1080p benchmarks that get 150-200fps and the 4090 gets 60 at 4k native. with a fast enough gpu it would hit the same 150-200 at 4k.. so we are gonna be gpu bottlenecked for a long time..

  • @stefensmith9522
    @stefensmith9522 4 месяца назад

    That's why AMD's stix point with an apu that'll compete with a 4070 will be the perfect combo to work together without having to spend thousands more dollars on a graphics card. Why spend $2000 on a graphics card.... Seriously... For gaming what point does it make if you are cpu limited and your getting 150fps with a 4070ti for $800, 150 fps with a 7900xtx for $800 or 150 fps with a 5090 for $2000....😂

  • @spinb
    @spinb 3 месяца назад

    I predict shock and disappointment when the 5090 is only 10% more performance than 4090. Nvidia will not blow their 4090 out of the water with super-duper powerful 5090. That would be bad business. Gotta save some for the 6090. Besides, Nvidia is going to sell people the 50 series on razzle-dazzle fluff such as DLSS 4.0 and Ray Tracing Super TI.

  • @FMBriggs
    @FMBriggs 5 месяцев назад +1

    I love questions like this because they assume on some level that large companies (Nvidia, AMD, Microsoft, Intel etc) wouldn't be thinking about bottlenecks or be actively working on developing new ways to utilize cutting edge hardware.

  • @inflatable2
    @inflatable2 5 месяцев назад

    The RTX 5060-5070-5080 need to be cheaper.. The 5090 as the halo product not so much, as that will sell anyway simply for being (by far) the fastest GPU avalable.. The rest of the lineup is bought by much more price conscious PC gamers.. I hope the RTX 5080 will not be more then 1000 dollar/euro at launch..
    And obviously you do not buy a top of the line RTX 5090 to play at 1440p or lower and 144Hz or lower to run into CPU bottlenecks.. It's a GPU to be used on monitors with high resolutions and refreshrates.. New games will also continue to require more & more GPU computingpower (more development with raytracing & pathtracing etc), I'm not worried about that at all.. We'll probably also see newer faster CPU's this year from both AMD and Intel I think..

  • @SimBunker
    @SimBunker 2 месяца назад

    What would you guys do in this scenario?
    I have an old build which is a 8700k with 1080ti which has served me very well. However, I've gone into sim racing which i'm running triple 1440. The build holds up fine with obvious graphics turned down with iRacing, but other title are obviously a big no.
    I have the funds to build current gen with a 5090 & 14700k. But with arrow lake and 50 series potentially close, am I better off waiting a bit longer?
    I get that if I do it now, it's a massive upgrade. But what do the pros think?
    Thanks

  • @Luizanimado
    @Luizanimado 5 месяцев назад

    When they release a 5090 I can finally buy a 4090 hehehe

  • @D.Enniss
    @D.Enniss 5 месяцев назад

    Complaining about CPU bottleneck doesn't make sense at 4K does it? With Path-Tracing and other heavy games around, 5090 needs to offer Native 4K full max settings and you'll probably still be GPU bottlenecked to hit 60fps without DLSS

  • @dpptd30
    @dpptd30 5 месяцев назад +1

    I know how to fix this: QUADRUPLE FRAME GEN

  • @ChadT-n9q
    @ChadT-n9q 4 месяца назад

    Nice conversations, but did they really answer the question? I think the user was asking that how can the most powerful consumer graphics card money can buy really enhance gaming experiences when we also need equally powerful CPU performance to make this happen which often causes choking on the GPU? I mean I can name a plethora of things video games are lacking today that REALLY need improvements to improve experiences compared to what we have had for the past decade or two, with some being physics in real-time(this will always need improvements), real-time water physics(this is EXTREMELY held back and limited because of limitations in CPU calculations), 3-D volumetric special effects(I believe this is GPU focused and CPU focused) as well as robotic NPC's that suck you out of immersion(this is heavily CPU needed as well), so point being we can have these powerful GPU's but if video game experiences don't improve we are stuck with the same mediocre gameplay with shiny polished graphics at 556 FPS. Also VR, to me at least, IS the absolute FUTURE OF GAMING, being able to actually transport yourself into these incredible worlds, and that requires much more powerful hardware both GPU and CPU. My hope though is that solutions are found to fix these concerns and allow developers to have more fun creating games and not be limited constantly by hardware. PS5 was really exciting when it released because of the way they saw SSD's bottlenecking a lot of freedom for developers, and they found a solution that even pushed outwards into the PC SSD market as well since most PC SSD's at that time weren't hitting speeds PS5 was because of I/O throughput issues. My hope is Cerny and his amazing team come up with bottleneck solutions for improving CPU performance, allowing more efficient multithread performance in games that will stretch out into the industry and become standard you know? And I imagine Ai will also assist in numerous ways with improving performance, user experience and helping developers make games more polished and advanced. SO exciting to think about...

  • @klaymoon1
    @klaymoon1 3 месяца назад

    For games, I will stick with the consoles. I'm done with spending 10 times money for a few more frame rate. For AI, I will be sticking with Macs. At the end of the day, Apple is the only consumer level product that can run a large model. At max 5090 will offer 32GB, which isn't enough.

  • @CitAllHearItAll
    @CitAllHearItAll 5 месяцев назад

    Ray tracing and path tracing put more strain on the GPU, not the CPU.
    Path tracing with good FPS is where the 50 series will shine. Native 4K with PT.
    And current CPU will not bottleneck.

  • @SpOoNzL
    @SpOoNzL 5 месяцев назад +5

    5090 needs a built in smoke detector above 12VHP cable.

    • @tamish3551
      @tamish3551 5 месяцев назад +2

      Nvidia is bundling it with a popcorn attachment so if your gpu burns at least you got a snack

    • @eliadbu
      @eliadbu 5 месяцев назад

      I thought about some elegant solution like 12vhpwr port with integrated temperature sensor or some way to sense resistance in the connector. Or make sure the new 12v-2x6 are working as they're supposed to.

  • @mttrashcan-bg1ro
    @mttrashcan-bg1ro 5 месяцев назад

    I don't even think anybody even has to ask this question, the answer is 100% yes. The 4090 won't get utilized with a 14900KS or 7800X3D in some games at 4k, even the games where it is at 99% the CPU doesn't have much headroom left, it'll maybe push a 5080, but a 5080Ti/5090/5090Ti or anything faster has zero chance of being useful for 4k until devs start optimizing their games.

  • @federicomasetti8809
    @federicomasetti8809 5 месяцев назад

    Whatever they do with the RTX 5000 series, I'll be happy if a 5070 performs like a 4080/4080S, but consuming a ridiculously low amount of power (and at a reasonable price, which is something Nvidia doesn't really like, d'oh!). I want to spend money on hardware upgrades, not on electricity bills 😂

  • @TheMightySilverback_
    @TheMightySilverback_ 5 месяцев назад

    At the current rate that games visuals are getting """"""better""""""" Ill be on my 3080 until 2030.

  • @CrystronHalq
    @CrystronHalq 4 месяца назад

    7800x3D is rarely being used above 50-60% in games using the 4090. (at 4k)
    I dont think it will bottleneck the 7800x3D when it comes to gaming at all. Will still be an amazing CPU for it.

  • @whitecometify
    @whitecometify 5 месяцев назад

    Honestly since.vega 56 or 580 eta the leap has been small. A Vega 56 vs say a 4090 is only a 3x jump 5090 might be 4-4.5x add some for perspective. Go from 40 to 80 fps double the graphics and u max a 5090 vs a Vega 56. They need to at least 3x from here . We are 10 years behind the curve and getting ripped off.

  • @Mcnooblet
    @Mcnooblet 5 месяцев назад

    They don't need to do much for a big portion. Just rAsTeR with no features would make many happy. Throw in some FiDeLiTy FrAmEgEn Garbage edition that works on everyones GPU, but sucks the kiznock and that will make many happy as well. If Nvidia does this, they will make a ton of money while saving any cool features for a generation after to profit.

  • @CoolaDiamond
    @CoolaDiamond 5 месяцев назад

    5090 is too weak to output beautiful realistic graphics even in 4K, 60fps.
    Graphics of the future is going to be very deep and complex, this T-Rex here is just like a Playstation 2 attempting to do a Playstation 5 work.

  • @PaulRoneClarke
    @PaulRoneClarke 4 месяца назад

    As someone with no interest in resolutions above 1440, and can personally perceive no benefit from frame rates above about 80… none of this bother me… at all.

  • @dextrophantom
    @dextrophantom 3 месяца назад

    Probably am going to get it by selling the 4090 and a bunch of 3070s i have just racking dusts but the thing that I hate about powerful cards and great features like dlss is game devs are super lazy, look at alan wake 2, its stupid my card struggles without frame gen to get above 60 fps at 1440p, devs are taking these features into account when making their games when it should be a cherry on top, a plus, A FEATURE. I should not have to use it to have decent framerate. Its horrible.

  • @RandoReign
    @RandoReign 3 месяца назад

    could have said it better myself. it needs to be the best, it needs to be cheap and it needs to be in stock. all 3 of which nvidia are capable of doing but somehow they manage to mess up 2 out of 3 of those criteria.

  • @asmod4n
    @asmod4n 4 месяца назад

    next gen low power AMD APUs will have 7800X3D and rtx 4070 performance all in one, one could guess their desktop counterparts will be way much faster too.

  • @krz9000
    @krz9000 4 месяца назад

    There is no problem keeping a gpu bussi with pathtracing. More samples...more bounces...until we reach unbiased territory

  • @vonbleak101
    @vonbleak101 4 месяца назад

    I play @1080p and have an i9 13900k and a 3080... I have no issues with any modern game and prob wont for another couple of years at least lol... The 5090 would be insane for me haha...

  • @SpielMitMirSession
    @SpielMitMirSession 5 месяцев назад

    With the highest end GPU/CPU combination out now, I feel like the only games that bottleneck are poorly optimized games.
    Seems like optimization is not a priority near the top10, when making a game anymore. Is this some sort of conspiracy between game developers and hardware manufacturers?

  • @mcalhoun73
    @mcalhoun73 4 месяца назад

    If Nvidia can outpace the CPU manufacturers and bottleneck them overwhelmingly, they may be able to shift the dynamic off of X86 platforms and fill the gap with their ARM cpus. Wouldn't that be awesome if Nvidia could make expensive CPUs obsolete and replace them with their more exhorbitant priced chips? If they make it mainstream enough developers will be forced to make the move to the ARM architecture to produce the playability they need for their game. They could honestly even go for a console style of marketing where if you want a certain game you have to have one architecture and another game on a different architecture.

  • @Ace-Brigade
    @Ace-Brigade 4 месяца назад

    Who cares? With that much power consumption of who will actually want one in their house? I'm so fed up with my 3080s that I'm "downgrading" to 4070 tiI supers just because it pulls less power, thus also produces less heat.

  • @sir1junior
    @sir1junior 5 месяцев назад

    You shouldn't be buying a 5090 if your just gaming at this point and honestly it'll probably be priced appropriately based on the specs, $1800 - $2500

  • @0perativeX
    @0perativeX 5 месяцев назад

    I don't think bottlenecking will be a problem because by the time the 5090 is out, Intel's Arrow Lake CPU's will hit the shelves more or less at the same time. And of course AMD will release their next gen CPU's in response.

  • @chriss2295
    @chriss2295 5 месяцев назад

    Nvidia will do just enough as always. 5080 will beat 4090. Keeping prices very high.

  • @wileymanful
    @wileymanful 5 месяцев назад

    the 5090 better bring back Jesus Christ himself otherwise I see no point in the slightest

  • @superthrustjon
    @superthrustjon 5 месяцев назад

    No joke, just cashed in about 1,000,000 Marriott points for over $3,000 in Best Buy gift cards 😂 getting ready for the 5090

  • @sabakunovedo
    @sabakunovedo 5 месяцев назад

    Nah, it's only PS5 Pro that should have bottlenecks and is not necessary somehow. Every other product is great for the industry...

  • @ericvalentine1497
    @ericvalentine1497 4 месяца назад

    4090 pretty limited by 2% in 4k with 7800x3d so yes probably it will be faster, but you will see it hovering around 90% usage