Nvidia isn't a graphics company anymore...

Поделиться
HTML-код
  • Опубликовано: 9 фев 2025

Комментарии • 87

  • @КогорашИнф
    @КогорашИнф 18 дней назад +52

    ~600w for gpu is insane. My kettle has 1800w, ​​it boils a liter of water in a couple of minutes.

    • @stew675
      @stew675 18 дней назад +4

      Physics wise, and assuming perfect insulation and rounding to the nearest second, it would take 3m4s to bring one liter of water starting at 21C, to 100C with 1800W, but your point still stands: 600W is a lot of energy for a graphics card to draw.

    • @dem4a874
      @dem4a874 18 дней назад +3

      and your kettle can draw 90fps in cyberpunk with rt overdrive?

    • @TechDregs
      @TechDregs  18 дней назад +4

      Yeah, I'm sure it won't be a constant 575w load, but transient spikes will be spicy. The other thing is, we're getting to a point where American household circuits are going to be an issue.
      Most circuits in the US are 15A breakers. So, 1800w continuous use. If you're running an overclocked 14900k and a 5090 and someone plugs in an iron to the same circuit... LOL.

    • @amazingjackJF
      @amazingjackJF 18 дней назад

      ur kettle is designed to create heat, which makes this comparison asinine

    • @honkhonk6219
      @honkhonk6219 18 дней назад

      @@TechDregs people who spend 2000 on a GPU can afford upgrading their house circuits

  • @davidblankenship7985
    @davidblankenship7985 21 день назад +44

    Your 3090 outperforms my $230 RX 6600 by about 75%, and my 6600 does everything I want it to, at about 40% of the power usage. If I am skipping this generation, you can too :)

    • @dingledorf5615
      @dingledorf5615 18 дней назад +8

      Honestly, we have hit a plataue with the need for power in regards to games. The only reason you see new games needing new cards is just pure unoptimization and overly high polygon count models. Yandere simulator is probably the most extreme example I can think of.

    • @lukeludwig1055
      @lukeludwig1055 18 дней назад

      Do you game in 4k?

  • @imnotusingmyrealname4566
    @imnotusingmyrealname4566 21 день назад +25

    Surely this growth can be sustained.

    • @steinkoloss7320
      @steinkoloss7320 18 дней назад +5

      Lmao

    • @Nersius
      @Nersius 18 дней назад +1

      Deviant Art trained financial analyst 😤

    • @TAD-9
      @TAD-9 17 дней назад

      this time its different lol

  • @TheLingo56
    @TheLingo56 18 дней назад +1

    I have a 1660 Super so this gen is kind of where I need to get a real card.
    It’s been 4 generations of me waiting for another 900 or 10 series, and it’s just not going to happen. At least a 5070 or 5070ti are going to be a ginormous uplift.

  • @Generallygeneral
    @Generallygeneral 19 дней назад +10

    I just really wished 5080 had more than 16 gb, I just know next year they will release 20 or 24 gb version

    • @anewbeginning4617
      @anewbeginning4617 18 дней назад

      They'll prob do the reverse of what they did with the 4080. 4080 = 1199$ 4080 SUPER = 999$ 5080 = 999$ 5080 SUPER 1199$ for a 5% increase in performance.

  • @noproblem4158
    @noproblem4158 18 дней назад +1

    The RTX 30 series is still good enough to play any of the latest games, graphical improvement has stagnated, the only thing that needs to happen is for graphics cards to get cheaper.

  • @TAD-9
    @TAD-9 17 дней назад +2

    You hit the nail on the head. Follow the money, and you'll get a clear picture of what's really going on. The fascinating aspect of this AI rush is that, over time, the gaming industry will reap the rewards of the R&D dollars invested in improving AI performance. Bottlenecks in AI hardware & technology will eventually be solved and then trickle down to consumer products. While prices will remain high, we are witnessing a paradigm shift in how gaming hardware will evolve over the next decade. I believe we will see many AI-driven breakthroughs that will push gaming graphics beyond the stagnation we've experienced in the current generation.

  • @imnotusingmyrealname4566
    @imnotusingmyrealname4566 21 день назад +4

    Please don't just say DLSS. Multiple technologies are all under the name of DLSS and it's misleading.

    • @cheyannei5983
      @cheyannei5983 20 дней назад +6

      Unfortunately, that's how Nvidia markets it. They set the terms and phrasing and it's all DLSS

  • @timderks5960
    @timderks5960 21 день назад +7

    On the 50 vs 5000 series comment: It actually is the 5000 series. Sure, they get called the 40-X and 50-X, but that's the same as calling 2025 20-25. It doesn't mean we live in the year 20 and then 25 get added or something, it's just a different way of saying 2025. With numbers it's pretty simple. 5XXX will always be a 5000 series, no matter what marketing departments try to tell you.

    • @TechDregs
      @TechDregs  21 день назад

      This was literally what my brain did when recording this video. :-D

  • @ChristianRusso-l3x
    @ChristianRusso-l3x 18 дней назад +1

    Would personally not get the 5090, it's all great and still but the upgrade over 4090 isn't that huge.
    Would rather go for the 5080 which is the same price as the 4080 and call it a day.

  • @joshhardin666
    @joshhardin666 21 день назад +6

    this all seems very reasonable except that it's sounding like you're disappointed with the uplift over the previous generation when you said it yourself it's around 30%. - which really isn't bad when combined with the uplift going from 30 to 40 series which was 30-40%. you do you, of course, i'm not advising you to buy or not buy anything at all, i'm just saying that yes, there's incredible ai uplift, but there's also the reasonable expected generation over generation uplift of about 30% as well... if that isn't enough for someone, that's totally fine, but it IS there. - further, it's seemingly still more than we're getting for uplift from the closest competitors, amd and to a lesser extent intel... what we NEED to push the graphics hardware industry forward is competition on the top end, and i'm sorry to say that right now, today, there's nothing that competes with a 4090 on the market that i'm aware of... I also doubt that whatever amd's releasing this year will directly compete with the 4090 either... and the 5090 is 30% over that... tl;dr: we're basically where intel was at with cpu's before ryzen... every year they'd release a new cpu, it would have 4 cores, the top end ones would have hyperthreading, it would come in at 3.5-4.2ghz or so, and the new one is probably like 5-10% better while pushing the power envelope up 5-10%... and intel was complacent on this approach until amd took some time off to get their act together and started releasing 8 core products that blew the intel parts out of the water, and workstation and server parts at the same time based on the same architecture that were at least 60% faster and usually 20-30% cheaper... that's what we need in the gpu space right now. a kick in nvidia's complacency.

    • @TechDregs
      @TechDregs  21 день назад +3

      It is similar to Intel, I've thought the same thing. And just like Intel, there's a lot of power bloat. That's my main disappointment.... yeah, 30% perf uplift. But 28% power uplift. Almost zero efficiency gain. And the 4090 was a 29% power increase over the 3090. The 3090 was 25% increase in power draw over a Titan RTX. The performance increases are only marginally outpacing the power requirements. If it keeps going this direction, the 6090 will draw 700-750 watts. LOL.

    • @joshhardin666
      @joshhardin666 21 день назад

      @@TechDregs yes and no. the power profile of a 4090 isn't that cut and dry. - when playing modern AAA titles, the my 4090 tbp rarely hits it's max. the 4090 isn't power limited, it's voltage limited. even if you allow it more power (which you can do in something like afterburner), it doesn't use it, instead boucing off the voltage limiter. just because the limit is set at 450w in the case of the 4090, doesn't mean that it actually uses that 450w... my 3090 would use the entire power envelope and be hungry for more (I actuially soft modded my 3090 so it could use up to 450w and use it it did when I overclocked (I used a standalone waterblock with custom liquid cooling). you just can't get that out of a 4090 (though it performs out of the box significantly better than my overclocked 4090 did even at the usual 350-400w). I have no idea what to expect when it comes to the 5090's performance and power behavior when you start changing it's power, frequency, and voltage parameters - looking forward to it's release.

    • @TechDregs
      @TechDregs  21 день назад +1

      Could be just different cards, but my experience was similar with my 3090 as you see with the 4090. My 3090 runs best at 300-320w rather than 350w. During gaming it often doesn't get close to that.
      My card paid for itself with ETH mining, so I spent a lot of time looking at what power levels it worked best at. Sure, it's a card capable of pushing 500w if I force it to, but there was very very little gain from that and it cause a lot of instability. 90% power levels basically gave up nothing in benchmarks vs stock.

    • @cheyannei5983
      @cheyannei5983 20 дней назад

      AMD is also getting a 25-30% uplift. If it's a good generational gain for Nvidia, then why is it lackluster for AMD? I think the disappointment is presenting DLSS improvements as whole card and raw/total performance improvements. Notice Nvidia has NOT delineated out raw vs DLSS frames in their 4090 vs 5090 comparison, even though DLSS is *not* free. DLSS will cause a drop in raw frames vs no upscaling in GPU limited scenarios.
      Such a change would be a huge improvement in DLSS performance... but they're not advertising it. They're keeping raw performance numbers close to their chest. They are marketing DLSS improvements as generic gaming performance, and AI TOPS as generic compute.
      If Nvidia has made it possible for developers to more broadly leverage AI models in their games for use as generic compute, they'd be make sure everyone is talking about it... nobody is. That isn't in the press kit.
      Speaking of, a while back there was a whitepaper at SIGGRAPH that replaced computationally expensive GPU physics that took minutes per frame with an AI model that was 99% accurate and ran in real time. The idea is out there and the tech is there, but it's not really integrated into any engine, which is typically Nvidia's job and the much-defended software advantage.
      So... what is the gain? "It ships with a 30% OC over the previous card". Is that really worth the, what, $2000 base asking price?

    • @cheyannei5983
      @cheyannei5983 20 дней назад

      Also re:4090 perf, at least in ARK ascended the 7900 XTX is a lot closer to the 4090 when both run TSR (a temporal accumulate step is required for their light rendering to work)
      TSR can be set to render at desired output resolution natively and then do it's normal super resolution upscaling to 4K or whatever with a short pixel history for less blur and less TAA frames on history rejection. Setup this way, TSR does not give an upscaling boost but actually takes a bit more perf than if there was no temporal step; but the lighting works as intended.

  • @necuz
    @necuz 21 день назад +6

    That AI TOPS number is another apples to oranges situation, they're comparing FP8 on the 4090 to FP4 on the 5090. The footnotes of the graph you showed at the start used to specify that, but apparently they have now edited that out.

    • @TechDregs
      @TechDregs  21 день назад +2

      I believe the number I used was the same base for both. It's hard to pin down. However, digging around, that seems to be the INT8 (sparse) peak TOPS number. The 4090 has 1321 peak INT8 TOPS using the Sparsity feature, noted on page 30, Appendix A, Table 2. There isn't a Blackwell whitepaper out yet to verify. Some discussion here, though: www.reddit.com/r/LocalLLaMA/comments/1hviw58/simple_table_to_compare_3090_4090_and_5090/
      It seems like they were mixing FP8 and FP4 in the models for the application benchmarks in their presentation though. Fricken marketing.

  • @BboyCorrosive
    @BboyCorrosive 18 дней назад +3

    Still on a 2080, even though I could easily afford any of the 4 or 5 series, think I'll just grab a 5070ti when it's out

    • @TAD-9
      @TAD-9 17 дней назад

      Same! 5070ti is the most sensible buy this gen.

  • @travisretriever7473
    @travisretriever7473 18 дней назад

    "Nvidia isn't a graphics company anymore..."
    That ship has sailed the river styx since 2018, OP.

  • @kyohiromitsu4010
    @kyohiromitsu4010 18 дней назад +1

    Its simply because their processes are limited by what TSMC, ASML, etc can offer them.

    • @StoicismisKing4444
      @StoicismisKing4444 18 дней назад

      Shhh, sense will make the "rasterization crew" even more salty.

  • @MGarrido_01
    @MGarrido_01 19 дней назад +2

    Hey, really liked your coment on this topic. Underrated video, you deserve more views!

  • @imnotusingmyrealname4566
    @imnotusingmyrealname4566 21 день назад +6

    I bought a 6800 for 367 bucks, then returned it for a slightly used 6900 XT for 400. I hope this card will last me as long as my 1070 did because the GPU market is completely fucked right now.

    • @cheyannei5983
      @cheyannei5983 20 дней назад +2

      9070 XT will be a big uplift over the 7800 XT but can't deny there will be no 7900 XTX replacement to jump to in 2-3 years

    • @carlosmagnomacieira8210
      @carlosmagnomacieira8210 19 дней назад +1

      And me, I finally got a 1660super (1070 performance) for 650 reais (100 dolars)... giiving the fact that here the minimum wage (which is what get) is around 250 dolars...

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 18 дней назад +1

      @ wow that's a very significant amount then. 1070 is still a good card and in 1080p it's sursprising what it can play and if you play only slightly older games even 1440p is good.

    • @carlosmagnomacieira8210
      @carlosmagnomacieira8210 18 дней назад +1

      @@imnotusingmyrealname4566 that is what people in Europe and America need to understand when they see some BS like the company of indiana jones did...I was so excited to play that game, but it cannot rum in the 1660super because it required that **** ray tracing hahahah

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 18 дней назад +1

      @@carlosmagnomacieira8210 yes mandatory ray tracing is just stupid, and GPUs that support it suffer performance-wise. It's just lazy and ignorant of the developers.

  • @Accuaro
    @Accuaro 19 дней назад +1

    I sell my GPU each gen, so buying the next gen doesn't cost all that much more. If you were buying new each time then you'd have to have deep pockets. It's why I can afford the 5090 at least.

    • @Ahmed-cu5lw
      @Ahmed-cu5lw 18 дней назад

      How do you sell them

    • @CURTSNIPER
      @CURTSNIPER 18 дней назад

      ​@Ahmed-cu5lw full priced apparently

    • @Accuaro
      @Accuaro 18 дней назад +1

      @ Actually yes, lmfao. I sold my 7900 XTX for around $1000 USD. It was the sapphire Ntro Vapor-X card listed as an auction with a low starting price, so no I didn't scam people lmfao.

    • @Accuaro
      @Accuaro 18 дней назад +2

      @ eBay, sold my RTX 2060 (during the mining craze), 6900XT and 7900 XTX at close to or not far behind MSRP.

  • @zahnatom
    @zahnatom 19 дней назад +1

    i agree with your position on AI Upscaling vs AI Framegen. Upscaling is fine and dandy, especially for running higher resolutions on lower end cards. its a good thing for consumers imo.
    Framegen just isnt. its there purely to increase the stats because no one really knows or understands how latency works.
    it'll even look nice and smooth but x4 framegen from 30fps to 120fps will still feel like a 30fps game due to input lag and theres no ways around that unless they "ai generate" the entire game, including inputs.

    • @dem4a874
      @dem4a874 18 дней назад

      dlss is not only upscaler, its also AA, stop showing your competent opinion on technologies

    • @Steeveerson
      @Steeveerson 18 дней назад

      @@dem4a874DLSS isn’t AA, DLAA is AA. DLSS is an upscalar that also happens to reduce aliasing.

    • @dem4a874
      @dem4a874 18 дней назад

      @ DLSS was designed from the ground up to integrate anti-aliasing (AA) as a core part of its process, not as an incidental effect. NVIDIA’s approach uses machine learning to eliminate aliasing and reconstruct fine details while simultaneously upscaling from a lower resolution. This makes anti-aliasing an integral part of the DLSS pipeline.
      DLSS 3, for example, excels at reducing jagged edges by leveraging temporal data and AI-driven reconstruction, far surpassing traditional methods like TAA or FXAA in both quality and performance.
      DLAA is essentially DLSS without the upscaling component, focusing solely on AA, but it’s built on the same neural network technology. Therefore, DLSS isn’t just an upscaler that happens to reduce aliasing - it’s a hybrid technology where anti-aliasing is a fundamental feature. That’s why it’s currently the best AA solution on the market."

  • @procrastinator24
    @procrastinator24 18 дней назад

    Sitting on the 4090 rn, I think I will upgrade to the 5090 end of 2025. Selling the 4090 and writing the 5090 off taxes makes it somewhat affordable for me. I think the 8gigs more on the GPU for rendering with blender will come in neat. Other than that I wouldn't recommend upgrading at all, honestly

  • @Greatbrittan
    @Greatbrittan 18 дней назад

    That explains the ridiculous pumping of their stock, they're all in on AI and Compute Maxxing

  • @mmaharasfuri1718
    @mmaharasfuri1718 18 дней назад

    I can't really complain it. They are company, they do have way to get more profit and now AI is the biggest selling.

  • @mrCrashCube
    @mrCrashCube 18 дней назад

    AI will become much more important for gaming as raw graphic performance.

  • @AnimikhAich
    @AnimikhAich 20 дней назад

    I have a 4060Ti 16G and I use it for AI. Cheap, and requires less power and also does the occasional 1080p gaming for AAA tiles on max/ultra setting at 75+ FPS (My monitors are 75 Hz). I'm good for now as well. Though, 5090 seems like a sweet sweet deal for AI, I'm broke and can't afford it. :'(

  • @amazingjackJF
    @amazingjackJF 18 дней назад

    yes they are, they just diversified

  • @AtjooFPV
    @AtjooFPV 18 дней назад

    I got a 2070 super right now, saved up enough for a 5070 ti, should I buy it? Mainly play Rust, csgo and want to play Myth Wukong (Played before but stopped because I want better grahps and fps)
    Monitor is 14440p 165hz

    • @Menace0
      @Menace0 18 дней назад +1

      if a few hundred dollars isn't going to break the bank, then yeah buy a new card. I have a 2080 and I'm looking to move to a 5080 this summer.

  • @itsmeurboi
    @itsmeurboi 18 дней назад

    I'm so glad i don't like AAA games! Any old trash GPU will do my needs 😍

  • @remowind1578
    @remowind1578 18 дней назад

    So fundamentally you don't get why Nvidia pushes for AI. They make GPUs with 3nm, and 5090 is already requiring 600WT, pushing it to 2nm would skyrocket prices into the space. AI is the only way at the moment. Funny how you don't understand such simple things.

    • @StoicismisKing4444
      @StoicismisKing4444 18 дней назад +1

      People seem to completely discount the vast costs of R and D for fabrication plants. It is mind blowing how little the average consumer understands how business actually works.

  • @UtopiaForMore
    @UtopiaForMore 20 дней назад

    more profit meaning money money and more money is a weak R&D foundation

  • @RN1441
    @RN1441 18 дней назад

    Gaming GPUs have essentially become a marketing exercise. As long as the market will take more of their data centre product they are taking a much lower price per wafer allocated to us than to compute accelerators. These boom days won't last forever, but while they do gaming is just a thing they do on the side because of their legacy.

  • @Struct.3
    @Struct.3 19 дней назад +3

    "Not a graphics company" that delivers the best graphics hardware and software by a landslide...

    • @KrisH-bt5cm
      @KrisH-bt5cm 18 дней назад +5

      Woe you didn't watch the video did you lol.

    • @maxiic4144
      @maxiic4144 18 дней назад +5

      this guy is the model consumer, dont ask questions, just consume and keep the company happy.

  • @Fisherman4200
    @Fisherman4200 18 дней назад

    Ai is falling so is nvidia.

    • @TechDregs
      @TechDregs  18 дней назад +1

      500b going to new AI development means it'll probably be a while before the collapse. :-D

    • @visionshift1
      @visionshift1 18 дней назад +1

      Falling? Can you cite something? Just curious.