GeForce RTX 4090 Review, Nvidia's Stupid Fast GPU!

Поделиться
HTML-код
  • Опубликовано: 10 сен 2024

Комментарии • 3,2 тыс.

  • @Hardwareunboxed
    @Hardwareunboxed  Год назад +199

    Imagine if we knew 9 months ago just how trash the pricing would be for the rest of the GeForce 40 series.

    • @user-eq3rs2tt5v
      @user-eq3rs2tt5v Год назад +47

      the irony that the 4090 is pretty much the only 40xx series card that's worth buying.

    • @imafirenmehlazer1
      @imafirenmehlazer1 Год назад

      Lol but what you said about the "segment of pricing" is and will always be true and like some here has said. It still going to sell I was very blessed to get a 4090 for 1,400 and I cracked and got it figured 9 months out 200$ below msrp same as a premium 4080 my mind told me was worth it but people will always buy the new and shiny and most powerful.

    • @ZloHunter
      @ZloHunter Год назад +2

      Well, I have gotten myself a 4090 after all. The only GPU that's actually worth the value in this godforsaken 40 series GeForce. Expecting it to be a timebomb like my old 3080

    • @new_og_
      @new_og_ 11 месяцев назад +1

      I paid $1800 for the FE , prices got out of control.

    • @mrbuIIets
      @mrbuIIets 11 месяцев назад

      @@new_og_ dang. I just got an FE from BB for Item
      Total: (after tax) $1,547.99
      Product Price: $1,439.99
      This is on 12 months, same as cash. Upgrading from a 12GB 3080. after I sell the 3080 for 6-700 dollars the upgrade will only cost me about $700 for double the fps. (gaming at 4k)

  • @VzlaLeo
    @VzlaLeo Год назад +2075

    Nice to see such a nicely priced budget product for us poor gamers
    EDIT: Obviously this comment was a joke
    EDIT2: Amazon, Bestbuy selling XFX RX 6600 for $230 today October 12. Newegg sells the Gygabyte RX 6650 XT for $265

    • @dalefrancis9414
      @dalefrancis9414 Год назад +20

      lol most of us won't be looking at this....

    • @MinhTran-fm5oy
      @MinhTran-fm5oy Год назад +32

      Hey, the RX 6600 is about 250$. May i ask is it worth?

    • @VzlaLeo
      @VzlaLeo Год назад +72

      @@MinhTran-fm5oy Yeah great 1080p card, doubt AMD or NVIDIA will launch anything for the low segment this year

    • @KnightDelta69
      @KnightDelta69 Год назад +29

      @@MinhTran-fm5oy abso-fricking-lutely worth it! If u can push for a 6650xt (I believe its below 280$ with rebate on Newegg, and 299$ on amazon for gigabyte version) it would be even better, as the performance gain can range between 10-30% depending on the game.

    • @TekGriffon
      @TekGriffon Год назад +14

      The xx80 products have never been a "budget product". The 4060 is the product for you guys. Just gotta be patient.

  • @romankozlovskiy7899
    @romankozlovskiy7899 Год назад +23

    Imagine getting a 3090ti and then it getting deystroyed by a 4090 that’s cheaper 9 months later

  • @paulshardware
    @paulshardware Год назад +1252

    Beautiful and well executed review Steve! Hope you're keeping up with this launch cycle ok 😅 it has to end eventually right?

    • @kiloneie
      @kiloneie Год назад +13

      Pretty sure he ain't gonna stop working on the launches till December.

    • @emmata98
      @emmata98 Год назад +9

      After RDNA3 etc soon^tm

    • @lloydaran
      @lloydaran Год назад +16

      Radeon 7000: *evil laughter*

    • @drakorez
      @drakorez Год назад +14

      I am exhausted just following the reviews. Making them all must be insane lol.

    • @cromefire_
      @cromefire_ Год назад +1

      @@emmata98 soon-ish

  • @DrearierSpider1
    @DrearierSpider1 Год назад +490

    No amount of performance can change the fact that Nvidia is pricing the average person out of the GPU market. It's fine that the top end product is stupid expensive (though I wish we got a cut down AD102 based product for a reasonable price similar to the RTX 3080 being a cut down GA102 die for $700). But when a cut down AD104 die is a $900 product, they've lost the script. And I'm not sure how well a $1600, 450W GPU is going to sell when the entire world is in a stagflationary recession with a global energy crisis.

    • @ypsilondaone
      @ypsilondaone Год назад +57

      In Europe it's 2000€..

    • @julessaviour5931
      @julessaviour5931 Год назад +66

      They're turning PC gaming into a premium only market. They know gamers will dish out lots of money for frames. And we keep doing it, so they keep making things more expensive

    • @calistin1
      @calistin1 Год назад +7

      @@ypsilondaone In Europe taxes are also included in the price, not so in NA.

    • @vsammy_poet
      @vsammy_poet Год назад +9

      @@ypsilondaone yeah. Tell me about it.. Theres some models way over 2000 in the Shops 🤦‍♂️🤦‍♂️

    • @velin02
      @velin02 Год назад

      Poor people like you need to realize that people with money do not care at all what your opinion is. I’ll be buying a 4090 tomorrow

  • @Mi2Lethal
    @Mi2Lethal Год назад +178

    Insane performance for the few people that can afford, 4k raytracing finally means something.

    • @dzenacs2011
      @dzenacs2011 Год назад +4

      Still 30 fps with horrible screen tearing

    • @Deadsmegma
      @Deadsmegma Год назад +3

      @@dzenacs2011 yeah with cyberpunk... Lol

    • @trffinal1541
      @trffinal1541 Год назад

      LoL The 4090 will be buy by millions of people

    • @dhaumya23gango75
      @dhaumya23gango75 Год назад +14

      @@Deadsmegma 40+ fps most of the time at native 4k maxed with psycho raytracing

    • @dhaumya23gango75
      @dhaumya23gango75 Год назад +31

      @@dzenacs2011 If you are referring to cp2077, than the 4090 maintains 40+ fps most of the time at native 4k maxed with psycho raytracing. And cp2077 is an exception, the other games run much better. It seems people would say anything to hate.

  • @battmarn
    @battmarn Год назад +317

    Holy shit that performance is crazy
    Can't wait to buy one second hand in 6 years for a fifth of the price

    • @Ko6pa
      @Ko6pa Год назад +12

      So true mate 😭😭

    • @jeremyphillips3087
      @jeremyphillips3087 Год назад +3

      I cant wait either, im getting one in a few months.

    • @BlackJesus8463
      @BlackJesus8463 Год назад +1

      Im gonna wait until they can do it with 250 watts. ¯\_(ツ)_/¯

    • @huleyn135
      @huleyn135 Год назад +42

      At that point you might as well get the 7060 or something and not have to burn 450 watts for that performance.

    • @mirroredvoid8394
      @mirroredvoid8394 Год назад +3

      @@huleyn135 AMD is going to come out with a furnace too just look at their new CPUs

  • @anotherhuman9974
    @anotherhuman9974 Год назад +556

    My first car was stupidly fast, cost less than a 4090 with the same running costs 😂 😅

    • @b3at2
      @b3at2 Год назад +12

      😂😂😂😂🤣🤣😂😂

    • @ylstorage7085
      @ylstorage7085 Год назад +33

      "stupidly fast" as in a "downhill situation with brake being frantically pressed to the floor" fast?
      legend has it that this man's first car ran over a dude who was having a crysis in life.

    • @LoneWolf-tk9em
      @LoneWolf-tk9em Год назад +1

      Man u should do a check up of your gears at a mechanic

    • @greyman8174
      @greyman8174 Год назад +4

      Riiiiiiiiight.

    • @aphrenedge2262
      @aphrenedge2262 Год назад +4

      Was it remote control?

  • @hineighbor
    @hineighbor Год назад +9

    Came from a 1080Ti, see you in another 6 years!

    • @DETERMINOLOGY
      @DETERMINOLOGY Год назад +1

      2nd comment ive seen with someone upgrading from a 1080 Ti to 4090 and waiting another 6 years. Seems like the most logical upgrade as well

  • @yobb1n544
    @yobb1n544 Год назад +815

    Even this great performance won't convince me to spend $1600USD.

    • @JP-xd6fm
      @JP-xd6fm Год назад +106

      2200€ where I live

    • @Cenkolino
      @Cenkolino Год назад

      @@JP-xd6fm 1950€ Founders Ediiton. Of course none of us can get one because the god damn scalpers will be onto this like flies on shit...It will be easily 2400+ on Ebay...

    • @kostasft2938
      @kostasft2938 Год назад +78

      2500/2700 Euros where I live.

    • @MrStormShield
      @MrStormShield Год назад +32

      My whole gas installation cost 2500 euro and that would be the average price in Europe.....

    • @lowzyyy
      @lowzyyy Год назад

      Sadly a lot of morons will buy it, saw it on reddit

  • @brh0003
    @brh0003 Год назад +196

    "I can enjoy gaming almost as much with a GPU that costs four times less"
    I feel this. I love my 6600XT, it's been great the past year I've used it.

    • @eb60lp
      @eb60lp Год назад +24

      Keep enjoying it then! Don’t let the announcement of new hardware taint your joy!

    • @shiraz1736
      @shiraz1736 Год назад +27

      Yea don’t ever fall for the hype, the 4k’er love commenting about what they have all the time, so It comes across that they are the norm out there when in reality they are what I call the 1%ers. The 4090 is a product and that’s about it, hell I’m about to upgrade to AM4 5600/5700 and look out for a 6800 or 3070. It’s enough if I finally bother to upgrade my monitor to 1440. Then in 2 yrs I’ll look out for a second hand 58003d, just hang out the back and pick up the scraps the “average consumer” leaves behind. 👌

    • @Splarkszter
      @Splarkszter Год назад +16

      @@shiraz1736 Specially with modern games launch(sadly) being crap. There is not even a reason to play the latest tittles.
      And i even consider the steam deck being a salvation for the low-end of PC gaming.

    • @onee1594
      @onee1594 Год назад +6

      Probably gonna buy one of those 6 series Radeon GPU just to fill a spot for year or two

    • @settispaghetti2273
      @settispaghetti2273 Год назад

      i have the same plan.Wgeb i see a goot 1440p monitor on sale. ill pick up a used am4 cpu and flip mine for a little bit of more performance out of this setup.

  • @darrellid
    @darrellid Год назад +351

    Those 4K benchmarks are absurd. Thanks for all that you do.

    • @kwinzman
      @kwinzman Год назад +26

      Nobody talking about the lack of DisplayPort 2.0?
      Exactly this card could benefit from high refresh 4k displays.

    • @ceroandone
      @ceroandone Год назад +2

      I don't have 4k display, why should I get that expensive gpu???

    • @WamblyHades
      @WamblyHades Год назад +19

      @@ceroandone u shouldnt, but it showcases the how good the new architecture is.

    • @jamieammar6131
      @jamieammar6131 Год назад +9

      @@kwinzman That's a big L from this impressive product.

    • @ninja.saywhat
      @ninja.saywhat Год назад +8

      @@ceroandone someone forced you to buy it???

  • @AnttiPW
    @AnttiPW Год назад +299

    That's brutal. Didn't expect this. It's still way out of my price range, but still it's freaking impressive.

    • @ThunderingRoar
      @ThunderingRoar Год назад +56

      @@fineartpottamus9020 no it doesn't?

    • @RogueSchoIar
      @RogueSchoIar Год назад +52

      @@fineartpottamus9020 Actually it only draws 105 watts. See... I can make numbers up too.

    • @aidenlee5207
      @aidenlee5207 Год назад +24

      @@fineartpottamus9020 Did you watch the power consumption portion of the video?

    • @paulustrucenus
      @paulustrucenus Год назад

      @@RogueSchoIar What a fucking liar. We all know it's 415+34i Watts (mathematician joke)

    • @Oz-gv5fz
      @Oz-gv5fz Год назад +1

      2-4x raster perf up 👍

  • @Rudolfik
    @Rudolfik Год назад +512

    It looked so good in the benchmarks that for a minute I forgot I can't afford it 🤣. Can't wait to see 4070 being 10% faster than 3070

    • @BlackJesus8463
      @BlackJesus8463 Год назад +9

      You only really need 4K/120Hz/Ultra!

    • @mykhaylovarvarin9078
      @mykhaylovarvarin9078 Год назад +41

      And 9% more expensive

    • @njkf
      @njkf Год назад +124

      You mean the 4080 8gb?

    • @karlhungus545
      @karlhungus545 Год назад +11

      Why even consider this unless you have a 4K monitor?

    • @anthonytech
      @anthonytech Год назад +6

      @@karlhungus545 Because some people like to feel like they’re “future proof” by having an insanely powerful GPU at a resolution it wasn’t design to game at

  • @wollsmoth69
    @wollsmoth69 Год назад +4

    4k Ultra RT locked at 120fps on an LG C2 OLED is truly a whole different universe. I couldn't be happier.

    • @DETERMINOLOGY
      @DETERMINOLOGY Год назад

      @@Anonymous-bk3nj Thats good not a bad thing

    • @grinceasarofficial
      @grinceasarofficial 24 дня назад

      No screen tearing ? I’m interested in this card I have a LGCX but worried about screen tearing from the high frame rates

  • @marloiv
    @marloiv Год назад +144

    Insane to see so much Content being pumped out. Especially with so much going on this month.
    Great Review and great work :)!

  • @iseeyou1312
    @iseeyou1312 Год назад +287

    It's pretty crazy that leaks from over a year ago suggested it'd be ~80% faster than the 3090, and it turns out that it almost is (possibly could be if all CPU bottlenecking was removed). The same leaks also said RDNA 3 would be faster, thus I'm more excited for that launch than paying the obscene Ngreedia tax.

    • @adnan4688
      @adnan4688 Год назад +18

      Who wants to bet AMD flagship priced at 1600$

    • @doufmech4323
      @doufmech4323 Год назад +50

      @@adnan4688 my guess is $1400

    • @rdmz135
      @rdmz135 Год назад +63

      @@adnan4688 Theres 0 chance AMD will match Nvidia's price unless they actually have a big performance lead. And I doubt that.

    • @damara2268
      @damara2268 Год назад +11

      What?? The leaks from over a year ago were that 4090 will be 2.5x 3090 and 7900xt will be 3x 6900xt lol, completely crazy

    • @adnan4688
      @adnan4688 Год назад +28

      @@rdmz135 Dude, get over yourself. AMD had a 10% slower card and they priced it for 10% less. I don't trust them like I used to. Lets come back to this comment in a month,and lets see how it went.

  • @KennyChong
    @KennyChong Год назад +25

    Definitely next level unboxing of hardware right here! I think it will be interesting to see a watt per frame comparison in either another video or in future GPU review videos. With all the talk of power usage and electricity prices being expensive, that might also be another factor potential buyers may want to consider before purchasing a new GPU.

    • @Splarkszter
      @Splarkszter Год назад

      lowest end usually has the most performance per watt anyway. Also people that can afford a 4090 i don't think they care much about electricity prices.

    • @KennyChong
      @KennyChong Год назад

      @@Splarkszter That's true regarding being able to afford the 4090 and not caring about electricity prices. What I'm really interested to see is the efficiency improvements between generations. (eg RX5000 series to RX6000 series)

    • @clansome
      @clansome Год назад +1

      @@Splarkszter That's the point really. If you can afford a 4090 you are not going to be bothered by price of running it. A (poor) car analogy would be those drooling over the latest Lamborghini talking about its looks and how fast it can go but never being able to afford it, instead putting spoilers on their Ford Focus. The only difference here is that for some (not me) the 4090 IS worth the price. Don't forget HUB doesn't test production benchmarks, but look at a video from Gamers Nexus, Epos Vox or even LTT and they show that this is a creative monster of a card.

    • @MrVuckFiacom
      @MrVuckFiacom Год назад

      I've been getting record-breaking electricity bills during this summer. This is absolutely a consideration when buying a GPU for me.

  • @mikeelek9713
    @mikeelek9713 Год назад +165

    The number of transistors is almost hard to fathom - 76.3 billion. Just think about that number. In 2023, that's an engineering marvel to have that many electronics packed into a consumer device and to work as intended. This is far above my budget, but I can still appreciate the immense amount of development work that went into it.

    • @propersod2390
      @propersod2390 Год назад +12

      And yet people will still cry that it's "too expensive". For being the absolute fastest gpu on the planet it's priced right

    • @falcon6329
      @falcon6329 Год назад +23

      What is more insane is that smartphones have 15+ billion transistors running at 5 watts

    • @Kevin-fl7mj
      @Kevin-fl7mj Год назад

      @@propersod2390 Stupid argument,nearly every new gpu released was "the fastest gpu on the planet" while costing 649-699$,this chip is very profitable for Nvidia at 1k$ but with sheep like you it might as well cost 5k$

    • @thelmaviaduct
      @thelmaviaduct Год назад +3

      Who counted them??? #IGotsTaKnow 👍🏿

    • @aravindpallippara1577
      @aravindpallippara1577 Год назад +2

      There is this 2.5 million dollar ai chip which uses the whole wafer, not the reticle limit of 800mm^2 but the whole 12 inch wafer
      It was released in 2019 I belive

  • @Yurgen_S
    @Yurgen_S Год назад +147

    Wow, that's impressive. A shame these kind of gains won't be seen in lower end products, not even the 4080 and (the real) 4070.
    Pricing is still a mess.

    • @libertyprime9307
      @libertyprime9307 Год назад +3

      The (real) 4080 won't be much worse performing than this.
      It's not like Lovelace will be the first architecture where the gains don't have diminishing returns.

    • @darcrequiem
      @darcrequiem Год назад +31

      @@libertyprime9307 The 4080 (16GB) has 56% of the CUDA cores of the 4090.

    • @f-22raptor25
      @f-22raptor25 Год назад +4

      @@libertyprime9307 it will be around 40% slower

    • @MauroTamm
      @MauroTamm Год назад

      This feels like a 4090ti in disguise looking at the giant cap in specs.
      4080 16g is going to be a 3090ti, at lower power.

    • @aerostorm_
      @aerostorm_ Год назад +8

      One of the main reason this generations 80 card wont be a match for the 90 is that they wont be using the same die. ADA 102 which is used for the 90 is massively more capable than ADA 103. Also the 12 GB 80 model is even using ADA 104 which is also a part of why people call it a 4070

  • @geraki3117
    @geraki3117 Год назад +4

    The only card that makes sense this generation

  • @superneenjaa718
    @superneenjaa718 Год назад +14

    From your conclusion, it seems nvidia's tactics of using 3090ti to trivialise 400W+ power consumption have worked.

  • @OfSheikah
    @OfSheikah Год назад +183

    thank you Hardware Unboxed for the thorough testing as usual
    definitely squeezed a lot of performance data of this gpu you guys

    • @kiloneie
      @kiloneie Год назад +3

      About 50x better testing that what LTT shown, the have a lab and all, and can't even benchmark 10 games, it was 6 with Tomb Raider and Cyberpunk being the most tested, and still not even close to HUB. Yeh they had 4 productivity tests, but it's still in total nothing.

    • @BudgetGamerz
      @BudgetGamerz Год назад +1

      @@kiloneie But but LTT has a new lab...

    • @kiloneie
      @kiloneie Год назад +1

      @@BudgetGamerz So far Anthony said these results were from the lab, and they are as bad and limited as before... some other things from the lab look promising, but not GPU benchmarks...

    • @BudgetGamerz
      @BudgetGamerz Год назад

      @@kiloneie lol agreed.

  • @Deadsmegma
    @Deadsmegma Год назад +3

    Honestly this is the kind of performance we all wanted, now just wait for Nvidia to reduce its size in 2 years... That or they keep getting bigger

  • @mirceastan4618
    @mirceastan4618 Год назад +128

    The graphics card itself is remarkable and although it consumes a lot of power, it's nice to have the 4K/144Hz and RTX experience finally fully viable.
    The problem with "4090" is that, beginning with this generation, it seems that Jensen simply doesn't care about lower-end gamers anymore, and intentionally cut-downs everything to the point where the X90 class is the best value (or close to) there is, not to mention the perfect storm with the recession upon us and non-US currencies devaluation - yes, especially the Euro => which means something like a 4070 will cost 1100€. Sure, it's not nVIDIA's fault for this, but the "American" MSRP is crap to begin with.
    The 3060 Ti/3070/3080 vs 3090 seemed to be an exception/mistake he will never do again, unless people vote with their wallet at least right now.

    • @veduci22
      @veduci22 Год назад +4

      There is huge market for $500 4050 cards because Nvidia is a brand... Even if AMD has products with far better price/performance ratio they will still have less than 20% of market.

    • @davidszep3488
      @davidszep3488 Год назад

      Dont worry, buy AMD or Intel GPU...or nothing, you have card now right?

    • @rayenwoomed5323
      @rayenwoomed5323 Год назад +17

      @@veduci22 Crazy to think that the GTX 980 launched around the $500 price point. Even with inflation, the 4050 should not be that expensive. 😔

    • @billpii6314
      @billpii6314 Год назад +1

      @@veduci22 Wrong

    • @LeegallyBliindLOL
      @LeegallyBliindLOL Год назад +1

      Slight correction. 4k 144hz is not fully viable, because it can't run 4K 144hz over DisplayPort 1.4 . DP 1.4 only supports 4K 120hz at 4:4:4.

  • @ComputerProfessor
    @ComputerProfessor Год назад +23

    Hold out guys! Lets not forget $599.99 would have got you a top-of-the-line GPU in 2008 called the GeForce 9800 GX2. Adjusting for inflation today that would be $927.85
    Far below the 4090's price. Don't buy it and have them drop their prices or buy their competitor AMD if their prices make sense

    • @saxoman1
      @saxoman1 Год назад +3

      Ima definitely wait to see what AMD has to offer (as well as how it compares with the 4080 and the other "4080" (really 4070). Also, 24GB vram has been the top SKU for 4 damn years now 😂, should be 48GB minimum by now 😡.
      But I won't lie, the performance looks insane, wonder how emulation will fair 🤔.

    • @ChrisM541
      @ChrisM541 Год назад +2

      THIS !!!

    • @kiloneie
      @kiloneie Год назад +2

      That's how i am doing it, bought 2x GPUs(my dad at the time, too young back then) for 300€, each for a Crysis game, 1 and 2, 3870 and 6950HD, then it cost me 550€ for 1070 Ti just a month or 2 before RTX 2000 series, bloody crypto prices at the time, so now i am looking to max out at absolute max 650€, hopefully 600€, but in like a year or two from now, no money atm + whatever this recession is, if it's not artificially made yet again.

    • @Safetytrousers
      @Safetytrousers Год назад

      The 9800 GX2 did so little less than what the 4090 does. It's not apples to apples.

    • @ComputerProfessor
      @ComputerProfessor Год назад +2

      @@Safetytrousers as someone who was in computers during that time, it was the 4090 of it's day.
      Next generation will be faster. Should you pay $2k for the 5090?
      2.2k for the 6090?
      2.5k for the 7090?
      There's a decent amount of people that understand silicone and yield rates that knows they're trying to please their share holders

  • @johndc7446
    @johndc7446 Год назад +11

    It would be nice to have h264/h265 video editing performance included in the standard benchmarks for GPU's. 3d compute/rendering benchmarks are also good but having a video editing(playback/export) benchmark would be a good addition to standard GPU reviews.

  • @Isaax
    @Isaax Год назад +111

    Very short 18 second video but I enjoyed it nonetheless! More of this, thank you!

  • @HeirofCarthage
    @HeirofCarthage Год назад +26

    The performance is impressive but the 1080ti had a big jump like this over its predecessor and wasn't near this expensive. The price on this card is awful. $1600 best case is insane even for flagship. I typically buy flagship GPUs but not this gen. The price in this economy sucks all the excitement out of the performance for me at least.

    • @HeirofCarthage
      @HeirofCarthage Год назад +1

      Very well done video and benchmarks as usual! Top notch content.

    • @cptnsx
      @cptnsx Год назад +2

      The 1080TI was ~70% faster than the 980TI and was only 699.

    • @matteo964
      @matteo964 Год назад +2

      @@cptnsx it could many times double the performance really. Especially at qhd

    • @notapplicable7292
      @notapplicable7292 Год назад +1

      As GN has said many times, Nvidia will never make the mistake of the 1080ti again.

  • @excalibur3311
    @excalibur3311 Год назад +3

    The performance gain simply isn't that impressive. It's a 37.25% difference compared to the 3090 Ti. When that 4080-16GB comes out, you people will experience sheer disappointment.

  • @VelcroSnake93
    @VelcroSnake93 Год назад +36

    pretty dang impressive. Looking forward to seeing what AMD has coming after this.

    • @nishanthadda8824
      @nishanthadda8824 Год назад +2

      Nothing.They lost

    • @tyranorex940
      @tyranorex940 Год назад +2

      @@nishanthadda8824 AMD will always be behind in GPU game NOT even closeee

    • @vaudou_
      @vaudou_ Год назад +29

      @@tyranorex940 Gotta love fanboys

    • @damiendegrasse
      @damiendegrasse Год назад +13

      @@tyranorex940 literally hundreds of slides in this video show AMD being close in the last gen. Your comment is disconnected from reality.

    • @tyranorex940
      @tyranorex940 Год назад

      @@damiendegrasse AMD fanboys lol. Too bad you couldn't get your hands on a 3080 I'm sorry man

  • @vensroofcat6415
    @vensroofcat6415 Год назад +65

    That unboxing video was something of a next gen. On par with the product presented. Saving my time for free and in style, love this channel! 👍

  • @smiIingman
    @smiIingman Год назад +3

    I told myself i wouldnt get a 4090. (ive a 10700k + 3080 rig at 1440p gaming)
    But the hype got the better of me, i ordered a PNY XLR8 4090 which is supposedly just an FE but with beefier cooling, we'll see when it arrives.
    Im very excited, i imagine my fractal design ION 860w Platinum psu should be fine at stock speeds.

    • @saxoman1
      @saxoman1 Год назад

      Similar for me, except my cpu is 5900x, I have a 144hz 4K monitor that my 3080 can't fully drive on high settings, but THIS thing is literally twice as fast, I never thought I'd spend $1,800 (after taxes, Gigabyte gaming OC) on a GPU, but here we are 😂😂 (luckily, now I can afford it)

  • @PixelBlitzXP
    @PixelBlitzXP Год назад +11

    I just bought an EVGA 2080ti ftw3 off eBay for $420. I'll be sticking with that card for the foreseeable future as it plays everything I want on ultra settings and I'm only playing on a 1080p monitor.

    • @badz1497
      @badz1497 Год назад +5

      2080ti for 1080p is overkill anyway

    • @PixelBlitzXP
      @PixelBlitzXP Год назад +1

      @@badz1497 I know.

    • @AdiiS
      @AdiiS Год назад

      @@badz1497 no it's not and it will be much worse in a few months with new games, it will be just enough for 1080p60.

    • @tranquil14738
      @tranquil14738 Год назад

      420? I would’ve thought it would be cheaper. 4 year old gpu man

    • @PixelBlitzXP
      @PixelBlitzXP Год назад

      @@tranquil14738 for a 2080 ti? right now that's a steal dude. Go look at the prices. And it's more powerful than a 3070.

  • @AvalanchCXVII
    @AvalanchCXVII Год назад +65

    Woah, the jump in perf/power is actually spectacular. A pleasant surprise, this card knocks the performance part out of the park. Price... eh. Will wait and see the lower end cards for that.

    • @colbyboucher6391
      @colbyboucher6391 Год назад +3

      Of course it needs to chug 400 watts to do it

    • @Nerdywr
      @Nerdywr Год назад

      @@colbyboucher6391 man your stupid af lol. Steve literally already showed the frame/watt is almost 50+% better than the 3090ti. It only chugs 400w because the games are run with frames uncapped at 4K. Stupid as fuck.

    • @RobBCactive
      @RobBCactive Год назад +1

      What jump???? There's only a jump if you select Nvidia's most offensively abusive crypto bubble height Ti cards.
      Given the 6900xt is $699 now, the price/perf has actually tanked

    • @markhackett2302
      @markhackett2302 Год назад

      That isn't perf/power, the 4090 could be 75% idle while the 3090 is 25% idle, so if you did power/watt to "fully used", the 4090 would lose again (at those figures) because it is sitting idle a LOT more than the previous gens. It becomes intercomparible with Navi31, theoretically both AD102 and Navi31 should be equally idle at that task, and perf/watt would compare. As it is, it compared the 3090 with the 6950, and we already knew the 3090 was cranked to the limit, so it isn't much of a surprise there. If the same test were done between 3080 (top end, not halo) and 6800/XT (again, top end, not halo), then we would see the actual architecture used be tested, whereas the halo can "excuse" itself as unprepresentative of that architecture, so losing isn't indicating other than the 3090 was heavily overclocked, not that NVidia were poor at optimisations.

    • @Blad071
      @Blad071 Год назад

      @@RobBCactive you AMD fanbois are actually insufferable. Nvidia could make something that is 4x on performance for next gen for the same price and you would still find something bad about it. Not an Nvidia fan myself but they actually made something amazing but you Muppets can't let go of your stupid hateboner.

  • @felix123418
    @felix123418 Год назад +39

    Great video!
    I think a comparison between PCI-E 3.0 and PCI-E 4.0 performance would be interesting. I mean nvidia didn't jump to 5.0 so I wonder if there even is a difference between 4.0 and 3.0.

    • @berengerchristy6256
      @berengerchristy6256 Год назад +8

      Not for gaming

    • @henryvaneyk3769
      @henryvaneyk3769 Год назад +6

      No point at all. GPUs, even one such as this, do not have near the bandwidth to make use of PCI-E 4.0.

    • @mexmer3223
      @mexmer3223 Год назад +1

      faster PCIE has meaning only when you need to move large data, which technically makes no sense for games (not to menion rebar rectifies lot of data latency issues for asset loading). when we talking ML/AI in that case you need high bandwith ... but that's more suitable for former QUADRO cards or new RTX Axxxx cards. Other periferals benefitting from higher bandwith are NICs and RAID controllers, but even highend gaming GPUs have hard time bottlenecking PCIe 3.0@16x

  • @radofjc8242
    @radofjc8242 Год назад +63

    I am really impressed by this performance no matter the power usage, though the msrp brings me back memories from the scalpocalypse..

    • @surft
      @surft Год назад +1

      Those who bought 3090TI's at $2k early in the year though, are probably kicking themselves.

    • @Jaynan2
      @Jaynan2 Год назад

      @Eric Nope since 4090Ti will be overkill.

    • @alexxxvill69
      @alexxxvill69 Год назад +3

      @@Jaynan2 4090ti will most likely start at $2,000. Thats a $400 difference. 4090 buyers won't kick themselves. At the most, it'll be 10-15fps diference. Not worth it.

    • @alexxxvill69
      @alexxxvill69 Год назад

      @Eric Its not overkill for 4k.

  • @60DollarCodger
    @60DollarCodger Год назад +35

    What a packed review! Thanks, Steve 👍
    30:10 "Ray Tracing is finally a carefree option"
    Knowing the great conversations HUB has had about this 'moment', that comment stands out.

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад +1

      Thank you Matt, as always your support is much appreciated.

  • @josh0156
    @josh0156 Год назад +10

    Thanks for reviewing the 4090 on the 5800x3D. I'm sticking with mine at least until I can see how the 7800x3D performs.

    • @jonaslarsson1761
      @jonaslarsson1761 Год назад

      Same here. Receiving my suprim x 4090 tomorrow and I hope that my x 3d will keep me gpu bottlenecked in most games

    • @oz_steve9759
      @oz_steve9759 Год назад

      @@jonaslarsson1761 Going to be doing exactly the same as you, keep the 5800x3d and see what the 7800x3D can bring to the table.
      I am only interested VR gaming at approx 4K and may discover that my current CPU and the 4090 are all I need to max out the VR headset.

  • @TheIronArmenianakaGIHaigs
    @TheIronArmenianakaGIHaigs Год назад +11

    It's seems like even if you have a top of the line PC. If you want to play games maybe your better at going for 4k 144hz then 1440 240/360hz thanks to CPU bottle necks.
    (Then again with the lack of display port 2.0 on the cards you can't do 4k 144hz only 4k 120hz. Really dumb move from Nvida)

    • @shotgunjohn3
      @shotgunjohn3 Год назад +1

      8k 500 FPS War Thunder content bby

  • @The_Fat_Turtle
    @The_Fat_Turtle Год назад +47

    Nice to see huge generational gains, hope the 4060 isn't cut down too much and ends up performing well at a decent price but that's asking too much.

    • @Thor_Asgard_
      @Thor_Asgard_ Год назад +33

      you know that the 80 is allready more then 50% cut down ^^ the is no price/performance increase this gen for midrange. its a joke.

    • @nsuinteger-au
      @nsuinteger-au Год назад +25

      "at a decent price". You lost the plot there

    • @nerijus7
      @nerijus7 Год назад

      4060 should cost about $700 but its only prediction.

    • @lagarttemido
      @lagarttemido Год назад +25

      The "4080 12GB" is your 4060 buddy.

    • @Dark.Syndicate
      @Dark.Syndicate Год назад +4

      @@lagarttemido yeah. still weird seeing these people expecting nvidia to release decently priced good gpus. it seems like such people live under a rock!

  • @Rossco1337
    @Rossco1337 Год назад +6

    my biggest takeaway from this is how well the 6950 xt stacks up at 1440p. in the UK, it only costs $680 USD before tax (scan). it gets most of the way there at well under half the price.

    • @TheGuy..........
      @TheGuy.......... Год назад +1

      4090 is cpu limited at 1440p and 1080p

    • @babochee
      @babochee Год назад

      @@TheGuy.......... which is nvidia's problem for not making cpus

  • @oldairpatino757
    @oldairpatino757 Год назад +13

    Ahhh so it begins

  • @youtubevanced4900
    @youtubevanced4900 Год назад +7

    A $900 4070 makes the entire mid-high end of the 4000 series an utter failure.

    • @raresmacovei8382
      @raresmacovei8382 Год назад

      4060*

    • @ypsilondaone
      @ypsilondaone Год назад

      More like 4060ti..

    • @nipa5961
      @nipa5961 Год назад +2

      Just wait for the $800 4080 10GB and $700 4080 8GB.

    • @youtubevanced4900
      @youtubevanced4900 Год назад

      The $900 4070 in Australia is over $1600.
      The 3080 launch price was under $1200.
      Not only is the US price ludicrously expensive, the Australian price gouging is even more extreme than last generation.
      The currency conversion rate is not 2:1.

    • @nipa5961
      @nipa5961 Год назад +2

      @@youtubevanced4900 It's similar in Europe.
      The 3080 MSRP was 729€ at launch.
      The 4080 MSRP is 1469€ here.
      More than double the price for an insanely cut down "80" card...

  • @Krenisphia
    @Krenisphia Год назад +23

    The best thing to me is not the performance, but rather the improved efficiency. Much better than I expected.

  • @Azhureus
    @Azhureus Год назад +5

    2.5x - 4x better performance, yeah right Nvidia, calm down yeah ! I see mostly around +50-60% in performance and sometimes not even that...for €1600 ? Thats a thievery ! Btw Harware Unboxed, props to you showing Hunt: Showdown, thats a big big plus you won over me :)

  • @Lue1337
    @Lue1337 Год назад +14

    These are some awesome results, now it's time to wait for the AMD counterpart, it's gonna be interesting seeing the first chiplet design GPU results.
    I hope we can get 1080p 240hz OLED and 1440p 240hz OLED monitors very soon with the new standards of performance.

    • @nishanthadda8824
      @nishanthadda8824 Год назад

      AMD is dead.

    • @SpeedKillersGaming
      @SpeedKillersGaming Год назад

      @@nishanthadda8824 if amd csnnot deliver ray tracing performance as good or better than nvidia then it is for me

    • @nishanthadda8824
      @nishanthadda8824 Год назад

      @@SpeedKillersGaming yeah we buy 4090

  • @tharun7290
    @tharun7290 Год назад +3

    Okay hear me out for a second guys, Moving from Samsung 8nm to TSMC's flagship 4nm process (with the current state of the market) means that a significant cost increase is kinda expected tbh. I'd guess that doubling the transistor count also adds significantly to cost. I'm sure they still make nice margins, but the price may not be as outrageous as we think. What really matters is the pricing of the 60 and 70 SKUs, this is just a halo product for the rich.

  • @Frequincy100
    @Frequincy100 Год назад +14

    Very impressive, but I will be waiting for RDNA 3. I would be interested in a cost per frame for RT results.

  • @izidor
    @izidor Год назад +5

    We arrived at 4k without any downsides, except price and watt usages.

  • @artofwar420
    @artofwar420 Год назад +12

    I just got a 4090. Luckily at retail price. It is as fast as he says. Makes ray tracing at 4K doable at 120fps. Frame generation is a powerful technology. It’s ridiculous how it works, so yes there can be tearing because at the moment there is no vsync with FG. Which means the more you push the card the smoother it feels. Basically you’ll find yourself turning on everything including RT to the max.

  • @mordax7443
    @mordax7443 Год назад +9

    Would be really interesting to see Alder Lake vs zen3d vs zen4 comparison using the RTX4090!

  • @Fistafandula
    @Fistafandula Год назад +14

    2 generations from now sounds like a nice future. Hanging on to 5700xt a while longer.

    • @Aleph-Noll
      @Aleph-Noll Год назад

      me side eyeing my gtx970 >.> hold on baby for a little while longer

  • @zvonimirkomar2309
    @zvonimirkomar2309 Год назад +27

    Thanks for the great video Steve. Apart from re-benchmarking the CPUs with the 4090, I'd also like to see if PCIe 3.0 X16 is now a bottleneck for this video card. Cheers once again.

    • @colemin2
      @colemin2 Год назад

      Good question.

    • @lagarttemido
      @lagarttemido Год назад

      It should be on 1080p and lower I guess.

    • @zvonimirkomar2309
      @zvonimirkomar2309 Год назад

      @@lagarttemido I don't know, it needs to be tested. 3090ti was very consistently about 3fps slower on all resolutions on PCIe 3.0 vs 4.0.

    • @jyrolys6
      @jyrolys6 Год назад +1

      @@zvonimirkomar2309 that could be more to do with 4.0 having a higher frequency for tighter update cycles. Unlikely to be a result of bandwidth alone.

  • @Theafter.
    @Theafter. Год назад +16

    Im my opinion if the 7900xt offers similar performance I see no reason to get a 4090 the 7900xt comes with display port 2 which with this much power is definitely a must your averaging over 120 at 4K but DisplayPort 1.4 maxes out at 120 while the 4090 is at 140 on the ultra preset and most people will be playing high rather than Ultra meaning your gonna be leaving an extra 40 to 50 fps on the table cause the 4090 only has display port 1.4

    • @gblessbacon
      @gblessbacon Год назад +1

      Geez youd think with their announcement of Remix that they would have included 2.0 to futureproof this card. Obv it can't output 240hz 4k in expensive modern games but older games with an RT remix might be able to. Not that I would ever buy this card but that is actually a big deterrent.

    • @Gaboou
      @Gaboou Год назад +1

      There is always Display Stream Compression which I use for 4K @144 Hz and it works beautifully. 8-)
      It's visually lossless and so far I haven't seen any artifacts or other problems with it.
      But yeah - not including DP 2.0 is still a bummer... :(

    • @Theafter.
      @Theafter. Год назад +1

      @@Gaboou Yea, Fair point but still definitly sucks that a card with this power does not have it

  • @frankieinjapan
    @frankieinjapan Год назад +21

    Wow this went way different than I was expecting. This is an insane jump in performance I'm genuinely impressed. I haven't been this impressed since the 1080ti release

    • @Old-Boy_BEbop
      @Old-Boy_BEbop Год назад +1

      bank balance is saying the same thing the jump in price.

    • @raresmacovei8382
      @raresmacovei8382 Год назад +9

      If it was 500-700 USD, it would've been cool.
      At 1600$ / 2000€ before tax, AHAHAHA

    • @adarion2994
      @adarion2994 Год назад +2

      Many expected the 4090 to be good it’s for enthousiast and it’s really good, the real issue is probably gonna the 4080 they just sound like huge cut of 4090 exept for the price

    • @frankieinjapan
      @frankieinjapan Год назад +3

      @@raresmacovei8382 Oh absolutely. They just artificially added the xx90 class cards so they can sell the xx80 silicone for Titan prices. The entire premise is pathetic and degrading. I miss getting the top tier card for 599.

    • @VoldoronGaming
      @VoldoronGaming Год назад +1

      At 200 watts more power and 3 times the price.

  • @jdnaveen321
    @jdnaveen321 Год назад +1

    i thought 20-30 series are the big leap in graphics, but this 30-40 is even crazier damn

  • @madd5
    @madd5 Год назад +37

    This is INSANE!
    I was expecting like 30-35 percent in pure rasterization performance. Nvidia must know what AMD are coming up with. Radeon 7000 will be insane too.

    • @MrSanbonsakura
      @MrSanbonsakura Год назад

      it has been known that performance will be 60-80% for about 6 month look at channel the Moore law is dead

    • @kiloneie
      @kiloneie Год назад +4

      Last year AMD already won rasterization at 500$ less, so if the chiplet design wins 4090 harder, we will hopefully see with FSR 2.0 a much greater shift towards AMD(not for RT though, but how many % of all people gaming actually care about RT ?). Quite confident AMD can beat them in everything but RT., and the pricing will either be 1000$ again or 1100$ based on rx 6950xt, hopefully 1000$...They can't afford to start following Nvidia's pricing like that, not when the GPU market share and year to year sales do not favor them at all, unlike how Ryzen crushed Intel, which wouldn't have happened if Intel didn't find themselves in the 10nm hell like they did, along with many other problems.

  • @mwales2112
    @mwales2112 Год назад +4

    Some are now on newegg with pricing at $2649.00 to $3199.00... Even at $1600 they can keep it... Very happy with my 6700XT....

  • @NostradAlex
    @NostradAlex Год назад +20

    To sum it up the 4090 is the first card that can actually play games at 4K.

    • @xpodx
      @xpodx 8 месяцев назад

      Older games are still fun and amazing. My 3090 plays 4k 144hz nicely in many, most my titles. Though I wanna save for the 5090

    • @deadscene1
      @deadscene1 8 месяцев назад

      6900xt can easly play at 4k

    • @xpodx
      @xpodx 8 месяцев назад

      @deadscene1 depends on the game, settings fps yes. But 4090 is the best option.

  • @rmgaminguk7079
    @rmgaminguk7079 Год назад +32

    These uplifts are insane. I only play at 1080p with my gtx 1080 and was thinking of an upgrade, but I think I'll wait a year till the 4050 and 4060 are out and jump to 1440p at the same time.

    • @MLWJ1993
      @MLWJ1993 Год назад +7

      1440p is definitely about to become the new 1080p 😛
      I wonder how this handles Doom Eternal @ 8k 🤔

    • @deathtoinfidelsdeusvult2184
      @deathtoinfidelsdeusvult2184 Год назад +1

      @@MLWJ1993 I can't even see 1440p able to go top 5 in the most used resolutions. I guess it's still 5 years to go to become one.

    • @wobblysauce
      @wobblysauce Год назад

      1080/4K is good scaling 1to1.
      But some like the 1440p look

    • @Splarkszter
      @Splarkszter Год назад +1

      *Cries in 720p*

    • @aaz1992
      @aaz1992 Год назад +1

      4060 should be great. Or maybe consider a 3060Ti or 3070 at a discount

  • @CataclysmZA
    @CataclysmZA Год назад +7

    I really appreciate the work that's gone into those cost per frame graphs. Excellent review, and I have no problems linking this video to anyone who wants to know if they should get one.

  • @Matticitt
    @Matticitt Год назад +66

    I'm now very interested to see how the 4080 performs. Awesome review, Steve, and awesome card.

    • @freelancerxxx
      @freelancerxxx Год назад +16

      You mean 4070 pretending to be 4080 😁

    • @robosergTV
      @robosergTV Год назад

      @@freelancerxxx no, there are two versions of 4080s

    • @KingZeusCLE
      @KingZeusCLE Год назад +19

      ​@@robosergTV No, the lower tier "4080" is literally a 70 series die. It's not cut down, it's completely different from the higher tier 4080. Nvidia misleading customers....

    • @zvonimirkomar2309
      @zvonimirkomar2309 Год назад +3

      I'm concerned there'll be a huge gap in perfirmance between this and 4080/12GB and lower. Which means mid-range price tier gamers will get screwed again.

    • @Matticitt
      @Matticitt Год назад +2

      @@freelancerxxx i mean the actual 4080, not the fake 12gb one

  • @syntrx8185
    @syntrx8185 Год назад +6

    I'm probably gonna buy this next year. This could last me ~5 years in 1440p high refreshrate gaming. The reason why I'm not going 4k yet is because I'm waiting for the monitor market to mature, what with QD-OLED.

    • @MrMastadox
      @MrMastadox Год назад +4

      Why buy it al all. Seems like 4k is really what this us for. And by the time QD-oled screens become mainstream, you will be buying a new system, monitor and gpu anyway. And by that time a midrage gpu will perform like this 4090. Unless you just have money to waste. Seems this card gets bottlenecked at that resolution quite a lot. Seems a waste of money.

    • @syntrx8185
      @syntrx8185 Год назад +1

      @@MrMastadox I won't be buying the next 50 series and RDNA 4 GPUs, I'll just skip straight to the 60 series/RDNA 5. By that point I'll probably switch to 4k or higher resolution. The rig I'm currently rocking is an R7 1700 and a GTX 1080. Seeing as AMD will be supporting AM5 at least to 2025, I'm pretty comfortable buying. There's a QD-OLED monitor currently in the market, the Alienware AW3423DW. Also I'll have less of a chance of just upgrading to the next gen if I have the 4090.

    • @MrMastadox
      @MrMastadox Год назад

      @@syntrx8185 Up to you. But it seems you stepped into the early ryzen am4 platform. And are now looking to go to AM5. Seems like the upgradability of AM4 was never used by you. Do you think you will with AM5? It does seem like everything is extra expensive right now. Might be smart to wait a bit. And see what amd's x3d processors will be like if you plan on going 4090

    • @syntrx8185
      @syntrx8185 Год назад

      @@MrMastadox I'm definitely waiting for the x3D versions, considering how good the 5800x3d was. And do you think it's better to upgrade all at once or upgrade piece by piece and sell off the old components?

    • @MrMastadox
      @MrMastadox Год назад

      @@syntrx8185 piece by piece? A 4090 on a 1700 ryzen is stupid. Especially because you can't make use of its power. And as time goes by the 4090 will only get cheaper. Bit by bit upgrade is useless. Your system is so old that you will not get much money for it anyway. So sell it or not. It will not make a huge difference.

  • @Many_Mirrors
    @Many_Mirrors Год назад +11

    The RTX 4090 is 2500€ in Germany. A couple of years ago I could've built 2 very decent gaming PCs for that money. Now I get a chonky compact space heater.

    • @JP-xd6fm
      @JP-xd6fm Год назад

      From 2200€ in Spain

  • @mrbean30392
    @mrbean30392 Год назад +7

    Wonderful job Steve.
    For the Watdogs RT performance; you mention that the DLSS bottleneck is 109 FPS.
    Wouldn't it be more correct to say the CPU bottleneck with RT enabled only allows for 109 FPS max. As we have seen from many RT titles. RT increases the demand on the CPU as well as the GPU.
    I would love to see how the 7700X + 4090 perform in these RT CPU bottlenecked scenarios.

  • @KMPMOCS
    @KMPMOCS Год назад +32

    That thing is beautifully sculpted. But still I'm eager to see what AMD will offer this year.

    • @hiphophead8053
      @hiphophead8053 Год назад +3

      No chance of matching the 4090

    • @yamsbeans
      @yamsbeans Год назад +1

      @@hiphophead8053 might match it when ray tracing isn’t on

    • @hiphophead8053
      @hiphophead8053 Год назад

      @@yamsbeans impossible. Nvidia went from samsungs 8nm ( which is like tsmc 12nm in density) to tsmc 4nm ! Amd is going from 6nm to 5nm lol its just physics

    • @inspirer4763
      @inspirer4763 Год назад

      Disappointment, as always.

    • @Danny_On_Wheels44
      @Danny_On_Wheels44 Год назад +1

      @@hiphophead8053 So I will still get AMD, it doesn't need to top the 4090.

  • @HardwareForGamers
    @HardwareForGamers Год назад +6

    Now, if it wasn't the price of a high end gaming system I would be more impressed.🤔

  • @ericlawrenceq
    @ericlawrenceq Год назад +1

    2 questions: (1) when rdna 3 arrives, will @Hardware Unboxed retest these with a new system (pcie5, ddr5,zen4) ... (2) also when rdna3 arrives, displayport 2.0 vs 1.4a really a thing?

  • @adi6293
    @adi6293 Год назад +17

    This looks like the very last card you would need for 1440p for a very long time 😜

    • @darkmanure
      @darkmanure Год назад

      You shall not get another card for 10 years at least!

    • @HeretixAevum
      @HeretixAevum Год назад

      @@adrenalinejunkie3828 I think this card costing 1K more than the 1080ti is a bit more relevant to it not being too good than lacking a new DP.

  • @gyunhigyu
    @gyunhigyu Год назад +14

    I just bought a 6800 XT used, altough in perfect condition, and it works great. Looking at the 4K graphs I'm happy that I bought this for 465€ vs the 4090 that goes for 2000€ -> double the performance but for 4x the price...

    • @Quade12X
      @Quade12X Год назад +3

      Well still be happy with it regardless, but a 7800 XT will probably be largely better price/performance (time will tell but a 90 series card versus your card isn't a good comparison). This card has 24GB of VRAM, not really a card anyone needs in 2022-2023.

    • @skywalker1991
      @skywalker1991 Год назад

      Amd won't be able to beat this monster .

  • @portatil8676
    @portatil8676 Год назад +1

    17:15 nice, I'm getting basically the same perfomance with my i7-13700k! Loving this gpu.

  • @madhouse8337
    @madhouse8337 Год назад +4

    that thumbnail and intro is top notch xdxdxdxdxxd

  • @RecBr0wn
    @RecBr0wn Год назад +9

    Amazing advancements in technology, I might just pick one up in a couple of years if the price come down a bit

    • @undeny
      @undeny Год назад

      Might not be worth it then.. Power consumption will surely improve and by the we might have newer technologies than dlss 3 as well

  • @Alirezarz62
    @Alirezarz62 Год назад +1

    For some reason I didn't get any recommendations or notification for this video from youtube had to search you guys to find the video

  • @raybrown6165
    @raybrown6165 Год назад +7

    Great review, as usual. Question is have we reached a point where the card's capability at 4K has exceeded capabilities of any Monitor? Thinking it is like having a car capable of 180 MPH and living in New York City with a 25 MPH speed limit.

    • @DravenCanter
      @DravenCanter Год назад +2

      The highest refresh rate 4k monitor is 240 hz.

    • @damiendegrasse
      @damiendegrasse Год назад +1

      No

    • @MLWJ1993
      @MLWJ1993 Год назад

      Depends on if you factor in features like DLSS 3.0
      If you don't the answer is: it depends. Some games actually might go beyond 4k high refreshrate monitors.

  • @MrRedRye
    @MrRedRye Год назад +13

    While the impact of RT is still brutal, the 4090 without DLSS matches or exceeds the 3090ti with DLSS. At least that shows some improvement to the RT pipeline; something that wasn't present from Turing to Ampere. I was never going to buy the 4090 but I will say that purely on the performance basis, this card is very impressive.

    • @jayjaytp10
      @jayjaytp10 Год назад

      Yeah matches it in 2 games out of 27 games

    • @MrRedRye
      @MrRedRye Год назад

      @@jayjaytp10 what are you on about? The 4090 with "RT Ultra" matched or beat the 3090ti with "RT Ultra + DLSS" in all the games they did that comparison for in the video at 1440p and 4k

  • @Aszourus
    @Aszourus Год назад

    I want everyone to realise that here in the Netherlands this card costs more than 2x 42 inch LG C2 oleds. And id even have cash left to buy a mount for both. This is insane

  • @jensenhuangnvidiaCEO
    @jensenhuangnvidiaCEO Год назад +12

    If I had a dollar for every GPU I sold last year, I would have $561 million dollars.

  • @Sakosaga
    @Sakosaga Год назад +10

    This card looks scary, I know we're thinking CPU bound, but like I would like to see this card against Intel and AMD CPUs a few years from now because last gen Nvidia cards looked like this during release but we didn't really see such change until 5800X3D came out. So now I think we're gonna have to wait until we see a few more generations of CPUs to see exactly the limits of this card.

    • @berengerchristy6256
      @berengerchristy6256 Год назад

      This card is a waste of money for gaming if you’re below 4k

    • @heruvim8887
      @heruvim8887 Год назад

      @@berengerchristy6256 yeah people were saying the same for 3080/3090 , look at them 2 years later, stop thinking you buy 4090 to play 2 months with it lmao.. no card is to much for 1440p.

    • @berengerchristy6256
      @berengerchristy6256 Год назад

      @@heruvim8887 no one was saying that

  • @krandeloy
    @krandeloy Год назад

    I had the video in an open tab for several hours before i had time to get to actually watching.
    That timing on the shift to black screen when the video re-loads was absolutely perfect for that intro.

  • @GAKtion64
    @GAKtion64 Год назад +26

    Steve, this is truly impressive what you found and I also want to thank you for all your hard work in checking out what nvidias engineers were able to do here with this GPU. I surprised how fast this GPU is except for its price. Shame they are charging 1600 dollars for this card.
    Anyway, Looking forward to your AMD Radeon 7000 series GPU reviews in the future!

  • @tanmay5570
    @tanmay5570 Год назад +9

    A review containing all types of advices for all classes of people...Great job HUB & not to mention Nvidia(without the pricepoint)!

  • @Vexiong
    @Vexiong Год назад +4

    What really impressed me about these graphs was the performance of the 6900 xt and 6950 xt

    • @GregPolkinghorne
      @GregPolkinghorne Год назад +2

      Same. For some reason I had it in my head that they were a competitor to the 3080. Not trading blows with a 3090ti.

  • @Link0402
    @Link0402 Год назад +6

    That's pretty nuts. Hope AMD can compete in this generation, else pricing will remain truly fucked for years to come.

    • @TrueThanny
      @TrueThanny Год назад +5

      Based on leaked specs, it's more likely to be a massacre than a competition. In favor of AMD.

    • @MLWJ1993
      @MLWJ1993 Год назад

      @@TrueThanny specs hardly tell the story though. Zen 4 already shows an issue using multiple CCX's in a game. It might hold up well for productivity, anything for gaming definitely remains to be seen & challenges with connecting tiles are very much a real thing with sometimes disasterous concequences depending on the content.

    • @TrueThanny
      @TrueThanny Год назад

      @@MLWJ1993 There's like one game which has a problem with dual-CCD Zen 4 chips. It's an issue with the game, not the processors.
      And that has nothing whatsoever to do with the specifications of AMD GPU's.

    • @MLWJ1993
      @MLWJ1993 Год назад

      @@TrueThanny Connecting different parts is a real challenge. Just as much as executing games is on multiple cores.
      Saying it's a software problem doesn't help making the experience less troublesome...

  • @florianwiegand1313
    @florianwiegand1313 Год назад +5

    I am a casual gamer and bought it simply because i just want to enter the game menu, set everything to ultra and not care. Great review!

  • @Dellerss
    @Dellerss Год назад +2

    5-6 times more expensive than the top card I bought in 2011. Inflation in the same period has been a bit over 30%. I really don't know who these are for.

  • @LAG09
    @LAG09 Год назад +5

    I don't think you should just look at the benchmark performance difference to lower end cards, the RAM is absolutely a factor. Specially for people who want to do stuff like neural net and other AI jobs on it. The ability to run much higher res/detail models with the 24GB of memory was what got me into looking for a 3090 and to lowball a scalper in the summer of 2021 hoping he'd accept for some reason. He probably didn't think I had the money lowballing him to near-MSRP and was probably just as surprised when I paid him without delay as I was when he accepted my offer at barely above MSRP.

    • @drago939393
      @drago939393 Год назад

      What exactly is this about RAM?

    • @LAG09
      @LAG09 Год назад

      @@drago939393 I refer to it as "RAM" rather than "VRAM" because in compute you are using it as such.

    • @drago939393
      @drago939393 Год назад

      @@LAG09 I get that, I was asking how the (V)RAM works for neural/AI things.

    • @LAG09
      @LAG09 Год назад

      @@drago939393 It allows you to work on bigger or more dense datasets resulting in higher quality output.

    • @drago939393
      @drago939393 Год назад

      @@LAG09 Hm, how does regular RAM factor into it?

  • @madgodzilla12465
    @madgodzilla12465 Год назад +24

    The 4090's performance is incredible. Although I don't see my 6900xt holding me back on anything anytime soon. This kinda reminds me of when the 1080ti came out and how much faster it was than everything haha

    • @8Paul7
      @8Paul7 Год назад +11

      If only 4090 was 699 bucks like 1080Ti was.

    • @Birdman._.
      @Birdman._. Год назад +1

      @@8Paul7 its flagship gpu it dont make sense
      But then 4080 ti isnt 699$ too(btw 1080 was 699$ ti one was 799$)

    • @8Paul7
      @8Paul7 Год назад +1

      @@Birdman._. I googled it and found that 1080 Ti was 699.

    • @bb5307
      @bb5307 Год назад

      @@Birdman._. 980Ti flagship was 649, 1080ti flagship was 699.

    • @justinvanhorne8859
      @justinvanhorne8859 Год назад +1

      exactly! by the time I "wanted" a 1080ti, it was already years old. its neat being at a point in my life where i can justify these purchases, but also knowing to limit myself.
      oh and review note:
      yes there is in fact a founders edition 3090ti... I've held one. Probably just not available in AUS like the 4090FE... if that's the justification for your not using it in your review for apples to apples comparisons, why are you reviewing the 4090FE ?

  • @shaunmadden545
    @shaunmadden545 Год назад +5

    I always come to hardware unboxed for the first good review. Incredible job Steve.

  • @WamblyHades
    @WamblyHades Год назад +5

    The performance upgrade with this new 40 series architecture is impressive. Shame that the price-performace ratio* will be basically the same as before...
    *edit: with the 4080 and lower range cards.

    • @hiphophead8053
      @hiphophead8053 Год назад +1

      In which universe is price/performance the same ? The rtx 3090 had an msrp of 1500$

    • @gurjindersingh3843
      @gurjindersingh3843 Год назад

      @@hiphophead8053 At current prices

    • @cptnsx
      @cptnsx Год назад +1

      @@hiphophead8053 it should have NEVER been 1499.

    • @hiphophead8053
      @hiphophead8053 Год назад

      @@gurjindersingh3843 haha yeah on that logic in 2 years time the rtx 4090 will also be around 1000$. you compare at msrp

    • @hiphophead8053
      @hiphophead8053 Год назад

      @@cptnsx sure but that was the msrp. Not only was that the msrp but people were paying 2k for it due to mining

  • @rare6499
    @rare6499 Год назад +34

    I never thought we would see these kind of performance increases in a single generation again. This is like the good old days. Amazing.

    • @HAHA.GoodMeme
      @HAHA.GoodMeme Год назад +3

      the power of going from Samsung to TSMC

    • @omerahmer6867
      @omerahmer6867 Год назад +9

      this aged horrendously

    • @rare6499
      @rare6499 Год назад +14

      @@omerahmer6867 no it didn’t. The 4090 is the standout card of the whole generation.

    • @omerahmer6867
      @omerahmer6867 Год назад +6

      @@rare6499 i’m talking about the 4060 ti having no improvement over the 3060 ti

    • @rare6499
      @rare6499 Год назад +18

      @@omerahmer6867 but what’s that got to do with a review of the 4090?

  • @crylune
    @crylune Год назад

    0:00 holy crap, hardware actually being unboxed on Hardware Unboxed

  • @crewmaster6014
    @crewmaster6014 Год назад +7

    Coming from Gaming Unboxed stream, pretty hyped for something so unaffordable.

    • @MrDvneil
      @MrDvneil Год назад

      hold my friend hold!! 😣 prices will go down
      myself holding the gtx 970

    • @crewmaster6014
      @crewmaster6014 Год назад

      @@MrDvneil holding on a gtx 970 and still with hope for some lower prices, you are th king. Can I have some of your trust?

  • @TheSnake0444
    @TheSnake0444 Год назад +8

    this is by far the best channel to watch detailed benchmarks nd easy to follow technical info, the presentation is always top notch, gamernexus can learn something from this.

    • @LiveBenchmarks
      @LiveBenchmarks Год назад

      They definitely like shiling for Nvidia often and their benchmarks need a bit more skepticism based on my own data

  • @Mootlips
    @Mootlips Год назад +1

    Weird that we live in a world where 30-40% gains over last gen are referred to as disappointing lol.

  • @TecnoguiaCol
    @TecnoguiaCol Год назад +4

    The best and most complete benchmarks.. thanks guys!

  • @gurshair
    @gurshair Год назад +8

    Thank you for all your work.
    I always appreciate and come for your extensive benchmarks. Doesn't matter if your video is a week late, I'll always watch it as your provide information others don't

  • @artyboy1377
    @artyboy1377 Год назад +1

    The 4090 is the first GPU that doubled the performance of my existing GPU (3080Ti) eventho it was only one generation step-up, mightily impressed. Its also the first time i bought a Titan class GPU. I paid the same amount for it as my 3080Ti which was purchased during the global shortage post Covid. Was glad that i picked the 4090 over the 4080 as online reviews said the price to performance ratio was much better.

  • @YAAMW
    @YAAMW Год назад +3

    Great work as always, Steve. Would've liked to see FSR and DLSS 2 added to the 6950XT and 3090 Ti without RT to put the gains from DLSS 3 in perspective. But I can't fault you for settling with 1000+ benchmark runs for this video😅