We need to have a chat about these 4090 Benchmarks...

Поделиться
HTML-код
  • Опубликовано: 6 авг 2024
  • The RTX 4090 is finally here... but there are some things you should know...
    Get an iFixit kit for yourself or someone you love at - amzn.to/3IDDkj9
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • НаукаНаука

Комментарии • 5 тыс.

  • @Cryptic0013
    @Cryptic0013 Год назад +2588

    Remember folks: Time is on your side. Nvidia is, as Jay put it, sitting on a mountain of overbought silicon and developers aren't going to rush to use this new horsepower (they've only just begun optimising for the 2060 and next-gen gaming consoles.) If you don't hit that purchase button now, they *will* have to drop the price to meet you. The market pressure is on them to sell it, not you to buy it.

    • @earthtaurus5515
      @earthtaurus5515 Год назад +66

      Indeed and generally that's usually the case tech does go down in price barring any unforseen madness like a global pandemic or GPU based crypto mining or both....

    • @DarkSwordsman
      @DarkSwordsman Год назад +67

      I really do think games are at an all time high of being CPU-bound due to how they work internally. For example: Unity is limited by draw calls. Every mesh with every material incurs a draw call, and it's why VRChat runs so bad while barely utilizing a single CPU core or the GPU. It's also part of the reason why Tarkov runs so poorly, though that's mostly down to the insane number of objects they have in the game and the, in my opinion, less then optimal LOD.
      Engines like UE5 with Nanite and Lumen, and games like Doom are prime examples of the direction that we need to go for future games if we want them to actually take advantage of modern hardware. The hardware we have now is so powerful, I don't think people realize what absolutely crazy things we can do with some optimization.

    • @corey2232
      @corey2232 Год назад +27

      Exactly. And at those prices, there's no way in hell I'm touching these cards. I already thought jacking up the prices last gen was too much, so I'm going to happily wait this out.

    • @JSmith73
      @JSmith73 Год назад +12

      Especially after rDNA3 hopefully brings a good sanity check to the market.

    • @theneonbop
      @theneonbop Год назад +8

      @@DarkSwordsman Draw calls in unity are often easily fixable with optimization from the developers (early on in the project), I guess the problem with VRChat is that the developers aren't really involved in making the maps.

  • @SkorpyoTFC
    @SkorpyoTFC Год назад +623

    "Nvidia doesn't see this as a problem"
    Summed up the whole launch right there.

    • @TheCameltotem
      @TheCameltotem Год назад +9

      supply and demand 101. People who can afford this will buy it and enjoy it.
      If you don't got the money then buy an Intel ARC or something

    • @lgeiger
      @lgeiger Год назад +30

      @@TheCameltotem Ha! Let's see how many 4090's will break due the strong bend of the cable. Not sure if everyone is going to "enjoy it" when Nvidia starts blaming them for breaking their 4090, because they bent the cables too much. This adapter is a problem and I am so sure that it's gonna create problems in the future.

    • @raymondmckay6990
      @raymondmckay6990 Год назад +23

      @@lgeigerNvidia could have solved the connector problem by having the connector be an L shape where it plugs into the card.

    • @lgeiger
      @lgeiger Год назад +11

      @@raymondmckay6990 Exactly my thought! But I guess that's not a valid solution for a billion dollar company.

    • @PresidentScrooge
      @PresidentScrooge Год назад +3

      @@raymondmckay6990
      What Nvidia shouldve done is plan ahead. If they plan to push a new standard, they should have worked with PSU developers 2-4 years ago so at least the high-end PSUs will have that standard available. this is just pure arrogance by Nvidia

  • @bander3n
    @bander3n Год назад +317

    I love how I can watch multiple tech RUclipsrs and they give you a general idea about a product while each one gives their input into a specific part that is important to them . Gives you an overall good idea about it . Very informative video Jay . Love it

    • @pat7808
      @pat7808 Год назад +15

      Jay, GN, and LTT. The holy trinity.

    • @punjabhero9706
      @punjabhero9706 Год назад +4

      @@pat7808 I watch all of them. And also Hardware unboxed, Pauls hardware and sometimes hardware canucks(I love their style of videos)

    • @pat7808
      @pat7808 Год назад

      @@punjabhero9706 Yes love the canucks!

    • @marvinlauerwald
      @marvinlauerwald Год назад

      @@pat7808 aaaand red gaming tech/hardware meld

    • @HillyPlays
      @HillyPlays Год назад

      @@punjabhero9706 Because this community subscribes to so many others, I was recommended those channels and it MASSIVELY improved my understanding

  • @luckyspec2274
    @luckyspec2274 Год назад +6

    17:00 Hi, I am from the future, JayzTwoCents was right about the cable bend issues

  • @niftychap
    @niftychap Год назад +1696

    Going to wait for AMD or buy last gen. In my eyes performance of the 4090 looks amazing but so easy to get carried away and end up blowing way more on a GPU than I'm comfortable with.

    • @originalscreenname44
      @originalscreenname44 Год назад +72

      I would say that for most people it's unnecessary. I'm still running a 2080 FE and it does enough for me to play anything on my PC. Unless you're working in CAD or streaming/creating video, you don't really need anything this powerful.

    • @CornFed_3
      @CornFed_3 Год назад +113

      @@originalscreenname44, those of us that only game in 4K would disagree.

    • @victorxify1
      @victorxify1 Год назад +124

      @@hoppingrabbit9849 yea, 1440p 160 fps > 4k 90 in my opinion

    • @idkwhattohaveasausername5828
      @idkwhattohaveasausername5828 Год назад +70

      @@CornFed_3 if you're only gaming in 4k then you need it but most people are playing in 1080p

    • @2buckgeo843
      @2buckgeo843 Год назад +18

      Ya snooze ya lose bro. Get the 4090 and call it a day.

  • @TheMatrixxProjekt
    @TheMatrixxProjekt Год назад +1365

    Completely agree with you on the bending cable to fit in case issue. What they probably should have done is made the cable adaptor have a 90 degree bend pre-done or made the plug an L-shape that can lead the cable directly downwards (or going an extra mile and having a rotational system that can allow it to go in any direction). There are so many design choices that could have remedied this I feel, but instead they’re giving out a whole Nine Tailed Fox for builders to deal with. Really odd oversight for such an expensive product.

    • @MichaelBrodie68
      @MichaelBrodie68 Год назад +62

      Exactly - like the very familiar SATA L connectors

    • @kingkush3911
      @kingkush3911 Год назад +24

      Definitely would be smart to have a 90 degree cable to avoid having to bend the cable and risk damaging the gpu .

    • @Pileot
      @Pileot Год назад +21

      Whats wrong with having the power connector on the motherboard side of the card and facing down? This is probably going to be the longest expansion card in your case and its not likely you are going to be running two of them.

    • @Xan_Ning
      @Xan_Ning Год назад +2

      EZDIV-FAB makes a 8-pin 180 degree power connector that wraps over onto the back plate. I expect them or someone else to make the same thing for the 12-pin. (EDIT: just saw that they have a long 12-pin (not 12+4 pin) to 2x8-pin, so I think they will have one for 12+4)

    • @gordon861
      @gordon861 Год назад +7

      Was just going to say the same thing. I wouldn't be surprised if someone produces a 90 degree plug/extension to solve this problem.

  • @YAAMW
    @YAAMW Год назад +167

    The MOST valuable info in this review was the bit about bending the connector. MASSIVE thanks for pointing that out. Somebody IS going to have a REALLY bad day because of this. The second most valuable info was the bit about the over-engineered coolers. This is the first time I felt restricted by the GPU clearance in my Armor Revo Snow Edition because of these unnecessarily huge GPUs

    • @DGTubbs
      @DGTubbs Год назад +5

      I agree. Shame on NVIDIA for downplaying this. If you screwed up somewhere in design, own it. Don't hide it.

    • @Chris-ey8zf
      @Chris-ey8zf Год назад +1

      Honestly though, if someone is that careless as to break off their connector by bending/pulling on the cables, they probably aren't responsible enough to be building/upgrading PCs. PCs aren't adult lego sets. You have to actually be careful with what you're doing. People that break things due to being that careless deserve to lose the money they spend. Hopefully it teaches them a valuable lesson for the future. Better they break a 4090 and learn than to mishandle a car or misfire a gun that can actually harm/kill others.

    • @EkatariZeen
      @EkatariZeen Год назад +8

      @@Chris-ey8zf Nope, that's just a sh¡t design, even if the user is careful they would be stuck with an unusable brick or an unsightly opened case until they get one of the 4 cases in the entire market where that crap fits.
      Jay just showed that he broke the power cable by bending it, so it's not just being careful about not breaking the cheap connector in the PCB.

    • @javascriptkiddie2718
      @javascriptkiddie2718 Год назад

      Vertical mount?

    • @MoneyMager
      @MoneyMager Год назад

      This video and comment aged "well" :)

  • @IxXDOMINATORXxI
    @IxXDOMINATORXxI Год назад +28

    Im sticking with my evga 3080ti that baby will get me through until 50 series easy. Honestly by that point im hoping intel cards are good, id love to try a good top notch Intel card

    • @SuperSavageSpirit
      @SuperSavageSpirit Год назад +3

      Intel's cards won't be if they are even still making them by that point and didn't give up

    • @christopherlawrence4191
      @christopherlawrence4191 Год назад +1

      If i dont have a card at all (1050ti died on me a year ago). Should i wait or keep the 4090 i ordered?

    • @brandonstein87
      @brandonstein87 Год назад +1

      ​@christopherlawrence4191 how is the card? I just got one

  • @Appl_Jax
    @Appl_Jax Год назад +459

    Was EVGA the only board partner that put the connectors on the end? At least they had the foresight to recognize that it was a problem. They will be missed in this space.

    • @onik7000
      @onik7000 Год назад +7

      PCB is not long enough on most GPU to put it there. Actually connectors on 4090 FE are on the END of PCB ) Only fan and radiator are behind that point.

    • @TheFoyer13
      @TheFoyer13 Год назад

      I have an MSI 3090ti with the 12 pin connector facing the side panel. These cards are so big, and so long, that if the power connector was at the rear, it would bend even worse. I guess the only benefit to that would be seeing the RGB "carbon" motherboard leds that are hidden by the PCIE wires but then it'd be uglier and the wires in the way of my fans. Now that they don't sell a lot of cases with optical drives, they aren't as long as they used to be. I guess it really comes down to what kind of case you buy. (My 3090ti is in a corsair 4000d, and it fits great, and the adapter doesn't touch the glass)

    • @EisenWald_CH
      @EisenWald_CH Год назад +1

      @@onik7000 that's why they put a little daughter pcb (when it's more than just the power) or just a cable that connects the PCB power IN to whatever place you would like the connector to be (and then fix that connector to the metal or plastics structures from the heatsink), it's not like they can't do what EVGA does, they just don't care that much or the feel it will look "out of place" or "bad", cost is also a thing, but i feel it's very negligible in this case, as it's just a little extra cable and fix points (redesign would be a bummer though, with machining cost and all).

    • @BM-wf8jj
      @BM-wf8jj Год назад +1

      It wasn't even a foresight. Last year I ended up having to send back a FTW3 3080 because I wasn't able to get the glass side panel back onto my Corsair 280x w/ the PCI cables attached to it smh. They could've at least sent out some type of communication to inform buyers that they fixed that issue.

    • @Old-Boy_BEbop
      @Old-Boy_BEbop Год назад +2

      @@onik7000 if only they could invent small pcb boards and capacitors with wiring to add the Power Connectors at the end... if only, what a world we would be living in.

  • @gremfive4246
    @gremfive4246 Год назад +360

    Igor's Lab explained why the AIB's coolers are so massive, the AIBs were told by Nvidia to build coolers for 600 watt cards (all those rumors back in July of 600 watt 4090s) in the end the 4090s would have been just fine with 3090 coolers since Nvidia used TSMC over Samsung.
    Thats going to add alot of cost to AIBs cards over the Founders Editions and maybe one of the things that made EVGA say enough is enough.

    • @ChenChennesen
      @ChenChennesen Год назад

      Bit of a tangent but what are the recommended psu sizes now?

    • @kimiko41
      @kimiko41 Год назад +17

      Gamer's Nexus did some power draw testing, stock was 500w and overclocked with the power limit / power target bumped was pulling over 650w. Seems to me the large coolers are necessary unless AIBs want to set a lower limit than FE.

    • @fahimp3
      @fahimp3 Год назад

      @@ChenChennesen 10000W if you want to be safe with the transient spikes and future proof it.

    • @randomnobody660
      @randomnobody660 Год назад +9

      @@kimiko41 Wasn't it mentioned in this video that the card got to 3Ghz while maintaining 60-ish degrees in a closed case with factory fan curves? That sounds like even the fe's cooler is arguably overbuilt already.

    • @TotallySlapdash
      @TotallySlapdash Год назад +12

      @@randomnobody660 or NV is binning chips for FE such that they get an advantage.

  • @Porkchop899aLb
    @Porkchop899aLb Год назад +30

    New builder here and went for a 3080ti for 165hz 1440p. The 4090 will be lovely of an upgrade for me in 3 or so years lol

    • @liandriunreal
      @liandriunreal Год назад +6

      getting rid of my 3080ti for the 4090 lol

    • @kertosyt
      @kertosyt Год назад +1

      @@liandriunreal i just got a 1080 after my 970 died ..lol

    • @garretts.2003
      @garretts.2003 Год назад +5

      3080ti is a great card. Certainly an excellent first build. The cards are getting so good that last gen is worth the savings for me personally. I'm still on a 2080ti running 1440 ultrawide decent. Most single player games I'm fine with at 60fps and online FPS can always drop the resolution if required. I'll probably upgrade to the 3080ti while prices are good.

    • @mattgibbia2692
      @mattgibbia2692 Год назад +2

      @@liandriunreal you must hate money 3080ti is more than enough for anything out right now

    • @Amizzly
      @Amizzly Год назад +1

      @@mattgibbia2692 like what? Most games with high detail textures and RT on my 35” UW 1440p monitor are dragging ass with my 3080. Cyberpunk it’s like 45FPS even with DLSS on performance mode. With no DLSS it’s like 15FPS.

  • @rallias1
    @rallias1 Год назад +52

    So, someone else showed how they were able to cut power down to like 50% and still get like 90% of the performance. I kinda want to see your take on that, and maybe in comparison to a 30-series or team red card with the same power limits.

    • @paritee6488
      @paritee6488 Год назад

      tech ciiy!

    • @emich34
      @emich34 Год назад +2

      @@paritee6488 derBauer too - running at 60% power target was like a 2% fps drop in most titles

    • @Trisiton
      @Trisiton Год назад +1

      Of course, you get diminishing returns after a certain point. This is why there are laptop 3080tis that run at 180W and still get like 70% of the performance of a desktop 3080ti.

    • @sonicfire9000
      @sonicfire9000 Год назад

      @@Trisiton those sound very interesting but I just have one question in my mind tbh: how are the batteries? Just asking so I don't accidentally run into a razer or alienware situation

    • @prabhsaini1
      @prabhsaini1 Год назад

      @@Trisiton 180 watts? on a laptop? those things must only run for a solid 28 seconds

  • @theldraspneumonoultramicro405
    @theldraspneumonoultramicro405 Год назад +345

    man, i just cant get over just how comically massive the 4090 is.

    • @woswasdenni1914
      @woswasdenni1914 Год назад +7

      there goes my plans making a ssf build. the cooler itself is bigger than the entire build

    • @itllBuffGaming
      @itllBuffGaming Год назад +1

      @@woswasdenni1914 if the waterblocks for them are the same as the 3090. It’ll be a quarter the size, if you want a small build custom liquid is going to be the way now

    • @MichaeltheORIGINAL1
      @MichaeltheORIGINAL1 Год назад +4

      I thought the 3090ti was huge but this thing is a whole nother story, haha.

    • @Cuplex1
      @Cuplex1 Год назад +2

      Agreed, I think my 3080 is massive. But it's nothing compared to that beast of a card. 🙂

    • @watchm4ker
      @watchm4ker Год назад

      @@outlet6989 It'll fit in a full tower case, as long as you don't have drive cages to worry about. And you're thinking EATX, which is for dual-socket MBs

  • @tt33333wp
    @tt33333wp Год назад +60

    They could introduce “L” shape adapters. This would be a good solution to this bending issue.

    • @Digitalplays
      @Digitalplays Год назад +32

      Then you can take two Ls when buying one of these

    • @johnderat2652
      @johnderat2652 Год назад +4

      @@Digitalplays4090 would be a great card for Blender artists, AI training, and simulations.
      Not so sure about gaming though.

    • @mukkah
      @mukkah Год назад

      Interesting thought and I wonder if it has been explored by RnD at nvidia (seems like an obvious thought now that you mention it but it escaped me entirely up until then, so who knows)

    • @Chipsaru
      @Chipsaru Год назад

      @Anonymous did you see 43 FPS in Cyberpunk RT vs 20ish with 3090? Nice uplift for RT titles.

  • @TheZigK
    @TheZigK Год назад +94

    couldn't wait any more. Had money to upgrade my 1050 ti when the market was inflated. Finally snatched a 6800 XT for $550 and have 0 regrets. Will still be watching to see how things evolve

    • @dennisnicholson2466
      @dennisnicholson2466 Год назад +4

      I last week nabbed the 6800xt after seeing a mod video that taxes this card to comfortably run as a 3090ti . Been having some power draw issues thought would have been safe using same modular psu that my dual 1080s .

    • @TheZigK
      @TheZigK Год назад +3

      @@dennisnicholson2466 I saw a article about the same thing! Seems like it requires a custom cooling setup to see real improvements though, and probably doesn't work with every game. If I find myself running up against the card's limits I might consider it

    • @ElectricityTaster
      @ElectricityTaster Год назад +3

      good old 1050ti

    • @HansBelphegor
      @HansBelphegor Год назад +1

      Same but msi 6950xt

    • @user-ck8ec7pj1l
      @user-ck8ec7pj1l Год назад +1

      I got that STRIX 3090 White edition he is showing for $999 2 weeks ago. Been eyeing it for months.

  • @bernds6587
    @bernds6587 Год назад +119

    Roman (der8auer) made an interesting point about the Powertarget:
    setting it to 70% allows the card to run cooler with its power reduced 300W, but still performing at about 95% of its graphical power. The power to fps curve definitely looks like it runs overclocked by default

    • @R1SKbreaker
      @R1SKbreaker Год назад +3

      Oh this is good to know! I have a 750 watt power supply, and I think I am going to splurge for a 4090 eventually, coming from a 2070 Super. I'd really rather not upgrade my power supply, and if I can hit 95% graphical power with my current power, then I am more than happy. 4090 is OP as is; I can deal with a 5% graphical power reduction.

    • @bobbythomas6520
      @bobbythomas6520 Год назад

      @@R1SKbreaker (coming from a person who owned a 750 watt power supply)

    • @ryze9153
      @ryze9153 Год назад

      ​@@R1SKbreaker I would get a 3070 now and then wait for 50 series. That's my suggestion.

    • @R1SKbreaker
      @R1SKbreaker Год назад

      @@ryze9153I'm actually just going to stay with 2070 Super until the 5000 series. After I upgraded my CPU, I'm a lot more content with my current setup.

    • @ryze9153
      @ryze9153 Год назад

      @@R1SKbreaker I'm hoping to have a 3060 ti or somethin like that pretty soon.

  • @shermanbuster1749
    @shermanbuster1749 Год назад +67

    I can see a lot of 90 degrees adaptors being sold for these cards. If I were in the market for this card, that is the way I would probably go. Go with a 90 degree adaptor so you are putting less stress on the cables and saving some space.

    • @ArtisChronicles
      @ArtisChronicles Год назад +7

      That's the one thing that would make the most sense to me. Problem for me is the damn things are so big. I do not want to run a card that big in my case.
      Those 90 degree adapters should exist regardless though. It's a pretty important piece overall.

  • @connor040606
    @connor040606 Год назад +52

    Thanks for the tips Jay with the adapter cable. Pretty sure you just saved a ton of people RMA headaches!

  • @markz4467
    @markz4467 Год назад +116

    You should start adding power consumption to the fps graphs. Something like - 158/220, where 158 stands for the fps count and 220 represents power consumption in watts.

    • @nervonabliss2071
      @nervonabliss2071 Год назад +3

      Or just next to the cards name

    • @ramongossler1726
      @ramongossler1726 Год назад +1

      no, Power consumption is just an AMD fanboy Argument just like "no one needs RT anyway" if you are concerned about Power consumption buy a Laptop

    • @GM-xk1nw
      @GM-xk1nw Год назад +30

      @@ramongossler1726 Power consumption is a thing people who pay bills care about, you know people with responsibilities.

    • @GamerErman2001
      @GamerErman2001 Год назад +16

      @@ramongossler1726 Power consumption raises your power bill, affects what power supply you need and heats up your PC as well as your room. Also although this is a small matter for a single user several people using large amounts of power to run their computer creates pollution and can also cause black/brown outs.

    • @excellentswordfight8215
      @excellentswordfight8215 Год назад +8

      Bills aside, when using something like PCAT so that you actually get a good messure of GPU-powerdraw it would actually be a good way of seeing how system-bottlenecked the card is.

  • @commitselfdeletus9070
    @commitselfdeletus9070 Год назад +1

    Jay makes the only ads that I don’t skip

  • @Annodite
    @Annodite Год назад +5

    Love your videos Jay! Always excited to watch it as soon as you release videos :D

  • @vorpled
    @vorpled Год назад +118

    DetBauer had an amazing point which shows that they could have reduced the size of the card and cooler by 1/3 for about a 5% performance hit.

    • @sircommissar
      @sircommissar Год назад +20

      Id unironically rather not, rather have a bigger card and better perf. Let some AIB have their trash perf for smaller size

    • @Beamer_i4_M50
      @Beamer_i4_M50 Год назад +45

      @@leeroyjenkins0 1/3 of cooler size means 1/3 of power needed. Means 150 Watt less you have to pay for. Means cooler temps all around. For 5% penalty in performance.

    • @jtnachos16
      @jtnachos16 Год назад

      @@leeroyjenkins0 You are both utterly missing the point, and a perfect example of the type of idiot who lets scalpers (be they the official producer or a third-party) keep stake in the market. Which is what NVIDIA is doing right now by marketing their 4070 as a 4080. Additionally, your comments further paint the picture as someone who doesn't have the money for a 4090 in the first place.
      You've just demonstrated that you have no understanding whatsoever of how GPUs work. Guess what? With less power draw and 5% less performance, you are still in a range where overclocking can bring that performance back to a degree that is utterly unnoticeable in actual use (this ignoring that we are talking loss of 1-3 frames or so in most real-world loads). Furthermore, that reduced power consumption? Less wear on parts, means more reliability, not just on your gpu, but also psu.
      This is also ignoring that the transient spikes that were being caused by GPUs are still unsolved as of yet, and are capable of killing other parts in the system (yes, they are claiming the new power supply standard solves it, but those aren't commercially available yet and will likely have most consumers priced out for the first year or two. It's also a really bad precedent to be having a totally new power standard for PSUs popping up solely because one manufacturer refuses to work toward efficient use of power). Further yet, momentary power spikes were what was consistently killing 3080ti and 3090 cards, yet nvidia's response was, to my knowledge 'we added more capacitors' which isn't a solution, as those capacitors will still end up getting blown.
      Put bluntly, Nvidia tim tayloring it in search of tiny advancements in performance is absolutely a bad idea, from engineering, consumer, and environmental positions. Literally from every reasonable and informed position, it's a bad idea. Furthermore, that power draw actually is reaching the point where a high end pc will risk overtaxing standard american in-home circuits.
      The TLDR here, is that NVIDIA absolutely did not have to draw that much power for a substantial performance gap over last gen. They are being exceedingly lazy/arrogant and trying to brute force the situation in a way that is almost certainly going to result in failing cards and potentially damage to other parts in the system.

    • @rodh1404
      @rodh1404 Год назад +11

      Personally, I think NVidia took a look at what AMD has in the pipeline and decided to go all out. Because they'd probably have been beaten in the benchmarks if they didn't. Although given how huge the coolers they've strapped on to their cards are, I don't understand why they haven't just abandoned air cooling for these cards and just gone with water cooling only.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад +1

      I heard that was more an issue with the manufacturing process. The one they originally planned warranted the 600W and massive coolers, but the one they ended up using sits at 450W and doesn't need those monsters.

  • @Harrisboyuno
    @Harrisboyuno Год назад +3

    Love the iFixit promos as well. All around info and entertainment. Thanks so much Jay.

  • @dschlessman
    @dschlessman Год назад +3

    Damn you called all these power connector issues. Good job!

  • @latioseon7794
    @latioseon7794 Год назад +328

    After the whole ordeal with 30 series and the "4070" thing, i hope nvidia gets a reality check

    • @eclipsez0r
      @eclipsez0r Год назад +45

      That performance sells lol this card gonna be sold out

    • @TheSpoobz
      @TheSpoobz Год назад +22

      Honesty just gonna stay with my 3080ti cuz of that

    • @kevinerbs2778
      @kevinerbs2778 Год назад +26

      @@eclipsez0r that's the most disappinting part about this.

    • @samson_the_great
      @samson_the_great Год назад

      @@eclipsez0r yup, I already hit the plug up.

    • @omniyambot9876
      @omniyambot9876 Год назад +1

      people who would spend money to 3090ti before crypto crash have every reasone to buy 4090 cards especially with those insane performance jumps. Yeah we hate NVIDIA, they are dick and overpriced but let's stop being stupid here, their product is still absolutely competitive that's why people still buy them, they are not pointing a gun at you.

  • @Daniel218lb
    @Daniel218lb Год назад +5

    You know, i'm still very happy with my rtx 2080ti.

  • @wile-e-coyote7257
    @wile-e-coyote7257 Год назад

    Thanks for sharing your benchmark results, Jayz!

  • @vedomedo
    @vedomedo Год назад +2

    I'm not gonna get the 4090, and I would probably have gotten the 4080 16gb if the pricing here in Norway was more in line with the $ pricing. However, here you have to pay the 4090 price for the 4080 16gb, and the 4090 costs like $2000-> $2500, which is simply silly. Even if I sold my 3080 the difference is still what I would expect a 4080 16gb to cost in total. Who knows, maybe the 50xx will be more in line with "normal" pricing, or maybe even the 40xx cards go down in price after a while.

  • @Zeniph00
    @Zeniph00 Год назад +123

    Impressive uplift, but happy to wait and see what AMD has. MCM tech has me very interested in what will come.

    • @Malc180s
      @Malc180s Год назад +4

      AMD has fuck all. Buy what you want now, or spend yoursl life waiting

    • @georgejones5019
      @georgejones5019 Год назад +21

      @@Malc180s lmao. Probably a userbrnchmark fan boy.
      AMD has the 3D vcache. The 5800X3D's tech will only improve with age, they've stated it not just applicable to CPUs, but GPUs as well.

    • @HeloisGevit
      @HeloisGevit Год назад +4

      @@georgejones5019 Is it going to improve their shocking ray tracing performance?

    • @AGuy-vq9qp
      @AGuy-vq9qp Год назад +3

      @@georgejones5019 the sounds false. GPUs are a lot less latency sensitive than CPUs are.

    • @mutley69
      @mutley69 Год назад +7

      @@HeloisGevit Ray tracing is just another tactic to make you buy their next latest and greatest cards, let alone the last 2 gens of cards have been been bad for Ray tracing. This 4090 is the first card that can actually manage it properly

  • @Liberteen666
    @Liberteen666 Год назад +16

    Jay I really appreciate your content. Thank you for being quick when it comes to informing all of us. I'm looking forward to building my 4K system in the upcoming few weeks after I see the aftermarket versions of 4090 and their benchmarks. Keep us updated!

  • @dtrjones
    @dtrjones Год назад +1

    Thanks, Jayz, really liked this video! If you had any reservations about calling out Nvidia on the 12v power connector on the FE card then you shouldn't, I'm glad you called it out, however with all these new systems I'm going to go through a PC builder anyway so let's hope they've also seen your advice!

  • @__last
    @__last Год назад +2

    hopefully i can get a 3090 for much cheaper now since no game is gonna need a 4090 for at least 3-4 years.

  • @PixelShade
    @PixelShade Год назад +10

    I'm at a point where I am totally happy with the 6600XT performance... At least for gaming at 1440p. I kind of feel like games need to become more demanding to justify an upgrade. The 4090 is impressive and I would totally buy it if I worked with 3D modelling/rendering professionally... But let's not kid ourselves. This is not really a "consumer" product, but rather nvidias professional grade hardware made available to normal consumers.

  • @dale117
    @dale117 Год назад +295

    What impressed me the most was the improvement in 4K resolution. Can't wait to see what AMD brings to the table.

    • @surft
      @surft Год назад +10

      Excited too, but I'm going to be shocked if their top of the line can get close (single digits) in fps to this in most games. The uplift in rasterization alone is insane.

    • @roccociccone597
      @roccociccone597 Год назад +8

      @@surft Well leaks suggest they do match it, sometimes even beat it. I do expect RNDA3 to be very very good.

    • @TwoSevenX
      @TwoSevenX Год назад +4

      @@surft AMD and NvIdia both expect AMD to *win* in pure raster performance by 5-20% with the 7950XT

    • @jakestocker4854
      @jakestocker4854 Год назад +27

      @@roccociccone597 the leaks always say that though. Literally for the last 3 generations there have always been links that AMD has something huge coming and then they release some solid cards but nothing like the leaks hype up.

    • @roccociccone597
      @roccociccone597 Год назад +7

      @@jakestocker4854 well rdna 2 was pretty accurate. And they mostly match nvidia. So I’m optimistic amd will manage to match or even beat nvidia in raster and get very close in ray tracing. And I hope they won’t be this expensive

  • @dereksinkro1961
    @dereksinkro1961 Год назад +10

    5900x and 3080ti is a nice spot to be for 1440 gaming, will enjoy my rig and watch this all play out.

    • @duohere3981
      @duohere3981 11 месяцев назад

      @reality8793he thinks he’s smart 😂

  • @jonathanjanzen5231
    @jonathanjanzen5231 Год назад

    Love your combined 4K/1440/1080 graphs. Music is cool too!

  • @adamsmith4953
    @adamsmith4953 Год назад +77

    Looks pretty impressive, I can't wait to see what AMD comes out with

  • @dunastrig1889
    @dunastrig1889 Год назад +21

    Thanks! Raw performance #'s are always what I look for first. DLSS and FSR are nice options to have if you can't push the fps but I want to see bare performance with all the bells and whistles. Now I'll look for a 12vhpwr 90 adapter...

    • @andytroschke2036
      @andytroschke2036 Год назад

      Better to wait for ATX3.0 PSU's to release instead of an Adapter.

    • @antonchigurh8343
      @antonchigurh8343 Год назад

      @@andytroschke2036 They are already available

    • @andytroschke2036
      @andytroschke2036 Год назад

      @@antonchigurh8343 where? All I can find is ATX 2.4 with a 12VHPWR. ATX3.0 hast several other additions

  • @michaelkuhlmeier8472
    @michaelkuhlmeier8472 Год назад

    Just noticed the smoke effect on the background of the graph slides. Always enjoy the little touches Phil does to make the videos look great.

  • @urazsoktay5275
    @urazsoktay5275 Год назад

    Very thorough and good video. Thank you.

  • @hdz77
    @hdz77 Год назад +9

    The only thing I got out of this benchmark is that the 4090 price is not justifiable. The 6950XT and the 3090 is more then enough for price to performance, if anything wait and see for the RX 7000 series GPUs.

    • @jondonnelly4831
      @jondonnelly4831 Год назад

      It is justifiable. Take the performance increase per dollar into account. The 4090 costs more, but it gets a lot more fps. If you have a 1440p 240Hz panal and a high end cpu you will feel that increase. The 3090 will feel S L O W in comparison.

    • @LordLentils
      @LordLentils Год назад +2

      @@jondonnelly4831 Old flagship GPUs were the beasts of their time yet the price increase wasn't as tremendous over a single generational leap.

  • @HRC4793
    @HRC4793 Год назад +45

    The iFixit ad is the only one you don't want to skip

  • @gettingwrekt
    @gettingwrekt Год назад +11

    Jay...I'd love to see you build the next personal rig in the EVGA E1. The analogue gauges are cool as hell.

    • @jsmooth76
      @jsmooth76 Год назад

      yes evga all the way

  • @marco_scuba
    @marco_scuba Год назад

    Awesome video as always!

  • @sidra_games4551
    @sidra_games4551 Год назад +23

    There is so much happening on such a short timeframe that it just makes sense to wait a few months before deciding on a new build. How are the Intel chips gonna perform versus the new AMD ones? How will the new AMD cards perform? How will the lesser (70/80) Nvidia cards compare with this one? And keep in mind we are still awaiting on 5.0 M2 SSDs. It's new build time for me as my last one is 5 years old. But I am gonna let the dust settle and once JAN-FEB rolls around figure out what's best.

    • @Silver1080P
      @Silver1080P Год назад

      I've been waiting for what's next across the board for 3 years now, whether it's due to cost or lack of power, most of the things I've been interested in has pushed to the side. I have a 3080 12gb and i7 8700k so I'm happy enough for now. Will be looking at Intel's next cpu though

    • @jamesc3953
      @jamesc3953 Год назад

      @@Silver1080P Do you find your 8700k bottlenecks your 3080? what kind of resolution do you play at?

  • @carlwillows
    @carlwillows Год назад +16

    It will be interesting to see the performances of the 4080's with only 47% and 60% of the 4090's cores respectively.

    • @carlwillows
      @carlwillows Год назад +3

      @@taekwoncrawfish9418 I don't think it will be quite so linear, but we shall see.

    • @ChiquitaSpeaks
      @ChiquitaSpeaks Год назад

      @@carlwillows benchmarks dropped it’s pretty Linear

    • @ChiquitaSpeaks
      @ChiquitaSpeaks Год назад

      @@taekwoncrawfish9418 same architecture lol right everything

  • @JohnAmanar
    @JohnAmanar Год назад

    I still really love the iFixit intro! xD Great video!

  • @RIGHTxTRIGGERx
    @RIGHTxTRIGGERx Год назад +8

    im always a few gens behind because i cant really afford the best of the best but rn im pretty focused on upgrading my cpu. ive currently got a 1660 super and i think i want to upgrade my graphics card sometime next year , i want to see how much the 30 series drops in price once all the 40s have been announced, released and sold. new tech and competition is a good thing for everyone.

    • @Bdot888
      @Bdot888 Год назад +2

      Good idea! I waited for a while and recently went the used gpu route and got a 3080 for $550. But im sure prices will drop a little more, just keep an eye out!

    • @Erikcleric
      @Erikcleric Год назад +2

      3090 prices dropped like Crazy in Sweden the past weeks. From 2600-ish USD to 1300 USD now. So I'm upgrading my brand-new rig which has the 3060ti to a 3090.
      4000 series, it's overkill for any game right now and the coming years unless you NEED 4K ultra at max fps...
      My 3060ti will go into a future desktop I'll get for my old room at my moms place. Hate for it just be abandoned.

    • @RIGHTxTRIGGERx
      @RIGHTxTRIGGERx Год назад +1

      @@Bdot888 i might look into something used actually. Not something ive ever thought about doing but it could save alot of cash!

    • @RIGHTxTRIGGERx
      @RIGHTxTRIGGERx Год назад +1

      @@Erikcleric definitely not in the market for anything over 1k but the way prices have been trending, i dont think that’s something ill have to worry about soon. I get not wanting to toss parts, it feels like such a waste. Ive got a bunch of old parts taking up space in my closet that im never going to touch again but i cant get myself to get rid of them lol.

  • @mcflu
    @mcflu Год назад +40

    On one hand seeing current performance of the RX 6950XT compared to 3000 series in super impressed and looking forward to what thing bring with RDNA3 next month. On the other hand, all of these cards are too much for me lol and I'm happy with my 3060ti 😁

    • @chexlemeneux8790
      @chexlemeneux8790 Год назад +4

      Personally Im totally fine playing in 1080p and the 3060ti is more than capable of playing every game I own on ultra settings with at least 100fps . I got it in a $1800 CAD pre-built PC during the chip shortage , while people were paying that much for a 3080 by itself. I felt like I made out like a bandit and still feel super good about that purchase.

    • @craiglortie8483
      @craiglortie8483 Год назад +2

      thats why i went with a 67000 xt for my upgrade. running 60fps on a 4k monitor. i watch youtube and streaming services more than i play now, so it made the best choice for me.

    • @cerebelul
      @cerebelul Год назад +1

      More impressive is the fact that it comes very close to the 4090 in some games at 2K/4K.

    • @FlotaMau
      @FlotaMau Год назад

      @@craiglortie8483 same I mean 4 k at 60 fps for single player Is really fine, I only Envy More fps for competitive but playing comp at 4k Is NOT even a thing.

    • @craiglortie8483
      @craiglortie8483 Год назад

      @@FlotaMau I play war thunder fine with my Philips monitor. The settings I turn down are the same as I would to improve game play. I stay locked between 55-60 fps all game. Lose a few ms from monitor but nothing I don't lose from age.

  • @vetsus3518
    @vetsus3518 Год назад +145

    I’m with you… a little confused why they didn’t create a 90 degree adapter if that was the temporary solution until the new PSU’s are released… that would have at least allowed you to fit it within a ‘normal’ case. Also, I love the GPU mounted on the wall in the back. At least that’s how it appeared to be. Looks like one of your custom printed GPU stands mounted to the wall with the card on it. It’s a cool look. Try putting up some motherboards too…. I mean since you’re just a little tech channel. Lol

    • @Ben-Rogue
      @Ben-Rogue Год назад +9

      That cable solution is just lazy. A 90 degree adapter with about 20CM of cable before they split it, would be a lot cleaner and easier for customers to fit into cases

    • @ChaseIrons
      @ChaseIrons Год назад +3

      Someone will make an adapter eventually. For my 3090’s I got u-turn adapters that have been excellent since launch. No bends needed at all

    • @joee7452
      @joee7452 Год назад +1

      I am not a betting man but what is the chance that they didn't create one so that the after market do create them and then can charge a 79 or 99 dollars for them as an extra part? Remember they put caps on prices, so that would be a way for them to give, say asus, an easy way to make 50 or 70 dollars extra on the 4k series that technically doesn't count in the price of the gpu itself.

    • @MrMoon-hy6pn
      @MrMoon-hy6pn Год назад

      @@joee7452 Since nvidia seems to treat their AIBs with utter contempt as shown with evga leaving the GPU market entirely because of nvidia, I somehow doubt that's the reason.

    • @lUnderdogl
      @lUnderdogl Год назад

      I bet they planned to do but sometimes it is too late to implement.

  • @ChikoDc_Tv
    @ChikoDc_Tv Год назад

    Thanks for the wire bend tips !

  • @zachr369
    @zachr369 Год назад +1

    I got a evga 3090 ftw3 earlier this year once the price finally dropped back down, mostly to run stable 120 frames in vr at all times for assetto corsa ultra settings so other filters. I play on 1080. so seeing what the new 40 series can do is good to see. Around all the news with invidia may or may not continue with their products in the future.

  • @NiveusLuxLucis
    @NiveusLuxLucis Год назад +81

    Thanks for talking about the connector, had the same concerns and nvidia's response is kind of unbelievable.

    • @earthtaurus5515
      @earthtaurus5515 Год назад +10

      Nvidia is too damn egoistic and they are fully aware of frying 12 vhpwr connectors as well as it's very low durability. They just don't give a damn about anyone except their bottom line. So, if you fry your PC or break their connector off the PCB they think everyone will go out and by another 4090. The only way they will do anything about it is if there is massive backlash publicly especially if people start frying their PCs en masse due to the low durability of the 12 vhpwr 4 way adapter.

    • @Cruor34
      @Cruor34 Год назад +2

      This one isn't a big concern to me... I buy a top end GPU every 4 years or so unless something breaks (for example, my 980 TI broke, MSI didn't have any left so the sent me a check, I got a 1080TI) I build the PC and I don't ever touch it again really, then I build a new PC years later. Who the hell is hooking and unhooking the cables constantly? On my list of care issues, this is really low, unless I am misunderstanding, its only an issue if you hook and unhook it 30 times.

    • @hashishiriya
      @hashishiriya Год назад +6

      @@Cruor34 cool

    • @flopsone
      @flopsone Год назад +3

      a 90 degree connector would make things a lot better/easier, surely nvidia or an aib can find a little space and just have the connector just come directly out of the bottom of the pcb, would direct the cable straight down where most cases have the power supply

    • @jordanmills7327
      @jordanmills7327 Год назад +3

      @@Cruor34 that 30 times was under "ideal conditions" which means with the cable being inserted straight in with no wiggle and without bending the cable. its almost certainly a lot less than 30 times if you bend or insert the cable in "not ideal" conditions. also keeping it in a tight angle might wear the cable/port down over time and considering this connector could be a serious fire hazard this is completely unacceptable.

  • @ferdivedaasje
    @ferdivedaasje Год назад +3

    Thanks for the video and your thoughts. I use my PC to game and to do some 3D design work. Right now I'm super happy with my 3080ti, it does everything that I want and more. I game at 1440p, so I don't think the performance difference in current games would be very noticeable to me. I don't design massive projects, so also there I don't think I will notice much difference. My strategy will be to just wait for now. I do usually like to have new toys, but not to be the earliest adopter.

  • @JimmyFigueroa
    @JimmyFigueroa Год назад

    That I fix it ad was awesome lmao

  • @kinaceman
    @kinaceman Год назад

    love the ifixit ad!! "..the new minnow" makes me laugh every time

  • @Angsaar011
    @Angsaar011 Год назад +66

    I got myself a 3070 Ti. Simply because that was what was available at the time of the shortage. I'm very curious to see what AMD brings to the party. Here's hoping it will be something exceptional.

    • @Its_Me_Wheelz
      @Its_Me_Wheelz Год назад +4

      I nailed a great deal on a 3080 last month, and I'm all sorts of happy. It will last me a long time. Most likely somewhere around the 5000 to 6000 cards.

    • @Angsaar011
      @Angsaar011 Год назад +4

      @@Its_Me_Wheelz For sure. I had my 980 ti up until recently. The 30-series are going to last a long time.

    • @Its_Me_Wheelz
      @Its_Me_Wheelz Год назад +3

      @@Angsaar011 In all honestly, I was running a 2060 super, and it ran everything I play with no problems. Mainly ESO, HLL, COD, and a few others in that kind of games. I had no intentions of upgrading. But like a said, I got a great deal on the 3080 and so, here I am.

    • @yyorophff786
      @yyorophff786 Год назад

      You should have waited for this card.

    • @josephj6521
      @josephj6521 Год назад

      @@yyorophff786 not at these prices.

  • @sleepii15
    @sleepii15 Год назад +2

    Great review and details as always. I’ll stay with my 3090; I wanted to upgrade but not worth it not having a 280hz 1440p monitor. Can’t wait for Monitor technology to catch up. That bending cable situation looks like it will show it’s true colors soon enough, I think they could’ve done a better job with it.

  • @pkkillernate
    @pkkillernate Год назад

    Where is the cheaper add blocker because I’ve watched 6 ads just for this video and 4 without skips! Love the channel.

  • @werderdelange2985
    @werderdelange2985 Год назад

    Great work and honest review as always Jay! On a side-note, what song was the background instrumentals from?

  • @shadowjulien5
    @shadowjulien5 Год назад +44

    Definitely waiting on rdna3 to see what they’ve got to offer. Then I finally want to get around to an itx build. Im also waiting to see what raptor lake has to offer because with the price of am5 boards I might actually end up going Intel after 4 zen systems lol

    • @ZackSNetwork
      @ZackSNetwork Год назад

      Why on GPU’s that powerful and huge they need space and proper cooling? As hell as a high wattage PSU. SFF builds are good for low to mid range PC’s.

    • @Adonis513
      @Adonis513 Год назад +3

      Never understood the point of putting high wattage cards in itx builds , very stupid.

    • @shadowjulien5
      @shadowjulien5 Год назад +4

      Tbh the engineering challenge of getting that much power in a small space seems fun and I’ve had a mid tower for like idk 15 years lol I just wanna try something different

    • @Adonis513
      @Adonis513 Год назад

      @@shadowjulien5 you buying cutting edge hardware just so it can be throttled in a itx setup , unless you are doing a custom loop i see no point.

    • @ghomerhust
      @ghomerhust Год назад +1

      @@Adonis513 thats the point of the challenge they talked about. if they can get it to run properly on air, thats a win. if they can fit cooling in that tiny box, its a win. for some people, just chucking big ass parts with big ass numbers in a big ass case with big ass airflow, well, it's boring as hell, regardless of performance.

  • @CR500R
    @CR500R Год назад +78

    Thank you for testing the PowerColor Red Devil 6950XT! It makes me feel better about my purchase. It actually held its own against the 3090 & 3090Ti on a lot of benchmarks. Not many people test the Red Devil 6950XT. It's not the most popular of the 6950XT cards.

    • @KingZeusCLE
      @KingZeusCLE Год назад +2

      Maybe not, but any of the other 6950 XT numbers still apply. They likely all perform within 1-2% of each other.
      Bitchin' card though. Even with the 4090 released.

    • @vespermoirai975
      @vespermoirai975 Год назад +1

      Red Devil and XFX/Sapphire has always been my favorites when I've had AMD Cards. Red Devils seem to be what EVGA was to Nvidia. With XFX/Sapphire coming close.

    • @ArtisChronicles
      @ArtisChronicles Год назад +1

      @@vespermoirai975 idk if I'd call the Red Devils that as the Red Devil RX 480 actually cut a lot of corners. Mostly applied to overclockers, but I'd still refrain from running Furmark on them. Unless you want to risk damaging them. Old card now, but still a relevant issue.

    • @vespermoirai975
      @vespermoirai975 Год назад +1

      @@ArtisChronicles If I remember right there was a thermal pad fix for that. Could be thinking the R9290

    • @farmeunit
      @farmeunit Год назад +2

      @@ArtisChronicles I had a Red Devil 580 and I loved it. That was their top tier card. Red Dragon below it, and then any other models. I wanted a Red Devil 6800XT but prices were ridiculous. Finally got cheaper but got a 6900XT Gaming Z for less.

  • @knifeuu777
    @knifeuu777 Год назад +11

    As far as the connector itself, it looks like someone will need to develop those L shaped adapters to safely wire route the cable.

    • @aussiebattler96
      @aussiebattler96 Год назад +1

      Those will finally be viable and not useless lmao

  • @patratcan
    @patratcan Год назад

    You made me like watching an ad. Hats off to you!

  • @OzzyInSpace
    @OzzyInSpace Год назад +10

    With the placement of that weak as heck plug, I'll be holding off of the 40 series. I'll be perfectly happy with my 3080 TI for a while.

  • @drizzlydr4ke
    @drizzlydr4ke Год назад +7

    Evga really were smart having the cables on the side, on my lian li dynamic the cables were touching the glass (with frontal cable plug)but having it on the side gives better room for the cables indeed even if you have a fan. Nvidia really need to make it on the side so it fit most cases with no issue

  • @bfizzle81
    @bfizzle81 Год назад

    Glad you mentioned the cable, Im running a EVGA 3080ti FTW3 and a Lian Li 011 Dynamic case and my cables are pressed up against the glass. Theres no way this would fit in my current set-up with the current connector config on the 4090........but honestly im good with my current setup @ 1440p and it should last me a while at least :) I'll wait.

  • @blze0018
    @blze0018 Год назад +1

    I have a 3090, no rush on my end. If a 4090 or RDNA3 comes down in price a bit I might get one sometime next year, since I do play at 4k resolutions, but no rush. Ideally I hold off until the 5000 series and get an even bigger boost, although I am concerned about how power hungry those cards will be.

  • @RenegadeLK
    @RenegadeLK Год назад +3

    Can't wait to get my hands on one next decade!

    • @ForAnAngel
      @ForAnAngel Год назад

      Can't wait to be able to play in 8K with all settings maxed on integrated graphics!

  • @eddiec1961
    @eddiec1961 Год назад +6

    I think it's worth waiting a little bit longer, I think your right to be concerned about the socket a 90°plug would be a better solution.

  • @nicoarcenas
    @nicoarcenas Год назад +4

    Couldn't help myself from cheer for the 6950 XT while watching the benchmarks

  • @milesprower06
    @milesprower06 Год назад +1

    My GPU buying plans happened a month and a half ago; used 3080 Ti on Facebook for a very decent price.
    Likely going to be my workhorse for the next generation or two for 4k gaming, which I have just now gotten into the past month.

  • @greenawayr08
    @greenawayr08 Год назад +124

    Jay, have you considered including VR in your benchmarks? it's an ever growing segment and really pushes performance with variance in performance from card to card. just a suggestion.
    thanks for the great content.

    • @pumpkineater23
      @pumpkineater23 Год назад +14

      Agreed. Even mid-range GPUs are way more than good enough for gaming on a monitor. VR is what really pushes a card now. Flat-screen gaming is yesterday's tech. How does the 4090 improve MSFS in VR.. that's a more 'worthy opponent'.

    • @blinkingred
      @blinkingred Год назад +7

      VR is growing? Last I checked it was stagnant with declining interest and sales.

    • @3lbios
      @3lbios Год назад +3

      I'd love to see a 3090 vs 4090 comparison on a Reverb G2 headset in the most popular VR games.

    • @lePoMo
      @lePoMo Год назад

      has the VR landscape changed?
      VR requires a fluid high framerate so much that no (sane) game developer takes any risks. or has this changed?
      When I bought into VR (CV1), every game targeted RX470-RX480 (GTX980/GTX1060). I somewhat got stuck on BeatSaber so didn't follow evolution since, but to my memory, the only games back then to break with this were flatscreen ports to VR.

    • @privateportall
      @privateportall Год назад +3

      Vr remains gimmicky

  • @TMacGamer
    @TMacGamer Год назад +17

    I am probably going to sit this one out for now. Im happy with the RTX 3070 that I have right now. And after seeing NVIDIA label an RTX 4070 as a lower end 4080, the huge price increase isnt worth it for me right now. Maybe down the line when the rest of the 40 series comes out if the prices are decent. But as of right now I think I will just watch & enjoy the competition playing out & see what happens next.

    • @adamahmadzai2357
      @adamahmadzai2357 Год назад +1

      i dont think its worth upgrading a 3070 at least for another two gens

    • @malazan6004
      @malazan6004 Год назад +2

      @@adamahmadzai2357 depends what resolution you play it honestly.. for 1080p and 1440p no not really but for 4k yeah it's a big deal. Then again DLSS makes 4k gaming on a 3070 much better

    • @beemrmem3
      @beemrmem3 Год назад

      @@malazan6004 3070 has the power for 4K DLSS, but not the VRAM. I’m finding that out trying to run my HP reverb G2

    • @beemrmem3
      @beemrmem3 Год назад +1

      I’m sitting this out too. I was hoping for a 4070 12gb with 3090 perf for $550-600. Let’s be real, the 4070 is $900 and is called the 4080 12gb. This means the 4070 they actually release won’t be much faster than a 3070.

    • @LunchBXcrue
      @LunchBXcrue Год назад +1

      Yeah I have a 3080, just gonna skip this gen. Not worth the upgrade, I dont even use the 3080s full potential, Nvidia should have waited for games to catch up cause even brand new releases aren't utilizing the cards fully unless you have a stupidly expensive setup and with motherboards becoming as expensive as the CPUs the money you would otherwise have saved with lowering solid state memory prices and DDR4 is wasted now that we've got ddr5 as mainstream on most new platforms. I'm happy with my 3700x and a 3080, I only play at 1440p so I'm good thanks. That price drop pisses me off though, I could have gotten buy on my 1080ti if I had of known they would drop prices by that much.

  • @Filiral
    @Filiral Год назад +4

    Card seems great, but I managed to get a 3080 ti on launch day and uts kept me plenty happy. Nothing I currently play or plan to play should strain that card too much, so I'll probably upgrade everything but graphics card next year and then wait for 5XXX to GPU upgrade.

    • @duohere3981
      @duohere3981 11 месяцев назад

      That was cheap I bet 😂

  • @joshuaableiter4545
    @joshuaableiter4545 Год назад

    I agree with JayzTwoCents' observations on the power connector.

  • @lostgenius
    @lostgenius Год назад +4

    I was finally able to get a 3080 a few months ago, so I will be passing on the 40 series

  • @stueyxd
    @stueyxd Год назад +12

    Just looking at the sheer size of the card, I think a lot of us would need a bigger case, and with the concern you are sharing about even the larger cases requiring such a stressed bend, I think I will give it a miss until some aftermarket/third party manufacturers correct this in our favour.

    • @Toutvids
      @Toutvids Год назад +1

      Exactly, my full tower Thermaltake Core X71 wouldn't even shut the glass side panel with one of these mounted. If I went vertical mount, the card would be starved for air shoved flush with the glass. No thought about current cases on the market was given when making these GPUs.

    • @yurimodin7333
      @yurimodin7333 Год назад

      just cut a hole in the side like a supercharger sticking out of a muscle car hood

    • @PDXCustomPCS
      @PDXCustomPCS Год назад

      Imagine this in an O11D Mini.. 😅

    • @cole7914
      @cole7914 Год назад

      Cost of the card, cost of a new PSU, and cost increase of electricity to run this monster. Nah… I’m good.

  • @-zerocool-
    @-zerocool- Год назад +1

    I wish AMD brought back an ATI special edition, but only for nostalgia and the reason being that it beats every Nvidia 4000 card, that would be real nice.

  • @NightWolfx03
    @NightWolfx03 Год назад

    some of those heatshrunk connector ends can be bent carefully with adding a bit of heat. But of course you have to be careful not to melt the connector or sleeve, and also careful not to pull the insulation back on the wires or pulling pins out. But a heatgun on a low setting can sometimes be helpful with some, but not all, harnesses. Just depending how thick the heatshrink is, and if there is anything under it ( like Corsair has capacitors in some of their cables ).

  • @jessmac1893
    @jessmac1893 Год назад +6

    One number I’d love is power (amps) coming out of the display port. For those with longer extension cables for VR, they often lose connection. I end up using my 1070ti for VR instead of my 3080ti because the 1070ti consistently connects and powers it with an extension cord. Weird stat. But no one measures/reports it.

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 Год назад

      Part of that is because only 1-2% of people are vr players and many have standalone VR headsets that dont interface with a pc

    • @GravitySandwich1
      @GravitySandwich1 Год назад

      I have the Reverb G2, They brought out a new powered cable due to connection issues. (usb wasn't providing enough power) Side note: I have a 1080ti. looking towards a The AMD equivilent of the 4080. (I'm boycotting Nvidia)

  • @BrettWidner
    @BrettWidner Год назад +194

    This is actually very interesting. A lot of your numbers for the 4090 are INCREDIBLY different from LTT's. I'm actually quite perplexed by it, their 4090 was getting 2X the fps of your 4090 test on what looks like the exact same settings in Cyberpunk. 4K, RT On, Ultra Preset, DLSS Off.
    EDIT: Just want to clarify, I'm not accusing either reviewer of anything. Merely pointing out the vast differences, could be related to their test bench, could not be. Something one might have to think about if they're looking to buy this card.
    UPDATE: LTT ran the card with FidelityFX on by mistake. JayzTwoCent's numbers are accurate.

    • @Pleasant_exe
      @Pleasant_exe Год назад +17

      And gamers nexus

    • @AsaMitakasHusband
      @AsaMitakasHusband Год назад +7

      Yea i noticed that too lol

    • @marcelosoares7148
      @marcelosoares7148 Год назад +30

      Hardware Unboxed too got different numbers but the strangest one was the 6950XT getting only 28fps in CP2077 on the LTT Benchmark

    • @B8con8tor
      @B8con8tor Год назад +8

      Everyone's numbers will not match. It will depend on room temperature, Open/closed case, Cpu Memory and so on.

    • @BrettWidner
      @BrettWidner Год назад +46

      @@B8con8tor I get they're on different benches but LTT's 4090 getting 2X the performance of Jayz 4090 on from what I can see are the same settings in Cyberpunk?

  • @grand0plays
    @grand0plays Год назад

    Nice job on this, Jay! You always show usefull and helpfull topics on your channel.

  • @Rasgarroth
    @Rasgarroth Год назад

    The best ad on a video I've seen in a WHILE

  • @MrGryphonv
    @MrGryphonv Год назад +7

    I may have missed it in the video, but I'm really curious about the undervolting performance. I was able to cut down about 100w draw on my 3090 with an undervolt at about a 3% performance hit. If the 4090 can be undervolted with similar or better values it will also be a good selling point.

    • @Daswarich1
      @Daswarich1 Год назад +2

      Der8auer tested tuning the power target and the 4090 got about 90% performance at 60% power target.

    • @MrGryphonv
      @MrGryphonv Год назад +3

      @@Daswarich1 Those are amazing numbers. Well worth the compromise IMO

  • @09juilliardbayan
    @09juilliardbayan Год назад +39

    Considering my budget, as much as I dream of having a 40 series, I see this as the perfect opportunity to buy a 30 series, which I have been waiting for for a looong time. It's all so exciting

    • @exq884
      @exq884 Год назад +1

      same - looking at a 3090

    • @bloodstalkerkarth348
      @bloodstalkerkarth348 Год назад

      @@exq884 wait to see if the 4080 is better or the new amd card

    • @zerogiantsquid
      @zerogiantsquid Год назад

      I'm in the same boat. I saw a 3090 for $950 on newegg two weeks ago and sniped it. I'm kinda sad that the 4090 is so much better, but at the same time I was super excited to finally get my hands on a 30 series. Still a massive leap from my previous card.

    • @chillchinna4164
      @chillchinna4164 Год назад +1

      @@zerogiantsquid Life is about being happy with what you are able to get, rather than being upset about not obtaining perfection.

    • @beH3uH
      @beH3uH Год назад +1

      Just bought rx 6900 xt for 800 euro lol prices are good.

  • @TinariKao
    @TinariKao Год назад

    I love looking at high end desktop hardware as a precursor of what can trickle down in the mobile, low power space which is where I start to care about things. :3

  • @zukodude487987
    @zukodude487987 Год назад +2

    I am still Happy playing Genshin Impact on my RTX 2070 super on my laptop and i will just wait for 5090 release before snagging a 4000 series.

  • @kieranpalmer2085
    @kieranpalmer2085 Год назад +3

    Here before the title change love you jay haha

  • @midnightlexicon
    @midnightlexicon Год назад +14

    Wanna see what the 4080 fe has in store for us. Good to see FE construction allows for good boosting whitout fan speed adjustment. Might stick with FE from now on.

  • @roylee3558
    @roylee3558 Год назад +15

    They need to put the cable plug on the motherboard side, then make the cable end a premade 90* (like how you have SATA 90* cable ends). This would keep the cable out of sight, and also relieve the bend pressure on the card's connection port.

    • @Trinity-Waters
      @Trinity-Waters Год назад

      Do that kind of thing with high-end military hardware and it works really well and is robust.

  • @Stevarneo
    @Stevarneo Год назад

    nice to see the nods to evga in the background

  • @davidjohnston6547
    @davidjohnston6547 Год назад +5

    I'm happy with my 3080Ti got it on 8/29 for $880 I'm seeing the prices of them climb back up, there are still some under $900 but others are well above that. The performance of the 4090 is nice to see, I think there will be a lot of people with the wait and see attitude until November when AMD has their RDNA 3 launch. I do agree with the connector issue and can see it causing problems on a lot of cases, even my rather large Corsair 7000x case I can see the connector touching the side panel depending on how wide the card is. But with the heft of these cards I'd rather place them vertically to better distribute the weight. Without a support bracket of some type, you're putting all that weight on the PCIe socket and the I/O slots, but this will starve the cards fans for air due to how massive the heatsinks are.

    • @alphadragongamingFTW
      @alphadragongamingFTW Год назад

      I got an evga 3080 12gb a couple mo ago for 700. I have been mulling around selling it and maybe getting a TI or even a 3090 for a couple hundred more. I'm kind of at a loss of what to do. Maybe just keep the 3080 12gb. I don't see it having any issues running games for quite a while plus I don't play heavy demanding games. I may get in to war zone 2 at some point or their new DMZ, but I play a lot of survival crafting games and I stream also from my gaming rig. Bits paired with a ryzen 5900x. I'll probably just keep it. I think it's just buyers remorse because I'm seeing TIs for very close to what I paid. Oh well lol

    • @smellslikebanana553
      @smellslikebanana553 Год назад +1

      @@alphadragongamingFTW $700? I paid $800 for my 3080 ( 12gb ) and I feel okay about my purchase. You're fine with what you have but if you REALLY want a TI or 3090 go for it. I think its best just to wait for the 5000 series or pull a 1080 ti and keep the 3080 for a couple of years.

    • @alphadragongamingFTW
      @alphadragongamingFTW Год назад +1

      @@smellslikebanana553 I still have my evga 1080ti and I love that thing. Now that Evga is out of the GPU business I don't want to get rid of my Evga cards lol. That 1080ti is still a beast. I run it in my secondary computer and it runs demanding games with out an issue.

  • @adamw9764
    @adamw9764 Год назад +21

    I feel like we are hitting a point in graphics cards where we will need an entire external case and power supply just for the card, great review!

    • @bygoneera1080
      @bygoneera1080 Год назад

      We'll only get there if people stupidly keep buying 'bad' (meaning unreasonable or absurd) products.

    • @KaiSoDaM
      @KaiSoDaM Год назад +1

      Oh yeah. Not talking about how throwing 600w of heat inside the PC case is a bad idea too.
      I'm pretty sure we will start using external GPU like some gaming laptop did.

    • @ultravisitors
      @ultravisitors Год назад

      That was true for 3090ti, this is much more efficient and doesn't have those crazy 2-2.5X transient spikes 3090ti had, and you can target 60% power and only lose 10% of the performance. If you want you can even use it with only 3 pins attached on much lower watt PSU.

  • @TekniQx
    @TekniQx Год назад +1

    @23:20 - Frazier never knocked out Ali in the first fight. He *DID* knock him down in the 15th round (only knock down of the fight) and did end up winning.
    Love ya, Jay! 🤓

  • @JosueRodriguez1225
    @JosueRodriguez1225 Год назад

    I never skip your adds, they’re awesome

  • @srodigital
    @srodigital Год назад +3

    Would be nice to see some sort of performance test comparisons based on real work examples of productivity (video editing etc) instead of or as well as gaming. Maybe stitching images in PTGUI for example.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      What I've seen from the test is that there is no need for me to get a 4090. I play none of the games tested, so there is no interest.

  • @damienlahoz
    @damienlahoz Год назад +9

    Why does it feel like the review community are trying to make 90s a mainstream product? I dont recall so much attention being afforded the Titans, which this essentially is. Its just weird. Every channel has dedicated significant time covering this show piece and what, 1% of PC enthusiasts will even bother trying to buy one regardless of how it performs. Its obvious that Nvidia is trying to normalize certain price points but it doesn't mean people have to play along.

    • @DanielFrost79
      @DanielFrost79 Год назад +1

      I personally hope and wish people did NOT play along. These prices are fuck**g insane and ridicolous.

    • @TheKain202
      @TheKain202 Год назад +1

      Because it's the only one they're allowed to make content about, until NV starts shipping out lower models? And besides, as ridiculous as it sounds - the 90's have the best value.
      4080 and 40""""80"""" had such a ridiculous price hike from last gen, or any before for that matter - it's really hard to justify buying it.

    • @damienlahoz
      @damienlahoz Год назад +1

      @@TheKain202 value? You could say a Ferrari is great value compared to a McLaren. And youd be right but its also astronomically expensive and outside the price point of 99.9% of consumers.

    • @Ferretsnarf
      @Ferretsnarf Год назад +3

      The power draw is just absolutely insane as well, and is an enormous increase over the previous gen. We're essentially seeing the performance scale with the power draw... which isn't really that impressive. 600 watts is off the charts. Honestly, we're not far off from having to dedicate an entire circuit in your house to these PCs that would be running hardware like this.
      Between the price, the power, and the ludicrous size of these things, when is enough enough? Nvidia, come see me when you make a better card by making it better, not by getting the performance out of a proportional gain in both size and power draw.

    • @damienlahoz
      @damienlahoz Год назад +1

      @@DanielFrost79 when the rubber meets the road, no 99% of gamers aren't playing along. $2k is still $2k to damn near everyone. Alot of these people saying they are buying one have 3060s and aint buying sht

  • @dunningkrueger
    @dunningkrueger Год назад

    Love the soundtrack!

  • @manicfoot
    @manicfoot Год назад +65

    Great analysis of the card, Jay. Given the energy crisis in Europe we're not sure we can afford to heat our home this winter, so buying a new GPU isn't really on my radar right now. Good thing is my PC is in a tempered glass case so it heats up my office quite nicely. I think playing games to keep warm this winter will actually be more economical than using a radiator 😅

    • @Liperium
      @Liperium Год назад +15

      Technically the only thing a gpu transfers it's electricity to is to heat. So it's technically as efficient as your electric radiator 😂 and a free side effect if you need the heating!

    • @TheVoiceOFMayhem1414
      @TheVoiceOFMayhem1414 Год назад +4

      I run a 3090 + lots of high end components in north Norway and i dont need any heating in my gaming area i can say 😁 they do produce enough heat and 2/3 of the year i even need 2 open my windows haha 😅
      So the high powerdraw kinda translates to heating the house 😅

    • @EJD339
      @EJD339 Год назад +2

      Holy hell. How much is your energy bill? I thought mine was bad in AZ when I had a 250 dollar bill for a month in a studio apartment in the summer.

    • @manicfoot
      @manicfoot Год назад +3

      @@TheVoiceOFMayhem1414 Nice! My idea is to adjust the fan curve on my GPU so they don't kick in till temps exceed 70 degrees. I think that way my PC could generate some heat while idle thanks to wallpaper engine always running and working the GPU a bit. Could save some electricity! 🙂

    • @HitmannDDD
      @HitmannDDD Год назад

      Get a 3090 (not TI) instead. Decent performance, less overall wattage, and it can double as a heater with the hotspot potentially hitting 100c.