DON'T Do this with your GPU!

Поделиться
HTML-код
  • Опубликовано: 12 окт 2022
  • Get an iFixit kit for yourself or someone you love at - amzn.to/3IDDkj9
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • НаукаНаука

Комментарии • 1,8 тыс.

  • @goldfries
    @goldfries Год назад +1556

    Hi Jay, Brian from ASRock (Malaysia) here.
    For your RTX issue, could you try change the option in Advanced > AMD PBS > AMD Common Platform Module > PCIe x16 Link Speed
    One of my local distributors had problem with Zotac card too on our X670E board, it refused to display but in the end it did with some BIOS setting changes.

    • @Frostbite1090
      @Frostbite1090 Год назад +51

      Bump

    • @goldfries
      @goldfries Год назад +72

      @@Frostbite1090 thanks. Jay didn't provide much details but I believe it was same issue. Ours case was ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO on ASRock X670E Steel Legend. For some reason it didn't work but later worked, got it settled within 24 hours and managed to work perfectly for the RTX 4090 launch event.

    • @PlayMates_
      @PlayMates_ Год назад +11

      UP

    • @goldfries
      @goldfries Год назад +10

      @@PlayMates_ Thanks.

    • @LarryGanz
      @LarryGanz Год назад +8

      We hope he sees this!

  • @dano1307
    @dano1307 Год назад +833

    didnt corsair totally get it wrong as far as how that connector works? I just watched Gamers Nexus video on that. Not sure I trust it now.

    • @skjetnis
      @skjetnis Год назад +137

      Corsair just posted a statement

    • @phillylove7290
      @phillylove7290 Год назад +189

      Corsair posted an apology for disrespectful behavior and language but they did not admit fault or give a technical answer.

    • @gucky4717
      @gucky4717 Год назад +189

      @@johnhowarth7216 Nope GN is right, they also have the specs right from Nvidia. In the 4x8-Pin Adapter there is an extra IC, that checks EACH 8-Pin, if it is plugged in and then sends the right signal to the sense-pins.

    • @Apex180
      @Apex180 Год назад +92

      @@johnhowarth7216 nope, If you had watched GN provide the source for their claims (Nvidia) so it looks like Corsair don't know squat.

    • @generalgrevous1984
      @generalgrevous1984 Год назад +10

      @@johnhowarth7216 Confused too. But my understanding is that anything less than 4 cables can only draw 600w as where all 4 cables draw 630w which good for overclocking. Also I did an analysis on spikes GN did earlier and the 40% transient spikes on a 450w power source will go over 600w.

  • @Eremon1
    @Eremon1 Год назад +164

    Jay's impression of the Side-band sensor talking was hilarious.
    Side-band: "Uhm we're only getting 300 watts."
    Controller: "Well then you don't get video." 🤣

  • @seanprzybyla2157
    @seanprzybyla2157 Год назад +486

    just as a note Jay. Solid core cables can break fairly easily if you bend them a lot. Might be something to be cautious of for your testing rigs incase they start to break and short/overheat

    • @warpedphreak
      @warpedphreak Год назад +14

      if you're in there bending your GPU PSU cable a LOT... you got bigger problems you caused lol...

    • @stevenhenson4956
      @stevenhenson4956 Год назад +9

      @@wojtek-33 hes talking about over time reason why most cables have a bend rating

    • @N0N0111
      @N0N0111 Год назад +55

      As history has proven again Jay has small knowledge on electronics.
      100% that is not a solid core cable, yes he miss spoke again...

    • @beaniiman
      @beaniiman Год назад +1

      We are going to start to need new cable tech soon, which is kinda gross. 60FPS at 4K is fine, stop it!!!
      but...

    • @whatshouldido22
      @whatshouldido22 Год назад +16

      @@warpedphreak don’t forget, as a reviewer, Jay is regularly swapping components. Regularly unplugging power cables. Yes, in a personal rig, you shouldn’t have that problem.

  • @natron8102
    @natron8102 Год назад +57

    “Well then you don’t get video!” Made me laugh for real. Jay can always make a boring subject entertaining. That’s why he’s the man.

  • @bobbybobman3073
    @bobbybobman3073 Год назад +53

    Also keep in mind that many psu offer two (6+2 pin connectors) off of one cable back to the psu, so very reasonably two 8 pins back to the psu should be fine anyway. So basically this is not really any different than a potentially common way other people might fulfill the 4, 8 pin connectors available.

    • @MrMirukou
      @MrMirukou Год назад +5

      My PSU unit does have these kind of cables? Corsair RM850x. so could i power a 4090 with it? it has 3 double cables... im not sure how it really works

    • @kaldo_kaldo
      @kaldo_kaldo Год назад +2

      Imagine. My PSU only has one 6+2pin and two 6 pin. To use squid adapter I would need four 8 pin connections. #1 is solved immediately. #2: A two-6 pin to 8 pin adapter. #3: Two molex to 6 pin x2 then a two-6 pin to 8 pin. #4: same as #3. That's a total of 8 adapters. 😂

    • @Koshmar-13
      @Koshmar-13 Год назад +2

      @@MrMirukou You can power it with the RM850x. Tech Yes City did all of his day one 4090 reviews on that powersupply. As for the wiring of the psu, I can't say how he had it wired.

    • @ShmeeGrim
      @ShmeeGrim Год назад +1

      @@MrMirukou Cables with two 8-pin connectors only deliver the same amount of power as a single 8-pin cable, which is 150W, IIRC. I would avoid using too many of those.

    • @matbailie3816
      @matbailie3816 Год назад +7

      @@ShmeeGrim That is simply not true.
      The PCIe 8 pin standard is essentially implicit communication. It says that if all 8 pins are connected correctly, the device (GPU) can draw 150W. Even if you make a cable and PSU capable of supplying 1.21 jigawatts, the device (GPU) has no way to know that, so will only ever draw upto 150W sustained per 8 pin connector.
      Corsair's cables can sustain 300W. But that will only be utilised if provided over two 8 pin PCIe connections, so their cables terminate with those two 8 pin PCIe connections.
      At the other end, there's an 8 pin connection to the PSU. That connector IS NOT an 8 pin PCIe connector. It's a connector bespoke to the PSU manufacturer. It can output up to 300W. It only works correctly on the manufacturer's PSU, as the the pinout is often not the same. (NEVER use a modular cable from a different manufacturer, unless you fancy risking connecting a ground pin on your GPU to a 12V pin on your PSU.)
      So, each connector on the Corsair PSU absolutely CAN deliver up to 300W, and using the twin 8 pin PCIe connectors at the other end CAN allow 300W to be drawn.
      The nVidia adaptor needs four 8 pin PCIe connections as each such CONNECTOR only guarantees 150W. You could be attaching for separate 150W PSUs. You could be attaching a nuclear power station over solid copper bars. nVidia can't tell, so they will only ever draw 150W per 8 pin PCIe connector
      Corsair have made a cable for THEIR PSUs. They wire the 12+4 pin connector to tell the GPU that 600W is available. They make the gauge of the cabling capable of delivering that. They make the PSU end of the cable connect to TWO of their 300W PSU outputs. Thus, their single cable CAN sustain 600W from their PSUs.
      Note that if you plugged in only One connection at the PSU end, only one of the two sense wires would be grounded (the other would be left "open"), and the GPU would be able to tell. Thus, the GPU would only be "allowed" to draw 300W from that connector.
      An 850W Corsair PSU absolutely CAN deliver 600W over either two of corsair's twin headed PCIe cables, or one of their 12+4 pin HPWR cables. As that leaves only 250W for everything else, including transients, I'm not sure I'd want to. I'd rather ensure the GPU be limited to 450W to leave enough capacity for your other components.

  • @tashmikagimanthamusic
    @tashmikagimanthamusic Год назад +4

    Has to admit man. This is the best iFixit ad ever. You are the best Jay. I'm a daily visitor now. Your videos are really informative. Massive thanks, man.

  • @sergionunez8878
    @sergionunez8878 Год назад +16

    I just have to say this, I've seen like a hundred of times your Ifixit sponsor video short and until this day still brings a smile on my face xD
    it's just so funny and it never gets old xD

  • @Kremon86
    @Kremon86 Год назад +179

    Could anybody please try to run an overclocked 4090 with the NVIDIA adapter with a couple of Y-splitters of a single 8 Pin PCIE Cable. I want to see the cables melt :-D

    • @Eric_Fate
      @Eric_Fate Год назад +52

      I'm sure Dawid will have video of himself doing this unintentionally before the month is out.

    • @ArcturusCOG
      @ArcturusCOG Год назад

      @@Eric_Fate Oh god, here we go!

    • @ChrispyNut
      @ChrispyNut Год назад +17

      I'm looking forward to someone using molex to 8 pin to 12 pin.
      More adapters is more better, right? Right!!

    • @PREDATEURLT
      @PREDATEURLT Год назад +4

      Few weeks and ebay will have molex to 12VHPWR with these two ground wires installed.

    • @edwardallenthree
      @edwardallenthree Год назад +1

      I'm not sure you would. The issue is the connectors, not the wires. Three pairs of 18 AWG wire won't melt at 450 watts, but the connectors might.

  • @jakejones9502
    @jakejones9502 Год назад +58

    But Jay, Steve just did this test and found you need all four or you will only get 400 watts.

    • @generalgrevous1984
      @generalgrevous1984 Год назад +10

      450

    • @lyianx
      @lyianx Год назад +38

      and that Corsair said Steve was full of shit lol, which he promptly proved them wrong.

    • @sauyon
      @sauyon Год назад +1

      Can you link that video?

    • @marceloconceicao2587
      @marceloconceicao2587 Год назад +1

      yeah I wanted to see Jay try to pull 600 with that setup

    • @FDFANATIC541
      @FDFANATIC541 Год назад +3

      Ya but 300watts will give you 90-95 percent of your performance. Using the full power on 4090s isn't worth the marginal performance gain

  • @kirk_kirksman4503
    @kirk_kirksman4503 Год назад

    4:00 jay better do this video idea! cause i love when they do videos just testing hardware and stuff, like the video where they test how much pressure a pump and rad can actually handle

  • @chadmckean9026
    @chadmckean9026 Год назад +4

    fun fact if the wire gauge is the same stranded can handle more amps, sold core is cheaper to make, sold core is used in builds cause it does not get moved around, it is set and forget so as long as it is cheaper than stranded it makes sense to use, also easy to screw sold core wires to the back of outlets, but they were made for solid core as it is the standard use product

  • @thefactory7221
    @thefactory7221 Год назад +210

    I'm curious to see how the super compact power supply cables behave thermally. A thermal camera video or section of reviews would be really helpful, especially since these cards risk igniting their own cords at max load.

    • @MrHaggyy
      @MrHaggyy Год назад +13

      Long cabels with a lot of connectors are usually way worse than shorter cabels without them.

    • @edwardallenthree
      @edwardallenthree Год назад +5

      Fortunately these products have specs that are available online, and have actually been tested for their power rating. Microfit connections from molex are almost as good in terms of power delivery as mini fit Junior (with a rating of 8.6Amp versus 9 on the larger connector). When you consider that the standard would normally use pigtails on 8 pin connectors, four 8 pin connectors would use 6 positive wires and 6 ground for 600 watt, and a single 16 pin connector would use 6 and 6 as well.

    • @hexdator2934
      @hexdator2934 Год назад +17

      AWG 18 (0.75mm²) cable is the most common across power supplies and can easy carry 12 amps (1440 watts @ 120V). In Jayz video it looks that AWG 16 (1mm²) may now be the standard being moved to, easily carrying 15 amps (1800 watts @ 120V). So these cards risking igniting their "own" cords is not a problem.

    • @edwardallenthree
      @edwardallenthree Год назад +8

      @@hexdator2934 good point. the reason I didn't bother to talk about wire gauge is that the limiting factor is the pins in the connector at the lengths used in a PC.

    • @hexdator2934
      @hexdator2934 Год назад +3

      @@edwardallenthree You went to an even greater level of detail......👍I was replying to OP..

  • @chrisspellman5952
    @chrisspellman5952 Год назад +16

    Given the comment they made to GN on the power requirements and how wrong the rep was. I 100% would not use any cable from Corsair. Gonna burn your house down.

    • @metaleggman18
      @metaleggman18 Год назад +3

      No, it'd be fine. The 16 pin cable is essentially just two 8 pin cables combined.

    • @edwardallenthree
      @edwardallenthree Год назад +1

      The original Corsair post was incorrect, but it wasn't incorrect about their own product it was incorrect about nvidia's product. It was simply a misunderstanding and they have apologized. It had nothing to do with whether or not the Corsair product is technically good. There is no difference in power between using the Nvidia adapter and corsairs, other than nvidia's adapter will produce a messy bunch of cables which might cause other issues. The IC on nvidia's adapter is designed to count how many cables are connected to it. Corsair's cable does not need to count how many cables are connected to it because there is only one configuration option and that is the plug both cables into your power supply. No need for an IC when the pins will always both be connected to ground.

    • @edwardallenthree
      @edwardallenthree Год назад +1

      @@metaleggman18 an eight pin cable has three lanes of 9 amps each, for a max power of around 324 watts, however the spec limits it to 150 watt (this difference is why you see pigtails). The 16 pin cable has 6 lanes of 8.5 amps each, for 612 watts rated power, or limited to 600w by the spec. So, yes, it is exactly like two 8 pin connectors, although it has more 12v wires than two 8 pins, and runs much closer to the spec of the underlying components, in theory.

  • @mingteo
    @mingteo 6 месяцев назад

    Thumbs up for the ad alone. Also great video in general. Needed a genuine laugh today, thanks Jay!

  • @DardaniaLion
    @DardaniaLion Год назад

    Man, enjoy everything about his videos. The humour and the knowledge fit perfectly well. Respect.

  • @nordenmike6531
    @nordenmike6531 Год назад +3

    Super excited about new the GPU’s but mostly waiting another round of Jay vs. Steve overclocking competition and trash talking to each others.😂

  • @davidbrennan5
    @davidbrennan5 Год назад +125

    I will buy mid range this time, this is nuts. The card is too big, that connector looks like a fire hazard and the lack of display port 2.0 is insane, no wonder EVGA pulled the plug.

    • @schleprok6333
      @schleprok6333 Год назад +6

      I'm waiting for the new AMD RX7000 series... Display port 2.1 confirmed...

    • @primary7208
      @primary7208 Год назад +7

      The mid range cards are laughable, better off skipping this gen or going Amd

    • @dustinharvey7394
      @dustinharvey7394 Год назад

      i wouldn't rush to buy midrange yet there are some other cards coming that 1. might have display port 2.0, 2. actually support PCIe 5.0 and 3. not need 450 watts for this level of performance.

    • @crispycrusader1
      @crispycrusader1 Год назад

      Why not buy the bow cheaper to get used/maybe new 3090 or 6950xt since they are around 1000 now lol

    • @codname125
      @codname125 Год назад

      Why do you need DP2.0? Is 8K 60Hz/4K 120Hz not enough for you?

  • @Phat-Monkey
    @Phat-Monkey Год назад +2

    Even with a Corsair 900D case the cables do touch the glass coming from the 4090 Suprim X so I totally agree on your previous video the design is pants for those with small PC cases.

  • @hiranthadil
    @hiranthadil Год назад

    Good content Jay and the crew!

  • @edwardallenthree
    @edwardallenthree Год назад +83

    The standard has 4 sense lanes. The first two produce a binary number (with 3 being 600watt). No IC needed, just connect both to ground to produce a 11 (or 3). The other two are power ok and gpu connected and are both optional.
    edit: just got the part of the video where you say this and reference the same page on the spec I did. should.never have doubted your journalistic chops, Jay!
    edit 2: awesome video, explained the sense pins very well and made me actually more excited for the future.

    • @IcecalGamer
      @IcecalGamer Год назад +3

      Can't wait to see pictures of people molex>6pin>8pin>12 pin. And having two speaker wires coming out from the side band, grounding into the case with techscrews (ground IS ground ¯\_(ツ)_/¯ )
      ^not unlike shi**y audio setups in cars :))

    • @edwardallenthree
      @edwardallenthree Год назад

      @@IcecalGamer as long as they include adapter cables with the cards, I don't think we're going to see much of that. I hope I'm right. Just for the sake of people's homes and their neighbors.

    • @blarghmcblarghson1903
      @blarghmcblarghson1903 Год назад +1

      To be fair, Jay isn't always right, no harm in double checking what he says. Literally last video he said these are PCIE 5.0 cards when they're 4.0. At best he meant it figuratively, but he didn't present it as such.

    • @edwardallenthree
      @edwardallenthree Год назад

      @@blarghmcblarghson1903 he did? Weird. It's an issue for me. I use the additional slot for a mellanox (I won't call them Nvidia) infiniband card for my storage network, so being able to use the GPU at 8x without a performance hit is a feature I want.

    • @emilybjoerk
      @emilybjoerk Год назад +1

      A 4to1 adapter cable (and 2to1 too) will still need an IC to drive the two sense pins. As it needs to count how many cables are connected on the ATX 2.0 end, as any three out of four should produce the same sense signal because you cannot rely on users populating then in the order prescribed by the engineer. I didn't see an IC on the Corsair cable and because it's got two inputs I half expect it to incorrectly report the higher mode if one one cable is connected to the wrong input.

  • @wile-e-coyote7257
    @wile-e-coyote7257 Год назад +17

    Excellent demonstration video, JayZ. QUESTION: can you please elaborate on how "monitors plays a part in this, too"? Thanks!

    • @abnormallynormal8823
      @abnormallynormal8823 Год назад +1

      You can push all the frames at 4k you want on a 1080p 60hz monitor and you probably won’t notice a difference between that and 1080p 60fps. So if you have a lower end monitor your GPU won’t need to work as hard to run the resolution and frames that are optimal for you. That’s why you should always match your GPU to the monitor you have or plan to have in the near future. It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for. As soon as you step up to 1440p or 4k and high refresh rate (120+ hz) it starts to make sense to go with a higher end GPU, but honestly no one needs a 4090, even for 4k gaming. It sits in a weird space where it’s too expensive and overkill for general consumers, and not powerful enough for professional use where Quattro’s dominate the space.

    • @xGOKOPx
      @xGOKOPx 9 месяцев назад

      @@abnormallynormal8823 "It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for."
      What? You can't just default assume you aren't utilizing the performance. It depends on the game, and settings. Lower refresh rate and resolution are always an opportunity to increase graphics settings. Now, if you've already maxed them out and still aren't anywhere near 100% GPU usage (assuming appropriately powerful CPU) then yes, your card's potential is kinda wasting. But unless that's the case in all games you play or might wanna play, you can't really say you've overpaid for the GPU.
      That's obviously not touching the possibility that you might also need/want the card's performance for professional use (like rendering, AI and other compute stuff) where your monitor doesn't matter. But that's another story

  • @Kieranh778
    @Kieranh778 Год назад +5

    I would be more curious if there ended up being any functional difference in overclocking headroom, between the official power adapter vs 3rd party

  • @OGxoKANE
    @OGxoKANE Год назад

    Love it jay keep up the great job

  • @misterio10gre
    @misterio10gre Год назад +14

    oh look! its the same Corsair that apparently doesn't know how the 4090's power delivery works lol

    • @Maradiaga23
      @Maradiaga23 Год назад +2

      They issued an apology already. Apparently the comment was made independently by one of their employees.

    • @Apex180
      @Apex180 Год назад

      typical back peddling when found out.

    • @KarolusTemplareV
      @KarolusTemplareV Год назад +3

      Isnt it always the case when an employee fucks up but its the whole company when an employee does something extraordinarily good? 😏

  • @Techno-Tanuki
    @Techno-Tanuki Год назад +3

    Interesting video. Time to see if I can ever decide if once prices come back down (scalpers again) if I can decide if a 4090 is worth it or not. Probably not.

  • @IceSki117
    @IceSki117 Год назад +2

    I think Jay's inner mad scientist is starting to poke its head out again for torture test season.

  • @adamroberts8415
    @adamroberts8415 Год назад +2

    The correct term for the "kickstart" voltage is the "Inrush" voltage. This is common through low voltage dc electronics as well as large circuit breakers for industrial motors. The inrush is akin to when you open the floodgates of a damn, there is a large starting deluge of water or power in this instance, then it tapers to the supply voltage.

  • @thracian
    @thracian Год назад +4

    me watching it on my 1060 : mmm yes yes

  • @oOWaschBaerOo
    @oOWaschBaerOo Год назад +10

    didnt corsair spread missinformation about the 4th/3rd plug on the adapters ? gamers nexus made a recent video about it , showing that corsair has no idea what the fuck they are talking about when they adress the new plug and standard

    • @thestitchedcow5726
      @thestitchedcow5726 Год назад +1

      this^

    • @edwardallenthree
      @edwardallenthree Год назад

      yes they did, but when you see how they built their own extension you can see why they thought what they thought, why they were wrong, and why their product is still perfectly okay. these cables are fine. I just read the spec, and essentially I think the Corsair person was just confused and didn't realize that Nvidia built their adapter differently than they built their adapter, but both will work fine for a 4090 or any other GPU with ATX 3 that has ever been produced.

    • @gucky4717
      @gucky4717 Год назад +2

      The Corsair one just uses 2 ground wires, the Nvidia adapter uses 4 ground wires and an IC to check them.

    • @edwardallenthree
      @edwardallenthree Год назад

      @@gucky4717 exactly. And let's be clear when we say "IC" we think of some complicated logic but it's really not doing that complicated logic. It's just counting the number of sense pins (old style) on the connectors and converting that to a two digit binary number.

    • @PabzRoz
      @PabzRoz Год назад

      That only applies to Nvidia's cable adapter. That's what Steve from gamersnexus was talking about when calling out Corsair. Not Corsairs own cable. The way this cable works is completely different.

  • @godtiermedic
    @godtiermedic Год назад +1

    Jay your biceps are looking stacked 💪

  • @eliotwildermann
    @eliotwildermann Год назад +1

    That opening promo was fun

  • @hme850
    @hme850 Год назад +17

    Seasonic making a native dual 8-pin to 12-pin cable was mint. So satisfying swapping that in for my 3070 Ti FE

    • @hme850
      @hme850 Год назад

      @@morganlimes cool

    • @gmualum08
      @gmualum08 9 месяцев назад

      Did same thing for my 3080 FE, definitely an awesome perk, and a very well made cable to boot. My only thing is did you notice that the 12 pin connector doesn't seem to look completely seated?

  • @hamstersmash
    @hamstersmash Год назад +4

    That intro, very reminiscent of early era morning TV ad's, very nice

  • @tfoutfou21000
    @tfoutfou21000 Год назад +2

    Even with 4 plug on the adaptor. You very often use only 2 plug on the PSU side.
    Wich is the same as the cable Corsair provided. Unless you use 3 or 4 individual cable for your adapter

  • @JayHijazi
    @JayHijazi Год назад +2

    For the adapter that come with the 4090, you do not need to connect four 8-pins, just three as per their instructions. The 4th plug is optional and is for overclocking.

    • @sabola1274
      @sabola1274 Год назад +1

      Can I get this cable with the 2 8-pina for the 4090? Or do I need a custom cable that has 4?

  • @salahmed2756
    @salahmed2756 Год назад +5

    Going back to the IDE cabling days lol

  • @lcmattern
    @lcmattern Год назад +7

    Something that derbauer tested is lowering the power limit, he got it down to 50 % and only saw a dip of 5-15% and a 200w drop in power usage.
    I think that should be tested by others as well, they would not have needed massive coolers and special connectors, maybe scared of amd. Interested to see what will come from them.

    • @ledoynier3694
      @ledoynier3694 Год назад

      well no surprise there. 30 series behave the same way :p
      And AMD too actually. They don't show the total board power so they appear to be more efficient, but they also benefit from lower power limits in the same way with a really minimal hit in performance.
      those last FPS are a b**** to get for the benchmarks.

  • @M1k3_
    @M1k3_ Год назад

    Thanks Jay

  • @einarabelc5
    @einarabelc5 Год назад

    The IFixit is literally my favorite youtube commercial. It's so dorky it never gets old.

  • @jackoneill1070
    @jackoneill1070 Год назад +12

    The power is defined by what consumes it (GPU).
    What produces the energy (PSU) only defines the power limit it can handle.
    Since the GPU is being told by the sense pins that the available power limit is under what it needs, it doesn't even try to start because it "knows" that it will go over the limit of the PSU and probably damage it.
    Jay, you cannot say the power is there, "there" is only the voltage in Volts, then multiplied by the Amperes that the GPU resistance provide will give you power in Watts.

  • @anthonyc417
    @anthonyc417 Год назад +6

    Also try the cable out on a SF750 lets see if ITX is even viable

  • @Dtr146
    @Dtr146 Год назад +2

    This all reminds me of adapting molex connectors into 6 and 8 pin in older computers.

  • @charlesaston6546
    @charlesaston6546 Год назад

    Best promo ever at the start lmao

  • @YAAMW
    @YAAMW Год назад +3

    There are no tricks involved in the corsair adapter. Some of the cables you get with this PSU give you two 8-pin connectors for each cable coming from the PSU. That means that a single PCIe connector ON THE PSU can supply 300W so two will give you 600W

  • @joakoc.6235
    @joakoc.6235 Год назад +4

    We need a what happens if you trick the voltage regulation IC in the card to push more voltage to the core.

  • @gyanozora3015
    @gyanozora3015 Год назад

    good work king, love you

  • @Crunchifyable2
    @Crunchifyable2 Год назад

    I think Jay found his hidden talent in voice acting and commercials.

  • @Nitrodozer
    @Nitrodozer Год назад +4

    Even though I'm gonna get under four hours of sleep before my hectic day I'm very relaxed because I'm watching JayzTwoCents.

    • @godtiermedic
      @godtiermedic Год назад +1

      Good old Lew unboxing stuff for us on 4 hours of sleep 😴

  • @YoutubeHandlesSuckBalls
    @YoutubeHandlesSuckBalls Год назад +4

    Ah, now it makes sense why Corsair called the GN review 'bullshit' because they made a bespoke cable that tricks the card into thinking it has 4 cables attached, and they didn't want to lose sales.

  • @Mustafa_9628
    @Mustafa_9628 Год назад

    Jay is really all about these pins 😂

  • @moviesandvisualfx5663
    @moviesandvisualfx5663 Год назад

    Your I-fixit ads are LEGEND HAHAHAHAH!

  • @gucky4717
    @gucky4717 Год назад +3

    Sadly all those cables are either not available or instantly sold out.

  • @dwurkz
    @dwurkz Год назад +8

    I think the reason there is a 4 8-pin adaptor is for psu's that had multi rails on the 12V. It is mostly a CYA adaptor while most corsair psu's are single 12V rail by design so the can basically just use 2 8pins to 12 high power connector.

    • @Andychiu845
      @Andychiu845 Год назад

      Yeah most psu can do 1 8 pin psu to 2 x 8pin pcie pigtail

    • @RurouTube
      @RurouTube Год назад +1

      The reason for 4 8-pin adaptor is simply because each PCIE 8-pin is rated at 150W (it is the standard), so you need 4 of those to get up to 600W. You don't design 2 8-pin adaptor and rated it for 600W because it will be a fire hazard.
      The adaptor itself can sense how many cables are attached to the adaptor, so if you only attach 3, it will tell the GPU that the max power is 450W. If you use Corsair PSU using the adaptor, you still need to connect to all 4 to get 600W max.
      One of the potential problem in the future is that if people ended up breaking or losing their adaptor (maybe buy a 2nd hand 4090 card and it doesn't came with the adaptor), they might think to simply trying to purchase the cheapest adaptor and those adaptor is not a smart one, it could end up with the GPU trying to pull 600W from 1 8-pin.

    • @bradley3549
      @bradley3549 Год назад +2

      @@RurouTube While the PCIe spec may say 150W per 8-pin ATX, in reality the three positive wires and Molex Mini-Fit Jr. pins are capable of much more than that if properly spec'd. Roughly 108w per pin, or 300W+ per 8-pin connector. Since Corsair specs the cable and the power supply, one must assume they know what their pins and wires are rated for and thus this is a 100% safe and compliant use case at 600w.

    • @RurouTube
      @RurouTube Год назад +4

      @@bradley3549 What I wrote is not about whether Corsair can do 600W safely from a single 8pin but more about the reasoning why Nvidia included this 4x8pin adaptor. It is not about single rail or multi rails, it is simply because of the spec. The spec for 8pin is 150W. You can't expect every single PSU on earth to safely do what Corsair PSU or its cable can. Edit: this is just from safety issue. There is also from function stand point where while PSU at this category should be able to supply more than 150W from a single 8pin, there might be edge cases where it can't. Also just having 2x8pin might ended up making people have a false sense of security thinking that their PSU with 2x8pin should be able to handle 4090 when in reality it can't.

    • @ledoynier3694
      @ledoynier3694 Год назад +1

      @@RurouTube spot on.. and some PSUs really put the bare minimum gauge to get 150W. through a PCIE cable.
      The cable needs to follow the specifications anyway. you don't deviate from rules just because "it should be alriiiight" ^^

  • @rashkavar
    @rashkavar Год назад

    Gods, that buzzsaw of a fan in the background. You guys are doing your best to noise gate, so it's only audible when Jay's talking, but holy crap that's loud!

  • @Shadowfax2121
    @Shadowfax2121 Год назад

    Goddamnit Jay, take your updoot. That iFixit ad was pretty great buddy.

  • @DrivenKeys
    @DrivenKeys Год назад +46

    GN just posted a video previewing Nvidia's explanation that the 4-plug adapter has a sensor in the terminal that unlocks 133%. If you use only 3 plugs, it stays at 100%. Four plugs enables 133%. This is how you can use current gen power supplies, and probably a big reason for using 4 instead of 2.

    • @ahs9674
      @ahs9674 Год назад +4

      Two plugs at the power supply side is enough, it's different with the adapter, but with the dedicated cable you can pull 600w no problem.

    • @RageQuitSon
      @RageQuitSon Год назад +15

      and wasn't it Corsair completely not understanding this generation and calling that total bullshit? I'm too lazy to double check, whatever company said it, avoid them this generation cuz they don't understand the product they are building psu's for

    • @madb132
      @madb132 Год назад +20

      @@RageQuitSon This ^^👍 GN Just proved Corsair are spreading mis-information.

    • @blarghmcblarghson1903
      @blarghmcblarghson1903 Год назад +13

      @@RageQuitSon A Corsair PR rep said that, yes, and hopefully he got his ear chewed out for it by his boss. Doesn't mean Corsair's engineering team are as bad though.

    • @Namamiri
      @Namamiri Год назад +3

      @@ahs9674 ah i would still not trust a product from a company who does not understand that plug and spreading misinformation

  • @vedomedo
    @vedomedo Год назад +22

    Got my 4090 a day or two ago, and you are sooo right. The goddamn cablemess is terrible. Ordered a Corsair cable for my Corsair PSU to neaten things up.

    • @klipschorny
      @klipschorny Год назад +1

      Did it work? Which Corsair psu do you have?

    • @Lagann0
      @Lagann0 Год назад +1

      @@klipschorny I have the corsair 600W 12VHPWR cable as well. I have a corsair 1000RMx and works perfectly.

    • @vedomedo
      @vedomedo Год назад

      @@klipschorny yes it worked fine. I have a RM850X

    • @csguak
      @csguak Год назад

      Wait what? This cable will only draw 600w if you have Rmx1200w supply or above. Anything less, it defaults to 450w. But people are ignorant, so they deserve to get scammed.😂🤣😂🤣

    • @StickyKeys187
      @StickyKeys187 Год назад

      @@csguak why the fuck do you need 600 watts of juice to power a graphics card? Are you building a delorean or sum? 😂

  • @axe_rocker
    @axe_rocker Год назад

    5:29 I thought Jay want to be Jay by turning that into bow LOL

  • @paulpitmon6663
    @paulpitmon6663 Год назад

    Recently found your channel, as I Recently found myself owning a new PC. You alright man. Pretty funny, best of all, not afraid to laugh at yourself. I'm a subscriber now.

  • @DGPG35
    @DGPG35 Год назад +16

    NOTE: the power that you get from this Corsair cable will vary depending upon the power supply that is used, it calls for a 1200w Corsair PSU to deliver the full 600w through the cable (1000w PSU = 450w and 750w PSU = 300w). I did not see this mentioned in the video.

    • @xGR1MxREAPERx
      @xGR1MxREAPERx Год назад +3

      This is false lol.. do the same test he is doing and you can see the wattage being supplied. I have a 1000 Corsair and this cable and I’ve had my strix card up to 527 watts. So it is definitely over 450.. maybe that’s changed I guess but as of now it isn’t true

    • @crimson7151
      @crimson7151 10 месяцев назад

      I have a rm850x and it goes up to 587 watts in furmark with the cable.

    • @TheCaciKO
      @TheCaciKO 9 месяцев назад

      ​@@crimson7151
      I also have rm850x. This might be the reason why I'm getting a black screen with this cable.

    • @crimson7151
      @crimson7151 9 месяцев назад

      yes its possible if you have a cpu like the 13900k@@TheCaciKO

    • @devel-rh6hy
      @devel-rh6hy 6 месяцев назад

      are you using the type 4 or type 5 PCI-e ? and do you connect 3 or 4 pci-e ? i have the same power supply but i havend connected yet@@crimson7151

  • @crisnmaryfam7344
    @crisnmaryfam7344 Год назад +4

    1:21 After what they said about GamersNexus I dont know if id trust anything from Corsair that has to do with the 40 series.

    • @mitchellp7739
      @mitchellp7739 Год назад +1

      Corsair already responded to that. It was just a dude who messed up. Not a big deal

  • @ErrorCDIV
    @ErrorCDIV Год назад

    3:30 That's a really well put. Didn't know that.

  • @PerryCS2
    @PerryCS2 Год назад

    your ifixit commercials are hilarious. LOL!

  • @abdulhkeem.alhadhrami
    @abdulhkeem.alhadhrami Год назад +19

    The ifixit ad never gets old, like it no matter how many times i watched it!

    • @zlungbutterz3307
      @zlungbutterz3307 Год назад +6

      And if it does get old, fix it with I-fixit!

    • @superdbsfan8670
      @superdbsfan8670 Год назад

      @@zlungbutterz3307 😂😂❤

    • @5federline
      @5federline Год назад +1

      i buy ifixit at amazon because i watched his video. it really influence me 😅

  • @VadikRamm
    @VadikRamm Год назад +3

    Correct me if i'm wrong, didn't Nvidia explicitly say that 4 pin connectors MUST be connected to individual rails and make sure not to use splitters?

    • @Marin3r101
      @Marin3r101 Год назад +1

      Yeah... im pretty sure i remember this as well. Jays 4 to 1 was also connected only using 3 plus the breakout on one... not a good idea for OC.

    • @VadikRamm
      @VadikRamm Год назад

      @@Marin3r101 I would not be surprised if this is why his STRIX 4090 does not fire up :D

    • @Marin3r101
      @Marin3r101 Год назад

      @@VadikRamm not it should still post the card. Only 450 watt is required unless Asus Vbios requires them all.

  • @AliYassinToma
    @AliYassinToma Год назад +1

    About the start of the video... no it dorsnt only take 2 it still takes 4 if u notice the regular cables have 1 connector on the psu side and 2 8 pins on the other side... so this adapter just skips 1 step.. its still the same amount of cables or lanes from the psu

  • @ChitoWorld
    @ChitoWorld Год назад

    This was a good episode 😂

  • @marcelkanter
    @marcelkanter Год назад +7

    Hi Jay,
    just that you can't band a cable, doesn't mean that its solid core. Especially the insulation makes a big difference in the stiffness of a cable. If you don't want to cut the cable, you can x-ray the cable to see if its standed.
    BTW. The "smart" communication is just passive pins. Maybe later they implement something I2C based, like the old fashioned PM Bus on the PSU and GPU. They should have done this already. Its like cents for an eeprom in the PSU.

  • @Hampton_twitch
    @Hampton_twitch Год назад +3

    Corsair lost all my respect for the message they sent to Gamer Nexus regarding power supply. They were very unprofessional......disappointing to hear.

  • @thumbwarriordx
    @thumbwarriordx Год назад +1

    The PCIE power connectors are rated at more than double their wattage at 12v of the 150w GPUs use them for.
    The cables on the other hand are typically not quite living up to that.
    But the math all checks out, it's no more insane to do this than to use the 12 pin connector in the first place lol

  • @Rayu25Demon
    @Rayu25Demon Год назад +5

    I think soon we will need a cooler for psu wires.

    • @nostrum6410
      @nostrum6410 Год назад

      no idea why they still use wires so small

  • @HirXeBomb
    @HirXeBomb Год назад +3

    Jayz got 12th place eventually.
    12. JayzTwoCents 28356 NVIDIA GeForce RTX 4090 AMD Ryzen 9 7950X

  • @CannibalToast
    @CannibalToast Год назад +1

    Someone needs to make a right angle adapter for this cable standard.

  • @barneybarney3982
    @barneybarney3982 Год назад

    normal pcie cable have it too... one pin on 6pin connector and total of 2 pins of 8pin connector, they are even similar setup, gnd or open.

  • @argonzeit
    @argonzeit Год назад +38

    I would retest this with the original cables and see if your FE can overclock. Watching the GE's video and hearing the Nvidia's guy talk about it, you may need their adapter to get the full power.
    While Corsairs adapter may be built differently, the card itself may not want to go past it's 450 watts if it cannot detect whatever signals the original adapter gives.
    Not to mention, if it's even safe for the card long-term.

    • @misterio10gre
      @misterio10gre Год назад +1

      power doesn't affect the longevity of cards, voltage does, which may I add Nvidia doesn't even let you increase it this time around

    • @earthtaurus5515
      @earthtaurus5515 Год назад

      Good point and I hope Nvidia hasn't done anything to effectively lock performance and constrain people into using their squid cables...

    • @edwardallenthree
      @edwardallenthree Год назад +2

      There is literally no difference in the power provider by corsair's cable and nvidia's cable when nvidia's cable has four PCIE 8pin cables connected from the perspective of the graphics card.
      As jay, and gamers Nexus have shown, the card determines its power limit based on what it senses from the cables. Both Jay and gamers Nexus perform the same experiment, a little bit differently, and got the same result.
      The new standard, in my humble opinion, is simple, easy to implement, and far superior to the hodgepodge of old power standards and pigtailed six pin and eight pin connections.

    • @enntense
      @enntense Год назад +3

      @@misterio10gre unsure of this reasoning. Power is just V*I. If currents constant and voltage goes up power increases. Heat in devices is mostly from resistance to current.

    • @misterio10gre
      @misterio10gre Год назад +1

      @@enntense card runs at around 1.045 stock voltage, my 3090 does like 1.2 at max voltage headroom
      Also you are outright incorrect, heat is directly proportional to voltage but inversely proportional to current

  • @armhunter
    @armhunter Год назад +3

    They proved your losing power if you dont use all four!

  • @MegaAssasin47
    @MegaAssasin47 Год назад +1

    man i love these freaking test's. Always learn something new. Thank you jay and your wonderful team.

  • @lilwashingtonmusic4516
    @lilwashingtonmusic4516 4 месяца назад

    You have that tales from the crypt face lol.. not even joking

  • @alexdefelice2448
    @alexdefelice2448 Год назад +4

    Your iFixit ad will never get old. I watch it every time.

  • @joshc7200
    @joshc7200 Год назад +9

    Jay, try to either 1) Disable deep sleep hdmi/ dp on your monitor or 2) use a differnt connector. This is a common issue.

  • @demetriescoad2396
    @demetriescoad2396 Год назад +1

    Hey Jay! I'm building my first ever PC soon. I've been watching a lot of your videos everyday to get a better idea of what I am doing and what exactly it is that I want to get. I have a few questions that maybe you or your community can help me out with:
    1. I'm interested in getting a threadripper CPU, mainly because I love spending a lot of money 😮‍💨, but I also plan on doing video editing as well as gaming. Is a threadripper necessary for what I'm trying to do?
    2. When a threadripper CPU is built into a PC what other parts of the build will need to be changed with it?
    3. What can be done to make sure the noise is kept to a minimum?
    Like I said I'm new to all of this and I plan on building my first PC this holiday season. I have a $5000 budget and I'm shopping for all of my parts on black friday/cyber Monday. Let me know if you have any shopping or build suggestions as well!

  • @lasthopelost9090
    @lasthopelost9090 Год назад

    I enjoy your vendetta and caring about aesthetic’s which seems to be your thing at least to me

  • @Azulath.
    @Azulath. Год назад +3

    Hey Jay, I have a similar issue with the MSI RTX 4090 Suprim X. I don't have any display output on my Gigabyte X570 Aorus Master and the card just won't work. (I have tested two different PSUs, one 1KW and one 1.6KW both with the GPU's dongle and with the cable you are showing in the video.)
    I'm wondering though if the behaviour is identical. On my card, the fans do not spin up and no RGB lights up...I it the same on the Strix? I think you mentioned the fans do not spin up but you don't seem to mention RGB.

    • @calop002
      @calop002 Год назад

      did you manage to solve the issue, having similar problem.

  • @KobraTrading
    @KobraTrading Год назад +7

    I still don't understand why they didn't just make it a plastic 90 degree bend from factory.
    All these problems (or at least most of them) immediately disappear.

    • @DJ3thenew23
      @DJ3thenew23 Год назад

      There are no problems, he literally bent the cable back and forth and nothing happened

  • @N0N0111
    @N0N0111 Год назад +2

    Ahh look Corsair, the brand that screams at other RUclipsrs.

  • @raylf3141
    @raylf3141 Год назад +1

    this is why i'm waiting for the atx 3.0 power supplies to launch at the end of the year.

  • @keithkamps77
    @keithkamps77 Год назад +8

    Use the supplied Nvidia cable with all 4 ports connected and your power slider will allow up too 133% voltage. Steve from GN demonstrated how this works, you need all 4 side band pins to be active.

  • @erichagen3617
    @erichagen3617 Год назад +6

    I love your vids as an older dude who’s hobby is PC’s and working on cars!

  • @ej_tech
    @ej_tech Год назад

    That iFixit ad still hasn't gone old.

  • @adamalmassudi530
    @adamalmassudi530 Год назад +1

    Does this void the warranty and is it reliable

  • @Krutzo11
    @Krutzo11 Год назад +3

    My question, why arent there a 4 PCIE cable that goes into this 12 pin connector just directly from the PSU? Like that all of the convertion happends straight after the PSU, and not right before the GPU? That fixes the whole problem, and you still get 4 PCIE cable into the 12 pin connector

    • @ItsssJustice
      @ItsssJustice Год назад +1

      Because some people don't have modular PSUs, different modular PSU manufacturers can have different pinouts on the PSU-end, which is why it's advisable to never mix cables from different modular PSUs. Making a single cable to deal with all that mess isn't worth it, it's cheaper to make a short one, if you design it to go all the way to the PSU, there will always be people complaining it's too short/long... While it sucks for the consumer, it would be a silly business decision for Nvidia to do it any other way. You MIGHT however be able to get replacement cables from your PSU manufacturer to address this properly however.

    • @metaleggman18
      @metaleggman18 Год назад +2

      Because by spec, you only need two 8 pin outputs. Pice 8 pins are 3 power, 3 ground, a sense a, and a sense b. The new 12 pin connector (which is actually a 16 pin, 12 + 4 sense) is essentially just two 8 pins in a new form factor that is designed to work with both older "dumb" PSUs and new "smart" atx 3.0 PSUs. The same type of connector and cabling is used in the CPU 12v power cables for stuff like the 12900K, and that sucker can draw almost 250W safely from one cable.

  • @mikeoleksa
    @mikeoleksa Год назад +3

    I love it when Jay does stuff "for science".

  • @HolyLandSite
    @HolyLandSite Год назад

    You should have posted a link to these cables. Would have been nice.

  • @brentsmithline3423
    @brentsmithline3423 Год назад +1

    Would be nice if Corsair offered a 90° plug at the end of the cable.

  • @18PSIFORYOU
    @18PSIFORYOU Год назад +4

    450-600 watts for a 4090.....that's what 30 series KINGPIN owners call..... owning a GPU.

    • @The44sasquatch
      @The44sasquatch Год назад +2

      I'm sad there's no 40 series K|NGP|N this go around 😢

  • @VRGamingTherapy
    @VRGamingTherapy Год назад +3

    If I was spending that kind of money on a 4090. Then I'd just buy the MSI or Gigabyte PcIE 5.0 power supply that has dedicated cable. They're only $200.

    • @Taipan303
      @Taipan303 Год назад

      I thought this was the dedicated cable?
      Or do you mean these PSUs have a dedicated port for 600W as well?

    • @VRGamingTherapy
      @VRGamingTherapy Год назад +1

      @@Taipan303 Yes. 1 port 1 cable.

    • @Taipan303
      @Taipan303 Год назад

      @@VRGamingTherapy nice. I'm only on 850W so maybe I'll upgrade my PSU to get one like this. I normally get Corsair or Seasonic though, surprised they don't offer the same.

    • @VRGamingTherapy
      @VRGamingTherapy Год назад

      @@Taipan303 These are the only ones I've seen yet. It's new tech I don't believe the rest of the manufacturers have anything until Jan, but keep checking.

  • @ABaumstumpf
    @ABaumstumpf Год назад

    It is great that you do reference to spec to see what the sideband actually does ( Not being "smart" but just 2 copper-connections that set it to 600W and 2 optional connectors that are not needed for power).
    Sadly it is rare that people do that as we have seen with the outrage about the 30 mating cycles of the connector (people should check what other connectors are rated for. Hint: not much more if at all), or the supposed USB 4 version 2.

  • @BeachClub_1
    @BeachClub_1 Год назад +2

    If your PSU can deliver 300W per socket this cable should work just fine. It has AWG 16, therefore not more current per cable core as for ATX 3.0 12 Pin cable.

  • @thetinker167
    @thetinker167 Год назад +7

    I never get tired of seeing the I-fix-it ad from this channel. Makes me laugh every time