@@johnhowarth7216 Nope GN is right, they also have the specs right from Nvidia. In the 4x8-Pin Adapter there is an extra IC, that checks EACH 8-Pin, if it is plugged in and then sends the right signal to the sense-pins.
@@johnhowarth7216 Confused too. But my understanding is that anything less than 4 cables can only draw 600w as where all 4 cables draw 630w which good for overclocking. Also I did an analysis on spikes GN did earlier and the 40% transient spikes on a 450w power source will go over 600w.
Hi Jay, Brian from ASRock (Malaysia) here. For your RTX issue, could you try change the option in Advanced > AMD PBS > AMD Common Platform Module > PCIe x16 Link Speed One of my local distributors had problem with Zotac card too on our X670E board, it refused to display but in the end it did with some BIOS setting changes.
@@Frostbite1090 thanks. Jay didn't provide much details but I believe it was same issue. Ours case was ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO on ASRock X670E Steel Legend. For some reason it didn't work but later worked, got it settled within 24 hours and managed to work perfectly for the RTX 4090 launch event.
Also keep in mind that many psu offer two (6+2 pin connectors) off of one cable back to the psu, so very reasonably two 8 pins back to the psu should be fine anyway. So basically this is not really any different than a potentially common way other people might fulfill the 4, 8 pin connectors available.
Imagine. My PSU only has one 6+2pin and two 6 pin. To use squid adapter I would need four 8 pin connections. #1 is solved immediately. #2: A two-6 pin to 8 pin adapter. #3: Two molex to 6 pin x2 then a two-6 pin to 8 pin. #4: same as #3. That's a total of 8 adapters. 😂
@@MrMirukou You can power it with the RM850x. Tech Yes City did all of his day one 4090 reviews on that powersupply. As for the wiring of the psu, I can't say how he had it wired.
@@MrMirukou Cables with two 8-pin connectors only deliver the same amount of power as a single 8-pin cable, which is 150W, IIRC. I would avoid using too many of those.
@@ShmeeGrim That is simply not true. The PCIe 8 pin standard is essentially implicit communication. It says that if all 8 pins are connected correctly, the device (GPU) can draw 150W. Even if you make a cable and PSU capable of supplying 1.21 jigawatts, the device (GPU) has no way to know that, so will only ever draw upto 150W sustained per 8 pin connector. Corsair's cables can sustain 300W. But that will only be utilised if provided over two 8 pin PCIe connections, so their cables terminate with those two 8 pin PCIe connections. At the other end, there's an 8 pin connection to the PSU. That connector IS NOT an 8 pin PCIe connector. It's a connector bespoke to the PSU manufacturer. It can output up to 300W. It only works correctly on the manufacturer's PSU, as the the pinout is often not the same. (NEVER use a modular cable from a different manufacturer, unless you fancy risking connecting a ground pin on your GPU to a 12V pin on your PSU.) So, each connector on the Corsair PSU absolutely CAN deliver up to 300W, and using the twin 8 pin PCIe connectors at the other end CAN allow 300W to be drawn. The nVidia adaptor needs four 8 pin PCIe connections as each such CONNECTOR only guarantees 150W. You could be attaching for separate 150W PSUs. You could be attaching a nuclear power station over solid copper bars. nVidia can't tell, so they will only ever draw 150W per 8 pin PCIe connector Corsair have made a cable for THEIR PSUs. They wire the 12+4 pin connector to tell the GPU that 600W is available. They make the gauge of the cabling capable of delivering that. They make the PSU end of the cable connect to TWO of their 300W PSU outputs. Thus, their single cable CAN sustain 600W from their PSUs. Note that if you plugged in only One connection at the PSU end, only one of the two sense wires would be grounded (the other would be left "open"), and the GPU would be able to tell. Thus, the GPU would only be "allowed" to draw 300W from that connector. An 850W Corsair PSU absolutely CAN deliver 600W over either two of corsair's twin headed PCIe cables, or one of their 12+4 pin HPWR cables. As that leaves only 250W for everything else, including transients, I'm not sure I'd want to. I'd rather ensure the GPU be limited to 450W to leave enough capacity for your other components.
just as a note Jay. Solid core cables can break fairly easily if you bend them a lot. Might be something to be cautious of for your testing rigs incase they start to break and short/overheat
@@warpedphreak don’t forget, as a reviewer, Jay is regularly swapping components. Regularly unplugging power cables. Yes, in a personal rig, you shouldn’t have that problem.
The standard has 4 sense lanes. The first two produce a binary number (with 3 being 600watt). No IC needed, just connect both to ground to produce a 11 (or 3). The other two are power ok and gpu connected and are both optional. edit: just got the part of the video where you say this and reference the same page on the spec I did. should.never have doubted your journalistic chops, Jay! edit 2: awesome video, explained the sense pins very well and made me actually more excited for the future.
Can't wait to see pictures of people molex>6pin>8pin>12 pin. And having two speaker wires coming out from the side band, grounding into the case with techscrews (ground IS ground ¯\_(ツ)_/¯ ) ^not unlike shi**y audio setups in cars :))
@@IcecalGamer as long as they include adapter cables with the cards, I don't think we're going to see much of that. I hope I'm right. Just for the sake of people's homes and their neighbors.
To be fair, Jay isn't always right, no harm in double checking what he says. Literally last video he said these are PCIE 5.0 cards when they're 4.0. At best he meant it figuratively, but he didn't present it as such.
@@blarghmcblarghson1903 he did? Weird. It's an issue for me. I use the additional slot for a mellanox (I won't call them Nvidia) infiniband card for my storage network, so being able to use the GPU at 8x without a performance hit is a feature I want.
A 4to1 adapter cable (and 2to1 too) will still need an IC to drive the two sense pins. As it needs to count how many cables are connected on the ATX 2.0 end, as any three out of four should produce the same sense signal because you cannot rely on users populating then in the order prescribed by the engineer. I didn't see an IC on the Corsair cable and because it's got two inputs I half expect it to incorrectly report the higher mode if one one cable is connected to the wrong input.
Jay's impression of the Side-band sensor talking was hilarious. Side-band: "Uhm we're only getting 300 watts." Controller: "Well then you don't get video." 🤣
The power is defined by what consumes it (GPU). What produces the energy (PSU) only defines the power limit it can handle. Since the GPU is being told by the sense pins that the available power limit is under what it needs, it doesn't even try to start because it "knows" that it will go over the limit of the PSU and probably damage it. Jay, you cannot say the power is there, "there" is only the voltage in Volts, then multiplied by the Amperes that the GPU resistance provide will give you power in Watts.
fun fact if the wire gauge is the same stranded can handle more amps, sold core is cheaper to make, sold core is used in builds cause it does not get moved around, it is set and forget so as long as it is cheaper than stranded it makes sense to use, also easy to screw sold core wires to the back of outlets, but they were made for solid core as it is the standard use product
The correct term for the "kickstart" voltage is the "Inrush" voltage. This is common through low voltage dc electronics as well as large circuit breakers for industrial motors. The inrush is akin to when you open the floodgates of a damn, there is a large starting deluge of water or power in this instance, then it tapers to the supply voltage.
Could anybody please try to run an overclocked 4090 with the NVIDIA adapter with a couple of Y-splitters of a single 8 Pin PCIE Cable. I want to see the cables melt :-D
NOTE: the power that you get from this Corsair cable will vary depending upon the power supply that is used, it calls for a 1200w Corsair PSU to deliver the full 600w through the cable (1000w PSU = 450w and 750w PSU = 300w). I did not see this mentioned in the video.
This is false lol.. do the same test he is doing and you can see the wattage being supplied. I have a 1000 Corsair and this cable and I’ve had my strix card up to 527 watts. So it is definitely over 450.. maybe that’s changed I guess but as of now it isn’t true
I'm curious to see how the super compact power supply cables behave thermally. A thermal camera video or section of reviews would be really helpful, especially since these cards risk igniting their own cords at max load.
Fortunately these products have specs that are available online, and have actually been tested for their power rating. Microfit connections from molex are almost as good in terms of power delivery as mini fit Junior (with a rating of 8.6Amp versus 9 on the larger connector). When you consider that the standard would normally use pigtails on 8 pin connectors, four 8 pin connectors would use 6 positive wires and 6 ground for 600 watt, and a single 16 pin connector would use 6 and 6 as well.
AWG 18 (0.75mm²) cable is the most common across power supplies and can easy carry 12 amps (1440 watts @ 120V). In Jayz video it looks that AWG 16 (1mm²) may now be the standard being moved to, easily carrying 15 amps (1800 watts @ 120V). So these cards risking igniting their "own" cords is not a problem.
@@hexdator2934 good point. the reason I didn't bother to talk about wire gauge is that the limiting factor is the pins in the connector at the lengths used in a PC.
Sense signals S3 and S4 are not for remote voltage sense as people might think. They let cards know information about how much power the PSU can provide. A logical combination of Sense0 and Sense1 tells the card how much power it can use according to the table below. In the PSU the appropriate sense0 and sense1 signals must be pulled to ground or left open to tell the card what the power limits are. These signals may dynamically change, but only when the power supply is in standby mode, meaning if they change when the 12V to the card is on the card might not recognize the change and act appropriately to the new setting. The sense1 line is pin 4 (the right most pin when reading the 600w label) and the sense0 is pin 3. Sense0-----Sense1----Bootpwr------Maximum Sustain Power -----Gnd-----Gnd----------375W----------600W --Open------Gnd----------225W----------450W -----Gnd----Open---------150W----------300W --Open-----Open---------100W----------150W
If the card supports this feature properly and you remove the lines from the cable the power limit should drop because now both sense0 and sense1 are open.
Given the comment they made to GN on the power requirements and how wrong the rep was. I 100% would not use any cable from Corsair. Gonna burn your house down.
The original Corsair post was incorrect, but it wasn't incorrect about their own product it was incorrect about nvidia's product. It was simply a misunderstanding and they have apologized. It had nothing to do with whether or not the Corsair product is technically good. There is no difference in power between using the Nvidia adapter and corsairs, other than nvidia's adapter will produce a messy bunch of cables which might cause other issues. The IC on nvidia's adapter is designed to count how many cables are connected to it. Corsair's cable does not need to count how many cables are connected to it because there is only one configuration option and that is the plug both cables into your power supply. No need for an IC when the pins will always both be connected to ground.
@@metaleggman18 an eight pin cable has three lanes of 9 amps each, for a max power of around 324 watts, however the spec limits it to 150 watt (this difference is why you see pigtails). The 16 pin cable has 6 lanes of 8.5 amps each, for 612 watts rated power, or limited to 600w by the spec. So, yes, it is exactly like two 8 pin connectors, although it has more 12v wires than two 8 pins, and runs much closer to the spec of the underlying components, in theory.
Got my 4090 a day or two ago, and you are sooo right. The goddamn cablemess is terrible. Ordered a Corsair cable for my Corsair PSU to neaten things up.
Wait what? This cable will only draw 600w if you have Rmx1200w supply or above. Anything less, it defaults to 450w. But people are ignorant, so they deserve to get scammed.😂🤣😂🤣
I just have to say this, I've seen like a hundred of times your Ifixit sponsor video short and until this day still brings a smile on my face xD it's just so funny and it never gets old xD
Has to admit man. This is the best iFixit ad ever. You are the best Jay. I'm a daily visitor now. Your videos are really informative. Massive thanks, man.
There are no tricks involved in the corsair adapter. Some of the cables you get with this PSU give you two 8-pin connectors for each cable coming from the PSU. That means that a single PCIe connector ON THE PSU can supply 300W so two will give you 600W
Something that derbauer tested is lowering the power limit, he got it down to 50 % and only saw a dip of 5-15% and a 200w drop in power usage. I think that should be tested by others as well, they would not have needed massive coolers and special connectors, maybe scared of amd. Interested to see what will come from them.
well no surprise there. 30 series behave the same way :p And AMD too actually. They don't show the total board power so they appear to be more efficient, but they also benefit from lower power limits in the same way with a really minimal hit in performance. those last FPS are a b**** to get for the benchmarks.
electrical engineering 101 Electrons flow on the outside of the wire, stranded will carry more current than solid core, making the point you were trying to make around the 2 minute mark invalid.
For the adapter that come with the 4090, you do not need to connect four 8-pins, just three as per their instructions. The 4th plug is optional and is for overclocking.
Hi Jay, just that you can't band a cable, doesn't mean that its solid core. Especially the insulation makes a big difference in the stiffness of a cable. If you don't want to cut the cable, you can x-ray the cable to see if its standed. BTW. The "smart" communication is just passive pins. Maybe later they implement something I2C based, like the old fashioned PM Bus on the PSU and GPU. They should have done this already. Its like cents for an eeprom in the PSU.
I think the reason there is a 4 8-pin adaptor is for psu's that had multi rails on the 12V. It is mostly a CYA adaptor while most corsair psu's are single 12V rail by design so the can basically just use 2 8pins to 12 high power connector.
The reason for 4 8-pin adaptor is simply because each PCIE 8-pin is rated at 150W (it is the standard), so you need 4 of those to get up to 600W. You don't design 2 8-pin adaptor and rated it for 600W because it will be a fire hazard. The adaptor itself can sense how many cables are attached to the adaptor, so if you only attach 3, it will tell the GPU that the max power is 450W. If you use Corsair PSU using the adaptor, you still need to connect to all 4 to get 600W max. One of the potential problem in the future is that if people ended up breaking or losing their adaptor (maybe buy a 2nd hand 4090 card and it doesn't came with the adaptor), they might think to simply trying to purchase the cheapest adaptor and those adaptor is not a smart one, it could end up with the GPU trying to pull 600W from 1 8-pin.
@@RurouTube While the PCIe spec may say 150W per 8-pin ATX, in reality the three positive wires and Molex Mini-Fit Jr. pins are capable of much more than that if properly spec'd. Roughly 108w per pin, or 300W+ per 8-pin connector. Since Corsair specs the cable and the power supply, one must assume they know what their pins and wires are rated for and thus this is a 100% safe and compliant use case at 600w.
@@bradley3549 What I wrote is not about whether Corsair can do 600W safely from a single 8pin but more about the reasoning why Nvidia included this 4x8pin adaptor. It is not about single rail or multi rails, it is simply because of the spec. The spec for 8pin is 150W. You can't expect every single PSU on earth to safely do what Corsair PSU or its cable can. Edit: this is just from safety issue. There is also from function stand point where while PSU at this category should be able to supply more than 150W from a single 8pin, there might be edge cases where it can't. Also just having 2x8pin might ended up making people have a false sense of security thinking that their PSU with 2x8pin should be able to handle 4090 when in reality it can't.
@@RurouTube spot on.. and some PSUs really put the bare minimum gauge to get 150W. through a PCIE cable. The cable needs to follow the specifications anyway. you don't deviate from rules just because "it should be alriiiight" ^^
Use the supplied Nvidia cable with all 4 ports connected and your power slider will allow up too 133% voltage. Steve from GN demonstrated how this works, you need all 4 side band pins to be active.
Appears there WAS some fire between Corsair and Gamers Nexus *(Corsair apologized)*. Corsair was calling BS regarding GN stated all four PCIe power needed to be the Nvidia FE power adapter to enable 133% power budget overclock, GN showed on follow up video only three won't allow you to go over 100% power budget. This video just proven GN isn't talking about BS, you do need 600W spec to get power budget to 133%. Nvidia's adaptor can sense how many PCIe power were plugged in with their adaptor, potentially creating a problem even when the power supply can do 600W PCIe.
didnt corsair spread missinformation about the 4th/3rd plug on the adapters ? gamers nexus made a recent video about it , showing that corsair has no idea what the fuck they are talking about when they adress the new plug and standard
yes they did, but when you see how they built their own extension you can see why they thought what they thought, why they were wrong, and why their product is still perfectly okay. these cables are fine. I just read the spec, and essentially I think the Corsair person was just confused and didn't realize that Nvidia built their adapter differently than they built their adapter, but both will work fine for a 4090 or any other GPU with ATX 3 that has ever been produced.
@@gucky4717 exactly. And let's be clear when we say "IC" we think of some complicated logic but it's really not doing that complicated logic. It's just counting the number of sense pins (old style) on the connectors and converting that to a two digit binary number.
That only applies to Nvidia's cable adapter. That's what Steve from gamersnexus was talking about when calling out Corsair. Not Corsairs own cable. The way this cable works is completely different.
I'm fairly new to PCing. I upgraded mt PSU because I upgraded my GPU. I got a BeQuiet! 12 M 850w, and all of the cables that came with that were solid core wires as well. Now I did grow up building houses, and for those who are worried about the wire stiffness, its a really good thing for power draw and delivery. Trade off is they are harder to work with for cable management. Pros and Cons and such Edit: I did use the 12pin PSIe 5.1 cable that was provided to delete the dual to single pig tail to the 4070 Ti Super OC i upgraded to. Looks a lot better for those who are chasing visual cleanliness.
I would retest this with the original cables and see if your FE can overclock. Watching the GE's video and hearing the Nvidia's guy talk about it, you may need their adapter to get the full power. While Corsairs adapter may be built differently, the card itself may not want to go past it's 450 watts if it cannot detect whatever signals the original adapter gives. Not to mention, if it's even safe for the card long-term.
There is literally no difference in the power provider by corsair's cable and nvidia's cable when nvidia's cable has four PCIE 8pin cables connected from the perspective of the graphics card. As jay, and gamers Nexus have shown, the card determines its power limit based on what it senses from the cables. Both Jay and gamers Nexus perform the same experiment, a little bit differently, and got the same result. The new standard, in my humble opinion, is simple, easy to implement, and far superior to the hodgepodge of old power standards and pigtailed six pin and eight pin connections.
@@misterio10gre unsure of this reasoning. Power is just V*I. If currents constant and voltage goes up power increases. Heat in devices is mostly from resistance to current.
@@enntense card runs at around 1.045 stock voltage, my 3090 does like 1.2 at max voltage headroom Also you are outright incorrect, heat is directly proportional to voltage but inversely proportional to current
If your PSU can deliver 300W per socket this cable should work just fine. It has AWG 16, therefore not more current per cable core as for ATX 3.0 12 Pin cable.
You can push all the frames at 4k you want on a 1080p 60hz monitor and you probably won’t notice a difference between that and 1080p 60fps. So if you have a lower end monitor your GPU won’t need to work as hard to run the resolution and frames that are optimal for you. That’s why you should always match your GPU to the monitor you have or plan to have in the near future. It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for. As soon as you step up to 1440p or 4k and high refresh rate (120+ hz) it starts to make sense to go with a higher end GPU, but honestly no one needs a 4090, even for 4k gaming. It sits in a weird space where it’s too expensive and overkill for general consumers, and not powerful enough for professional use where Quattro’s dominate the space.
@@abnormallynormal8823 "It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for." What? You can't just default assume you aren't utilizing the performance. It depends on the game, and settings. Lower refresh rate and resolution are always an opportunity to increase graphics settings. Now, if you've already maxed them out and still aren't anywhere near 100% GPU usage (assuming appropriately powerful CPU) then yes, your card's potential is kinda wasting. But unless that's the case in all games you play or might wanna play, you can't really say you've overpaid for the GPU. That's obviously not touching the possibility that you might also need/want the card's performance for professional use (like rendering, AI and other compute stuff) where your monitor doesn't matter. But that's another story
Onboard graphics is such a blessing to have. When I finished my first build, I noticed the graphics card wasn't being utilized at all. It turned out that all it needed was a driver update. Has it not been for onboard graphics, I'd probably have panicked a little when the display didn't turn on.
How in the hell does a graphics card not work at all without a driver update? In my past experience they happily function with no driver at all other than whatever default driver Windows uses. This sounds like the manufacturer screwed up badly?
If you got the money to afford a 4080 or a 4090 the power supply isn't your concern. It amazes me that people still have to cut wires when all you got to do is just pull the pin out of the connector.
About the start of the video... no it dorsnt only take 2 it still takes 4 if u notice the regular cables have 1 connector on the psu side and 2 8 pins on the other side... so this adapter just skips 1 step.. its still the same amount of cables or lanes from the psu
I still don't understand why they didn't just make it a plastic 90 degree bend from factory. All these problems (or at least most of them) immediately disappear.
Hey. I'm sorry, my response is going to be completely unrelated to the video. I absolutely love the commercials that Jay and his team come up with for their sponsors. This ad makes you want to just go online, and shop for an iFix it kit
The PCIE power connectors are rated at more than double their wattage at 12v of the 150w GPUs use them for. The cables on the other hand are typically not quite living up to that. But the math all checks out, it's no more insane to do this than to use the 12 pin connector in the first place lol
Gods, that buzzsaw of a fan in the background. You guys are doing your best to noise gate, so it's only audible when Jay's talking, but holy crap that's loud!
Engineering wise. I love this approach. It is a single universal connector that works for all GPUs and PSUs from massive to tiny and ensures the GPU won't overload the supply willy nilly. An improvement over just using more and more old clunky 6/8/6+2 connectors. Its just the transition with all these fugly 4-headed adapters over the place that sucks.
i watched the video from Gamernexus that they were saying their test are bullshit and than they got smashed by Steve and proved them wrong so that cable from corsair probably will be a failure from corsair this time
Over the past couple years I have lost almost all interest in anything from nVidia due to their business practices. That said, I greatly appreciate the way Jay experiments with the new tech that is available to his team. Consistently great content! 👍
LOL you just finding this out? I had 3DFX stock back in the day and Nvido was going to give 2 shares of Nvidia stock to 1 share of 3DFx stock. So if you have 5000 shares you end up converting it to 10,000 shares of nvidia. But they found a better idea, they kept 3dfx as a shell company with (no value) so then they did not have to give all the 3DFX share holds 1 dime. So a guy with 5000 shares at 5.00 a share of 3DFX ($25,000) shares went to 0 so his worth is 0. Lots of people lost all their money. And after that Nvidia sent those share holders a letter asking them or telling them they could now buy Nvidia Shares. LOL oh the number of angry people they created. They have not changed one bit in 2023. They screwed over so many people with that dirty move.
3:30 that statement just made me want to go on a rant about how people think and higher amp brick will damage a device that came with a lower amp one. Amps are pulled not given.
Even with a Corsair 900D case the cables do touch the glass coming from the 4090 Suprim X so I totally agree on your previous video the design is pants for those with small PC cases.
Interesting video. Time to see if I can ever decide if once prices come back down (scalpers again) if I can decide if a 4090 is worth it or not. Probably not.
From what I've seen from GN, the 4 cables having power essentially just tells the gpu how much it can draw. Each one is 150W so 4 is 600W. 2 would only be 300W.
And the card needs 450 to boot. Remember, however, that it is fine to use the same 8 pin line for two connections (which is why they are pigtailed). So, the connection to the power supply when using corsair's cable and nvidia's would be the same: two connectors.
Hey Jay, I have a similar issue with the MSI RTX 4090 Suprim X. I don't have any display output on my Gigabyte X570 Aorus Master and the card just won't work. (I have tested two different PSUs, one 1KW and one 1.6KW both with the GPU's dongle and with the cable you are showing in the video.) I'm wondering though if the behaviour is identical. On my card, the fans do not spin up and no RGB lights up...I it the same on the Strix? I think you mentioned the fans do not spin up but you don't seem to mention RGB.
With all these multi-ended cables, like tendrils, coming off the GPU and Corsair's drama with Steve/GN, this could literally be a new Squid Game [EDIT:] Your test @12:39 confirms what Steve said. Corsair told them their testing method was BS lol
FYI what you mentioned at 06:45 - it's fucking AWESOME we're getting onboard igpu for ryzen now. I couldn't tell you the amount of times I couldn't get a screen to appear with my 3080 because my bios was in UEFI mode in which apparently the FE 3080 wouldn't boot display anything. So I had to get out my 1060, boot into bios and enable legacy mode. Then turn the machine off - put the 3080 back in and everything was fine. Now with the board I have - I am at least presented a screen letting me know to go into bios (while using the 3080) and change the mode. Now I've got a 7700X coming in and just knowing I'll be able to simply boot and mess with the bios with the on board GPU is an amazing feature. Not overlooked in the slightest from the customer - big appreciation to AMD for that.
I agree with that statement 🥰👍💪. I think aesthetics overall are no where near the level of the 30-Series. But still impressive numbers overall, regardless of power draw. CPUs now have to catch up 😇💪
Damn...eventually we will just need a new PSU with a GPU and they will be bundled together if it uses that much power. That cable grouping is atrocious. I thought the double cable that came with my 1070 was awful, just going to run my 1070 into the ground before bothering with a newer GPU. 🤣 Also I'm still running a 3700x 😅🤣
I wonder if the new AMD cards will have the same kind of cable fun going on? I'd imagine we will, but their stuff tends to also be more efficient than team green so things might not be quite as crazy.
@@TalesOfWar yeah I was wondering the same thing when I was commenting. I'm just going to upgrade to a 2070 or idk I will have to research something better than my 1070 I have now if it's even worth doing a 2070 upgrade. I can imagine AMD having something smarter and more innovative though
Will the Corsair adapter allow 600 watts on a 4090? GN has an engineer from Nvidia saying all four are need for full power. And Steve proved that you would not get 600 watts w/out all four being connected.
I think that's for using the nvidia adapter. the coarsair one is for their power supply. Their power supply should be able to handle the wattage over that cable/connector from their power supply. Think about pcie power cables from power supplies. some have more than one 8 pin on a cable. It should be because the power supply can supply that ammount of wattage for those connectors. But if you're using the nvidia adapter, it only sees power from each connector. coming from your power supply, you're probably stll only using 2 separate feeds from the power supply even if it's 4 8 pin pcie connectors.
Yes. It's because the cable Nvidia provides with their FE cards has an IC in the cable itself, at least according to the GN video. The 16 pin connector is essentially just two 8-pin pcie cables connected together, at least when used on an ATX 2.0 PSU.
each PSU line can hold 288W, nvidia's adapter uses 4 because the old 8-pin GPU CONNECTOR is limited to 150W, also you might use a cable with 2 terminations which would end up in a single line
Yup, can't believe so many are confused by this, even youtubers who should know better are spreading misinformation. Has anyone even looked at the back of their PSU? Those 8 pin connectors are clearly marked PCIE 6+2, CPU 4+4, meaning each connector is also rated for EPS12 8pin, capable of about 300W easy. That's why Corsair only needs 2 connectors for this cable (ofcourse you gotta be mindful about your PSU's total wattage thought).
@@Glotttis So can I use 2 x Cables and just use the splitters on the end to feed in to the other 2 ? so only need 2 x PSU cables that split in to 2 x GPU power ? Or does this not work.
yes 2 psu cables are enough. a Y splitter is not ok though because it would mean 300W going through a single connector, which is out of spec. if by "splitter" you mean a cable with 2 gpu-side connectors, that's ok
So this launch *is not about super exciting, affordable graphics cards!* It's all about the power supplies, cables and stuff. So sad! Does anyone remember RTX 3070 launch?! 2080 Ti performance for 499 bucks? That was cool ... and feels like ages ago.
the reason the PSU end is only 2x 8pin molex, is because it have 4x 12V lines, the larger gauge wire helps to, but on the PCI-e Power 8 pin they only have 3x 12V, so that's why they use 3 or 4x 8 pin to one mini molex Y connector.
Jesus Christ why doesn't anyone pay attention. That only applies to Nvidia's cable adapter. That's what Steve from gamersnexus was talking about when calling out Corsair. Not Corsairs own cable. The way this cable works is completely different and has absolutely nothing to do with what Corsair and Steve were beefing over.
didnt corsair totally get it wrong as far as how that connector works? I just watched Gamers Nexus video on that. Not sure I trust it now.
Corsair just posted a statement
Corsair posted an apology for disrespectful behavior and language but they did not admit fault or give a technical answer.
@@johnhowarth7216 Nope GN is right, they also have the specs right from Nvidia. In the 4x8-Pin Adapter there is an extra IC, that checks EACH 8-Pin, if it is plugged in and then sends the right signal to the sense-pins.
@@johnhowarth7216 nope, If you had watched GN provide the source for their claims (Nvidia) so it looks like Corsair don't know squat.
@@johnhowarth7216 Confused too. But my understanding is that anything less than 4 cables can only draw 600w as where all 4 cables draw 630w which good for overclocking. Also I did an analysis on spikes GN did earlier and the 40% transient spikes on a 450w power source will go over 600w.
Hi Jay, Brian from ASRock (Malaysia) here.
For your RTX issue, could you try change the option in Advanced > AMD PBS > AMD Common Platform Module > PCIe x16 Link Speed
One of my local distributors had problem with Zotac card too on our X670E board, it refused to display but in the end it did with some BIOS setting changes.
Bump
@@Frostbite1090 thanks. Jay didn't provide much details but I believe it was same issue. Ours case was ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO on ASRock X670E Steel Legend. For some reason it didn't work but later worked, got it settled within 24 hours and managed to work perfectly for the RTX 4090 launch event.
UP
@@PlayMates_ Thanks.
We hope he sees this!
Also keep in mind that many psu offer two (6+2 pin connectors) off of one cable back to the psu, so very reasonably two 8 pins back to the psu should be fine anyway. So basically this is not really any different than a potentially common way other people might fulfill the 4, 8 pin connectors available.
My PSU unit does have these kind of cables? Corsair RM850x. so could i power a 4090 with it? it has 3 double cables... im not sure how it really works
Imagine. My PSU only has one 6+2pin and two 6 pin. To use squid adapter I would need four 8 pin connections. #1 is solved immediately. #2: A two-6 pin to 8 pin adapter. #3: Two molex to 6 pin x2 then a two-6 pin to 8 pin. #4: same as #3. That's a total of 8 adapters. 😂
@@MrMirukou You can power it with the RM850x. Tech Yes City did all of his day one 4090 reviews on that powersupply. As for the wiring of the psu, I can't say how he had it wired.
@@MrMirukou Cables with two 8-pin connectors only deliver the same amount of power as a single 8-pin cable, which is 150W, IIRC. I would avoid using too many of those.
@@ShmeeGrim That is simply not true.
The PCIe 8 pin standard is essentially implicit communication. It says that if all 8 pins are connected correctly, the device (GPU) can draw 150W. Even if you make a cable and PSU capable of supplying 1.21 jigawatts, the device (GPU) has no way to know that, so will only ever draw upto 150W sustained per 8 pin connector.
Corsair's cables can sustain 300W. But that will only be utilised if provided over two 8 pin PCIe connections, so their cables terminate with those two 8 pin PCIe connections.
At the other end, there's an 8 pin connection to the PSU. That connector IS NOT an 8 pin PCIe connector. It's a connector bespoke to the PSU manufacturer. It can output up to 300W. It only works correctly on the manufacturer's PSU, as the the pinout is often not the same. (NEVER use a modular cable from a different manufacturer, unless you fancy risking connecting a ground pin on your GPU to a 12V pin on your PSU.)
So, each connector on the Corsair PSU absolutely CAN deliver up to 300W, and using the twin 8 pin PCIe connectors at the other end CAN allow 300W to be drawn.
The nVidia adaptor needs four 8 pin PCIe connections as each such CONNECTOR only guarantees 150W. You could be attaching for separate 150W PSUs. You could be attaching a nuclear power station over solid copper bars. nVidia can't tell, so they will only ever draw 150W per 8 pin PCIe connector
Corsair have made a cable for THEIR PSUs. They wire the 12+4 pin connector to tell the GPU that 600W is available. They make the gauge of the cabling capable of delivering that. They make the PSU end of the cable connect to TWO of their 300W PSU outputs. Thus, their single cable CAN sustain 600W from their PSUs.
Note that if you plugged in only One connection at the PSU end, only one of the two sense wires would be grounded (the other would be left "open"), and the GPU would be able to tell. Thus, the GPU would only be "allowed" to draw 300W from that connector.
An 850W Corsair PSU absolutely CAN deliver 600W over either two of corsair's twin headed PCIe cables, or one of their 12+4 pin HPWR cables. As that leaves only 250W for everything else, including transients, I'm not sure I'd want to. I'd rather ensure the GPU be limited to 450W to leave enough capacity for your other components.
just as a note Jay. Solid core cables can break fairly easily if you bend them a lot. Might be something to be cautious of for your testing rigs incase they start to break and short/overheat
if you're in there bending your GPU PSU cable a LOT... you got bigger problems you caused lol...
@@wojtek-33 hes talking about over time reason why most cables have a bend rating
As history has proven again Jay has small knowledge on electronics.
100% that is not a solid core cable, yes he miss spoke again...
We are going to start to need new cable tech soon, which is kinda gross. 60FPS at 4K is fine, stop it!!!
but...
@@warpedphreak don’t forget, as a reviewer, Jay is regularly swapping components. Regularly unplugging power cables. Yes, in a personal rig, you shouldn’t have that problem.
The standard has 4 sense lanes. The first two produce a binary number (with 3 being 600watt). No IC needed, just connect both to ground to produce a 11 (or 3). The other two are power ok and gpu connected and are both optional.
edit: just got the part of the video where you say this and reference the same page on the spec I did. should.never have doubted your journalistic chops, Jay!
edit 2: awesome video, explained the sense pins very well and made me actually more excited for the future.
Can't wait to see pictures of people molex>6pin>8pin>12 pin. And having two speaker wires coming out from the side band, grounding into the case with techscrews (ground IS ground ¯\_(ツ)_/¯ )
^not unlike shi**y audio setups in cars :))
@@IcecalGamer as long as they include adapter cables with the cards, I don't think we're going to see much of that. I hope I'm right. Just for the sake of people's homes and their neighbors.
To be fair, Jay isn't always right, no harm in double checking what he says. Literally last video he said these are PCIE 5.0 cards when they're 4.0. At best he meant it figuratively, but he didn't present it as such.
@@blarghmcblarghson1903 he did? Weird. It's an issue for me. I use the additional slot for a mellanox (I won't call them Nvidia) infiniband card for my storage network, so being able to use the GPU at 8x without a performance hit is a feature I want.
A 4to1 adapter cable (and 2to1 too) will still need an IC to drive the two sense pins. As it needs to count how many cables are connected on the ATX 2.0 end, as any three out of four should produce the same sense signal because you cannot rely on users populating then in the order prescribed by the engineer. I didn't see an IC on the Corsair cable and because it's got two inputs I half expect it to incorrectly report the higher mode if one one cable is connected to the wrong input.
Jay's impression of the Side-band sensor talking was hilarious.
Side-band: "Uhm we're only getting 300 watts."
Controller: "Well then you don't get video." 🤣
The power is defined by what consumes it (GPU).
What produces the energy (PSU) only defines the power limit it can handle.
Since the GPU is being told by the sense pins that the available power limit is under what it needs, it doesn't even try to start because it "knows" that it will go over the limit of the PSU and probably damage it.
Jay, you cannot say the power is there, "there" is only the voltage in Volts, then multiplied by the Amperes that the GPU resistance provide will give you power in Watts.
fun fact if the wire gauge is the same stranded can handle more amps, sold core is cheaper to make, sold core is used in builds cause it does not get moved around, it is set and forget so as long as it is cheaper than stranded it makes sense to use, also easy to screw sold core wires to the back of outlets, but they were made for solid core as it is the standard use product
Jayz got 12th place eventually.
12. JayzTwoCents 28356 NVIDIA GeForce RTX 4090 AMD Ryzen 9 7950X
“Well then you don’t get video!” Made me laugh for real. Jay can always make a boring subject entertaining. That’s why he’s the man.
But Jay, Steve just did this test and found you need all four or you will only get 400 watts.
450
and that Corsair said Steve was full of shit lol, which he promptly proved them wrong.
Can you link that video?
yeah I wanted to see Jay try to pull 600 with that setup
Ya but 300watts will give you 90-95 percent of your performance. Using the full power on 4090s isn't worth the marginal performance gain
me watching it on my 1060 : mmm yes yes
Seasonic making a native dual 8-pin to 12-pin cable was mint. So satisfying swapping that in for my 3070 Ti FE
@@morganlimes cool
The correct term for the "kickstart" voltage is the "Inrush" voltage. This is common through low voltage dc electronics as well as large circuit breakers for industrial motors. The inrush is akin to when you open the floodgates of a damn, there is a large starting deluge of water or power in this instance, then it tapers to the supply voltage.
Could anybody please try to run an overclocked 4090 with the NVIDIA adapter with a couple of Y-splitters of a single 8 Pin PCIE Cable. I want to see the cables melt :-D
I'm sure Dawid will have video of himself doing this unintentionally before the month is out.
@@Eric_Fate Oh god, here we go!
I'm looking forward to someone using molex to 8 pin to 12 pin.
More adapters is more better, right? Right!!
I'm not sure you would. The issue is the connectors, not the wires. Three pairs of 18 AWG wire won't melt at 450 watts, but the connectors might.
@@PREDATEURLT perfectly fine if it uses 12 molex connectors.
NOTE: the power that you get from this Corsair cable will vary depending upon the power supply that is used, it calls for a 1200w Corsair PSU to deliver the full 600w through the cable (1000w PSU = 450w and 750w PSU = 300w). I did not see this mentioned in the video.
This is false lol.. do the same test he is doing and you can see the wattage being supplied. I have a 1000 Corsair and this cable and I’ve had my strix card up to 527 watts. So it is definitely over 450.. maybe that’s changed I guess but as of now it isn’t true
I have a rm850x and it goes up to 587 watts in furmark with the cable.
@@crimson7151
I also have rm850x. This might be the reason why I'm getting a black screen with this cable.
yes its possible if you have a cpu like the 13900k@@TheCaciKO
are you using the type 4 or type 5 PCI-e ? and do you connect 3 or 4 pci-e ? i have the same power supply but i havend connected yet@@crimson7151
I'm curious to see how the super compact power supply cables behave thermally. A thermal camera video or section of reviews would be really helpful, especially since these cards risk igniting their own cords at max load.
Long cabels with a lot of connectors are usually way worse than shorter cabels without them.
Fortunately these products have specs that are available online, and have actually been tested for their power rating. Microfit connections from molex are almost as good in terms of power delivery as mini fit Junior (with a rating of 8.6Amp versus 9 on the larger connector). When you consider that the standard would normally use pigtails on 8 pin connectors, four 8 pin connectors would use 6 positive wires and 6 ground for 600 watt, and a single 16 pin connector would use 6 and 6 as well.
AWG 18 (0.75mm²) cable is the most common across power supplies and can easy carry 12 amps (1440 watts @ 120V). In Jayz video it looks that AWG 16 (1mm²) may now be the standard being moved to, easily carrying 15 amps (1800 watts @ 120V). So these cards risking igniting their "own" cords is not a problem.
@@hexdator2934 good point. the reason I didn't bother to talk about wire gauge is that the limiting factor is the pins in the connector at the lengths used in a PC.
@@edwardallenthree You went to an even greater level of detail......👍I was replying to OP..
Sense signals S3 and S4 are not for remote voltage sense as people might think. They let cards know information about how much power the PSU can provide. A logical combination of Sense0 and Sense1 tells the card how much power it can use according to the table below. In the PSU the appropriate sense0 and sense1 signals must be pulled to ground or left open to tell the card what the power limits are. These signals may dynamically change, but only when the power supply is in standby mode, meaning if they change when the 12V to the card is on the card might not recognize the change and act appropriately to the new setting.
The sense1 line is pin 4 (the right most pin when reading the 600w label) and the sense0 is pin 3.
Sense0-----Sense1----Bootpwr------Maximum Sustain Power
-----Gnd-----Gnd----------375W----------600W
--Open------Gnd----------225W----------450W
-----Gnd----Open---------150W----------300W
--Open-----Open---------100W----------150W
If the card supports this feature properly and you remove the lines from the cable the power limit should drop because now both sense0 and sense1 are open.
Given the comment they made to GN on the power requirements and how wrong the rep was. I 100% would not use any cable from Corsair. Gonna burn your house down.
No, it'd be fine. The 16 pin cable is essentially just two 8 pin cables combined.
The original Corsair post was incorrect, but it wasn't incorrect about their own product it was incorrect about nvidia's product. It was simply a misunderstanding and they have apologized. It had nothing to do with whether or not the Corsair product is technically good. There is no difference in power between using the Nvidia adapter and corsairs, other than nvidia's adapter will produce a messy bunch of cables which might cause other issues. The IC on nvidia's adapter is designed to count how many cables are connected to it. Corsair's cable does not need to count how many cables are connected to it because there is only one configuration option and that is the plug both cables into your power supply. No need for an IC when the pins will always both be connected to ground.
@@metaleggman18 an eight pin cable has three lanes of 9 amps each, for a max power of around 324 watts, however the spec limits it to 150 watt (this difference is why you see pigtails). The 16 pin cable has 6 lanes of 8.5 amps each, for 612 watts rated power, or limited to 600w by the spec. So, yes, it is exactly like two 8 pin connectors, although it has more 12v wires than two 8 pins, and runs much closer to the spec of the underlying components, in theory.
Got my 4090 a day or two ago, and you are sooo right. The goddamn cablemess is terrible. Ordered a Corsair cable for my Corsair PSU to neaten things up.
Did it work? Which Corsair psu do you have?
@@klipschorny I have the corsair 600W 12VHPWR cable as well. I have a corsair 1000RMx and works perfectly.
@@klipschorny yes it worked fine. I have a RM850X
Wait what? This cable will only draw 600w if you have Rmx1200w supply or above. Anything less, it defaults to 450w. But people are ignorant, so they deserve to get scammed.😂🤣😂🤣
@@csguak why the fuck do you need 600 watts of juice to power a graphics card? Are you building a delorean or sum? 😂
oh look! its the same Corsair that apparently doesn't know how the 4090's power delivery works lol
They issued an apology already. Apparently the comment was made independently by one of their employees.
typical back peddling when found out.
Isnt it always the case when an employee fucks up but its the whole company when an employee does something extraordinarily good? 😏
I just have to say this, I've seen like a hundred of times your Ifixit sponsor video short and until this day still brings a smile on my face xD
it's just so funny and it never gets old xD
Has to admit man. This is the best iFixit ad ever. You are the best Jay. I'm a daily visitor now. Your videos are really informative. Massive thanks, man.
Bleeding edge tech!
There are no tricks involved in the corsair adapter. Some of the cables you get with this PSU give you two 8-pin connectors for each cable coming from the PSU. That means that a single PCIe connector ON THE PSU can supply 300W so two will give you 600W
Something that derbauer tested is lowering the power limit, he got it down to 50 % and only saw a dip of 5-15% and a 200w drop in power usage.
I think that should be tested by others as well, they would not have needed massive coolers and special connectors, maybe scared of amd. Interested to see what will come from them.
well no surprise there. 30 series behave the same way :p
And AMD too actually. They don't show the total board power so they appear to be more efficient, but they also benefit from lower power limits in the same way with a really minimal hit in performance.
those last FPS are a b**** to get for the benchmarks.
That intro, very reminiscent of early era morning TV ad's, very nice
Those 90's ads gives me nostalgic
I miss the good ole days
electrical engineering 101
Electrons flow on the outside of the wire, stranded will carry more current than solid core, making the point you were trying to make around the 2 minute mark invalid.
Jay, try to either 1) Disable deep sleep hdmi/ dp on your monitor or 2) use a differnt connector. This is a common issue.
For the adapter that come with the 4090, you do not need to connect four 8-pins, just three as per their instructions. The 4th plug is optional and is for overclocking.
Can I get this cable with the 2 8-pina for the 4090? Or do I need a custom cable that has 4?
Hi Jay,
just that you can't band a cable, doesn't mean that its solid core. Especially the insulation makes a big difference in the stiffness of a cable. If you don't want to cut the cable, you can x-ray the cable to see if its standed.
BTW. The "smart" communication is just passive pins. Maybe later they implement something I2C based, like the old fashioned PM Bus on the PSU and GPU. They should have done this already. Its like cents for an eeprom in the PSU.
I think the reason there is a 4 8-pin adaptor is for psu's that had multi rails on the 12V. It is mostly a CYA adaptor while most corsair psu's are single 12V rail by design so the can basically just use 2 8pins to 12 high power connector.
Yeah most psu can do 1 8 pin psu to 2 x 8pin pcie pigtail
The reason for 4 8-pin adaptor is simply because each PCIE 8-pin is rated at 150W (it is the standard), so you need 4 of those to get up to 600W. You don't design 2 8-pin adaptor and rated it for 600W because it will be a fire hazard.
The adaptor itself can sense how many cables are attached to the adaptor, so if you only attach 3, it will tell the GPU that the max power is 450W. If you use Corsair PSU using the adaptor, you still need to connect to all 4 to get 600W max.
One of the potential problem in the future is that if people ended up breaking or losing their adaptor (maybe buy a 2nd hand 4090 card and it doesn't came with the adaptor), they might think to simply trying to purchase the cheapest adaptor and those adaptor is not a smart one, it could end up with the GPU trying to pull 600W from 1 8-pin.
@@RurouTube While the PCIe spec may say 150W per 8-pin ATX, in reality the three positive wires and Molex Mini-Fit Jr. pins are capable of much more than that if properly spec'd. Roughly 108w per pin, or 300W+ per 8-pin connector. Since Corsair specs the cable and the power supply, one must assume they know what their pins and wires are rated for and thus this is a 100% safe and compliant use case at 600w.
@@bradley3549 What I wrote is not about whether Corsair can do 600W safely from a single 8pin but more about the reasoning why Nvidia included this 4x8pin adaptor. It is not about single rail or multi rails, it is simply because of the spec. The spec for 8pin is 150W. You can't expect every single PSU on earth to safely do what Corsair PSU or its cable can. Edit: this is just from safety issue. There is also from function stand point where while PSU at this category should be able to supply more than 150W from a single 8pin, there might be edge cases where it can't. Also just having 2x8pin might ended up making people have a false sense of security thinking that their PSU with 2x8pin should be able to handle 4090 when in reality it can't.
@@RurouTube spot on.. and some PSUs really put the bare minimum gauge to get 150W. through a PCIE cable.
The cable needs to follow the specifications anyway. you don't deviate from rules just because "it should be alriiiight" ^^
Use the supplied Nvidia cable with all 4 ports connected and your power slider will allow up too 133% voltage. Steve from GN demonstrated how this works, you need all 4 side band pins to be active.
Appears there WAS some fire between Corsair and Gamers Nexus *(Corsair apologized)*. Corsair was calling BS regarding GN stated all four PCIe power needed to be the Nvidia FE power adapter to enable 133% power budget overclock, GN showed on follow up video only three won't allow you to go over 100% power budget.
This video just proven GN isn't talking about BS, you do need 600W spec to get power budget to 133%. Nvidia's adaptor can sense how many PCIe power were plugged in with their adaptor, potentially creating a problem even when the power supply can do 600W PCIe.
didnt corsair spread missinformation about the 4th/3rd plug on the adapters ? gamers nexus made a recent video about it , showing that corsair has no idea what the fuck they are talking about when they adress the new plug and standard
this^
yes they did, but when you see how they built their own extension you can see why they thought what they thought, why they were wrong, and why their product is still perfectly okay. these cables are fine. I just read the spec, and essentially I think the Corsair person was just confused and didn't realize that Nvidia built their adapter differently than they built their adapter, but both will work fine for a 4090 or any other GPU with ATX 3 that has ever been produced.
The Corsair one just uses 2 ground wires, the Nvidia adapter uses 4 ground wires and an IC to check them.
@@gucky4717 exactly. And let's be clear when we say "IC" we think of some complicated logic but it's really not doing that complicated logic. It's just counting the number of sense pins (old style) on the connectors and converting that to a two digit binary number.
That only applies to Nvidia's cable adapter. That's what Steve from gamersnexus was talking about when calling out Corsair. Not Corsairs own cable. The way this cable works is completely different.
I'm fairly new to PCing. I upgraded mt PSU because I upgraded my GPU. I got a BeQuiet! 12 M 850w, and all of the cables that came with that were solid core wires as well. Now I did grow up building houses, and for those who are worried about the wire stiffness, its a really good thing for power draw and delivery. Trade off is they are harder to work with for cable management. Pros and Cons and such
Edit: I did use the 12pin PSIe 5.1 cable that was provided to delete the dual to single pig tail to the 4070 Ti Super OC i upgraded to. Looks a lot better for those who are chasing visual cleanliness.
Going back to the IDE cabling days lol
this is why i'm waiting for the atx 3.0 power supplies to launch at the end of the year.
I never get tired of seeing the I-fix-it ad from this channel. Makes me laugh every time
Seeing Jay hold that "bow" at 5:18 made me realize how perfect he'd be to play Little John in Robin Hood. Hear that Hollywoood?
Even though I'm gonna get under four hours of sleep before my hectic day I'm very relaxed because I'm watching JayzTwoCents.
Good old Lew unboxing stuff for us on 4 hours of sleep 😴
18:30 for a topic that has absolutly zero impact on anything but looks. Well done.
I would retest this with the original cables and see if your FE can overclock. Watching the GE's video and hearing the Nvidia's guy talk about it, you may need their adapter to get the full power.
While Corsairs adapter may be built differently, the card itself may not want to go past it's 450 watts if it cannot detect whatever signals the original adapter gives.
Not to mention, if it's even safe for the card long-term.
power doesn't affect the longevity of cards, voltage does, which may I add Nvidia doesn't even let you increase it this time around
Good point and I hope Nvidia hasn't done anything to effectively lock performance and constrain people into using their squid cables...
There is literally no difference in the power provider by corsair's cable and nvidia's cable when nvidia's cable has four PCIE 8pin cables connected from the perspective of the graphics card.
As jay, and gamers Nexus have shown, the card determines its power limit based on what it senses from the cables. Both Jay and gamers Nexus perform the same experiment, a little bit differently, and got the same result.
The new standard, in my humble opinion, is simple, easy to implement, and far superior to the hodgepodge of old power standards and pigtailed six pin and eight pin connections.
@@misterio10gre unsure of this reasoning. Power is just V*I. If currents constant and voltage goes up power increases. Heat in devices is mostly from resistance to current.
@@enntense card runs at around 1.045 stock voltage, my 3090 does like 1.2 at max voltage headroom
Also you are outright incorrect, heat is directly proportional to voltage but inversely proportional to current
If your PSU can deliver 300W per socket this cable should work just fine. It has AWG 16, therefore not more current per cable core as for ATX 3.0 12 Pin cable.
Excellent demonstration video, JayZ. QUESTION: can you please elaborate on how "monitors plays a part in this, too"? Thanks!
You can push all the frames at 4k you want on a 1080p 60hz monitor and you probably won’t notice a difference between that and 1080p 60fps. So if you have a lower end monitor your GPU won’t need to work as hard to run the resolution and frames that are optimal for you. That’s why you should always match your GPU to the monitor you have or plan to have in the near future. It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for. As soon as you step up to 1440p or 4k and high refresh rate (120+ hz) it starts to make sense to go with a higher end GPU, but honestly no one needs a 4090, even for 4k gaming. It sits in a weird space where it’s too expensive and overkill for general consumers, and not powerful enough for professional use where Quattro’s dominate the space.
@@abnormallynormal8823 "It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for."
What? You can't just default assume you aren't utilizing the performance. It depends on the game, and settings. Lower refresh rate and resolution are always an opportunity to increase graphics settings. Now, if you've already maxed them out and still aren't anywhere near 100% GPU usage (assuming appropriately powerful CPU) then yes, your card's potential is kinda wasting. But unless that's the case in all games you play or might wanna play, you can't really say you've overpaid for the GPU.
That's obviously not touching the possibility that you might also need/want the card's performance for professional use (like rendering, AI and other compute stuff) where your monitor doesn't matter. But that's another story
You should have posted a link to these cables. Would have been nice.
Onboard graphics is such a blessing to have. When I finished my first build, I noticed the graphics card wasn't being utilized at all. It turned out that all it needed was a driver update. Has it not been for onboard graphics, I'd probably have panicked a little when the display didn't turn on.
How in the hell does a graphics card not work at all without a driver update? In my past experience they happily function with no driver at all other than whatever default driver Windows uses. This sounds like the manufacturer screwed up badly?
If you got the money to afford a 4080 or a 4090 the power supply isn't your concern. It amazes me that people still have to cut wires when all you got to do is just pull the pin out of the connector.
I love your vids as an older dude who’s hobby is PC’s and working on cars!
About the start of the video... no it dorsnt only take 2 it still takes 4 if u notice the regular cables have 1 connector on the psu side and 2 8 pins on the other side... so this adapter just skips 1 step.. its still the same amount of cables or lanes from the psu
Super excited about new the GPU’s but mostly waiting another round of Jay vs. Steve overclocking competition and trash talking to each others.😂
The IFixit is literally my favorite youtube commercial. It's so dorky it never gets old.
I still don't understand why they didn't just make it a plastic 90 degree bend from factory.
All these problems (or at least most of them) immediately disappear.
There are no problems, he literally bent the cable back and forth and nothing happened
Getting my card on Tuesday and I couldn't care less about what it's gonna look like in my case.
Corsair lost all my respect for the message they sent to Gamer Nexus regarding power supply. They were very unprofessional......disappointing to hear.
5:55 Jay, don't cut the cable, just slide the pins from the connector and put electrical tape on the pins...
Looks like that's what he did 12:30 16:35
Also try the cable out on a SF750 lets see if ITX is even viable
Hey. I'm sorry, my response is going to be completely unrelated to the video. I absolutely love the commercials that Jay and his team come up with for their sponsors. This ad makes you want to just go online, and shop for an iFix it kit
Your iFixit ad will never get old. I watch it every time.
This adapter is a huge marketing mistake.
I love it when Jay does stuff "for science".
The PCIE power connectors are rated at more than double their wattage at 12v of the 150w GPUs use them for.
The cables on the other hand are typically not quite living up to that.
But the math all checks out, it's no more insane to do this than to use the 12 pin connector in the first place lol
I think soon we will need a cooler for psu wires.
no idea why they still use wires so small
Gods, that buzzsaw of a fan in the background. You guys are doing your best to noise gate, so it's only audible when Jay's talking, but holy crap that's loud!
Correct me if i'm wrong, didn't Nvidia explicitly say that 4 pin connectors MUST be connected to individual rails and make sure not to use splitters?
Yeah... im pretty sure i remember this as well. Jays 4 to 1 was also connected only using 3 plus the breakout on one... not a good idea for OC.
@@Marin3r101 I would not be surprised if this is why his STRIX 4090 does not fire up :D
@@VadikRamm not it should still post the card. Only 450 watt is required unless Asus Vbios requires them all.
Engineering wise. I love this approach. It is a single universal connector that works for all GPUs and PSUs from massive to tiny and ensures the GPU won't overload the supply willy nilly.
An improvement over just using more and more old clunky 6/8/6+2 connectors. Its just the transition with all these fugly 4-headed adapters over the place that sucks.
1:21 After what they said about GamersNexus I dont know if id trust anything from Corsair that has to do with the 40 series.
Corsair already responded to that. It was just a dude who messed up. Not a big deal
i watched the video from Gamernexus that they were saying their test are bullshit and than they got smashed by Steve and proved them wrong so that cable from corsair probably will be a failure from corsair this time
Over the past couple years I have lost almost all interest in anything from nVidia due to their business practices. That said, I greatly appreciate the way Jay experiments with the new tech that is available to his team.
Consistently great content! 👍
LOL you just finding this out? I had 3DFX stock back in the day and Nvido was going to give 2 shares of Nvidia stock to 1 share of 3DFx stock. So if you have 5000 shares you end up converting it to 10,000 shares of nvidia. But they found a better idea, they kept 3dfx as a shell company with (no value) so then they did not have to give all the 3DFX share holds 1 dime. So a guy with 5000 shares at 5.00 a share of 3DFX ($25,000) shares went to 0 so his worth is 0. Lots of people lost all their money. And after that Nvidia sent those share holders a letter asking them or telling them they could now buy Nvidia Shares. LOL oh the number of angry people they created. They have not changed one bit in 2023. They screwed over so many people with that dirty move.
What are you gonna use instead? Subpar garbage and cards? Good luck.
This all reminds me of adapting molex connectors into 6 and 8 pin in older computers.
We need a what happens if you trick the voltage regulation IC in the card to push more voltage to the core.
3:30 that statement just made me want to go on a rant about how people think and higher amp brick will damage a device that came with a lower amp one. Amps are pulled not given.
I would be more curious if there ended up being any functional difference in overclocking headroom, between the official power adapter vs 3rd party
Even with a Corsair 900D case the cables do touch the glass coming from the 4090 Suprim X so I totally agree on your previous video the design is pants for those with small PC cases.
Interesting video. Time to see if I can ever decide if once prices come back down (scalpers again) if I can decide if a 4090 is worth it or not. Probably not.
Liked seeing Jay lose his words on the 300w test out of sheer admiration of the controller. We feel you Jay!
From what I've seen from GN, the 4 cables having power essentially just tells the gpu how much it can draw. Each one is 150W so 4 is 600W. 2 would only be 300W.
And the card needs 450 to boot. Remember, however, that it is fine to use the same 8 pin line for two connections (which is why they are pigtailed). So, the connection to the power supply when using corsair's cable and nvidia's would be the same: two connectors.
That iFixit ad still hasn't gone old.
Hey Jay, I have a similar issue with the MSI RTX 4090 Suprim X. I don't have any display output on my Gigabyte X570 Aorus Master and the card just won't work. (I have tested two different PSUs, one 1KW and one 1.6KW both with the GPU's dongle and with the cable you are showing in the video.)
I'm wondering though if the behaviour is identical. On my card, the fans do not spin up and no RGB lights up...I it the same on the Strix? I think you mentioned the fans do not spin up but you don't seem to mention RGB.
did you manage to solve the issue, having similar problem.
Solid wire wouldn't need cable management. Just work it back and forth a few times and it'll stiffen and stay wherever you put it
With all these multi-ended cables, like tendrils, coming off the GPU and Corsair's drama with Steve/GN, this could literally be a new Squid Game
[EDIT:] Your test @12:39 confirms what Steve said. Corsair told them their testing method was BS lol
FYI what you mentioned at 06:45 - it's fucking AWESOME we're getting onboard igpu for ryzen now. I couldn't tell you the amount of times I couldn't get a screen to appear with my 3080 because my bios was in UEFI mode in which apparently the FE 3080 wouldn't boot display anything. So I had to get out my 1060, boot into bios and enable legacy mode. Then turn the machine off - put the 3080 back in and everything was fine.
Now with the board I have - I am at least presented a screen letting me know to go into bios (while using the 3080) and change the mode. Now I've got a 7700X coming in and just knowing I'll be able to simply boot and mess with the bios with the on board GPU is an amazing feature. Not overlooked in the slightest from the customer - big appreciation to AMD for that.
Im sticking with 30 series it looks better
I agree with that statement 🥰👍💪. I think aesthetics overall are no where near the level of the 30-Series. But still impressive numbers overall, regardless of power draw. CPUs now have to catch up 😇💪
@@michaelthompson9798 very true
I think Jay found his hidden talent in voice acting and commercials.
Damn...eventually we will just need a new PSU with a GPU and they will be bundled together if it uses that much power. That cable grouping is atrocious. I thought the double cable that came with my 1070 was awful, just going to run my 1070 into the ground before bothering with a newer GPU. 🤣
Also I'm still running a 3700x 😅🤣
I wonder if the new AMD cards will have the same kind of cable fun going on? I'd imagine we will, but their stuff tends to also be more efficient than team green so things might not be quite as crazy.
@@TalesOfWar yeah I was wondering the same thing when I was commenting. I'm just going to upgrade to a 2070 or idk I will have to research something better than my 1070 I have now if it's even worth doing a 2070 upgrade. I can imagine AMD having something smarter and more innovative though
Jay its been years 😊 , iFixt has to collab and do another explosion AD which interrupts the interrruption of GPU becoming bigger than whole PC
Will the Corsair adapter allow 600 watts on a 4090? GN has an engineer from Nvidia saying all four are need for full power. And Steve proved that you would not get 600 watts w/out all four being connected.
I think that's for using the nvidia adapter. the coarsair one is for their power supply. Their power supply should be able to handle the wattage over that cable/connector from their power supply. Think about pcie power cables from power supplies. some have more than one 8 pin on a cable. It should be because the power supply can supply that ammount of wattage for those connectors.
But if you're using the nvidia adapter, it only sees power from each connector. coming from your power supply, you're probably stll only using 2 separate feeds from the power supply even if it's 4 8 pin pcie connectors.
Yes. It's because the cable Nvidia provides with their FE cards has an IC in the cable itself, at least according to the GN video. The 16 pin connector is essentially just two 8-pin pcie cables connected together, at least when used on an ATX 2.0 PSU.
each PSU line can hold 288W, nvidia's adapter uses 4 because the old 8-pin GPU CONNECTOR is limited to 150W, also you might use a cable with 2 terminations which would end up in a single line
Yup, can't believe so many are confused by this, even youtubers who should know better are spreading misinformation. Has anyone even looked at the back of their PSU? Those 8 pin connectors are clearly marked PCIE 6+2, CPU 4+4, meaning each connector is also rated for EPS12 8pin, capable of about 300W easy. That's why Corsair only needs 2 connectors for this cable (ofcourse you gotta be mindful about your PSU's total wattage thought).
@@Glotttis So can I use 2 x Cables and just use the splitters on the end to feed in to the other 2 ? so only need 2 x PSU cables that split in to 2 x GPU power ? Or does this not work.
yes 2 psu cables are enough. a Y splitter is not ok though because it would mean 300W going through a single connector, which is out of spec. if by "splitter" you mean a cable with 2 gpu-side connectors, that's ok
Jay your biceps are looking stacked 💪
1:53 Solid wire cable will fail more quickly than stranded especially if flexed/moved frequently.
3:40 Exactly those are not solid core cables, they don't hold shape in 1 bend.
if its a solid core it would crack pretty easy after few bends , it will never ever be solid for sure
So this launch *is not about super exciting, affordable graphics cards!* It's all about the power supplies, cables and stuff.
So sad!
Does anyone remember RTX 3070 launch?! 2080 Ti performance for 499 bucks? That was cool ... and feels like ages ago.
After Corsair calling out Steve, then Steve proving them wrong, I'll be unlikely to trust any Corsair PSU's or cables.
Lol ok.
the reason the PSU end is only 2x 8pin molex, is because it have 4x 12V lines, the larger gauge wire helps to, but on the PCI-e Power 8 pin they only have 3x 12V, so that's why they use 3 or 4x 8 pin to one mini molex Y connector.
I don't know about this Corsair cable especially after reading what that Corsair rep stated about how Nvidia sense pin work and was wrong!
yep you need all 4 conection to get to 133% power, that corsiar cable is trash
Guess this cable explains why Corsair thought they could get shitty with Gamers Nexus and get away with it.
Jesus Christ why doesn't anyone pay attention. That only applies to Nvidia's cable adapter. That's what Steve from gamersnexus was talking about when calling out Corsair. Not Corsairs own cable. The way this cable works is completely different and has absolutely nothing to do with what Corsair and Steve were beefing over.
Goddamnit Jay, take your updoot. That iFixit ad was pretty great buddy.
3:30 That's a really well put. Didn't know that.