DON'T Do this with your GPU!

Поделиться
HTML-код
  • Опубликовано: 21 ноя 2024

Комментарии • 1,8 тыс.

  • @dano1307
    @dano1307 2 года назад +858

    didnt corsair totally get it wrong as far as how that connector works? I just watched Gamers Nexus video on that. Not sure I trust it now.

    • @skjetnis
      @skjetnis 2 года назад +140

      Corsair just posted a statement

    • @phillylove7290
      @phillylove7290 2 года назад +194

      Corsair posted an apology for disrespectful behavior and language but they did not admit fault or give a technical answer.

    • @gucky4717
      @gucky4717 2 года назад +193

      @@johnhowarth7216 Nope GN is right, they also have the specs right from Nvidia. In the 4x8-Pin Adapter there is an extra IC, that checks EACH 8-Pin, if it is plugged in and then sends the right signal to the sense-pins.

    • @Apex180
      @Apex180 2 года назад +94

      @@johnhowarth7216 nope, If you had watched GN provide the source for their claims (Nvidia) so it looks like Corsair don't know squat.

    • @generalgrevous1984
      @generalgrevous1984 2 года назад +10

      @@johnhowarth7216 Confused too. But my understanding is that anything less than 4 cables can only draw 600w as where all 4 cables draw 630w which good for overclocking. Also I did an analysis on spikes GN did earlier and the 40% transient spikes on a 450w power source will go over 600w.

  • @goldfries
    @goldfries 2 года назад +1595

    Hi Jay, Brian from ASRock (Malaysia) here.
    For your RTX issue, could you try change the option in Advanced > AMD PBS > AMD Common Platform Module > PCIe x16 Link Speed
    One of my local distributors had problem with Zotac card too on our X670E board, it refused to display but in the end it did with some BIOS setting changes.

    • @Frostbite1090
      @Frostbite1090 2 года назад +52

      Bump

    • @goldfries
      @goldfries 2 года назад +75

      @@Frostbite1090 thanks. Jay didn't provide much details but I believe it was same issue. Ours case was ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO on ASRock X670E Steel Legend. For some reason it didn't work but later worked, got it settled within 24 hours and managed to work perfectly for the RTX 4090 launch event.

    • @PlayMates_
      @PlayMates_ 2 года назад +12

      UP

    • @goldfries
      @goldfries 2 года назад +10

      @@PlayMates_ Thanks.

    • @LarryGanz
      @LarryGanz 2 года назад +8

      We hope he sees this!

  • @bobbybobman3073
    @bobbybobman3073 2 года назад +59

    Also keep in mind that many psu offer two (6+2 pin connectors) off of one cable back to the psu, so very reasonably two 8 pins back to the psu should be fine anyway. So basically this is not really any different than a potentially common way other people might fulfill the 4, 8 pin connectors available.

    • @MrMirukou
      @MrMirukou 2 года назад +5

      My PSU unit does have these kind of cables? Corsair RM850x. so could i power a 4090 with it? it has 3 double cables... im not sure how it really works

    • @kaldo_kaldo
      @kaldo_kaldo 2 года назад +2

      Imagine. My PSU only has one 6+2pin and two 6 pin. To use squid adapter I would need four 8 pin connections. #1 is solved immediately. #2: A two-6 pin to 8 pin adapter. #3: Two molex to 6 pin x2 then a two-6 pin to 8 pin. #4: same as #3. That's a total of 8 adapters. 😂

    • @Koshmar-13
      @Koshmar-13 2 года назад +2

      @@MrMirukou You can power it with the RM850x. Tech Yes City did all of his day one 4090 reviews on that powersupply. As for the wiring of the psu, I can't say how he had it wired.

    • @ShmeeGrim
      @ShmeeGrim 2 года назад +1

      @@MrMirukou Cables with two 8-pin connectors only deliver the same amount of power as a single 8-pin cable, which is 150W, IIRC. I would avoid using too many of those.

    • @matbailie3816
      @matbailie3816 2 года назад +7

      @@ShmeeGrim That is simply not true.
      The PCIe 8 pin standard is essentially implicit communication. It says that if all 8 pins are connected correctly, the device (GPU) can draw 150W. Even if you make a cable and PSU capable of supplying 1.21 jigawatts, the device (GPU) has no way to know that, so will only ever draw upto 150W sustained per 8 pin connector.
      Corsair's cables can sustain 300W. But that will only be utilised if provided over two 8 pin PCIe connections, so their cables terminate with those two 8 pin PCIe connections.
      At the other end, there's an 8 pin connection to the PSU. That connector IS NOT an 8 pin PCIe connector. It's a connector bespoke to the PSU manufacturer. It can output up to 300W. It only works correctly on the manufacturer's PSU, as the the pinout is often not the same. (NEVER use a modular cable from a different manufacturer, unless you fancy risking connecting a ground pin on your GPU to a 12V pin on your PSU.)
      So, each connector on the Corsair PSU absolutely CAN deliver up to 300W, and using the twin 8 pin PCIe connectors at the other end CAN allow 300W to be drawn.
      The nVidia adaptor needs four 8 pin PCIe connections as each such CONNECTOR only guarantees 150W. You could be attaching for separate 150W PSUs. You could be attaching a nuclear power station over solid copper bars. nVidia can't tell, so they will only ever draw 150W per 8 pin PCIe connector
      Corsair have made a cable for THEIR PSUs. They wire the 12+4 pin connector to tell the GPU that 600W is available. They make the gauge of the cabling capable of delivering that. They make the PSU end of the cable connect to TWO of their 300W PSU outputs. Thus, their single cable CAN sustain 600W from their PSUs.
      Note that if you plugged in only One connection at the PSU end, only one of the two sense wires would be grounded (the other would be left "open"), and the GPU would be able to tell. Thus, the GPU would only be "allowed" to draw 300W from that connector.
      An 850W Corsair PSU absolutely CAN deliver 600W over either two of corsair's twin headed PCIe cables, or one of their 12+4 pin HPWR cables. As that leaves only 250W for everything else, including transients, I'm not sure I'd want to. I'd rather ensure the GPU be limited to 450W to leave enough capacity for your other components.

  • @seanprzybyla2157
    @seanprzybyla2157 2 года назад +496

    just as a note Jay. Solid core cables can break fairly easily if you bend them a lot. Might be something to be cautious of for your testing rigs incase they start to break and short/overheat

    • @warpedphreak
      @warpedphreak 2 года назад +15

      if you're in there bending your GPU PSU cable a LOT... you got bigger problems you caused lol...

    • @stevenhenson4956
      @stevenhenson4956 2 года назад +9

      @@wojtek-33 hes talking about over time reason why most cables have a bend rating

    • @N0N0111
      @N0N0111 2 года назад +55

      As history has proven again Jay has small knowledge on electronics.
      100% that is not a solid core cable, yes he miss spoke again...

    • @beaniiman
      @beaniiman 2 года назад +1

      We are going to start to need new cable tech soon, which is kinda gross. 60FPS at 4K is fine, stop it!!!
      but...

    • @whatshouldido22
      @whatshouldido22 2 года назад +16

      @@warpedphreak don’t forget, as a reviewer, Jay is regularly swapping components. Regularly unplugging power cables. Yes, in a personal rig, you shouldn’t have that problem.

  • @edwardallenthree
    @edwardallenthree 2 года назад +83

    The standard has 4 sense lanes. The first two produce a binary number (with 3 being 600watt). No IC needed, just connect both to ground to produce a 11 (or 3). The other two are power ok and gpu connected and are both optional.
    edit: just got the part of the video where you say this and reference the same page on the spec I did. should.never have doubted your journalistic chops, Jay!
    edit 2: awesome video, explained the sense pins very well and made me actually more excited for the future.

    • @IcecalGamer
      @IcecalGamer 2 года назад +2

      Can't wait to see pictures of people molex>6pin>8pin>12 pin. And having two speaker wires coming out from the side band, grounding into the case with techscrews (ground IS ground ¯\_(ツ)_/¯ )
      ^not unlike shi**y audio setups in cars :))

    • @edwardallenthree
      @edwardallenthree 2 года назад

      @@IcecalGamer as long as they include adapter cables with the cards, I don't think we're going to see much of that. I hope I'm right. Just for the sake of people's homes and their neighbors.

    • @blarghmcblarghson1903
      @blarghmcblarghson1903 2 года назад +1

      To be fair, Jay isn't always right, no harm in double checking what he says. Literally last video he said these are PCIE 5.0 cards when they're 4.0. At best he meant it figuratively, but he didn't present it as such.

    • @edwardallenthree
      @edwardallenthree 2 года назад

      @@blarghmcblarghson1903 he did? Weird. It's an issue for me. I use the additional slot for a mellanox (I won't call them Nvidia) infiniband card for my storage network, so being able to use the GPU at 8x without a performance hit is a feature I want.

    • @emilybjoerk
      @emilybjoerk 2 года назад +1

      A 4to1 adapter cable (and 2to1 too) will still need an IC to drive the two sense pins. As it needs to count how many cables are connected on the ATX 2.0 end, as any three out of four should produce the same sense signal because you cannot rely on users populating then in the order prescribed by the engineer. I didn't see an IC on the Corsair cable and because it's got two inputs I half expect it to incorrectly report the higher mode if one one cable is connected to the wrong input.

  • @Eremon1
    @Eremon1 2 года назад +171

    Jay's impression of the Side-band sensor talking was hilarious.
    Side-band: "Uhm we're only getting 300 watts."
    Controller: "Well then you don't get video." 🤣

  • @jackoneill1070
    @jackoneill1070 2 года назад +12

    The power is defined by what consumes it (GPU).
    What produces the energy (PSU) only defines the power limit it can handle.
    Since the GPU is being told by the sense pins that the available power limit is under what it needs, it doesn't even try to start because it "knows" that it will go over the limit of the PSU and probably damage it.
    Jay, you cannot say the power is there, "there" is only the voltage in Volts, then multiplied by the Amperes that the GPU resistance provide will give you power in Watts.

  • @chadmckean9026
    @chadmckean9026 2 года назад +5

    fun fact if the wire gauge is the same stranded can handle more amps, sold core is cheaper to make, sold core is used in builds cause it does not get moved around, it is set and forget so as long as it is cheaper than stranded it makes sense to use, also easy to screw sold core wires to the back of outlets, but they were made for solid core as it is the standard use product

  • @HirXeBomb
    @HirXeBomb 2 года назад +4

    Jayz got 12th place eventually.
    12. JayzTwoCents 28356 NVIDIA GeForce RTX 4090 AMD Ryzen 9 7950X

  • @natron8102
    @natron8102 2 года назад +59

    “Well then you don’t get video!” Made me laugh for real. Jay can always make a boring subject entertaining. That’s why he’s the man.

  • @jakejones9502
    @jakejones9502 2 года назад +60

    But Jay, Steve just did this test and found you need all four or you will only get 400 watts.

    • @generalgrevous1984
      @generalgrevous1984 2 года назад +10

      450

    • @lyianx
      @lyianx 2 года назад +38

      and that Corsair said Steve was full of shit lol, which he promptly proved them wrong.

    • @sauyon
      @sauyon 2 года назад +1

      Can you link that video?

    • @marceloconceicao2587
      @marceloconceicao2587 2 года назад +1

      yeah I wanted to see Jay try to pull 600 with that setup

    • @FDFANATIC541
      @FDFANATIC541 2 года назад +4

      Ya but 300watts will give you 90-95 percent of your performance. Using the full power on 4090s isn't worth the marginal performance gain

  • @thracian
    @thracian 2 года назад +9

    me watching it on my 1060 : mmm yes yes

  • @hme850
    @hme850 2 года назад +16

    Seasonic making a native dual 8-pin to 12-pin cable was mint. So satisfying swapping that in for my 3070 Ti FE

    • @hme850
      @hme850 2 года назад

      @@morganlimes cool

  • @adamroberts8415
    @adamroberts8415 2 года назад +2

    The correct term for the "kickstart" voltage is the "Inrush" voltage. This is common through low voltage dc electronics as well as large circuit breakers for industrial motors. The inrush is akin to when you open the floodgates of a damn, there is a large starting deluge of water or power in this instance, then it tapers to the supply voltage.

  • @Kremon86
    @Kremon86 2 года назад +178

    Could anybody please try to run an overclocked 4090 with the NVIDIA adapter with a couple of Y-splitters of a single 8 Pin PCIE Cable. I want to see the cables melt :-D

    • @Eric_Fate
      @Eric_Fate 2 года назад +53

      I'm sure Dawid will have video of himself doing this unintentionally before the month is out.

    • @ArcturusCOG
      @ArcturusCOG 2 года назад

      @@Eric_Fate Oh god, here we go!

    • @ChrispyNut
      @ChrispyNut 2 года назад +17

      I'm looking forward to someone using molex to 8 pin to 12 pin.
      More adapters is more better, right? Right!!

    • @edwardallenthree
      @edwardallenthree 2 года назад +1

      I'm not sure you would. The issue is the connectors, not the wires. Three pairs of 18 AWG wire won't melt at 450 watts, but the connectors might.

    • @edwardallenthree
      @edwardallenthree 2 года назад +3

      @@PREDATEURLT perfectly fine if it uses 12 molex connectors.

  • @DGPG35
    @DGPG35 2 года назад +17

    NOTE: the power that you get from this Corsair cable will vary depending upon the power supply that is used, it calls for a 1200w Corsair PSU to deliver the full 600w through the cable (1000w PSU = 450w and 750w PSU = 300w). I did not see this mentioned in the video.

    • @xGR1MxREAPERx
      @xGR1MxREAPERx Год назад +4

      This is false lol.. do the same test he is doing and you can see the wattage being supplied. I have a 1000 Corsair and this cable and I’ve had my strix card up to 527 watts. So it is definitely over 450.. maybe that’s changed I guess but as of now it isn’t true

    • @crimson7151
      @crimson7151 Год назад +1

      I have a rm850x and it goes up to 587 watts in furmark with the cable.

    • @TheCaciKO
      @TheCaciKO Год назад

      ​@@crimson7151
      I also have rm850x. This might be the reason why I'm getting a black screen with this cable.

    • @crimson7151
      @crimson7151 Год назад

      yes its possible if you have a cpu like the 13900k@@TheCaciKO

    • @devel-rh6hy
      @devel-rh6hy 11 месяцев назад

      are you using the type 4 or type 5 PCI-e ? and do you connect 3 or 4 pci-e ? i have the same power supply but i havend connected yet@@crimson7151

  • @thefactory7221
    @thefactory7221 2 года назад +209

    I'm curious to see how the super compact power supply cables behave thermally. A thermal camera video or section of reviews would be really helpful, especially since these cards risk igniting their own cords at max load.

    • @MrHaggyy
      @MrHaggyy 2 года назад +13

      Long cabels with a lot of connectors are usually way worse than shorter cabels without them.

    • @edwardallenthree
      @edwardallenthree 2 года назад +5

      Fortunately these products have specs that are available online, and have actually been tested for their power rating. Microfit connections from molex are almost as good in terms of power delivery as mini fit Junior (with a rating of 8.6Amp versus 9 on the larger connector). When you consider that the standard would normally use pigtails on 8 pin connectors, four 8 pin connectors would use 6 positive wires and 6 ground for 600 watt, and a single 16 pin connector would use 6 and 6 as well.

    • @hexdator2934
      @hexdator2934 2 года назад +17

      AWG 18 (0.75mm²) cable is the most common across power supplies and can easy carry 12 amps (1440 watts @ 120V). In Jayz video it looks that AWG 16 (1mm²) may now be the standard being moved to, easily carrying 15 amps (1800 watts @ 120V). So these cards risking igniting their "own" cords is not a problem.

    • @edwardallenthree
      @edwardallenthree 2 года назад +8

      @@hexdator2934 good point. the reason I didn't bother to talk about wire gauge is that the limiting factor is the pins in the connector at the lengths used in a PC.

    • @hexdator2934
      @hexdator2934 2 года назад +3

      @@edwardallenthree You went to an even greater level of detail......👍I was replying to OP..

  • @10100rsn
    @10100rsn 2 года назад +2

    Sense signals S3 and S4 are not for remote voltage sense as people might think. They let cards know information about how much power the PSU can provide. A logical combination of Sense0 and Sense1 tells the card how much power it can use according to the table below. In the PSU the appropriate sense0 and sense1 signals must be pulled to ground or left open to tell the card what the power limits are. These signals may dynamically change, but only when the power supply is in standby mode, meaning if they change when the 12V to the card is on the card might not recognize the change and act appropriately to the new setting.
    The sense1 line is pin 4 (the right most pin when reading the 600w label) and the sense0 is pin 3.
    Sense0-----Sense1----Bootpwr------Maximum Sustain Power
    -----Gnd-----Gnd----------375W----------600W
    --Open------Gnd----------225W----------450W
    -----Gnd----Open---------150W----------300W
    --Open-----Open---------100W----------150W

    • @10100rsn
      @10100rsn 2 года назад

      If the card supports this feature properly and you remove the lines from the cable the power limit should drop because now both sense0 and sense1 are open.

  • @chrisspellman5952
    @chrisspellman5952 2 года назад +19

    Given the comment they made to GN on the power requirements and how wrong the rep was. I 100% would not use any cable from Corsair. Gonna burn your house down.

    • @metaleggman18
      @metaleggman18 2 года назад +3

      No, it'd be fine. The 16 pin cable is essentially just two 8 pin cables combined.

    • @edwardallenthree
      @edwardallenthree 2 года назад +2

      The original Corsair post was incorrect, but it wasn't incorrect about their own product it was incorrect about nvidia's product. It was simply a misunderstanding and they have apologized. It had nothing to do with whether or not the Corsair product is technically good. There is no difference in power between using the Nvidia adapter and corsairs, other than nvidia's adapter will produce a messy bunch of cables which might cause other issues. The IC on nvidia's adapter is designed to count how many cables are connected to it. Corsair's cable does not need to count how many cables are connected to it because there is only one configuration option and that is the plug both cables into your power supply. No need for an IC when the pins will always both be connected to ground.

    • @edwardallenthree
      @edwardallenthree 2 года назад +1

      @@metaleggman18 an eight pin cable has three lanes of 9 amps each, for a max power of around 324 watts, however the spec limits it to 150 watt (this difference is why you see pigtails). The 16 pin cable has 6 lanes of 8.5 amps each, for 612 watts rated power, or limited to 600w by the spec. So, yes, it is exactly like two 8 pin connectors, although it has more 12v wires than two 8 pins, and runs much closer to the spec of the underlying components, in theory.

  • @vedomedo
    @vedomedo Год назад +22

    Got my 4090 a day or two ago, and you are sooo right. The goddamn cablemess is terrible. Ordered a Corsair cable for my Corsair PSU to neaten things up.

    • @klipschorny
      @klipschorny Год назад +1

      Did it work? Which Corsair psu do you have?

    • @Lagann0
      @Lagann0 Год назад +1

      @@klipschorny I have the corsair 600W 12VHPWR cable as well. I have a corsair 1000RMx and works perfectly.

    • @vedomedo
      @vedomedo Год назад

      @@klipschorny yes it worked fine. I have a RM850X

    • @csguak
      @csguak Год назад

      Wait what? This cable will only draw 600w if you have Rmx1200w supply or above. Anything less, it defaults to 450w. But people are ignorant, so they deserve to get scammed.😂🤣😂🤣

    • @StickyKeys187
      @StickyKeys187 Год назад

      @@csguak why the fuck do you need 600 watts of juice to power a graphics card? Are you building a delorean or sum? 😂

  • @misterio10gre
    @misterio10gre 2 года назад +14

    oh look! its the same Corsair that apparently doesn't know how the 4090's power delivery works lol

    • @Maradiaga23
      @Maradiaga23 2 года назад +2

      They issued an apology already. Apparently the comment was made independently by one of their employees.

    • @Apex180
      @Apex180 2 года назад

      typical back peddling when found out.

    • @KarolusTemplareV
      @KarolusTemplareV 2 года назад +3

      Isnt it always the case when an employee fucks up but its the whole company when an employee does something extraordinarily good? 😏

  • @sergionunez8878
    @sergionunez8878 2 года назад +16

    I just have to say this, I've seen like a hundred of times your Ifixit sponsor video short and until this day still brings a smile on my face xD
    it's just so funny and it never gets old xD

  • @tashmikagimanthamusic
    @tashmikagimanthamusic 2 года назад +4

    Has to admit man. This is the best iFixit ad ever. You are the best Jay. I'm a daily visitor now. Your videos are really informative. Massive thanks, man.

  • @MistyKathrine
    @MistyKathrine 2 года назад +1

    Bleeding edge tech!

  • @YAAMW
    @YAAMW 2 года назад +3

    There are no tricks involved in the corsair adapter. Some of the cables you get with this PSU give you two 8-pin connectors for each cable coming from the PSU. That means that a single PCIe connector ON THE PSU can supply 300W so two will give you 600W

  • @lcmattern
    @lcmattern 2 года назад +7

    Something that derbauer tested is lowering the power limit, he got it down to 50 % and only saw a dip of 5-15% and a 200w drop in power usage.
    I think that should be tested by others as well, they would not have needed massive coolers and special connectors, maybe scared of amd. Interested to see what will come from them.

    • @ledoynier3694
      @ledoynier3694 2 года назад

      well no surprise there. 30 series behave the same way :p
      And AMD too actually. They don't show the total board power so they appear to be more efficient, but they also benefit from lower power limits in the same way with a really minimal hit in performance.
      those last FPS are a b**** to get for the benchmarks.

  • @hamstersmash
    @hamstersmash 2 года назад +4

    That intro, very reminiscent of early era morning TV ad's, very nice

  • @ericdevarney4089
    @ericdevarney4089 2 года назад +1

    electrical engineering 101
    Electrons flow on the outside of the wire, stranded will carry more current than solid core, making the point you were trying to make around the 2 minute mark invalid.

  • @joshc7200
    @joshc7200 2 года назад +9

    Jay, try to either 1) Disable deep sleep hdmi/ dp on your monitor or 2) use a differnt connector. This is a common issue.

  • @JayHijazi
    @JayHijazi Год назад +2

    For the adapter that come with the 4090, you do not need to connect four 8-pins, just three as per their instructions. The 4th plug is optional and is for overclocking.

    • @sabola1274
      @sabola1274 Год назад +1

      Can I get this cable with the 2 8-pina for the 4090? Or do I need a custom cable that has 4?

  • @marcelkanter
    @marcelkanter 2 года назад +7

    Hi Jay,
    just that you can't band a cable, doesn't mean that its solid core. Especially the insulation makes a big difference in the stiffness of a cable. If you don't want to cut the cable, you can x-ray the cable to see if its standed.
    BTW. The "smart" communication is just passive pins. Maybe later they implement something I2C based, like the old fashioned PM Bus on the PSU and GPU. They should have done this already. Its like cents for an eeprom in the PSU.

  • @dwurkz
    @dwurkz 2 года назад +8

    I think the reason there is a 4 8-pin adaptor is for psu's that had multi rails on the 12V. It is mostly a CYA adaptor while most corsair psu's are single 12V rail by design so the can basically just use 2 8pins to 12 high power connector.

    • @Andychiu845
      @Andychiu845 2 года назад

      Yeah most psu can do 1 8 pin psu to 2 x 8pin pcie pigtail

    • @RurouTube
      @RurouTube 2 года назад +1

      The reason for 4 8-pin adaptor is simply because each PCIE 8-pin is rated at 150W (it is the standard), so you need 4 of those to get up to 600W. You don't design 2 8-pin adaptor and rated it for 600W because it will be a fire hazard.
      The adaptor itself can sense how many cables are attached to the adaptor, so if you only attach 3, it will tell the GPU that the max power is 450W. If you use Corsair PSU using the adaptor, you still need to connect to all 4 to get 600W max.
      One of the potential problem in the future is that if people ended up breaking or losing their adaptor (maybe buy a 2nd hand 4090 card and it doesn't came with the adaptor), they might think to simply trying to purchase the cheapest adaptor and those adaptor is not a smart one, it could end up with the GPU trying to pull 600W from 1 8-pin.

    • @bradley3549
      @bradley3549 2 года назад +2

      @@RurouTube While the PCIe spec may say 150W per 8-pin ATX, in reality the three positive wires and Molex Mini-Fit Jr. pins are capable of much more than that if properly spec'd. Roughly 108w per pin, or 300W+ per 8-pin connector. Since Corsair specs the cable and the power supply, one must assume they know what their pins and wires are rated for and thus this is a 100% safe and compliant use case at 600w.

    • @RurouTube
      @RurouTube 2 года назад +4

      @@bradley3549 What I wrote is not about whether Corsair can do 600W safely from a single 8pin but more about the reasoning why Nvidia included this 4x8pin adaptor. It is not about single rail or multi rails, it is simply because of the spec. The spec for 8pin is 150W. You can't expect every single PSU on earth to safely do what Corsair PSU or its cable can. Edit: this is just from safety issue. There is also from function stand point where while PSU at this category should be able to supply more than 150W from a single 8pin, there might be edge cases where it can't. Also just having 2x8pin might ended up making people have a false sense of security thinking that their PSU with 2x8pin should be able to handle 4090 when in reality it can't.

    • @ledoynier3694
      @ledoynier3694 2 года назад +1

      @@RurouTube spot on.. and some PSUs really put the bare minimum gauge to get 150W. through a PCIE cable.
      The cable needs to follow the specifications anyway. you don't deviate from rules just because "it should be alriiiight" ^^

  • @keithkamps77
    @keithkamps77 2 года назад +8

    Use the supplied Nvidia cable with all 4 ports connected and your power slider will allow up too 133% voltage. Steve from GN demonstrated how this works, you need all 4 side band pins to be active.

  • @AaronShenghao
    @AaronShenghao 2 года назад +1

    Appears there WAS some fire between Corsair and Gamers Nexus *(Corsair apologized)*. Corsair was calling BS regarding GN stated all four PCIe power needed to be the Nvidia FE power adapter to enable 133% power budget overclock, GN showed on follow up video only three won't allow you to go over 100% power budget.
    This video just proven GN isn't talking about BS, you do need 600W spec to get power budget to 133%. Nvidia's adaptor can sense how many PCIe power were plugged in with their adaptor, potentially creating a problem even when the power supply can do 600W PCIe.

  • @oOWaschBaerOo
    @oOWaschBaerOo 2 года назад +10

    didnt corsair spread missinformation about the 4th/3rd plug on the adapters ? gamers nexus made a recent video about it , showing that corsair has no idea what the fuck they are talking about when they adress the new plug and standard

    • @thestitchedcow5726
      @thestitchedcow5726 2 года назад +1

      this^

    • @edwardallenthree
      @edwardallenthree 2 года назад

      yes they did, but when you see how they built their own extension you can see why they thought what they thought, why they were wrong, and why their product is still perfectly okay. these cables are fine. I just read the spec, and essentially I think the Corsair person was just confused and didn't realize that Nvidia built their adapter differently than they built their adapter, but both will work fine for a 4090 or any other GPU with ATX 3 that has ever been produced.

    • @gucky4717
      @gucky4717 2 года назад +2

      The Corsair one just uses 2 ground wires, the Nvidia adapter uses 4 ground wires and an IC to check them.

    • @edwardallenthree
      @edwardallenthree 2 года назад

      @@gucky4717 exactly. And let's be clear when we say "IC" we think of some complicated logic but it's really not doing that complicated logic. It's just counting the number of sense pins (old style) on the connectors and converting that to a two digit binary number.

    • @PabzRoz
      @PabzRoz 2 года назад

      That only applies to Nvidia's cable adapter. That's what Steve from gamersnexus was talking about when calling out Corsair. Not Corsairs own cable. The way this cable works is completely different.

  • @theshootingbench9116
    @theshootingbench9116 18 дней назад

    I'm fairly new to PCing. I upgraded mt PSU because I upgraded my GPU. I got a BeQuiet! 12 M 850w, and all of the cables that came with that were solid core wires as well. Now I did grow up building houses, and for those who are worried about the wire stiffness, its a really good thing for power draw and delivery. Trade off is they are harder to work with for cable management. Pros and Cons and such
    Edit: I did use the 12pin PSIe 5.1 cable that was provided to delete the dual to single pig tail to the 4070 Ti Super OC i upgraded to. Looks a lot better for those who are chasing visual cleanliness.

  • @salahmed2756
    @salahmed2756 2 года назад +5

    Going back to the IDE cabling days lol

  • @Ray_LF
    @Ray_LF 2 года назад +1

    this is why i'm waiting for the atx 3.0 power supplies to launch at the end of the year.

  • @thetinker167
    @thetinker167 2 года назад +7

    I never get tired of seeing the I-fix-it ad from this channel. Makes me laugh every time

  • @cyberwomble7524
    @cyberwomble7524 2 года назад

    Seeing Jay hold that "bow" at 5:18 made me realize how perfect he'd be to play Little John in Robin Hood. Hear that Hollywoood?

  • @Nitrodozer
    @Nitrodozer 2 года назад +4

    Even though I'm gonna get under four hours of sleep before my hectic day I'm very relaxed because I'm watching JayzTwoCents.

    • @godtiermedic
      @godtiermedic 2 года назад +1

      Good old Lew unboxing stuff for us on 4 hours of sleep 😴

  • @mgkpraesi
    @mgkpraesi Год назад

    18:30 for a topic that has absolutly zero impact on anything but looks. Well done.

  • @argonzeit
    @argonzeit 2 года назад +38

    I would retest this with the original cables and see if your FE can overclock. Watching the GE's video and hearing the Nvidia's guy talk about it, you may need their adapter to get the full power.
    While Corsairs adapter may be built differently, the card itself may not want to go past it's 450 watts if it cannot detect whatever signals the original adapter gives.
    Not to mention, if it's even safe for the card long-term.

    • @misterio10gre
      @misterio10gre 2 года назад +1

      power doesn't affect the longevity of cards, voltage does, which may I add Nvidia doesn't even let you increase it this time around

    • @earthtaurus5515
      @earthtaurus5515 2 года назад

      Good point and I hope Nvidia hasn't done anything to effectively lock performance and constrain people into using their squid cables...

    • @edwardallenthree
      @edwardallenthree 2 года назад +2

      There is literally no difference in the power provider by corsair's cable and nvidia's cable when nvidia's cable has four PCIE 8pin cables connected from the perspective of the graphics card.
      As jay, and gamers Nexus have shown, the card determines its power limit based on what it senses from the cables. Both Jay and gamers Nexus perform the same experiment, a little bit differently, and got the same result.
      The new standard, in my humble opinion, is simple, easy to implement, and far superior to the hodgepodge of old power standards and pigtailed six pin and eight pin connections.

    • @enntense
      @enntense 2 года назад +3

      @@misterio10gre unsure of this reasoning. Power is just V*I. If currents constant and voltage goes up power increases. Heat in devices is mostly from resistance to current.

    • @misterio10gre
      @misterio10gre 2 года назад +1

      @@enntense card runs at around 1.045 stock voltage, my 3090 does like 1.2 at max voltage headroom
      Also you are outright incorrect, heat is directly proportional to voltage but inversely proportional to current

  • @BeachClub_1
    @BeachClub_1 2 года назад +2

    If your PSU can deliver 300W per socket this cable should work just fine. It has AWG 16, therefore not more current per cable core as for ATX 3.0 12 Pin cable.

  • @wile-e-coyote7257
    @wile-e-coyote7257 2 года назад +17

    Excellent demonstration video, JayZ. QUESTION: can you please elaborate on how "monitors plays a part in this, too"? Thanks!

    • @abnormallynormal8823
      @abnormallynormal8823 2 года назад +1

      You can push all the frames at 4k you want on a 1080p 60hz monitor and you probably won’t notice a difference between that and 1080p 60fps. So if you have a lower end monitor your GPU won’t need to work as hard to run the resolution and frames that are optimal for you. That’s why you should always match your GPU to the monitor you have or plan to have in the near future. It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for. As soon as you step up to 1440p or 4k and high refresh rate (120+ hz) it starts to make sense to go with a higher end GPU, but honestly no one needs a 4090, even for 4k gaming. It sits in a weird space where it’s too expensive and overkill for general consumers, and not powerful enough for professional use where Quattro’s dominate the space.

    • @xGOKOPx
      @xGOKOPx Год назад

      @@abnormallynormal8823 "It doesn’t make sense to run a 40 series card, or a 70 series card or a 6800xt/6900xt for 1080p gaming because you aren’t utilizing the performance you paid for."
      What? You can't just default assume you aren't utilizing the performance. It depends on the game, and settings. Lower refresh rate and resolution are always an opportunity to increase graphics settings. Now, if you've already maxed them out and still aren't anywhere near 100% GPU usage (assuming appropriately powerful CPU) then yes, your card's potential is kinda wasting. But unless that's the case in all games you play or might wanna play, you can't really say you've overpaid for the GPU.
      That's obviously not touching the possibility that you might also need/want the card's performance for professional use (like rendering, AI and other compute stuff) where your monitor doesn't matter. But that's another story

  • @HolyLandSite
    @HolyLandSite Год назад

    You should have posted a link to these cables. Would have been nice.

  • @blacksheep7576
    @blacksheep7576 2 года назад +7

    Onboard graphics is such a blessing to have. When I finished my first build, I noticed the graphics card wasn't being utilized at all. It turned out that all it needed was a driver update. Has it not been for onboard graphics, I'd probably have panicked a little when the display didn't turn on.

    • @DGneoseeker1
      @DGneoseeker1 7 месяцев назад

      How in the hell does a graphics card not work at all without a driver update? In my past experience they happily function with no driver at all other than whatever default driver Windows uses. This sounds like the manufacturer screwed up badly?

  • @jamesm568
    @jamesm568 2 года назад +1

    If you got the money to afford a 4080 or a 4090 the power supply isn't your concern. It amazes me that people still have to cut wires when all you got to do is just pull the pin out of the connector.

  • @erichagen3617
    @erichagen3617 2 года назад +6

    I love your vids as an older dude who’s hobby is PC’s and working on cars!

  • @AliYassinToma
    @AliYassinToma 2 года назад +1

    About the start of the video... no it dorsnt only take 2 it still takes 4 if u notice the regular cables have 1 connector on the psu side and 2 8 pins on the other side... so this adapter just skips 1 step.. its still the same amount of cables or lanes from the psu

  • @nordenmike6531
    @nordenmike6531 2 года назад +3

    Super excited about new the GPU’s but mostly waiting another round of Jay vs. Steve overclocking competition and trash talking to each others.😂

  • @einarabelc5
    @einarabelc5 2 года назад

    The IFixit is literally my favorite youtube commercial. It's so dorky it never gets old.

  • @KobraTrading
    @KobraTrading 2 года назад +7

    I still don't understand why they didn't just make it a plastic 90 degree bend from factory.
    All these problems (or at least most of them) immediately disappear.

    • @DJ3thenew23
      @DJ3thenew23 2 года назад

      There are no problems, he literally bent the cable back and forth and nothing happened

  • @tabarnacus5629
    @tabarnacus5629 2 года назад +1

    Getting my card on Tuesday and I couldn't care less about what it's gonna look like in my case.

  • @Hampton_twitch
    @Hampton_twitch 2 года назад +9

    Corsair lost all my respect for the message they sent to Gamer Nexus regarding power supply. They were very unprofessional......disappointing to hear.

  • @cac2244
    @cac2244 2 года назад +1

    5:55 Jay, don't cut the cable, just slide the pins from the connector and put electrical tape on the pins...

    • @ShashankKR
      @ShashankKR 2 года назад

      Looks like that's what he did 12:30 16:35

  • @anthonyc417
    @anthonyc417 2 года назад +6

    Also try the cable out on a SF750 lets see if ITX is even viable

  • @orensGT
    @orensGT 2 года назад +1

    Hey. I'm sorry, my response is going to be completely unrelated to the video. I absolutely love the commercials that Jay and his team come up with for their sponsors. This ad makes you want to just go online, and shop for an iFix it kit

  • @alexdefelice2448
    @alexdefelice2448 2 года назад +4

    Your iFixit ad will never get old. I watch it every time.

  • @skybuck2000
    @skybuck2000 2 года назад +1

    This adapter is a huge marketing mistake.

  • @mikeoleksa
    @mikeoleksa 2 года назад +3

    I love it when Jay does stuff "for science".

  • @thumbwarriordx
    @thumbwarriordx Год назад +1

    The PCIE power connectors are rated at more than double their wattage at 12v of the 150w GPUs use them for.
    The cables on the other hand are typically not quite living up to that.
    But the math all checks out, it's no more insane to do this than to use the 12 pin connector in the first place lol

  • @Rayu25Demon
    @Rayu25Demon 2 года назад +5

    I think soon we will need a cooler for psu wires.

    • @nostrum6410
      @nostrum6410 2 года назад

      no idea why they still use wires so small

  • @rashkavar
    @rashkavar 2 года назад

    Gods, that buzzsaw of a fan in the background. You guys are doing your best to noise gate, so it's only audible when Jay's talking, but holy crap that's loud!

  • @VadikRamm
    @VadikRamm 2 года назад +3

    Correct me if i'm wrong, didn't Nvidia explicitly say that 4 pin connectors MUST be connected to individual rails and make sure not to use splitters?

    • @Marin3r101
      @Marin3r101 2 года назад +1

      Yeah... im pretty sure i remember this as well. Jays 4 to 1 was also connected only using 3 plus the breakout on one... not a good idea for OC.

    • @VadikRamm
      @VadikRamm 2 года назад

      @@Marin3r101 I would not be surprised if this is why his STRIX 4090 does not fire up :D

    • @Marin3r101
      @Marin3r101 2 года назад

      @@VadikRamm not it should still post the card. Only 450 watt is required unless Asus Vbios requires them all.

  • @Foxhood
    @Foxhood 2 года назад

    Engineering wise. I love this approach. It is a single universal connector that works for all GPUs and PSUs from massive to tiny and ensures the GPU won't overload the supply willy nilly.
    An improvement over just using more and more old clunky 6/8/6+2 connectors. Its just the transition with all these fugly 4-headed adapters over the place that sucks.

  • @crisnmaryfam7344
    @crisnmaryfam7344 2 года назад +4

    1:21 After what they said about GamersNexus I dont know if id trust anything from Corsair that has to do with the 40 series.

    • @miliyn17
      @miliyn17 2 года назад +1

      Corsair already responded to that. It was just a dude who messed up. Not a big deal

  • @failomas1443
    @failomas1443 2 года назад +1

    i watched the video from Gamernexus that they were saying their test are bullshit and than they got smashed by Steve and proved them wrong so that cable from corsair probably will be a failure from corsair this time

  • @NicWoods88
    @NicWoods88 2 года назад +10

    Over the past couple years I have lost almost all interest in anything from nVidia due to their business practices. That said, I greatly appreciate the way Jay experiments with the new tech that is available to his team.
    Consistently great content! 👍

    • @Physics072
      @Physics072 Год назад

      LOL you just finding this out? I had 3DFX stock back in the day and Nvido was going to give 2 shares of Nvidia stock to 1 share of 3DFx stock. So if you have 5000 shares you end up converting it to 10,000 shares of nvidia. But they found a better idea, they kept 3dfx as a shell company with (no value) so then they did not have to give all the 3DFX share holds 1 dime. So a guy with 5000 shares at 5.00 a share of 3DFX ($25,000) shares went to 0 so his worth is 0. Lots of people lost all their money. And after that Nvidia sent those share holders a letter asking them or telling them they could now buy Nvidia Shares. LOL oh the number of angry people they created. They have not changed one bit in 2023. They screwed over so many people with that dirty move.

    • @xGR1MxREAPERx
      @xGR1MxREAPERx Год назад

      What are you gonna use instead? Subpar garbage and cards? Good luck.

  • @Dtr146
    @Dtr146 2 года назад +2

    This all reminds me of adapting molex connectors into 6 and 8 pin in older computers.

  • @joakoc.6235
    @joakoc.6235 2 года назад +4

    We need a what happens if you trick the voltage regulation IC in the card to push more voltage to the core.

  • @Dtr146
    @Dtr146 2 года назад

    3:30 that statement just made me want to go on a rant about how people think and higher amp brick will damage a device that came with a lower amp one. Amps are pulled not given.

  • @Kieranh778
    @Kieranh778 2 года назад +5

    I would be more curious if there ended up being any functional difference in overclocking headroom, between the official power adapter vs 3rd party

  • @Phat-Monkey
    @Phat-Monkey 2 года назад +2

    Even with a Corsair 900D case the cables do touch the glass coming from the 4090 Suprim X so I totally agree on your previous video the design is pants for those with small PC cases.

  • @Techno-Tanuki
    @Techno-Tanuki 2 года назад +3

    Interesting video. Time to see if I can ever decide if once prices come back down (scalpers again) if I can decide if a 4090 is worth it or not. Probably not.

  • @bitsbfg1810
    @bitsbfg1810 2 года назад

    Liked seeing Jay lose his words on the 300w test out of sheer admiration of the controller. We feel you Jay!

  • @tristen9736
    @tristen9736 2 года назад +3

    From what I've seen from GN, the 4 cables having power essentially just tells the gpu how much it can draw. Each one is 150W so 4 is 600W. 2 would only be 300W.

    • @edwardallenthree
      @edwardallenthree 2 года назад +1

      And the card needs 450 to boot. Remember, however, that it is fine to use the same 8 pin line for two connections (which is why they are pigtailed). So, the connection to the power supply when using corsair's cable and nvidia's would be the same: two connectors.

  • @ej_tech
    @ej_tech 2 года назад

    That iFixit ad still hasn't gone old.

  • @Azulath.
    @Azulath. 2 года назад +3

    Hey Jay, I have a similar issue with the MSI RTX 4090 Suprim X. I don't have any display output on my Gigabyte X570 Aorus Master and the card just won't work. (I have tested two different PSUs, one 1KW and one 1.6KW both with the GPU's dongle and with the cable you are showing in the video.)
    I'm wondering though if the behaviour is identical. On my card, the fans do not spin up and no RGB lights up...I it the same on the Strix? I think you mentioned the fans do not spin up but you don't seem to mention RGB.

    • @calop002
      @calop002 2 года назад

      did you manage to solve the issue, having similar problem.

  • @shanent5793
    @shanent5793 2 года назад

    Solid wire wouldn't need cable management. Just work it back and forth a few times and it'll stiffen and stay wherever you put it

  • @MrIrondog55
    @MrIrondog55 2 года назад +3

    With all these multi-ended cables, like tendrils, coming off the GPU and Corsair's drama with Steve/GN, this could literally be a new Squid Game
    [EDIT:] Your test @12:39 confirms what Steve said. Corsair told them their testing method was BS lol

  • @BoahsLoL
    @BoahsLoL Год назад

    FYI what you mentioned at 06:45 - it's fucking AWESOME we're getting onboard igpu for ryzen now. I couldn't tell you the amount of times I couldn't get a screen to appear with my 3080 because my bios was in UEFI mode in which apparently the FE 3080 wouldn't boot display anything. So I had to get out my 1060, boot into bios and enable legacy mode. Then turn the machine off - put the 3080 back in and everything was fine.
    Now with the board I have - I am at least presented a screen letting me know to go into bios (while using the 3080) and change the mode. Now I've got a 7700X coming in and just knowing I'll be able to simply boot and mess with the bios with the on board GPU is an amazing feature. Not overlooked in the slightest from the customer - big appreciation to AMD for that.

  • @Mark_catface
    @Mark_catface 2 года назад +4

    Im sticking with 30 series it looks better

    • @michaelthompson9798
      @michaelthompson9798 2 года назад +1

      I agree with that statement 🥰👍💪. I think aesthetics overall are no where near the level of the 30-Series. But still impressive numbers overall, regardless of power draw. CPUs now have to catch up 😇💪

    • @Mark_catface
      @Mark_catface 2 года назад

      @@michaelthompson9798 very true

  • @Crunchifyable2
    @Crunchifyable2 2 года назад

    I think Jay found his hidden talent in voice acting and commercials.

  • @Bloodrocutioner
    @Bloodrocutioner 2 года назад +5

    Damn...eventually we will just need a new PSU with a GPU and they will be bundled together if it uses that much power. That cable grouping is atrocious. I thought the double cable that came with my 1070 was awful, just going to run my 1070 into the ground before bothering with a newer GPU. 🤣
    Also I'm still running a 3700x 😅🤣

    • @TalesOfWar
      @TalesOfWar 2 года назад +1

      I wonder if the new AMD cards will have the same kind of cable fun going on? I'd imagine we will, but their stuff tends to also be more efficient than team green so things might not be quite as crazy.

    • @Bloodrocutioner
      @Bloodrocutioner 2 года назад

      @@TalesOfWar yeah I was wondering the same thing when I was commenting. I'm just going to upgrade to a 2070 or idk I will have to research something better than my 1070 I have now if it's even worth doing a 2070 upgrade. I can imagine AMD having something smarter and more innovative though

  • @S_Karthik
    @S_Karthik 2 года назад

    Jay its been years 😊 , iFixt has to collab and do another explosion AD which interrupts the interrruption of GPU becoming bigger than whole PC

  • @Bob_Smith19
    @Bob_Smith19 2 года назад +4

    Will the Corsair adapter allow 600 watts on a 4090? GN has an engineer from Nvidia saying all four are need for full power. And Steve proved that you would not get 600 watts w/out all four being connected.

    • @joebleed
      @joebleed 2 года назад

      I think that's for using the nvidia adapter. the coarsair one is for their power supply. Their power supply should be able to handle the wattage over that cable/connector from their power supply. Think about pcie power cables from power supplies. some have more than one 8 pin on a cable. It should be because the power supply can supply that ammount of wattage for those connectors.
      But if you're using the nvidia adapter, it only sees power from each connector. coming from your power supply, you're probably stll only using 2 separate feeds from the power supply even if it's 4 8 pin pcie connectors.

    • @metaleggman18
      @metaleggman18 2 года назад

      Yes. It's because the cable Nvidia provides with their FE cards has an IC in the cable itself, at least according to the GN video. The 16 pin connector is essentially just two 8-pin pcie cables connected together, at least when used on an ATX 2.0 PSU.

  • @DavidFregoli
    @DavidFregoli 2 года назад +2

    each PSU line can hold 288W, nvidia's adapter uses 4 because the old 8-pin GPU CONNECTOR is limited to 150W, also you might use a cable with 2 terminations which would end up in a single line

    • @Glotttis
      @Glotttis 2 года назад +2

      Yup, can't believe so many are confused by this, even youtubers who should know better are spreading misinformation. Has anyone even looked at the back of their PSU? Those 8 pin connectors are clearly marked PCIE 6+2, CPU 4+4, meaning each connector is also rated for EPS12 8pin, capable of about 300W easy. That's why Corsair only needs 2 connectors for this cable (ofcourse you gotta be mindful about your PSU's total wattage thought).

    • @JasonSturgess
      @JasonSturgess 2 года назад

      @@Glotttis So can I use 2 x Cables and just use the splitters on the end to feed in to the other 2 ? so only need 2 x PSU cables that split in to 2 x GPU power ? Or does this not work.

    • @DavidFregoli
      @DavidFregoli 2 года назад

      yes 2 psu cables are enough. a Y splitter is not ok though because it would mean 300W going through a single connector, which is out of spec. if by "splitter" you mean a cable with 2 gpu-side connectors, that's ok

  • @godtiermedic
    @godtiermedic 2 года назад +1

    Jay your biceps are looking stacked 💪

  • @CRSolarice
    @CRSolarice 5 месяцев назад

    1:53 Solid wire cable will fail more quickly than stranded especially if flexed/moved frequently.

  • @N0N0111
    @N0N0111 2 года назад +1

    3:40 Exactly those are not solid core cables, they don't hold shape in 1 bend.

  • @thomasvennekens4137
    @thomasvennekens4137 2 года назад +1

    if its a solid core it would crack pretty easy after few bends , it will never ever be solid for sure

  • @hassosigbjoernson5738
    @hassosigbjoernson5738 2 года назад +1

    So this launch *is not about super exciting, affordable graphics cards!* It's all about the power supplies, cables and stuff.
    So sad!
    Does anyone remember RTX 3070 launch?! 2080 Ti performance for 499 bucks? That was cool ... and feels like ages ago.

  • @briant9251
    @briant9251 2 года назад +1

    After Corsair calling out Steve, then Steve proving them wrong, I'll be unlikely to trust any Corsair PSU's or cables.

  • @FilipMunk
    @FilipMunk 2 года назад

    the reason the PSU end is only 2x 8pin molex, is because it have 4x 12V lines, the larger gauge wire helps to, but on the PCI-e Power 8 pin they only have 3x 12V, so that's why they use 3 or 4x 8 pin to one mini molex Y connector.

  • @spankbuda7466
    @spankbuda7466 2 года назад +1

    I don't know about this Corsair cable especially after reading what that Corsair rep stated about how Nvidia sense pin work and was wrong!

  • @manuelcardoso4269
    @manuelcardoso4269 2 года назад +1

    yep you need all 4 conection to get to 133% power, that corsiar cable is trash

  • @eljoel89
    @eljoel89 2 года назад +1

    Guess this cable explains why Corsair thought they could get shitty with Gamers Nexus and get away with it.

    • @PabzRoz
      @PabzRoz 2 года назад

      Jesus Christ why doesn't anyone pay attention. That only applies to Nvidia's cable adapter. That's what Steve from gamersnexus was talking about when calling out Corsair. Not Corsairs own cable. The way this cable works is completely different and has absolutely nothing to do with what Corsair and Steve were beefing over.

  • @Shadowfax2121
    @Shadowfax2121 2 года назад

    Goddamnit Jay, take your updoot. That iFixit ad was pretty great buddy.

  • @ErrorCDIV
    @ErrorCDIV 2 года назад

    3:30 That's a really well put. Didn't know that.