iirc early models of this riser used HDMI to connect the PCIe adapter with the tiny PCIe x1 bracket, whoever was making these probably figured USB ports and cables were cheaper to source than the already cheap HDMI ones
@@DrewWalton If that was then case the PCI USB expansion cards wouldn't be a thing. Windows automatically recognizes USB devices even through PCI expansion.
I think the title of the video is misleading. It's not a usb graphics card adapter. They are just using the usb cable to transfer the pcie x1 signal. People should not try to connect the usb to a usb port.
I used this adapter with Radeon RX 470 on old 2008 compaq laptop connected with additional Mini PCI-E x1 adapter, and realized that it has PCIe 1.1 and the bandwidth limitation was so terrible the performance wasn't much higher than Radeon HD5450 which I also tested. But considering that this laptop had integrated GeForce 8200M G that actually is the worst DX10-capable GPU on the planet (about 4-5x worse performance than radeon HD5450 on Mini PCI-E believe it or not) it still was a good upgrade, although very impractical.
@@detoi4371 Yup, Bioshock Infinite is working in around 30-40fps on lowest with 800x600 on HD5450, meanwhile that poor geforce 8200m g gives less than 10FPS. You can get it to maybe 20-ish FPS if you overclock the geforce and set custom resolution to something under 640x480 but then it looks like crap. It even struggles to maintain 30fps with GTA SA if you set resolution to higher than 1024x768, and considering the native resolution of that laptop is 1280x800 and it came out 3 years after the game was released it's terrible.
Yeah, Mini PCIe for Wi-Fi tend to be really bad ones, otherwise your scheme would be pretty popular with some custom «gaming dock stations» for laptops. But it probably may work waaaay better with M.2 slots, designed for SSD. Most laptops with multiple M.2 are already gaming machines with dedicated GPU, though. Btw, I wonder, if modern M.2 Wi-Fi modules also utilize older PCIe standarts? Business laptops usually have several secondary M.2 slots, so there's still a hope for turning some robust Thinkpad or Latitude into a semi-decent gaming machine every Friday.
Would it be worth it to invest in one of those dedicated eGPU enclosures that have their own sff PSU and just connect to a laptop/nuc-sized desktop? Asking because on my local used market, there's a Gigabyte Aorus one with a GTX 1080 included for around $250 and I've really been debating on getting it, just unsure of how the bandwidth would be affected since it's basically a better version of the adapter referenced in this video
I wonder how much better a USB type C version would be, with their 10gb bandwidth Edit: Yes guys I see it is still PCIE not USB, I was just curious as to whether or not the USB2.0 would have caused a bottleneck.
@@Alex-qq1gm Exactly, this is not compatible with USB, just utilizing the cable. Actually if you plug any side of the riser to real USB, it will short and fry one or both of your components.
This has NOTHING to do with USB, the adapter just repurposes a USB cable because it has the amount of wires needed to carry the data signals of a single pci-e lane and in general such usb cables are made with good enough quality control. Could have easily been two ethernet patch cables instead of a single USB cable, it would have just looked uglier and take up more space on that tiny adapter card plugged in the pci-e x1-x16 slot. You should edit the title of the video, I actually expected to see a USB video card (there are such cards).
@@thrivingentrepreneur google usb to hdmi adapters, basically just for getting an extra display output. Theres thunderbolt 3 external pcie chassies tho too.
These aren't really meant for putting a graphics card in a small PC that can't hold one otherwise, these are made for GPU mining. I'm using 3 of them at the moment in a mining rig. If you need to put a graphics card in a tiny PC you'll want a 8x or 16x riser. Otherwise if you can buy these with the cables they're a cheap way to get good USB 3.0 cables if you need a bunch for some reason.
Very interesting. Would you consider repeating this with an older generation graphics card? Perhaps the difference would be smaller for graphics card like the GTX 1050, which, presumably would require less bandwidth on the PCIe slot anyway.
You need a card with a power supply connector because the adapter slot is only getting about 54W from the molex, not the supposedly industry standard 75W for an X16.
@@robertkubrick3738 You could limit the clocks in MSI Afterburner, with a nice undervolt curve as well. However I wouldn't be happy running continuously 54W through that molex. I would want to limit to something like 40W max. You'd probably still get 70% performance.
This is a test bench. Typically they have just whats needed to mount a board and some peripherals. They're open like that to easily swap components for testing. It'd be a terible idea to leave an actual full-time PC open like this in most cases as the cooling would be extremely inefficient.
no they cannot be, because that USB cable still carries PCIe signal to a PCIe port. It is nothing but an extention cord. If you plug it to a laptop's USB port then the best case senario nothing will happen but be advised that you may end up frying data lanes of the GPU.
I bet the performance would be slightly better if you put the adapter in the X16 slot, because that slot's directly wired in to the CPU PCI-E lanes. EDIT: the X1 slot is wired through the chipset and then using the chipset to cpu connection, which will add some latency. I know, i know, the premise is if you don't have an X16 slot, still it would be interesting to see the difference if any.
I'd be really curious to see you test those laptop GPU adapters. There's lots to choose from, expensive ones like the razer ones, and cheap ones that are like what you have there. So I'd be really curious to see what you can do with a decent laptop and external gpu
Love these random adaptors in HD! It's got me thinking it's been near a decade since I've played GTA IV and I really need to give it another go on my modern system.
@@Varmint260 meh gta iv is very unoptimized even a gtx 1080 ti when it first came out was struggling to handle it so dont be surprised if its still runs like shit
@@Varmint260 play it on PS3 again,its better than other versions also pc version has some quirky bugs like the final mission being unwinnable until you cap it to 30 FPS
There are mini pcie (pcmcia too) on chinese sites that are quite useful to bring life into an aging laptop. But they became quite expensive. Great video!
@@-x21- There were no "PCMCIA" adapters, but PCI-derived "CardBus PC Card" can and has been "Bridged" to PCIe. The kits I've seen, have all been 'industrial' focused. (and very spendy) Note: PLX, ASmedia, and a few other ASIC manufacturers make bi-directional PCI-PCIE bridges.
@@-x21- communications breakdown. That is precisely what I expounded upon: No, there were not PCMCIA -> PCIe adapters (confirming your comment) But, 32-bit PCI derived CardBus PC Card, the direct successor to ISA-based PCMCIA, bridging adapterd did/do exist. (expounding upon 'ancient interfaces' being adapted to modern PCIe)
Okay, here is another scenario that would totally validate this adapter. SAS expanders in servers, these are cards you attach to SAS controllers to have more harddrives (SAS is compatible with SATA), but these cards are usually PCI-E 8x in length. Now these expanders are unique in the sense that they do not use the PCI-E bus for anything other than power, all communication is done over a SAS cable from your SAS controller. So this adapter you have will basically allow these cards to get their power without eating up a PCI-E slot on your motherboard.
If you plan on buying some of these make sure you get good quality risers. Only buy risers with 6 pin power connectors and avoid the molex/ SATA risers. There have been cases where fires have been started by them.
Why would you not just plug the card into the x1 slot if you have it available? (obvs you would need to remove the bit of plastic at the back of the slot to fit it in, a 2 min job)
I have a Dell PowerEdge R710, which has a PCI-E x16 slot but it's limited to 25 watts as opposed to the standard 75 watts on most boards. I thought it would be interesting to use one of these to hook up a 960 or a 1060 6 gig or something to see how it would compare to my desktop. But the performance here seems so bad I feel like you've saved me the hassle, lol. Great video!
Can you make a second test with the adapter in the upper pcie slot which is connected to the cpu? In the lower slot you have performance drops because these are supported by the chipset. So you measured not only the differences between pcie 4.0 x1 and pcie 3.0 x1. If you want, you can limit the pcie version of the upper slot down to 3.0 for the mesurements in the UEFI.
I bought one of these intentionally, but not for use with a GPU. I built a Direct Attached Storage chassis so I could add more disks to my NAS build. The extra chassis used a SAS expander which itself needed to be powered by its pci-e bus connector, but it didn't pass any data over that bus. With this adapter, I could power the card without the need for a whole motherboard in the DAS chassis.
Hilarious... only a few weeks ago I've pulled a very similar adapter out of a "discarded" Lenovo ThinkCentre workstation (cept the power was delivered via a SATA to 6-pin PCIe power cable) and with it a Nvidia T600. The regular x16 slot was populated by some GT730. I wonder what kind of "expert" put this together. LOL
About 5 years ago I used a PCI-E x1 riser card to diagnose a strange problem that I has with an old Asus P67 board. P67 boards don't work with the iGPU. The problem was the PCI-E x16 sized slots on the board weren't working. As a troubleshooting step I tried one of these cards in a PCI-E x1 slot and I was able to boot the system. The CPU's I used were the 2600k, 2500k, and a Pentium g645 What's strange is I tested the system with a 3770, 3570, and a 3770k and the PCI-E x16 slots work. It makes sense that the slots that are connected to the chipset worked, and the slots that were connected to the CPU didn't. What doesn't make much sense is slots not working with Sandy Bridge but the slots worked fine with Ivy Bridge. My only guess is some of the socket pins on the board were shorted.... and IVB was using reserve pins that SB wasn't. The main performance difference between IVB and SB wasn't the CPU processing power, it was the iGPU that came with IVB was much more robust.
Hey great video! I noticed that you own/owned an I5 9400f and i wanted ask you if i should get one? is it worth it? and what is the gpu that is paired best with it? thanks in advance and i hope you have a wonderful day!
@@RandomGaminginHD Thanks for the advice! but also if i pair it with new gen gpus would that be too much of a cpu bottleneck for examlpe lets say a 3070? I am sorry for throwing a million questions at you.
@@leecannotbesin483 I know I'm not RGHD but hopefully I can be helpful anyway bottlenecking is hard to determine, higher resolutions/ higher graphics put less stress on the cpu opposed to higher Fps, sadly some games just lean on certain hardware My questions are what resolution ? And what games?
@@leecannotbesin483 3070 for 1080p is a bit silly but it'll work fine, if your not 100% set on that cpu I would suggest the i3 12100f instead then just cheap out on the card a little bit and grab a 3060ti or something from AMD
i still have like 30 pcs left in my toy box from my last mining operation, they run at pcie 3.0 x1, the best card you can go with them will be 1030 or 750ti.
@@RandomGaminginHD i did do some research on the power provided by the pcie slot, and it was from the 4 pin molex, which most of the time, people use a sata converter on it. the "safe"(i mean burning the board instead of the gpu) power draw will be near 50W or so. however, theoretically you can go up 150W if safety is none of your concern (i fried no less than 10 boards with my GTX 1060s at ~100W).
Very important it is NOT USB it just uses the USB-A 3.x Connectors because they have 9 Pins. Some Adapters use the HDMI connector because it has even more Pins available. If you connect that to USB or put an USB device in it, you might damage something.
I own a HP 400 G1 SFF i7 system, and also a full size 1060 3GB graphics card. I combined the two by obtaining a PCI-E extension cable along with a SATA to 6 pin power cable. By using a sheet of thick cardboard (no hot spots so not really a fire hazard, although I will try and find something less flammable) with 2 slits for the cables instead of the metal case top, which obviously cannot be used with this configuration, I can now use the card to its full potential, and it flies. The card sits neatly on the cardboard sheet. The 1060 3GB card only draws around 110 watts maximum under heavy load so the 250 watt power supply copes fine as the rest of the system only draws around 20-30 watts at the most. In fact, I have 2 external hard drives connected and everything runs without a hitch. With this setup I can test any number of graphic cards, well, low powered ones, although the system will still be able to run much more power hungry cards for normal every day use. It is when you put the card under load does it require the extra power. I might make my own video, come to think of it 😉
Interesting. I wouldn't want to use one of these full time, but having said that, I've found a use case for one of these. If you have a laptop with integrated graphics, you can just shove a low power GPU into this, and connect it to the laptop via USB 3.0. There's still the MOLEX question though (on how to power this thing up), but still - possibly a god way to get a laptop to game, until you can get a gaming laptop or tower PC with dedicated graphics?
Nope, unfortunately plugging this into an external USB port would do nothing (except perhaps fry your graphics card, USB port, or both). The USB port here is just being used as a cheap way to get a fairly high bandwidth physical connection, it's not operating with anything resembling USB.
Based on what I can see is that this has nothing to do with USB. The USB cable is used because it probably has sufficient signal lanes to support a PCIe signal lanes and is of a good quality. This is just a riser, that allows to connect graphics outside the case on a cable (vs a ribbon cable). Try to connect the graphics to a PCIe x1 slot, it should perform exactly the same. Yes, it should work fine.
I tried to do this the other day for a Plex server because I wanted to used the two x16 slots for a LSI HBA adaptor and a 10gbe network card. Sadly the pcie1x slot wasn't enough for the card to do it's transcoding duty properly. The video-out would work without trouble so I was optimistic and it could transcode 1080p source video. 4K>HD was a complete impossibility though :( So i had to compromise and use a 5gbe network card (which can work in in 1x Pcie3.0 slot) I actually ended up using that adaptor ithout plugging it into the pcie slot (mine had pcie power slots as well) for a SAS expander card for 28 additional sata ports (so that I can have 40 drives in my computer and SAS expanders can be daisy chained.
Oh, I was thinking of using these with a splitter for a 10Gbe card and a GPU for transcoding (this server only has one PCIe slot sadly). I guess I'll have to find an alternative or move my media server and GPU to my NAS :/ How's the performance with the LSI card tho?
@@mttkl I am extremely happy with the lsi card (I bought a 2nd hand one that had been flashed to IT mode so that it works as a host-bus-adaptor - otherwise it can only be used for RAID configurations out of the box) I've had a lot of SMR hdds that would regularly drop off (until a reboot) using a basic consumer level pcie>sata card. That doesn't happen at all with LSI controller and I can barely tell that these are SMR drives compared to how they were before. the biggest problem is that the LSI card I have is pcie2.0 so it pretty much *has* to have access to all 16 lines for it to work properly. You can probably get new (but really expensive) pcie3 or pcie4 cards. 10gbe networking is kind of overkill for HDD storage. YOu can get 5gbe usb adaptors that might meet your needs. I think that you need a motherboard that explicitly supports pcie bifurcation in order to split pcie slots in the way that you are thinking - I don't know of any consumer boards that do.
@@helenFX Thank you for the thorough reply! Hopefully I'll be able to get a cheap LSI card soon and a motherboard with more PCIe slots too :) Never heard good things of those PCIe to Sata cards, will definitely avoid, specially when running TrueNAS Core. Yup, 10Gbe is totally overkill for me as I don't even use an SSD write/read cache, but I cant really find any 5Gbe card here and the only other option is going 2.5Gbe (or going the more expensive route with SFP). Thing is, the only 2.5Gbe card I can find, while enough for me, is literally the same price as a RJ45 10Gbe card, so I might as well just go with 10Gbe one and have that headroom for future storage upgrades. I think my personal B550 board supports bifurcation, but I'm not so sure about the one I have in my server as it's OEM and basic, but I already was planning on replacing it in the future.
came here to say the same thing, using one of these with a molex is dangerous. @RandomGaminginHD Dont use this setup again mate, you can get ones that have a PCI power on the riser
@@Loneadmin the RTX 3050 is a 130w TDP card but the problem is you can never know where all the power is going to be requested from. If the card wants a full 75w from the PCI-lane itself, the molex powering that can only supply 52w, so it cannot safely deliver all the required power to the right place. The 8-pin going into the card can deliver up to 150w of power and in this specific case with the 3050 being a low powered card, things should not get overloaded. But with a more powerful GPU that will ask for more power, using a riser with a molex is really dangerous.
There are much better PCI-E x1 Risers on the market that connect to a PCI-E 6-pin connector. A molex chain on a high quality PSU (eg EVGA T2/P2, Super Flower Leadex, Corsair AX/HX/RMx) that uses heavier 18 gauge wires that are 12" in length are good for 120w (safe) (132w max, not so safe)..... usually 10a to 11a on the +12v rail. That on the chain, not a single connector. Using a riser with a cheaper PSU that can only handle 6a per molex chain would definitely not be a good idea.
@@YorkieKilla Yeah, you'll likely get away with it because of the massive drop in performance meaning it won't draw close to that 130W TDP. Still not recommended. Would be okay if you undervolt and underclock though, and tested it to make sure it doesn't consume more than say 90W overall. Btw the power drawn is usually balanced from slot and connector.
This is called riser. Crypto miners used this connect many gpu in one motherboard. If you try to connect the 'USB 3.0'into the usb port it will not do anything. It's basically a bridge or link cable between the motherboard to the riser. Nice video btw.
It is also useful for laptops with bad integrated graphics, even if you lose about 50% of the card's performance, it may still be worth it. I used to game like that long ago with a previous version of this adpater which used the ExpressCard slot to connect.
Tried this already and it performs slower because 1x16 is faster, But it still can play lots of tripple A games but not on very high settings and also it can't run every gpu in the market because that port will bottleneck the powerful gpus out there.
As others have mentioned, you’d probably be better off going with a more traditional eGPU setup on that. I’ve been using one for some amateur AI work and it functions quite well, especially since that kind of task isn’t as time-sensitive as gaming.
@@pokepress What kind of egpu if you mind me asking? because there are the Chinese one which is more compatible to any laptop but has limitations or the pricy thunderbolt egpu?
Yes, I did that about ten years ago, with a version that used the expressCard to connect to the laptop. It really enabled my Core2 Duo laptop to run a lot of games with an external graphics card.
When running multiple GPUs some PCIe x1 ports run on the southbridge which is limited to previous generations of the PCIe protocol. For maximum bandwidth use the x16 slot with the x1 adapter. Make sure you plug it in the right way - as it's possible to REALLY screw this up - it will ROAST the "USB like" cable and could cause a fire.
Is it USB? As in, can you just connect it straight to an actual USB port on your system? Or does it just use a USB-style connector for a custom card? Because I'd expect that the main use of an actual USB graphics card adapter would be to try and plug into a laptop, not using some small PCI-e USB card.
Definitely do NOT plug it into a regular USB port. You'll most likely fry the USB controller or possibly even the board itself due to different voltages and power draw from the card.
@@austinriddick6414 I didn't even think about the fact that it would try to power itself from the port. I guess I just assumed you threw in some sort of adapter to power it from the mains.
Its not usb its just using a usb cable to bring the wires to the 1x card These were made for crypto mining mainly as many coins didn't need the bandwidth of x8 x16
I threw together an EXP GDC Beast egpu a few years ago for a now 12 year old all-in-one PC that I daily drive. The hardware itself is comparatively similar to a desktop, the dock connects through mpcie where the wifi card would be, and the performance is actually really good. I have no clue how that performance would translate to other machines because I've never transferred it, but I imagine it'd be very hit and miss for a lot of hardware. If these docks had a mobile option like expresscard, m.2 or mpcie I think it would be the most beneficial. The beauty of the egpu is keeping a portable machine portable or giving life to a really old machine that can take it.
It's not USB, its just using a USB cable to connect the 1st set of PCIe Bus Lanes to the external board, technically, you could design a board that has a full PCIe socket with 4 or 8 of those USB cables and have a full external socket - or maybe you could build a device that Translates the Output to wirelessly send and recieve the Bus Data to an "Wireless External PCIe Translator-Reciever" board across the room and just send the data as Packets over Wifi or something since WiFi is pretty fast these days with multiple channels and Bandwidth I bet it could be done...
Bearing in mind that in 5 Generations, PCIE speeds have increased so that an x1 connection on PCIE v5 is as fast as x16 on PCIE v1, I wouldn't be surprised to see PCIE x1 interface GPUs any time in the next Decade. Especially if Efficiency becomes the overriding factor.
I used a couple of these with vertical GPU mounts to free up my x16 (2 wired as x8 and 1 wired as x4) for 10Gbe networking and RAID for my servers. This way, I have display capabilities through a x1 slot and enough slots for my SAS RAID and HBA and 10Gbe adapters. Windows Server (Hyper-V) doesn't need the gaming capabilities, and G210 GPUs don't really need the PCIe bandwidth. These PCIe x1 adapters work great for that purpose.
You're missing something important here.. You could replace the pcie wifi adapter on motherboards (of laptops) to make use for a pcie Extender for a x4 connection. Not ideal but better than an apu gpu for sure.
I ran a similar set up but for an old HP compaq presario laptop (from c.2009) with an intel core 2 duo t6600, combining it with a Radeon HD 7770, connecting the PCIE1 slot where the wireless card should have been. Getting an output via the Graphics card was a bit faffy due to the locked HP Bios, but putting it to sleep, connecting and waking up did the trick. Actually ran Left 4 Dead 2 surprisingly well and was very playable! Didn't take any benchmarks or anything but from what I recall the HD7770 was a reasonably good pairing as it basically maxxed out the bandwidth on the slot. Mostly was just tinkering for a bit of fun with old hardware though, never really used it in earnest :)
To be honest this is actually pretty cool. If you have an oldish laptop with integrated graphics and can pick up a low-TDP GPU for cheap used, this isn't the worst thing I've ever seen for if you wanna be able to play some old games when travelling. Easier to carry around than a full PC and much cheaper than a gaming laptop, anyway. Neat!
Best way is to find one with a ye olde Expresscard slot, so you don't have to remove the bottom of your laptop to fit it. But used Ryzen 2/3rd gen and MX150 laptops are cheap enough to make this method a bit obsolete for travel.
I don't know if this matters to you but honestly in my opinion. The quality of the videos recently has improved immensely. I don't know what you changed but your videos look a lot more professional
I have actually tested this. These things do generally use normal USB3 cables that have the proper wiring. They could theoretically use any other type of cable as long as it has the correct number of pins, because all they're doing is using the cable as a transport medium. They're not using the USB protocol, just the cable.
Fun fact, these are actually super useful for ultra budget rigs. On ebay, itx 6th-7th gen intel i5 cpu/mobo/ram combos can be as cheap as $80, and the cheapest ones are because they only have a x1 slot. So you can have that system with a 1050ti for under $300 with performance that still beats a stock 5600g in many cases, systems that can cost double the price. Obviously its not the best solution, but in a increasingly expensive used market, its still a solution.
The same 6/7th gen rigs full size or SFF size, like Dell Optiplex, are around the same price now though. They're definitely good shouts for budget gaming rigs. No need to blow $1k when you're just looking to get a good 1080p gaming experience!
This actually worked far better than I thought. I might try this to use a low-end Nvidia card I've got lying around to provide PhysX support to my 'vintage' HD 7950.
I remember years ago in my country, they used to sell adapters for GPUs for laptops, so you could get better graphics for any purpose you wanted. Never got to try one but this gives me an insight of how could have been.
EE here, the molex cable has such a low impedance to catch fire you'd need to dump an incredible amount of current through it - more likely whatever chips are doing power regulation would end up torched Edit - that's why the adapter got warm
I love seeing random hardware stuff like this to extend the life of parts you have on hand. Lots of mini PCs and laptops could put something like this to use. Great video, thanks!
Directly connect riser mxm card to laptop WIFI card port and you can drastically improve performance and don't forget using windows readyboost sd card to improve 5 to 10 FPS performance.
The graphic card is not really run on usb3.0, the real agreement under the shape (usb3.0) is pcie x1. It cannot directly connect to a USB A of a laptop!
Molex should not be used long term to power this, it’s a fire hazard as the cables do not handle the power required by the PCI-E connector which is up to 75Watts. Molex only Rated up to 4.5amps or 54Watts. This probably explains the adapter getting hot and cracking sound. Learned this with crypto mining, there are new PCIE adapters that use PCI-E power cables.
now if this had an AGP 8x adapter that you could plug in a slightly more modern card into an old system and maybe deal with some questionable drivers, could be fun to fiddle around with
At first I thought this was a way to connect a GPU over a USB 3 bus. That would actually be interesting for use in plex servers for small form factor PC's without a PCIe slot.
You are so wrong about this setup. The USB setup does not have enough bandwidth for transmitting enough data for full graphics spec. However, it has more than enough bandwidth for... *drumroll*... Mining. That's right, these were used for MINING rigs... not adding a gpu to a smaller form factor. The most bandwidth you get out of that is x1-x2... tops.
This is only possible because PCIe is delay and latency tolerant. This is the same reason why you can plug a x16 card on a x1 slot. The molex connector is just to send 75W to the slot.
Nvidia, hear me out... RTX 3030 with an option to use both PCIe x1 and x16 slots for: 1) light modern gaming; 2) display adapters for companies that want the hardware with the longest support period possible; 3) access to Tensor cores as a second GPU for content creators, deep learning enthusiasts, and other tech people.
I think the usb3 is just a cable basically for connecting the pins, not doing anything 'usb'.
You can plug a normal USB device into the slot and it will work. I made a pc have "onboard WiFi " using one of these
@@HardWhereHero I very seriously doubt that.
iirc early models of this riser used HDMI to connect the PCIe adapter with the tiny PCIe x1 bracket, whoever was making these probably figured USB ports and cables were cheaper to source than the already cheap HDMI ones
Usb cables also increase the latency in that length especially for heavy resources like games
@@DrewWalton If that was then case the PCI USB expansion cards wouldn't be a thing. Windows automatically recognizes USB devices even through PCI expansion.
I think the title of the video is misleading. It's not a usb graphics card adapter. They are just using the usb cable to transfer the pcie x1 signal. People should not try to connect the usb to a usb port.
Agreed. This is not USB at all although the cable used to transfer the pci-e data is a USB cable. Something like, "gaming on a pci express x1 riser"
I wonder what would happen if you connect it to a usb port
Yeah, that's what I was thinking, I wonder if thats why it's a USB 3 cable, because it has more wires, not because its using USB protocols.
There are mining boards where you connect this directly to a usb port
I just wanted to ask why he didn't think to use this with a laptop. There are laptops with USB 3 and 4 and eGPUs are usually very expensive.
I used this adapter with Radeon RX 470 on old 2008 compaq laptop connected with additional Mini PCI-E x1 adapter, and realized that it has PCIe 1.1 and the bandwidth limitation was so terrible the performance wasn't much higher than Radeon HD5450 which I also tested. But considering that this laptop had integrated GeForce 8200M G that actually is the worst DX10-capable GPU on the planet (about 4-5x worse performance than radeon HD5450 on Mini PCI-E believe it or not) it still was a good upgrade, although very impractical.
Holy crap, I had the HD5450. And you're saying it's 5 times slower than that??? Damn that's rough
@@detoi4371 Yup, Bioshock Infinite is working in around 30-40fps on lowest with 800x600 on HD5450, meanwhile that poor geforce 8200m g gives less than 10FPS. You can get it to maybe 20-ish FPS if you overclock the geforce and set custom resolution to something under 640x480 but then it looks like crap. It even struggles to maintain 30fps with GTA SA if you set resolution to higher than 1024x768, and considering the native resolution of that laptop is 1280x800 and it came out 3 years after the game was released it's terrible.
@@mruczyslaw50 Jesus Christ that cards terrible 🤣, makes the hd 5450 look like a gaming card
Yeah, Mini PCIe for Wi-Fi tend to be really bad ones, otherwise your scheme would be pretty popular with some custom «gaming dock stations» for laptops. But it probably may work waaaay better with M.2 slots, designed for SSD. Most laptops with multiple M.2 are already gaming machines with dedicated GPU, though.
Btw, I wonder, if modern M.2 Wi-Fi modules also utilize older PCIe standarts? Business laptops usually have several secondary M.2 slots, so there's still a hope for turning some robust Thinkpad or Latitude into a semi-decent gaming machine every Friday.
Would it be worth it to invest in one of those dedicated eGPU enclosures that have their own sff PSU and just connect to a laptop/nuc-sized desktop? Asking because on my local used market, there's a Gigabyte Aorus one with a GTX 1080 included for around $250 and I've really been debating on getting it, just unsure of how the bandwidth would be affected since it's basically a better version of the adapter referenced in this video
G'day Random,
I love the surprises & giggles your channel gives me, it is always a better day with RGinHD videos
Thanks mate :)
@@RandomGaminginHD any way i can do this to mine laptop? Its a toshiba satilite c 850
I wonder how much better a USB type C version would be, with their 10gb bandwidth
Edit: Yes guys I see it is still PCIE not USB, I was just curious as to whether or not the USB2.0 would have caused a bottleneck.
I don't think it's really USB. It's just 1x PCI-E that happens to be on a USB cable.
Not much better, you need at least thunderbolt to be capable
@@Alex-qq1gm Exactly, this is not compatible with USB, just utilizing the cable. Actually if you plug any side of the riser to real USB, it will short and fry one or both of your components.
theoretically that is what thunderbolt is suppose to be, PCIE over usb
@@Alex-qq1gm a philosopher of budget gaming. namaste.
This has NOTHING to do with USB, the adapter just repurposes a USB cable because it has the amount of wires needed to carry the data signals of a single pci-e lane and in general such usb cables are made with good enough quality control. Could have easily been two ethernet patch cables instead of a single USB cable, it would have just looked uglier and take up more space on that tiny adapter card plugged in the pci-e x1-x16 slot.
You should edit the title of the video, I actually expected to see a USB video card (there are such cards).
Please, link a USB Graphic Card 🤣
@@thrivingentrepreneur google usb to hdmi adapters, basically just for getting an extra display output.
Theres thunderbolt 3 external pcie chassies tho too.
@@thrivingentrepreneur He's right, they exist, they're just not called GPUs. USB > HDMI / Mini HDMI is only one example of them.
These aren't really meant for putting a graphics card in a small PC that can't hold one otherwise, these are made for GPU mining. I'm using 3 of them at the moment in a mining rig. If you need to put a graphics card in a tiny PC you'll want a 8x or 16x riser. Otherwise if you can buy these with the cables they're a cheap way to get good USB 3.0 cables if you need a bunch for some reason.
No shit lol
@Lurch7861 There's some pretty flexible and longer x16 and x8 riser cables available.
Very interesting. Would you consider repeating this with an older generation graphics card? Perhaps the difference would be smaller for graphics card like the GTX 1050, which, presumably would require less bandwidth on the PCIe slot anyway.
You need a card with a power supply connector because the adapter slot is only getting about 54W from the molex, not the supposedly industry standard 75W for an X16.
@@robertkubrick3738 An RX460 could work if it's limited to 50W. A lot of the red bare PCB ones are.
@@robertkubrick3738 You could limit the clocks in MSI Afterburner, with a nice undervolt curve as well. However I wouldn't be happy running continuously 54W through that molex. I would want to limit to something like 40W max. You'd probably still get 70% performance.
@@ChrisD__ I think my integrated graphics would beat a RX460 that was limited.
@@elemkay5104 It might not have been 54W through the molex, depends on how much the 6+2 connector supplies. 54W is just the standard molex maximum.
I love how utilitarian your desk set up is. There is something extremely cool about having a no frills set up.
This is a test bench. Typically they have just whats needed to mount a board and some peripherals. They're open like that to easily swap components for testing. It'd be a terible idea to leave an actual full-time PC open like this in most cases as the cooling would be extremely inefficient.
these types of adapters can be a good solution for older laptops
no they cannot be, because that USB cable still carries PCIe signal to a PCIe port. It is nothing but an extention cord. If you plug it to a laptop's USB port then the best case senario nothing will happen but be advised that you may end up frying data lanes of the GPU.
I bet the performance would be slightly better if you put the adapter in the X16 slot, because that slot's directly wired in to the CPU PCI-E lanes.
EDIT: the X1 slot is wired through the chipset and then using the chipset to cpu connection, which will add some latency.
I know, i know, the premise is if you don't have an X16 slot, still it would be interesting to see the difference if any.
I guess latency would be reduced a little but the bandwidth would still produce the bottleneck regardless,
i have mining rig with this riser, put it in X16 slot, no difference with put it in X1 slot for gaming
I'd be really curious to see you test those laptop GPU adapters. There's lots to choose from, expensive ones like the razer ones, and cheap ones that are like what you have there. So I'd be really curious to see what you can do with a decent laptop and external gpu
Love these random adaptors in HD! It's got me thinking it's been near a decade since I've played GTA IV and I really need to give it another go on my modern system.
I feel that
Gta 4 badly optimized so won't change much
@@johnnyblaze9217 I originally played GTA IV on PS3, so I suspect I'll be able to tweak it enough to get a better experience than that.
@@Varmint260 meh gta iv is very unoptimized even a gtx 1080 ti when it first came out was struggling to handle it so dont be surprised if its still runs like shit
@@Varmint260 play it on PS3 again,its better than other versions also pc version has some quirky bugs like the final mission being unwinnable until you cap it to 30 FPS
There are mini pcie (pcmcia too) on chinese sites that are quite useful to bring life into an aging laptop. But they became quite expensive. Great video!
Actually it's Express ExpressCard. PCMCIA was based on ISA. Later they replaced the ISA interface with PCI. This was known as CardBus.
@@-x21-
There were no "PCMCIA" adapters, but PCI-derived "CardBus PC Card" can and has been "Bridged" to PCIe.
The kits I've seen, have all been 'industrial' focused. (and very spendy) Note: PLX, ASmedia, and a few other ASIC manufacturers make bi-directional PCI-PCIE bridges.
@@LRK-GT incorrect. Before the pci derived cardbus there were ISA derived cards. These were 16 bit.
@@-x21- communications breakdown.
That is precisely what I expounded upon:
No, there were not PCMCIA -> PCIe adapters (confirming your comment)
But, 32-bit PCI derived CardBus PC Card, the direct successor to ISA-based PCMCIA, bridging adapterd did/do exist. (expounding upon 'ancient interfaces' being adapted to modern PCIe)
@@LRK-GT Ah sorry. Yeah there's a ton of bridge chips for pci to pcie and pcie to pci for such applications.
Not USB, just using the USB cable as the connecting interface between the two cards.
Okay, here is another scenario that would totally validate this adapter.
SAS expanders in servers, these are cards you attach to SAS controllers to have more harddrives (SAS is compatible with SATA), but these cards are usually PCI-E 8x in length.
Now these expanders are unique in the sense that they do not use the PCI-E bus for anything other than power, all communication is done over a SAS cable from your SAS controller.
So this adapter you have will basically allow these cards to get their power without eating up a PCI-E slot on your motherboard.
I’ve always pondered on this but never put it into practice, well I guess I don’t have to now! Great video as always 🙏🏼
You can try USB-C
@@prateekpanwar646 haha yeah cheers
its using the usb cables as data wires, not usb signal
If you plan on buying some of these make sure you get good quality risers. Only buy risers with 6 pin power connectors and avoid the molex/ SATA risers. There have been cases where fires have been started by them.
Its always awesome watching you do off the wall stuff like this, the unthinkable 👍
Yes, the molex adapter is VERY capable of catching fire. Or at least in my case it got so hot that it melted the cables isolation....
Why would you not just plug the card into the x1 slot if you have it available? (obvs you would need to remove the bit of plastic at the back of the slot to fit it in, a 2 min job)
I have a Dell PowerEdge R710, which has a PCI-E x16 slot but it's limited to 25 watts as opposed to the standard 75 watts on most boards. I thought it would be interesting to use one of these to hook up a 960 or a 1060 6 gig or something to see how it would compare to my desktop. But the performance here seems so bad I feel like you've saved me the hassle, lol.
Great video!
Can you make a second test with the adapter in the upper pcie slot which is connected to the cpu? In the lower slot you have performance drops because these are supported by the chipset. So you measured not only the differences between pcie 4.0 x1 and pcie 3.0 x1. If you want, you can limit the pcie version of the upper slot down to 3.0 for the mesurements in the UEFI.
I bought one of these intentionally, but not for use with a GPU. I built a Direct Attached Storage chassis so I could add more disks to my NAS build. The extra chassis used a SAS expander which itself needed to be powered by its pci-e bus connector, but it didn't pass any data over that bus. With this adapter, I could power the card without the need for a whole motherboard in the DAS chassis.
Hilarious... only a few weeks ago I've pulled a very similar adapter out of a "discarded" Lenovo ThinkCentre workstation (cept the power was delivered via a SATA to 6-pin PCIe power cable) and with it a Nvidia T600. The regular x16 slot was populated by some GT730. I wonder what kind of "expert" put this together. LOL
About 5 years ago I used a PCI-E x1 riser card to diagnose a strange problem that I has with an old Asus P67 board. P67 boards don't work with the iGPU. The problem was the PCI-E x16 sized slots on the board weren't working. As a troubleshooting step I tried one of these cards in a PCI-E x1 slot and I was able to boot the system. The CPU's I used were the 2600k, 2500k, and a Pentium g645
What's strange is I tested the system with a 3770, 3570, and a 3770k and the PCI-E x16 slots work.
It makes sense that the slots that are connected to the chipset worked, and the slots that were connected to the CPU didn't. What doesn't make much sense is slots not working with Sandy Bridge but the slots worked fine with Ivy Bridge. My only guess is some of the socket pins on the board were shorted.... and IVB was using reserve pins that SB wasn't. The main performance difference between IVB and SB wasn't the CPU processing power, it was the iGPU that came with IVB was much more robust.
I've always wondered if this worked! I have a few of these from a mining rig that I never actually built.
Keep up the Shenanigans. I enjoy to watch them
Hey great video! I noticed that you own/owned an I5 9400f and i wanted ask you if i should get one? is it worth it? and what is the gpu that is paired best with it? thanks in advance and i hope you have a wonderful day!
It’s a great cpu. Would go nicely with a lot of GPUs. A used 1080 would be a good choice
@@RandomGaminginHD Thanks for the advice! but also if i pair it with new gen gpus would that be too much of a cpu bottleneck for examlpe lets say a 3070? I am sorry for throwing a million questions at you.
@@leecannotbesin483 I know I'm not RGHD but hopefully I can be helpful anyway bottlenecking is hard to determine, higher resolutions/ higher graphics put less stress on the cpu opposed to higher Fps, sadly some games just lean on certain hardware
My questions are what resolution ?
And what games?
@@younglingslayer2896 Well i do have a big game library that i cant fully enjoy at the moment its gonna be 1080p E sport and Trilple A game titles
@@leecannotbesin483 3070 for 1080p is a bit silly but it'll work fine, if your not 100% set on that cpu I would suggest the i3 12100f instead then just cheap out on the card a little bit and grab a 3060ti or something from AMD
This was humbling for me because it still did a hell of a lot better than I thought it would.
So you became humble?
@@Tom_Quixote No, hell no. I was surprised how well it did, but I have long since gone back to my swatting limp-wristed, internet boy bitch days.
i still have like 30 pcs left in my toy box from my last mining operation, they run at pcie 3.0 x1, the best card you can go with them will be 1030 or 750ti.
Yeah I wondered what the sort of limit is power wise until you start to see huge performance dips
@@RandomGaminginHD i did do some research on the power provided by the pcie slot, and it was from the 4 pin molex, which most of the time, people use a sata converter on it. the "safe"(i mean burning the board instead of the gpu) power draw will be near 50W or so. however, theoretically you can go up 150W if safety is none of your concern (i fried no less than 10 boards with my GTX 1060s at ~100W).
I used a few AMD and Nvidia cards, Nvidia card were working normally, 10 ~30% reduced performance, gt 1030 was the best for gaming.
they say we don't have answers for all the questions, but honestly you are closing us to that very fast
I have a good video for you, I recently acquired a lenovo thinkstation m920q and added a pcie adapter and low profile 1050ti. shockingly it works!
Good to hear :)
Very important it is NOT USB it just uses the USB-A 3.x Connectors because they have 9 Pins. Some Adapters use the HDMI connector because it has even more Pins available.
If you connect that to USB or put an USB device in it, you might damage something.
There are also ExpressCard to PCIe slot adapters that should perform much better since it's 2.5Gbps. Good for older laptops with no Thunderbolt ports
I own a HP 400 G1 SFF i7 system, and also a full size 1060 3GB graphics card. I combined the two by obtaining a PCI-E extension cable along with a SATA to 6 pin power cable. By using a sheet of thick cardboard (no hot spots so not really a fire hazard, although I will try and find something less flammable) with 2 slits for the cables instead of the metal case top, which obviously cannot be used with this configuration, I can now use the card to its full potential, and it flies. The card sits neatly on the cardboard sheet. The 1060 3GB card only draws around 110 watts maximum under heavy load so the 250 watt power supply copes fine as the rest of the system only draws around 20-30 watts at the most. In fact, I have 2 external hard drives connected and everything runs without a hitch. With this setup I can test any number of graphic cards, well, low powered ones, although the system will still be able to run much more power hungry cards for normal every day use. It is when you put the card under load does it require the extra power. I might make my own video, come to think of it 😉
Interesting. I wouldn't want to use one of these full time, but having said that, I've found a use case for one of these.
If you have a laptop with integrated graphics, you can just shove a low power GPU into this, and connect it to the laptop via USB 3.0. There's still the MOLEX question though (on how to power this thing up), but still - possibly a god way to get a laptop to game, until you can get a gaming laptop or tower PC with dedicated graphics?
Nope, unfortunately plugging this into an external USB port would do nothing (except perhaps fry your graphics card, USB port, or both). The USB port here is just being used as a cheap way to get a fairly high bandwidth physical connection, it's not operating with anything resembling USB.
Based on what I can see is that this has nothing to do with USB. The USB cable is used because it probably has sufficient signal lanes to support a PCIe signal lanes and is of a good quality. This is just a riser, that allows to connect graphics outside the case on a cable (vs a ribbon cable). Try to connect the graphics to a PCIe x1 slot, it should perform exactly the same. Yes, it should work fine.
I tried to do this the other day for a Plex server because I wanted to used the two x16 slots for a LSI HBA adaptor and a 10gbe network card.
Sadly the pcie1x slot wasn't enough for the card to do it's transcoding duty properly. The video-out would work
without trouble so I was optimistic and it could transcode 1080p source video. 4K>HD was a complete
impossibility though :(
So i had to compromise and use a 5gbe network card (which can work in in 1x Pcie3.0 slot)
I actually ended up using that adaptor ithout plugging it into the pcie slot (mine had pcie power slots as well) for a SAS expander card for 28 additional sata ports (so that I can have 40 drives in my computer and SAS expanders can be daisy chained.
Oh, I was thinking of using these with a splitter for a 10Gbe card and a GPU for transcoding (this server only has one PCIe slot sadly). I guess I'll have to find an alternative or move my media server and GPU to my NAS :/
How's the performance with the LSI card tho?
@@mttkl I am extremely happy with the lsi card (I bought a 2nd hand one that had been flashed to IT mode so that it works as a host-bus-adaptor - otherwise it can only be used for RAID configurations out of the box)
I've had a lot of SMR hdds that would regularly drop off (until a reboot) using a basic consumer level pcie>sata card. That doesn't happen at all with LSI controller and I can barely tell that these are SMR drives compared to how they were before.
the biggest problem is that the LSI card I have is pcie2.0 so it pretty much *has* to have access to all 16 lines for it to work properly. You can probably get new (but really expensive) pcie3 or pcie4 cards.
10gbe networking is kind of overkill for HDD storage. YOu can get 5gbe usb adaptors that might meet your needs.
I think that you need a motherboard that explicitly supports pcie bifurcation in order to split pcie slots in the way that you are thinking - I don't know of any consumer boards that do.
@@helenFX Thank you for the thorough reply!
Hopefully I'll be able to get a cheap LSI card soon and a motherboard with more PCIe slots too :)
Never heard good things of those PCIe to Sata cards, will definitely avoid, specially when running TrueNAS Core.
Yup, 10Gbe is totally overkill for me as I don't even use an SSD write/read cache, but I cant really find any 5Gbe card here and the only other option is going 2.5Gbe (or going the more expensive route with SFP).
Thing is, the only 2.5Gbe card I can find, while enough for me, is literally the same price as a RJ45 10Gbe card, so I might as well just go with 10Gbe one and have that headroom for future storage upgrades.
I think my personal B550 board supports bifurcation, but I'm not so sure about the one I have in my server as it's OEM and basic, but I already was planning on replacing it in the future.
i was actually expecting way lower performance than that. crazy stuff!
Here's the biggest issue, molex is only 52 watts instead of 75 from pcie, if anyone gets one get one with a 6 pin pcie connector instead.
Does a 3050 require 75w?
came here to say the same thing, using one of these with a molex is dangerous. @RandomGaminginHD Dont use this setup again mate, you can get ones that have a PCI power on the riser
@@Loneadmin the RTX 3050 is a 130w TDP card but the problem is you can never know where all the power is going to be requested from. If the card wants a full 75w from the PCI-lane itself, the molex powering that can only supply 52w, so it cannot safely deliver all the required power to the right place. The 8-pin going into the card can deliver up to 150w of power and in this specific case with the 3050 being a low powered card, things should not get overloaded. But with a more powerful GPU that will ask for more power, using a riser with a molex is really dangerous.
There are much better PCI-E x1 Risers on the market that connect to a PCI-E 6-pin connector.
A molex chain on a high quality PSU (eg EVGA T2/P2, Super Flower Leadex, Corsair AX/HX/RMx) that uses heavier 18 gauge wires that are 12" in length are good for 120w (safe) (132w max, not so safe)..... usually 10a to 11a on the +12v rail. That on the chain, not a single connector.
Using a riser with a cheaper PSU that can only handle 6a per molex chain would definitely not be a good idea.
@@YorkieKilla Yeah, you'll likely get away with it because of the massive drop in performance meaning it won't draw close to that 130W TDP. Still not recommended. Would be okay if you undervolt and underclock though, and tested it to make sure it doesn't consume more than say 90W overall. Btw the power drawn is usually balanced from slot and connector.
This is called riser. Crypto miners used this connect many gpu in one motherboard. If you try to connect the 'USB 3.0'into the usb port it will not do anything. It's basically a bridge or link cable between the motherboard to the riser. Nice video btw.
Bit of gear intended for mining rigs, got a load of them lying around, I did try this exact thing out and it was terrible 😅
Yeah it’s not ideal haha
It is also useful for laptops with bad integrated graphics, even if you lose about 50% of the card's performance, it may still be worth it. I used to game like that long ago with a previous version of this adpater which used the ExpressCard slot to connect.
I'm actually surprised this ran as well as it did! I mean the uses cases are super limited, but hey it worked!
You could theoretically use this on a laptop. It would be less jank.
there are NVME ones that work at x4 speed, and it will work a lot better, you can go up to 2060 with it.
Tried this already and it performs slower because 1x16 is faster, But it still can play lots of tripple A games but not on very high settings and also it can't run every gpu in the market because that port will bottleneck the powerful gpus out there.
As others have mentioned, you’d probably be better off going with a more traditional eGPU setup on that. I’ve been using one for some amateur AI work and it functions quite well, especially since that kind of task isn’t as time-sensitive as gaming.
@@pokepress What kind of egpu if you mind me asking? because there are the Chinese one which is more compatible to any laptop but has limitations or the pricy thunderbolt egpu?
Yes, I did that about ten years ago, with a version that used the expressCard to connect to the laptop. It really enabled my Core2 Duo laptop to run a lot of games with an external graphics card.
The adpater is used for Crypto mining. They basically connect 10s of GPUs to the same CPU with many adaptors and ming crypto
you've been doing random gaming for years
whether that's from hardware side or software
keep it up with the random gaming 👍
Always good to see different potential solutions. thank you.
The title is kinda misleading, i was expecting him to connect it via usb to pc and not to put to a pci-e adapter to connect it than to the pc.
When running multiple GPUs some PCIe x1 ports run on the southbridge which is limited to previous generations of the PCIe protocol. For maximum bandwidth use the x16 slot with the x1 adapter. Make sure you plug it in the right way - as it's possible to REALLY screw this up - it will ROAST the "USB like" cable and could cause a fire.
Is it USB? As in, can you just connect it straight to an actual USB port on your system? Or does it just use a USB-style connector for a custom card?
Because I'd expect that the main use of an actual USB graphics card adapter would be to try and plug into a laptop, not using some small PCI-e USB card.
Definitely do NOT plug it into a regular USB port. You'll most likely fry the USB controller or possibly even the board itself due to different voltages and power draw from the card.
@@austinriddick6414 I didn't even think about the fact that it would try to power itself from the port. I guess I just assumed you threw in some sort of adapter to power it from the mains.
Its not usb its just using a usb cable to bring the wires to the 1x card
These were made for crypto mining mainly as many coins didn't need the bandwidth of x8 x16
It's not USB. It just uses a USB cable and connectors.
@@austinriddick6414 The power isnt being drawn through USB
I threw together an EXP GDC Beast egpu a few years ago for a now 12 year old all-in-one PC that I daily drive. The hardware itself is comparatively similar to a desktop, the dock connects through mpcie where the wifi card would be, and the performance is actually really good. I have no clue how that performance would translate to other machines because I've never transferred it, but I imagine it'd be very hit and miss for a lot of hardware. If these docks had a mobile option like expresscard, m.2 or mpcie I think it would be the most beneficial. The beauty of the egpu is keeping a portable machine portable or giving life to a really old machine that can take it.
That's a good idea having that adapter around, I've had mobos with dead pcie slots before....neat.
It's not USB, its just using a USB cable to connect the 1st set of PCIe Bus Lanes to the external board, technically, you could design a board that has a full PCIe socket with 4 or 8 of those USB cables and have a full external socket - or maybe you could build a device that Translates the Output to wirelessly send and recieve the Bus Data to an "Wireless External PCIe Translator-Reciever" board across the room and just send the data as Packets over Wifi or something since WiFi is pretty fast these days with multiple channels and Bandwidth I bet it could be done...
Bearing in mind that in 5 Generations, PCIE speeds have increased so that an x1 connection on PCIE v5 is as fast as x16 on PCIE v1, I wouldn't be surprised to see PCIE x1 interface GPUs any time in the next Decade. Especially if Efficiency becomes the overriding factor.
its crazy what a 1x pci slot can do! I would of never expected 100 frames through 1 pci lane.
what would happen if you plug one of these in a laptop's usb port?
It can't transmit data meant for PCIe. Might do some damage but probably nothing at all.
I used a couple of these with vertical GPU mounts to free up my x16 (2 wired as x8 and 1 wired as x4) for 10Gbe networking and RAID for my servers. This way, I have display capabilities through a x1 slot and enough slots for my SAS RAID and HBA and 10Gbe adapters. Windows Server (Hyper-V) doesn't need the gaming capabilities, and G210 GPUs don't really need the PCIe bandwidth. These PCIe x1 adapters work great for that purpose.
I wonder what happens if u plug it into an actual USB
You're missing something important here..
You could replace the pcie wifi adapter on motherboards (of laptops) to make use for a pcie Extender for a x4 connection.
Not ideal but better than an apu gpu for sure.
Sapphire TOXIC RX 6900 XT Air Cooled
I ran a similar set up but for an old HP compaq presario laptop (from c.2009) with an intel core 2 duo t6600, combining it with a Radeon HD 7770, connecting the PCIE1 slot where the wireless card should have been.
Getting an output via the Graphics card was a bit faffy due to the locked HP Bios, but putting it to sleep, connecting and waking up did the trick.
Actually ran Left 4 Dead 2 surprisingly well and was very playable! Didn't take any benchmarks or anything but from what I recall the HD7770 was a reasonably good pairing as it basically maxxed out the bandwidth on the slot.
Mostly was just tinkering for a bit of fun with old hardware though, never really used it in earnest :)
very cool experiment
honestly im here for the intro
To be honest this is actually pretty cool. If you have an oldish laptop with integrated graphics and can pick up a low-TDP GPU for cheap used, this isn't the worst thing I've ever seen for if you wanna be able to play some old games when travelling. Easier to carry around than a full PC and much cheaper than a gaming laptop, anyway. Neat!
Best way is to find one with a ye olde Expresscard slot, so you don't have to remove the bottom of your laptop to fit it. But used Ryzen 2/3rd gen and MX150 laptops are cheap enough to make this method a bit obsolete for travel.
That's actually very respectable
This is actually better performance than I expected
Love your video from Jamaica 🇯🇲
Thanks :)
I don't know if this matters to you but honestly in my opinion. The quality of the videos recently has improved immensely. I don't know what you changed but your videos look a lot more professional
These risers are typically used with the 1X dongles.... I'd NEVER risk plugging the "USB3.0" cable into my ports.
I have actually tested this. These things do generally use normal USB3 cables that have the proper wiring. They could theoretically use any other type of cable as long as it has the correct number of pins, because all they're doing is using the cable as a transport medium. They're not using the USB protocol, just the cable.
@@DrewWalton even then, I still won't plug these risers directly into my ports unless the motherboard already has ports in place of PCI slots.
@@BeezyKing99 YMMV, obviously. I can't guarantee that all of them are valid USB3 cables.
Fun fact, these are actually super useful for ultra budget rigs. On ebay, itx 6th-7th gen intel i5 cpu/mobo/ram combos can be as cheap as $80, and the cheapest ones are because they only have a x1 slot. So you can have that system with a 1050ti for under $300 with performance that still beats a stock 5600g in many cases, systems that can cost double the price. Obviously its not the best solution, but in a increasingly expensive used market, its still a solution.
The same 6/7th gen rigs full size or SFF size, like Dell Optiplex, are around the same price now though. They're definitely good shouts for budget gaming rigs. No need to blow $1k when you're just looking to get a good 1080p gaming experience!
that adapter is for external hardware to use with a minnig rigs or an external laptop gpu.
This actually worked far better than I thought. I might try this to use a low-end Nvidia card I've got lying around to provide PhysX support to my 'vintage' HD 7950.
This machination, deserves more shenananigans... In the form of SLi/Crossfiring, as a deconstructed [hanging] pc
Great video. I've seen this a load for mining rigs but I would never thought about actually using it as a daily.
I remember years ago in my country, they used to sell adapters for GPUs for laptops, so you could get better graphics for any purpose you wanted. Never got to try one but this gives me an insight of how could have been.
That worked better than I would have expected it too
We're getting dangerously close to a random gaming in hd x Dawid does tech stuff cross over episode
Smoking, dusting scenes and something like that which include tons of particles are requiring more bandwidth between GPU and CPU than normal scenes.
EE here, the molex cable has such a low impedance to catch fire you'd need to dump an incredible amount of current through it - more likely whatever chips are doing power regulation would end up torched
Edit - that's why the adapter got warm
I love seeing random hardware stuff like this to extend the life of parts you have on hand. Lots of mini PCs and laptops could put something like this to use. Great video, thanks!
Theres a adapter intended for stronger cards called The Beast. It's been around for years now and is the preferred way to do GPU upgrades for laptops.
Directly connect riser mxm card to laptop WIFI card port and you can drastically improve performance and don't forget using windows readyboost sd card to improve 5 to 10 FPS performance.
Thank you for this video, I really needed this because my pcie x16 is faulty and now I can use this method, it may not be perfect but works.
This is very good for people who have very small cases and needs a big gpu for good fps
I put my like, because you did a review of GTA4, a forgotten but great game, 4real, thanks for that! 😎
The graphic card is not really run on usb3.0, the real agreement under the shape (usb3.0) is pcie x1. It cannot directly connect to a USB A of a laptop!
0:23 i thought you was going to say, Cash, Stranger! After the way you said those two familiar words.
it actually isn't USB anything at all just a cable and PCI-e 1x slot adapter and maybe provides a little 3.3v. For mining not a problem.
Molex should not be used long term to power this, it’s a fire hazard as the cables do not handle the power required by the PCI-E connector which is up to 75Watts. Molex only Rated up to 4.5amps or 54Watts. This probably explains the adapter getting hot and cracking sound. Learned this with crypto mining, there are new PCIE adapters that use PCI-E power cables.
now if this had an AGP 8x adapter that you could plug in a slightly more modern card into an old system and maybe deal with some questionable drivers, could be fun to fiddle around with
At first I thought this was a way to connect a GPU over a USB 3 bus. That would actually be interesting for use in plex servers for small form factor PC's without a PCIe slot.
It runs much better than I expected.
You are so wrong about this setup.
The USB setup does not have enough bandwidth for transmitting enough data for full graphics spec. However, it has more than enough bandwidth for... *drumroll*... Mining.
That's right, these were used for MINING rigs... not adding a gpu to a smaller form factor.
The most bandwidth you get out of that is x1-x2... tops.
That USB cable is only for connecting the pcie x1 as a dumb connector though
This is only possible because PCIe is delay and latency tolerant. This is the same reason why you can plug a x16 card on a x1 slot. The molex connector is just to send 75W to the slot.
2:00 what slick keyboard is that? Always been a fan of low profile boards and that one has a numpad!
Logitech k220 :)
@@RandomGaminginHD Thank you!
Nvidia, hear me out... RTX 3030 with an option to use both PCIe x1 and x16 slots for:
1) light modern gaming;
2) display adapters for companies that want the hardware with the longest support period possible;
3) access to Tensor cores as a second GPU for content creators, deep learning enthusiasts, and other tech people.
That was weirdly strange and entertaining. I liked the video.
Can't wait for the USB 4 version.