@@MeakerSE i have 2 dual GPUs in my collection HD6990 ( in raw shape ) and GTX 295 ( a little better shape ) its hard to get them in Poland but sometimes you can get lucky and buy them for 20 usd ;)
I never had dual chip but I did have sli 980ti's (I upgraded it on a whim after my PSU blew up and took my single 980 with it) and used them with a 1kW PSU. Gaming in summer was a test of how much I could tolerate the 700w worth of heat being dumped into the room. Swapping them out for a 3080 was well overdue and undervolting it basically halved my power usage & significantly lower heat load. Those days are over, same as those dual chip (I always wanted a 590 and those sli 980tis almost made up for never getting one) units. I suppose that shows just how well things have improved since then. I wouldn't mind Jensen having a crack at a 5090 dual GPU, surely with all their tricks, DLSS, frame gen etc etc their engineers can make something that can run City Skylines 2 at 30fps at 4k.
It’s really sad because they developed explicit multi GPU as part of DirectX 12, so a dual chip GPU would be worthwhile if the games supported them, and without all the micro stutter that existed with us ally in crossfire
I seen that too, right around the section where he says that the GPU's are exact copies but turned 90 degrees, you can see on the other GPU that all those caps are there and not missing but on that specific one on the other GPU side is, you can even see a little bit of the old solder left from where that cap got ripped off cleanly! As the card fails on 1 of the GPU's that'd be my guess as to why too. Thumbup @TA-mx3zm 's comment up people, get it to the top of the comments so that @Der8auer see's this rather than having to send it off to Chrisfix as it'll only cost a few $ then and make for a nice repair video too!
I noticed that as well. I googled pictures of the disassembled card and yes according to what I can find it's missing a cap. I actually don't think it was ripped off the card or it would have destroyed the pads it looks like it was removed and no one cleaned up the pads with the iron and copper wick.. maybe they removed that cap to disable that phase from a short preventing power on or boot up.. or the GPU chip has degraded and wasn't stable and was completely disabled by removing the cap.. that's likely caused a power on or boot issue regardless of the driver settings so the only solution was to disable that chip in hardware not by software to get the card stable enough to boot.. but i'd replace the cap to see what happens and to trouble shoot the actual problem instead of just disabling half of the card. Kinda sux selling a card as working when you know it's not and you hid the defect by removing a part to make it boot stable.. that is something the Chinese Ali express sellers would do to con people out of cash money 💰. The card is pretty good as a single GPU card and capable but I want to see it running properly.
I had the same issue with 295x2, at first thought it was second GPU at fault, but the problem was in PLX-bridge MCU (it has it's own thermal interface under the cap, and it looks like a PCM, but was dried to concrete). While I was able to get another PLX from workstation GPU of same era I managed to fit indium sheet between chip glass and its heatspreader and replaced died PLX-bridge with one I've prepared. All was fine except faulty Adrenalin drivers at this moment, so I've used Radeon PRO enterprise drivers (for WS cards), they were muuuch more stable back there (yep, they are containing drivers for gaming cards, but no Control Panel).
I hope the GPU can be revived, it is always cool to see these monstrosities of GPUs compared to cards that we have now. Bringing all the cool HW stuff, as always, keep it up!
My first dual GPU card was the Powercolor 3870 X2. That card was amazing! Not only did everything get accelerated by the dual GPU being on one card but it lasted for years until a bad driver burned out the second GPU. Otherwise, it just kept on going long after other peoples cards had died. Yes, it was expensive but the performance and longevity actually made it cheaper than incremental upgrades. I would absolutely love another dual GPU card!
@@DelticEngine I ran my Diamond 4870x2 that I used to have From 2008-2019 I had to repaste it in 2014 as it filled with dust and started overheating. Part of me misses that card and wishes I had kept it as a momento rather than sending it to e-waste.
@@evileyeball Interesting you should mention keeping it as a memento as I kept mine, and the box it came in! I still have it to this day. My own experience is that I didn't have problems until I did a totally fresh driver install which lost something that had kept the card working properly. Up until that time I had been extracting the drivers and using the .inf file to update the driver. This worked great and the card stayed happy. After the completely fresh install it was like the driver didn't see the card properly and my poor card suffered and died because of the lousy driver. Ever since then I have had a distrust of AMD's ability to write drivers that work properly. At the time I wasn't experienced enough, unfortunately, and didn't know which drivers were okay and which weren't. The only thing that maybe would have worked would be to go with the driver that came with the card and incrementally update as I used to in the hope it would fix things. Other than that, I (much) later bought a 4870X2 HOT edition, which so far I've never used because the situation changed. I'm now getting familiar with Linux, Fedora 38 in my case, and watching this reading the comments has started me thinking if these old cards would work properly under Linux and maybe even play some (relatively) modern titles. The Vullkan API would likely fill in with the CPU various shortcomings in features of the rather old GPUs.
Had no idea this existed. What a unit of a GPU at the time. And I just think it's funny that two eight pins with 16ga wire is rated to 720 watts and would be way more than enough to power this if it weren't for stupidly conservative sensing specifications.
Honestly, we could have done a 4-pin connector. Two fat lugs with thick 8awg wire, and a small twisted pair for remote-sensing. Or maybe 10awg, and the stupidly high power cards can use two.
You can tell how fucked the market is now considering you could get 2 GPU's on one card for the price of a mid-high end card now. Imagien if they had two 4090's on a single PCB, it would cost like £4k+. GTX 690 was like £700 I think back when it released and I thought that was mental at the time but now that will get you a 70 series car maybe. Nice cat by the way.
You get the same wattage now as two cards in one now, (some RTX 4090 go up 600watts) & they take up four slots now too. I see no real improvement & only software gimmicks to trick you.
Having done board design in the past, this thing is a work of art. There looks to be 32 memory ICs on that card, which is absolutely bonkers. On top of that, a pair of GPU dies, a PLX chip, and the PCIE routing to make them work together. Not to mention how you go about routing power from a gigantic monolith of 8-pins like that. I'm sure the power planes of that PCB look incredible.
My dad had this card in his rig (paired with an FX 9590) and it basically turned his office into a sauna when he played games during the summer. He had gotten this to replace his previous Xfire 290s, to pair with his 4K monitor. It's a pretty solid card and, as far as I know, he never had any problems with it. Definitely glad that I talked him into buying a beefy PSU when he built his rig.
All of AMD's RDNA line already support mGPU (you only lose Rebar when you turn it on) tell developers to quit being bitches & start implementing the stuff we need. RTX 4090 can't even give good frames rates at /2160p/4K ultra setting without software gimmicks. by the way how is F.G any different than A.F.R? other than using machine learning to create a picture that you never really interact with it's basically rendering the pictures the exact same every other frame.
I had lots of fun back in the cross fire days, with my 2 x XFX R570s never had the money for a dual card! After years of gaming my 2 x 570s spent a good year in a mining and rig with other various cards and still survived today and both work!
So cool! Hopefully KrisFix will take it and hopefully find the issue. I really want to see a follow up video. Only if manufactures would make newer cards like this still.
OK, here is something I found messing with 2 AMD cards, Re-bar and Above 4G decode must be disabled, (and yes I know if disable 4G will disable Re-bar), I've noticed on GPUz that re-bar is enabled for the second GPU. I Doubt we will see another video for this card, but I think it is good for the community to know that turning on Above 4G decode has some problems with dual GPU on windows 10 and 11 for 3d applications (don't know why yet) It run ok on machine learning, BUT i got some weird problems with PCI lanes and some stuff not working properly, like my numpad on keyboard.
The last PCIe contact pad on the back of the card is a ground line. The one next to it is part of the 16th PCIe lane. Since the first GPU performs as expected, it's unlikely that the lane is damaged or shorted, and missing one ground (of many) won't matter much. Worst case, if lane 16 was taken out, the main link should fallback to x8, but the links between the GPUs and PLX chip would be fine, and both GPUs should still work identically.
If it does not miss any component it is very likely to be the multiplexer bridge chip either failing or desoldering itself. I had 2 HD 5970 cards, first one did not work at all and baked it in the oven and worked for about a month with crossfire working then died again and the second one works with crossfire but for the first run of the PC until it actually warm up it shows artifacts on screen and freezes image for seconds or even freezes the PC completely. If you restart it works perfectly both GPUs. Really weird beasts but pretty cool cards, cheers from Greece! Jim.
I initially thought they would just be driving the LEDs off the VRM controller's PWM outputs, but it looks like they're instead running it as some kind of bar graph, which is interesting. I can't see a multi-channel comparator package anywhere but there are exactly ten SOT23 packages on the back near the power inputs, which may well be BJTs/FETs set up to switch on sequentially depending on the Vcore line's current sense voltage. Bit of a gimmick but kinda cool for that era.
Yeah AMD did something to the drivers few years ago. I had a R9 295x2 that worked perfectly fine and after needing to reinstall windows no driver would allow for both GPUs to work together. It happened at the first generation of cards AMD had no crossfire supported cards and abandoned the technology. Their GPUs are build so well from that and previous eras, but the software support made me swear never to use AMD GPUs again. SLI on the other hand still works fine and you can even use different GPUs no problem with that workaround that disables forced restrictions. I think it was called different SLI auto.
That "Different S.L.I auto" won't only works up driver 427.89 on Nvidia it doesn't work any thing past GTX series like RTX 2070 super on up to the RTX 3090 TI's which have S.L.I connectors.
I definitely enjoyed Remnant 2! Currently holding the World Record for NG+ Apocalypse (only because nobody wants to compete in the harder difficulties ): ) Awesome card, curious to see if it can be repaired.
As someone that ran QuadFire with 2x R9 295x2 cards, I can say that dual GPU cards were pretty freakin' awesome back in the day. I enjoyed a 3 monitor setup with the 4 GPUs so much in the games that supported it. Seeing cards like these being looked at again is fun. Wish they'd make more, but I'm sure the money required to develop them vs the limited sales wouldn't be worth it now - I imagine it would cost a lot of money to design a 7900XT x2 GPU, and because it would likely cost upwards of $1700 I would guess, it wouldn't sell very well, especially since the support in games would be limited unless it was all driver-side Crossfire.
Haha, yeah I still run my 2x Radeon 5870HD setup. A supported game like Dirt, Crysis, Tomb Raider or Mad Max gets almost double the framerate. Combined with Eyefinity and 3 CRTs it was amazing. Too bad no Unreal Engine games support it any longer. Most Unity3d games work when launched with "-window-mode exclusive". But scaling isn't always good. Or it runs worse.
This brings me back to the days where I first had SLI 670's and then crossfire 290's. All water-cooled with full coverage blocks from EK. The amount of heat that system put out was impressive 🤦♂
I always wanted one of these back in the day. Eventually I was able to get my hands on a Radeon Pro Duo Fiji card. Which will use a significant amount of power as well.
the last nvidia dual core card was the behemoth GTX690 . i was looking forward to the future of these type of cards. they seemed like exotic cars to me.
I missed the time of crossfire and SLi it was such a time to see a beast of a PC build one after another. now it's just oh I'll slap a 4090 in there boom beast PC done.
I have been trying to get my hands on one of these for years, how the heck did you get it? Also you should have been able to run the second GPU core standalone to verify if core / mem is operational. Windows supports selecting your GPU accelerator based on application in use. Default feature on laptops but can be enabled on desktop via regedit. Or you can do it through the Crimson software suite.
I bought a Radeon Pro Duo quite some time back for collector reasons. It's a dual Fury Nano (Fiji, 4GB HBM memory). I stumbled upon the exact same issue. No matter what i tried, i never got them working together as a crossfire duo.
I always like these type of cards, super unique and inventive. I think the reason why SLI went dead is because you could get two 150-200 dollar cards and have them perform better than a single high end card. At least that is my understanding of it. I would love to find a card like this.
Haha I almost bought a Saphire 4870x2 in 2008 but Instead went with a Diamond one. I ran that card from 2008-2014 when I repasted it, and ran it again more from 2014-2019 when I finially retired that mahcine. It played ANYTHING I threw at it at Max (I didn't throw high end titles at it really ever) and It ran super well on my XP Pro X64 Build with 8gb DDR2, and E8400 OC to 3.6 ghz on stock Voltage
G'day Shiek, Roman & Makita, WOW! what an AWESOME! GPU yes Power Color's Red Devil along with Sapphire's Toxic & VaporX have always been a very good looking Designs, I didn't know these 2x R9s existed, so was this the last Single Card Crossfire GPU🤔? sad it is having problems, I hope Kris can work some "Black Magic" on it.
I had an AMD4850 dual GPU card once upon a time. I experienced all the same problems shown in this video. It was a beast though when it was running properly...
I had 2 separate 290x’s. Bought them in used condition for 300€ each. When it worked (not all games supported crossfire) I managed 60 fps in 4k on some games, high settings but not max. But stuttering was pretty annoying with running dual gpu’s.
I was running HD 3870x2 by two. Moved up to HD4870x2 by two. I could see my lights dim really badly. I was running 840 EE clocked over 4ghz, 4 gpuz clocked equal, running talladega C class trucks over 500 fps maxed settings on Iracing back in the day.... Good thing power bill was included in the rent. 😊.. the good ol days.....
Crossfire just like the old nvidia 3d shutter glasses works only in exclusive fullscreen mode. And you also need compatible PCIe bus settings and driver. I would try a different mainboard or mess with PCIe settings and timings.
I had a HD5900 series dual GPU - the thing was a cruching beast when you could get both GPU's to load. Frame tearing and microstutter was terrible. Mine "only" had 3 8 pins but was a massive heat generator.
I wonder if the mobo STILL has to support crossfire, even with a single slot dual card? Also Sniper Elite 4 was one of the last crossfire games that had great scaling! I still have a GTX 590 and an HD6990. EDIT: just a thought that the dodgy PCIE pin might cause issues, stick the slot into x8 mode.
I know the Threadripper boards (TRX40/50 WRX80/90) all mention support for all of it still. No mid-range ( or what people call consumer boards now) have support for S.L.I at all, as none of them are certified for S.L.I & must be for S.L.I not so for mGPU on AMD's RDNA line up all of them support mGPU & most boards on AM4/5 still support crossfire regardless of certification.
(Looks at the two HD 5970's on the wall) .. nothing new :P I had 2x5970 and 2x5870's in my benchmark rig :P i am on 3dmark with it. (EVGA SR2 with dual X5670, STILL have it) The motherboard box is as big as my Thermaltake CTE 570 case ! :P They got replaced with 4xGTX570's with EK waterblocks just for for 3dmark :P
The crashing problem is due to very new driver. Since the card is no further supported, it runs in a legacy mode. Just find a driver close to the release of the card and your problem will be solved. P.S. Finding the appropriate old AMD driver is like winning the lottery 5 times in a row...
Yeah it's definitely a driver issue, not a hardware issue unless there's a missing cap on the pcb causing instabilities. Trying to get two gpus to work together in tandem on a single PCB would require specialy driver magic coding to get the two gpus to cooperate. AMD obviously abandoned the technology if their newer drivers are any indication. He needs to go back to the very original legacy drivers, and maybe start with version 2 or version 3 of the first driver update (amd will have implemented fixes and improvements with a few versions after the original).
@@luminatrixfanfiction I would say that these dual GPUs were built as tanks. I've got lots of 5970 still working on a project for more than 14 years, and they never missed a beat. Even those that were used to mine bitcoin, they were damaged due to heat/fan issues and not due to caps blowing up. So, it should be a driver problem. He said that he spent a couple of days working on the driver problem, but it's the vanity of the newbie to get something working wit the "best/latest" drivers, not having spent hours uppon hours to resolve AMD drivers issues... And I am not even mentioning trying to run multi-gpu stuff on linux....
I miss sli. I used it on dirt 2 and the original borderlands with dual 970s. yes I know those cards were gimped by nvidia but they ran great for the games that I was playing.
As I know, for now, when main load for Video Card is AI loads, it will balance loads between chips greatly than loads from standard rendering (game, video etc), and as I understand this was the main issue for this cards, sli, crossfire, two chips was very hard to make working with equal loads, but now, when AI making render (dlaa, dlss, rtx) we can check if it stabilising loads between cards. We need to test it, if this is as I say - it will get huge performance increase.
Was this the successor to the R9 295X2? or a completely different PowerColor only dual GPU that they Frankensteined together to make an unofficial dual GPU?
I had a Diamond HD Radeon 4870x2 and Man do I miss this card which at the time was the top of the GPU pile in 2008... Now I am running a 3060TI which is decidedly midrange. haha
Seeing old cards like this reinforces my belief that current cards like a 4090 are where the equivalent of 2 or 3 GPUs has all been combined onto a single giant card.
They kind of are if you look at the die shots. AD102 has 2 big blocks of SMs on each side of a crossbar and 2 sets of 3x64-bit GDDR controllers. Navi 31 does this as well, but in more smaller groups. ARC is a more homogenized design, but this is probably because the number of Xe cores isn't large enough, and the cores themselves too large, to start forming those higher-level patterns yet. Battlemage will probably look more like the others.
Nuts back then, nuts now, nuts according to Techpowerup relative performance that my laptop's igpu (780m with the RX 4/570 as reference) theoredically has ~80% the performance of this thing while consuming way less than 10% the power
if it locks your entire pc with pheripherals then it might be a gpu dye problem. if the mouse can still react even if it freezes then it might be an electrical circutry problem.
Could be that the PLX chips were known to have problems in cards like this because they consumed about 30w on their own and have a paste TIM under the IHS (so 2 layer of TIM in total) instead of solder. I thought this was fixed on the newer cards ie 390's but who knows Good chance you are right about the die of card 2 being the problem. You can see when he started testing GPU#2 did get to about 30% load and the freq tried jumping up from 300 MHz. Combination of Powercolor lack of quality back then and poor lead-free soldering unable to survive the thermal cycling of these crazy hot running cards. The gpu in the front usually the problem with such a long card with seperated cooler and flimsy backplate causing a lot of flex to the pcb
As you already know and see @der8auer, everything with it's thermal design has been tampered with and not in a good way. I suspect since it's a dualgpu 600 watt card, that it may have broken traces due to heat expansion. Maybe try and remove the cooler and bake the GPU to reflow all path traces. This happened twice on my old crossfired 4870x2's, which drew about 750 watts. Both cards got heat damaged due to old paste and they both worked after reflow.
I feel like dual GPU solutions, be it on the same card or paired cards, has always been problematic, and that's probably also why they stopped doing it.
I had the HD 7990, which was two 7970s slapped on a board with 6GB VRAM. It was a beast! When it died I replaced it with a GTX 1060, and that was an upgrade lol.
My VIA UD8 has something like that PLX chip between the 2x S3 5400EW chips. The VIA UD8 is only PCIe 2.0 x8 so.... Umm... 2 GPUs for 8 lanes. 😅 I sold off most of my S3 collection. I probably should have kept one of the earlier UD8 Revs I had because the one I still have needs a UEFI motherboard. Anyways, I didn't know with a controller like that that it can use all it's PCIE lanes.
Also a quick thought Der8auer: Try the card on a less security patched board/older version of Windows, such as 7? The PLX chip might be triggering security issues due to the 'bypassing' or the newer kernel with the older driver simply isn't stable! AMD started disabling CrossFire for me even back in the early 2010s, the first one to go was StarCraft II and FPS slowly dropped, as i said in a previous comment i ended up just using a single card and put the second one in another machine, so friends could play games then.. And all but one of my HD5870 cards died, as well as a brand new R9 290X from MSi (they also managed to crush the fans and kill the bearings by packing it too tightly in the box straight from their factory! Two weeks later, the price also halved..oh, i was pissed. Hence never using AMD for GPUs again nor CPUs since 2010)
Dual gpu cards are cool I’m trying to get a functional Titan Z for my collection I have a 690 right now along with a few other Titan cards but would like to get a Z and a 295x dual for my collection
i used the Radeon HD 6990 playing battlefield bad company 2 and i do remember it using both gpus. tho not running very well because amd drivers and crossfire profiles was terrible
Dual core cards like this work on literally any motherboard as the PCI-e lanes necessary for multi GPU are handled by the GPU. Multi-card SLI needs the motherboard to have a license from Nvidia. Multi card crossfire just requires that you have 2 PCI-e slots that are at least x4(it'll even run through a chipset).
I miss these contraptions. Made following pc tech more interesting as sometimes unique cards like this would pop out. Asus Mars come to mind, and even the Titan Z. Not that those were a smart choice to buy, but they were a lot cooler than a bog standard single gpu card
i miss dual GPU cards .. out of my price range but it was cool they existed and seeing some with 2 LN2 buckets on them was giving me joy ;)
I had an HD4850X2 it was so silly but awesome.
@@MeakerSE i have 2 dual GPUs in my collection HD6990 ( in raw shape ) and GTX 295 ( a little better shape ) its hard to get them in Poland but sometimes you can get lucky and buy them for 20 usd ;)
I never had dual chip but I did have sli 980ti's (I upgraded it on a whim after my PSU blew up and took my single 980 with it) and used them with a 1kW PSU. Gaming in summer was a test of how much I could tolerate the 700w worth of heat being dumped into the room. Swapping them out for a 3080 was well overdue and undervolting it basically halved my power usage & significantly lower heat load.
Those days are over, same as those dual chip (I always wanted a 590 and those sli 980tis almost made up for never getting one) units. I suppose that shows just how well things have improved since then.
I wouldn't mind Jensen having a crack at a 5090 dual GPU, surely with all their tricks, DLSS, frame gen etc etc their engineers can make something that can run City Skylines 2 at 30fps at 4k.
It’s really sad because they developed explicit multi GPU as part of DirectX 12, so a dual chip GPU would be worthwhile if the games supported them, and without all the micro stutter that existed with us ally in crossfire
had the 7990 that thing was an absolute monster card
Sort of looks like there is a missing tantalum cap on the VREG of the first die, on the top 6 VRMs, the 3rd one is missing the cap.
I seen that too, right around the section where he says that the GPU's are exact copies but turned 90 degrees, you can see on the other GPU that all those caps are there and not missing but on that specific one on the other GPU side is, you can even see a little bit of the old solder left from where that cap got ripped off cleanly! As the card fails on 1 of the GPU's that'd be my guess as to why too.
Thumbup @TA-mx3zm 's comment up people, get it to the top of the comments so that @Der8auer see's this rather than having to send it off to Chrisfix as it'll only cost a few $ then and make for a nice repair video too!
One missing phase would certainly make it unstable.
I noticed that as well. I googled pictures of the disassembled card and yes according to what I can find it's missing a cap. I actually don't think it was ripped off the card or it would have destroyed the pads it looks like it was removed and no one cleaned up the pads with the iron and copper wick.. maybe they removed that cap to disable that phase from a short preventing power on or boot up.. or the GPU chip has degraded and wasn't stable and was completely disabled by removing the cap.. that's likely caused a power on or boot issue regardless of the driver settings so the only solution was to disable that chip in hardware not by software to get the card stable enough to boot.. but i'd replace the cap to see what happens and to trouble shoot the actual problem instead of just disabling half of the card. Kinda sux selling a card as working when you know it's not and you hid the defect by removing a part to make it boot stable.. that is something the Chinese Ali express sellers would do to con people out of cash money 💰. The card is pretty good as a single GPU card and capable but I want to see it running properly.
that's how it is stock, it's not missing a cap. look the pcb up.
@@h4x0rm1k35 chrisfix might not be able to do it in the move, send it to dosdude1 as it would be a cakewalk for him
I had the same issue with 295x2, at first thought it was second GPU at fault, but the problem was in PLX-bridge MCU (it has it's own thermal interface under the cap, and it looks like a PCM, but was dried to concrete). While I was able to get another PLX from workstation GPU of same era I managed to fit indium sheet between chip glass and its heatspreader and replaced died PLX-bridge with one I've prepared. All was fine except faulty Adrenalin drivers at this moment, so I've used Radeon PRO enterprise drivers (for WS cards), they were muuuch more stable back there (yep, they are containing drivers for gaming cards, but no Control Panel).
Things you never think about, because normally it doesn't even exist.
The same, pads did that ?? no blocks ??
don't understand, same issues ???
Wow! I suck at soldering pcb boards. Of course I am old and shaky now, so I just gave up soldering anything.
I hope the GPU can be revived, it is always cool to see these monstrosities of GPUs compared to cards that we have now. Bringing all the cool HW stuff, as always, keep it up!
I remember when this came out I wanted it so bad! Powercolor has made such cool dual gpu cards
My first dual GPU card was the Powercolor 3870 X2. That card was amazing! Not only did everything get accelerated by the dual GPU being on one card but it lasted for years until a bad driver burned out the second GPU. Otherwise, it just kept on going long after other peoples cards had died. Yes, it was expensive but the performance and longevity actually made it cheaper than incremental upgrades. I would absolutely love another dual GPU card!
@@DelticEngine I ran my Diamond 4870x2 that I used to have From 2008-2019 I had to repaste it in 2014 as it filled with dust and started overheating. Part of me misses that card and wishes I had kept it as a momento rather than sending it to e-waste.
@@DelticEngine "The Radeon/AMD card was amazing until it catastrophically failed" many such cases
@@evileyeball Interesting you should mention keeping it as a memento as I kept mine, and the box it came in! I still have it to this day.
My own experience is that I didn't have problems until I did a totally fresh driver install which lost something that had kept the card working properly. Up until that time I had been extracting the drivers and using the .inf file to update the driver. This worked great and the card stayed happy.
After the completely fresh install it was like the driver didn't see the card properly and my poor card suffered and died because of the lousy driver. Ever since then I have had a distrust of AMD's ability to write drivers that work properly. At the time I wasn't experienced enough, unfortunately, and didn't know which drivers were okay and which weren't. The only thing that maybe would have worked would be to go with the driver that came with the card and incrementally update as I used to in the hope it would fix things.
Other than that, I (much) later bought a 4870X2 HOT edition, which so far I've never used because the situation changed.
I'm now getting familiar with Linux, Fedora 38 in my case, and watching this reading the comments has started me thinking if these old cards would work properly under Linux and maybe even play some (relatively) modern titles. The Vullkan API would likely fill in with the CPU various shortcomings in features of the rather old GPUs.
@@CyberneticArgumentCreator okay team green 😂
Had no idea this existed. What a unit of a GPU at the time. And I just think it's funny that two eight pins with 16ga wire is rated to 720 watts and would be way more than enough to power this if it weren't for stupidly conservative sensing specifications.
Honestly, we could have done a 4-pin connector. Two fat lugs with thick 8awg wire, and a small twisted pair for remote-sensing. Or maybe 10awg, and the stupidly high power cards can use two.
Or use existing EPS12V connectors that normally meant to feed the CPU . That is what Nvidia uses for their enterprise cards
You can tell how fucked the market is now considering you could get 2 GPU's on one card for the price of a mid-high end card now. Imagien if they had two 4090's on a single PCB, it would cost like £4k+. GTX 690 was like £700 I think back when it released and I thought that was mental at the time but now that will get you a 70 series car maybe. Nice cat by the way.
Yup. I had a titan card. The ones that launched at £800
You get the same wattage now as two cards in one now, (some RTX 4090 go up 600watts) & they take up four slots now too. I see no real improvement & only software gimmicks to trick you.
Wow, it really is a dense card. The engineers must have had a fun time designing this.
Having done board design in the past, this thing is a work of art. There looks to be 32 memory ICs on that card, which is absolutely bonkers. On top of that, a pair of GPU dies, a PLX chip, and the PCIE routing to make them work together. Not to mention how you go about routing power from a gigantic monolith of 8-pins like that. I'm sure the power planes of that PCB look incredible.
My dad had this card in his rig (paired with an FX 9590) and it basically turned his office into a sauna when he played games during the summer. He had gotten this to replace his previous Xfire 290s, to pair with his 4K monitor. It's a pretty solid card and, as far as I know, he never had any problems with it. Definitely glad that I talked him into buying a beefy PSU when he built his rig.
I wish they'd bring back dual GPU cards, even if only for a non-gaming market. Some of the coolest pieces of tech.
You know a dual AD103 4090 , and a dual AD102 4090ti makes perfect sense ;)
@@tourmaline07 Good luck designing a cooler for that beast, more heat output compared to some air conditioner units 😂
7950gx2 disagrees. That thing was garbage
All of AMD's RDNA line already support mGPU (you only lose Rebar when you turn it on)
tell developers to quit being bitches & start implementing the stuff we need. RTX 4090 can't even give good frames rates at /2160p/4K ultra setting without software gimmicks. by the way how is F.G any different than A.F.R? other than using machine learning to create a picture that you never really interact with it's basically rendering the pictures the exact same every other frame.
@@kevinerbs2778Go try it. I'm not sure it's possible with 7000 series.
I really wish we had the 4x8 power connectors. They would look so good with custom cables
same here
I'd still prefer a single connector. But not the 12vhpwr connector. And place it on the side instead of the "middle" of the card.
A curtain of strands
Not great for systems with limited space. Sometimes it's possible to fit big GPUs in small cases,.but side cables would make it more difficult
You could run the cables so it looked like there was an explosion coming out of the card. 🤪
I had lots of fun back in the cross fire days, with my 2 x XFX R570s never had the money for a dual card! After years of gaming my 2 x 570s spent a good year in a mining and rig with other various cards and still survived today and both work!
So cool! Hopefully KrisFix will take it and hopefully find the issue. I really want to see a follow up video. Only if manufactures would make newer cards like this still.
OK, here is something I found messing with 2 AMD cards, Re-bar and Above 4G decode must be disabled, (and yes I know if disable 4G will disable Re-bar), I've noticed on GPUz that re-bar is enabled for the second GPU.
I Doubt we will see another video for this card, but I think it is good for the community to know that turning on Above 4G decode has some problems with dual GPU on windows 10 and 11 for 3d applications (don't know why yet) It run ok on machine learning, BUT i got some weird problems with PCI lanes and some stuff not working properly, like my numpad on keyboard.
I LOVED my Devil 13. A true enthusiast card. I even bought a second and had them in Crossfire for the benchmarks.
Could it be that one damaged PCIE pin causing the problems? seems unlikely but...
I think it's VERY likely to be the prob...but there is only ONE WAY to find out lol
The last PCIe contact pad on the back of the card is a ground line. The one next to it is part of the 16th PCIe lane. Since the first GPU performs as expected, it's unlikely that the lane is damaged or shorted, and missing one ground (of many) won't matter much. Worst case, if lane 16 was taken out, the main link should fallback to x8, but the links between the GPUs and PLX chip would be fine, and both GPUs should still work identically.
It's the missing cap off the 6 phase vrm.. likely because of a degraded GPU chip and was no longer stable or something was wrong with the vrm phase
If it does not miss any component it is very likely to be the multiplexer bridge chip either failing or desoldering itself.
I had 2 HD 5970 cards, first one did not work at all and baked it in the oven and worked for about a month with crossfire working then died again and the second one works with crossfire but for the first run of the PC until it actually warm up it shows artifacts on screen and freezes image for seconds or even freezes the PC completely. If you restart it works perfectly both GPUs.
Really weird beasts but pretty cool cards, cheers from Greece! Jim.
I initially thought they would just be driving the LEDs off the VRM controller's PWM outputs, but it looks like they're instead running it as some kind of bar graph, which is interesting. I can't see a multi-channel comparator package anywhere but there are exactly ten SOT23 packages on the back near the power inputs, which may well be BJTs/FETs set up to switch on sequentially depending on the Vcore line's current sense voltage. Bit of a gimmick but kinda cool for that era.
Yeah AMD did something to the drivers few years ago. I had a R9 295x2 that worked perfectly fine and after needing to reinstall windows no driver would allow for both GPUs to work together. It happened at the first generation of cards AMD had no crossfire supported cards and abandoned the technology. Their GPUs are build so well from that and previous eras, but the software support made me swear never to use AMD GPUs again. SLI on the other hand still works fine and you can even use different GPUs no problem with that workaround that disables forced restrictions. I think it was called different SLI auto.
Dude im sure theres a driver on linux that would work perfect. Don't give up on AMD yet😢
That "Different S.L.I auto" won't only works up driver 427.89 on Nvidia it doesn't work any thing past GTX series like RTX 2070 super on up to the RTX 3090 TI's which have S.L.I connectors.
Thanks for this very detailed and indepth video.
Great video! And that kitty at 8 minutes 😍
I definitely enjoyed Remnant 2! Currently holding the World Record for NG+ Apocalypse (only because nobody wants to compete in the harder difficulties ): )
Awesome card, curious to see if it can be repaired.
As someone that ran QuadFire with 2x R9 295x2 cards, I can say that dual GPU cards were pretty freakin' awesome back in the day. I enjoyed a 3 monitor setup with the 4 GPUs so much in the games that supported it.
Seeing cards like these being looked at again is fun. Wish they'd make more, but I'm sure the money required to develop them vs the limited sales wouldn't be worth it now - I imagine it would cost a lot of money to design a 7900XT x2 GPU, and because it would likely cost upwards of $1700 I would guess, it wouldn't sell very well, especially since the support in games would be limited unless it was all driver-side Crossfire.
Haha, yeah I still run my 2x Radeon 5870HD setup. A supported game like Dirt, Crysis, Tomb Raider or Mad Max gets almost double the framerate. Combined with Eyefinity and 3 CRTs it was amazing. Too bad no Unreal Engine games support it any longer. Most Unity3d games work when launched with "-window-mode exclusive". But scaling isn't always good. Or it runs worse.
it was missing a cap in the pictures. 8:22 Top, middle. There is a solder blob there so I don't think its a blank unused pad. Maybe?
I always loved the dual GPU cards, I still have my GTX690 still laying around.
ASUS MARS, lol
I need to find and add this gpu to my collection of past AMD/ATI flagships
This brings me back to the days where I first had SLI 670's and then crossfire 290's. All water-cooled with full coverage blocks from EK. The amount of heat that system put out was impressive 🤦♂
I remember my gainward radeon 4870x2 golden sample !!!
It was a beast!
And the design fantastic
I always wanted one of these back in the day. Eventually I was able to get my hands on a Radeon Pro Duo Fiji card. Which will use a significant amount of power as well.
the last nvidia dual core card was the behemoth GTX690 . i was looking forward to the future of these type of cards. they seemed like exotic cars to me.
These dual GPUs really inspired people to build, build, BUILD!
Loved this video ❤ I hope to get my hands on one
I missed the time of crossfire and SLi it was such a time to see a beast of a PC build one after another. now it's just oh I'll slap a 4090 in there boom beast PC done.
I think dual GPUs looked really awesome, used to run dual 290's and then furys.
I love these old card deep dives
I have been trying to get my hands on one of these for years, how the heck did you get it? Also you should have been able to run the second GPU core standalone to verify if core / mem is operational. Windows supports selecting your GPU accelerator based on application in use. Default feature on laptops but can be enabled on desktop via regedit. Or you can do it through the Crimson software suite.
Re: The amount of thermal paste - was it Steve (GN) that was the previous owner?
I love Powercolor so much. This is the daddy to the Red Devil Ultimate.
I bought a Radeon Pro Duo quite some time back for collector reasons. It's a dual Fury Nano (Fiji, 4GB HBM memory).
I stumbled upon the exact same issue. No matter what i tried, i never got them working together as a crossfire duo.
I was aware of the 295 X2. But not aware of these 390 variants. That's pretty cool.
@der8auer im not sure if you noticed but i saw one of the components of the left gpu vrm was missing. maybe check the out.
I think its impressive how that PLX chip was capable of keeping up with having both GPUs talking to it along with all the memory chips.
I found a quick way to test multi gpu setups by using blender and doing the multi render method by frame sharing. Haven't done this in years though.
I always like these type of cards, super unique and inventive. I think the reason why SLI went dead is because you could get two 150-200 dollar cards and have them perform better than a single high end card. At least that is my understanding of it.
I would love to find a card like this.
The reason SLI died for gaming was because of pacing, at high frame rates it got too difficult to coordinate them and they would stutter horrifically
those individual blocks are not only a genius move but look bad*ss as heck :O
Used to love these dual GPUs. Had a Saphire 4870X2 in a SFFPC case, one of the dies fused with the heatsink 😅
7950GX2 was my dream GPU
Haha I almost bought a Saphire 4870x2 in 2008 but Instead went with a Diamond one.
I ran that card from 2008-2014 when I repasted it, and ran it again more from 2014-2019 when I finially retired that mahcine.
It played ANYTHING I threw at it at Max (I didn't throw high end titles at it really ever) and It ran super well on my XP Pro X64 Build with 8gb DDR2, and E8400 OC to 3.6 ghz on stock Voltage
Your cat "Shieik" seems to like sleeping to where you work. Now with eyes wide open!. He seems to like Devil type cards on the table....
what a beautiful piece of engineering, yall remember the Lucid Hydra chip?
G'day Shiek, Roman & Makita,
WOW! what an AWESOME! GPU yes Power Color's Red Devil along with Sapphire's Toxic & VaporX have always been a very good looking Designs,
I didn't know these 2x R9s existed, so was this the last Single Card Crossfire GPU🤔?
sad it is having problems, I hope Kris can work some "Black Magic" on it.
Those fans are awesome! I would like to have them so much!
I had an AMD4850 dual GPU card once upon a time. I experienced all the same problems shown in this video. It was a beast though when it was running properly...
Can't wait for the fix! Hope that one gets a video too.
I had 2 separate 290x’s. Bought them in used condition for 300€ each. When it worked (not all games supported crossfire) I managed 60 fps in 4k on some games, high settings but not max. But stuttering was pretty annoying with running dual gpu’s.
I was running HD 3870x2 by two. Moved up to HD4870x2 by two. I could see my lights dim really badly. I was running 840 EE clocked over 4ghz, 4 gpuz clocked equal, running talladega C class trucks over 500 fps maxed settings on Iracing back in the day....
Good thing power bill was included in the rent. 😊.. the good ol days.....
Crossfire just like the old nvidia 3d shutter glasses works only in exclusive fullscreen mode. And you also need compatible PCIe bus settings and driver.
I would try a different mainboard or mess with PCIe settings and timings.
I think I saw this somewhere before, but damn! 4 8 pin is crazy
You could almost say that the design aged like fine wine x)
wuhuu derbauer started usinng FPS Monitor!! do you have performance drops when you use it in some games? i know i do .. ;/
OMG I forgot about that card. I remember seeing that when it came out, and we were like "4 slots? How can anyone afford that kind of space"
I had a HD5900 series dual GPU - the thing was a cruching beast when you could get both GPU's to load.
Frame tearing and microstutter was terrible.
Mine "only" had 3 8 pins but was a massive heat generator.
thanks for reminding me how much I miss my GTX 295 and to never lend anything you value to your friends
I wonder if the mobo STILL has to support crossfire, even with a single slot dual card? Also Sniper Elite 4 was one of the last crossfire games that had great scaling!
I still have a GTX 590 and an HD6990. EDIT: just a thought that the dodgy PCIE pin might cause issues, stick the slot into x8 mode.
I know the Threadripper boards (TRX40/50 WRX80/90) all mention support for all of it still. No mid-range ( or what people call consumer boards now) have support for S.L.I at all, as none of them are certified for S.L.I & must be for S.L.I not so for mGPU on AMD's RDNA line up all of them support mGPU & most boards on AM4/5 still support crossfire regardless of certification.
thumbs up for the Remnant love
Looking forward to see KrisFix repairing this beast and them hopefully here again with a new look on the benchmarks!
I really miss those old times which let us some weird and crazy things like this gpu, the nvidia 9800GX2 and many others.
(Looks at the two HD 5970's on the wall) .. nothing new :P
I had 2x5970 and 2x5870's in my benchmark rig :P i am on 3dmark with it. (EVGA SR2 with dual X5670, STILL have it)
The motherboard box is as big as my Thermaltake CTE 570 case ! :P
They got replaced with 4xGTX570's with EK waterblocks just for for 3dmark :P
Damn it. This is one of my dream cards.
Great job Roman 👍✅
At 9:51 resizeable bar is disabled but when you go to gpu 2 it's enabled why is that
thats a good question, and didnt came up on the german video
Ahhhh as I watch this, I can't help but look at the 295x2 that is hanging on my wall... a trip down memory lane for sure
Have you tried another CPU + MOBO combination? I remember motherboard manufacturers indicating crossfire / sli support
Nice technology with this dual GPU and I hope ChrisFix will be able to fix or maybe he can tell the problem.
Waiting for a follow up video :)
Can't wait to see the KrisFix video
The crashing problem is due to very new driver. Since the card is no further supported, it runs in a legacy mode. Just find a driver close to the release of the card and your problem will be solved.
P.S. Finding the appropriate old AMD driver is like winning the lottery 5 times in a row...
Nimez drivers to the rescue.
Yeah it's definitely a driver issue, not a hardware issue unless there's a missing cap on the pcb causing instabilities. Trying to get two gpus to work together in tandem on a single PCB would require specialy driver magic coding to get the two gpus to cooperate. AMD obviously abandoned the technology if their newer drivers are any indication. He needs to go back to the very original legacy drivers, and maybe start with version 2 or version 3 of the first driver update (amd will have implemented fixes and improvements with a few versions after the original).
@@luminatrixfanfiction I would say that these dual GPUs were built as tanks. I've got lots of 5970 still working on a project for more than 14 years, and they never missed a beat. Even those that were used to mine bitcoin, they were damaged due to heat/fan issues and not due to caps blowing up.
So, it should be a driver problem. He said that he spent a couple of days working on the driver problem, but it's the vanity of the newbie to get something working wit the "best/latest" drivers, not having spent hours uppon hours to resolve AMD drivers issues... And I am not even mentioning trying to run multi-gpu stuff on linux....
@@gt4654all the people that bought 5870s and 5970s to mine btc are laughing now
@@gt4654 can confirm 2 5870s used for mining, still play games on them for 13 years now.
I miss sli. I used it on dirt 2 and the original borderlands with dual 970s. yes I know those cards were gimped by nvidia but they ran great for the games that I was playing.
As I know, for now, when main load for Video Card is AI loads, it will balance loads between chips greatly than loads from standard rendering (game, video etc), and as I understand this was the main issue for this cards, sli, crossfire, two chips was very hard to make working with equal loads, but now, when AI making render (dlaa, dlss, rtx) we can check if it stabilising loads between cards. We need to test it, if this is as I say - it will get huge performance increase.
I really hope Kris can repair it given how this is one of those "one of its kind" GPU's ! I'll be keeping an eyeout for a follow up on this :D
Was this the successor to the R9 295X2? or a completely different PowerColor only dual GPU that they Frankensteined together to make an unofficial dual GPU?
I had a Diamond HD Radeon 4870x2 and Man do I miss this card which at the time was the top of the GPU pile in 2008... Now I am running a 3060TI which is decidedly midrange. haha
"We'll be having one hell of an electric bill." 🤣
I had the old AMD Radeon R9 295X2... gave me so many problems in games, but l loved that damn thing lol
Kitteh watches GPU explanation with much curiosity…
Seeing old cards like this reinforces my belief that current cards like a 4090 are where the equivalent of 2 or 3 GPUs has all been combined onto a single giant card.
They kind of are if you look at the die shots. AD102 has 2 big blocks of SMs on each side of a crossbar and 2 sets of 3x64-bit GDDR controllers. Navi 31 does this as well, but in more smaller groups. ARC is a more homogenized design, but this is probably because the number of Xe cores isn't large enough, and the cores themselves too large, to start forming those higher-level patterns yet. Battlemage will probably look more like the others.
Can't wait for a diagnosis of Chris Fix here.
Nuts back then, nuts now, nuts according to Techpowerup relative performance that my laptop's igpu (780m with the RX 4/570 as reference) theoredically has ~80% the performance of this thing while consuming way less than 10% the power
if it locks your entire pc with pheripherals then it might be a gpu dye problem.
if the mouse can still react even if it freezes then it might be an electrical circutry problem.
Could be that the PLX chips were known to have problems in cards like this because they consumed about 30w on their own and have a paste TIM under the IHS (so 2 layer of TIM in total) instead of solder. I thought this was fixed on the newer cards ie 390's but who knows
Good chance you are right about the die of card 2 being the problem. You can see when he started testing GPU#2 did get to about 30% load and the freq tried jumping up from 300 MHz. Combination of Powercolor lack of quality back then and poor lead-free soldering unable to survive the thermal cycling of these crazy hot running cards. The gpu in the front usually the problem with such a long card with seperated cooler and flimsy backplate causing a lot of flex to the pcb
As you already know and see @der8auer, everything with it's thermal design has been tampered with and not in a good way.
I suspect since it's a dualgpu 600 watt card, that it may have broken traces due to heat expansion. Maybe try and remove the cooler and bake the GPU to reflow all path traces. This happened twice on my old crossfired 4870x2's, which drew about 750 watts. Both cards got heat damaged due to old paste and they both worked after reflow.
I feel like dual GPU solutions, be it on the same card or paired cards, has always been problematic, and that's probably also why they stopped doing it.
There should be a modern dual gpu like 4060 x2 it will actually utilize all PCIE lanes
I had the HD 7990, which was two 7970s slapped on a board with 6GB VRAM. It was a beast! When it died I replaced it with a GTX 1060, and that was an upgrade lol.
My VIA UD8 has something like that PLX chip between the 2x S3 5400EW chips. The VIA UD8 is only PCIe 2.0 x8 so.... Umm... 2 GPUs for 8 lanes. 😅 I sold off most of my S3 collection. I probably should have kept one of the earlier UD8 Revs I had because the one I still have needs a UEFI motherboard.
Anyways, I didn't know with a controller like that that it can use all it's PCIE lanes.
Also a quick thought Der8auer: Try the card on a less security patched board/older version of Windows, such as 7? The PLX chip might be triggering security issues due to the 'bypassing' or the newer kernel with the older driver simply isn't stable! AMD started disabling CrossFire for me even back in the early 2010s, the first one to go was StarCraft II and FPS slowly dropped, as i said in a previous comment i ended up just using a single card and put the second one in another machine, so friends could play games then.. And all but one of my HD5870 cards died, as well as a brand new R9 290X from MSi (they also managed to crush the fans and kill the bearings by packing it too tightly in the box straight from their factory! Two weeks later, the price also halved..oh, i was pissed. Hence never using AMD for GPUs again nor CPUs since 2010)
what the heck you need 4x 8pin for?
Even 2 can easily handle 600w +
did you try changing the port the display is plugged into? i recall that was one of the things amd drivers had problems with back then.
I come for the tech, but I stay for the cat.
Dual gpu cards are cool I’m trying to get a functional Titan Z for my collection I have a 690 right now along with a few other Titan cards but would like to get a Z and a 295x dual for my collection
i used the Radeon HD 6990 playing battlefield bad company 2 and i do remember it using both gpus. tho not running very well because amd drivers and crossfire profiles was terrible
If I remember correctly one of the drawback of those dual-gpu is it still needed a crossfire compatible motherboard. Maybe something to look into
Dual core cards like this work on literally any motherboard as the PCI-e lanes necessary for multi GPU are handled by the GPU. Multi-card SLI needs the motherboard to have a license from Nvidia. Multi card crossfire just requires that you have 2 PCI-e slots that are at least x4(it'll even run through a chipset).
I miss these contraptions. Made following pc tech more interesting as sometimes unique cards like this would pop out. Asus Mars come to mind, and even the Titan Z. Not that those were a smart choice to buy, but they were a lot cooler than a bog standard single gpu card
Am I missing something? Where was the showcase of the card consuming 580watts ZOMG!?!!