two GPUs is doable and effective, but not via crossfire or SLi. Lossless Scaling's frame generation can use the frame gen ability entinrely on only the 2nd GPU, making the 1st one be the render GPu, thus actually gaining 2x or 3x fps actually. Toasty Bros should check it out, LSFG actually makes 2 Gpus in 2024 possible. Edit: Wow. didn't think this would get upvoted. I might as well provide more info in case the Toasty Bros does see this. The setup is relatively easy, but there's a specific cable to the monitor setup that one must follow, it's pretty simple though. One just has to make sure the display out (HDMI, DVI, Displayport whatever) HAS to be connected to the GPu that's utilizing LSFG. For example, GPU 1 does the rendering of the game or video (yes LSFG works on videos too), while GPU 2 does the LSFG entirely only. The cable that is connect to the monitor(s) has to all be connected to GPU 2. After that, you just make sure in windows 10 or 11, that you setup GPU 1 to be the main one, and in LSFG GPU 2 be the render one. If you have two same GPUS, just use RTSS or some monitor to make sure LSFG is using the correct GPU, which Toasty Bros are doing already. If they need more info, LSFG has an entire discord.
Actually had a question sort of along those lines. So I'm someone who liked to have other things running on my second monitor, usually some kinda yourube video or a stream. Well doing so with some of my games puts my rx 6700 xt through the ringer and hurts performance. What I was curious about is could I say grab something like an rx 6400, something super low powered, and run my second monitor, and whatever apps I'm using there off that weaker gpu, so my good gpu can focus soul on gaming? It'd be a lot of money to simply upgrade my 6700, and it does run the games I play perfectly fine when nothing is happening on my other monitor. I'm curious if this is a viable idea, since I had heard about people throwing in the arc A310 into systems as a second gpu souly to use its newer video incoder, and it got me thinking if that would help my use case, as a small investment in something like an A310 or rx 6400 is a lot better then a huge investment in simply a higher performing gpu.
the people who made the hybrid physx software that let you use a amd card for main and nvidia card for physx are working on a hybrid rtx software to let you buy a budget rtx card like a 2060 pair it with 7900xt and use nvidias feature set like dlss and raytracing
Back in the day I had dual 1070s with a 4770K, I remember a handful of games not working with both GPUs but when games did work it was pretty cool, I played through Resident Evil 7 on a triple monitor setup with that PC, good times.
@@smokejumper749 on the games that supported it, it worked great. Once I bought a 1080TI to replace them, I could see how the 1080TI was superior though for sure.
I used to run 2X GTX 660 in SLI back in the day… Older versions of Fallout 4 worked rather well and the non-special edition of DMC4 is great with multi-GPU setups 😊
The lesson here kids, If you missed out on the SLI Crossfire wave you didnt miss much outside of relaunching your games and reinstalling and whipping your drivers with DDU constantly!! the only config of SLI I had no big trouble out of was the 560ti's once the cards demand more power oops!!
@@bornwisedistruction so far my experience with SLI and Crossfire has been pretty seamless. Only problem I've had is power draw. I have maxed my RM1000 out a few times now with my current system configuration.
@@GoonyMclinux I used to call them the Ice Cream sandwich gpu's, dont know why my brain always processed them that way. If you look at them long enough you want to take a bite out of them! 😅
PCIE Power. Slot gives 75 Watts 8 Pin gives 150 Watts 6 Pin gives 75 Watts So these cards with 2x 8pins can draw 375 Watts safely. However... you are using daisy chained power cables.... but that shouldn't be a problem as each cable should be able to supply about 400 Watts.
Dual running an RX 6600 and an RX 6700XT in the same system for LLM research and frequent gaming. AMD is great for these scenarios, it was seamless for me (I remember the SLI/Crossfire old days, it wasn't always this easy).
I run 8b models on my 6600 xt all day.. got my own finetunes I made of llama 3.1. this on Linux though as Rocm is a PITA to get working correctly on windows. do my gaming on linux also
It worked fantastic. I had 2 r9 390x gpus in crossfire right when they launched, and it was a great experience. Almost every game worked with them, and overall, I had no complaints.
I really wish both AMD and nVidia bring back support for Crossfire and SLI for both driver, game and motherboard levels. I really would like to know what it is like to run current gen GPUs in its maximum potential.
Crossfire (AMD) and SLI (Nvidia) only required 2 PCIe x8 Lanes. To get true full PCIe x16 + x16 Crossfire/SLI, it required Enthusiast (Intel i9/AMD Threadripper) motherboards with a far higher PCIe lane count. They only need to be PCIe Gen3 to work, as no Crossfire/SLI on any PCIe Gen4 cards exist outside of Enterprise level GPUs. Normal Consumer-level chips only have 20+4 PCIe lanes with motherboards. Motherboards will no longer specifically state they support SLI/Crossfire... but if it has 2 x8 lanes, it will work. For instance... ASUS Pro Art B550 Creator motherboard has 2 PCIe 4.0 x16 slots. (x16 or x8/x8) Will work with this motherboard, confirmed with 2x GTX 1080s in SLI.
Back in the days i had a sapphire 7970 when it came out, years later i got a sapphire r9 280x (pretty much same chip/card). I ran them both in crossfire without a problem. The only problem however was that crossfire, when it actually worked in a game, was often plagued with rather bad frametimes. In some games it worked quite good. One i particularly remember was metro 2033 (the original). Ran so much better in crossfire than with 1 card only. But in general was more or less a waste. Replaced them both later with a rx 580 when it came out. However... there was also the time where physix was quite a thing (at least in some games). And adding a nvidia gpu as secondary to my rx 580 allowed me to use hardware physix while gaming on the rx 580. (shoved in a 560ti to make that happen).
I had 2x 7970 cf relaced with one nvidia 680.... same performance, added second 680 then replace with one 980ti.... same performance, replaced with a 1070ti and the story ended with a 2070
1 8-pin can pull 150W max. That's 300 per card. Plus 75 via the PCIe = 375W. Double that due to the SLI = 750W. Edit: the official TDP is 295W. So that's 590W in theory.
I have 2 vega 64s in my system ironically. I have already tripped the breaker multiple times and have maxed out my rm1000 on more than 1 occasion. I even fried an outlet with this system. I have engineering samples so they operate at a max of 250 watts while boosting to 1400 mhz and you can't change the clocks either. On the retail bios it has absolutely no limits in place so i can make them pull around 400 watts each if i wanted. Funny enough the engineering sample bioses actually score higher in benchmarks.
I sold my Vega 64 for $900 during the crypto boom and bought one of the low end AMD cards at the time and used the profit to upgrade some other components. Hearing you got two of them for $200 tells me just how different the market is now.
I’d be willing to bet two 1080TI’s in SLI would give you better results, especially if you can track down a high bandwidth bridge from the 10 series. I ran two 1070’s back in the day and the bulk of what I played supported SLI with good scaling. The big upgrade from the 9th to the 10th gen was that HB bridge, it smoothed out the majority of the microstuttering I had experienced with the 700 and 900 cards.
I used to run 1070 sli back in the day just fine with no issues until more games came out without support which ended up making the choice to upgrade to a single card. Loved the performance I got when I had it though
Id got with a 40 series card or wait on the 50 series with how games are taking more and more to run your gonna want a upgrade and and update path also deff got with ddr5
In October of 2009 I built a computer around the AMD Phenom II x4 965 Black Edition which fairly new at the time. For graphics cards I ended up getting two XFX HD 5770 which was new at the time so I could run them in crossfire each GPU costed $400 US dollars. For storage I ended up getting two 1TB WD's Blacks ran them in Raid 0. I can't remember which motherboard I had but do remember I had 8GB of RAM. I clearly remember crossfire being hit and miss when it came to games. To the point some games would flat out crash on you if you was using crossfire had to disable it to play. In the end crossfire was more of an headache than it was worth. I ended up using that system for like 6-7 years if I recall correctly I stretched it as far as I could. Around a week ago I ended up finding one of the XFX HD 5770's I ran back than. I just ended up chucking it in the bin but perhaps I should have installed the card in my computer. To see what the card could do in 2024 with my i7 9700K.
Sure, a 4090 is more powerful than five 1080s, but it cost more than twenty, so if there was support for it in games now, it could be a value. IIRC one of the big drawbacks with SLI and Crossfire was that you were limited to the VRAM of a single card, as they only paired the GPUs themselves for processing, and the VRAM was whatever the primary card had.
In addition to other comments about lossless frame gen... you can also pair an older gen Nvidia card with a newer one and in Nvidia control panel make the older card a dedicated Physx Processor. The removes load from both the CPU and render card which gives a noticeable uplift in titles that have physics rendering. It also has the benefit that a low end old gen card is still usually a better physx card than your cpu is.
Trying running each gpu with it's own power supply to see if it's the power draw or crossfire. The other possibility it could be the motherboard as well.
Ran Nvidia SLI all the time never had any problems in any game and SLI made a lot of difference in frame rates and performance. But then again you could just buy a card that was equal to the performance of your two card in SLI. I concluded that this is how Nvidia figured it out anyway when they came up with SLI just to sell you another card. Or buy the expensive card to skip SLI. All I can say it was fun doing it and I even ran 3 nvidia gtx 260 cards at one time and 2 1080 ti cards that ran as good or better than one 3080 of today. These days two cards in one rig is just not worth the trouble or money. Just get a 4090 and fly.
Early Mac Pro setups were specifically designed to run with dual CPUS and GPUs. My daughter's (from her college days) came with dual Xeons and dual AMD GPUs (don't recall which model). Running BootCamp for Windows, there was literally no PC game she couldn't run. Of course, with current generation GPUs, Windows, and games, it's no longer practical.
there is a pin switch on the card that change the led colors, it's not because of the crossfire there's also a pin switch that changes the vbios I have a 56 and that was the first thing I researched about the card
Man, my first and only SLI build Circa 2015ish was a 4790k with 32gbs of Corsair Vengence and 2 970 Windforces. I still have all but the 970s sitting in my closet lol. Upgraded the gpu to the 2080ti back in 2018, still have it in my new build, should be upgrading once the 50 series are announced.
In the AI era, we need not only dual GPUs for better load distribution between gaming and studio creative loads but also solving for science and live streaming encoding load. We need not only explicit multi GPU, but also truly tiled MCM GPUs (with multiple GCDs MCDs and IOD) on a single package.
According to the power supply connector overuse definition from SilverStone warranty information. A single PCIe 8pin cable and connector's maximum current rating is 12.5A, which is 150W (+12V x 12.5A).
vega 56 owner here my card can pull 400 watts ish pcie 8pins are rated at 150watts each theese cards have 2 plus 75 watts from the pcie slot 375 watts together anything past thats a risk pcie slots are over engineered tho but i still wouldnt push it past that your cards are both extremly starved at 200 watts and are running at about half each of what they could pull with out a undervolt
Your black screen is more then likely caused by the starvation A.set power limit to plus 50 b.Undervolt c.the hbm memory tends to begin failing after time make sure you load them individually to test this
Back in the days I remember running Crysis 1 with my old Radeon 9800 Pro 128MB/ Pentium 4 3ghz HT/ 256MB DDR, then a GeForce 8800 GTS 640MB, then a Radeon x1950xtx 512MB and it ran pretty good with that card at 1024x768 on medium/high
Betcha one of those old cards (at least) is bad. Try running them each individually. Crossfire/SLI was always problematic but those hard crashes are something else.
What i would have liked to have seen is the pc running killing floor with crossfire turned off to show the difference in the fps between crossfire on and off.
I remember having two 1080ti's, wasnt really worth it. Good content, was wondering for most of the video why you were looking at the ceiling/past the camera though
But with a B450 motherboard you are running one card at PCIe 3.0 x8 speeds and the other card at PCIe 2.0 x8 speeds... Wish there was a test video with at least dual PCIe 3.0 x8 speeds or even 4.0 speeds.
Use pro drivers, adrenalin sucks with crossfire. You'll lose a bit of fps probably like 1-5% but the stability gain makes it worth it. Running two Radeon Vii's on a LGA 2011 board with crossfire support. Works great for me, I run both GPUs in most games pre 2014. I also use a virtual machine half the time, but that's why I built the system.
I am in the process of building a retro xp gaming machine, with a qx9770 4gb of ram and 2 8800 Ultra gpus in SLI. SLI was a lot of fun with the 8000 series nvidia cards back in the day.
Ive got this mobo and a sapphire pulse rx580 in my living room pc for casual games with people and itll still do 60fps in elden ring on med/high settinfs. Sapphire pulse cards come with scaling software called sapphire trixx where you can render at 75-99% screen resolution and then upscale it. Its got problems and theres some graphical issues here and there but for what it is, its amazing it still holds up with newer games.
Ironically, I had this same thing happen trying to run a 6900xt ultimate stable. I have to set the max frequency to 2000mhz and limit the power to get it to not crash or completely shutdown. Those surges are no joke!!!!!!!!!!!!!
crossfire only works with direct x 11 and vulcan. I ran crossfire for years and never had it crash like this. Farcry 5, tomb raider, witcher 3 all run fantastic in crossfire
i have been using 2 gpus my whole life , recently i have 7800xt and a 6750xt but I do a lot of work in adobe products, so having 2 gpus and 3 monitors makes it easy where i can have a GPU dedicated to adobe and one for gaming on 1 monitor via that card output but while gaming I recently realize that both gpus share a certain amount of usage while gaming where the dedicated GPU for gaming runs at 90 percent over clock and the other gpu I notice usage while gaming it uses 30 percent from the GPU , for example in cyberpunk2077 on ultra I get 330fps all the time, while adobe products are running in the background , PC ram is 64gb Kingston renegade. CPu is 5800x over clocked. almost all the games I play are over.
i ran dual gpus for years with nvidia, 780ti's, 980ti's then 1080ti's, never had these issues on the nvidia side. it either worked or the second gpu sat idle, never had lockups like these vega cards are doing
I remember in 2009 I bought an I9 with 16gb Ram and a 256mb card...and games were so slow. And everyone was telling me to go crossfire....i never did but that was the thing back then. I dont see anyone doing this anymore.
I have two different retro builds one is crossfire HD 4870s and the other is SLI GTX 295 both work well with occasional issues. Being Retro i use Win 7 because I had nothing but issues trying windows 10 with my older multi GPU setups. I quit using SLI on more modern setups after my GTX 780 SLI setup and R9 280x Crossfire.
i know im 2 weeks late to this but sli and crossfire works really well on rust as my friend upgraged dual 1080ti's to 3090's(he moslty needs it for work )
I use to run two HD5970s in crossfire (4 GPU's) as cool as it looked it was a horrible experience overall. I remember selling them off in a crypto spike and moved onto something else. I never experienced a lot of crashing but the micro stuttering was very real lol.
I have a vega 56 with the backup bios switch programed to run the Vega 64. Both are very stable and my HBM clocks up to 960Mhz. I bought it for nothing on ebay and paired it with a Xeon E3 1275 V3 and it absolutely smashes. I put it in my main rig and it almost kept up with my overclock Colorful RTX 3060 TI! The utilization does the same thing on my build, rhe card wont hardly ever push the full 1650Mhz Core clocks. Lol MAKE SURE THESE CARDS ARE SLOTTED INTO THE PCIE SLOT CORRECTLY, ONLY A FEW PINS ON THESE CARDS ARE REDUNDANT AND IF ITS NOT FULLY CONNECTED YOU WILL GET SCREEN GLITCHING CONSTANTLY!!!
I don't think that SLI or Crossfire will come back mostly as developers have moved on from it and also as someone who used SLI I found it was dropped real fast when developers did not see it as viable any more was on 2 GTX Titans back then. Would never go back to it or Crossfire if it did come back think the hassle it causes for users and also the developers is understandable this is a relic of the old 3Dfx cards so that shows just how dated and flawed it was. Interesting video on a footnote of graphics card history even so.
fun fact DX12 crossfire on 2 rtx3090 was doable last year. PS: why we did it? because we wanted to see if we had improv on AI computing (spoiler you are better off using them individually). Then we where curious to the classic question: can it run crysis? yeah it can, on BF3 we had 400+fps 4k all ultra both gpu at 84-89%. Was it painless? no but i'm not sure if it was the system fault that did not support it or if it was our "thinkering" for our AI projects.
I think the crashes are due to the GPU's spiking at high wattage and causing the power supply to trip. Crysis Remastered games disable XFIRE in the later drivers, you will have top use older drivers to get those games to work.
You need something with a strong single 12v rail, like a Corsair TX950. I kind of miss using crossfire and wish there was better driver support. Ah well. Haven't bothered with it since the 390. Anyway good video.
I had crossfire rx 580s and a 2700x cpu on my x470 mobo. Then i swapped to a 3950x and rx 5700xt. Now I'm rocking a 5800x3d and rx 7800xt on the same mobo.
Do they work as two in most games? Doesn’t nvlink work better? Like meshes the gpus into one instead of having a master and slave gpu. So you would have 22gbs of VRAM vs just the vram from the master gpu.
two GPUs is doable and effective, but not via crossfire or SLi. Lossless Scaling's frame generation can use the frame gen ability entinrely on only the 2nd GPU, making the 1st one be the render GPu, thus actually gaining 2x or 3x fps actually. Toasty Bros should check it out, LSFG actually makes 2 Gpus in 2024 possible.
Edit: Wow. didn't think this would get upvoted. I might as well provide more info in case the Toasty Bros does see this. The setup is relatively easy, but there's a specific cable to the monitor setup that one must follow, it's pretty simple though.
One just has to make sure the display out (HDMI, DVI, Displayport whatever) HAS to be connected to the GPu that's utilizing LSFG. For example, GPU 1 does the rendering of the game or video (yes LSFG works on videos too), while GPU 2 does the LSFG entirely only. The cable that is connect to the monitor(s) has to all be connected to GPU 2.
After that, you just make sure in windows 10 or 11, that you setup GPU 1 to be the main one, and in LSFG GPU 2 be the render one. If you have two same GPUS, just use RTSS or some monitor to make sure LSFG is using the correct GPU, which Toasty Bros are doing already. If they need more info, LSFG has an entire discord.
I'd be interested in see a video about this ngl, sounds like a cool concept
Actually had a question sort of along those lines. So I'm someone who liked to have other things running on my second monitor, usually some kinda yourube video or a stream. Well doing so with some of my games puts my rx 6700 xt through the ringer and hurts performance.
What I was curious about is could I say grab something like an rx 6400, something super low powered, and run my second monitor, and whatever apps I'm using there off that weaker gpu, so my good gpu can focus soul on gaming? It'd be a lot of money to simply upgrade my 6700, and it does run the games I play perfectly fine when nothing is happening on my other monitor.
I'm curious if this is a viable idea, since I had heard about people throwing in the arc A310 into systems as a second gpu souly to use its newer video incoder, and it got me thinking if that would help my use case, as a small investment in something like an A310 or rx 6400 is a lot better then a huge investment in simply a higher performing gpu.
the people who made the hybrid physx software that let you use a amd card for main and nvidia card for physx are working on a hybrid rtx software to let you buy a budget rtx card like a 2060 pair it with 7900xt and use nvidias feature set like dlss and raytracing
Crazy that doing all that and buying two GPUs is cheaper than one top end card. @@josephdias5859
can i get a gpu please
Me with 0 graphics card and 0 pc
L
I have 1 gpu no pc
F
I have a pc in a shopping cart, but not enough money to purchase😭
İ have 1 keyboard
Back in the day I had dual 1070s with a 4770K, I remember a handful of games not working with both GPUs but when games did work it was pretty cool, I played through Resident Evil 7 on a triple monitor setup with that PC, good times.
4770k for 2 1070....
good old SLI days, im freaking OLD.
you`re not old and SLI its cool 😎
Did you run Voodoo 2 SLI? That’s where Nvidia got that technology from.
My friend with dual 980TI in sli had the fastest pc at my high school
@@Typhon888 had that card with sis video card. I didn't had 2 of them but even one was like a rocket.
I got into pc building right as it started to die so I never got a chance to try it out
Time to kick it up to some Quad-SLi next!
the best
@@DanielCardei Hey there what's up Daniel? 😀
I waiting for your RX Vega crossfire.
Praise the modern GPU!
What? Why?
nah hook up 4 GTX1080s
I remember running 3 Rx 580's to get the performance of a 1080TI. Those were the days. Great video!
Same here 😂
Lol how did it run
@@smokejumper749 on the games that supported it, it worked great. Once I bought a 1080TI to replace them, I could see how the 1080TI was superior though for sure.
I used to run 2X GTX 660 in SLI back in the day…
Older versions of Fallout 4 worked rather well and the non-special edition of DMC4 is great with multi-GPU setups 😊
I ran Crysis with just a i7-4770 and a GTX 660. That rig should have blown my old system away.
A 970 / 1070 is a perfect match for that processor, you could play alot of games! I love that processor it was my first resl PC I modified.
Mate that's a pretty powerful pc compared to what crysis was supposed to be ran on
I ran it with i7 920 with 2 GTX 285 in SLI. Couldn’t do 1080p max.
@Typhon888 such a cool setup idea
mine i7-7700K and 1060 6gb
The lesson here kids, If you missed out on the SLI Crossfire wave you didnt miss much outside of relaunching your games and reinstalling and whipping your drivers with DDU constantly!! the only config of SLI I had no big trouble out of was the 560ti's once the cards demand more power oops!!
@@bornwisedistruction so far my experience with SLI and Crossfire has been pretty seamless. Only problem I've had is power draw. I have maxed my RM1000 out a few times now with my current system configuration.
I ran a 3870x2, two gpus no crossfire needed. 😂
@@GoonyMclinux I used to call them the Ice Cream sandwich gpu's, dont know why my brain always processed them that way. If you look at them long enough you want to take a bite out of them! 😅
SLI 6600GT SLI 8800Ultra and SLI GTX260 😅
PCIE Power.
Slot gives 75 Watts
8 Pin gives 150 Watts
6 Pin gives 75 Watts
So these cards with 2x 8pins can draw 375 Watts safely.
However... you are using daisy chained power cables.... but that shouldn't be a problem as each cable should be able to supply about 400 Watts.
iv had issues with vegas on daisy chain and some not. iv had a fair amount of them lol
Love to see kf 2 being played, im still playing it on my i7-4790/1070 rig
I’m excited waiting on the new one
Dual running an RX 6600 and an RX 6700XT in the same system for LLM research and frequent gaming. AMD is great for these scenarios, it was seamless for me (I remember the SLI/Crossfire old days, it wasn't always this easy).
I run 8b models on my 6600 xt all day.. got my own finetunes I made of llama 3.1. this on Linux though as Rocm is a PITA to get working correctly on windows. do my gaming on linux also
Does it bring any perks at gaming or it just utilize one card at a time.
It worked fantastic. I had 2 r9 390x gpus in crossfire right when they launched, and it was a great experience. Almost every game worked with them, and overall, I had no complaints.
I used twin RX 580s for years. Most games worked well, just used "generic AFM" in the drivers.
I miss those days.
“ oh god i lost a screw” 😭😭 I felt that in my soul
I really wish both AMD and nVidia bring back support for Crossfire and SLI for both driver, game and motherboard levels. I really would like to know what it is like to run current gen GPUs in its maximum potential.
Apparently there’s UAlink coming out that will mesh gpu’s through the pcie ports.
Crossfire (AMD) and SLI (Nvidia) only required 2 PCIe x8 Lanes.
To get true full PCIe x16 + x16 Crossfire/SLI, it required Enthusiast (Intel i9/AMD Threadripper) motherboards with a far higher PCIe lane count. They only need to be PCIe Gen3 to work, as no Crossfire/SLI on any PCIe Gen4 cards exist outside of Enterprise level GPUs.
Normal Consumer-level chips only have 20+4 PCIe lanes with motherboards.
Motherboards will no longer specifically state they support SLI/Crossfire... but if it has 2 x8 lanes, it will work.
For instance... ASUS Pro Art B550 Creator motherboard has 2 PCIe 4.0 x16 slots. (x16 or x8/x8)
Will work with this motherboard, confirmed with 2x GTX 1080s in SLI.
Back in the days i had a sapphire 7970 when it came out, years later i got a sapphire r9 280x (pretty much same chip/card).
I ran them both in crossfire without a problem.
The only problem however was that crossfire, when it actually worked in a game, was often plagued with rather bad frametimes.
In some games it worked quite good. One i particularly remember was metro 2033 (the original). Ran so much better in crossfire than with 1 card only.
But in general was more or less a waste. Replaced them both later with a rx 580 when it came out.
However... there was also the time where physix was quite a thing (at least in some games). And adding a nvidia gpu as secondary to my rx 580 allowed me to use hardware physix while gaming on the rx 580. (shoved in a 560ti to make that happen).
I had 2x 7970 cf relaced with one nvidia 680.... same performance, added second 680 then replace with one 980ti.... same performance, replaced with a 1070ti and the story ended with a 2070
1 8-pin can pull 150W max. That's 300 per card. Plus 75 via the PCIe = 375W.
Double that due to the SLI = 750W.
Edit: the official TDP is 295W. So that's 590W in theory.
I have 2 vega 64s in my system ironically. I have already tripped the breaker multiple times and have maxed out my rm1000 on more than 1 occasion. I even fried an outlet with this system. I have engineering samples so they operate at a max of 250 watts while boosting to 1400 mhz and you can't change the clocks either. On the retail bios it has absolutely no limits in place so i can make them pull around 400 watts each if i wanted. Funny enough the engineering sample bioses actually score higher in benchmarks.
I was hoping for the 7970xt but gave up and got the 7900xtx
@@sasquatchcrew leaks are coming out that they are releasing refreshes under those names being 7950xt and 7990xt
Its actually insane how good tge quality and the release frequency of these videos are, keep up the great job :D
I remember taking the side of my case off to put a box fan on my 2x r9 290s. Talk about putting some heat out
I sold my Vega 64 for $900 during the crypto boom and bought one of the low end AMD cards at the time and used the profit to upgrade some other components. Hearing you got two of them for $200 tells me just how different the market is now.
I’d be willing to bet two 1080TI’s in SLI would give you better results, especially if you can track down a high bandwidth bridge from the 10 series. I ran two 1070’s back in the day and the bulk of what I played supported SLI with good scaling. The big upgrade from the 9th to the 10th gen was that HB bridge, it smoothed out the majority of the microstuttering I had experienced with the 700 and 900 cards.
Yes you have to roll back driver for them to fully work.
I used to run 1070 sli back in the day just fine with no issues until more games came out without support which ended up making the choice to upgrade to a single card. Loved the performance I got when I had it though
finally a micro center near me
I ran 7300gts in SLI back in 2007, it actually was pretty great for the time.
When I did dual power supplies both cards got better showing a dedicated power supply just for the GPu alone was best to allow it to fully ai unlock
You guys should have done a prime day budget build , wheel spin vid, maybe being able to use Newegg too.
I had two hd 7870's paired with my i7 980x back in the day was a cool rig
Apply new thermal paste on the gpus, set a target fps, and undervolt the gpus if you can. If those cards run good undervolted
I used to run Battlefield 4 with dual Radeon HD 7850's running on the Mantle API.... haha... remember Mantle?
i`ve done this with 2x r9 295x2 in Quad CrossfireX. i feel you. its an achievement when its working.
heyyyy Daniel 😎
English or spanish
@@danielivanov930 👋
@@danielivanov930 ✌
I had 7970s in cross fire back in the day n was so worth it 💯
I’m here love your videos soon going to custom build one of the 3050 builds you guys did
Id got with a 40 series card or wait on the 50 series with how games are taking more and more to run your gonna want a upgrade and and update path also deff got with ddr5
@@MichaelGuertin-kh5no ok thank you
@@Natebate9000Dicamillo Your welcome best of luck to you!!! may your frames be high and your ping be low!!!
Dude buy 3060 not 3050@@Natebate9000Dicamillo
In October of 2009 I built a computer around the AMD Phenom II x4 965 Black Edition which fairly new at the time. For graphics cards I ended up getting two XFX HD 5770 which was new at the time so I could run them in crossfire each GPU costed $400 US dollars. For storage I ended up getting two 1TB WD's Blacks ran them in Raid 0. I can't remember which motherboard I had but do remember I had 8GB of RAM.
I clearly remember crossfire being hit and miss when it came to games. To the point some games would flat out crash on you if you was using crossfire had to disable it to play. In the end crossfire was more of an headache than it was worth. I ended up using that system for like 6-7 years if I recall correctly I stretched it as far as I could.
Around a week ago I ended up finding one of the XFX HD 5770's I ran back than. I just ended up chucking it in the bin but perhaps I should have installed the card in my computer. To see what the card could do in 2024 with my i7 9700K.
Sure, a 4090 is more powerful than five 1080s, but it cost more than twenty, so if there was support for it in games now, it could be a value. IIRC one of the big drawbacks with SLI and Crossfire was that you were limited to the VRAM of a single card, as they only paired the GPUs themselves for processing, and the VRAM was whatever the primary card had.
The 1080 was £575 at launch. 20 x 575 is 11,500, so you are not even close.
In addition to other comments about lossless frame gen... you can also pair an older gen Nvidia card with a newer one and in Nvidia control panel make the older card a dedicated Physx Processor. The removes load from both the CPU and render card which gives a noticeable uplift in titles that have physics rendering. It also has the benefit that a low end old gen card is still usually a better physx card than your cpu is.
Trying running each gpu with it's own power supply to see if it's the power draw or crossfire. The other possibility it could be the motherboard as well.
Ran Nvidia SLI all the time never had any problems in any game and SLI made a lot of difference in frame rates and performance. But then again you could just buy a card that was equal to the performance of your two card in SLI. I concluded that this is how Nvidia figured it out anyway when they came up with SLI just to sell you another card. Or buy the expensive card to skip SLI. All I can say it was fun doing it and I even ran 3 nvidia gtx 260 cards at one time and 2 1080 ti cards that ran as good or better than one 3080 of today. These days two cards in one rig is just not worth the trouble or money. Just get a 4090 and fly.
Early Mac Pro setups were specifically designed to run with dual CPUS and GPUs. My daughter's (from her college days) came with dual Xeons and dual AMD GPUs (don't recall which model). Running BootCamp for Windows, there was literally no PC game she couldn't run. Of course, with current generation GPUs, Windows, and games, it's no longer practical.
my last sli was 2080ti's always run sli back in the day started sli on 680's then 780's then 1080ti's gotta love the good old days.
there is a pin switch on the card that change the led colors, it's not because of the crossfire
there's also a pin switch that changes the vbios
I have a 56 and that was the first thing I researched about the card
Man, my first and only SLI build Circa 2015ish was a 4790k with 32gbs of Corsair Vengence and 2 970 Windforces. I still have all but the 970s sitting in my closet lol. Upgraded the gpu to the 2080ti back in 2018, still have it in my new build, should be upgrading once the 50 series are announced.
In the AI era, we need not only dual GPUs for better load distribution between gaming and studio creative loads but also solving for science and live streaming encoding load. We need not only explicit multi GPU, but also truly tiled MCM GPUs (with multiple GCDs MCDs and IOD) on a single package.
According to the power supply connector overuse definition from SilverStone warranty information. A single PCIe 8pin cable and connector's maximum current rating is 12.5A, which is 150W (+12V x 12.5A).
vega 56 owner here my card can pull 400 watts ish pcie 8pins are rated at 150watts each theese cards have 2 plus 75 watts from the pcie slot 375 watts together anything past thats a risk pcie slots are over engineered tho but i still wouldnt push it past that your cards are both extremly starved at 200 watts and are running at about half each of what they could pull with out a undervolt
Your black screen is more then likely caused by the starvation
A.set power limit to plus 50
b.Undervolt
c.the hbm memory tends to begin failing after time make sure you load them individually to test this
Back in the days I remember running Crysis 1 with my old Radeon 9800 Pro 128MB/ Pentium 4 3ghz HT/ 256MB DDR, then a GeForce 8800 GTS 640MB, then a Radeon x1950xtx 512MB and it ran pretty good with that card at 1024x768 on medium/high
Betcha one of those old cards (at least) is bad. Try running them each individually. Crossfire/SLI was always problematic but those hard crashes are something else.
thank you for doing this video, i was actually thinking about doing this and spending around $600 instead of saving up for a really expensive GPU
still running a 1050 ti with fans that dont spin from a friends old broken pc
What i would have liked to have seen is the pc running killing floor with crossfire turned off to show the difference in the fps between crossfire on and off.
9:14 new meaning to “whistle while you work” 🤷🏾♂️🤭
Great video guys
What model Zalman case is that ?
Neo # ????
i3 neo
I remember having two 1080ti's, wasnt really worth it. Good content, was wondering for most of the video why you were looking at the ceiling/past the camera though
But with a B450 motherboard you are running one card at PCIe 3.0 x8 speeds and the other card at PCIe 2.0 x8 speeds... Wish there was a test video with at least dual PCIe 3.0 x8 speeds or even 4.0 speeds.
Great content as always.
Use pro drivers, adrenalin sucks with crossfire. You'll lose a bit of fps probably like 1-5% but the stability gain makes it worth it. Running two Radeon Vii's on a LGA 2011 board with crossfire support. Works great for me, I run both GPUs in most games pre 2014. I also use a virtual machine half the time, but that's why I built the system.
I am in the process of building a retro xp gaming machine, with a qx9770 4gb of ram and 2 8800 Ultra gpus in SLI. SLI was a lot of fun with the 8000 series nvidia cards back in the day.
Ive got this mobo and a sapphire pulse rx580 in my living room pc for casual games with people and itll still do 60fps in elden ring on med/high settinfs. Sapphire pulse cards come with scaling software called sapphire trixx where you can render at 75-99% screen resolution and then upscale it. Its got problems and theres some graphical issues here and there but for what it is, its amazing it still holds up with newer games.
Ironically, I had this same thing happen trying to run a 6900xt ultimate stable. I have to set the max frequency to 2000mhz and limit the power to get it to not crash or completely shutdown. Those surges are no joke!!!!!!!!!!!!!
How about 2x 2080 Ti over NVLINK? Curious minds!!
Thx for the great content! 🎉
LTT already did 3090s
I always wanted to duel up my 1070 but by the time I could it was an outdated technology 😢
My last dual gpu set up was dual Evga 8800gt’s in sli
I had one as my first gpu
This ALL depends on your Motherboard x4 x8 x16 and what slots you use.
crossfire only works with direct x 11 and vulcan. I ran crossfire for years and never had it crash like this. Farcry 5, tomb raider, witcher 3 all run fantastic in crossfire
I imagine they need to be re-pasted badly, I bet the TJ-Max is high.
i have been using 2 gpus my whole life , recently i have 7800xt and a 6750xt but I do a lot of work in adobe products, so having 2 gpus and 3 monitors makes it easy where i can have a GPU dedicated to adobe and one for gaming on 1 monitor via that card output but while gaming I recently realize that both gpus share a certain amount of usage while gaming where the dedicated GPU for gaming runs at 90 percent over clock and the other gpu I notice usage while gaming it uses 30 percent from the GPU , for example in cyberpunk2077 on ultra I get 330fps all the time, while adobe products are running in the background , PC ram is 64gb Kingston renegade. CPu is 5800x over clocked. almost all the games I play are over.
My guess is the thermal interface is failing and the cooler is clogged up. Can't wait to see Watt it was!
I'll see myself out.
Man this was literally the dream pc of use back then😅
0:01 Hah, I don't have 1 graphics card, I have 0! Integrated Graphics! I am better! I am smarter!
i ran dual gpus for years with nvidia, 780ti's, 980ti's then 1080ti's, never had these issues on the nvidia side. it either worked or the second gpu sat idle, never had lockups like these vega cards are doing
I remember in 2009 I bought an I9 with 16gb Ram and a 256mb card...and games were so slow. And everyone was telling me to go crossfire....i never did but that was the thing back then. I dont see anyone doing this anymore.
I have two different retro builds one is crossfire HD 4870s and the other is SLI GTX 295 both work well with occasional issues. Being Retro i use Win 7 because I had nothing but issues trying windows 10 with my older multi GPU setups. I quit using SLI on more modern setups after my GTX 780 SLI setup and R9 280x Crossfire.
i know im 2 weeks late to this but sli and crossfire works really well on rust as my friend upgraged dual 1080ti's to 3090's(he moslty needs it for work )
I use to run two HD5970s in crossfire (4 GPU's) as cool as it looked it was a horrible experience overall. I remember selling them off in a crypto spike and moved onto something else. I never experienced a lot of crashing but the micro stuttering was very real lol.
I'm still using 2x 2080's with a NVLink or whatever and a FX-9590. Honestly not needed one does fine on most games.
Does that mobo support PCIe Gen 4 for the 7600?
B450 has 3.0 so no.
I have a vega 56 with the backup bios switch programed to run the Vega 64. Both are very stable and my HBM clocks up to 960Mhz. I bought it for nothing on ebay and paired it with a Xeon E3 1275 V3 and it absolutely smashes. I put it in my main rig and it almost kept up with my overclock Colorful RTX 3060 TI! The utilization does the same thing on my build, rhe card wont hardly ever push the full 1650Mhz Core clocks. Lol MAKE SURE THESE CARDS ARE SLOTTED INTO THE PCIE SLOT CORRECTLY, ONLY A FEW PINS ON THESE CARDS ARE REDUNDANT AND IF ITS NOT FULLY CONNECTED YOU WILL GET SCREEN GLITCHING CONSTANTLY!!!
Jackson's t-shirt is awesome!
I don't think that SLI or Crossfire will come back mostly as developers have moved on from it and also as someone who used SLI I found it was dropped real fast when developers did not see it as viable any more was on 2 GTX Titans back then. Would never go back to it or Crossfire if it did come back think the hassle it causes for users and also the developers is understandable this is a relic of the old 3Dfx cards so that shows just how dated and flawed it was.
Interesting video on a footnote of graphics card history even so.
Matty had a sick Spider-Man T-shirt at the end of the video.
fun fact DX12 crossfire on 2 rtx3090 was doable last year. PS: why we did it? because we wanted to see if we had improv on AI computing (spoiler you are better off using them individually). Then we where curious to the classic question: can it run crysis? yeah it can, on BF3 we had 400+fps 4k all ultra both gpu at 84-89%. Was it painless? no but i'm not sure if it was the system fault that did not support it or if it was our "thinkering" for our AI projects.
I bought an SLi motherboard back in the day to be able to run 2 x 560 Ti as a budget GPU upgrade. Can't remember having any issues.
I think the crashes are due to the GPU's spiking at high wattage and causing the power supply to trip.
Crysis Remastered games disable XFIRE in the later drivers, you will have top use older drivers to get those games to work.
You need something with a strong single 12v rail, like a Corsair TX950. I kind of miss using crossfire and wish there was better driver support. Ah well. Haven't bothered with it since the 390. Anyway good video.
Hi, great vid. Wanted to try something different. Will this work with 2 X AMD WX5100's (8Gb) each?
I had crossfire rx 580s and a 2700x cpu on my x470 mobo. Then i swapped to a 3950x and rx 5700xt. Now I'm rocking a 5800x3d and rx 7800xt on the same mobo.
Yeah dual GPU was cool when it worked. Often went to crap once drivers got updated. It was not smooth with my dual R9 270xs.
Alot of the Vega 56 and 64s have been bios swapped a lot of 64s have been flashed to vega 56 bios due to them being extremely good mining cards
I really wish sli or crossfire didnt die casue imagine 2 3090s that would be insane
Crossfire was already pretty much dead in 2019 when I still rocked with 2x R9 290.
I still play Killing Floor . Great challenge when playing solo.
Crossfire was a nice idea boys, but using pigtail power connectors on the GPUs, really? Use four individual PCI-e cables from the PSU.
I ran a 580 crossfire on an Asus Tuff x570 plus board with 1000watt power supply and mine ran stable
We should try gaming with 2 cpus
I'm still running my two GTX 1080Ti in SLI right now. It looks beautiful to me...
Do they work as two in most games? Doesn’t nvlink work better? Like meshes the gpus into one instead of having a master and slave gpu. So you would have 22gbs of VRAM vs just the vram from the master gpu.
This intro felt funny, as I look to my right and see x2 1050 TI **confused laughter**
First issue: the second PCIE slot is electracly at x8/PRIMER FALLO: EL SEGUNDO SLOT PCIE ES ELECTRICAMENTE X8
That's normal for mainstream platforms