The story with the two power supplies is because this was the era of custom pcs were power supply manufacturers lied about their power ratings (before 80 plus was thing) and couldn't actually handle SLI. Those supplementary power supplies were actually a god send at the time if you wanted dual gpus.
I almost guarantee this system had one of those billet aluminum "600WATT" power supplies that was actually like 300W and had a lead weight in it, which was why it needed the booster. And then the original power supply exploded.
I'm usually the last guy to say this, (most of the times people do it's just a case of "WAH, I DON'T WANNA BE RESPONSIBLE FOR MY OWN PURCHASES!") but that definitely sounds like, suuuuper fucking illegal. That's literally just flat out incorrectly labeling the product and straight faced lying to people about what they're buying. Was it like second hand shady ebay sellers or something who were buying 300w and reselling them as 600, or did a first party manufacturer actually do that and expect to get away with it?
@@robonator2945 Usually its stupid shit like "yeah our 800W PSU can only deliver only 400W from the 12Volt line, but you also get 5V and 3V and all that and it almost kinda maybe adds up to 800Watts just not at once 'cause then it explodes, but its your fault for trying to use all 800 Watts at once"
I had (still do but it's buried in the attic) a full aluminum Lian Li case that did that and I loved and cherished for over a decade until liquid cooling was almost required to game on. Separate chassis and case don't play well for liquid cooling.
i love the “hey smokers Bringus1 here” intro !! you watched Druaga too ! god i miss him so much i wish he would come back ! his tech videos were just like no other you are literally the closest we’ve got to him since 😢
Man, it's crazy how many of those modern "let's do stupid shit with computers" RUclipsrs were inspired by Druaga1 (Action Retro, MichaelMJD, Bringus, WindowsG Electronics, even LGR Blerbs to a certain degree, and countless smaller channels), him and Green Ham Gaming really started a whole genre.
@@MrCed122 absolutely. and tbh, i am definitely one of those people who wants to start a channel like this and i have everything i need except a tripod to get started so im only waiting on that. i do have serious plans to make a let’s do stupid shit with tech/gaming channel i wanna add to the community 🙂 i also have a fair amount of tech at my disposal ranging from time periods from all over the place lol
man what happened with Ian was very unfortunate, we don't talk anymore. But a lot of his fans we're very cynical to him, got into discord too early for proper moderation. Community ate itself from the inside out (even some really weird stalking happened!)
Those cathode lights would have worked with that wiring when first tried (apart from the broken ones) if it had been powered properly. That switch at the back of the PC went to a sound reactive module that the lights were connected to. The lights would have flashed on when hearing music or game sounds. Without that being on the lights didn't work... until he pulled it apart and ignored it.
Hey! The reason those games kept crashing on load is because the medium and high presets kept turning on MSAA 4x or 8x. MSAA is old-school multi-sample AA and it is extremely GPU and VRAM intensive, especially back in that era. I'm willing to bet you could run most of those games at Very High / Ultra just fine with MSAA turned off (assuming the textures all fit into VRAM.) You might consider giving it another go someday.
I remember on my old GTX 660 I often turned MSAA off and forced FXAA in the Nvidia control panel. Yes it made games a bit blurry, but it was still better than MSAA just for the sake of performance. I’m glad we’ve got better AA options nowadays, though of course a modern GPU can handle MSAA in old Source games no problem.
I think that wouldn’t give much. Problem with steam is that game receive updates including engine changes and fixes over time. If he used era correct versions they would run better on old hardware. I had a load of 2006-2011 gpus and what you said about performance hit applies mostly to pre HD 4000 series radeons where they drown like titanic after antialiasing is turned on. GeForce 8000 series were superior in antialiasing handling and had much smaller performance drops.
The cool thing about the Crucial RAM LEDs is that they are electrically driven by the input/output going on instead of pure software based. It's true blinkenlights, like the HDD activity light. It's showing real-time activity. Too bad they ditched that for their DDR4 and DDR5 kits. What's insane is they even had a rare kit for DDR1 era. Corsair had a LED activity meter (green going to red) for DDR2 kits. I have the Crucial Ballistix Tracer DDR3 kits (with the black heatsinks) and I gotta say, watching 8 DIMMs blinking on my ASRock Extreme 11 X79 motherboard based on true activity was mesmerizing; you could tell which pair of DIMMs were more active, etc. It was like watching the Connection Machine CM-2/CM-5. It did come with some software with just a few settings (like 4 levels of brightness and patterns) but nothing like the RGB LED kits now. I gotta reiterate it was true electrical activity though; wasn't like the software was just emulating it or something separate on top. The kits came in Blue & Red, Green & Orange combos for the DDR3 I had if I'm remembering right (early 2010s era of color matching PC parts). They were also one of the few DIMMs that showed temperature too at the time (something most people barely thought about). I really wish DDR4/DDR5 had something like them, one of the few things that made me want to stay on the older platform.
I'm the kind of person who gets a case with shit tons of RGB, then spends the first day trying to figure out what unholy software I need to install to make them shut the fuck up and just either turn off or be a dim white, but I would *_100%_* buy modern ddr5 sticks of that RAM. Fuck fancy RGB, if you duct taped a dozen tiny red LED's on a stick of RAM and wired them to react to I/O I'd take it. That's the sort of subtly baller energy that makes a computer look actually good, rather than like it's only legally allowed to be powered on in june. (not to mention, it actually gives a very nice visual indicator of how much your computer is working.) (and, if it needs to be mentioned, the fucking *_irony_* that that "subtly baller energy" is in this brrrrring billboard of an Nvidia advertisement that was cursed to take the form of a computer case against it's will is not lost on me) edit : 8:32 ... perhaps I judged you too harshly, that too is subtly baller as fuck.
@@tdata545 I have a bunch of the DDR3 Ballistix Tracers and there is software for it but it's not like what you'd see on other kits, especially RGB LED RAM. In terms of settings, every kit had two colors, and you can switch one or the other (I think the combo from memory is Red & Blue, Green & Orange). You can change the brightness, pattern (4 settings), pick which one of the 2 colors and turn them off. But I want to reiterate that the way it works is like legit based on the electrical activity. Like if you're playing a game or really loading up the RAM, it actually shows. There's no software sync for all of them, so when I had 8 DIMMs installed, it looked like a legit thinking machine, with each DIMM doing it's own thing, you could tell which DIMM slots were paired. Another cool thing, at least for the DDR3 era was that the software also shows the RAM DIMM temperatures. That was kinda rare. I basically bought a bunch of kits for my X79 board (ASRock Extreme 11 with 8 DIMM slots) because I love the Connection Machine CM-2/CM-5. I love blinkenlights. I even tried to buy dual processor Chinese boards to get even more blinkenlights since in some rare instances non-ECC RAM works (since they do some custom BIOS) but didn't have luck. I still have it all but moved on to faster newer computer hardware.
This was an expensive PC back in 2007, for sure. Plus it has cold cathodes jury-rigged into the system, so you know the owner was a pro modder. FSP is an OEM power supply manufacturer, so that device powered the cathodes. SLI has always sucked, so this was nothing more than a baller PC that was constantly flexed on forums.
no, the extra power supply was for the second GPU, but whoever owned this PC must have upgraded the main PSU at some point. nobody had an 850w PSU in 2007.
As someone who was there, cold cathodes were BRIGHT AF when new. but they faded over time. I would have creamed my jorts for this PC back in the day. I can tell you that 850w psu was not original to this build, the old PSU i can gaurantee you needed that extra booster .
>Casually runs Crysis good "Oh I guess this is fine" Dude you have absolutely no idea lol. That's what this computer was MADE for...also you have to get to the sunrise on the hill that's the start of the first real level for the actual Crysis test.
Guy says: "Q9300...That Sounds Good?" Sir... Im wounded. That was the budget i3-i5 Equivalent with a big delta of OC (Yeah, before your time, Overclocking was worth the headroom then) but some minor heating TDP to contend with at the 1.31v with around ~450 FSB x7.5 Clock Mult. It was a budget proc for a MUCH Different period in PC hardware releases (We're talking every six months release at this time). Think 2008 Bioshock, COD4 and Crysis. This whole system was a budget Proc, Higher end SLI GPU build. So Around ~1k for this. They went HAM. Kudos to the OG owner. You succeeded with Crysis.
I am literally using this case with all modern components to watch this video on. Bought it brand new back in 2006 or whenever it came out for $300 but I had a kid soon after and couldn't build in it. Ended up giving to a buddy for a birthday gift a couple of years later and then he gave it back to me in 2020. Last year I cleaned it up and put a new coat of paint on it and built a new PC in it. My goal was to do SLI when I first got the case but when the time came to build in it, SLI had been dead for some time. I still like the look of the case and the RGB fans light it up nice! Its one of those rare even for the time cases that I will probably never get rid of. A relic of the past!
not installing SLI drivers on an OS where SLI drivers didn't exist anyway to play games that didn't support SLI But we're not here for results, we're here for the journey, the personality and the jank.
It could get better, he could have installed SLI drivers and applications, yet they would have performed better with a single card. The way it was meant to be played.
Ahem akctually, sli drivers are baked into the base driver, win7 supports sli, and most of the games he tested support or supported sli. You have to enable it though and ~.3GB of vram wasn't going to cut it anyway. It would actually be a pretty sick old rig with a Kepler II card or something. Edit: TWOOO Kepler II cards
Flipping the internal chassis around has a purpose! Around late 2004 Intel introduced what was supposed to be a successor to the ATX form factor, aptly called BTX. In terms of layout, it was essentially a "vertically flipped ATX" and, as such, it was incompatible with previous cases as some elements (like connector and I/O shield placement) wouldn't line up correctly. At the end of the day, BTX never really took off and that's why we're still using ATX to this day. This case, however, was released in a time where both were present on the market, and flipping the internal chassis makes it compatible with both form factors! EDIT: There's little tabs on the I/O shield and expansion slot plates, after flipping the internal chassis you have to swap those two around as well, and then you're done!
@@ThisisForTheTV Cool, I did not know that the precision line also had BTX boards. It does make sense, I guess that dell were all in on BTX boards for while. Also nice profile pic!
EVGA does not in fact still make motherboards, they laid off that entire department, what you're looking at is left over stock, their LGA 1700 mobos never received BIOS updates for 14th gen CPUs. That company is on its deathbed, the only thing they still sell consistently are PSU's and those have skyrocketed in price despite reducing their warranties from 10 years to 3 years.
What Invidia did to EVGA I never forgave them for such a good company that cared about consumers put to death by a giant in the industry. Wish EVGA would partner up with AMD or Intel and start making GPUs again. Would be so cool to see them back at the top not many companies fought for consumers like EVGA did and it killed them.
@@yasho9008 painting nvidia as just a reasonable, innocent party in the situation is definitely a choice. It's a complex web of corpo bs and personal feelings that were involved over the course of decades that made EVGA drop out the market and very clearly not just that
@@yasho9008 Nvidia drove the prices up ridiculously high for AIB partners making their margins even thinner than they already were. EVGA said enough is enough and told Nvidia that they're done.
The era of Phenom II, Core 2 Quad, Nvidia 8000 series, and AMD 4000 series, was a fantastic time. After all the updates to the Source games, socket 775 amd AM3 systems really do drag.
AM3 is fine with DDR3 and modern video cards with over 1 GB of VRAM. Biggest thing was probably the iffy raid 0 config, and the 8800 GTS limited to 320 MB VRAM. There was the G92 8800s with 512 MB - 1 GB VRAM those would have fared better too.
@@clack1 What I'm talking about is how CPU intensive Steam became. No, those systems are not fast enough, from personal experience half life 2 deathmatch will stick up on a slow PC like that ddr3 or not, and it's not HL2.exe that's doing it, it's STEAM, steamwebhelper etc etc etc and so on and so forth. That's why about 6 years ago I moved from Phenom II to a used socket 1155 platform, and that squashed the issue. I have a r7 5700x now, even better. Even compared to socket 1155 I don't care how fast your p2 or c2q is it sucks.
For I handed on to you as of first importance what I also received: that Christ died for our sins in accordance with the scriptures; that he was buried; that he was raised on the third day in accordance with the scriptures 1 Corinthians 15:3
I built a $3k PC with this (non-nVidia variant) case in 2007. I still have the case in my basement. This is still one of the most extraordinary cases ever made, and the removable motherboard tray is epic. Classic.
The reason for the reliability issues with the 8800 GTS cards is that they're notorious for cooking themselves, Nvidia ended up in court with computer manufacturers like Apple over the GPUs of that era. You can temporarily make them better with the oven trick, balancing the GPU on four foil balls and throwing the board in an oven hoping the solder re-flows itself. But it's only temporary and emits some dangerous chemicals not good for your oven.
@@SunroseStudios The "oven trick" is equivalent to the tide pod challenge, we had idiots back then too. The cause of the failure of the 8xxx/9xxx series GPUs was bad ROHS solder that was too brittle. The heat from the GPU ASIC would cook the BGA connecting the ASIC to the PCB and cause the individual connections to crack and fail. The proper fix is to get the GPU reballed with leaded solder, and if you can't do that, keep the card on all the time. The card being at a consistent temperature would significantly prolong the life of the card. I had numerous 8xxx and 9xxx Nvidia cards, and the ones I still have today still work because I always kept them running. The only ones that died were the mobile variants, which is explained below. All that frying the card in the oven did was cause thermal expansion of the broken BGA joints and reform a brittle weak connection that would fail again in short order. The lawsuit the other person was talking about was less related to this problem and more related to a different problem with the mobile variants. The bond wires and chip carrier connected to the ASIC die had a manufacturing defect that would cause the connections to break. Either from the bond wires to the chip carrier or the chip carrier to the substrate package where the BGA pads were. This problem was terminal and fatal, it could not be fixed with a reball since the problem is inside the physical package. Going back to the ROHS solder issue, this plagued the entire industry for the better part of a decade, and is still a problem today. No other metal has the ductility of lead, and early ROHS solders used between 2005-2009ish were horrible in that regard. There's a reason why hardware from that era is scarce today, because so much of it died from either that, or the capacitor plague.
Technology connections, greatest technician that's ever lived, louis rossman, dankpods. Never did i ever think i'd hear them all mentioned on a single channel, yet here i am, dreaming of the ultimate collab.
I'm a retired custom PC builder and that machine in the day ran around 3500 dollars,those 8800 were about 500 bucks a pop back then.and so on,good score.
I just have to comment, but this gaming rig, is literally top of the line stuff, lol. Holy eff, I would have been incredibly, INCREDIBLY young, but dual 8800 GTS's and a Q9300, holy eff. That's a monster unit. Others already mentioned how 8x MSAA and the resolution you were running at were like, trying to do 4k/8k in todays time, lol, quite unreasonable.
It totally is, but the DDR2 and low memory cards are a huge shame. In 2008 I was the insane guy with DDR3 and a GTX 280. I would have felt better about if BFG (who I bought it from) didn't go out of business right away and shortly after DX11 was and is to this day the standard. Kids have no idea how crazy PC gaming was back then, NOTHING lasted very long.
@@rushnerd My dad used to say that as a kid I could never understand a world where the threat of war with Russia was so high. I'm sure like that, this trend will also change back quickly enough.
@@rushnerd Nowadays nothing really lasts that long either, that aside it's always been quite a few games that did ran on everything - take GTA San Andreas for an example, could easily play that on an office PC bought around the time of its release and you'd also be able to play quite a few games years after, it's usually the selected few that don't really run. Hell, my uncles 10 year old Office PC somehow still manages to run F1 2019 but got stutters in between when using any browser with more than 2 tabs open.
@@LagrangePoint0 Lol yeah, they closed up in 2009. They were BAR NONE the best in the GPU industry because of that lifetime warranty and they made the best cards. Was an easy pick for me, but yeah I got washed both on DX10 and their card. Card was SUPER complex to take apart too, but the cooler had major issues about five years in and I replaced it with an aftermarket 3-fan cooler and it was even better after that (was blower style). Now that EVGA is out of the game too, there is absolutely no top brand anymore that puts customers first and it's very sad.
Oh yeah, it's crazy fun hooking up an optical drive to a modern PC and ripping a whole bunch of DVDs and Blu-rays to build your own personal movie server. Funny to think that modern AAA titles with their 100GB install sizes would take 3-5 Blu-ray discs to install though.
@@Dee_Just_Dee You wouldn't necessarily need multiple discs, there's actually multi-layer Blu-Ray discs much like dual-layer DVDs. They can hold upwards of 100GB on one disc
Holy mother of God, I remember hearing and reading about rigs like this back in the day. People were upgrading their PCs to configurations such as this to go play Crysis when it was released, and quickly found out that the Core2Duo they upgraded from was quicker than the Core2Quad they got thanks to the game being so single-threaded. But the SLI, man the SLI, that worked wonders. The stuff of legend.
@@SMlFFY85 no, Nvidea's SLI is why I swapped to ATI for a long time. I loved the SLI in the Voodoo2, 3dfx had the two cards directly communicate and nvidea had the communication go through the PCI which relied on CPU bus speed while it was acting as an ambassador between the 2 cards which made the process slow down. 3dfx's SLI version could effectively double the pixel generation which could be either used to generate twice the fps or twice the resolution or a mix of both. The CPU on this build would have actually been fine except for textures since it came out a couple of years after the cards, but with a CPU that would have available at release it would have taking all the clocks down to the bus speed before combining and putting them back out again. Simple math E6600 has a 1066 bus speed G80 has a 576 gpu clock, a 1350 shader clock and a 900 memory clock. The way nvidea did it would be on each card to half the bus speed to get the max gpu clock from 576 down to 533, limit the shader per card from 1350 down to 1066 and limit the memory speed down to 533 then they are combined which after the fact would make all the clocks max at 1066 gpu(instead of 1162) 2132 shader (instead of 2700) and 1066 memory (instead of 1800). It's why SLI SUCKED at textures, because 2 cards barely put out 10% more memory processing than just a single card. For a long time chip manufacturers bus speed lagged behind doubling graphics cards' clocks, SLI nvidea style would probably do better now that that has lined up more, 14900kf has a 6k mhz bus and the rtx 4090 doesn't come near half that with any of their clocks so two cards would would probably work pretty well now.
Hint for the fake flash drives, 'SD Formatter V4' can fix the fake storage numbers of any USB device so it displays the true storage capabilities and works again. Better than having it be landfill like it was destined to be. PS "SD Formatter V5" sucks and removed the "size adjustment" option, it has to be the older version
Man this is very similar to my old 5 thousand dollar build from back in the day! I used the nicer version of this case with like 10 huge fans running all BFG Technologies gear with the motherboard and dual graphics cars running in sli on a triple monitor setup. I was running a Core 2 Duo Extreme, Corsair Dominator ram with cooling fans on the ram and a thermaltake modular power supply and 2 WD Raptors running in raid! It was literally all the best I could find on new egg at the time and worked great for 10+ years before the motherboard died and by then BFG technologies were no longer a company and everything was dated so I moved on! I still have it and this video just motivated me to get a new motherboard and get it running again so thank you!
That "mystery box" that connects the switch to the cold cathode is a DC/AC inverter. I had a lot of fun with those back in the day. That day being about 3 years ago, when I built the beefiest computer I could spec with parts from 2008.
Cold cathode lighting was so badass, it needs to make a comeback. Edit: Did you even install SLI drivers? Seems like it was running on just one 8800 GTS.
said similar before I saw your comment. Only Crysis and the updated Source engine even supported SLI at a game level, and SLI was dead by the time Windows 7 landed, so those weren't even SLI drivers.
@@rossclutterbuck1060 What do you mean by SLI being dead by the time Windows 7 landed? SLI was definitely around in Win7 era and a lot of PC build guides from around 2010-2013 even recommended mid range cards in SLI for some reason
I was drooling as you were opening this case up. 2 x 8800 GTS, a Core 2 Quad, removable mobo tray, and cold cathode ray tubes! This was a beast of a machine back in the day and what I was dreaming of as a teen.
SO much of this takes me back. I remember Nvidia chipsets causing me no end of grief with Core 2 Quads in my video edit suites, and the heat generated by a pair of 320GB Velociraptors in RAID 0 needed its own fan. Pretty sure though SLI was never running in any of those games. Only Crysis supported it out of the box, and it wasn't until the Lost Coast tech demo did Source get SLI support (although bloom and lighting options in Half Life 2 indicated it was that later Source revision). Pretty sure SLI was already dead by the time Windows 7 landed too so the drivers themselves had no SLI support. Still, green cathodes FTW.
I never understood why that was the only IDE 10K rpm drive and that it had such small platters. The drive enclosure was the 3.5" form factor but the platters were 2.5". There were so many more SCSI drives that spun at 10K and 15K rpm back in the day although not sure what their platter size was.
These nforce boards are notorious for dying all the time... the red balliastix were quite expensive, too! This was extremely expensive back then. Basically the q9300 was very very late into socket 775's lifecycle. Very cool find!
I used to own all sorts of full size towers I found in the garbage, they were easy to work with, but a bit heavy. I loved using cold cathode tubes, they looked so cool, especially in the alienware cases.
They do make an “emulation” of that cold cath light with EL wire (electro luminescent), you could just glue it into a resin block then run it through an acrylic tube or even mould it into resin , only problem is it also needs its own power supply/ reg and it usually puts off a high frequency , it can be mitigated but it’s not good for cooling
That would make for a sick retro-gaming PC. Somebody just threw it out too. Wow. - On that power booster PSU... I wonder if they were overclocking and thought the power booster power was "cleaner" for better GPU power? I guess when the CPU draws more power... it could cause a ripple for the GPUs?
Had the Q6600 years ago in my first gaming rig. The difference in the Q6's to the Q9's (6 year difference) was so negligible. Those Q6600's were BEAST for their time.
The main difference was the Q6600 SLACR would go from 2.4ghz to 3.2ghz and (providing you had the cooling) wouldn't break a sweat. Bonkers CPUs for the day.
Yes!!! My first real gaming rig was a q6600 slacr, with a dfi lanparty and a custom cpu cooling loop complete with uv dye, built new in 2007. Ran it at 3.4ghz by bumping the vcore. I ran that setup for 8 years before building my current rig in 2015 (got out of gaming a while ago). Those were the days lol.
In 2015+ there was a clear difference between the Q6000 and Q9000 series. The Q9000 series added SSE4.1 support which was required by many games in mid to late 2010s. I have a Q9550 @ 3.4GHz stock voltage in one of my systems. It had a Q6600 SLACR before but it didn't want to go above ~3.1GHz, probably could have if I had given it higher voltage and had something better than the stock cooler.
@@Aguy6714 similar setup except I had an XFX nforce 680i sli mobo and the CoolerMaster V8 air cooler. Originally I had a GTX 280 in it but I remember keeping that PC up all the way through the GTX 580 before upgrading. Wish I kept that PC.
4:54 I legitimately struggle to get modern LED ram that can do THIS specific effect. It reminds me of background technical readouts from 70s and 80s Scifi and it's an effect that's awesome in a case.
Ngl. Loved watching every second of this video,be ause it really looked like you were an rcheologist trying to understand some aincent technology... Ik I'm older than the PC,but ngl. I only know abour the parts that came after my 2014 PC was built bc. I always wanted to upgrade it but couldn't till I got a job recently,bc. when I got the old PC back then it was from My Father as a gift for the good grades I had😊
I built this exact full EVGA system back in the day, including dual-video cards in SLI, the 2nd power supply booster & the cold-cathode lighting. I had around $2,000 into it. I built it to power a set of shutter glasses that turned the internet, movies, and games into a 3-D viewing experience. I also had the biggest computer monitor you could buy at the time; a 19" CRT! It was huge! And to this day i'm still running a huge computer case; The NZXT Phantom in white.
at the time this hardware and case were on the market, you could get a 24" LCD screen (which is what I did when I built mine)- it's just that a 24" LCD back then cost around a grand...
Pretty sure the "Drive Signature Checks" that windows does to verify drivers is what was preventing the display from working on windows 10/11. Since that gear is old, you notice how it worked on windows 7? We didn't check driver signatures back then. Next time, disable the driver enforcement checks during a boot and it might work.
Next time you gotta try these games at a resolution appropriate for the time and see how good they go, I imagine 1024x768 at highest res in these games would run fine
I hope something goes wrong in the video!
Hello half life man
glad to know there are plenty of bringus enjoyers
omg tyler watches bringus :o
I do as well lol
Sane
8 GB of memory in like 2007 was _baller._ This person spent some real cash on this PC.
Man it also had a quad cpu
And here we are 17 years later Apple still shipping 8 gigs on £1700 macbook pros
2 8800GTS cards was god tier back then
@@fil44ls my main driver laptop still has a quad-core so that's saying smtn😭✋️
Ac plugg 😂 1000 Ws
The story with the two power supplies is because this was the era of custom pcs were power supply manufacturers lied about their power ratings (before 80 plus was thing) and couldn't actually handle SLI. Those supplementary power supplies were actually a god send at the time if you wanted dual gpus.
Never saw one of those back in the day.
80 PLUS was launched in 2004, the same year that Nvidia introduced SLI in its most recent form.
80 Plus was a thing when this was created lol
Crazy that power supplies weren't regulated up until 2004
I almost guarantee this system had one of those billet aluminum "600WATT" power supplies that was actually like 300W and had a lead weight in it, which was why it needed the booster. And then the original power supply exploded.
850w power supply looked too modern for 2007
@@SteamPlayLEET it's a Thermaltake PSU. For them 850W in 2024 is highly questionable.
600w is highly questionable from thermaltake these days...oh how the mighty have fallen
I'm usually the last guy to say this, (most of the times people do it's just a case of "WAH, I DON'T WANNA BE RESPONSIBLE FOR MY OWN PURCHASES!") but that definitely sounds like, suuuuper fucking illegal. That's literally just flat out incorrectly labeling the product and straight faced lying to people about what they're buying. Was it like second hand shady ebay sellers or something who were buying 300w and reselling them as 600, or did a first party manufacturer actually do that and expect to get away with it?
@@robonator2945 Usually its stupid shit like "yeah our 800W PSU can only deliver only 400W from the 12Volt line, but you also get 5V and 3V and all that and it almost kinda maybe adds up to 800Watts just not at once 'cause then it explodes, but its your fault for trying to use all 800 Watts at once"
NGL, I wish modern computer cases pulled out like that, it would make things a lot more simplistic for getting deep cleaning done.
I had (still do but it's buried in the attic) a full aluminum Lian Li case that did that and I loved and cherished for over a decade until liquid cooling was almost required to game on. Separate chassis and case don't play well for liquid cooling.
@@plebiansociety liquid cooling is nowhere close to required. You just want to justify excess.
She bringus on my studio
til i batarong
*Loud incorrect buzzer*
@@christiangomez2496She studio on my bringus til I holo?
bring this studios!!!
She bringus on my studios 'till I leapfrog
i love the “hey smokers Bringus1 here” intro !! you watched Druaga too ! god i miss him so much i wish he would come back ! his tech videos were just like no other you are literally the closest we’ve got to him since 😢
Man, it's crazy how many of those modern "let's do stupid shit with computers" RUclipsrs were inspired by Druaga1 (Action Retro, MichaelMJD, Bringus, WindowsG Electronics, even LGR Blerbs to a certain degree, and countless smaller channels), him and Green Ham Gaming really started a whole genre.
I caught that too! I loved druaga, and his long ass videos
@@MrCed122 absolutely. and tbh, i am definitely one of those people who wants to start a channel like this and i have everything i need except a tripod to get started so im only waiting on that. i do have serious plans to make a let’s do stupid shit with tech/gaming channel i wanna add to the community 🙂 i also have a fair amount of tech at my disposal ranging from time periods from all over the place lol
@@JC61990 i used to come home from school and put on the new Druaga upload… good times 🥲
man what happened with Ian was very unfortunate, we don't talk anymore. But a lot of his fans we're very cynical to him, got into discord too early for proper moderation. Community ate itself from the inside out (even some really weird stalking happened!)
Those cathode lights would have worked with that wiring when first tried (apart from the broken ones) if it had been powered properly. That switch at the back of the PC went to a sound reactive module that the lights were connected to. The lights would have flashed on when hearing music or game sounds. Without that being on the lights didn't work... until he pulled it apart and ignored it.
Hey! The reason those games kept crashing on load is because the medium and high presets kept turning on MSAA 4x or 8x. MSAA is old-school multi-sample AA and it is extremely GPU and VRAM intensive, especially back in that era. I'm willing to bet you could run most of those games at Very High / Ultra just fine with MSAA turned off (assuming the textures all fit into VRAM.) You might consider giving it another go someday.
Yeah, there were some brand-new games back then that literally didn't even have an AA option because it was still a pretty new technology.
he didn't even turn off AA on TF2 but lowered other specs :(
I remember on my old GTX 660 I often turned MSAA off and forced FXAA in the Nvidia control panel. Yes it made games a bit blurry, but it was still better than MSAA just for the sake of performance. I’m glad we’ve got better AA options nowadays, though of course a modern GPU can handle MSAA in old Source games no problem.
I think that wouldn’t give much. Problem with steam is that game receive updates including engine changes and fixes over time. If he used era correct versions they would run better on old hardware.
I had a load of 2006-2011 gpus and what you said about performance hit applies mostly to pre HD 4000 series radeons where they drown like titanic after antialiasing is turned on. GeForce 8000 series were superior in antialiasing handling and had much smaller performance drops.
@@telegummis Some people just consider jaggies at any resolution to be a deal breaker. Ross Scott is like that.
The cool thing about the Crucial RAM LEDs is that they are electrically driven by the input/output going on instead of pure software based. It's true blinkenlights, like the HDD activity light. It's showing real-time activity. Too bad they ditched that for their DDR4 and DDR5 kits. What's insane is they even had a rare kit for DDR1 era. Corsair had a LED activity meter (green going to red) for DDR2 kits. I have the Crucial Ballistix Tracer DDR3 kits (with the black heatsinks) and I gotta say, watching 8 DIMMs blinking on my ASRock Extreme 11 X79 motherboard based on true activity was mesmerizing; you could tell which pair of DIMMs were more active, etc. It was like watching the Connection Machine CM-2/CM-5. It did come with some software with just a few settings (like 4 levels of brightness and patterns) but nothing like the RGB LED kits now. I gotta reiterate it was true electrical activity though; wasn't like the software was just emulating it or something separate on top. The kits came in Blue & Red, Green & Orange combos for the DDR3 I had if I'm remembering right (early 2010s era of color matching PC parts). They were also one of the few DIMMs that showed temperature too at the time (something most people barely thought about). I really wish DDR4/DDR5 had something like them, one of the few things that made me want to stay on the older platform.
You can't activate that in the controller settings?
This is exactly what I thought as well!! It looks like a cool gimmick, something I wonder if we’ll get on this gen of RAM hardware
I'm the kind of person who gets a case with shit tons of RGB, then spends the first day trying to figure out what unholy software I need to install to make them shut the fuck up and just either turn off or be a dim white, but I would *_100%_* buy modern ddr5 sticks of that RAM. Fuck fancy RGB, if you duct taped a dozen tiny red LED's on a stick of RAM and wired them to react to I/O I'd take it. That's the sort of subtly baller energy that makes a computer look actually good, rather than like it's only legally allowed to be powered on in june. (not to mention, it actually gives a very nice visual indicator of how much your computer is working.)
(and, if it needs to be mentioned, the fucking *_irony_* that that "subtly baller energy" is in this brrrrring billboard of an Nvidia advertisement that was cursed to take the form of a computer case against it's will is not lost on me)
edit : 8:32 ... perhaps I judged you too harshly, that too is subtly baller as fuck.
@@tdata545 I have a bunch of the DDR3 Ballistix Tracers and there is software for it but it's not like what you'd see on other kits, especially RGB LED RAM. In terms of settings, every kit had two colors, and you can switch one or the other (I think the combo from memory is Red & Blue, Green & Orange). You can change the brightness, pattern (4 settings), pick which one of the 2 colors and turn them off. But I want to reiterate that the way it works is like legit based on the electrical activity. Like if you're playing a game or really loading up the RAM, it actually shows. There's no software sync for all of them, so when I had 8 DIMMs installed, it looked like a legit thinking machine, with each DIMM doing it's own thing, you could tell which DIMM slots were paired. Another cool thing, at least for the DDR3 era was that the software also shows the RAM DIMM temperatures. That was kinda rare. I basically bought a bunch of kits for my X79 board (ASRock Extreme 11 with 8 DIMM slots) because I love the Connection Machine CM-2/CM-5. I love blinkenlights. I even tried to buy dual processor Chinese boards to get even more blinkenlights since in some rare instances non-ECC RAM works (since they do some custom BIOS) but didn't have luck. I still have it all but moved on to faster newer computer hardware.
Yeah, I need these but DDR5.
This was an expensive PC back in 2007, for sure. Plus it has cold cathodes jury-rigged into the system, so you know the owner was a pro modder. FSP is an OEM power supply manufacturer, so that device powered the cathodes.
SLI has always sucked, so this was nothing more than a baller PC that was constantly flexed on forums.
Hell yeah I'm 29 now, but around this era is when I started gaming. Showoff pc for SURE.
lol I was fully expecting an SLI build upon seeing the thumbnail of this video 😂
I hope that wiring mess was done by the 2nd owner then because if not, I wouldn't count the og owner as a pro modder :D
@@Fortzon a pro modder wouldn't care about wire management. just how much they can cram in and it function correctly
no, the extra power supply was for the second GPU, but whoever owned this PC must have upgraded the main PSU at some point. nobody had an 850w PSU in 2007.
that guy been waiting on that CS GO server for 10 years, finally someone joins, "Our battle will be Legen... oh he gone"
I'm gonna need the uncompressed flac for that absolute banger of a song at 14:44
real.
same
Literally the “Closing In” of Bringus Studios lol
i have it :) ripped in 320kbps
I could try to do it. The vocals would sound kinda compressed but I can try to remake the instrumentals
As someone who was there, cold cathodes were BRIGHT AF when new. but they faded over time. I would have creamed my jorts for this PC back in the day. I can tell you that 850w psu was not original to this build, the old PSU i can gaurantee you needed that extra booster .
>Casually runs Crysis good "Oh I guess this is fine"
Dude you have absolutely no idea lol. That's what this computer was MADE for...also you have to get to the sunrise on the hill that's the start of the first real level for the actual Crysis test.
@@rushnerd was about to say this hardware is kinda crazy to run crysis
I got a bit too hyped for just seeing the icon there, i just love the franchise
Guy says: "Q9300...That Sounds Good?"
Sir... Im wounded. That was the budget i3-i5 Equivalent with a big delta of OC (Yeah, before your time, Overclocking was worth the headroom then) but some minor heating TDP to contend with at the 1.31v with around ~450 FSB x7.5 Clock Mult.
It was a budget proc for a MUCH Different period in PC hardware releases (We're talking every six months release at this time). Think 2008 Bioshock, COD4 and Crysis.
This whole system was a budget Proc, Higher end SLI GPU build. So Around ~1k for this. They went HAM. Kudos to the OG owner. You succeeded with Crysis.
👍💯💯
Ive had way better pc when crysis launched and only got it to play 3 years later on another pc, it was crazy
That technology connections reference was TOP TIER! First video I've seen of yours and its an absolute gem!
I am literally using this case with all modern components to watch this video on. Bought it brand new back in 2006 or whenever it came out for $300 but I had a kid soon after and couldn't build in it. Ended up giving to a buddy for a birthday gift a couple of years later and then he gave it back to me in 2020. Last year I cleaned it up and put a new coat of paint on it and built a new PC in it. My goal was to do SLI when I first got the case but when the time came to build in it, SLI had been dead for some time. I still like the look of the case and the RGB fans light it up nice! Its one of those rare even for the time cases that I will probably never get rid of. A relic of the past!
So where you from sir ?????and why you hadn't changed this cabinet😂😂😂😂😂
Me 2 :), my case is quite modded now with custom water cooling loop but the PSU i got with it in 2008 still works perfect
I want to see this
post an imgur link of it
i have one too , its awesome ! they dont make anything like it anymore . its built solid !
-Buys SLI PC
-Doesnt do a single thing with SLI
yep thats about right
not installing SLI drivers on an OS where SLI drivers didn't exist anyway to play games that didn't support SLI
But we're not here for results, we're here for the journey, the personality and the jank.
@@rossclutterbuck1060 The sli profiles are in the base nvidia driver, windows 7 supports sli, and most of the games he tested had sli support.
the true SLi experience
It could get better, he could have installed SLI drivers and applications, yet they would have performed better with a single card. The way it was meant to be played.
Ahem akctually, sli drivers are baked into the base driver, win7 supports sli, and most of the games he tested support or supported sli. You have to enable it though and ~.3GB of vram wasn't going to cut it anyway. It would actually be a pretty sick old rig with a Kepler II card or something.
Edit: TWOOO Kepler II cards
Thank you bringus very studio
Lol
true and real
wow
What are we? Some kind of Bringus Studio?
As an "old" at 40... I enjoyed the hell out of this video. Infinite power! :)
Flipping the internal chassis around has a purpose! Around late 2004 Intel introduced what was supposed to be a successor to the ATX form factor, aptly called BTX. In terms of layout, it was essentially a "vertically flipped ATX" and, as such, it was incompatible with previous cases as some elements (like connector and I/O shield placement) wouldn't line up correctly. At the end of the day, BTX never really took off and that's why we're still using ATX to this day. This case, however, was released in a time where both were present on the market, and flipping the internal chassis makes it compatible with both form factors!
EDIT: There's little tabs on the I/O shield and expansion slot plates, after flipping the internal chassis you have to swap those two around as well, and then you're done!
Dell used BTX motherboards in the optiplex computers for a while.
We still have a Compaq pc at work with a BTX motherboard. Its running windows 10 on 2gb RAM. The thing is still being used for remote desktop use
@@Compact-Disc_700mb And the Presicion workstations for a while in the mid/late 2000s. Got a T7500 and the motherboard is HUGE and inverted.
@@ThisisForTheTV Cool, I did not know that the precision line also had BTX boards. It does make sense, I guess that dell were all in on BTX boards for while. Also nice profile pic!
I feel like I've discovered the remains of a failed civilization
EVGA does not in fact still make motherboards, they laid off that entire department, what you're looking at is left over stock, their LGA 1700 mobos never received BIOS updates for 14th gen CPUs. That company is on its deathbed, the only thing they still sell consistently are PSU's and those have skyrocketed in price despite reducing their warranties from 10 years to 3 years.
What Invidia did to EVGA I never forgave them for such a good company that cared about consumers put to death by a giant in the industry. Wish EVGA would partner up with AMD or Intel and start making GPUs again. Would be so cool to see them back at the top not many companies fought for consumers like EVGA did and it killed them.
@Beezkneez187 EVGA left on their own accord , Nvidia just wanted them to not extensively alter the cards they didn't agree and lefy
@@yasho9008 painting nvidia as just a reasonable, innocent party in the situation is definitely a choice. It's a complex web of corpo bs and personal feelings that were involved over the course of decades that made EVGA drop out the market and very clearly not just that
@@yasho9008 What about AMD? (Radeon)
@@yasho9008 Nvidia drove the prices up ridiculously high for AIB partners making their margins even thinner than they already were. EVGA said enough is enough and told Nvidia that they're done.
we love a Druaga1 reference in our bringle man video
i was looking for this comment
it warms my heart to hear it
rushed to the comments to see if anyone else caught the hey smokers lol
I miss that man's content.
😎 hey smokers
5:03 The 7 Patapon fans are gonna love this one
Literally was looking for this comment lol
The era of Phenom II, Core 2 Quad, Nvidia 8000 series, and AMD 4000 series, was a fantastic time. After all the updates to the Source games, socket 775 amd AM3 systems really do drag.
Good times.
AM3 is fine with DDR3 and modern video cards with over 1 GB of VRAM. Biggest thing was probably the iffy raid 0 config, and the 8800 GTS limited to 320 MB VRAM. There was the G92 8800s with 512 MB - 1 GB VRAM those would have fared better too.
@@clack1 What I'm talking about is how CPU intensive Steam became. No, those systems are not fast enough, from personal experience half life 2 deathmatch will stick up on a slow PC like that ddr3 or not, and it's not HL2.exe that's doing it, it's STEAM, steamwebhelper etc etc etc and so on and so forth. That's why about 6 years ago I moved from Phenom II to a used socket 1155 platform, and that squashed the issue. I have a r7 5700x now, even better. Even compared to socket 1155 I don't care how fast your p2 or c2q is it sucks.
For I handed on to you as of first importance what I also received: that Christ died for our sins in accordance with the scriptures; that he was buried; that he was raised on the third day in accordance with the scriptures
1 Corinthians 15:3
6:44 There’s no way he got me with the wireless charger in the bios. I thought I was gonna be dumbfounded by some weird bios shenanigans
That was a smooth segue
@@joshuamusser8893And a good segment, ugreen just planned a well timed assault on my wallet 😓
Oh my god this PC is a blast from the past. Custom gaming pcs have come a LONG way in the last ~18 years.
I built a $3k PC with this (non-nVidia variant) case in 2007. I still have the case in my basement. This is still one of the most extraordinary cases ever made, and the removable motherboard tray is epic. Classic.
Having a Bulldozer/Piledriver architecture CPU in your first gaming computer really builds character in a person.
That FX 6300 really carried me for God knows how many years. lol
damn I feel called out for fx 8300
Still running a AMD FX 8120 with dual Radeon HD 6950
The reason for the reliability issues with the 8800 GTS cards is that they're notorious for cooking themselves, Nvidia ended up in court with computer manufacturers like Apple over the GPUs of that era.
You can temporarily make them better with the oven trick, balancing the GPU on four foil balls and throwing the board in an oven hoping the solder re-flows itself. But it's only temporary and emits some dangerous chemicals not good for your oven.
Yes, I had two of them installed on my rig and both went bad within a year.
I have 3 8800gtx that still work excellently
sorry, what, the OVEN??
@@SunroseStudios yup age old trick
@@SunroseStudios The "oven trick" is equivalent to the tide pod challenge, we had idiots back then too.
The cause of the failure of the 8xxx/9xxx series GPUs was bad ROHS solder that was too brittle. The heat from the GPU ASIC would cook the BGA connecting the ASIC to the PCB and cause the individual connections to crack and fail. The proper fix is to get the GPU reballed with leaded solder, and if you can't do that, keep the card on all the time. The card being at a consistent temperature would significantly prolong the life of the card. I had numerous 8xxx and 9xxx Nvidia cards, and the ones I still have today still work because I always kept them running. The only ones that died were the mobile variants, which is explained below.
All that frying the card in the oven did was cause thermal expansion of the broken BGA joints and reform a brittle weak connection that would fail again in short order.
The lawsuit the other person was talking about was less related to this problem and more related to a different problem with the mobile variants. The bond wires and chip carrier connected to the ASIC die had a manufacturing defect that would cause the connections to break. Either from the bond wires to the chip carrier or the chip carrier to the substrate package where the BGA pads were. This problem was terminal and fatal, it could not be fixed with a reball since the problem is inside the physical package.
Going back to the ROHS solder issue, this plagued the entire industry for the better part of a decade, and is still a problem today. No other metal has the ductility of lead, and early ROHS solders used between 2005-2009ish were horrible in that regard. There's a reason why hardware from that era is scarce today, because so much of it died from either that, or the capacitor plague.
Technology connections, greatest technician that's ever lived, louis rossman, dankpods. Never did i ever think i'd hear them all mentioned on a single channel, yet here i am, dreaming of the ultimate collab.
I'm a retired custom PC builder and that machine in the day ran around 3500 dollars,those 8800 were about 500 bucks a pop back then.and so on,good score.
The average resolution back then was 1600x900~ 1080p was on the EXTREMELY HIGH END at that point.
Also 1366×768 was also pretty common.
He was using 8x MSAA lol, this is why the source games were really struggling.
1280x1024 was the high-end resolution at the time. The 16:10 of the 4:3 world.
WHY IS MY 2022 LAPTOP 1366X768 HELP MEEE HP SUCKS
@@oneblacksun this is revisionist as hell, 1600x900 was average and 1080p wasnt out of reach, 30-60fps was considered acceptable/normal on pc
@@Sniperpro04059 mine too
I just have to comment, but this gaming rig, is literally top of the line stuff, lol. Holy eff, I would have been incredibly, INCREDIBLY young, but dual 8800 GTS's and a Q9300, holy eff. That's a monster unit. Others already mentioned how 8x MSAA and the resolution you were running at were like, trying to do 4k/8k in todays time, lol, quite unreasonable.
It totally is, but the DDR2 and low memory cards are a huge shame. In 2008 I was the insane guy with DDR3 and a GTX 280. I would have felt better about if BFG (who I bought it from) didn't go out of business right away and shortly after DX11 was and is to this day the standard. Kids have no idea how crazy PC gaming was back then, NOTHING lasted very long.
@@rushnerd My dad used to say that as a kid I could never understand a world where the threat of war with Russia was so high. I'm sure like that, this trend will also change back quickly enough.
@@rushnerd Nowadays nothing really lasts that long either, that aside it's always been quite a few games that did ran on everything - take GTA San Andreas for an example, could easily play that on an office PC bought around the time of its release and you'd also be able to play quite a few games years after, it's usually the selected few that don't really run.
Hell, my uncles 10 year old Office PC somehow still manages to run F1 2019 but got stutters in between when using any browser with more than 2 tabs open.
@@rushnerd Damn, I didn't know BFG was out of business.
@@LagrangePoint0 Lol yeah, they closed up in 2009. They were BAR NONE the best in the GPU industry because of that lifetime warranty and they made the best cards.
Was an easy pick for me, but yeah I got washed both on DX10 and their card.
Card was SUPER complex to take apart too, but the cooler had major issues about five years in and I replaced it with an aftermarket 3-fan cooler and it was even better after that (was blower style). Now that EVGA is out of the game too, there is absolutely no top brand anymore that puts customers first and it's very sad.
18:53
Optical Disc Drive: "You couldn't live with your own failure. Where did that bring you? Back to me."
Optical media good actually. 😄
Oh yeah, it's crazy fun hooking up an optical drive to a modern PC and ripping a whole bunch of DVDs and Blu-rays to build your own personal movie server. Funny to think that modern AAA titles with their 100GB install sizes would take 3-5 Blu-ray discs to install though.
@@Dee_Just_Dee You wouldn't necessarily need multiple discs, there's actually multi-layer Blu-Ray discs much like dual-layer DVDs. They can hold upwards of 100GB on one disc
first video watched and alot of good references, nice work bringus stu
0:11 oh my god is that a druaga1 refrence???
HEY SMOKERS!!!
HEY SMOKERS
I LOVE DRUAGA1
His stuff was awesome!
Man, I miss his content 😪
@@WantBadtime we sure all do miss his content. I hope he's doing well
You fill the void of Druaga1. I'm so happy you exist!
Definitely
26:12 I think that the 8x msaa is eating up all of the vram
Also he didnt realize he had borderlands 2 still open until the TF2 test, so that might have been affecting it as well
1:15 thats actually how cable management in prebuild computers used to look like, there was none. XD
15:33 create an Soundcloud bro
fr
The Shrek PC has competition
1920x1080 back then was like 4k nowadays.
I remember upgrading from 1366x768 and it was game changing
My eyes are so used to 1366x768 that everything above it looks really weird smh.
Probably more like 1920x1200
Tbh I just upgraded to 1080p this year and I'm still recovering from the pixel overdose
1680x1050 was the most common widescreen resolution. For 24" monitors 1920x1200 was common and 2560x1600 for 30" monitors.
I went straight from a 34 inch JVC I'art HD CRT to a 65 inch 4K samsung QLED TV in my living room in like 2020 💀 felt like my brain was gonna explode
No one has mentioned how much of a banger the build montage part was. 10/10, made me laugh hard
Holy mother of God, I remember hearing and reading about rigs like this back in the day.
People were upgrading their PCs to configurations such as this to go play Crysis when it was released, and quickly found out that the Core2Duo they upgraded from was quicker than the Core2Quad they got thanks to the game being so single-threaded. But the SLI, man the SLI, that worked wonders. The stuff of legend.
I had an SLi system back in the day, it never worked how it was supposed to work, weird framedrops, crap frametimes and so many crashes.
@@petratoth811 skill issue
@@SMlFFY85 no, Nvidea's SLI is why I swapped to ATI for a long time. I loved the SLI in the Voodoo2, 3dfx had the two cards directly communicate and nvidea had the communication go through the PCI which relied on CPU bus speed while it was acting as an ambassador between the 2 cards which made the process slow down. 3dfx's SLI version could effectively double the pixel generation which could be either used to generate twice the fps or twice the resolution or a mix of both. The CPU on this build would have actually been fine except for textures since it came out a couple of years after the cards, but with a CPU that would have available at release it would have taking all the clocks down to the bus speed before combining and putting them back out again. Simple math E6600 has a 1066 bus speed G80 has a 576 gpu clock, a 1350 shader clock and a 900 memory clock. The way nvidea did it would be on each card to half the bus speed to get the max gpu clock from 576 down to 533, limit the shader per card from 1350 down to 1066 and limit the memory speed down to 533 then they are combined which after the fact would make all the clocks max at 1066 gpu(instead of 1162) 2132 shader (instead of 2700) and 1066 memory (instead of 1800). It's why SLI SUCKED at textures, because 2 cards barely put out 10% more memory processing than just a single card. For a long time chip manufacturers bus speed lagged behind doubling graphics cards' clocks, SLI nvidea style would probably do better now that that has lined up more, 14900kf has a 6k mhz bus and the rtx 4090 doesn't come near half that with any of their clocks so two cards would would probably work pretty well now.
Hint for the fake flash drives, 'SD Formatter V4' can fix the fake storage numbers of any USB device so it displays the true storage capabilities and works again. Better than having it be landfill like it was destined to be.
PS "SD Formatter V5" sucks and removed the "size adjustment" option, it has to be the older version
Noice
No I want to use V5!!@!
15:40 A worthy song to wake up under every day.
Man this is very similar to my old 5 thousand dollar build from back in the day! I used the nicer version of this case with like 10 huge fans running all BFG Technologies gear with the motherboard and dual graphics cars running in sli on a triple monitor setup. I was running a Core 2 Duo Extreme, Corsair Dominator ram with cooling fans on the ram and a thermaltake modular power supply and 2 WD Raptors running in raid! It was literally all the best I could find on new egg at the time and worked great for 10+ years before the motherboard died and by then BFG technologies were no longer a company and everything was dated so I moved on! I still have it and this video just motivated me to get a new motherboard and get it running again so thank you!
That "mystery box" that connects the switch to the cold cathode is a DC/AC inverter. I had a lot of fun with those back in the day. That day being about 3 years ago, when I built the beefiest computer I could spec with parts from 2008.
15:00 this is fire
fr
This is a certified big rig. Also my mom’s Compaq laptop is also how I played TF2 back in the day 😭
NOT MATORO IN THE JAR, HE IS OUR HERO :0
20:26 should we call the hospital 😂
yes
absolutely
Cold cathode lighting was so badass, it needs to make a comeback.
Edit: Did you even install SLI drivers? Seems like it was running on just one 8800 GTS.
I came here to say the exact same thing. He didn't install SLI.
said similar before I saw your comment. Only Crysis and the updated Source engine even supported SLI at a game level, and SLI was dead by the time Windows 7 landed, so those weren't even SLI drivers.
I was just about to ask how do you know for sure both GPUs are getting used?
@@CheapSushi the information overlay will show multiple GPUs in their usage.
@@rossclutterbuck1060 What do you mean by SLI being dead by the time Windows 7 landed? SLI was definitely around in Win7 era and a lot of PC build guides from around 2010-2013 even recommended mid range cards in SLI for some reason
I was drooling as you were opening this case up. 2 x 8800 GTS, a Core 2 Quad, removable mobo tray, and cold cathode ray tubes! This was a beast of a machine back in the day and what I was dreaming of as a teen.
the neon glow is so much better than led imo
Technically, you can pair an 8800 GTS 320mb with an 8800 GTS 640mb but the 640mb card will only use 320mb
I love how much irreparable harm was in this video, keep up the good work!
SO much of this takes me back. I remember Nvidia chipsets causing me no end of grief with Core 2 Quads in my video edit suites, and the heat generated by a pair of 320GB Velociraptors in RAID 0 needed its own fan.
Pretty sure though SLI was never running in any of those games. Only Crysis supported it out of the box, and it wasn't until the Lost Coast tech demo did Source get SLI support (although bloom and lighting options in Half Life 2 indicated it was that later Source revision). Pretty sure SLI was already dead by the time Windows 7 landed too so the drivers themselves had no SLI support.
Still, green cathodes FTW.
SLI wasn't dead for windows 7, no idea why you would think that. I played skyrim on SLI, and with nvidia 3d vision even.
I never understood why that was the only IDE 10K rpm drive and that it had such small platters. The drive enclosure was the 3.5" form factor but the platters were 2.5".
There were so many more SCSI drives that spun at 10K and 15K rpm back in the day although not sure what their platter size was.
@@pontiacg445 because Windows 7 never got SLI certified drivers. If you were running SLI on Windows 7 then you were likely running Vista drivers.
I ran 8800 GTS 512's and later 560Ti's in SLI on Windows 7 using up to date drivers right up until upgrading to Windows 10.
Love the RCR references and homage. 😂🔥🙏
As a guy who has his main build inside one of those ancient and massive Coolermaster cases, I'd absolutely kill to own one of these cases.
THE MUSIC SLAPS 🔥🔥🔥🔥🔥
5:28 you forgot to plug in the power cable in the gpu
was about to say the same thing
These nforce boards are notorious for dying all the time... the red balliastix were quite expensive, too! This was extremely expensive back then. Basically the q9300 was very very late into socket 775's lifecycle. Very cool find!
I used to own all sorts of full size towers I found in the garbage, they were easy to work with, but a bit heavy. I loved using cold cathode tubes, they looked so cool, especially in the alienware cases.
They do make an “emulation” of that cold cath light with EL wire (electro luminescent), you could just glue it into a resin block then run it through an acrylic tube or even mould it into resin , only problem is it also needs its own power supply/ reg and it usually puts off a high frequency , it can be mitigated but it’s not good for cooling
5:02 A PATAPON REFERENCE??!! IN THIS DAY AND AGE??!!! BRINGUS???
Patapon mentioned aaaaaa
Good old PSP days 😭
That build montage was peak honestly, love the lyrics!
Bringle my Dingle
(Yeah I deserve the Skulls)
💀
*💀*
💀
💀
💀
21:44 WinBros seething
26:35 - I was not expecting the Technology Connections reference 🤣
*a little bit of criticism*: This song was jamming! Nice! 🙂
I'm not even going to lie, i want that case and the toobs. The green glow would look awesome in my dank dark gamer tech closet dungeon.
23:20 is such a cursed thing to come across out of context
He’s trying to escape containment
Plap plap plap plap get pregnant!
LMFAO
the sound effects when he's taking the tings apart are epic
14:42 I absolutely love the musical number here
14:40 this is definitely going in my "songs to game ferociously to" playlist
Better than ksi
before I die I’m trying SLI
Green neons are lit let’s do this shit
same
yay a video at 12am!
That would make for a sick retro-gaming PC. Somebody just threw it out too. Wow. - On that power booster PSU... I wonder if they were overclocking and thought the power booster power was "cleaner" for better GPU power? I guess when the CPU draws more power... it could cause a ripple for the GPUs?
Editing on point
you should call this the “le nvidiot™️”
Batarong verified
his boy
Kk, 😂😂😂😂😂😂😂😂😂😂😂🎉🎉🎉🎉🎉😅😅❤❤😅 (idk i just spammed emojis lol)
Batarong certified
Eprom
jahoo
Had the Q6600 years ago in my first gaming rig. The difference in the Q6's to the Q9's (6 year difference) was so negligible. Those Q6600's were BEAST for their time.
The main difference was the Q6600 SLACR would go from 2.4ghz to 3.2ghz and (providing you had the cooling) wouldn't break a sweat. Bonkers CPUs for the day.
Yes!!! My first real gaming rig was a q6600 slacr, with a dfi lanparty and a custom cpu cooling loop complete with uv dye, built new in 2007. Ran it at 3.4ghz by bumping the vcore. I ran that setup for 8 years before building my current rig in 2015 (got out of gaming a while ago). Those were the days lol.
In 2015+ there was a clear difference between the Q6000 and Q9000 series. The Q9000 series added SSE4.1 support which was required by many games in mid to late 2010s.
I have a Q9550 @ 3.4GHz stock voltage in one of my systems. It had a Q6600 SLACR before but it didn't want to go above ~3.1GHz, probably could have if I had given it higher voltage and had something better than the stock cooler.
@@Aguy6714 similar setup except I had an XFX nforce 680i sli mobo and the CoolerMaster V8 air cooler. Originally I had a GTX 280 in it but I remember keeping that PC up all the way through the GTX 580 before upgrading. Wish I kept that PC.
This is the best pc video is saw in my life!
YOU KNOW WE GAMING WIT DIS ONE
bro made a Druaga1 reference and now i miss him =(
Same. Funny lil guy.. :(
4:54 I legitimately struggle to get modern LED ram that can do THIS specific effect. It reminds me of background technical readouts from 70s and 80s Scifi and it's an effect that's awesome in a case.
2001: a space Odyssey
Ngl. Loved watching every second of this video,be ause it really looked like you were an rcheologist trying to understand some aincent technology...
Ik I'm older than the PC,but ngl. I only know abour the parts that came after my 2014 PC was built bc. I always wanted to upgrade it but couldn't till I got a job recently,bc. when I got the old PC back then it was from My Father as a gift for the good grades I had😊
I built this exact full EVGA system back in the day, including dual-video cards in SLI, the 2nd power supply booster & the cold-cathode lighting. I had around $2,000 into it. I built it to power a set of shutter glasses that turned the internet, movies, and games into a 3-D viewing experience. I also had the biggest computer monitor you could buy at the time; a 19" CRT! It was huge! And to this day i'm still running a huge computer case; The NZXT Phantom in white.
Maybe this one in video is yours
at the time this hardware and case were on the market, you could get a 24" LCD screen (which is what I did when I built mine)- it's just that a 24" LCD back then cost around a grand...
Pretty sure the "Drive Signature Checks" that windows does to verify drivers is what was preventing the display from working on windows 10/11. Since that gear is old, you notice how it worked on windows 7? We didn't check driver signatures back then. Next time, disable the driver enforcement checks during a boot and it might work.
Windows vista did...
In a vmware, I was trying to install chrome for vista, and said not safe.
@@Sniperpro04059except if you used Vista x86… in the early days x64 was experimental
2:20 No no, server racks generally suck too.
Especially 1U ones. So much tech crammed in so little space...and you need to connect all of it somehow.
14:38 do you allow me to put this masterpiece on spotify ??
The fact that you did a Druaga1 reference makes this video an ultimate win in my book.
"So Nvidia **** you" - Linus Torvalds(Creator of Linux)
Torvalds, not techtips, in case people get confused
oh man linus tech tips are gonna pay for this
@@sklungofunkThis was another Linus speaking. He created Linux
@@saveliyivanov9943 oh man Linus Sebastian is gonna pay for this
6:44 cleverest sponser ive ever seen
Not really, if you had looked at the bios menu before that it would be obvious that it was a sponsor
@@harrisonk8561ah you’re real fun
@@harrisonk8561 true
I love this guy humor😂
And that ram insane
Next time you gotta try these games at a resolution appropriate for the time and see how good they go, I imagine 1024x768 at highest res in these games would run fine
"Nvidia, fuck you."
- The Linux guy.
8:25 literal Robocop footstep sound
Your humor alone got me to subscribe in my first video