The Worst CPU AMD ever made... - Quad FX Upgrade Path Retrospective

Поделиться
HTML-код
  • Опубликовано: 14 июн 2024
  • Get a 15-day free trial for unlimited backup at www.backblaze.com/landing/pod...
    Check out MSI’s epic Summer Sale today using the link below!
    msi.page/3qUXYY9
    AMD and Intel have been competing for years and they've both had their ups and downs. Back in 2006, AMD may as well have been at the bottom of the ocean with how down they were when Quad FX launched. How bad is it now?
    Discuss on the forum: linustechtips.com/topic/15198...
    Purchases made through some store links may provide some compensation to Linus Media Group.
    ► GET MERCH: lttstore.com
    ► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
    ► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
    ► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
    ► EQUIPMENT WE USE TO FILM LTT: lmg.gg/LTTEquipment
    ► OUR WAN PODCAST GEAR: lmg.gg/wanset
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    1:24 AMD in 2006
    2:55 "Quad Core"
    4:05 4X4
    4:55 Quad FX Upgrade Path
    8:45 Fire it up
    9:50 RUclips Playback
    10:45 Gaming
    12:50 TF2
    13:45 Mars GTX 760
    15:45 RTX 3060
    16:22 Conclusion
    18:34 Outro
  • НаукаНаука

Комментарии • 1,7 тыс.

  • @teflonstestbench
    @teflonstestbench 11 месяцев назад +3032

    As the viewer who lent you that board it was an amazing video but there are a couple of remarks I'll like to do about the setup:
    1. Windows 10 works on the board, I was able to install it and use it with no problem on the board and without changing anything on the bios.
    2. The GPUs have dual GPU cores and a SLI chip on the board, so it is like any other GPU as the GPUs are joined in the board and they work on non sli setups, you just need SLI if you want to use 2 of those dual GTX 760 cards.
    3. The green flickering happens because of the output GPU as you said, just plug it into another output and it will work fine, same with the sli performance and stability, it is not easy to make it work but it does. I actually included some information about that with the setup.

    • @MuitoDaora
      @MuitoDaora 11 месяцев назад +487

      But who is gonna take time to read your instructions, am I right? /s

    • @Oscar4u69
      @Oscar4u69 11 месяцев назад +97

      ​@@MuitoDaora
      NOT Linus of course

    • @gstargreen1
      @gstargreen1 11 месяцев назад +36

      Brilliant man... thanks for keeping, tinkering and sending this board in. Such interesting content.

    • @ampex189
      @ampex189 11 месяцев назад +19

      Thanks for sharing this board with us!

    • @HardWhereHero
      @HardWhereHero 11 месяцев назад +18

      I am glad you had one to loan them. I buried all my FX stuff in the back yard.

  • @ipalygames6191
    @ipalygames6191 11 месяцев назад +351

    My father was an electrical engineer for AMD at that time and he agrees 110% with you, and says it wasn't his most proud product hes produced... yea he helped design and make that and thinks its garbage...

    • @cal2127
      @cal2127 11 месяцев назад +20

      man id love to pick his brain about so much stuff as an amd fanboy

    • @Freestyle80
      @Freestyle80 11 месяцев назад

      my auntie is Lisa Su and said you are saying bullsh*t to farm likes

    • @TheBloodypete
      @TheBloodypete 11 месяцев назад +5

      I'm guessing he was actually an electronics engineer? Electrical is the wires in your house scale stuff...

    • @guccihorsepiss2406
      @guccihorsepiss2406 11 месяцев назад +1

      @@TheBloodypete and magnetism or three phase induction

    • @username8644
      @username8644 11 месяцев назад +22

      ​@@TheBloodypeteWrong. An electrical engineer is that exact type of person who would work on processors and design them. You are thinking of electricians.

  • @K31TH3R
    @K31TH3R 11 месяцев назад +126

    One of my old coworkers bought this platform (FX-72) in 2007 while I was piecing together my Q6600 system. I think he ended up spending almost double what I did, and with AMD being quite competitive in those times, we both had very high expectations and were looking forward to comparing performance. I ended up having good luck with silicon lottery and my Q6600 could run 3.9GHz stable, and I had it paired with some fast Micron D9 DDR2. I cannot overstate just how much faster my Q6600 build was versus his FX-72 build, I actually felt bad for him, because it was a slaughter.

  • @honorablejay
    @honorablejay 11 месяцев назад +133

    It would be interesting to see a comparison video showing both AMD's and Intel's failures over the years and how they were able to improve over time. Heck, go all the way back to the 90's when the AMD K6 was introduced as a starting point. Now that's nostalgia.

    • @CommodoreFan64
      @CommodoreFan64 11 месяцев назад +12

      I would not call the Super Socket 7 and the AMD K6/II/III line up a failure, as it was cheaper for consumers, and the IPC was just good if not better than Intel. I had a K6 II 550Mhz machine I built with my money from my frist after school job, and it played every game of the era I threw at it on Win 98se, and DOS.

    • @honorablejay
      @honorablejay 11 месяцев назад +6

      @@CommodoreFan64 I think you misunderstood what I meant. Back in the K6 days, AMD was an up and comer that really pushed Intel hard. However, over the years since then, both companies started making mistakes and had to recover from them. That's the kind of video I'm looking for. Go back to the golden era where they were both squaring off, show their mistakes, and bring everything back to the present where they're now back in contention with each other.

    • @phyde1885
      @phyde1885 11 месяцев назад +2

      @@CommodoreFan64 i had ALL those Boards, DAM near Bulletproof ! Those were fun boards to play with,the OLD DOS DAYS TOO ! DLL HELL !! The Old Times, 😎

    • @xtreamerpt7775
      @xtreamerpt7775 7 месяцев назад +2

      @@honorablejay Not a up and comer, AMD is around since the 286 clone, i think early 80s

    • @EternalxFrost
      @EternalxFrost 6 месяцев назад +2

      @@xtreamerpt7775 Correct. Even before the 286.
      Speaking about that, if you look closely to an AMD 286, it's copyrighted by Intel at the bottom of the chip. Most of them came out in 1982. They also reversed engineered the Intel 8080 with a simple picture of the core die without iicence in 1975, that they codenamed Am9080. And they were around even before that time (they were founded in 1969 actually). After that, Intel gave AMD a licence to produce CPUs for them as a second source, which lasted until 1984-1985-ish, when AMD started making their own CPUs, like the Am386DX and DXL-40 models, for example.
      But ALL Am286 CPUs were produced under licence and copyrighted by Intel.

  • @rysterstech
    @rysterstech 11 месяцев назад +969

    For those who are confused, this is not the FX line of CPUs of that launched in 2011 on the bulldozer architecture, those were completely different and shared a floating point unit between every two cores which led to lawsuits for false advertising later because they really weren't true eight cores, and their performance reflected that

    • @CanIHasThisName
      @CanIHasThisName 11 месяцев назад +63

      Yup, people often forget this one. In plain words, the issue was that you had essentially pairs of cores. Each pair shared some resources, and when you were utilizing the second core in a pair, single-core performance of that pair went down significantly. And it needs to be said that even the full single-core performance wasn't great compared to Intel, and at the time there were still a lot of games which actually only ran on a single core.
      I also feel like people often overstate the "budgetness" of these CPUs. They were cheapER than Intel's but also generally had more expensive motherboards if you wanted feature parity. The i5 destroyed the 8-Core on single-core performance, while the i3, although only dual core, had hyperthreading, so it was often a better option than the quad cores from AMD which, thanks to their design, were acceptable dual cores or weak quad cores (depending on the load).

    • @David_Quinn_Photography
      @David_Quinn_Photography 11 месяцев назад +2

      Yep this wasn't the 2010s FX

    • @ALPHABYTE64
      @ALPHABYTE64 11 месяцев назад +12

      But Buldozer were true 8core, but just 8 weak cores. Look and single-core and multi-core scores.

    • @ALPHABYTE64
      @ALPHABYTE64 11 месяцев назад +2

      Also
      Not to be confused with AMD-FX and not to be confused with Intel Core 2 Quad

    • @dnakatomiuk
      @dnakatomiuk 11 месяцев назад +2

      Wasn't Intel's first attempt with quad CPU also similar like 2 dual cores basically bolted together until they released Core2Duo

  • @AlexBoneChannel
    @AlexBoneChannel 11 месяцев назад +337

    The power consumption of Quad FX is brutal

    • @Q-nt-Tf
      @Q-nt-Tf 11 месяцев назад +25

      we threw out about 7000 FX chips when they came out. that's how trash they were.

    • @kimnice
      @kimnice 11 месяцев назад +20

      Imagine having Quad FX with Radeon R9 295X2... crossfire..during the day?

    • @PQED
      @PQED 11 месяцев назад +11

      @@kimnice I reacted to this bit myself - why did they just keep mentioning nVidia cards on an AMD system? The obvious thing - I would have thought - would be to test it with Crossfire.
      Using a couple of 4870x2's would've made more sense.
      This entire video was a waste on what is a piece of rare computing history.

    • @I_enjoy_some_things
      @I_enjoy_some_things 11 месяцев назад +5

      @@Q-nt-Tf You threw away seven-thousand CPUs?

    • @CommodoreFan64
      @CommodoreFan64 11 месяцев назад +1

      @@I_enjoy_some_things Yeah that don't sound sus 😂

  • @aijena
    @aijena 11 месяцев назад +48

    16:05 Reason why the already installed nvidia drivers for the gtx760 didn't work with the rtx3060 is beacuse at least for win7, the driver support is divided between gtx10 series (and older) and the rtx20 series (and newer).
    So it would be correct that the 3060 needs different drivers.

    • @arnox4554
      @arnox4554 11 месяцев назад +1

      Thanks for explaining this! I had first thought Nvidia just let their drivers for Windows 7 take a shit and got super mad.

    • @aijena
      @aijena 11 месяцев назад +2

      @@arnox4554 They actually did drop active support for some while, the latest version is 474 for win7 and they will only provide security updates going forward but no more regular driver updates.

    • @Fay7666
      @Fay7666 11 месяцев назад

      They also do similar stuff to this for Quadro and GTX. I tried plugging a 1080 into an M5000 laptop and the whole install got borked because of driver conflicts and neither card worked.
      Similar story with a Pro W5700 and a Vega M GL laptop, the Pro drivers installed on top of the normal ones, and so the Vega didn't work until I DDU'd.

    • @arnox4554
      @arnox4554 11 месяцев назад

      @@aijena Oh, that's fine. I meant outright shipping broken drivers for Windows 7.

    • @aijena
      @aijena 11 месяцев назад

      @@arnox4554 Ah got it, yeah that's not what is happening there :P
      Tho some versions of drivers may still run worse or less stable with some specific games, just like it is with all drivers on all platforms.

  • @Kitek94
    @Kitek94 11 месяцев назад +304

    I super appreciate those seizure warnings. Luckily I never got one before, but I do get some pretty bad headaches from rapid flashing lights etc. Thanks, editing team!

    • @emma70707
      @emma70707 11 месяцев назад +20

      Agreed! More accessible videos are always better.

    • @collinblack9605
      @collinblack9605 11 месяцев назад +13

      Um they chine as if it's over and it continues until ANOTHER chime

    • @FluorescentGreen5
      @FluorescentGreen5 11 месяцев назад +5

      @@collinblack9605 that's probably the skype call sound you're thinking of

    • @dustinolsen4994
      @dustinolsen4994 11 месяцев назад +2

      NVlink replaced SLI and is only in servers

  • @mraso30
    @mraso30 11 месяцев назад +31

    I remember "upgrading" from a Core 2 Duo E6600 system to an FX 8350 system, and if it weren't for the actual upgrade from an 8800 GTS 320MB to a GTX 760, I honestly felt like my PC barely got any better, certainly not to an extent that I was expecting going from 2 cores to 8 cores. I recall CS:GO struggling hard, meanwhile watching older Intel i5s that ran it better than my system with the same GPU... Then I switched to an i7-4790k with the same GPU and RAM (I think? pretty sure they were both DDR3) and man, CS:GO FPS tripled and other games easily saw a 25 - 50% boost. That CPU was SO BAD. I gave it to my step dad, he had a "better" use-case for it, with video rendering/editing, rather than gaming, but he still complained of system instability and it feeling slow, even when we got him an SSD to install Windows and programs onto.
    It is hard to find game benchmarks comparing an e6600 to an fx 8350, but one I found did Valorant on a 750 Ti for both... the 8350 got 35 FPS @ 1080p.. the e6600 for six years prior with 6 fewer cores? It got 23 FPS. Obviously both are brutal, the 750 Ti isn't even all that good in comparison to the 760 I wound up with, but still.. I expected so much more after waiting so long to upgrade, and its performance in Windows and programs was just soooo bad and sluggish, especially compared with the 4790k. Night and day differences.
    I am back to AMD now though. Went from a 4790k to a R5 3600X, then I went to a 5600X and now a 5800X3D all on the same X570 motherboard. That i7 was great though. I had it paired with a 760, a 980, a 1080, and a 1080 Ti. Now running 3070 Ti with my 5800X3D, never been happier with my PC, I run 1080p high refresh rate, and it feels like this PC could last me another 3 - 4 years at least honestly. Not worried about the 8GB VRAM, it hasn't stopped me from running any games at 1080p with RT yet, and DLSS means no worries honestly as I've only had to use that in a couple games at Quality setting so far, so going to Balanced/Performance is still an option down the line if needed, or turn off RT in some titles, or turn down a couple settings I don't value much. The thing with my current set up that I like so much though is how flippin stable it is. I *never* blue screen, games never crash, everything just works perfectly. Haven't had this level of stability from a system probably ever. It has been 1.5 years since I reformatted/reinstalled windows even and it is still just smoooooth.

    • @solocamo3654
      @solocamo3654 11 месяцев назад +6

      I did the same, had a FX-8120, 8350 and then was gifted a 9590. Got sick of trying to keep the 9590 cool and got a 4790k that lasted me until 2021. Still is a decent gaming cpu even by today's standards but I moved on to a 10900 and then 11900k that I got for a great deal. (11900k + good z590 mobo for barely over $300 NEW)

    • @nater420
      @nater420 11 месяцев назад +1

      I went from an (iirc) fx4100 which was dogshit, not even actually 4core, and couldn't run jack shit to an fx 8350 that was still dogshit. It overheated constantly even after thermal paste replacement and a box fan blowing at it. Absolutely awful CPUs and put me off AMD CPUs for life

    • @Jtwizzle
      @Jtwizzle 11 месяцев назад

      @@nater420 You tried two cpus from the same generation and had the same bad experience and now are put off from amd? Come on now. Ever since ryzen amd has been very good.

    • @Jtwizzle
      @Jtwizzle 11 месяцев назад

      8GB vram is probably still plenty for 1080p for several more years. Even at 2k res I dont have to turn textures down to often on my 3070.

    • @solocamo3654
      @solocamo3654 11 месяцев назад

      @@Jtwizzle It's already not for quite a few titles unless you are ok with dropping texture quality down a bit (which is one of the biggest visual impacts by far).

  • @Mother_Mercury
    @Mother_Mercury 11 месяцев назад +111

    2006 was a time when it was better to have a fast dual core for gaming than a quad core. The problem at that time was that many games could only use a single thread. A frequently used gaming CPU at this time was a core2duo E6700, which was much faster for gaming than, for example, a core2quad Q6600. The AMD fanbase at the time was using the Athlon X2 6000+, but this CPU was a bit slower than the E6700.

    • @ALPHABYTE64
      @ALPHABYTE64 11 месяцев назад +3

      But there is world beyond gaming

    • @Elkarlo77
      @Elkarlo77 11 месяцев назад +8

      I still got my C2D E8500 with its GT 9800 GTX+ and 8GB of ram, suprisingly fast still today, you can work on it. It's the second fastest Gaming CPU of its time, only the E8600 was faster, and you had to pay 30% for 2x160mhz more. 3,16ghz and 6mb of Cache.

    • @Mother_Mercury
      @Mother_Mercury 11 месяцев назад +17

      ​@@ALPHABYTE64 That does not change the situation that in 2006 there were only a few applications with multicore cpu support.

    • @joshyc2006
      @joshyc2006 11 месяцев назад +8

      ah the q6600, i had mine for years. Overclocked like a beast! Remember selling it for £25 after Uni...went to a 4770k which again lasted me forever, but CPU's had stagnated for a long time around then, just got a 5800x3d and hoping to have struck onto another long laster!

    • @Tragix80
      @Tragix80 11 месяцев назад

      ​​​@@joshyc2006 well i pray that your long lasting journey of CPUs continue , 5800X3D is also a gaming beast of this era/time

  • @jayrap94
    @jayrap94 11 месяцев назад +23

    Anyone remember the ASUS MARS GTX 295? 2x GTX 285 spec chips with 4 GB VRAM versus 2 x GTX 280 spec chips with 1792 MB. That thing was a beast back then! A very expensive one I could never afford though.

    • @johncate9541
      @johncate9541 10 месяцев назад

      As I recall, it may have been called Mars, but it ran as hot as Venus!

  • @6XGate
    @6XGate 11 месяцев назад +19

    As someone who owned one of these. The biggest issues with the benchmarks from the day is they were done on Windows XP, since Windows Vista hadn't yet released by the time this came out. I ran Vista and 7 on the board and actually had much better numbers than reviewers did. But this setup wasn't really consumer ready, but was neat.

  • @R3AL-AIM
    @R3AL-AIM 11 месяцев назад +52

    My first home built desktop was with the FX6300... While it wasn't great at productivity stuff, it held up to High/Ultra 1080p gaming until about 2015/2016. That little chip means something to me. Surely not because it was great, but it was definitely serviceable and was my first time seating a CPU.

    • @daeganj
      @daeganj 11 месяцев назад

      I had that, too! And Crossfire 2 hd6970's lol

    • @Volkswagen_Yeetle
      @Volkswagen_Yeetle 11 месяцев назад

      Same here, FX 6300 Black Edition. Started out with a GT 610 and later got a GTX 1050. It wasn’t able to run GTA 5 though, would always crash on that first mission when you play as Franklin with Lamar

    • @michaelharry2528
      @michaelharry2528 11 месяцев назад +1

      I shot myself in the foot by getting a FX 4170. It worked fairly ok with the games I played but took forever to encode my videos. I also didn't need a heater in the winter lol

    • @AgentSmith911
      @AgentSmith911 11 месяцев назад +12

      Those Bulldozer and Piledriver CPUs are a completely different lineup than the ones in the video though, by about five years. I still have the FX-8350 and I believe Steve from GamersNexus still daily drives his FX-8370

    • @pavelcuba9260
      @pavelcuba9260 11 месяцев назад

      Same! I loved that cpu! It was 2013 and I was running it with Gt630 and than R9 270x, that system put me through a lot! If there wouldn't be Covid I would never had to upgrade that poor computer. It was really keeping up with anything I wanted to play and it aged well... its now in family PC mainly for printing stuff :/, but man, it still works great! I would love to see a FX-6300 video, something with overclocking it to 5Ghz and playing modern games

  • @yankumarrah
    @yankumarrah 11 месяцев назад +92

    Ah yes, the Quad-Father. I would have killed for this back in the mid 2k's. It was pretty much a hamfisted attempt at catching up to intel's core2 quads.

    • @Q-nt-Tf
      @Q-nt-Tf 11 месяцев назад

      only child molesters use amd.

    • @FixedFunction
      @FixedFunction 11 месяцев назад +2

      AMD developed the platform within the same time as Core 2 Quad's development, and they released 30 days apart from one another. AMD didn't have much chance to tweak anything to stay ahead, and K10 was still a few months away. They banked on the "FASN8" program where K10 would be a drop-in upgrade and allow for 8 cores, and once 65nm yields for K10 proved to be a nightmare and the whole architecture had to be cut down to fit (axing most of the L3 cache) they cancelled that plan.

  • @jakereding2313
    @jakereding2313 11 месяцев назад +12

    I'd love to see a video where the challenge is to get 60fps in a handful of modern games using the oldest hardware possible, including OC and used parts! Good reference to see what still holds up

    • @lucasrem
      @lucasrem 10 месяцев назад

      I own both setups, almost, bought the AMD fancy FX 64x myself, almost buying the dual socket board with one CPU. Never used the system a lot, too many driver issues, AMD now...?
      Happy i found a Q9650 CPU in 2006, able to build my first DDR 3 system on nForce, that was the cheap intel system, just before the Core launch.
      The Q9650 system we sill use, 16 Gb, the NVMe drive makes it fast enough to run Win10 x64 fast, the new RTX card in it was not a waste ! 775 can still be modern if you do it good, early 2000's !
      The ASUS MARS, was years later, cheap GTX 660 SLi ! wow, how weird, i understand why linus did this build !

  • @classic_jam
    @classic_jam 11 месяцев назад +28

    You could run those on non-SLI boards. That was the main gimmick of the dual-GPU cards... at least as far as I know? My newest one from nVIDIA is the GTX 690 so maybe it changed later.

    • @lucasrem
      @lucasrem 10 месяцев назад

      jam,
      running mid level GK104 chips on one PCIx bus, both the GTX 690 and ASUS MARS, was the solution back then, slow chips, getting some FPS, able to sink ?
      The GTX 690 i still own, the EVGA one blower factory model, no fancy sticker, needed for basic 1080p gaming, replaced as soon as i could, the GTX 1080 did that for me, still own the 1080ti.

  • @paulladdie1026
    @paulladdie1026 11 месяцев назад +31

    Back In the late 90's, I had an Abit BP6 dual socket 370 Mainboard, with two Celeron 466 cpu's overclocked to 525 MHz. The only problem was that at the time the only SMP O/S available to me was Windows NT4 Workstation, which barely supported any games. But it was a very cool mobo at the time 🙂

  • @Nate_the_Nobody
    @Nate_the_Nobody 11 месяцев назад +5

    You're forgetting about the Bulldozers, I took the risk of buying a "FIRST EVER 8 CORE DESKTOP CPU!" instead of that lame quad core 2600K from intel, boy did I pay for that, the FX-8120 I got overclocked all the way to 4.9GHZ on air but was still 30-40% slower than the Phenom II x4 955BE I upgraded from

  • @SaperPl1
    @SaperPl1 11 месяцев назад +4

    I had dual Opteron 270 system for my gaming rig around that time and it did rip copying data on sata drives with two nforce 4 chipsets. It did make a difference against DDR2 dual cores at the time for some reason until quad cores from AMD came out.

  • @CrystalSolvent
    @CrystalSolvent 11 месяцев назад +20

    I think you should have gone with the 7950GX2. I always dreamed of that card back in the day.

    • @rare6499
      @rare6499 11 месяцев назад +15

      Yeah that or 9800GX2 would have made sense for the era. A single GTX 760 is overkill let alone quad GTX 760’s 🫠😂

    • @rsweb1993
      @rsweb1993 11 месяцев назад +1

      @@rare6499 100% the reason they didnt is because both the 9800GX2 and 7950GX2 are "extremely" unreliable and hard to find working nowadays

  • @alexanderr6106
    @alexanderr6106 11 месяцев назад +7

    The Vega VII didn't paper launch. It still is a beast for crypto mining comparable like a more efficient 3080. They were mostly bought out before reaching consumers. Have seen my fair share of Those fragile beasts

    • @superskrub4209
      @superskrub4209 11 месяцев назад

      Im amazed a 4-year old GPU is still so competitive at mining

  • @samurai_8917
    @samurai_8917 11 месяцев назад +13

    That double cut shot under the table about his masochism was great, keep improving your skit making LTT crew it shows in the product!

  • @CaptPatrick01
    @CaptPatrick01 11 месяцев назад +3

    When my current PC was first built it came with an FX 8350. Compared to the previous PC's Phenom II 1055t, only BeamNG and Space Engineers saw any improvement at all and some applications even got worse outright with stability issues. I didn't hesitate to upgrade it to an R5 2600x the moment affordable Ryzen MBs started being sold in my area. The improvement was night and day. My GPU wasn't getting bottlenecked anymore to the point I was even able to upgrade my monitor setup. (There was a whole other can of worms after that, but it was more due to ASUS related shenanigans than a fault of the platform itself.)

  • @THENOOBISYOU
    @THENOOBISYOU 11 месяцев назад +4

    9:52 bottom, right image 💀

  • @ljcool17
    @ljcool17 11 месяцев назад +6

    I remember the Pentium 4 days. One of my friends had the 3.0 Ghz and it was a pain to cool. I only had the 2.4 Ghz

  • @sergeantseven4240
    @sergeantseven4240 11 месяцев назад +4

    I remember there being some really cool build back in the early 2000's that used multi GPU and multi CPU setups. I would really love a video about what people were doing back then with those.

  • @0to600
    @0to600 11 месяцев назад +6

    You should do a video build with all the oldest computer components have in stock and compared to a new systemy

  • @GReaper
    @GReaper 11 месяцев назад +6

    I have a still functional system using the the non-ECC version of the board with the FX-74s (the PCIe slots are blue vs red), and I actually liked it overall. Yes it ran hot, but I had the thermaltake vortex coolers to handle that (which look pretty killer too i might add), and I later used liquid cooling specifically for the GPUs (nVidia Quadro 4000s, later upgraded to K4000s). It can actually play Doom 2016 at 720p on moderate-ish settings (deffs not 60fps, but playable). You can install Windows 10, and even Windows 11 if you remove the TPM requirement. WIndows 11 is pretty awful performance-wise though. I think the biggest killer was that the RAM slots and onboard devices were known for going bad or being bad from the factory.
    I remember on cold winter nights i would literally drape a blanket over me and it (at the foot of my chair) and just use the machine to keep me warm. lol.

  • @prestonjmathis
    @prestonjmathis 11 месяцев назад +7

    I had a Athlon X2 black edition in an am2/am2+ asus board. Later upgraded to a phenom ×4 955 black edition. Used that for many years with a GTS 250. First gpu was 8500gt.

  • @xpPhantom2
    @xpPhantom2 11 месяцев назад +81

    Hey Linus, you should make a mouse pad with a schematic of a GPU or CPU or generally pc parts on it, it would look so cool

    • @Crazynin1
      @Crazynin1 11 месяцев назад +18

      You mean what gamers Nexus already did?

    • @Alex_Fishman
      @Alex_Fishman 11 месяцев назад +8

      They will just put a giant image of his face across the pad. Just imagine *W I D E* Sad Linus laying across your desk, staring at you from under the keyboard.

    • @draketurtle4169
      @draketurtle4169 11 месяцев назад +5

      @@Alex_Fishmanthat’s perfection, though I see one flaw… ya know the big booba mouse pads to support the wrist… yes like that.

    • @xpPhantom_
      @xpPhantom_ 11 месяцев назад

      (same guy different account btw) thx ill check that out

    • @xpPhantom_
      @xpPhantom_ 11 месяцев назад

      shipping to the uk is half the price of the mousepad on gamersnexus 💀💀💀

  • @woodenotaku
    @woodenotaku 11 месяцев назад +4

    Next up: one of those old (circa 2006) Tyan dual-core Opteron motherboards with 4 CPU sockets plus the Tyan M4881 daughterboard that adds an additional 4 sockets for a total of 8 CPU sockets and 16 CPU cores in one machine

  • @jorismak
    @jorismak 11 месяцев назад +7

    The 760 was the card I bought because I ditched SLI, finally recognizing that the fps counter was meaningless if it played like crap.
    Never thought someone would be making a dual gpu care in that era .. let alone buying one , let alone midrange cards :).
    I went with a 2x hd3870 (dual gpu in one card ) at the start , single hd4xxx series , hd5850, dual hd5850, dual gtx560ti. I finally leaned my lesson by then to just get a single card, no matter what .

  • @ygdefine322
    @ygdefine322 11 месяцев назад +10

    Linus needs to try use a single core celeron lol... Then you'll feel the power of wasted silicon

    • @sihamhamda47
      @sihamhamda47 11 месяцев назад

      It could be 100x more painful for him than his recent "cheapest laptop ever" video

    • @ygdefine322
      @ygdefine322 11 месяцев назад

      @@sihamhamda47 I never knew a browser window lagging out was possible on modern cpus till I experienced the abomination they try sell

  • @argcades
    @argcades 11 месяцев назад +10

    i think it would perform a lot better with the dual cores 3ghz (4 cores effectively) and definitely dual channel ram and a 1060 at most

  • @friddevonfrankenstein
    @friddevonfrankenstein 11 месяцев назад +4

    That screen tearing edit was glorious! That editor needs a raise :D

    • @ImKyleJK
      @ImKyleJK 11 месяцев назад +1

      the edits seem old i had to check when it was released like the music in the background is so loud

  • @knyte6426
    @knyte6426 11 месяцев назад +11

    I remember that era well. I was an AMD fanboy up to that point. Went from Athlons to an Opteron 170 as my first dial core CPU. But, yeah did AMD drop off a cliff fast. When, it was time for my next upgrade, I really had no choice but to switch to team blue and the stupidly good C2Q 6600, that thing was an overclocking beast!

  • @marienvincent9390
    @marienvincent9390 11 месяцев назад +8

    I have a MSI k8n master2-far with 2 opteron 280, and it is similar to the quad fx platforme. Really good on the paper (SLI, 12gb DDR400 ecc, 2*2core cpus, 2gigabit lan, on a e-atx form factor,...) but released in 2005... 1 year too last for it to be interesting.

  • @BReal-10EC
    @BReal-10EC 11 месяцев назад +6

    Based on the performance charts.. I wonder how they even Marketed this Quad FX then since it was still not good? Did any applications use more threads back then? Some type of engineering program, for example?

    • @joefish6091
      @joefish6091 11 месяцев назад +1

      Small office file and service server market, not for home. These systems in tower cases would be sitting in closets in offices.

  • @hiddendrifts
    @hiddendrifts 11 месяцев назад

    9:57 this video was a month+ in the works... dam... [this was recorded when the "Custom 'YTX' gaming build" was first uploaded]

  • @Zenefor
    @Zenefor 11 месяцев назад +1

    for sli to work properly you'd need an equal bandwidth pcie bridge aside from the motherboard one (16x bridge for 16x pcie connector).

  • @ChocoXD.
    @ChocoXD. 11 месяцев назад +3

    can u try making the most powerfull 2007- 2008 system?? would be interesting!

  • @Gokul_Yt
    @Gokul_Yt 11 месяцев назад +3

    (10:04) This video is recorded 1 month ago.

  • @TheOriginalFaxon
    @TheOriginalFaxon 11 месяцев назад +1

    I had an old crossfire setup with a pair of 290x 4gb cards on water, that I used until 2018. Once I took it down, one of the GPUs simply would not come back to life on it's own, it would recognize if I had it in a rig with the other one but it was totally DOA, no picture nothing, when it was in a rig by itself. When I had it as the primary GPU it would do that same thing with the flickering textures and other nasty problems, clearly it had some bad VRAM but also I think the output engine was broken because one of the outputs didn't work either. I had no idea anything was broken until after I took it apart lol

  • @peptoyo
    @peptoyo 11 месяцев назад +1

    Man these were the days though that overclocking WAS necessity out of the box. Where you'd instantly get 25% instant increase... it would be interesting but I don't know how safe to do with its age now.

  • @mikelay5360
    @mikelay5360 11 месяцев назад +5

    Please check into CPU latency ! From both team red and blue

    • @johnscaramis2515
      @johnscaramis2515 11 месяцев назад

      Don't think there's too much difference, as the Intel quad cores at that time were simply two dual core chips on one package with no direct connection. Much like this dual CPU setup from AMD.

  • @lukeb7954
    @lukeb7954 11 месяцев назад +31

    My old PC had an AMD FX-8350 Eight-Core Processor from 2012, and it honestly wasn’t all that bad. Yes, it ran very hot and really raised the temperature of my room, but it was snappy, great for gaming, and handled anything I threw at it. It still performs well today.

    • @C0mmanderX
      @C0mmanderX 11 месяцев назад +1

      if you have a gtx 1060 or lower then yea its fine

    • @lukeb7954
      @lukeb7954 11 месяцев назад +1

      @@C0mmanderX The old PC had a GTX 960

    • @sihamhamda47
      @sihamhamda47 11 месяцев назад +3

      That could be perfect for winter setup for those who lives in very cold climate

    • @jakegog9897
      @jakegog9897 11 месяцев назад +3

      I used to have a fx-6300 and the computer sat near my leg. Could be shirtless with the AC on and still be sweating

    • @xpPhantom_
      @xpPhantom_ 11 месяцев назад +1

      same, to this day i am still running an i7 3rd gen, works fine

  • @DeiviZZZ
    @DeiviZZZ 11 месяцев назад

    I remember back in the day I had dual cere E6300 1.8ghz overclocked to rock solid 3.4ghz. good days.

  • @philtkaswahl2124
    @philtkaswahl2124 11 месяцев назад

    Retrospectives of components from that era always makes me want to look at old archived Maximum PC magazines.

  • @Some_Awe
    @Some_Awe 11 месяцев назад +5

    im honestly surprised and impressed AMD managed to crawl its way back as a competitor

  • @KyberGaming47
    @KyberGaming47 11 месяцев назад +8

    my first computer was a dell optiplex 780 sff with a core 2 duo (unsure on model) then upgraded it to a core 2 quad q6600 and damn it went for what it was. great little computer and i got it for $30aud way back lol

    • @lands1459
      @lands1459 11 месяцев назад +2

      ive got one of these laying around, it comes with an E7500 stock

    • @KyberGaming47
      @KyberGaming47 11 месяцев назад

      @@lands1459 interesting, the guy i bought it off mustve got a different cpu at some point or as an upgrade cause after a double check it had a E8400 duo @3Ghz

  • @LordvanDarc
    @LordvanDarc 11 месяцев назад +1

    13:58 props to the editor who put in the effect, subtle but nice.

  • @EightPieceBox
    @EightPieceBox 11 месяцев назад

    I just had this running in the background while assembling a bike. I wasn't paying attention, but loved the throwing of Gary under the bus over a past CPU choice!

  • @mysteriousmarvel8413
    @mysteriousmarvel8413 11 месяцев назад +4

    Hey Linus, i still use my Asus KCMA D-8 which uses a dual opteron 4130 setup with ddr3 ram and use it as a home server setup, A setup from 2009. I'm so glad you are enlightening the world about the world's best platform.

  • @uppyoass
    @uppyoass 11 месяцев назад +21

    Can unfortunately support his claim. That line of CPUs made me stay away from AMD until Ryzen gen 3.
    And I had the very setup he displays here. With custom watercooling. My god what a mess that was to get working and keep working...

    • @C0mmanderX
      @C0mmanderX 11 месяцев назад

      why ryzen 3 the lowest of them? or did you mean ryzen 3000? because ryzen 3 isnt age

    • @ridwananhar4418
      @ridwananhar4418 11 месяцев назад +4

      @@C0mmanderX i think he meant that he stayed away from AMD CPU untill their Ryzen CPU came in.

    • @uppyoass
      @uppyoass 11 месяцев назад +1

      @@C0mmanderX the other guy was correct.
      I meant to type Ryzen generation 3. Not Ryzen 3. My bad! :)

  • @bertnijhof5413
    @bertnijhof5413 11 месяцев назад

    My 15 year old AMD PC the HP dc5850 was great. It had a AM2+ socket with a Phenom X3 and later a Phenom II X4 B97 (4x 3.2GHz); max 8GB of DDR2 (800MHz) and 2nd gen PCIe with a 128GB SSD on SATA-2. I used it till April 2019, when I replaced it with the 2nd slowest Ryzen ever (Ryzen 3 2200G), but still a huge improvement 2x the speed of the Phenom II X4. That 2008 HP dc5850 is still in use with a friend.
    I love the AM2+ and AM4 motherboards, they both supported at least 2 generations of CPUs.

  • @David_Quinn_Photography
    @David_Quinn_Photography 11 месяцев назад

    Even in 2010 when I just entered PC gaming I never understood SLI for gaming

  • @jaygrant7263
    @jaygrant7263 11 месяцев назад +8

    I love the dual socket MB they just look cooler

    • @astr0nox
      @astr0nox 11 месяцев назад

      Probably hotter actually

    • @joefish6091
      @joefish6091 11 месяцев назад

      New Chinese dual Xeon ATX motherboards are 100 or so, the 4-18 core CPUs pulled from big server farms are very cheap. X79 uses quad channel DDr3, X99 uses dual channel DDR4.
      You can buy 19" rack dual CPU server slabs for 150 upwards from A******n , they generally have two PCIe slots at the rear, also a power connector for GPU lead. fit a USB3 card in the second slot. the biggest caveat is they have lots of small fans and are jet engine noisy at start up..

  • @sHoRtBuSseR
    @sHoRtBuSseR 11 месяцев назад +4

    Even though it sucks, it sure is awesome. This is one of those things that you build today just out of pure fun, because you can. Multi GPU and CPU setups were absolutely a blast to play with back in the day. It usually wasn't practical, but it was a blast to tweak.

  • @fatjawns3671
    @fatjawns3671 11 месяцев назад +1

    11:51 Gary looks like Linus’s disappointed father right here 😂

  • @LaczPro
    @LaczPro 11 месяцев назад +1

    Amazing to see Opterons in the channel. I remember I was this close to buying a G34 socket server motherboard to get some of those cheap CPUs... But you can imagine why I didn't for it. It could've been incredible for server things, but Ryzen going more affordable and more reasonable (even Threadripper and Epyc) basically negates all purpose, unless I go "just for fun".

  • @ShadeAssault
    @ShadeAssault 11 месяцев назад +3

    I remember the Mars 760's. I honestly almost bought one. Kinda wish I had now cause I collect dual GPU cards and finding them is incredibly difficult. I would have hated myself at the time, but I would have one now for my collection.

    • @lucasrem
      @lucasrem 10 месяцев назад

      The MARS i sold, some old guy needed it, getting way more for it then i ever payed for it
      Got the GTX from Nvidia, the EVGA 690 model fefurbigest, RNA for a dead 590 card, still own that.
      GTX 660 running in SLi was mad, to many overhead, the GTX 690 did the job, just basic 1080p gaming was needed !

  • @harp69420
    @harp69420 11 месяцев назад +5

    time for a best CPU EVER VIDEO

  • @ethanlieske9678
    @ethanlieske9678 11 месяцев назад +1

    I would suspect the Numa boundary on the RAM between the CPUs would also cause a tone of issues/slowdown.

  • @RylTheValstrax
    @RylTheValstrax 11 месяцев назад

    I have a retrocomputer with one of those! I purchased it in 2017 for a celebratory bonus build in parallel to my TR 1950X build. I have a number of retrocomputers with... intentionally weird and questionable parts, including the infamous nvidia leafblower (FX 5800 ultra) and a 3dfx Voodoo 5500. The Voodoo 6000 and XGI Volari Duo both evade me sadly.
    ...I also have some systems with modern weird and questionable parts too, like a working system with Via/Centaur's last x86 cpu (the CHA/CNS chip) before Intel acquihired Centaur.

  • @mikavbtje
    @mikavbtje 11 месяцев назад +2

    my mom had this in her old pc, I grew up with it playing flash games on it 😆

  • @IT10T
    @IT10T 11 месяцев назад +12

    The editing in this video should be the gold standard for LTT, they even impressed me with the quality of seizure warning. It is appricated when they warn people about loud sound or strobing lights.

  • @uattias
    @uattias 3 месяца назад +1

    Imagine if motherboard designers revisited the double CPU idea.
    Just imagine running two i9 14900 kf's.

  • @NodokaHanamura
    @NodokaHanamura 10 месяцев назад +2

    FX-series CPUs have always been a fucking dumpster fire - I remember my computer constantly hitting T-junction in the summers and shutting off.

  • @DaEpikMan
    @DaEpikMan 11 месяцев назад +9

    The Quad FX looks like something out of a *fever nightmare*

    • @KhriszB
      @KhriszB 11 месяцев назад

      Your profile picture looks like something out of a fever nightmare

  • @GetShwiftyInHere
    @GetShwiftyInHere 11 месяцев назад +3

    I miss SLI 😢 imagine what games could do running dual 4090ti's

    • @indigomizumi
      @indigomizumi 11 месяцев назад

      Imagine the power you'd need for that. And I doubt many cases could even fit dual 4090s.

    • @indigomizumi
      @indigomizumi 11 месяцев назад

      @@transistorjump919 Fair point. There is even an official water cooled 4090.

  • @OfficialiGamer
    @OfficialiGamer 11 месяцев назад +2

    I didn't recognize Gary till you mentioned ASUS, met him at HardOCP's Gaming Convention thingy. (Met Kyle Bennett too) both awesome guys!
    Also I still have an AMD FX-8350 + GTX 980 as a spare gaming rig. Also have a C2Q 9650 that runs a GTX 960 (before that it was a 560 Ti)

  • @noodleman9945
    @noodleman9945 11 месяцев назад +2

    I don’t think anyone would invest that much in old parts. That’s why LTT did it for us.

  • @kalark
    @kalark 11 месяцев назад +3

    I remember upgrading from an athlon xp to an Athlon 64 FX-74 and it was such an incredible upgrade. I also remember going into fry's and the sales rep trying to sell me on quad fx (or more like trying to sell my dad on buying me quad fx) . Glad that didn't happen lol

    • @FixedFunction
      @FixedFunction 11 месяцев назад

      How did you upgrade to an FX-74 if you didn't buy the Quad FX system? The FX-74 only works on Quad FX, it was made specifically for that board and was only sold in pairs (2 CPUs per box).

    • @kalark
      @kalark 11 месяцев назад +1

      @@FixedFunction lmao you made me go dig out that old pc (in the garage) and to my surprise it's an FX-62, guess I got it mixed up (I was like 12 at the time /shrug)

  • @danavidal8774
    @danavidal8774 11 месяцев назад +5

    Like for the warning!

  • @powerpower-rg7bk
    @powerpower-rg7bk 11 месяцев назад

    The thing with platforms like these is that they don't have a solid upgrade path which brings things to today with the Threadripper platforms. Essentially every generation of Threadripper. Sure the first generation got an upgrade from 16 core to 32 core max but they were the same Zen 1+ architecture. Then then was the TR4 socket for the Threadripper 3000 series which never saw a new chip released for them. Ditching the previous dead end socket for another dead end socket.
    Then there was the Threadripper Pro which added yet another socket to the Threadripper history but hooray it did provide a generational upgrade with some gotchas. The Threadripper Pro 3000s were mostly OEM only with off the shelve parts for DIY builders only appearing shortly before OEMs had access to the Threadripper Pro 5000s, thus making Threadripper Pro 3000 in the DIY area kind of rare: an upgrade path did exist for the dozen or so people who had one. While the consumer and server segments of AMDs line up got some 3D V-cache parts, Threadripper again was neglected. The saving grace for Threadripper Pro 3000 and 5000 is that the many DIY motherboards can also work with Epyc chips. One downside of being mostly an Epyc chip for Threadripper Pro is that they can be locked to an OEM platform, complicating the used market.

  • @prostheticconscience6707
    @prostheticconscience6707 11 месяцев назад

    Anyone else noticed a weird pink line flicker at 16:42 right above the FX-74 frequency?

  • @coonyman10
    @coonyman10 11 месяцев назад +3

    I love these legacy hardware reviews!

  • @asetatlikalem
    @asetatlikalem 11 месяцев назад +7

    That still powerful from my 10th gen i5 😀back in then, amd decided to put more but not so powerful cores into processors. İ guess they were like, ohhh mate if we add more cores it will be more powerful

    • @ItsDeeno69
      @ItsDeeno69 11 месяцев назад

      how

    • @doublevendetta
      @doublevendetta 11 месяцев назад

      Swing and a miss my guy, those slightly less powerful single-core but better multi-core CPU for coming out when 8th and 9th gen were a thing

    • @asetatlikalem
      @asetatlikalem 11 месяцев назад

      @@doublevendetta ohh i Dont really recognize nor i have Heard quad FX thing but i didnt think it was that New i think that multicore thing is 2011 thing. So sorry if i was wrong

  • @Nothuman76
    @Nothuman76 11 месяцев назад

    About https on old hardware. Old CPUs lack a feature that's found on all modern CPUs and that's AES hardware encode and decode. If you go to youtube on old hardware, then the CPU must do all the work to decode and encode the network traffic. That causes the CPU to max out and the video to struggle at the beginning. Other single cores and even some duo 2s would suffer from this problem but never fully recover because they can't catch up with encryption (just stay laggy and dropped frames). So really comes down to that hardware decoding and encoding that makes the modern web work! Imagine opening a website where the processor had to do all the work. It would be like starting up a program every time. Oh, that's kind of like what Linus experienced when doing so...

  • @kalmtraveler
    @kalmtraveler 11 месяцев назад

    memory is hazy from that far back but I recall the intention with mid-range cards being able to SLI was that you could buy one now, and later on when you have more money buy another, or a 3rd, 4th, etc for theoretically increased performance without having to spend as much money up front on a single higher end card.

  • @generalmisery
    @generalmisery 11 месяцев назад +1

    I did play with the green flickering for some time. I had a Radeon 240, I think. To run games like GTA 5 at 30 fps, I overclocked it to the point that it shouldn't have worked. To fix it, I had to tinker with bios settings. I have no clue what I did 7 years ago, but I broke it a year later.

  • @LiamDilley
    @LiamDilley 11 месяцев назад

    You guys should do a video about old school modding. The AMD K6? For example where with a silver pen you could re-connect the lines they cut on the die to create the cheaper CPU version. And get some GPU's as well where "Soft modding" was still a thing to unlock the under clocked GPU's because those failed yield testing and offered as cheaper versions.
    Those days are gone and sadly missed - fun times but would be good for us older techno bobs to enjoy and to enlighten the newer generations of the "good times"

  • @user-es8bm1zs2s
    @user-es8bm1zs2s 11 месяцев назад

    Anything pre Core is always going to be horrendous with any web stuff because of the "new" encryption pages are wrapped in thanks to non existent instruction sets.

  • @JessieDoidge
    @JessieDoidge 11 месяцев назад +2

    still have a FX8350 around somewhere, was great fun to overclock (5.1Ghz air all core iirc) but while tinkering was fun getting things to run without frame time issues was near impossible, at least on my chip.

  • @jeph_os
    @jeph_os 11 месяцев назад

    I'd love to see a video on how older components that would typically be used in a server build perform in a different environment such as gaming.

  • @lukelwnmr
    @lukelwnmr 11 месяцев назад

    9:20 wow, that was lowkey pretty beautiful

  • @tipoomaster
    @tipoomaster 11 месяцев назад +1

    Imagine playing Project Offset on this puppy
    If you know that hadn't died with Larrabee

  • @cjhawk67
    @cjhawk67 11 месяцев назад

    I had a pair of GTX 260s in SLI ages ago and in games that supported it well it would render on both GPUs and scaled well enough but it turns out the bottom GPU didn't output video on any of the ports and I have no idea if it had always been like that or if not for how long it was like that lol

  • @Rose.Of.Hizaki
    @Rose.Of.Hizaki 11 месяцев назад +2

    That CPU cooler... Reminds me of the time I used to cool an AMD 939 setup with a Thermaltake Big Typhoon.

  • @eggcopter
    @eggcopter 11 месяцев назад

    With Ampere you probably need to download an older driver. I run turing where the latest driver doesn't install but the ones thats slightly older does. It is probably easier to get the "iCafe" version drivers working also, considering Windows 7 usage in chinese iCafe market.

  • @cwhitley.sawlabs
    @cwhitley.sawlabs 11 месяцев назад

    I used to rock a Athlon II X4- was always scared of the 95W TDP and the fact that my only working cooler was just extruded aluminum with a tiny 80mm fan.

  • @freendysinaga9762
    @freendysinaga9762 11 месяцев назад

    dayum i just watched that two socket intel video yesterday.
    just like last time, he didn't watercool it.

  • @linkVIII
    @linkVIII 11 месяцев назад

    Yt playback is something I've started to notice problems with my 4th gen laptop i7. Fans need to kick in if I have the webpage visable. But embeds or minimizing the page is fine.
    The page itself is somehow a problem

  • @apachelives
    @apachelives 11 месяцев назад +2

    You need to review the older dual Celeron socket 370 rigs with an ABit BP6 motherboard, the days when Intel forgot to disable multi core support on their cheaper products.

  • @gorillatech5337
    @gorillatech5337 11 месяцев назад +1

    I had an FX8350 till like 2 years ago

  • @Chozo4
    @Chozo4 11 месяцев назад +1

    Could always try a Quad Opteron setup based on the Piledriver platform using socket G34. Can get Operon 6380's relatively cheap compared the next step up, the 6386.🤔

  • @alexisrivera200xable
    @alexisrivera200xable 11 месяцев назад

    Those green tinged frames on CSGO seem like one of the cards wasn't doing so hot while the other what working properly. Every time the card with issues rendered a frame it came out green creating that strobing effect. Could be a driver glitch.

  • @kh_trendy
    @kh_trendy 11 месяцев назад +1

    I'm old, so I really love seeing you guys look at older "high end" stuff I could only dream about back when I was a kid 😂

  • @sashacrossi
    @sashacrossi 11 месяцев назад

    I recently managed to revive an old Gaming PC from my dad I found in my family's basement. I used that case for my setup at home, but wanted to see what the Motherboard was all about since I could find no info whatsoever on the case on what I was holding in my hand. Only the spiders knew. All I knew was that it was an old ASUS MB with Dual DDR2 and 467 GB HDD. So I took the MB and placed it in the Windows 97 case I was trying to mod. I managed to connect an almost as old WLAN-Adapter to the MB as well, just had to install some drivers. Now it's running Windows 7 and a functioning WLAN Module plus a little bit of other time travel from 1997-2010.
    I am most likely not going to use that thing often. It also had nothing on its drives anymore. But it was surprisingly fast, even with only HDDs and I want to find a nice little place for it somewhere, maybe see if I can't dust off an old Monitor or sth.. Maybe take it upstaires for my mum to use who is at the hospital atm (although she really isn't much of a PC user lol). Have a little present for her when she comes back (she won't care for it). Seeing old technology like that still light up is a special feeling somehow. I am loving it.

  • @dikbozo
    @dikbozo 11 месяцев назад +2

    I have to wonder as to the GPU choice for upgrade that was tried here. I think a more appropriate one would be a 1050 or RX 580. They both would beat the crap out of that 760 and have better support under W7. Consider giving it whirl even if just in the lab.