The Ultimate Linux Gaming Trashcan

Поделиться
HTML-код
  • Опубликовано: 23 сен 2023
  • Go to www.piavpn.com/ActionRetro to get 83% off Private Internet Access with 4 months free!
    Today we're upgrading the Linux Trashcan Mac Pro with a new CPU, and an eGPU! I'm sure it will just work, no issues at all. How well can this thing game now?
    LINKS:
    ------------------
    🍎Colin's upgrade video: • Can This Decade-Old Ma...
    (Amazon links are affiliated links)
    ══════════════════════════
    💾 For more vintage Apple stuff, please subscribe: ruclips.net/user/ActionRetro?s...
    💾 Support these retro computing shenanigans on Patreon! / actionretro
    ══════════════════════════
    Check out my Amazon page with links to my tools, adapters, soldering equipment, camera gear and more: www.amazon.com/shop/actionretro
    ══════════════════════════
    💬 Come talk about old computers on the BitBang Social Mastodon! bitbang.social
    ══════════════════════════
    #VCFMW #PowerPC #Macintosh
  • НаукаНаука

Комментарии • 444

  • @ActionRetro
    @ActionRetro  9 месяцев назад +11

    Go to www.piavpn.com/ActionRetro to get 83% off Private Internet Access with 4 months free!

    • @chandlerbing7570
      @chandlerbing7570 9 месяцев назад

      Thank you S̶p̶o̶n̶s̶o̶r̶b̶l̶o̶c̶k̶ for blocking todays s̶p̶o̶n̶s̶o̶r̶

    • @dylanlindsay1993
      @dylanlindsay1993 8 месяцев назад

      plz tell me that sceptre monitor aint the 75-80$ on off amazon! i had the 80$ one and i had it for 7 days before i returned it and used the money to buy a 165hz 1080p lg ultra gear monitor that walmart had on rollback for like 5$ mor than the sceptre! i will never buy a sceptre monitor ever again! no cap my backup monitor which is a dell monitor from 2008 that i got for 8$ at goodwill had WAY BETTER viewing angles than that sceptre i had! i had to sit dead center infront of the sceptre becuase the viewing angles were so bad and thats why i returned it!

    • @BigKelvPark
      @BigKelvPark 4 месяца назад

      So if they don't keep any of your data - how do they bill you monthly?

  • @AutumnRain
    @AutumnRain 9 месяцев назад +512

    well i have to compliment the design a bit - you don't often see a computer where installing ram makes you feel like you're replacing uranium fuel cells

    • @raven4k998
      @raven4k998 9 месяцев назад +14

      well he's making that trash can useful at running without apple trash in the can🤣🤣🤣

    • @ChairmanMeow1
      @ChairmanMeow1 2 месяца назад +1

      the design of the cube was INCREDIBLE. but sadly it was form over function in the end. But turning it over and pulling the CORE out. That felt nuclear as well.

  • @retropuffer2986
    @retropuffer2986 9 месяцев назад +219

    Action Retro: "If you enjoyed taking heavily depreciated professional grade tech and using it entirely unprofessionally..."
    Me: Stares at maxed out NeXT Station Color Turbo...."Yes, yes I do."

    • @BrianMoore-uk6js
      @BrianMoore-uk6js 9 месяцев назад +15

      What to do with a NeXT Station Color Turbo professional workstation in 1991? John Carmack: "let's create Doom on it."

    • @ordinosaurs
      @ordinosaurs 9 месяцев назад +8

      My little LAN of SparcStation 10 (Debian), SparcStation 20 (NeXTStep 3.3) and Alpha Miata (Windows 2K ß) agrees...

    • @davel4030
      @davel4030 14 дней назад +1

      🤤😂

  • @peterstrider7303
    @peterstrider7303 9 месяцев назад +229

    @11:42 You don't need a secondary PSU for the 6750xt. the 650w recommendation is for an ENTIRE COMPUTER. Not just the GPU. unless your running a 4090, which can draw up to 400w BY ITSELF, Almost all lower tiered GPUs will run fine with a 400w PSU. @12:15 A possible reason why you might be unable to use the external RX6750xt is because the AMD Drivers are already in use by the dual GPU solution inside the MAC PRO. it seems that the driver will only engage in a single GPU when running the driver. Not to mention that thunderbolt support in Linux is sometimes hit or miss. Its unfortunate that this setup did not work as expected, however It wasn't that surprising considering the level of jank that went into this setup.

    • @TheDemocrab
      @TheDemocrab 9 месяцев назад +9

      The AMD Linux driver should engage with multiple AMD GPUs at once and drive them independently without issues. I use the iGPU on my 7800x3D for a secondary screen alongside my 6700XT, only problem is that I have to hide the iGPU for some titles that don't automatically select the 6700XT to run on.

    • @Astalvistar
      @Astalvistar 9 месяцев назад +15

      @@TheDemocrab it will be problem thats the 6700XT runs on amdgpu kernel driver and the build in gpus runs on old radeon kernel driver, there was some kernel parameter to bring the older GCN gpu on amdgpu.

    • @gajbooks
      @gajbooks 9 месяцев назад +1

      @@AstalvistarThat may be one of the causes, but it also may have been tried already. That being said, there's not really any reason not to use the amdgpu driver, on my system at least the Radeon driver is terrible and doesn't even have display output on GCN 1 GPUs while the amdgpu one works wonderfully.

    • @hamsterwolf
      @hamsterwolf 9 месяцев назад +1

      Was just about to type this myself.

    • @salvadorpardinas1870
      @salvadorpardinas1870 9 месяцев назад +1

      @@gajbooks IIRC (it's been almost a decade since I last used GCN 1 on Linux) AMD never managed to get video decoding working as performant on GCN1+AMDGPU as it did on the older Radeon and fglrx. As a whole it absolutely was better to use AMDGPU though, for KMS support, better Vulkan performance, etc.

  • @alasdairlumsden670
    @alasdairlumsden670 9 месяцев назад +180

    As someone who had his hands in many tens if not hundreds of Xeon class servers over a period of 15 or so years, I can say that the thermal compound looks totally normal and would have been working fine. It looks crusty as it solidifies with heat, so when you pull it off it leaves a textured appearance. But strapped to the CPU it would have been uniform. The difference in single thread performance is likely down to the larger 30MB cache on the E5-2697v2 vs 10MB of the E5-1620v2.

    • @ahmetrefikeryilmaz4432
      @ahmetrefikeryilmaz4432 9 месяцев назад +7

      geekbench is extremely touchy with it's memory...

    • @alasdairlumsden670
      @alasdairlumsden670 9 месяцев назад +7

      @@ahmetrefikeryilmaz4432 That actually raises a fair point - the test wasn't totally like for like as the memory was changed as well!

    • @ahmetrefikeryilmaz4432
      @ahmetrefikeryilmaz4432 9 месяцев назад +5

      @@alasdairlumsden670 I was actually affirming your statement. Cache is still memory.
      I don't really think there should be a drastic difference between the old and the new RAM kits.

    • @mndlessdrwer
      @mndlessdrwer 9 месяцев назад

      Oof, I remember the v2 series of Xeon E series processors. They were good for the time, but I opted for the v4 version in my desktop build for a good reason. Better ongoing longevity. I've got a 20 core, 40 thread processor and it's rarely ever seeing serious utilization, particularly in Windows where thread allocation routines are kinda borked at the best of times.

    • @Pasi123
      @Pasi123 9 месяцев назад +5

      For some reason his Geekbench 6 results on both CPUs are way lower than they should be. Comparing to CPUs that I've tested even a laptop Core2 Duo P8700 scored higher in single thread (366/550).
      A generation older Xeon E5-2690 8c/16t scored 724/4001 beating the faster E5-2697 v2 in both single and multi thread even though it shouldn't. And no, I'm not confusing Geekbench 6 with Geekbench 5 which has different scoring.

  • @polocatfan
    @polocatfan 9 месяцев назад +60

    when the graphics card is bigger than your computer you're definitely doing something right.

    • @raven4k998
      @raven4k998 9 месяцев назад +2

      nope nope nope you should fear for the cpu's safety then🤣🤣🤣

  • @betonmischer_86
    @betonmischer_86 9 месяцев назад +88

    I'm sure someone has already mentioned this, but plugging the monitor directly to the eGPU is preferable, otherwise display data coming back from the GPU will be taking up some of the already very limited Thunderbolt 2 bandwidth.

    • @Cyba_IT
      @Cyba_IT 9 месяцев назад +5

      That's what I was thinking and thought that was the default way to use eGPU's. Seems very inefficient to not do it that way unless you're just using the eGPU as some sort of cache or hardware accelerator or something.

  • @RonLaws
    @RonLaws 9 месяцев назад +28

    oh no he's installing the nVidia driver from nVidia instead of the gpu repo :(

  • @ok-tr1nw
    @ok-tr1nw 9 месяцев назад +49

    First law for nvidia on linux is to never use the nvidia installee, for ubuntu use the graphics ppa

    • @cnr_0778
      @cnr_0778 9 месяцев назад +5

      This.

    • @roccociccone597
      @roccociccone597 8 месяцев назад +8

      I mean the best rule of thumb is to just not use nvidia to begin with. The hours I've wasted fixing broken kernel updates thanks to nvidia's incompetence left me scarred for decades to come.

    • @cnr_0778
      @cnr_0778 8 месяцев назад

      This. If you have a choice you should just not use nVidia to begin with.@@roccociccone597

  • @Astravall
    @Astravall 9 месяцев назад +48

    The 4070 only has a single eight PIN Connector which can support a maximum of 150 Watts ... plus 75 Watts max from the PICe Slot it cannot draw more then 225 Watts. So the extra 650 Watts power pupply is overkill for both graphics cards IMHO.

    • @juanignacioaschura9437
      @juanignacioaschura9437 9 месяцев назад +3

      The problem is not the TBP for the nVIDIA Cards, but rather the transient power spikes they have.

    • @nated4wgy
      @nated4wgy 9 месяцев назад

      @@juanignacioaschura9437this is not an issue on Ada Lovelace cards. It was with Ampere. But not anymore. See Gamers Nexus video reviews on 40 series cards for this. Was one of the first things they tested.

    • @Objectorbit
      @Objectorbit 9 месяцев назад +9

      @@juanignacioaschura9437 The 4000 series lessened the issues with those spikes. A 400 watt PSU would have been fine for powering nothing but the GPU

  • @spoopyangie
    @spoopyangie 9 месяцев назад +27

    You're using 22.04 LTS. It's kernel predates RDNA2. That might be a (not the) problem...

    • @JakeR0bH
      @JakeR0bH 9 месяцев назад

      Wont the linux-firmware package contain up to date firmware for AMD GPUs? I've been running an RX 6700 XT from 22.04 - 23.04 and it's been working fine.

    • @spoopyangie
      @spoopyangie 9 месяцев назад +1

      @@JakeR0bH Ye 23.04 should be fine. But 22?

    • @egsuslo
      @egsuslo 9 месяцев назад +1

      Kernel 6.2 (which is used in the latest 22.04) supports RDNA2. My rx 6700xt runs just fine on latest Ubuntu 22.04.

    • @spoopyangie
      @spoopyangie 9 месяцев назад +1

      ​@@egsuslo.3 LTS does indeed use 6.2. I stand corrected 😅
      Besides blacklisting the old Radeon driver to force amdgpu module. I don't really see a solution or the actual problem...

  • @majinshinsa
    @majinshinsa 9 месяцев назад +5

    That plot twist was unexpected but wicked cool

  • @MR.Peanut2
    @MR.Peanut2 9 месяцев назад +29

    I thought you got your hands on the leaked Xbox

    • @MateoThePro
      @MateoThePro 9 месяцев назад +5

      Xbox Series X Digital Edition 😂😂😂

    • @TechTonic420
      @TechTonic420 9 месяцев назад +2

      It had to run some sort of Windows and some Nvidia GPU in order to qualify as an xbox 😂. Hold on.....
      If you would install the lightest ever linux distro that could emulate the xbox, then it would be an even better xbox

    • @ok-tr1nw
      @ok-tr1nw 9 месяцев назад

      ​@@TechTonic420funny you say nvidia even when the only xbox with an nvidie gpu is the very first one

    • @poggerfrogger9327
      @poggerfrogger9327 9 месяцев назад +1

      Xbox 720

  • @bryans8656
    @bryans8656 9 месяцев назад +8

    I did this exact same upgrade (except for the video card) when I had my Pro 13 a few years ago. I liked the challenge of the CPU replacement.

  • @arranmc182
    @arranmc182 9 месяцев назад +11

    When using thunderbolt 2 if you ask it to send the image back to be output over the build in graphics on the computer then you are throwing away performance, if you want the best performance possible then run a monitor directly from the external GPU that way your not chucking away bandwidth sending to much data over the slow thunderbolt 2.

  • @Mitch3D
    @Mitch3D 9 месяцев назад +5

    I like these these systems are becoming so cheap, it's good to know they're actually a great windows gaming PC for the price.

  • @Somelucky
    @Somelucky 9 месяцев назад +5

    After having watched several Jeff G videos about getting these big gaming cards working over external interfaces, it was pretty clear you were in for a challenge. Glad to see that the integrated video on the can works well in Windows at least.

  • @mausmalone
    @mausmalone 9 месяцев назад +17

    This kind of puts some cold water on my idea of setting up an eGPU for my linux laptop so I could theoretically game when I dock it at home. It seems like the sort of project where I could spend a boat load of money and still come up with absolutely nothing in the end.

    • @RockyPeroxide
      @RockyPeroxide 9 месяцев назад +2

      I think you can get your money back if you decide it's not for you within 2 weeks or something.
      At least, that's the costumer protection law here for online shopping.
      It should be able to work on Linux, but then again, I once tried to do surround sound using different sound cards, it should also work but I couldn't get it to work.

    • @Brant92M
      @Brant92M 9 месяцев назад +4

      It works much better when you use normal hardware

    • @archlinuxrussian
      @archlinuxrussian 9 месяцев назад

      It depends on the hardware and how it's actually hooked up. I'd recommend looking around on Phoronix or another Linux-centric forum (gamingonlinux maybe? or /r/linux?) for information on specific hardware. The setup in this video was quite esoteric and niche, so I wouldn't take it as indicative of all eGPU setups. And I have some reservations regarding his use of "AMDGPU drivers" and how the monitor was hooked up upon first trying the eGPU...

    • @CFWhitman
      @CFWhitman 9 месяцев назад +10

      I wouldn't read too much into this experiment unless you are trying to use a Thunderbolt 3 eGPU with a Thunderbolt 2 device.

    • @tarajoe07
      @tarajoe07 8 месяцев назад +2

      USB-C ones work great

  • @nmihaylove
    @nmihaylove 9 месяцев назад +6

    Can you passthrough the eGPU to a Windows VM? Sounds like more in the spirit of this channel than running Windows native.

  • @DigitalDiabloUK
    @DigitalDiabloUK 9 месяцев назад +6

    I've had a similar fight with a trashcan trying to work with egpus. The problem was that Windows often wouldn't boot with the thunderbolt card connected, and I couldn't get the OSX patches to support a modern eGPU. I might have another go as the weather turns colder and having a space heater would be useful in the office 😂

  • @globalvillagedrunk
    @globalvillagedrunk 9 месяцев назад +5

    Loving this series. I did the same CPU upgrade with a cheap Mac Pro off eBay recently. OpenCore Sonoma guide next?

  • @lucasjones6295
    @lucasjones6295 9 месяцев назад +4

    just a quick tip the gpu power supply requirements assumes you have one power supply supplying power to the entire computer so the 400w but we fine for any gpu except for like a x900 amd gpu or a xx90 nvidia gpu assuming the gpus max power draw is less then the 400watts the power supply can provide

  • @jwoody8815
    @jwoody8815 9 месяцев назад +2

    IDK im not an apple guy really but that "trashcan" upgraded would prolly make a damned good steam box, it even kinna looks the part of some unknown modern game console.
    Wouldnt mind having one myself to tinker with.
    FYI, A 2060 - 3050 would have made a pretty competent gaming setup.

  • @Astravall
    @Astravall 9 месяцев назад +13

    Wait? .... why do you need a driver for the AMD 6750XT graphics card? OK it might be different for thunderbolt, but the Linux Kernel should support the card out of the box. Which Ubuntu and kernel version did you use? I'm certain a recent kernel will support it.
    And i'm pretty sure 650 Watts are recommended for a complete system with CPU etc. A Radeon 6750XT only draws 250 Watt maximum. Sure the might be spikes that are higher but 400 Watts should be enough. for powering the card alone.

    • @CommodoreFan64
      @CommodoreFan64 9 месяцев назад +1

      i have an ASRock AMD RX 6650 XT 8GB GPU on my Erying Intel 12900H MATX mobo with 32GB DDR4 3200Mhz RAM, and running Manjaro GNOME in traditional desktop mode on the latest kernels it works just fine with DP to HDMI adapters powering 2 of the same 1080p 75Hz Sceptre monitors he showed in the video. even Steam works just fine for the most part.
      Having said that I stopped using Ubuntu, and Ubuntu based distros a few years back when they started doing more, and more of their own BS wank, like not having the latest kernels, trying to kills 32bit Libs, SNAPS, etc.. and that's part of why companies like Valve moved away from using Debian/Ubuntu as the base for STEAMOS, and moved to an Arch base, and maybe he should just try installing Manjaro GNOME with the latest updates, and newest kernel, and see if that does not help solve some things(not sure it will for the internal dual GPUs through on that trash can but worth a shot).

    • @RealOny
      @RealOny 9 месяцев назад +2

      @@CommodoreFan64 I think he used Ubuntu 22.04.3 which is running over a year-old kernel, that would explain all the issues, maybe switching to a bleeding edge distro like Arch or any distro offering a recent kernel

    • @CommodoreFan64
      @CommodoreFan64 9 месяцев назад

      ​@@RealOny I'm personally not a huge fan of PoP_OS! , or distros being based on Ubuntu in general as once Ubuntu does major changes that are bad for the community many of the ones based on it just following along like sheep, and rolling pure Arch on something like the Apple trash can has its own set of headaches which is why I suggest Manjaro GNOME as a balance with it still having the latest kernels, being based on Arch but somewhat more vetted before pushing out packages in the main repos, while being easy to set up, and get going fast, plus the Manjaro team has been working really hard to make Manjaro more stable with far far less breakages than in the past.

    • @MashonDev
      @MashonDev 16 дней назад

      Jus rembember to install mesa and vulkan!

  • @tralphstreet
    @tralphstreet 9 месяцев назад +11

    I would have gone with Arch, or some other rolling release distro. Having the latest packages really helps when it comes to gaming. Also, why not plug the display to the eGPU directly?

    • @serqetry
      @serqetry 9 месяцев назад +8

      Yes, Ubuntu is a terrible distribution to be doing this kind of stuff with.

    • @peppefailla1630
      @peppefailla1630 9 месяцев назад +2

      Fedora Is rolling release

    • @peppefailla1630
      @peppefailla1630 9 месяцев назад +1

      ​@@initial_kdthe situation got better over time. Nowadays arch even has a TUI installer, even like 3 years ago that would have been CRAZY

    • @Motolav
      @Motolav 9 месяцев назад +1

      ​@@peppefailla1630 Fedora isn't entirely rolling release it's a hybrid distro

  • @MadsonOnTheWeb
    @MadsonOnTheWeb 9 месяцев назад +2

    I was watching and suggesting windows in my mind, then you did it. It works well for such generation.

  • @GaiusIuliusCaesar1
    @GaiusIuliusCaesar1 9 месяцев назад +6

    I would suspect that there is a wierd pcie4 to pcie3/2 miss match. Seen that happen with pcie4 compatible cards running on pcie3 riser. Had to force the primary pcie 16x slot to run at pcie3(default is auto) in the bios. Dont know if that can be done for the egpu enclosure.

  • @gumbyx84
    @gumbyx84 9 месяцев назад +1

    I love these crazy videos. If I had the extra cash, I'd be very tempted to pick up my own trashcan Mac

  • @CodingItWrong
    @CodingItWrong 9 месяцев назад +1

    The constant references to Spindler are especially hilarious with reference to a 2013 machine😂

  • @KiteAndKeyProductions
    @KiteAndKeyProductions 9 месяцев назад +2

    Been thinking of buying a trashcan Mac. It'd only be my second Mac ever after my 07 polycarb, but those little turds always spoke to me.

  • @coyote_den
    @coyote_den 9 месяцев назад +2

    I love the design of the trashcan Mac Pro... but I compared the maxed out Geekbench scores compared to the M1 Pro I'm on right now and good god.

  • @jgaming2069
    @jgaming2069 9 месяцев назад

    I like how tou intergrated the ad into the theme of the video

  • @JARVIS1187
    @JARVIS1187 9 месяцев назад +2

    Had a 970 EVO in my iMac 2019 and was SO disappointed. macOS had trouble with trim, so every restart was a test of patience. Took about 5 minutes each time. Booting into Windows via Bootcamp was no problem, it was just macOS.
    Just for you to consider when you swap back to macOS (maybe possibly eventually) and notice the same: it is the SSD.

  • @VIRACYTV
    @VIRACYTV 9 месяцев назад +2

    Someone needs to make translucent shells for this. Show the beautiful guts it has

  • @nrg753
    @nrg753 9 месяцев назад +3

    The interesting thing about those universal blue images (like bazzite) is that you can switch out to any other image with a single command. Or switch to another ostree based OS like the official Silverblue or Kinoite. You don't even lose your data or flatpak apps.

    • @Cyba_IT
      @Cyba_IT 9 месяцев назад +1

      A steamOS-like OS would be pretty cool on the can. Limit it's functionality somewhat but would be pretty cool next to the TV just for gaming.

  • @cogspace
    @cogspace 4 месяца назад

    Used one of these as a workstation at my job in like 2014. Always liked the design even if it was hilariously impractical in many ways.

  • @spore2009
    @spore2009 9 месяцев назад

    thanks for the video.

  • @robsquared2
    @robsquared2 9 месяцев назад +2

    Remember, any trash can can be a gaming trash can if you use it as a basketball hoop.

  • @NeverlandSystemZor
    @NeverlandSystemZor 6 месяцев назад

    You are a madman... and I LOVE it. :D

  • @gregzambo6693
    @gregzambo6693 9 месяцев назад +2

    It would look great with a clear cover.

  • @u0aol1
    @u0aol1 9 месяцев назад +1

    Looks like the kind of design a technician would have nightmares about. Looks gorgeous though

  • @goodasdead4303
    @goodasdead4303 9 месяцев назад +2

    I've told people for years these weren't bad computers, they were just expecting way too much from a company like apple.

    • @CommodoreFan64
      @CommodoreFan64 9 месяцев назад

      That's true, but sadly so many people fall for Apple's marketing wank, and get caught up in the hype thus leading to unrealistic expectations.

    • @BridgetSculpts
      @BridgetSculpts 9 месяцев назад

      I think he means the toxic pc gamers not people 'disappointed' with their 3k dollar purchase. most people that bought this computer at the time were pretty happy with it. I remember when it came out and all the videos of people trying to build a hackingtosh with the same specs for the same price but it wasnt possible, it usually came out to be more expensive.
      if you bought this computer back in the day I'm sure you could justify it if it's for work.@@CommodoreFan64

  • @kevinmckenna5682
    @kevinmckenna5682 9 месяцев назад +2

    I would love to see you benchmark a bunch of new-ish games on that Win10 trashcan system.

  • @JapanPop
    @JapanPop 9 месяцев назад +6

    You inspired me to install Ubuntu on an old Dell machine with an Nvidia 1050ti GPU. Problem is, I still can’t figure out how to get Steam to work. Nevertheless, I will prevail!

    • @RukaIXR
      @RukaIXR 9 месяцев назад +2

      Good luck! I hope it goes well for you!

  • @jaredwright5917
    @jaredwright5917 9 месяцев назад +1

    I remember when the trash can came out and was always wondering how the hell it was built, given its weird shape. I thought they might have built the whole thing on a large flexible PCB and rolled it up to fit the case.

  • @arthurswart4436
    @arthurswart4436 9 месяцев назад +3

    The firmware probably doesn't support the 6750xt. Would be nice to try again with a 1080ti.

  • @Cyba_IT
    @Cyba_IT 9 месяцев назад +1

    I would love to have one of these simply because they are very cool computers and they actually run Windows very well. Problem is a reasonably well specced one is still around $900 ($1.4K here in New Zealand) which is a lot for a PC just to have as an ornament! I would use it of course but just for messing around on.

  • @E54OW
    @E54OW 9 месяцев назад +1

    LOVE ITT

  • @SagesCartoon
    @SagesCartoon 3 месяца назад +1

    For those that are interested in playing more modern games and have a viable low-medium end computer (as of 2024), I'd recommend getting the version with the 2697v2 as stock, as well as the dual FirePro D700s. Oh, and you may as well put 128gb of ram in there.
    Boom! You got a very high ram set up with a decent CPU (12c/24t, 3.5GHz), a decent GPU setup (7 teraflops, 12GB of VRAM), and 128GB of RAM!

  • @seshpenguin
    @seshpenguin 9 месяцев назад +2

    I daily drove Arch on my MBP with an eGPU for a while, though it was a whole thing to setup. What I learned was two things:
    1. you cannot hotplug the GPU, it needs to be there when the system boots
    2. You need to tell the window manager to use the GPU as the primary graphics device (in KDE/KWin, I had to set the KWIN_DRM_DEVICES environment variable to /dev/dri/card1). Then after logging KDE would spring to life on the eGPU.

    • @archlinuxrussian
      @archlinuxrussian 9 месяцев назад +2

      Also, at least in this case, I have a hunch that plugging the monitor directly into the eGPU may have made a difference? Also a bit concerned when he mentioned "AMDGPU drivers"...not sure if he meant the proprietary ones or not (which I'd almost never recommend).
      Also, nice profile picture :)

    • @seshpenguin
      @seshpenguin 9 месяцев назад +2

      @@archlinuxrussian Yea, I don't think I ever got display loopback to work, which was fine because that would decrease performance anyway (especially on TB2 bandwidth on the Mac Pro).
      Also thanks :P

  • @whophd
    @whophd 9 месяцев назад +1

    Things I’ve done: 1. Upgraded my trashcan to 12-core (with a lot of help, but it was open heart surgery feels!) 2. Use Windows on a trashcan for gaming 3. Use a 6900XT eGPU with massive 750W power supply on a Mac Mini 2018 for AAA gaming, if you can call MSFS 2020 a AAA title? It sure needs the processing power. But what I’ve never tried is to grab my TB2-TB3 adaptor and connect the eGPU to my trashcan. I’ve seen a few articles about it though - you need to add special support to get eGPU working over TB2. Read up online first! And that’s before you even go beyond macOS.

  • @c.n.crowther438
    @c.n.crowther438 4 месяца назад

    If that shroud were clear with a bit of ligthing, that thing would look cool af.

  • @philtkaswahl2124
    @philtkaswahl2124 9 месяцев назад +1

    An Intel Apple product with Linux (then Windows) installed on it, an AMD GPU (then an NVidia one, then stock), and a controller for a Microsoft console.
    Gotta love the mishmash.

  • @Francois_L_7933
    @Francois_L_7933 9 месяцев назад +1

    It's funny because every time you remove the case on this one I get the same feeling I had when I saw Darth Vader remove his helmet in The Empire Strikes Back..

  • @SlowHardware
    @SlowHardware 9 месяцев назад

    Good video!

  • @complexacious
    @complexacious 9 месяцев назад +2

    I wonder if the reason the 6750 is failing is because you're using it in a PCIe3 capable Thunderbolt 3 enclosure so it attempts to operate at PCIe3 specs, but then you've got the thunderbolt 2 to 3 adapter which can't tell the card that the link is only PCIe2 capable. People ran into these kinds of problems with PCIe3 risers on PCIe4 capable motherboards. The card would see the mobo was capable of 4 and try to use it, not knowing that there was a non-compliant riser in the way. You had to tell the motherboard to only use PCIe3 signalling but you had a chicken and egg problem.
    The 1050Ti maybe worked because it was old enough that it was only trying to use a slower specification. Maybe it's only 8x wired? Maybe some quirk made it decide to be a PCIe2 card?
    eGPUs are always fiddly.

  • @Astinsan
    @Astinsan 9 месяцев назад

    8:38 ice trays.. what a great idea

  • @btarg1
    @btarg1 9 месяцев назад +2

    I am very surprised that this machines has an M.2 slot, PCs didn't have those as a common standard for another couple of years

    • @JosephHalder
      @JosephHalder 9 месяцев назад +3

      It's not exactly a m.2 slot, it's proprietary but still PCIe like m.2, which is why it can be adapted.

  • @dmug
    @dmug 9 месяцев назад

    Good vid, just wanted you to know though there’s a 60Hz hum in the background audio.

  • @FavoritoHJS
    @FavoritoHJS 9 месяцев назад

    about bus speeds, if memory serves capacity by itself doesn't limit clock rates, but 4 sticks of memory tend to have lower clock rates than 2 due to bus issues, and larger capacity sticks might have lower maximum clock rates by themselves.
    of course, apple could also be mucking things up by automatically setting speed based on ram count...

  • @karlc11
    @karlc11 7 месяцев назад

    Just picked one of these up for £100 and I'm stoked. It's much faster than my 3770k and 7970ghz and the size and quietness is fantastic. Always been an apple hater but the aesthetics of this thing are great, and build quality

  • @maillouski
    @maillouski 9 месяцев назад

    Sean, it's really funny to see your beard grow throughout the video, showing how long and hard it was to get to something working :P

  • @dgpsf
    @dgpsf 9 месяцев назад

    1:00 I died when you added "Tire"
    LOLLLLLLLL

  • @thavith
    @thavith 9 месяцев назад +2

    I was really hoping the Linux only way would work. I am sure there are some Linux guys out there that have done this.

  • @Arachnoid_of_the_underverse
    @Arachnoid_of_the_underverse 9 месяцев назад +2

    I think it looks great without a cover,real R2D2ish.Maybe you could make a clear perspex cover for it.

    • @beenine5557
      @beenine5557 9 месяцев назад +1

      Exactly what I was thinking! Except in my mind, this was clearly an Imperial darkside astromech, given the all black color-scheme.

  • @DouglasWalrath
    @DouglasWalrath 9 месяцев назад +1

    it showing as nvidia corporation device is just because it doesn't have the updated PCI IDs for it, this will not prevent the card from working since the name is entirely cosmetic and for the user
    if you want it to show the actual name of the device run sudo update-pciids and it'll fetch the latest PCI IDs from the internet

  • @jangosch8647
    @jangosch8647 Месяц назад

    If you have two GPU's the standard setting for the nvidia-driver is "On demand". The Desktop is then always using the dedicated GPU and the discrete GPU is only used when needed or you tell the system to use it. Ubuntu has an Option "Use discrete graphics" when you launch programs. Actually you can run the Steamlauncher with the dedicated GPU and the Games will run on the discrete because most of the games would not run on the dedicated anyway. At least this is how it works on my Lenovo with discrete NVIDIA-Card.

  • @tk421dr
    @tk421dr 9 месяцев назад +2

    installing ram on this is like swapping isolinear chips on a LACARS terminal. kinda want one just for that, could be a neat home server.

  • @hermean
    @hermean 9 месяцев назад +1

    I wanna see gaming on a ThinkPad with an eGPU, myself. I got it to work on my x200, exactly once. The ultimate Linux gamer experience, surely.

  • @yjk_ch
    @yjk_ch 9 месяцев назад +2

    12:16 Depending on how old the kernel is on distro you've chosen, the amdgpu in the kernel may not support your GPU.
    I couldn't get Debian working on my RX 6600XT for instance, because Kernel was too old and doesn't even attempt to initialize the card(not even single dmesg line). Maybe the recent Debian works fine, but idk.
    I could compile latest Linux kernel myself, but I didn't want to go that route.

  • @dustinschings7042
    @dustinschings7042 9 месяцев назад +4

    The PSU in the external GPU should have been more than enough to run any of the GPUs you tried. 400W is plenty for almost any GPU out there. Yes, the AMD card recommended a 650W PSU, but it also is accounting for you having a whole PC powered by the PSU as well, not just the GPU. If you could not get almost anything to work with only the PSU inside the enclosure, then I would guess that the 400W PSU is simply not working properly.

  • @icanrunat3200mhz
    @icanrunat3200mhz 9 месяцев назад +4

    I used to be an "old school [K]Ubuntu fanboy" as well, until I tried OpenSUSE. It just seems that little bit more polished. Plus, YaST is super slick for managing things that would otherwise be kinda tedious.

    • @joe--cool
      @joe--cool 9 месяцев назад +3

      Basically anything other than Ubuntu is much slicker, faster and smaller. I still cannot fathom why Canonical still clings to their failed snaps.

    • @WyvernDotRed
      @WyvernDotRed 9 месяцев назад +3

      Back before the move to Snaps I got into Linux with Ubuntu MATE and was happy enough with this OS to not distro-hop further.
      But as Snaps got rolled out, the modern laptop I ran it on got slow on both system and application startup.
      Which led me to hop to Manjaro and after getting annoyed with their bad management of the distro I somehow ended moving to Garuda Linux.
      This has it's (significant) problems too, but it's a great OS for me which I'm sticking to long-term, as I like Arch's ease of installing up-to-date obscure software.
      But after trying it on an older laptop, OpenSUSE has become my backup distro as I can replicate all the setup I like, including most obscure stuff, with personal repos.
      OpenSUSE tumbleweed is significantly more stable than Arch, both in the sense of packages changes as with breakage on updates, but I'm not changing a system I'm happy with.

  • @fwah23
    @fwah23 16 дней назад

    Your videos are allowing me to get on with my life while protecting me from janking up my life too hard with my ADHD brain. Thank you

  • @thedefectivememe
    @thedefectivememe 6 месяцев назад +1

    Keep the Linux dream alive and maybe try using KVM! If you can do some tricky by having the integrated GPU dedicated to video output on the linux OS, and then passing the nvidia GPU to the windows VM

  • @jbettcher1
    @jbettcher1 9 месяцев назад +3

    It's probably been said, but that RTX 4070 only draws like 200W. I know you probably were just addicted to the jank, I get it.

    • @fuelvolts
      @fuelvolts 9 месяцев назад +2

      I was so confused by what he was doing. I don’t know of any consumer GPU that wouldn’t work with 400w when that PSU is only powering the GPU. No need for a janky second PSU.

  • @semperfidelis316
    @semperfidelis316 9 месяцев назад +2

    That CPU swap has been my favorite one to do. Yes it's a crap ton of work and shouldn't be, but I guess I'm weird and like computers like that. I let my kid drill a hole in the old cpu and put a key ring on it. He thinks it's so cool.

  • @ran2wild370
    @ran2wild370 9 месяцев назад +2

    LOL, r/linux is going to fire plasma torch from all exhausting orifices. :-))

  • @mikehensley78
    @mikehensley78 9 месяцев назад

    You tricked me into watching a windows video!? LOL. Man, i love taking shit apart and doing exactly what you do with all the parts you have. Hell yeah.

  • @rog2224
    @rog2224 9 месяцев назад +2

    Without its case, the Trashcan looks like something from the Death Star.

    • @BridgetSculpts
      @BridgetSculpts 9 месяцев назад +1

      thats why I named mine VaderPro.

  • @NaoPb
    @NaoPb 9 месяцев назад

    Quite the plot twist, but we did learn a thing or two.

  • @hahohihi-zm7ou
    @hahohihi-zm7ou 9 месяцев назад +1

    Try to connect the GPU through the M2-adapter interface. There are adapters e.g M2-PCI-E or M2-Oculink

  • @AltimaNEO
    @AltimaNEO 9 месяцев назад

    Man, what a plot twist!

  • @iceowl
    @iceowl 9 месяцев назад

    do you ever feel like saying "the plans for that battle station are stored in this droid"?

  • @eljefe7070yt
    @eljefe7070yt Месяц назад

    I think it’s time for a capture card friend. Thanks for the content, however we are missing a lot of info for when you connected the egpu.

  • @UpLateGeek
    @UpLateGeek 9 месяцев назад +4

    I'm wondering whether you'd have more luck running it through a native TB2 EGPU (correctly pronounced egg-poo) enclosure, rather than using a TB3 unit with the TB2 adapter?

  • @TylerComptonShow
    @TylerComptonShow 9 месяцев назад +2

    In my experience, installing the Linux drivers directly from Nvidia causes nothing but pain and suffering. In my very early days of using Linux, I ruined a couple installs trying to do that!

  • @BRBTechTalk
    @BRBTechTalk 9 месяцев назад

    18:53 Bravo ha, I love your comment and I agree with you 100%

  • @michaelmullett7327
    @michaelmullett7327 9 месяцев назад

    Love it and I want one bad

  • @beenine5557
    @beenine5557 9 месяцев назад

    Too bad about the plot-twist ending after all that work, but hey! You've saved a bunch of Linux geeks a lot of time by letting us know what _doesn't_ work.

  • @Astinsan
    @Astinsan 9 месяцев назад

    9:35 I found Clorox wipes work great to clean old compound off.

    • @bstock
      @bstock 9 месяцев назад

      That won't do a thorough job though; there's microcracks in the copper that actual thermal compound cleaner can clean out.
      If it makes that much of a difference, IDK. And maybe I'm just being sold on marketing but from what I understand you want to use actual thermal compound cleaner when removing it.

  • @jaimeduncan6167
    @jaimeduncan6167 9 месяцев назад +1

    Sometimes it does not works. Loved the video. I wonder if Apple build a machine like this, it could have 3 M2 ultras and a bus, and it will be easear to clean than the Studio (that I love, not get me wrong). Simply the thermals were not there, It was evident to me, but for some reason it was not for Apple.

  • @floydlooney6837
    @floydlooney6837 9 месяцев назад

    I would have not remembered how to put that thing back together. And now you have 2 more expensive GPU's for future projects.

  • @ahmetrefikeryilmaz4432
    @ahmetrefikeryilmaz4432 9 месяцев назад +1

    Dude the PSU in that external whatever is perfectly fine for anything lesser than a 3090...

  • @TheGreatMatthias
    @TheGreatMatthias 9 месяцев назад +1

    For a 2012 macbook pro
    What eGPU should i get, and should I run windows or a linux distro made for gaming.
    If linux, which one is best for gaming?

    • @tostadorafuriosa69
      @tostadorafuriosa69 9 месяцев назад

      As for gaming almost any distro would be suitable. There are some “gaming” focus distros but as far as i know they don’t change all that much. If you are still interested on gaming focus distros a dude names glorius eggroll made a fedora spin. But if i were you i would use plain fedora with a DE you like, which the better choice for the most part is gnome and kde. But if you are going to game windows will be the better choice because most stuff just works aside from shody pc ports.

  • @JonneBackhaus
    @JonneBackhaus 9 месяцев назад

    oldschool and ubuntu in the same sentence: made me chuckle.

  • @annieworroll4373
    @annieworroll4373 Месяц назад

    neat 970 evo. I've got two 1TB units, one in my desktop and one upgraded the gen 2 256gb SSD my laptop came with.

  • @sirena7116
    @sirena7116 9 месяцев назад

    I so want to do this now! I just have to find a reasonably priced mac pro 6,1.

  • @macbuff81
    @macbuff81 5 месяцев назад

    Yes, I did the upgrade to 128GBs and yes, it reduces the RAM speed from 1866Mhz down to 1066Mhz. The issue here is not the memory, but rather a technical limitation of the Xeon processor memory controller. However, that is still a lot faster than any swap file even if it on an NVMe storage medium. Everything pretty runs of memory on my 2013 trash can at this point. I also installed an AngelBoard which allows me to install regular M.2 drives.
    I also have an eGPU, however, the newest MacOS (Mojave) won't allow the use of it. Thanks Apple. They say it because of stability issues which is BS of course. Yes, you won't get the full bandwith the card is capable of (in my case, it is a 5700XT) due to the Thunderbolt 2 port being only capable 0f 20Gb/s vs. TB3 which can do double. However, it could still work with the TB3 to TB2 adapter from Apple. Apple simply wants you to buy a new system (their standard strategy). There is a workaround for this which does not require turning off SIP (system integrity protection). It's called "Kryptonite." It is still a bit of a hassle though.

  • @LabCat
    @LabCat 9 месяцев назад

    This is the level of jank that I subscribed for.

  • @hattree
    @hattree 9 месяцев назад

    If you've ever seen Auralnauts Star Wars ...."Thank you, Magic trashcan."