GPUs on a Raspberry Pi Compute Module 4!

Поделиться
HTML-код
  • Опубликовано: 26 окт 2024

Комментарии • 1,4 тыс.

  • @JeffGeerling
    @JeffGeerling  4 года назад +555

    09:32 This is the way

  • @mikespencer3209
    @mikespencer3209 4 года назад +1371

    Dude, doesnt matter if it worked or not, I was able to learn a lot from this video. Thanks for all the videos you make.

    • @GadgetStation
      @GadgetStation 4 года назад +5

      That's the point 😁

    • @achimtillmann234
      @achimtillmann234 4 года назад +2

      That's the way

    • @frogandspanner
      @frogandspanner 4 года назад +2

      The spirit of adventure: that's what made me a scientist.

    • @DetConanEdogawa
      @DetConanEdogawa 4 года назад +4

      I think a litlle more about the hardware: So...also if you can find a GPU without the IO BAR needing (and i not know if it exist at all), you have a phisical PCIe Gen 2 x1, but it can handle the power for a GPU ? I mean, a discrete GPU can use 50-75W from PCI-E slot, can the IO board for the Module 4 provide them ?

    • @Chooch1998
      @Chooch1998 4 года назад

      @@DetConanEdogawa was thinking the same thing.

  • @sandordugalin8951
    @sandordugalin8951 4 года назад +718

    RPF: "We're gonna put a PCI-E slot on the new Raspberry Pi!"
    Users: "That's great! What sort of things will it support?!"
    RPF: "... support?"

    • @JuanPerez-jg1qk
      @JuanPerez-jg1qk 4 года назад +38

      they forgot BROADCOM chips don't support pcie data ...only bandwidth data ..not i/o devices to produce video or sound

    • @johanrosenberg6342
      @johanrosenberg6342 4 года назад +16

      From the link in the description of the video it seems like it can use quite a few different PCI-E cards. Networking, USB, NVME, SATA...
      It would have been cool to be able to use other cards too, but I don't know how big the market is for Pi4 CMs with a GPU.

    • @l3p3
      @l3p3 4 года назад +10

      @@johanrosenberg6342 actually for the intended purpose of these tiny computers, stuff like SATA controllers are just ideal.

    • @johanrosenberg6342
      @johanrosenberg6342 4 года назад +7

      @@l3p3 indeed, I know many people use RPs as servers, and in that case the need for storage far exceeds the need for any kind of graphics.
      However, the video briefly touched upon the idea of using GPUs for parallel calculations. I can't say if that is a viable application, but that was primarily what I was reffering to when I said it would be cool to be able to use other cards.
      I'm currently taking a class in shader programming though and it really is amazing what a GPU can carry out with only a few little lines of code and some heavy math.

    • @l3p3
      @l3p3 4 года назад +2

      @@johanrosenberg6342 actually, there is a gpu integrated into the SOC. I have no idea what kind of gpu it is but maybe you can also abuse that for server stuff. ;-)

  • @prajullas
    @prajullas 4 года назад +223

    This is a suspense thriller. I held my breath till the end.

    • @Shinjih1
      @Shinjih1 3 года назад

      Have you tried Windows 10 arm version?

    • @HimanshuGhadigaonkar
      @HimanshuGhadigaonkar 3 года назад

      Yes.. me too!!

    • @abrahammartinez3603
      @abrahammartinez3603 3 года назад

      Somehow I was curious too. I enjoyed the whole process even If the result wasn't good.

  • @FatheredPuma81
    @FatheredPuma81 4 года назад +48

    Let's rebrand what you just said: "It's not whether it should be done but instead whether it can be done."
    Literally sums up how i use computers and the story of why I want to Raid0 my flash drives, run a super slow or secondary GPU, and use that GPU to run games it shouldn't be able to.

  • @SpikeTheBear
    @SpikeTheBear 4 года назад +114

    This is what it felt like installing my wifi drivers on my first linux laptop...

    • @tissuepaper9962
      @tissuepaper9962 3 года назад +7

      This is what installing wifi drivers always feels like for me lmao

    • @sonichuizcool7445
      @sonichuizcool7445 3 года назад

      This is the reason linux will never be anything but a fringe OS. It's too non user friendly outside of custome linux based apps like batocera and the like where someone has literally taken the time and frustration to coax something useful out of it.

    • @SpikeTheBear
      @SpikeTheBear 3 года назад +2

      @@sonichuizcool7445 i disagree, there are a lot of different distos and they are constantly being updatesd. While i agree with you at the moment i think that it will change. People will eventually make more user friendly versions. But that will probably take a long time
      #edit i realized i did not fully understand the meaning of fringe i mistook it for something else. My bad. It will probably stay less popular then the other major OS competitors but i still think it will grow so i agree with you in the end haha

    • @tissuepaper9962
      @tissuepaper9962 3 года назад +1

      @@sonichuizcool7445 oh no, I have to actually know how to use a computer to use my computer! The horror!

    • @portlyoldman
      @portlyoldman 3 года назад +1

      @@tissuepaper9962 - sorry, but your comment is truly asinine. The very greatest proportion of people who use computers in their day to day lives have zero clue what goes on under the GUI and care an order of magnitude less. That huge majority will never know about, use or care about Linux unless and until they can turn it on, plug in some arbitrary peripheral and have an almost 100% expectation of it working immediately and flawlessly. My 94 year old father can navigate Windows, write documents, schedule zoom meetings and obtain value from his PC but he’d never bother with trying to make a WiFi dongle work with Linux. Why would he? Liber Office? It a hope when he gets Office 365 for a few euro a month and it all just works.
      I use Linux for a lot of projects, but I would love to be able to avoid throwing a Raspberry Pi at the wall after a full day trying to wrangle with a driver for a, seemingly, common peripheral, and I’ve been at this nerdy game for decades.

  • @kayakMike1000
    @kayakMike1000 4 года назад +702

    I'm a firmware engineer. You are probably a better firmware engineer than myself. I would hire you if you interviewed with this...

    • @nowrd2xpln
      @nowrd2xpln 4 года назад +110

      I’m also a firmware engineer, but would not hire you if you interviewed presenting this to a firmware engineering team because of the lack of deeper understanding. I’ll applaud you for making an attempt and being entertaining but I’d grill you for not attempting to debug the Linux device driver code because that’s where you would begin to find and understand why the driver was failing to initialize. In an interview, if I asked why it doesn’t work, the best/simplest answer would be because the gfx card doesn’t support the raspberry pi.

    • @anonymous-rj6ok
      @anonymous-rj6ok 4 года назад +77

      Good thing they don't have firmware engineers in charge of HR.

    • @electricflyer81
      @electricflyer81 4 года назад +6

      @@nowrd2xpln Grammer, please... I don't know if English is your language of choice or not. However, you make no sense.

    • @erathemonologuer1454
      @erathemonologuer1454 4 года назад +98

      @@electricflyer81 His grammar is fine. Attacking someone's grammar in an otherwise unrelated argument is the most strawman thing I've ever seen.

    • @ronisaiba9623
      @ronisaiba9623 4 года назад +83

      @@erathemonologuer1454 He typed 'grammer' while complaining about someone's grammar. What a joke.

  • @kebman
    @kebman 4 года назад +114

    You're literally the only one buggyng Nvidia with 32 bit ARM compatability reports lol. They must _really_ love you over there! xD

  • @markusroth8770
    @markusroth8770 4 года назад +342

    I love how the video escalated more and more. From just plugging in a potentially unsupported hardware to installing drivers to recompiling a damn kernel.
    Also Edmund Hillary would be proud. Why try to add a gpu to a raspberry pi's pcie?`Because it's there!

    • @l3p3
      @l3p3 4 года назад +6

      When people think that recompiling damn kernels is something complicated. o,O

    • @jared4169
      @jared4169 4 года назад +4

      @@l3p3 you do realise that @Markus never said it was complicated to re-compile a kernel. They just stated it was quite an escalation from less difficult things such as installing the hardware and drivers. :)

    • @debaliol
      @debaliol 4 года назад

      Shame you couldn't 'knock the bastard off' as Edmund Hillary would have said.

    • @IsaacMIT
      @IsaacMIT 4 года назад

      @@l3p3 That does NOT mean that compiling/recompiling kernels is as easy as plug and play.

    • @AureliusR
      @AureliusR 4 года назад

      @@IsaacMIT I dunno, it's pretty damn easy. It's like three commands to build a basic kernel. Four if you first copy the current running kernel's .config file.

  • @Oaisus
    @Oaisus 4 года назад +34

    It's part of the pci express standard, you can literally plug any size pci express card into any size slot, as long as you cut the plastic out of the back of the smaller slots

    • @W1ldTangent
      @W1ldTangent 4 года назад +11

      It's not a risk-free operation. I've damaged a couple PCI-E slots attempting it, one to the point it no longer worked, the other just looked fugly as hell.

    • @TheAechBomb
      @TheAechBomb 4 года назад +7

      @@W1ldTangent they also make adapter cables

  • @HaradaSan28
    @HaradaSan28 4 года назад +53

    "I quickly tossed the CD and documentation aside and got right down to business"- story of my life

  • @gbenselum
    @gbenselum 4 года назад +94

    A lot of pain to do a 13.32 video. Congrats. Maximun Effort!

    • @JeffGeerling
      @JeffGeerling  4 года назад +28

      Haha, you have no idea how many times I was like ... if this doesn't work, will I have anything to show? Luckily I did it for the memes-this is the way.

    • @gbenselum
      @gbenselum 4 года назад +5

      This is the way

  • @jekader
    @jekader 4 года назад +120

    Try a Matrox card, they're ancient but still used by all kinds of server motherboards and should have good linux support.

    • @trevorpowdrell5474
      @trevorpowdrell5474 3 года назад +1

      www.matrox.com/en/video/products/graphics-cards/m-series/m9120-plus-lp-pcie-x1

  • @sudo_rice
    @sudo_rice 4 года назад +21

    Awesome project! These really show how a Pi is versatile enough to become a light weight daily machine (Pi in a keyboard). Thanks for the great video!

    • @JeffGeerling
      @JeffGeerling  4 года назад +9

      "Pi in a keyboard" that would be cool; I think the CM4 would make that achievable, just as long as you can get some airflow through it so it doesn't overheat!

    • @nandulalkrishna923
      @nandulalkrishna923 4 года назад +1

      CM4 + Mechanical switches + Thunderbolt (wooo) in a keyboard formfactor would be really nice

    • @MaxMax-dq1lu
      @MaxMax-dq1lu 4 года назад +2

      Dang - now that's a great idea..!

    • @AMalas
      @AMalas 4 года назад +1

      @@JeffGeerling no need, make the case out of metal and slap thermalpads between

  • @shujaatchand105
    @shujaatchand105 4 года назад +20

    I learned a lot from this video keep it up, your way of explaining things is amazing btw

  • @stephencarrasquillo3964
    @stephencarrasquillo3964 4 года назад +51

    "I was getting tired and was gonna give up but thought I would try it ne more thing."
    .... Story of my life

  • @cunninghamster1954
    @cunninghamster1954 4 года назад +4

    @Jeff Geerling - your best video yet! Thanks so much. The jokes at the beginning had me groaning, but your can-do attitude and your clear explanation of your troubleshooting steps is great. Your knowledge is way beyond mine, but your ability to clearly break down what you're doing is helpful in teaching the rest of us how to dig further into the potential of the Pi.

  • @joseaparicio8820
    @joseaparicio8820 4 года назад +3

    I arrived to this video based on a RUclips suggestion. When I Saw the Trogdor t-shirt my mind blew away!
    Interesting video!

  • @GrigLarson
    @GrigLarson 4 года назад +28

    One of the things I respect about this is *I* have gone down similar paths, like trying to get ancient Iomega Zip 100 drives to work with Linux (they do, my github account is GrigLars if anyone is curious). I didn't have to recompile kernels in that case, but I think the best "why would you do this?" answer is "because it's there." And also, one learns so much ancillary stuff that helps with other projects later on. So even if one gives up after days of all that research, months or even years later, some of these bits and pieces will show up again, and you might have added them with OTHER bits and pieces, which makes one a far better Linux programmer than just "I bought a Raspberry Pi, and it's still unopened in my top desk drawer." I started with Linux in 1993 or so when we tried to compile it on an old 68000 series chip (I think an Atari ST1040). Three days later and three programmers working, we never even got a bootable kernel. But BOY did we learn a lot, and 27 years later, I feel it was still worth it. All of us went on to become big people in the industry.

    • @GrigLarson
      @GrigLarson 4 года назад +3

      Now I want to do "Linux from Scratch" on a Pi... www.linuxfromscratch.org/lfs/

    • @simpletongeek
      @simpletongeek 4 года назад +1

      Didn't Atari try to do their own unix box? I don't think it went to market.

    • @GrigLarson
      @GrigLarson 4 года назад +2

      @@simpletongeek Yes! They had the "Atari System V," a short lived version of Unix System V Release 4.0 (SVR4) developed by Unisoft. Worked on the Atari TT 030 Workstation. It was originally intended to be a high-end Unix workstation, however the Trammels took two years to release a port of Unix SVR4 for the TT. Then the TT was replaced by the Atari Falcon, which was a slower and a severely bottlenecked consumer-grade system due to some ... strange ideas about CPU throughput. I don't remember why, but it was some lane channel traffic stupidity on the board, which made ZERO business sense. That era of Atari made some of the worst business decisions, the ST line should have beat Amiga, but Amiga won out because Commodore was just a better-run business.

  • @pamus6242
    @pamus6242 4 года назад +118

    Asking someone why use a GTX with a Rpi instead of going with a x86 is like asking why does one fill air in bicycle tyres.
    HOW DARE YOU!
    "HOW DARE YOU!"

    • @ArtemisKitty
      @ArtemisKitty 4 года назад +9

      I'd say more like "why are you filling your bicycle tires with nitrogen, rather than air?" but... a typically good counter to "Why", in this situation, is simply "Why not?"

    • @jyvben1520
      @jyvben1520 4 года назад

      @@ArtemisKitty first question answer : less leakage ?

    • @ArtemisKitty
      @ArtemisKitty 4 года назад +5

      @@jyvben1520 Right, that was sort of my point. Just because someone may be unaware of why you are doing something they perceive as "not normal" doesn't mean it shouldn't be done, or that you don't have a good reason for doing so.
      Just the fact that it's not what you're familiar with doesn't mean it's wrong! :-p
      I especially love when people try to qualify it with "I've been in X industry for 20+ years..." - yes, so perhaps you're the local resident expert on 20 year old technology? Experience does not equate to skill, lol. I know from my countless failures at certain projects that I'm no expert, even after years of "experience", haha.

    • @hugobracamontesbaltazar
      @hugobracamontesbaltazar 4 года назад

      @@ArtemisKitty Totally agree.

    • @justaguycalledjosh
      @justaguycalledjosh 4 года назад

      "BuT nItRoGeN iS bEtTeR!"

  • @shafnet
    @shafnet 3 года назад +2

    I'm glad you're now x-compiling but I seriously felt your pain whilst watching this video.... your adventure is like those we took 20 years ago, a lot of frustration and pain but you learn a lot !

  • @jasnix
    @jasnix 4 года назад +128

    I would see if any of the old PowerMac cards from when Apple used the power pc architecture get past the io bar issue.

    • @noaether
      @noaether 4 года назад +8

      Oh god i have two of these (I have two old 1999 PowerMacs G4) with all original parts. Oh god the memories

    • @madmax2069
      @madmax2069 4 года назад +3

      Oh that would be interesting to see.

    • @christiaansteenkamp5617
      @christiaansteenkamp5617 4 года назад +8

      This. Ight be a interesting test mail Jeff a card and maybe we get a video out of the deal...

    • @stephenbaber1547
      @stephenbaber1547 4 года назад +15

      Needs to be PCI Express, not regular old PCI.

    • @timrb
      @timrb 4 года назад +16

      PowerMac G5 2.0 / 2.3 / 2.5 used PCI-E and certainly were not x86 based.

  • @theomegared02
    @theomegared02 4 года назад +2

    My first experience with you and it was amazing! Can't wait to watch more of your videos....I like your sense of humor! 😜

  • @hw5622
    @hw5622 3 года назад

    Wonderful job Jeef! It doesn’t matter if it works or not! It’s about the fun of exploration!

  • @jacevedo770
    @jacevedo770 4 года назад +7

    Bro you’re my hero! I appreciate you for going through so many obstacles. The first one through always gets the most bloody.

  • @VaibhhavRajKumar
    @VaibhhavRajKumar 4 года назад +1

    There are only a few people to think of implementing POC on IoT modules , this video is a classic 👌..keep up the good work.

  • @G_HarishKumar
    @G_HarishKumar 4 года назад +71

    Me : With lot of expectations start to watch this video.
    one after the other fails

    • @JeffGeerling
      @JeffGeerling  4 года назад +22

      This whole project has been a roller coaster! In good news, it seems there *are* a couple ARM developer boards which can at least output console text over Nvidia GT 710, so there is some hope... but maybe not with the current generation of Pis :-/

    • @harrydamour7564
      @harrydamour7564 4 года назад +6

      @@JeffGeerling I just appreciate you doing this vid. Subbed 😎 a true pioneer.

    • @TheOleHermit
      @TheOleHermit 4 года назад +4

      One only fails by not trying. One learns from failures and mistakes. Many viewers learned a lot by simply watching Jeff's troubleshooting techniques. Jeff is a Raspberry Pioneer, leading the way for the rest of us, saving us from the frustrations of our own total failure.
      Apparently, you came here with the expectations of Jeff solving your problems, instead of trying yourself. Then, you shoot the messenger, who has worked tirelessly and personally endured the 'agony of defeat'? Really?

    • @simpletongeek
      @simpletongeek 4 года назад

      Pretty sure there is *one* graphic card that works. I can't remember the name ATM. Did you try Raspi official blog? Pretty sure I got it from there. HTH.

    • @wishusknight3009
      @wishusknight3009 4 года назад

      @@JeffGeerling Spend 500 hours getting an idea to work, getting close, almost there, almost there... One more try.. Everything but one little pesky piece of lingering x86 baggage..
      And then ruclips.net/video/Ag1o3koTLWM/видео.html

  • @davephillips5667
    @davephillips5667 3 года назад

    Jeff, I love your out-takes!!!

  • @woodlanditguy2951
    @woodlanditguy2951 4 года назад +8

    Love the Trogdor T-Shirt and the Mando reference, Great video!

    • @JeffGeerling
      @JeffGeerling  4 года назад

      Can't wait until Friday! Mando will be burninatin' the countryside :D

  • @thorntontarr2894
    @thorntontarr2894 4 года назад

    Jeff, while I am not in need of this content, I really enjoyed your presentation and the insights it contained. Your humor is appreciated as is your humility. As one of your clips showed, failure has always been my best teacher. TOTALLY WELL DONE!

  • @nnasab
    @nnasab 4 года назад +6

    I think the problem is power to the pci, so you can use the mining rider that power the pic externally instead of power Pi board. Good luck. Love your videos.

  • @saifal-badri
    @saifal-badri 4 года назад +1

    Jeff, I nominate you the pi-guru of the year lol love these videos please keep it up!

  • @nowrd2xpln
    @nowrd2xpln 4 года назад +20

    I get flashbacks from work when I see this type of struggle with Linux Device Drivers. Linux is amazing and great but dealing, or developing, kernel modules (device drivers), can be really painful and tedious. Kernel updates, which happens a lot now-a-days, make me nervous because kernel library api's oftern change and break the driver compilation.
    @jeffgeerling
    Issue(02:40):
    It's a kernel module build error. The driver code had an error during compilation. At 02:27 when you grab the latest Linux Arm Display Driver (version 390.138), it lists fixes for Linux 5.6 Release Candidate in the Release Note Highlights, and at 02:52 I can see you are compiling the driver against your Kernel of a different version, Linux 5.4.51-v71+.
    Solution: Try downloading and compiling version 390.132, or possibly the version released after that. It claims to fix " kernel module build problems with Linux kernel 5.4.0 release candidates" in the release notes. Basically, it should play nicely with the Linux Kernel you have. The next suggestion would be to download the 5.6 kernel and build the latest driver against it, but sounds very painful.
    It would be really interesting to see you connect the pi to the monitor using a VGA cable. In the very beginning of the video you removed the bracket, along with the VGA plug, from the card and connect it to the monitor using an HDMI cable but at 01:52 the kernel explicitly recognizes it as a "VGA compatible controller". It could it be possible the video was being outputted through the VGA port and not the HDMI.

  • @TmanaokLine
    @TmanaokLine 4 года назад +2

    This is the way. I appreciate all the effort you made to attempt this setup, keep experimenting with this stuff!
    I look forward to your next videos!

  • @dorinxtg
    @dorinxtg 4 года назад +51

    I just wonder: instead of compiling on the Raspberry Pi, why didn't you cross compile it on Linux X86 machine?

    • @justuseodysee7348
      @justuseodysee7348 4 года назад +10

      Most likely because setting a cross compilation environment requires additional time and effort

    • @JeffGeerling
      @JeffGeerling  4 года назад +30

      Honestly after the 5th or so recompile I started thinking about setting it up. It would be a bit faster on my i9 laptop, or even my older i7 I have over in the corner of my office to play Halo 3.

    • @MrTolcher
      @MrTolcher 4 года назад

      @@JeffGeerling doooo it 😄

    • @dorinxtg
      @dorinxtg 4 года назад +9

      @starshipeleven actually, the instructions to compile the kernel sources in the raspberry pi pages include a section for cross compiling. You'll need to add maybe 4-5 commands

    • @anlumo1
      @anlumo1 4 года назад +7

      ​@@JeffGeerling I once had to recompile the kernel a few times for another ARM device… the time it took the kernel to compile once on the hardware itself (AllWinner A20), I was able to research how to do cross compilation, set up a VM with Ubuntu, install the cross compilation environment and compile the kernel once. After that it was a breeze.

  • @anthalas9
    @anthalas9 3 года назад +2

    Loved watching your mind work, I have a very similar work flow and it was refreshing to see someone else who does things not because you should, but because... you know, why not...

  • @rexcatston8412
    @rexcatston8412 4 года назад +46

    Im just waiting for the day when I can thunderbolt a gpu to an android phone.

    • @ahmedihab9315
      @ahmedihab9315 4 года назад +6

      This is hilariously fantasy-esque i love it 😂😂❤

    • @GeneralNickles
      @GeneralNickles 4 года назад +7

      Don't external GPU enclosures run on thunderbolt?
      That might actually work if you can find drivers.

    • @easyaspi31415
      @easyaspi31415 4 года назад +7

      Ah yes, let me hook up my 1080ti to my phone
      *Looks at 1.9GHz CPU* WHO'S THE BOTTLENECK NOW?!

    • @Wacypro
      @Wacypro 4 года назад +7

      Candy crush on MAX

    • @habag1112
      @habag1112 3 года назад +1

      @@GeneralNickles isn't thunderbolt exclusive to x86 intel cpus? I mean sure, we have the "usb 4.0" now (currently found on the new apple computers), but it doesn't seem to support gpus.

  • @puop
    @puop 3 года назад +2

    Oh god... you just won my heart over with that Jurassic park reference.

  • @pingtime
    @pingtime 4 года назад +4

    Finally, video I was waiting for a long time has arrived

  • @001vgupta
    @001vgupta 4 года назад

    You have gently and nicely explained you points, against some irrational comments, you have had on your previous video.

  • @Junji101
    @Junji101 4 года назад +30

    Does the x1 to x16 pcie slot need to be actively powered? I know some graphics cards might need more than the pi can output.

    • @JeffGeerling
      @JeffGeerling  4 года назад +20

      It doesn't unless the device needs extra power. I actually have a powered adapter as well, and I may do some extra testing with it once I get a separate power supply for it.
      I'm also going to be testing a PCIe splitter/switch so I can see how multiple cards run together!

    • @uptimefpv
      @uptimefpv 4 года назад +2

      @@JeffGeerling I don't think that the Pi could handle PCIe bifurcation (especially not on an x1 slot), but it's worth a try.

    • @christianjb2002
      @christianjb2002 4 года назад +19

      Seeing how the Raspberry pi 4 uses a 10-15W power supply and the relatively huge heatsinks on the cards, I'd say they need more power. A x16 PCIe slot is expected to provide up to 75 watts for a high-powered card. Even if the Raspberry pi can support that, a 1x slot can only provide 6W.
      TL;DR yes
      Also for better software support on an AMD card, it might be better to use a GCN GPU such as Radeon HD 7000 or RX 200 series because those drivers were much more developed in Linux.

    • @kane587mad
      @kane587mad 4 года назад +2

      The power delivery over pci-e was my first thought too. As the card doesn't have additional power plugs, it needs to get all power through the pci-e port. The TDP of the cards is at least some hind how much power they need to get over the port. Or plugging them into a x86 pc and look into HWinfo how much power they use in idle mode.

    • @nikkopt
      @nikkopt 4 года назад +7

      @@christianjb2002 according to wikipedia "A full-sized x1 card may draw up to the 25 W limits after initialization and software configuration as a "high power device"." That makes sense since he only started having those problems when the drivers were properly installed. Even if he's not pushing the GPU, i bet it's asking for more power than what the board is giving it.

  • @Slaegar
    @Slaegar 4 года назад

    Not entirely sure this was recommended to me, but I'm glad it was. Have a sub for your very impressive even if unsuccessful work.

    • @JeffGeerling
      @JeffGeerling  4 года назад +1

      I'm glad you stopped by, hope you enjoy the videos :)

  •  4 года назад +3

    "That's definitely not my head". :D
    I'm almost sure that now that you've pawed the way, in a month somebody figures out how to fix those drivers to work :)

  • @PramodhRachuri
    @PramodhRachuri 4 года назад +1

    Your videos are just awesome!
    I lost interest in RPI and stuff sometime back. Watching your videos makes me excited about these topics again.

  • @Rayan-Singh
    @Rayan-Singh 4 года назад +14

    That GPU would've used ATI Drivers
    and there would've been Power issues as you didn't use a powered PCIe expansion card

    • @Rayan-Singh
      @Rayan-Singh 4 года назад +1

      And yeah try Manjaro or Pop OS

    • @ArtemisKitty
      @ArtemisKitty 4 года назад +5

      @@Rayan-Singh Those still wouldn't get around the IO BAR issue, though. That's something that, as mentioned, is involved with VGA ports. IIRC, they had data pins to detect the status and capabilities of the monitor, which is probably what it is trying to access for VESA drivers. In order to support a video output device using said device's drivers, you need it to be VESA compliant. In order to comply with the VESA standards, you must fully support VGA "legacy" devices, which are kept current since servers still tend to use them a lot for diagnostic terminal screens. I know mine has one for iLO. (HP DL380)
      The problem goes back to the point someone else made here - it's a firmware issue, since the developers safely assumed there was no need to include VGA support natively.
      Normally, on a PC, this is something which would be baked into the BIOS, but...the raspberry pi has no BIOS. When it's powered on, it assumes the same hardware is there that it expects and starts running code a certain way right away. No overhead interface to first determine what hardware it has etc (BIOS), as it's assumed it will always be exactly the same hardware, since that's the case for the rPI boards.
      It's more than just a case of "not the right drivers", as there is a more core level component missing that ALL video card drivers require.
      Still... awesome POC! You could still probably find a way to get it to do some basic CUDA coding if you used a dev driver instead and built it WITHOUT video output of any kind, just using the cuda cores for AI processing. That might be kind of useful for the RPi, I think.

    • @JeffGeerling
      @JeffGeerling  4 года назад +2

      @@ArtemisKitty Not only the IO BAR, it seems the Radeon (ATI) driver, which I _did_ use (see 09:10 onward), is also trying to initialize bios on the card, which would go over the IO BAR.

    • @ArtemisKitty
      @ArtemisKitty 4 года назад +1

      @@JeffGeerling Oh yeah, good point. So yeah, a LOT more than insufficient power going on there, lol.
      I’m personally curious about getting the cuda cores up as AI processing devices using tensorflow or something similar, so there might still be a possibility there for them. Even a cheaper GTX series card can do some pretty heavy processing on that front. Still... long uphill road there. Thank you so much for all of this; I’ve learned so much more in just the past 2 days!

  • @AI-xi4jk
    @AI-xi4jk 4 года назад +2

    Your sufferings deserve my subscription! This is just like how I feel at work sometimes :), especially working with hardware.

  • @Zerzayar
    @Zerzayar 4 года назад +4

    You're a real hero. If there's a mountain, it has to be climbed. 💪

  • @DjVortex-w
    @DjVortex-w 4 года назад +1

    The genius of the PCI Express standard is that it's completely compatible, backwards, forwards and sideways. Among other things, any PCI Express card, of any length, will work on any PCI Express slot, of any length. It doesn't matter what those lengths are.
    A PCI-Express x16 card will work on a PCI-Express x1 slot just fine (with one sixteenth of the transfer rate, obviously). The only reason why in most cases you can't just plug it in is because they put a physical barrier on the end of the slot. If you were to dremel out this barrier, it would work.
    (I don't know why they always put that barrier in x1 slots. Perhaps to just stop people doing exactly this, and avoid card sagging damaging the slot.)

  • @ArshaansEdits
    @ArshaansEdits 4 года назад +6

    bro, please try WoR, I think even without the driver you might get display out from the hdmi using the nvidia graphics card

    • @ArshaansEdits
      @ArshaansEdits 4 года назад

      pls reply

    • @dylan_00
      @dylan_00 4 года назад +1

      Actually this isn't a terrible idea, though the Win10 AMD/NVidia drivers may not run on non-x86

    • @JeffGeerling
      @JeffGeerling  4 года назад +2

      It's something I may try, I've never tried Windows on the Pi yet though... might have some growing pains.

  • @keithmiller9665
    @keithmiller9665 4 года назад +2

    Jeff, hi from the UK. Superb video, really enjoyed it. I was playing with hardware video encoding (ffmpeg h264_omx) early in the lockdown with the RPI 4. Not fast but very low power consumption and can only used bitrate based h264 encoding - so larger filesizes than with CRF - I then went onto play with ffmpeg h264_nvenc hardware encoding on the NVIDIA GT 710 and 1050 Ti using Ubuntu 20.04 on various PCs. Very interested in your video, and I hope you can progress this in future. Good Luck!

  • @SenpaiSerene
    @SenpaiSerene 4 года назад +14

    Hello to my Pi-Labs and RPi Discord fam!

    • @pilabs3206
      @pilabs3206 4 года назад

      ?? :) Hi Nat

    • @akamadman203
      @akamadman203 4 года назад

      @@pilabs3206 long time no see we should talk on discord some time

  • @WouterVandenneucker
    @WouterVandenneucker 4 года назад +1

    I struggled myself with ROM BARs last week. Albeit it was in a case of GPU Passthrough. Lucky for me I wasn't walking uncharted territory like you are. But it also baffled my mind for a second as I was trying to get it to work.
    Great work in any case! Your effort and reporting are making a difference for a huge part of the community for years to come!
    (think AWS ARM systems, the new macs that are coming,...)
    Sincerely: thank you!
    “Our greatest glory is not in never failing, but in rising every time we fail.” - Confucious

  • @jackpatteeuw9244
    @jackpatteeuw9244 4 года назад +4

    At one time I might have undertaken such a quest. Now I let the "youngsters" take on these challenges !
    Good foot work. I am betting that RPi5 (and what ever chip Broadcom makes of it) will have a lot more PCIe lanes. Someday PCIe will replace I2C and SPI for even mundane peripherals.

    • @TheOleHermit
      @TheOleHermit 4 года назад +1

      I'm putting my bets on USB-C ;-)

    • @jackpatteeuw9244
      @jackpatteeuw9244 4 года назад

      @@TheOleHermit What part of USB-C are you interested in ? The VL805 claims to support USB 3.0 5Gbs. USB 3.1 (10GBs) or USB 3.2 (20GBS) would be nice. You just need a USB-B to USB-C adapter. I don't see RPi ever supporting any of the USB-PD for output, although I do think futures versions will require a 15W (5V@3A) or possibly 27W (9V@3A) USB-C power supply. This depends on what the cost of supporting USB-PD on input and output.

    • @TheOleHermit
      @TheOleHermit 4 года назад

      @@jackpatteeuw9244Yes, USB-C is only a connector style, but it is capable of much more than PD, including USB3.x data rates, display port, networking, you name it. There are many USB-C adapter cables already on the market that provide these functionalities. A fully functional USB-C port could be broken out into a PD powered multi port, multifunctional input/output hub, with the various conventional connectors, just as the PCIe bus can be expanded with different adapter cards.
      I'm a maker, who wants to enclose my IoT devices and their CM4 controllers. Legacy PCIe cards only get in my way.

    • @nowrd2xpln
      @nowrd2xpln 4 года назад +2

      Let me clarify the statement, “Someday PCIe will replace i2c and SPI...”. This isn’t true and will never be true. The PCIe protocol was created for specific purposes and one of those purposes was not for replace low level serial communication protocols (i.e. RS232, i2c, spi, CAN, 1-wire, etc). In fact there are consortiums/groups interested in replacing the PCIe protocol because the design is beginning to reach it limits. New systems such as CXL and CCIX seek to replace PCIe (to a certain extent)

    • @jackpatteeuw9244
      @jackpatteeuw9244 4 года назад +1

      @@nowrd2xpln Well, maybe I was TOO enthusiastic ! Here is where I am coming from. Large, embedded applications (think automotive EFI) need huge amounts of IO. Traditionally, these were on the SoC. Not only is this not "flexible" but they were running out of pins. The lower voltages on today's chips make it difficult to use a 5V Vref for analog signals.
      i2c and spi are just not fast enough to sample all of sensors required for a modern car. What is really needed is a high speed "bus expender" that would directly map all of these analog channels to memory space and their controls to memory space on the chip. The same is true for outputs, but many of those have to be "timed".

  • @sandymccabe7670
    @sandymccabe7670 3 года назад

    Had to sub simply for your logic "its not if you should but if I could" now that's a mind set for innovation

  • @letmein606
    @letmein606 4 года назад +5

    You can run external GPUs on on a pi. Just need to do hardware modification. Although since this is the compute module, just need to do some tinkering in the software side and you should be able to get it to work.
    Sources:
    mloduchowski.com/en/blog/raspberry-pi-4-b-pci-express/
    &
    www.tomshardware.com/amp/news/raspberry-pi-4-pci-express-bridge-is-a-step-closer

    • @heasterian2508
      @heasterian2508 4 года назад

      Post from twitter about bar that was part of this project.

  • @Kimbp85
    @Kimbp85 4 года назад +1

    Love the storytelling, and details. Good job :-)

  • @nadiasmith3262
    @nadiasmith3262 4 года назад

    Nice. That's a HELL of a lot of watching, waiting and hoping. Thanks for the serious amount of work and one more try attitude.

  • @domleonhardt4841
    @domleonhardt4841 3 года назад +1

    First time you showed up on my suggestion feed. Glad I tuned in Thank for sharing. And thansks for speaking plain English for us less experienced in IT and Development. And appreciate the dry humor I have a dry scene of humor myself.

  • @jirehpadua304
    @jirehpadua304 4 года назад

    I salute you man! The video is somewhat disheartening but you find your way to get the benefit of its experience.

  • @曹輔
    @曹輔 3 года назад

    Although it didn't work out, I still appreciate your efforts, you are great!

  • @NatoasCode
    @NatoasCode 4 года назад

    The closest I've personally gotten to running ARM with a GPU is clustering a Jetson Nano with a Raspberry Pi as a GPU node using something like Kubernetes or Docker Swarm. Still, a Jetson in an ARM cluster is still not the same as a proper ARM GPU!
    I was curious if the hardware resources necessary to add a standard GPU directly to a single Pi were present, but this answers everything I wanted to know in one video. Thank you for the work you do, it's very helpful!

  • @worlukk
    @worlukk 3 года назад

    I've seen a few of your videos, but after watching this one I had to subscribe.

  • @fauzirahman3285
    @fauzirahman3285 4 года назад

    I wonder if the same people who say "to get a better computer instead to connect to the GPU" also told people who try to run Doom on odd devices to "get a newer computer" or "get a newer game".
    Anyway, this is great work. Thanks for sharing your experience with us and all your homework. Subscribed.

  • @AdmV0rl0n
    @AdmV0rl0n 4 года назад

    This video and its subject matter took you a lot of time. I appreciate it. I oddly enough had a thought yesterday on if the Pi and PCI-E + GPU was something. When I dig in GPUs, they often seem to have low level things that get in the way. An example today is a lot of new GPUs will not play with non UEFI bios systems. So you made a good call to go back to older model GPUs.

  • @boal3806
    @boal3806 4 года назад

    Learned something, and had a good laugh. Great story - thoroughly entertaining. Was really rooting for you to make it work. Thank you!

  • @andrewmcleod9312
    @andrewmcleod9312 3 года назад

    You sir are a champion !! Thank you for this video and thank you for all the effort. Awesome channel

  • @LuisCruz-yg6rs
    @LuisCruz-yg6rs 3 года назад

    After watching this video I had no options. but to subscribe. It is just as entertaining as a well done Netflix film.

  • @deechvogt1589
    @deechvogt1589 4 года назад

    Thanks for another great video Jeff. I am totally down for this journey and will continue to follow along as you chronical your adventure for us. Thanks again.

  • @tiberiumihairezus417
    @tiberiumihairezus417 4 года назад +1

    Much appreciation for your struggle, i feel like it won't end in vane.

  • @Kalthorgr
    @Kalthorgr 4 года назад

    it doesn't matter that you didn't figure how to make this work. We learned a lot from your research. Good Job fella'. Let's support this guy your persistance is inspiring.

  • @julienbietlot3401
    @julienbietlot3401 4 года назад +1

    Indeed this is the way!!! Love your vid Jeff, continue the good work !!!!

  • @Toastmaster_5000
    @Toastmaster_5000 4 года назад

    A few things:
    1. The closed-source ARM Nvidia drivers are meant for the Tegra series, so, they're probably not built to look for a wide variety of PCIe GPUs.
    2. There are PCIe riser cards with 12v molex connectors spliced in. These might help alleviate some of the power-related issues you might be facing.
    3. PCIe GPUs are known to work on ARM but perhaps there's a hardware limitation for the RPi.

  • @r2d2agr
    @r2d2agr 4 года назад +2

    bro keep trying, you are a genius! its amazing all your work!

  • @HelgeTier1000
    @HelgeTier1000 4 года назад +1

    Nice vid and project. Keep up the good work and make it great :)

  • @roqeyt3566
    @roqeyt3566 3 года назад +1

    I read your blogpost about this just yesterday
    I can't wait till someone will figure out how we can add a decent gpu to the raspberry pi

  • @anjuverma4275
    @anjuverma4275 3 года назад

    you had done soo much hardwork i really appreciate that .

  • @cybersecurity10
    @cybersecurity10 3 года назад

    I really Appreciate your Hardwork. The video was helpful.

  • @lennytheleopard
    @lennytheleopard 4 года назад

    I'm in total awe of your powers of persistence. I'm sure one day you will discover / invent faster than light travel.

  • @igordasunddas3377
    @igordasunddas3377 4 года назад

    Great Video! So das your efforts didn't end up in a happy end with nice video output, but the video really was worth my time. Thank you! I love learning stuff and while I didn't learn details, I've learned that even those extreme means didn't lead to the pie properly displaying the output. Cheers and I'm excited to watch follow-ups if there are or will be any.

  • @haaapypsppl3022
    @haaapypsppl3022 4 года назад

    Bro wish you good luck for you and your channel

  • @chandrashekharsingh8640
    @chandrashekharsingh8640 3 года назад

    Dude I am liking your video for your efforts 👍

  • @tejonBiker
    @tejonBiker 4 года назад

    I didn't expect you go so far, the support for desktop gpu outside of the mains share markets is so broken, thanks for the effort and the info, keep the hard work please

  • @zxuiji
    @zxuiji 4 года назад

    Never knew how to deal with driver diagnoses stuff, know I do, subscribed since I figure I'll get more of that sort of stuff from this channel

  • @dmoneyballa
    @dmoneyballa 4 года назад

    you did good work. I appreciate learning from your tries so I don't try for many hours, thank you.

  • @bunakkaptan
    @bunakkaptan 3 года назад

    I admire your patience.. Kudos

  • @yuryeuceda8590
    @yuryeuceda8590 4 года назад

    Great research friend, So sad you couldn't make it work I was very excited waiting for you to succedd in this adventure.

  • @sertoriusdane735
    @sertoriusdane735 4 года назад

    Dude I loved your bar joke. Instant subs for me. Also u make quality content keep it up bro!

  • @diggleboy
    @diggleboy 4 года назад

    Nice deep dive Jeff! Taught me a lot about when to stop bashing my head against a brick wall. Hopefully the community can assist this effort and help you write a linux driver and compile a kernel for it to work with the Raspberry Pi module with the PCIe slot. I'm very interested in this hardware configuration for GPU machine learning in an embedded system for a really low price.

  • @cornebistouille
    @cornebistouille 4 года назад

    as a develloper i really enjoy the way you go trough hiccups, and bug, and search how to manage them by exchanging with others... till you found the solution...

  • @kentvandervelden
    @kentvandervelden 4 года назад

    Fantastic attempt! And thanks for sharing despite not being entirely successful. A lot is learned from attempts. Wish more people would report what didn't go well.

  • @child_of_god_
    @child_of_god_ 3 года назад

    Very good info for us what youve done here.
    Conclusion, the hammer is in the device driver *and* the os.
    The driver is not crucial, but the os need to be able to do much if the os developer wants to make this work.

  • @maniaenjoyer
    @maniaenjoyer 4 года назад

    The entire video had me at the edge of my seat. Well done!

  • @C138
    @C138 4 года назад

    Even though it did not worked out at the end but you gained a Subscriber.

  • @RGD2k
    @RGD2k 3 года назад

    Thank you so much! Gave me flashbacks to trying to get gtkmm to compile on a cirrus logic armv4tel in 2006! (so much of that 'I'll just try ONE more thing...').
    CL had paid a dev to add support for their SoC to mainline gcc... specifically for their non-arm thirdparty floating point unit. (the v4 arm arch didn't necessarily come with one, was an optional extra...) but that dev had gotten the condition register format for the floating point unit confused with the arm one - gcc would compile code, but it was horribly, horribly broken. Worse, gtkmm (the C++ gtk bindings) were coded by people not suspecting that cross-compilation might exist, and it did things like try to compile and then immediately RUN small applets during configuration, so to enable cross-compilation (with specially-patched gcc that wasn't mainlined, yet) you had to custom compile linux kernels on both the dev PC (for the cross-compiler) as well as the target machine. To turn on miscellaneous binary extensions (on the PC) and a little thing called GAPING_SECURITY_HOLE in the arm kernel. You then nfs-shared your dev PC root, booted the target up with that kernel, with root mounted from the PC via NFS - which would immediately kernel panic because init was an x86 binary! - but would leave the kernel running, and network accessible... whereby that PC kernel extension could be set to use the target machine as a 'arm coprocessor', using that GAPING_SECURITY_HOLE extension to just load the arm binary from the PC over the network to the ARM kernel where it would run.. and interact with the same filesystem... Supposedly, this would all be enough to let gtkmm do it's 'compile, execute, test' configuration phase OK, even whilst cross-compiling everything to arm code.... Although, I never did get quite that far - I recall it so well, because this was the point at which I gave up, and found the boss an x86 SBC for the project. At the time (2006) I concluded that OSS gui embedded apps on linux arm hardware, would be possible, but that it would take a team of perhaps 20 engineers at a big company, like say, google, to get it all squared away... Of course, later, exactly this did happen, only for some reason they went with java, not C++. Equally bad choice, imho.
    The kicker is, the reason why gtkmm was a necessity, was because I was working with another guy, who was a comp.sci graduate. And he'd chosen C++ to do all his application-level dev work on, and he could demo it all working on his laptop, so the boss just said 'make it work'. Had he stuck to C and just used gtk gui elements directly, it would have been ok, because we wouldn't have needed gtkmm.
    I still would have had to have been basically hand-building the cross-compilation toolchain with the custom patch there to fix the floating-point issue.
    It amazes me sometimes how far linux on arm has come, but the biggest learning for me was this: Leave linux distro development to the distro development guys: If you have an embedded system to build, just work from a nearly-normal os (or use something like angstrom to automate it), but try not to build anything that requires a custom kernel to work. Use an FPGA over usb or PCI to do the 'real time' things and don't get stuck trying to optimise a system also hosting an entire OS on it! Let it do it's thing, and keep the system you have to be 100% responsible for as simple as possible!
    I'd really like to see how well an ECP5 FPGA PciE card will work on the cm4! - Nowadays, you should be able to pick a card that has symbiflow support, so the cm4 can host the whole OSS FPGA toolchain itself!

  • @syntheticperson
    @syntheticperson 3 года назад

    I applaud your perseverance!

  • @alfredopepitonio5128
    @alfredopepitonio5128 4 года назад

    This channel has helped me a lot and thanks to it I chose the tech carrer

  • @mcskinz
    @mcskinz 4 года назад

    I recently bought a raspberry pi 4b 8g, your video is awesome

  • @jamescs50
    @jamescs50 3 года назад

    Thank you for making an interesting but very complicated project comprehensible to dummies like me.

  • @vikaskumarojha8616
    @vikaskumarojha8616 4 года назад

    Doesn't matter if it worked or not but you are a inspiration.

  • @DiomedesDominguez
    @DiomedesDominguez 4 года назад +1

    the only channel I watch the whole video, great work, as always.