Will ANY GPUs work on the Raspberry Pi?

Поделиться
HTML-код
  • Опубликовано: 20 авг 2024
  • НаукаНаука

Комментарии • 1 тыс.

  • @JeffGeerling
    @JeffGeerling  3 года назад +467

    Coreforge just got a full screen of text (plus a few artifacts)! Hope springs eternal, and thanks again to AMD for making their drivers open source, so we can get this deep into debugging and tweaking them! github.com/geerlingguy/raspberry-pi-pcie-devices/issues/4#issuecomment-826009569
    There's also some new hope for the GT 710-going to see how things go on a new test run with the nouveau driver.

    • @tentative_flora2690
      @tentative_flora2690 3 года назад +8

      Interesting, so there's a ray of hope. Maybe one day I will build a 2d game console/arcade cabinet with a pi.

    • @yuryeuceda8590
      @yuryeuceda8590 3 года назад +5

      I have some radeon passive gpu thak would be great to test it in your research. Let me search for it.

    • @Cornyspoon884
      @Cornyspoon884 3 года назад +5

      Try Anthony from Linus tech tips

    • @paulwratt
      @paulwratt 3 года назад +4

      meh - another 24hrs and that screen of text could have been in the video (as well as the reason why - tunrming off combined writes) - I read all 76 messages in that thread on the 23rd, and when I saw that last screen shot .. "Hot damn" (with cheesey grin) Quake3 here we come :)

    • @noahwhipkey6262
      @noahwhipkey6262 3 года назад +1

      backpacks which create a virtual world with glasses. normalize the orientation per user and you have 3d map of the world with enough nodes. a real virtual world you can live in. users add own features and dapps for real life services. just need the glasses, and some driver support :) lol

  • @NonsensicalSpudz
    @NonsensicalSpudz 3 года назад +1571

    nice to see you shoutout smaller channels like LinusTechtips

  • @art_nt_nk8353
    @art_nt_nk8353 3 года назад +452

    1:18 "this little unknown youtube channel linus tech tips" lmao u killed me

    • @tacetan
      @tacetan 3 года назад

      LMAO!

    • @art_nt_nk8353
      @art_nt_nk8353 3 года назад +24

      @Suthan Cn its a joke

    • @notnalin
      @notnalin 3 года назад +15

      Linus tech tips must be lucky to get this shout out from him

    • @DrippyHarryPotterObamaSonic
      @DrippyHarryPotterObamaSonic 3 года назад +2

      @Suthan Cn clearly a joke.

    • @notnalin
      @notnalin 3 года назад +1

      @@AdamBucky21 it's a joke bruh

  • @mario-sz
    @mario-sz 3 года назад +286

    There is a Polish proverb: If there is something that cannot be done, tell someone who does not know about it - he will take it and he will do it

    • @craigsickles7049
      @craigsickles7049 3 года назад +11

      Yeh I always seem to be that guy!
      I always took "you can't do that it can't be done" as a change
      Just because someone else can't do it doesn't mean I can't ,!!
      I had two sided 80 track floppy's running on the old apple IIs way before someone said you can't do that. I rented a 400k mac drive do I could interface it to my apple IIs plus...

    • @joelc3449
      @joelc3449 3 года назад +4

      Just means: that's not going to be easy

    • @averagejoe9040
      @averagejoe9040 3 года назад +4

      Exactley. The thing that all great breakthroughs have in common is that the people making them happen dont know they shouldnt work.

    • @szpecunio
      @szpecunio 3 года назад +2

      jak to jest po polsku?

    • @mario-sz
      @mario-sz 3 года назад +2

      @@szpecunio Jeżeli jest coś czego nie da się zrobić, powiedz to komuś kto o tym nie wie - weźmie i to zrobi ;)

  • @the_beefy1986
    @the_beefy1986 3 года назад +685

    Great work Jeff. Just remember, failed tests aren't really failures. They're just revealing the dead ends that aren't worth pursuing further.

    • @JeffGeerling
      @JeffGeerling  3 года назад +91

      Definitely! That's why I decided to post this video now, instead of letting it linger without resolution for more weeks/months.
      I'm hoping, one of these days, I can post a video with 'success' like I get to for many other cards!

    • @ninline2000
      @ninline2000 3 года назад +14

      @@JeffGeerling Thank you for your hard work. You have made progress and I believe one day you will have success.

    • @volkankonur2454
      @volkankonur2454 3 года назад

      @@JeffGeerling ruclips.net/video/ci7u1fIRVXY/видео.html this guy has worked one.

    • @MohsinExperiments
      @MohsinExperiments 3 года назад

      @@volkankonur2454 Read the first comments on his video.

  • @Scie
    @Scie 3 года назад +1188

    The joke about Linux is that you write your own drivers
    I didn’t think that someone would actually do it

    • @bloepje
      @bloepje 3 года назад +69

      It's the reason why I use linux. If support doesn't exist, fix it.

    • @bjarnestronstrup9122
      @bjarnestronstrup9122 3 года назад +26

      The GPU drivers were written to work on a little-endian machine equipped with an X86 processor, the ARM are configurable to work in both little and big endian mode. The video drivers probably use some assembly code that is not compatible with the RPI processors, I don't know if they tried looking through assembly code blobs whether the drivers are using architecture specific functionality or not. Personally I think it is possible adapt the drivers to use a custom PCI controller that could resolve incompatible functionality and communication on the PCIe bus.

    • @jimbo1531
      @jimbo1531 3 года назад +14

      @@bloepje it's the reason why I use Windows. If it doesn't work, someone has already been there and documented it for you 🤷🏻‍♂️

    • @potaziio7174
      @potaziio7174 3 года назад +17

      @@jimbo1531 it’s the reason why I have windows and Linux on dual boot, whatever can’t be done with Linux can be done with windows and the other way around lol

    • @BruceHoult
      @BruceHoult 3 года назад

      @@bjarnestronstrup9122 all modern ARM machines are little-endian.

  • @bunnysparklzbunnytime5117
    @bunnysparklzbunnytime5117 3 года назад +539

    elfarto. now that is an amusing name to be saying haha

  • @reizhi
    @reizhi 3 года назад +158

    Working through these issues with you together is a great experience.

  • @wywywywywywywy
    @wywywywywywywy 3 года назад +63

    Bootlin should be able to help. They've done various open source Linux graphics & kernel development work before, including work on ARM-based SBCs. And I'm sure they'd love the publicity

    • @JeffGeerling
      @JeffGeerling  3 года назад +21

      Will try to reach out!

    • @daredevil8888
      @daredevil8888 3 года назад +5

      I was thinking similarly the open source soc gpu drivers might give a hint on the differences in desktop cpu dependencies vs arm. The Tegra should definitely help to get an nvidia based card working

    • @johndododoe1411
      @johndododoe1411 2 года назад

      @@daredevil8888 I recall the massive disappointment of nVidia pulling the Tegra 1 tools and documentation when I was still using and troubleshooting devices based on that chip. Didn't help that they started removing features from the PC driver for my office class GPU.

  • @SomeTechGuy666
    @SomeTechGuy666 3 года назад +6

    I've done work like this. I can't imagine how many hours you've spent testing all the combinations you've tried. Good work and thanks for sharing.

  • @gendragongfly
    @gendragongfly 3 года назад +21

    Your perseverance is astonishing. I'm sure you or someone else will succeed eventually in getting a GPU to work with the Pi 4 and I'll be watching your channel to see how it was managed. Thank you for being a pillar of the Raspberry Pi community when it comes to PCI support :)

  • @BenMitro
    @BenMitro 3 года назад +232

    Just remember Jeff, there is an alternative universe (or many) where you were successful. However, with 10^500 such alternatives, it might be a bit difficult to find that Jeff and ask him what he did.

    • @JeffGeerling
      @JeffGeerling  3 года назад +71

      I just need to find a way to communicate with alternate-universe-Jeff (maybe set up 5G on a Pi?)

    • @robertd1965
      @robertd1965 3 года назад +2

      @@JeffGeerling Go watch Frequency. LOL.

    • @zacharywebb7197
      @zacharywebb7197 3 года назад +2

      @@JeffGeerling speaking of adding cellular to the Pi, have you thought about that 👀

    • @zambonidriver42
      @zambonidriver42 3 года назад +2

      Tie-Dye Jeff!

    • @arrangemonk
      @arrangemonk 3 года назад +1

      more importantly, which color has his shirt?

  • @DaPanda19
    @DaPanda19 3 года назад +41

    This stuff is beyond me but I seriously appreciate you and your content, the level of detail you put into what you do really shows that you love this stuff! I may not understand everything but I always watch your videos from beginning to end :)

  • @jackt3603
    @jackt3603 3 года назад +29

    Sadly neither my knowledge or my network would be useful to help, but man, this is so exicitng! Keep going! If it is down to finances, I am sure there are a lot of us who wouldn't mind chipping in for the good cause.

    • @JeffGeerling
      @JeffGeerling  3 года назад +17

      I even considered finding some way to justify an RTX card purchase, but I just can't right now. Not with the way prices are.
      Maybe I can somehow get LTT to loan me one of there dozens of cards from the warehouse :D

    • @xWood4000
      @xWood4000 3 года назад +1

      @@JeffGeerling They had like four rtx 3000 (maybe 3080 only, anyway very little) cards in the inventory in a recent video so I don't know if that's possible. The VAG program doesn't fit this situation either

  • @MarcoGPUtuber
    @MarcoGPUtuber 3 года назад +101

    I can't wait to make my Raspberry Pi start gaming!

    • @Kuratius
      @Kuratius 3 года назад +5

      Well, YOU can get a gpu.

    • @iHawke
      @iHawke 3 года назад +1

      what's stopping you from doing so already?

    • @MarcoGPUtuber
      @MarcoGPUtuber 3 года назад +4

      @@iHawke i dont have the compute module one. I just have the normal one. Lmao

    • @fetus2280
      @fetus2280 3 года назад +2

      I already Do game on mine, guess depends what kind of Games you want To play ? I bought one of those Nintendo cases and dropped my pi in that and play retro games . Would be awesome to play modern games but i wont hold my breath on that one being a thing anytime soon . Good luck for this young man trying and i wish him all the success ..but ya .

    • @iHawke
      @iHawke 3 года назад

      @@MarcoGPUtuber look up pi labs on youtube, he has done some wonderful things with the pi already

  • @geoffhalsey2184
    @geoffhalsey2184 3 года назад +13

    I think the song "The Impossible Dream" was written with you in mind. Your dedication is exemplary.

    • @geoffhalsey2184
      @geoffhalsey2184 3 года назад

      Incidently, I recommend the, rather old, Andy Williams version of this song as the best.

  • @lillywho
    @lillywho 3 года назад +42

    7:52 yay for parenting! All the best wishes for the wee bundle of love!

  • @AndrewAHayes
    @AndrewAHayes 3 года назад +49

    I am beginning to think it may be easier to develop a purpose-made video card for the compute 4 than try to marry up with the existing x86 based cards LOL

    • @worldhello1234
      @worldhello1234 3 года назад +1

      It makes IMHO more sense to use a Mali GPU with the Pie SOC.

  • @flapjacksmike
    @flapjacksmike 3 года назад

    Just found this channel and subscribed.
    +1 for editing style
    +1 for showing driver code and hardware steps
    +1 for diet Dr Pepper
    +1 for bloopers at the end

  • @zambonidriver42
    @zambonidriver42 3 года назад +3

    I always loved the Matrox G400. Scroll a page with a mouse wheel, and the text was crisp and clear. Great for a machine that wasn’t for 3D games.

  • @boi_doingthings
    @boi_doingthings 3 года назад +3

    There are a lot of things in this video that went above my head. However, I must say, what wins my heart and (pretty sure of others too) is the time and effort you put into it, Jeff. I am really a fan of "not losing hope" attitude of yours. PS: I know you'll figure this out.

  • @wouldntyaliktono
    @wouldntyaliktono 3 года назад +5

    I spit beer all over my keyboard when you said "elFarto" out loud. Thanks for that.

    • @excitedbox5705
      @excitedbox5705 3 года назад

      @@elFarto2 The man, the myth, the LEGEND. He is ,El Farto.

  • @seriousshuck3484
    @seriousshuck3484 3 года назад +4

    I am excited for the future of M.2 GPUs.

  • @grahams5871
    @grahams5871 3 года назад +2

    Don't give up! Thank-you for documenting your progress

  • @FunkyBaby01
    @FunkyBaby01 3 года назад +4

    Deep respect for your tenacity, I stand in awe. How deep can you go figuring this graphics card snafu out? Awesomely deep, as you demonstrate.

  • @TheTechieScientist
    @TheTechieScientist 3 года назад +12

    Now...I'm not sure which is more harder : Getting A GPU or....getting one work with the Pi 😉

  • @yalopov
    @yalopov 3 года назад +1

    Really interesting video, your raspberry pi divulgation efforts are so valuable for the entire community, from the persons who only use it as a cheap linux box as hobby to the ones who actually use it as work tool!

  • @cesar_otoniel
    @cesar_otoniel 3 года назад +8

    I love when RUclipsrs make references to each other 😂 like that one at 5:08 from eevblog.

  • @jashaswimalyaacharjee9585
    @jashaswimalyaacharjee9585 3 года назад +72

    Imagine training GAN Models on Raspberry Pi, in the near future.

    • @felosrg1266
      @felosrg1266 3 года назад +7

      I also want gpu support on the rpi for a cheep deep learning training plataform instead of building a complete pc setup.

    • @TV4ELP
      @TV4ELP 3 года назад +2

      You mean something like the google coral thingy that already works on the pi?

    • @felosrg1266
      @felosrg1266 3 года назад

      @@TV4ELP Nono google coral can only make inference but cannot train aa far as I now

    • @chachocarajoz
      @chachocarajoz 3 года назад

      @@TV4ELP You can´t use Coral EdgeTPU through PCIe on the CM4

    • @TV4ELP
      @TV4ELP 3 года назад

      @@chachocarajoz there is a usb version that works perfectly fine

  • @t1mmy13
    @t1mmy13 3 года назад +27

    By the way, a top of the line card totally has a place on a raspberry pi: as a mining system

  • @brettmiller7469
    @brettmiller7469 3 года назад +1

    Jeff, I really appreciate the effort and articulation... not to mention the masterful editing. Been waiting for this one.

  • @vanudamaster
    @vanudamaster 2 года назад

    Great work Jeff, even it failed during these test you for sure tested all possible ways to get it to work! Next time it will work better!

  • @vmiguel1988
    @vmiguel1988 3 года назад +24

    Can't wait until you find a solution and all the crypto miners wipe the stock of raspberries to mine 😅😂

  • @hellelujahh
    @hellelujahh 3 года назад +5

    And till the end of time, Jeff Geerling you shall be!

  • @raulhatescheese
    @raulhatescheese 3 года назад +1

    great work! i appreciate how you show the depths to which these issues can delve into and your process!

  • @beauregardslim1914
    @beauregardslim1914 3 года назад +1

    Graphics drivers stuck in staging forever isn't necessarily a bad sign about the driver. Linux switched over to the DRM system and any non-DRM driver that wasn't already "in" at that point is now stuck. Just like with the switch to libinput, you'll often find that the "legacy" driver is 10x better than the new one in terms of compatibility and features.

  • @masterneme
    @masterneme 3 года назад +12

    Drake 😣🤚 "Quake 3 @ 4K with CM4 & a graphics card".
    Drake 😊👉 "Doing Doom 3 instead".

    • @proxy1035
      @proxy1035 3 года назад +1

      nah, lets run Quake II RTX

  • @edwardallenthree
    @edwardallenthree 3 года назад +5

    I need that tiny gpu. Hear my out: I already have windows 10 on a vm with gpu passthrough. I have a radeon card I use for linux, a nvidia for windows. Wouldnt it be great to have that little m.2 drive for linux, and then have a separate hackintosh vm?!?!?

  • @echobucket
    @echobucket 3 года назад +1

    I want to thank you for continuing this journey. I feel like since they all lock up, maybe there's a single problem causing this. I hope once you find it suddenly everything just starts working for all these cards.

  • @nunyabeeswax3012
    @nunyabeeswax3012 3 года назад +2

    It's fascinating watching you have to work through workarounds to things that we now take for granted. Keep up the good work.
    What I will say is that the redragon PSU you used really isn't a good one. It lacks a bunch of pretty necessary protections, including OTP and OCP (both of which are recommended or required by the latest ATX specification.) No OTP means that it has no way of protecting itself from overheating, while no OCP means it has no way of shutting itself off if you overdraw any of the rails. Probably doesn't matter for a GPU this weak, but just thought I'd give some insight.

  • @ArshaansEdits
    @ArshaansEdits 3 года назад +5

    Which OS are you using for now, U might wanna use WOR! Direct windows drivers might work straight away! If you need any help in WOR you can ask me anytime.
    Thanks and have a nice day :)
    :D

  • @esqueue
    @esqueue 3 года назад +4

    If youtube recommended this video to me just to make me feel stupid, it succeeded. I probably picked up 2% of what was said this video. That said, it was interesting enough for me to finish watching the whole thing so that says something.

  • @briangoldberg4439
    @briangoldberg4439 3 года назад +1

    Thanks for the update on GPUs Jeff! Even though it seems discouraging, please don't give up. :) With any luck, someone from NVidia or AMD will reach out. I wonder who will be the first company to get an external GPU working on the Raspberry Pi?? It would surely be a great honor!!!

  • @noiamhippyman
    @noiamhippyman 3 года назад

    Cheers to you for taking on such a crazy task. Good luck with your experiments!

  • @MarcoGPUtuber
    @MarcoGPUtuber 3 года назад +7

    I have a MiniPCIe video card. You wanna try it?

    • @JeffGeerling
      @JeffGeerling  3 года назад

      Which one is it?

    • @MarcoGPUtuber
      @MarcoGPUtuber 3 года назад +3

      @@JeffGeerling EMPV-1201. it's a 2D card so not technically a GPU. Pcie 1x. Not in use at the moment cause i am figuring out how to get Ubuntu on the board.

    • @JeffGeerling
      @JeffGeerling  3 года назад

      @@MarcoGPUtuber I mean, it can't hurt to at least connect it and see what happens. It looks like there's a driver (www.innodisk.com/en/support_and_service/download), and it lists "Linux Ubuntu 12.04-16.04(32/64bit)"... AND it looks like it might actually use the SM750 that the M2_VGA uses!
      Going to have to see how their driver works with the M2_VGA too now...

    • @MarcoGPUtuber
      @MarcoGPUtuber 3 года назад

      @@JeffGeerling Right. Absolutely. And it's ARM compatible. I'm just a UBoot n00b tryin to get Ubuntu on this ARM board i got. Lmao

  • @andreavergani7414
    @andreavergani7414 3 года назад +3

    I really enjoy your vids.

  • @EER0000
    @EER0000 3 года назад +1

    Thanks for all the work you are doing on these investigations! It will help the overall efforts move forward even if it’s not directly visible to the current end users :)

  • @jerryknudsen7898
    @jerryknudsen7898 2 года назад

    Being the software and linux noob I am, I'm really surprised a maxwell gpu would have so many issues working on a pi given how successful the Tegra line has been in the past as both an amazing android TV and old android gaming tablets.

  • @JohnHollowell
    @JohnHollowell 3 года назад +6

    "... I can't even debug it since I don't have the source." Wow, it's almost like having opensource code can actually let the community solve some of your problems for you. I guess Nvidia just doesn't want help

    • @JeffGeerling
      @JeffGeerling  3 года назад +4

      I really wish they saw the value in OSS. And that more people would use AMD's stuff, since they're so much better in the OSS ecosystem.

    • @johnbuscher
      @johnbuscher 3 года назад

      @@JeffGeerling Tell AMD to give us a better acceleration for ML and maybe they’ll adopt it. Until then, CUDA reigns and useful applications for end-users like RTX Voice are easily accelerated on nVidia cards. Don’t get me wrong, I want more AMD/OSS vs closed-source but at some point, one product has better performance for a specific goal.

  • @izzieb
    @izzieb 3 года назад +10

    I hear the Broadcom VideoCore VI GPU works out the box on the Pi.

    • @RN1441
      @RN1441 3 года назад +3

      yeah but it's poorly documented and has no openCL support.

    • @user-pk8fr8ix6d
      @user-pk8fr8ix6d 3 года назад +3

      but VideoCore is not a GPU. It's a general-purpose CPU, but with more DSP features

    • @JeffGeerling
      @JeffGeerling  3 года назад +13

      And it can't do Quake III at 4K 😆

    • @izzieb
      @izzieb 3 года назад +2

      @@JeffGeerling Not with that attitude it can't!

    • @JeffGeerling
      @JeffGeerling  3 года назад +10

      @@izzieb Okay, okay... I should specify-at more than 10 seconds per frame :D

  • @zachmeyer620
    @zachmeyer620 3 года назад

    hey so i know this is an older video but i find this very inspiring. absoutly nuts that you can do this. well done

  • @pwcloete8022
    @pwcloete8022 3 года назад +1

    Great effort here. For what it's worth, it's encouraging, maybe even motivation, to see how other people struggle with technical stuff as well. Somehow we all find stuff to hit our heads against 😂. Anyhow, rooting for you... "Why put a graphics card on a pi", because it's freakin cool! 😁👍

  • @wallrunner87
    @wallrunner87 3 года назад +6

    Wendell from level 1 tech. Might be able to help, the man is a tech genius.

    • @fetus2280
      @fetus2280 3 года назад +1

      I was thinking the same thing, you beat me to it :)

  • @ojtechml
    @ojtechml 3 года назад +6

    I mean with Nvidia acquiring ARM I don't see it being too long before some sort of support is out there. Do ARM based servers have GPUs in them currently?

    • @mattym8
      @mattym8 3 года назад

      Yes. But they’re very different ARM cpus. Eg. Annapurna

    • @JeffGeerling
      @JeffGeerling  3 года назад +3

      As Matt mentions, yes there are ARM servers with GPUs. And Nvidia even makes the Jetson already, but so far I've only seen a few ARM platforms where video cards actually work, and they're very expensive (at least for my tiny budget).

  • @AdmV0rl0n
    @AdmV0rl0n 3 года назад +2

    I have the gut feeling here that even if you do get it working, the offset between weak CPU and difficult 'driver' area, will lead to a very 'meh' outcome. For its perf area, I think its current GPU is 'ok', and may as well just go with that.
    Dumb question of the day. You are looking at PC industry GPUs. Have you considered taking a look at any possible ARM GPU options that might be left field, and might work better with an ARM platform? I don't think I have seen any, but maybe something is out there.

  • @MrUtak
    @MrUtak 3 года назад

    Jeff! You’re awesome. I’m so sorry this hasn’t worked. Thanks for trying! The community appreciates it. Your work has already pushed the boundaries of the unknown! You’re a scientist and don’t even know it :)

    • @MrUtak
      @MrUtak 3 года назад

      Crazy that you're literally trying to do something that they haven't succeeded also. It is possible... Is it worth it? Maybe the Pi 5 will be best for this type of crap, and your work will have been very important in the development (in creating an easier to integrate system)

  • @TheTechieScientist
    @TheTechieScientist 3 года назад +6

    Yes...yes....yess.......finally after waiting for eternity

    • @JeffGeerling
      @JeffGeerling  3 года назад +3

      Haha, I actually had the script mostly-complete a month or so ago... every time I put some words in, I felt compelled to test something a couple more ways to make sure I wasn't saying something that could be easily refuted!

    • @TheTechieScientist
      @TheTechieScientist 3 года назад +1

      @@JeffGeerling Yeah...but maybe this Pi aint destined to work with Graphic cards...maybe we should just wait for a while

  • @gregoryturner1505
    @gregoryturner1505 3 года назад +55

    Little known "Linus Tech Tips" Hahahahah. Nice dig.

    • @akshat2485
      @akshat2485 3 года назад +3

      I was shocked that he said little guys .

  • @geroconcina7718
    @geroconcina7718 3 года назад

    I love how research can make awesome things maybe is not the results what you want but is a good start

  • @SkepticalCaveman
    @SkepticalCaveman 3 года назад

    Nice of you to shout out smaller RUclipsrs. I'm sure Linus appreciate it.

  • @CptPatch
    @CptPatch 3 года назад +5

    Oh. This time you're Jeff Geerling "'til the end of time" instead of just "until next time", I'm glad you got your identity crisis worked out!

  • @jashaswimalyaacharjee9585
    @jashaswimalyaacharjee9585 3 года назад +4

    *This little unknown youtube Channel Linus Tech Tips*.... Laughing my literal A$$ Off... *hahahahahah*

  • @bbconrad92
    @bbconrad92 3 года назад +1

    I learn a hell of a lot more from these failed attempt videos than anything else. thanks for the update!

  • @johnlocke9609
    @johnlocke9609 3 года назад

    Your daughter before walking: Daddy, don't be silly, the kernel panic and memory error comes from 3 lines before your delay, get better at debugging dad!

  • @wbwarren57
    @wbwarren57 3 года назад +13

    Is it true that your new daughter will be postponing her walking and talking training in order to focus on programming the raspberry pie?

  • @tuttocrafting
    @tuttocrafting 3 года назад +3

    7:08 Thats me while I was debugging a broken kernel module for my Chromebook ambient light sensor. Its the easiest and straight way to troubleshoot sh!t like that!
    EDIT: At the end I decided to dump fixing the kernel module and fix the BIOS instead.

    • @JeffGeerling
      @JeffGeerling  3 года назад +4

      print-based-debugging is the simplest!

    • @tuttocrafting
      @tuttocrafting 3 года назад

      @@JeffGeerling I just discovered that editing a comment the heart goes away! Good to know! :)
      (It makes sense BTW)

    • @JeffGeerling
      @JeffGeerling  3 года назад +1

      @@tuttocrafting Ha, true. I'll re-heart it though :D

    • @tuttocrafting
      @tuttocrafting 3 года назад

      @@JeffGeerling Thanks man, it was not necessary at all. It just was one funny thing to know. Not always happen to have to edit a "hearted" comment.

  • @75Krusty
    @75Krusty 3 года назад +1

    Thanks a lot for the video. Impressive how much work you put into getting this to work and debugging. Interesting. Hopefully it will work one day, with more minds focus on this. So we have the big gfx cards to Raspberry Pi.

  • @1.618_Murphy
    @1.618_Murphy 3 года назад

    Jeff is just like a scientist who's showing us what we have to do to unify all the technologies

  • @techlover1
    @techlover1 3 года назад +7

    Have you considered reaching out to Wendell from Level1Techs? This seems like it might be up his ally

  • @obsoletepowercorrupts
    @obsoletepowercorrupts 3 года назад +1

    This reminds me of when youtuber Stephen Jones raised funding for that graphics driver to be written for AROS (Commodore Amiga compatible OS on PC) for the intel GMA driver so as to get screen dragging working.

  • @Revoku
    @Revoku 3 года назад +2

    I had a similar looking screen on my PC once when I managed to get two completely different cards running at the same time, the drivers did not love each other, I think it was a GT 710 and a HD 5770 can't remember what I was trying to do at the time but the few times it didn't kernal panic it looked like 11:20

  • @shadowtheimpure
    @shadowtheimpure 3 года назад +1

    I use one of those little asrock rack GPUs in my NAS for head-end video for when I need to troubleshoot.

  • @ManjunathCV
    @ManjunathCV 3 года назад +2

    You inspire me to never give up on anything!

  • @unaphiliated5090
    @unaphiliated5090 3 года назад

    To me the only worthwhile computing endeavour is the create a system with a very high processing to power ratio. But appreciate the work you are doing.

  • @aleksandarlazarov9182
    @aleksandarlazarov9182 3 года назад

    Thank you for the great content, my maaan!!! I just got recomended this video and I am already in love!
    And amazing bloopers, keep up the great work!!!

  • @avejst
    @avejst 3 года назад +1

    Impressive results 👍
    Thanks for sharing your experience with all of us 👍😀

  • @rednassie1101
    @rednassie1101 3 года назад

    Okay honestly, mad respect for pushing this far. I would have given up a looong time ago :P

  • @shiftynevers
    @shiftynevers 3 года назад

    Keep up the great work!!! I'm looking forward to the vid where you guys get it working!!!

  • @igordasunddas3377
    @igordasunddas3377 2 года назад +1

    Love it! Keep up the great work! I'm just a mere mortal Java-crap and TypeScript-developer and while I may be able to debug something and compile a Linux kernel, that's the most I can do there.
    But I am excitedly following this and would love to see any GPU on a Pi actually boot, possibly even more.

  • @syntheticperson
    @syntheticperson 3 года назад

    Your perseverance is humbling

  • @JDMACC
    @JDMACC 3 года назад

    Good job man dont stop I'm counting on you

  • @trueriver1950
    @trueriver1950 2 года назад

    Use case:
    I have a computer I use as a fan heater that does sums on the side. I use it with a number of BOINC projects. It has a pair of graphics cards used only for compute purposes (using the CPU's internal GPU to drive the monitor).
    To implement a thermostat feature i have an air temperature sensor on a pi that sends start and stop scripts to the big box over ssh over ethernet.
    Downside is that the big box still draws Watts when it is in idle waiting for the pi. On a day when the heat is not needed it has to be turned off manually, and then I freeze when the weather changes till I figure out that I turned off the whole thing at the wall.
    It would be nice to have the pi doing it all (though not, of course, the current one that is a very early model)

  • @lubricatedgoat
    @lubricatedgoat 3 года назад +2

    Hey Jeff: can you give a shout-out to your future daughter with something about how young dad looks?
    She's going to be very amused at your efforts to make retro-hardware do something it wasn't made to and these vids will be so precious one day!

  • @lhonficamilo
    @lhonficamilo 3 года назад +1

    It seems to me that the Compute Module is on its way to become some sort of CPU (/GPU hybrid) alternative. Maybe in the future we will have commercial mobos that will support compute module sockets

  • @salvadorliebana4739
    @salvadorliebana4739 3 года назад +1

    Well, I cant deny you had made a major achievement "solving" the BAR limitations.

    • @JeffGeerling
      @JeffGeerling  3 года назад +1

      And this week got my first kernel patch merged to the Pi OS kernel project! (For SATA support, so not directly related to GPU issues).

    • @salvadorliebana4739
      @salvadorliebana4739 3 года назад

      @@JeffGeerling I have all my hopes about it on rk3588.

  • @jannikmeissner
    @jannikmeissner 3 года назад +2

    How does the implementation of Nvidia GPU and ARM CPU on the Jetson look like? Maybe that’s a pointer

    • @JeffGeerling
      @JeffGeerling  3 года назад +3

      The difficulty with Jetson / Nvidia in general is everything happens behind closed doors / is closed source. So even though I know there are engineers who could probably figure things out (or give me a definitive 'yes' or 'no' as to whether it is futile)... I wonder if they'd ever be allowed to help in public :(

  • @BenHanson137
    @BenHanson137 3 года назад

    Damn buddy... Thanks for jumping down this rabbit hole so I don't have to. Been following, I don't think I can help but I'll ask around... As a fellow believer in the application, good luck.

  • @ewookiis
    @ewookiis 3 года назад +2

    Thank you Jeff, keeping the hacking-spirit alive. I just love your enthusiasm trying to find a way forward!

  • @tecnocriollo
    @tecnocriollo 2 года назад

    "Little unkown RUclips Channel Linus Tech Tips" I could'nt stop laughing hahahaha

  • @BradClarke
    @BradClarke 3 года назад

    Guest appear by Red Shirt Jeff at 2:56 🤣

  • @nandulalkrishna923
    @nandulalkrishna923 3 года назад +2

    Great work Jeff , your my inspiration to work with linux and HW , please do a video on kernel compiling

    • @JeffGeerling
      @JeffGeerling  3 года назад +3

      I would like to do this, might soon!

  • @cattigereyes1
    @cattigereyes1 2 года назад +1

    I would love a one that is made for the PI!

  • @EstebanGallardo
    @EstebanGallardo 3 года назад +1

    I'm trying to make an overclocked PI 4 to run Saturn games with Yabause emulator and some extra juice from than tiny gpu board could do the trick. With Saturn I would be able to have the 32 generation era complete (psx, Saturn, N64) and also (Dreamcast and PSP). Then I would get in touch with people who have knowledge to 3D print videogame cases so I could fit the PI 4+Cooler+mini-gpu into a nice N64 case

  • @Tvngsten
    @Tvngsten 3 года назад +2

    10:22 Glad I'm not the only one to do this with my mouse

    • @JeffGeerling
      @JeffGeerling  3 года назад +2

      Getting to the real meat of this video :D

  • @PMARC14
    @PMARC14 3 года назад

    Man has been working so hard right through the silicon crisis

  • @SpectrumDIY
    @SpectrumDIY 2 года назад

    Love the outakes XD had to add my sub. I'm only just now getting into PCB etching and design, but I got my hands on a raspberry pi 4 today (well ordered anyway), I wish I had the knowledge to debug and help, I would in a heartbeat. Alas I do not speak the language.

  • @lasbrujazz
    @lasbrujazz 3 года назад +1

    I can think some scenarios where Pi and external GPU can work together, such case like Plex server with hardware acceleration, or a render farm. Basically, Pi just acts as an adapter and redirects the task to the GPU.

    • @ninline2000
      @ninline2000 3 года назад

      Bitcoin miner.

    • @coler154
      @coler154 3 года назад

      @@ninline2000 you cant mine bitcoins with GPUs anymore

  • @yagobueno2785
    @yagobueno2785 3 года назад

    In the next episode: "We create our own gpu and WORKING DRIVER just for raspberry pi"

  • @Matrxmonky
    @Matrxmonky 3 года назад

    HEY! I have one of those 750s! Little Ti SC edition, what a little trooper that card was...

    • @JeffGeerling
      @JeffGeerling  3 года назад

      Nowadays it's considered high end-of the cards people could actually find under $1000 😆

  • @javvyjavvy
    @javvyjavvy 3 года назад

    Goodluck! Doing a pi thing this weekend. Love your videos.