It took me TWO YEARS to get this working! (GPU on Pi)

Поделиться
HTML-код
  • Опубликовано: 14 июн 2024
  • After TWO YEARS it finally works! Video output through multiple GPUs on the Raspberry Pi. Well... sorta. Check out what we have working in the video-and what's yet to be discovered.
    Special thanks to EVERYONE who's worked so hard on this over the past couple years, especially those in the GitHub issues. See links below if you want to dive deeper into the Pi external GPU rabbit hole!
    Support me on Patreon: / geerlingguy
    Sponsor me on GitHub: github.com/sponsors/geerlingguy
    Merch: redshirtjeff.com
    #RaspberryPi #GPU #AMD
    More info:
    - Raspberry Pi PCIe Device Database: pipci.jeffgeerling.com
    - Blog post with more detail: www.jeffgeerling.com/blog/202...
    Graphics cards used in this video:
    - ASRock Rack M2_VGA (SM750): pipci.jeffgeerling.com/cards_...
    - VisionTek AMD Radeon 5450 1GB: pipci.jeffgeerling.com/cards_...
    - Dell AMD Radeon HD7470 1GB: TODO
    - AMD Radeon RX 6700 XT: pipci.jeffgeerling.com/cards_...
    Adapters and equipment used in this video (some are affiliate links):
    - PCIe x1 to x16 adapter: amzn.to/3MvRQLA
    - Razor Saw blades: amzn.to/3LrgywA
    - BC1 Mini ITX bench table: amzn.to/37JqEtX
    - Seaberry Pi: pipci.jeffgeerling.com/boards...
    - Coolerguys AC to Molex 2A power adapter: amzn.to/3KinmuW
    Contents:
    00:00 - Persistence pays off
    01:21 - M2_VGA works!
    01:57 - AMD Radeon cards
    02:46 - GUIs work!
    03:34 - 3D Benchmarks... kinda work!
    04:13 - Memory woes
    05:45 - Can it run Crysis?
    07:01 - Will other ARM SoCs work better?
    08:05 - What about Windows?
    08:16 - And that RX 6700 XT?
    08:35 - To boldly go.
  • НаукаНаука

Комментарии • 708

  • @RaidOwl
    @RaidOwl 2 года назад +678

    That's pretty sweet. Next step is to have GPU manufacturers ship a GPU with a CM4 built into it 😁

    • @samiraperi467
      @samiraperi467 2 года назад +33

      CM4 on an RX6400?

    • @bourne_
      @bourne_ 2 года назад +5

      Compatibility layer? 👀

    • @woutererades933
      @woutererades933 2 года назад +5

      Didnt nvidia already do something like that?

    • @AntonioNoack
      @AntonioNoack 2 года назад +16

      @@woutererades933 Kind of, there is development boards like the Jetson Nano and Jetson Xavier, which run ARM CPUs and have CUDA cores working.

    • @JerimiahHam797
      @JerimiahHam797 2 года назад +6

      Good luck getting your hands on one right now. They were $99 at launch and now are on eBay for $400. 🤦‍♂️

  • @ergo4216
    @ergo4216 2 года назад +322

    Thank you for contributing to open source community!
    The process you're showing is part of why we have free things like Blender and Linux. People dedicate their time to making something work for the rest of us
    In arms case, this is particularly exciting because this may one day lead to a desktop pc platform alternative to the red and blue giants.

    • @STORMFIRE07
      @STORMFIRE07 2 года назад +24

      Arm is cool and all, but if we are aiming for desktop pc platform alternative, then I would like RISC-V to achieve that goal than arm, since risc-v is completely open source, whereas arm is proprietary, and if the pi4 had a RISC-V chip instead of an arm one, then I am sure someone would have already made the high end GPU drivers work with the pi CPU at this point

    • @ergo4216
      @ergo4216 2 года назад +7

      @@STORMFIRE07 Agreed! Risc-V is exciting development.

    • @OEFarredondo
      @OEFarredondo 2 года назад

      More so, they are modern day Wizards. See these people are techno wizards. I code and mess with hardware. But this is that technopath stuff for me.

    • @xerzy
      @xerzy 2 года назад +2

      @@STORMFIRE07 (well, RISC-V says nothing about PCIe having to be standard-compliant, and you can extend RISC-V with proprietary extensions, as vendors like Esperanto and SiFive do, but it would still be neat)

  • @CognosSquare
    @CognosSquare 2 года назад +163

    Your work on the Pi-GPU are the stuff of legends Jeff.

  • @froid_san
    @froid_san 2 года назад +179

    That was one heck of a wild ride to get a GPU working. Thanks for your contribution to the Pi community!

  • @famitory
    @famitory 2 года назад +178

    seems like a useful tool here would be an open-source PCIe "card" that's actually a SBC on it's own, and acts to test what PCIe features are working properly for whatever CPU they're talking to. then you could more rapidly rule out any given SBC platform without having to dig into driver dev

    • @the_retag
      @the_retag 2 года назад +11

      Sounds like a pi cm4 application to me

    • @famitory
      @famitory 2 года назад +31

      @@the_retag ah, another computational application of the 'spidermen pointing at eachother' meme.

    • @liquefactor
      @liquefactor 2 года назад

      famitory i love ur music good to see u here

    • @rarrawer
      @rarrawer 2 года назад +3

      So basically a board with an FPGA programmed with a PCIe IP core wired for up to x16 to talk to the Device Under Test, some RAM to act as a scratch space, a means to reset the Device Under Test and either a soft or hard SOC to execute tests through that FPGA fast enough to satisfy the PCIe link?
      Probably with the SOC able to write to the FPGA gate definition/configuration memory or just plain booting the FPGA from the SOC; meaning the development of the test hardware becomes less of a nightmare.
      And then on that a boatload of tests designed to validate spec compliance, edge cases, quirks, and do fuzzing/fault-injection in otherwise normal test cycles.
      Limiting scope to pass/fail validation of specific PCIe features with no need to diagnose and correct after that might make it manageable for someone who knew what they are doing?

  • @fattomandeibu
    @fattomandeibu 2 года назад +93

    Maybe one day, due to your pioneering exploration, we'll get to see the truly epic sight of a Pi 5(or 6, 7 etc.) model B with a triple slot RTX 3090 sticking out of the top... or more likely watch it fall over. Still be pretty fun.

    • @JeffGeerling
      @JeffGeerling  2 года назад +60

      In that case, you would mount the Pi to the graphics card, and not the reverse :D

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 2 года назад +11

      Now the Pi foundation is making their own custom designs, hopefully they'll consider designing a high-end chip for ARM. If they put enough behind it, I can easily see Pi's becoming the new standard for desktop ARM.

    • @riccardo1796
      @riccardo1796 2 года назад +2

      @@amirpourghoureiyan1637 that is too cool to ever happen

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 2 года назад +5

      @@riccardo1796 At this point with open RISC desktops, it's either them or SiFive. If they hit a wall with ARM, we might even see the Pi foundation build upon their RISC-V investment and make new Pi's with RISC-V

    • @STORMFIRE07
      @STORMFIRE07 2 года назад +4

      @@amirpourghoureiyan1637 I would love for them to go for RISC-V if they are thinking of making their own CPU for the pi5, since arm is proprietary whereas RISC-V is open source

  • @RN1441
    @RN1441 2 года назад +49

    Kudos for the hard work and sticking with it. I would have given up when it became obvious the problems were in the underlying PCIe implementations and controllers.

    • @kevolegend
      @kevolegend 2 года назад +3

      Persistence, patience, perseverance.

  • @christopherleadholm6677
    @christopherleadholm6677 2 года назад +13

    Jeff, i have watched you since at least you began the gpu adventure... it doesnt seem like it's been two years, but, hey, i guess it has been.
    I've been trying to keep up with yours, as well as a couple other pi-related youtube channels' stuff and what you're all up to, but it's hard - you and the others tend to move on to new projects and things faster than i can duplicate or even comprehend them sometimes!!!
    I like to explore each little bit of it on my own, and can get distracted going (seemingly) 'where no man has gone before' all on my own. ; - )
    -
    I've found doing things yourself is the BEST way to learn things you dont know...
    good work, jeff. nice stuff. keep it up

  • @EeroafHeurlin
    @EeroafHeurlin 2 года назад +31

    The epitome of a good hack, technically challenging and increases your understanding of low level things but mostly useless outside of the learning experience and bragging rights.

  • @elFarto2
    @elFarto2 2 года назад +10

    Great to see you finally got it working!

    • @JeffGeerling
      @JeffGeerling  2 года назад +3

      Great to see you here :D

    • @sandmaster4444
      @sandmaster4444 2 года назад +1

      I enjoyed that he said your username in the video :)

    • @vladduh3164
      @vladduh3164 2 года назад

      came here to find you

  • @alphaLONE
    @alphaLONE 2 года назад +4

    Finally! Loaded the video as soon as it popped up in my notifications. Kudos to you and all the community members that got together to bring this so far.

  • @theprodigal72
    @theprodigal72 2 года назад +50

    Based on what you say, Minecraft might be the first game to test, because it uses OpenGL and Java. I don't know about OpenGL, but Java is definitely supported on Arm.

    • @JeffGeerling
      @JeffGeerling  2 года назад +28

      Definitely! In fact, Coreforge has been testing Minecraft in various scenarios this week, and I just shipped him an upgraded 8GB CM4 so he can do some extra testing with more RAM than he had available in the past.
      Testing with actual applications is helpful because we can also find some architectural flaws in the way we patch things, so in the end it gets even more stable.

    • @XbotcrusherX
      @XbotcrusherX 2 года назад +6

      You can also go back to really old versions as well. For instance, Beta 1.7.3 ran decently on 500Mhz PPC with a 128MB Radeon card.

    • @GSBarlev
      @GSBarlev 2 года назад +3

      With optifine or sodium/phosphor you can already run Minecraft Java at a playable 15-25 fps, but what's more intriguing is that Bedrock under Android performs about the same! I can't imagine that getting those Linux patches into the Android Linux kernel would be the most productive use of anyone's time, but it would be pretty legendary to see a Raspberry Pi playing with "RTX On."

    • @comicsansgreenkirby
      @comicsansgreenkirby 2 года назад

      Weren't there OpenGL drivers being worked on for the Pi4?

    • @circuit10
      @circuit10 2 года назад

      @@comicsansgreenkirby Minecraft already works on the Pi 4 but not with a dedicated GPU

  • @MichaelKlements
    @MichaelKlements 2 года назад +7

    Such awesome work! The amount of patience and persistence required to get this working is amazing, well done!

  • @TailRecursion
    @TailRecursion 2 года назад +20

    He finally did it, the absolute legend!

    • @alexanderstohr4198
      @alexanderstohr4198 2 года назад

      at present its just a sneak trough a slightly opened door.
      but hey, its still a promising keystone step.

  • @Wickedm1k3
    @Wickedm1k3 2 года назад +1

    I saw your comments and pictures on github a week ago or so on this topic, nice to be able to be part of your journey

  • @donnhussey568
    @donnhussey568 2 года назад +3

    The best part is that there is someone out there named El Farto, and you said their name in your video. Simple things make my day.

    • @JeffGeerling
      @JeffGeerling  2 года назад +1

      Opportunity: Change your name to DonnFarto!

  • @dragonrider6875
    @dragonrider6875 2 года назад +2

    You and the community are insane!! I have been following this for 2 years and I hope that it continues grow and we see more standards come out and be followed by all developers.

  • @notthesameman
    @notthesameman 2 года назад

    Great super job and we ill be standing by for the next big break thru thanks jeff

  • @TheOrganicartist
    @TheOrganicartist 2 года назад

    amazing, this is such an awesome project. glad to see you make progress cracking a very hard problem.

  • @psyolent.
    @psyolent. 2 года назад

    super well done bud. love your work man.

  • @GuoFuTseng
    @GuoFuTseng 2 года назад

    Thank you Jeff and everyone working on this project. The information is very helpful.

  • @izzieb
    @izzieb 2 года назад +20

    After many recompilations of the Linux kernel, Jeff has finally succeeded!

  • @RazorSkinned86
    @RazorSkinned86 2 года назад +2

    y'all are total legends for finally pulling this off.

  • @technosurfer92
    @technosurfer92 2 года назад +14

    This is an amazing milestone in a long and arduous journey. Given that the Pi is becoming more PC-viable with every successive generation, I really hope the Pi Foundation and Broadcom consider aiming for SystemReady SR compliance so that PCIE devices like all AMD GPUs just work.

  • @RetroRobotRadio
    @RetroRobotRadio 2 года назад

    Congrats. I have been following your trials.

  • @Megaflare47
    @Megaflare47 2 года назад +3

    Nice work. Definitely will try this out myself when I get one of these boards for myself.

  • @strawmanfallacy
    @strawmanfallacy 2 года назад +1

    Hey man thanks for doing what you do. Really appreciate it.

  • @MegaManNeo
    @MegaManNeo 2 года назад +41

    It's just the beginning, Jeff.
    You guys made the first step in what could turn the RPi platform in a reliable desktop platform for everyone in the future.

    • @foldionepapyrus3441
      @foldionepapyrus3441 2 года назад +2

      Personally I'd say its already there, at least for most people - I know I only really turn my main PC on occasionally, mostly for modern gaming (and to run the backups - as its got all the big HDD) now - a Pi4/CM4 perhaps with a bit of overclock and cooling can easily handle running multiple VM's, FreeCAD (at least up to reasonably complex model), web browsing, video playback etc, and in many cases isn't any slower doing so than a modern AMD/Intel monster with giant GPU consuming 3-4x the wattage at idle that the Pi maxes out at, We fleshbags are often the bottle neck, and data access speeds can be to - but the Pi isn't a slouch there, via USB3 or the PCIe its able to go about as fast as most common drives. There are of course folks doing more computationally heavy tasks for which the arm/vidcore combo are either not able to use acceleration features to make it smooth and nice (yet) or just not up to doing the task in a timely fashion - like Jeffs many many kernel recompiles would be far better done on something else, but most folks computing needs are well in range of a Pi4.
      I do agree though this work really could make it a more attractive proposition for some.

  • @RidgeRacer
    @RidgeRacer 2 года назад +8

    Seeing this video brings me great calm. Like that first bite of my favourite meal after having to prepare it for 2 years.
    (Wonder if I'll ever get to see Half-Life 2 running on a Pi with a dedicated GPU; some madman got it running on the Pi 4's VideoCore GPU. Well, running. But that's about it)

  • @sametayaz4891
    @sametayaz4891 2 года назад

    Keep up the good work bro! And good luck with that 👌🏻

  • @xirtus
    @xirtus Год назад

    I've been watching, I've been excited, I'm thrilled with you, I can't but wait for the future of this.

  • @tankfire20
    @tankfire20 2 года назад

    This is some fire content. I look forward to this!

  • @keithwhite2986
    @keithwhite2986 2 года назад

    Fantastic effort, thanks for all you are doing.

  • @scottwilliams895
    @scottwilliams895 2 года назад

    Jeff, so impressed with your dedication!! Keep up the outstanding work 👍

  • @Abishek_Muthian
    @Abishek_Muthian 2 года назад +2

    Congratulations to you and all other contributors; Impressive achievement indeed!

  • @IngwiePhoenix
    @IngwiePhoenix 2 года назад +2

    Woohoo! The madlad did it - video out over PCIe on a GPU :D
    Been watching for quite a while and sometimes looked at your repos to see what was going on in that project. I am quite happy to see this progress!
    It also is - at least in my opinion - a shining example for why it's a great thing to have open-source GPU drivers. :)

  • @dieSpinnt
    @dieSpinnt 2 года назад +4

    Mind blowing!
    Reminds me of long gone by years, where DVB cards where new and we had to confront ourselves with kernel driver programming.
    Thank you for the great work, your videos and the contribution to the open source software community. Well done, Jeff! :)

    • @Nick3DvB
      @Nick3DvB 2 года назад +3

      Those were the days, I remember one guy had a Nova-S in a G4 Mac running franken-build PowerPC linux with native Dreambox EMUs like Newcamd, just to update EMMs on an NDS smartcard, in an RS232 phoenix reader!

  • @adventuregamer6499
    @adventuregamer6499 2 года назад +1

    im really exited to see how this turns out. keep it up jeff.

  • @nilyaj
    @nilyaj 2 года назад

    So happy for you man been waiting for this for a while

  • @BlenderP984
    @BlenderP984 2 года назад

    This is a massive achievement, congratulations!! I remember the first time you worked on getting a Graphics Card working on the Pi and you’ve made some incredible progress since then. Hope you continue to explore this!!

  • @rmcdudmk212
    @rmcdudmk212 2 года назад +2

    Congrats on finally getting that GPU up and running. The hard work seems to have payed off.

  • @EarnestWilliamsGeofferic
    @EarnestWilliamsGeofferic 2 года назад

    Congratulations on such a huge milestone!

  • @ParisCarper
    @ParisCarper 2 года назад

    This is amazing! Great work

  • @nullmind
    @nullmind 2 года назад

    You finally did it! Was such an incredible journey to watch. Keep it up!

  • @user-hm4xq4mv5v
    @user-hm4xq4mv5v 2 года назад +1

    This work is peak electrical engineering and the video had me giddy and cheering! Congratulations to you and all the other people who worked on this!!! This is a HUGE step for ARM integration and standardisation.

  • @garrettwalker5674
    @garrettwalker5674 2 года назад

    Yes! I've been checking your channel for weeks hoping for this video to pop up

  • @TheRealWulfderay
    @TheRealWulfderay 2 года назад

    What a journey! Nice perseverance!

  • @der.Schtefan
    @der.Schtefan 2 года назад

    I haven't yet seen the video, but already clicking on the description makes my heart jump high :)

  • @glucosefructose
    @glucosefructose Год назад

    I first saw this video when waiting around at school on my break period. Thank you good sir for your epic efforts.

  • @eRogue7
    @eRogue7 2 года назад

    Congrats on making progress on your trip down the rabbit hole of trying to get something to for no other purpose than to see if you could!!! It’s amazing to see how you’ve learned on this journey and I’m sure by now you are pretty comfortable with compiling custom Linux kernels.

  • @nexxusty
    @nexxusty 2 года назад

    Awesome Jeff.
    Well done, all of you. This has been a long road.

  • @therealjamescobb
    @therealjamescobb 2 года назад +1

    Thats awesome to see it working!!. I just got a shirt.. IT was dns! I totally had a dns issue the other day and needed your shirt because of it heh. Thanks for doing what you do Jeff!

  • @ma3oun
    @ma3oun 2 года назад

    Well done !!! Keep up the great work 🙂 !

  • @a4e69636b
    @a4e69636b 2 года назад

    Excellent job everyone working on this project.

  • @DanielLopez-up6os
    @DanielLopez-up6os 2 года назад +3

    Amazing man, you've finally done it! Also the 5450 used to be my main card for AGES!

  • @wtf028
    @wtf028 2 года назад

    This is really great! Thanks for sharing.

  • @ArronLorenz
    @ArronLorenz 2 года назад

    Fantastic! It's great to see your satisfaction from the hard work.

  • @educastellini
    @educastellini 2 года назад

    -Thank you very much for the knowledge and work you do with the Rapberry Pi....!!!

  • @davidg5898
    @davidg5898 2 года назад +8

    Excellent work. Keep pushing the limits!
    As wild as it would be to get modern games running on an RPi, it's the math and AI acceleration that I want to see. Even an older GPU would yield a massive performance boost to those algorithms.
    I think SBC/SoC vendors with consumer grade products do pay attention to tinkerer/maker projects like this, and hopefully they'll now see that a more standardized PCIe implementation is something more people would put to use. I wouldn't expect it to appear in the next generation (since those are probably well into the design phase, if not testing phase), but hopefully the generation after that.

  • @BalancedSpirit79
    @BalancedSpirit79 2 года назад

    Amazing work!!

  • @tobiwonkanogy2975
    @tobiwonkanogy2975 2 года назад

    I have watched nearly every video since the first time you tried . Exciting !!!

  • @ur1friend437
    @ur1friend437 2 года назад

    Thank you for your sharing these awesome and exciting journeys with us.

  • @DKali-gn7xj
    @DKali-gn7xj 2 года назад

    DAMN! you da man! bravo for the continued research and banging your head against the wall.

  • @indraneel1080brahma
    @indraneel1080brahma 2 года назад +2

    I was waiting for this video for a long time because the end of 1st one was not satisfactory , Very nice to see the progress hope the future holds more

    • @JeffGeerling
      @JeffGeerling  2 года назад +2

      It's been a long journey, and it's not over yet!

  • @MINKIN2
    @MINKIN2 2 года назад

    Amazing work!

  • @1918dawson
    @1918dawson 2 года назад

    massive W, keep pushing jeff

  • @etch_lime
    @etch_lime 2 года назад +1

    Thanks for the video

  • @Ollital
    @Ollital 2 года назад

    It#s really fascinating to follow you on this journey.

  • @maxmouse3
    @maxmouse3 2 года назад

    The man really did it! :O 💪🏻 congrats

  • @YouCanHasAccount
    @YouCanHasAccount 2 года назад +22

    We're struggling with write combining and cache coherency issues for the radeons on higher-end aarch64 SoCs too, like the lx2160.

    • @JeffGeerling
      @JeffGeerling  2 года назад +9

      Yeah; from what I've seen, there are four or five issues on the Pi... and at least one or two on _most_ ARM chips. The biggest problem is X86 device drivers assume certain features that aren't in the 'official' cross-platform PCIe spec and those quirks are hard to work around on hardware that doesn't support them.

    • @KoenKooi
      @KoenKooi 2 года назад +1

      @@JeffGeerling When I was still at linaro we use a patch for the DMA subsystem to get GPUs to work to account for the differences in memory models. That was 5 years ago, I hope
      things have improved since then :)

    • @Knirin
      @Knirin 2 года назад +2

      @@JeffGeerling or the minimum spec is badly engineered . Given the chicanery I have seen regarding Type-C accessories for cell phones I suspect both a badly engineered spec and pathetic enforcement of the spec.

  • @OsX86H3AvY
    @OsX86H3AvY 2 года назад

    great job and congratulations!

  • @noahhorler1771
    @noahhorler1771 2 года назад

    Been waiting for so long for this... definitely not disappointed!

  • @lepsycho3691
    @lepsycho3691 2 года назад +1

    Man, if I could press the thumbs up button twice I would! Amazing work Jeff, I am amaze by the work you do and the way you do it! You are really building something here!

  • @_guru_1547
    @_guru_1547 2 года назад

    kudos to your hard work

  • @Castaa
    @Castaa 2 года назад +1

    If anything you learn a lot from these types of adventures. Congrats!

  • @ritaoraodndndbe3419
    @ritaoraodndndbe3419 2 года назад

    no one has gone before......
    Amazing work.....THX for sharing

  • @reggiep75
    @reggiep75 2 года назад

    Very entertaining witchcraft and good that many people pitched in to assist when you hit a brick wall. Keep it up! 👍

  • @Mr.Leeroy
    @Mr.Leeroy 2 года назад

    Reminds me of Dracula's struggle with making his children to hatch in Van Helsing (2004), getting PCI-E bus on cheap ARM HW to comply is like reanimating something born without life.

  • @dfbess
    @dfbess 2 года назад

    Congratz Jeff!

  • @pavan13
    @pavan13 2 года назад +4

    So from now onwards your videos are available in 4K that's awesome

    • @JeffGeerling
      @JeffGeerling  2 года назад +5

      Been slowly working up to it. Not every video will be 4K, but I finally upgraded all my equipment so I can at least record most things in 4K. I do like the detail it gives for up-close product shots!

    • @hicknopunk
      @hicknopunk 2 года назад +1

      My internet only does 360p, i usually watch youtube in 144p. 🤣🤣

  • @ZephaniahNoah
    @ZephaniahNoah 2 года назад

    Finally!! Great work!

  • @hommydc2
    @hommydc2 2 года назад

    excellent work!

  • @alextracythaxton5796
    @alextracythaxton5796 2 года назад

    Congratulations bro

  • @donporter8432
    @donporter8432 2 года назад

    You are a great mentor to others displaying creativity and persistence. Bravo Jeff!

  • @MrJoegotbored
    @MrJoegotbored 2 года назад

    Keep going, Jeff!

  • @Leroys_Stuff
    @Leroys_Stuff 2 года назад

    That’s awesome one step at a time. To use things as no one intended them to be used is the most fun

  • @adrianatanasov9886
    @adrianatanasov9886 2 года назад

    that's so cool. and finally some development on the subject

  • @richardwatkins6725
    @richardwatkins6725 2 года назад

    well done awesome work.

  • @terrawolf
    @terrawolf 2 года назад

    YOU FINALLY DID IT
    good job my guy

  • @yourmanyoshi2503
    @yourmanyoshi2503 Год назад

    I’m proud of you man😁

  • @dartz7137
    @dartz7137 6 месяцев назад

    He is a man of focus, commitment and sheer freaking will.

  • @paulsama4144
    @paulsama4144 2 года назад

    Hi! JEFF, I'm verry happy for all the stuff you are doing. I know that it isn't an easy job to do "THE KERNEL COMPILER'''
    and i hope to see the rasbpberry py running big GPU one day.
    So good luck and never give up. I continue to follow your puth even if in computer science domain, i'm just solidworks software user.

  • @helmutzollner5496
    @helmutzollner5496 2 года назад

    well done Jeff!
    interesting results. this has uncovered a few bugs and maybe it will serve to make future versions of the PI better.

  • @eccentricOrange
    @eccentricOrange 2 года назад +2

    Congratulations on a 100000π subscribers!!

  • @kookwater456
    @kookwater456 2 года назад

    I dunno how monumental this is, but it is pretty monumental and very beautiful to me. Well done Jeff and team!!

  • @TheOleHermit
    @TheOleHermit 2 года назад +1

    Wow! No one can ever call you a quitter, Jeff. Wish I understood about 1/2 of what you just ran down, but relieved that I don't need to. Thank you for keeping me humble, Jeff. I need that.
    As it is, I expect a 6 month learning curve ahead to program all the features into my Teensy Laser MIDI controller project. Just ompleted the new RGB projector and Teensy laser synth.
    Yes, I can totally relate to the costs of "going where no one has ever gone before..."
    8 * 600 ppr rotary encoders w/16 neopixel RGB level rings, 8 * RGB button/rotary encoders, 36 * RGB backlit buttons, 2 * 7" Nextion HMIs, 22" multipoint touchscreen monitor, miniPC. That list soon blasted through the budget.
    But for some things, settling for less nor the acceptance of failure are even options worth considering. Right? 😎

  • @stevefulgione4231
    @stevefulgione4231 2 года назад

    OMG! You finally did it!! 💪👍

  • @chloefletcher9612
    @chloefletcher9612 2 года назад

    Wow well done!

  • @syntheticperson
    @syntheticperson 2 года назад

    Congrats!

  • @alexanderstohr4198
    @alexanderstohr4198 2 года назад

    CONGRATS!