Ollama in a RASPI | Running a Large Language Model in a Raspberry Pi

Поделиться
HTML-код
  • Опубликовано: 23 авг 2024
  • Hi and welcome back to DevXplaining channel! In todays video, we'll go through how to install Ollama and a large language model in a Raspberry Pi. I'll also cover the prerequisites needed to make it happen smoothly, and give you a glimpse at the performance and use cases for this.
    So join me for a bit and as always, I appreciate any likes, feedback and comments. Feel free to subscribe to my channel for more content like this! Might get inspired to do a follow-up with some coding as well.
    Links in the video:
    - www.raspberryp...
    - ollama.com/

Комментарии • 2

  • @LanJulio
    @LanJulio Месяц назад +1

    Thanks for the Video !!! Will try on my Raspberry Pi 5 with 8GB of RAM !!!

    • @DevXplaining
      @DevXplaining  Месяц назад

      Perfect! It's gonna be slowwww... But fully local too :)