Install Ollama, your own personal LLM, on your Mac

Поделиться
HTML-код
  • Опубликовано: 20 май 2024
  • In this video, I'm joined by José Domingo Cruz (the Mac Whisperer) as he helps me troubleshoot getting Homebrew, Docker, and Docker-Compose installed on his Mac. This makes it possible to use a docker-compose.yaml file to run Ollama with Open-Webui locally on a Mac.
    Here are the basic steps and commands:
    1. Install Homebrew on the Mac
    /bin/bash -c "$(curl -fsSL raw.githubusercontent.com/Hom...)"
    then later see "07:20 Finishing the Homebrew installation" for some extra commands to be copied and run from inside the terminal.
    2. Install "Docker Desktop for Mac" from docs.docker.com/desktop/insta... (NOTE: there are 2 versions: one for Apple silicon and one for Intel chips. Choose the one for your Mac)
    3. Use Homebrew to install docker-compose
    brew update
    brew install docker-compose
    4. Download the docker-compose.yaml file from Github (repo github.com/adamjenkins/pillama )
    raw.githubusercontent.com/ada... (raw file)
    5. In the same directory as the docker-compose file run docker compose to start the server
    docker-compose up -d
    (NOTE: to take the server offline use 'docker-compose down')
    6. Visit localhost and download some models to start using your new local Ollama server.
    ----------------
    Many thanks to the Mac Whisperer, José Domingo Cruz for his help in creating this video. I was lost without you buddy! You can find more of José at his website goldfish365.com/ . Also, if you're an English teacher looking for a way to get your students speaking fluently in the classroom, check his "Verbal Classrooms" page here goldfish365.com/vcdocuments/ !!!
    Timestamps:
    For the first three minutes we chat about what we're trying to do... (Hey, it was our first time ever doing this)
    03:05 Getting started installing Homebrew
    07:20 Finishing the Homebrew installation
    08:45 Running brew update and installing docker-compose
    09:38 YeeHa!
    09:55 Checking that docker-compose is actually installed
    11:05 Getting the docker-compose.yaml file (we actually use nano here to copy, paste and save - you could just download the file)
    13:20 First attempt at running "docker-compose up -d"
    14:00 Troubleshooting the first error message (no docker daemon)
    17:50 Second attempt at running "docker-compose up -d"
    18:20 Deciding to install "Docker Desktop for Mac"
    19:40 Installing "Docker Desktop for Mac"
    21:00 "Docker Desktop for Mac" first launch
    22:35 Going through the first run licensing stuff
    27:50 Third attempt at running "docker-compose up -d"
    24:00 Troubleshooting the new error message (must use ASL logging - the answer was to run the command without "sudo")
    25:20 Finding the answer
    27:50 Fourth attempt at running "docker-compose up -d" (SUCCESS!!!)
    28:15 Opening Ollama in a web browser for the first time and making a user account
    29:45 Getting ready to add a model to the new Ollama
    30:19 Checking the connection to the local Ollama API
    30:48 Adding a model (mistral) to the new Ollama
    33:01 Running the first prompt
    34:30 Finding more models to download and information about each model
    44:05 How to take the server down with 'docker-compose down'
    Then we just chatted and had a look at the "Activity Monitor" while running a query etc. Basically we basked in the glory of success!
    Links:
    Ollama ollama.com
    Open-WebUI github.com/open-webui/open-webui
    Open-WebUI Community openwebui.com (modelfiles, prompts and more)

Комментарии • 6

  • @laresbernardo
    @laresbernardo Месяц назад +1

    This is fantastic. I am not the most programming savvy guy and it ran perfectly in my computer. Thanks!

    • @Wise-Cat
      @Wise-Cat  Месяц назад

      Great to hear! Enjoy your new AI!

  • @LucGougeon
    @LucGougeon Месяц назад +1

    This is awesome, thanks!

    • @Wise-Cat
      @Wise-Cat  Месяц назад

      Glad you like it! Definitely felt a follow up to the workshop last Sunday was needed.