How to Run Ollama Docker FastAPI: Step-by-Step Tutorial for Beginners

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024

Комментарии • 18

  • @Bitfumes
    @Bitfumes  2 месяца назад

    Please subscribe to Bitfumes channel to level up your coding skills.
    Do follow us on other social platforms:
    Hindi Channel → www.youtube.com/@code-jugaad
    LinkedIn → www.linkedin.com/in/sarthaksavvy/
    Instagram → instagram.com/sarthaksavvy/
    X → x.com/sarthaksavvy
    Telegram → t.me/+BwLR8bKD6iw5MWU1
    Github → github.com/sarthaksavvy
    Newsletter → bitfumes.com/newsletters

  • @ano2028
    @ano2028 2 месяца назад +2

    Thanks a lot! The whole tutorial is really to follow, I have been trying to dockerize and get my fastapi container and ollama container to interact with each for the last two days, you video helps me a lot

    • @Bitfumes
      @Bitfumes  2 месяца назад

      waooo that's nice to know.
      and thanks for this amazing comment.
      please subscribe to my newsletter bitfumes.com/newsletters

  • @mochammadrevaldi1790
    @mochammadrevaldi1790 2 месяца назад +1

    helpfully, Thank man!

    • @Bitfumes
      @Bitfumes  2 месяца назад

      cool cool
      please subscribe to my newsletter bitfumes.com/newsletters

  • @shreyarajpal4212
    @shreyarajpal4212 25 дней назад +1

    So I can directly host this as a website then right?

    • @Bitfumes
      @Bitfumes  25 дней назад

      yes or no
      yes you obviously can but it's not recommended
      although you can use AWS ECS to setup docker and then use same application

  • @NikolaosPapathanasiou
    @NikolaosPapathanasiou 2 месяца назад +1

    Hey nice video man! Since the Ollama is running in the docker container, is it using the GPU ?

    • @Bitfumes
      @Bitfumes  2 месяца назад

      not in my case it uses cpu but you need to specify runtime if you want to use GPU in docker
      so yes you can

  • @karthikb.s.k.4486
    @karthikb.s.k.4486 2 месяца назад +1

    Nice . May I know how are you getting suggestions in vs code . When you press docker the command suggests are coming in VS CODE what is the settings for this please let me know

    • @Bitfumes
      @Bitfumes  2 месяца назад

      Thanks bhai, bdw I am using GitHub copilot so maybe thats why I get suggestion

  • @mat15rodrig
    @mat15rodrig 25 дней назад

    thanks for the video!!
    Do you know how to resolve this error?
    ERROR:root:Error during query processing: HTTPConnectionPool(host='localhost', port=11434):
    Max retries exceeded with url: /api/chat (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))

    • @Bitfumes
      @Bitfumes  25 дней назад +1

      make sure your ollama is running properly and your most use 0.0.0.0 for docker host

  • @iainmclean6095
    @iainmclean6095 Месяц назад +1

    Just so you know, this does not work on Apple Silicon.

    • @Bitfumes
      @Bitfumes  Месяц назад

      how much RAM you have in your mac ?

    • @iainmclean6095
      @iainmclean6095 Месяц назад

      @@Bitfumes 128 GB, M3 Max

    • @iainmclean6095
      @iainmclean6095 Месяц назад

      @@Bitfumes 128 GB - M3 Max

    • @iainmclean6095
      @iainmclean6095 Месяц назад

      @@Bitfumes I have 128GB Ram and an M3 Max, the error I think is related to Docker and Ollama running on Apple Silicon