How to Install Ollama, Docker, and Open WebUI on Windows

Поделиться
HTML-код
  • Опубликовано: 15 янв 2025

Комментарии • 38

  • @NicolasPapalampros
    @NicolasPapalampros 7 дней назад +7

    My favorite prompt is no yapping. You were on point. Thanks!

  • @viktorialis9675
    @viktorialis9675 Месяц назад +13

    This is the style of tech content I need in my life. Subbed, liked, and commented on.

    • @BlueSpork
      @BlueSpork  Месяц назад +1

      Thank you, I appreciate it very much.

  • @judgemongaming8435
    @judgemongaming8435 2 дня назад +2

    Seriously I can't thank you enough for being so direct. All those yappings about unrelated stuff just confuses me.

    • @BlueSpork
      @BlueSpork  2 дня назад +1

      Thank you. I’m really glad you like it, and I appreciate you taking the time to make this comment.

  • @davidroberts8800
    @davidroberts8800 Месяц назад +5

    Finally someone who gets to the point. THANK YOU> SUBSCRIBED

  • @olchowikj
    @olchowikj 16 дней назад +2

    This was really great guide. No bs, quick and easy. I managed to FINALLY make it work. Thanks! 👍

    • @BlueSpork
      @BlueSpork  15 дней назад

      Great! I'm really happy to hear it helped you make it work :)

  • @FlorianRasche
    @FlorianRasche 25 дней назад +1

    I will hold a talk on the possibilities of local AI in the medical field and use your video as a quick demonstration how easy it is to set up OPENWEB UI. THANK YOU!

    • @BlueSpork
      @BlueSpork  25 дней назад +1

      Awesome! Not only easy to set up, but also open source 🙂 I’m making another video for Linux setup and then for a MacOS setup.

  • @arnaudbatista9946
    @arnaudbatista9946 10 дней назад +1

    Hello, thanks for the tuto.
    Why, when I remove the ollama/ollama image from Docker Desktop, can I no longer access the Ollama 3.2vision model, even though it is still stored on my machine? How does the ollama/ollama image bridge the connection between the locally installed model and OpenWeb-GUI?

    • @BlueSpork
      @BlueSpork  10 дней назад

      You’re welcome. You can install and run Ollama natively on your system or inside a Docker container. My tutorial demonstrates the setup where Ollama is installed on Windows and doesn’t run in a Docker container. From your description, it seems you’re running Ollama in a Docker container. Is that correct?

    • @arnaudbatista9946
      @arnaudbatista9946 10 дней назад

      @BlueSpork no. I remade the tutorial exactly like you. The ony container that runs on Docker Desktop is openWeb-gui.
      But, I have another problem : the model took big amount of time to answer me ( I used this ollama3.2-vision). For example, I just ask for one-question quizz about AWS services. I notice that the CPU of openweb-gui is very low (go up to 85% when I started it then 0,3% when it's running). Do you think that normal or ?

    • @BlueSpork
      @BlueSpork  10 дней назад

      How much GPU VRAM do you have, and how much RAM do you have on your computer?

    • @arnaudbatista9946
      @arnaudbatista9946 10 дней назад

      @@BlueSpork RAM : 16 Go
      Gpu vram : 7,9 Go

    • @BlueSpork
      @BlueSpork  10 дней назад

      What GPU do you have?

  • @qualityposts2011
    @qualityposts2011 Месяц назад +1

    many thanks for this concise video.

    • @BlueSpork
      @BlueSpork  Месяц назад +1

      Glad it was helpful :)

  • @sangramsinghshekhawat5053
    @sangramsinghshekhawat5053 Месяц назад +1

    You are a Great help dude. Many thanks for the video.

    • @BlueSpork
      @BlueSpork  Месяц назад

      You are welcome my friend.

  • @sidarth404
    @sidarth404 Месяц назад +1

    thank you vary much for straight to steps.

  • @burakinan7518
    @burakinan7518 Месяц назад +1

    Thank you for this video. Is that possible to fine tuning ollama on windows? Could you please provide a video about that issue? And why devolopers are not recomending to use ollama on windows? Thank you again.

    • @BlueSpork
      @BlueSpork  Месяц назад +2

      You’re welcome! Are you asking about fine-tuning Ollama or models?

    • @burakinan7518
      @burakinan7518 Месяц назад +1

      @ I am asking fine tuning models. I am using ollama on open web ui.

    • @BlueSpork
      @BlueSpork  Месяц назад

      I am considering fine-tuning LLMs, but it is time-consuming and currently not necessary, as the LLMs I use provide everything I need.

  • @hydrogpt
    @hydrogpt День назад

    Hi, When I run localhost it says that page isn’t working, any ideas what the problem is?

    • @BlueSpork
      @BlueSpork  День назад

      Is your docker container running and what is your local host full address?

  • @SavetheRepublic
    @SavetheRepublic 9 дней назад +1

    PERFECT! Sub! and a thumb's up!

    • @BlueSpork
      @BlueSpork  9 дней назад

      Thanks for the sub and thumb's up!

  • @digital4people
    @digital4people 11 дней назад +1

    from RU w thx