How to Install Ollama, Docker, and Open WebUI on Linux (Ubuntu)

Поделиться
HTML-код
  • Опубликовано: 3 фев 2025
  • НаукаНаука

Комментарии • 72

  • @TheGlull
    @TheGlull 4 дня назад +4

    This is the best video tutorial I have found. The fact that you provide information if something doesn't work is awesome.

  • @ginescap
    @ginescap 21 день назад +10

    Perfect, not a single word is left out, thank you.

    • @BlueSpork
      @BlueSpork  21 день назад +1

      Excellent! Thanks!

  • @andresgo1321
    @andresgo1321 Месяц назад +3

    I been searching to fix the issue with the not found models like 2 days and this worked thank u men

    • @BlueSpork
      @BlueSpork  Месяц назад

      Great! Good timing :)

  • @stuntmotorsports
    @stuntmotorsports Месяц назад +2

    This video was so informative, straight-forward and thorough! Working perfectly on Zorin 17 You definitely got a sub from me

    • @BlueSpork
      @BlueSpork  Месяц назад +1

      Great! I appreciate you taking the time to make this comment.

  • @chichudox
    @chichudox 2 дня назад +1

    Breaf and clear, excelent tutorial can we have a the Manual Installation video?

    • @BlueSpork
      @BlueSpork  День назад

      What exactly do you mean by "manual installation"?

  • @thedygiyanto4479
    @thedygiyanto4479 6 часов назад +1

    simple and very helpful tutorial 👍👍👍

  • @jaygat5366
    @jaygat5366 Месяц назад +1

    Great video , resolved a problem I was facing for days in few minutes. Thank you

    • @BlueSpork
      @BlueSpork  Месяц назад

      Yeah, it took me while to figure it out too. I'm glad it helped.

  • @nabildanial00
    @nabildanial00 4 дня назад +1

    thanks for the tutorial, followed everything and works perfectly. i had bad performance issues on windows, on linux it is blazingly fast.

  • @JoaoPauloDev
    @JoaoPauloDev День назад +1

    Well done my friend thank you so much!!!

  • @Fadeout_Alex
    @Fadeout_Alex 4 дня назад +1

    Thank you so much ! It works perfectly on Pop_OS!.
    I can run deepseek r1 locally on my PC, that doesn't even have a GPU. That's just amazing.

  • @PlaidStone
    @PlaidStone 20 дней назад +2

    You're direct to the point. thanks

  • @thomaskogler
    @thomaskogler 29 дней назад +1

    Awesome quick guide, thanks a lot!!

  • @mwester8
    @mwester8 14 дней назад +1

    Thank you, just completed it

  • @filipf1421
    @filipf1421 Месяц назад +1

    Great work, finally found working tutorial! :D

    • @BlueSpork
      @BlueSpork  Месяц назад

      Thanks for the feedback! Glad the tutorial helped. All my tutorials are working tutorials :)

  • @RegisOssa
    @RegisOssa Месяц назад +1

    Thanks I tried and it works on debian 12.8

    • @BlueSpork
      @BlueSpork  Месяц назад

      That is good to know. Thanks!

  • @DryUrEyesMate
    @DryUrEyesMate День назад +1

    For some reason I still can’t see the model in web ui even after following your instructions?

  • @young-tw
    @young-tw 11 дней назад +1

    Thanks! This video helps a lot

  • @JimShingler
    @JimShingler 17 дней назад +1

    Spot On, ... Great Job

  • @BigOak1669
    @BigOak1669 9 дней назад +1

    Wow! Thank you so much for this clear and easily to follow tutorial. Everything worked exactly as you described, and I also appreciate you including the extra commands in the description. This is my first time messing with local LLMs, and I'm really excited (although I might melt this old laptop).
    You mention configuring ollama to listen more broadly - would that enable me to access the machine running ollama / docker via the web ui from a different computer on my network? ideally, I'd like to be able to share the local llm with others on my home network, setting up a profile for them. I hope that makes sense.
    Thank you again!

    • @BlueSpork
      @BlueSpork  9 дней назад

      Thank you for the comment, I really appriciate it :)
      If you the OLLAMA_HOST enviroment variable to 0.0.0.0 that should open it to all trafic, but there might be some security concerns if you do this.
      You might find this Reddit post interesting
      www.reddit.com/r/ollama/comments/1guwg0w/your_ollama_servers_are_so_open_even_my_grandma/

    • @BigOak1669
      @BigOak1669 9 дней назад +1

      @BlueSpork thank you for the reply. I was just coming to update this as I've figured some of these things out. I set a static local ip on this old laptop via my router. After, I could access the web ui via the ip:3000 from any browser on any local computer. I was also able to add users via the admin panel settings in openwebui. The one trick there was the models were set to private by default, so I had to make them public. Adding models can also be done through the webUI.
      The ai are not fast, and the ~3B versions this old laptop can handle are pretty crappy, but this is so cool and I'm really excited to explore and tinker. Thank you again!!

  • @GenAiWarrior
    @GenAiWarrior 19 дней назад +1

    Thank you brother !!

  • @shimmeringreflection
    @shimmeringreflection 3 дня назад

    Thanks. What are the reasons for running in a docker container? Why not just run in a terminal?

  • @jasonmehlhoff8877
    @jasonmehlhoff8877 6 дней назад +2

    Hello and thank you. After changing the Environment and running sudo systemctl daemon-reload, sudo systemctl restart ollama and then doing an ollama list I am getting a Error: could not connect to ollama app, is it running? Then when I run ollama serve I get a really long list of errors in Debug Mode.
    Is this OK? I do see my model in Open WebUI so it does appear to be working.
    Thanks!!
    Jason Mehlhoff

    • @BlueSpork
      @BlueSpork  6 дней назад

      If Open WebUI is working and showing your model, then Ollama is likely running correctly. The error when running ollama list might be due to permission issues or an environment configuration problem.

    • @jasonmehlhoff8877
      @jasonmehlhoff8877 6 дней назад +1

      @BlueSpork cool! Thanks for getting back so quickly. Yes, otherwise seems to be working very nicely. Have a good evening.
      Jason

    • @BlueSpork
      @BlueSpork  6 дней назад

      No problem. You too

    • @soulsborneenjoyer
      @soulsborneenjoyer 6 дней назад

      @@BlueSpork unfortunately I also get the error mentioned above... and I don't know how to solve this problem when I try to add a new model... do you have any idea?

    • @soulsborneenjoyer
      @soulsborneenjoyer 6 дней назад +2

      I think I solved the problem when I changed the local host to 0.0.0.0, reload the system configuration and restarted the ollama.

  • @danielltda
    @danielltda 15 дней назад +1

    At 2:41", regarding the "permission denied" issue, if it continue to occur even after you LogOut - LogIn, try to reboot the entire OS. That is how I solve this issue on my computer.

    • @BlueSpork
      @BlueSpork  15 дней назад

      Thanks for the info

  • @user-zk3px2fk1
    @user-zk3px2fk1 5 дней назад

    hello, I am the only one that have black screen after login in Open WebUI and nothing happen. I have tried to reinstall it all but nothing changed. it seems that Ollama can't communicate with Open WebUI. Not sure how to fix it. Any idea?

  • @rrkumar78
    @rrkumar78 25 дней назад +1

    Hi, one issue I’m still having is that ollama won’t use my GPU. I’m on 24.04 and I’m using the llama 3.2 8b model.

    • @BlueSpork
      @BlueSpork  25 дней назад

      What is your GPU?

    • @rrkumar78
      @rrkumar78 25 дней назад

      I have an RTX 4070 TI super.

    • @BlueSpork
      @BlueSpork  25 дней назад

      How do you know Ollama is not using your GPU?

    • @rrkumar78
      @rrkumar78 25 дней назад

      I’m using NVTOP and all my commands are going through the CPU

    • @BlueSpork
      @BlueSpork  25 дней назад

      There could be many different reasons for this issue. Troubleshooting through RUclips comments can be challenging and time-consuming. I recommend researching your specific problem (with your configuration) online to find more targeted solutions.

  • @florentmont7261
    @florentmont7261 Месяц назад +1

    Nice video, do you have the possibility of producing it for lobe-chat as well ?

    • @BlueSpork
      @BlueSpork  Месяц назад

      Thank you. I'm curious, why you have interest in lobe-chat.

    • @florentmont7261
      @florentmont7261 28 дней назад

      @@BlueSpork Because it's a bullet too 😍

  • @muhammadangga8837
    @muhammadangga8837 26 дней назад +1

    Awesome tutorial, very informative. thanks a lot!!
    How to running on Intel(R) Iris(R) Xe Graphics

    • @BlueSpork
      @BlueSpork  26 дней назад

      Intel Iris Xe Graphics can't handle large language models well. If they do work, it'll be really slow. A dedicated GPU would do a much better job.

    • @muhammadangga8837
      @muhammadangga8837 25 дней назад +1

      @@BlueSpork Oke, Thankyou sir.

  • @adzibilal8526
    @adzibilal8526 20 дней назад +1

    please create a tutorial for Ollama + OpenWebUI but on WSL

    • @BlueSpork
      @BlueSpork  20 дней назад

      Good idea! I’ll look into it after finishing the video I’m currently working on.

    • @DryUrEyesMate
      @DryUrEyesMate День назад

      I just did mine in wsl and the only issue I have is the model not showing in open web ui

  • @mclois79
    @mclois79 Месяц назад +1

    THanks the tuto, and merrie christmas
    Ffor me not sure why but im installing open-webui:main for the last week and gets flagged with unhealty :D , while the open-webui:v0.3.35 works

    • @BlueSpork
      @BlueSpork  Месяц назад

      You are welcome. I’m not sure why a different WebUI version works. I haven’t seen that problem before.

    • @mclois79
      @mclois79 Месяц назад

      @@BlueSpork no worries, funny thing is that first time i installed it worked straight away ... i did a fresh popos install and since than iit failed :D ... proably some stuff that i need to tweak ... wil let you know when i figure this out

    • @BlueSpork
      @BlueSpork  Месяц назад

      Hmm… yeah, let me know. I’m curious

  • @justzlagi
    @justzlagi 29 дней назад

    its not founding me a models

    • @BlueSpork
      @BlueSpork  29 дней назад

      Did you created a new Environment "OLLAMA_HOST" variable in the Ollama service file?

    • @justzlagi
      @justzlagi 29 дней назад +1

      @@BlueSpork I fixed it thank you