Thanks 😊 And great question! Yes you can run Ollama on docker. You'll find more information on this Ollama blog post: ollama.com/blog/ollama-is-now-available-as-an-official-docker-image
Excellent, thank you. Would it be possible and reasonable to run this on Lightening AI (CPU), full time, and have permanent mobile access on the go while my desktop machine is sleeping?
Honestly, I'm not sure how it will behave when the studio is sleeping. I'll have to try that out. Also: from what I know, the lightning.ai free tier disconnects after 4 hours (even in CPU). But that is definitely something I will try and put in a future video ☺️ Thanks for the suggestion!
Hi , can you please help answer my query about multiple users: can this url link be used by multiple user simultaneously or only one user can access it remotely at a time, kindly reply
As far as I know, a free Ngrok URL can be accessed by multiple users at the same time, but there are still limits on monthly usage. Just keep that in mind!
I would recommend at least 16 GB RAM. However, in this demo, I'm running this on a Macbook Air M2 8GB RAM and if I don't use any other processes, it runs - although very slowly. So 16 GB RAM or higher should run more smoothly.
I'm not a Linux user myself, so I have not tried it, but Ollama offers a Linux version: ollama.com/download/linux. And open WebUI uses Docker (OS independent) and mentions Linux, so I assume it should be possible to run it on Linux: github.com/open-webui/open-webui?tab=readme-ov-file#quick-start-with-docker-. Hope this helps ☺️
Maybe one day the linux community will wake up and start using GUIs instead of CLIs. For some reason there is no package manager that actually have a effective GUI. It's always missing something or another and you have to go back to CLI😅
Nah. I like the CLI. I just wish things gave you more control at times. For example why does OpenWebUI only work on localhost and not from another PC on the network?!?!?!?! SOOOO DUMB! Why not listen on all ports or where is the flag for that in the docker setup?!?!?!? THAT is what is annoying.
@@thegreatcerebral My friend, that's been fixed. I've used it on all my machines/mobile. Find out the IP of the computer/docker hosting webui and try the following ports: 8080, 8000, 3000 :
You're right, the free tier is limited to 1GB and 20,000 requests, which is often sufficient. You can monitor usage on their site, and no payment info is needed to use it.
You done us a great favor putting everything together step-by-step. It's often very confusing to get things
Thank you sooo much! I'm really glad you enjoyed my video ☺️
If don't have homebrew or know what it is you have do download it. HUGE help
Can I run Ollama in a Docker container as well, instead of on the bare hardware?
Thanks for your service, great video!
Thanks 😊 And great question! Yes you can run Ollama on docker. You'll find more information on this Ollama blog post: ollama.com/blog/ollama-is-now-available-as-an-official-docker-image
Hi can you offer any assistance enable voice to the docker or ollama webui?
sorry no, haven't tried that out yet.
http won't allow mic/cam, it has to be https which is the reason for the ngrok forwarding part in this video. What a mess
Thanks. All is work. Last part done with docker . Dont use windows system, its extracting feils
I dont have a Mac. will this work for me. Im just trying to configure ubiquify equipment I was sent a email suggesting webui. I am lost.
Both ollama and docker work on Linux and windows. So this setup should work all these operating systems as well.
Excellent, thank you. Would it be possible and reasonable to run this on Lightening AI (CPU), full time, and have permanent mobile access on the go while my desktop machine is sleeping?
Honestly, I'm not sure how it will behave when the studio is sleeping. I'll have to try that out. Also: from what I know, the lightning.ai free tier disconnects after 4 hours (even in CPU). But that is definitely something I will try and put in a future video ☺️ Thanks for the suggestion!
Hi , can you please help answer my query about multiple users: can this url link be used by multiple user simultaneously or only one user can access it remotely at a time, kindly reply
As far as I know, a free Ngrok URL can be accessed by multiple users at the same time, but there are still limits on monthly usage. Just keep that in mind!
Great!
how do you set it up for local Lan connection ONLY, can you explain please.
Of course. I made another video about setting up OpenWebUI locally: ruclips.net/video/0DFJc-oIRQ8/видео.html. Hope this helps ☺️
so how powerful should my computeer has to be?
I would recommend at least 16 GB RAM. However, in this demo, I'm running this on a Macbook Air M2 8GB RAM and if I don't use any other processes, it runs - although very slowly. So 16 GB RAM or higher should run more smoothly.
This can be installed on Linux too?
I'm not a Linux user myself, so I have not tried it, but Ollama offers a Linux version: ollama.com/download/linux. And open WebUI uses Docker (OS independent) and mentions Linux, so I assume it should be possible to run it on Linux: github.com/open-webui/open-webui?tab=readme-ov-file#quick-start-with-docker-. Hope this helps ☺️
thanks
and you can do it on windows
any way to do in windows I saw they have a install via docker too but I don't know how to add my local http
Maybe one day the linux community will wake up and start using GUIs instead of CLIs. For some reason there is no package manager that actually have a effective GUI. It's always missing something or another and you have to go back to CLI😅
☺️
Nah. I like the CLI. I just wish things gave you more control at times. For example why does OpenWebUI only work on localhost and not from another PC on the network?!?!?!?! SOOOO DUMB! Why not listen on all ports or where is the flag for that in the docker setup?!?!?!? THAT is what is annoying.
@@thegreatcerebral My friend, that's been fixed. I've used it on all my machines/mobile. Find out the IP of the computer/docker hosting webui and try the following ports: 8080, 8000, 3000
:
Ngrok isn't free
They have a free tier. What I show in the video uses the free tier
@@TechXplainator yes i heard it is limited, and when you are done with bandwidth it is time to pay
You're right, the free tier is limited to 1GB and 20,000 requests, which is often sufficient. You can monitor usage on their site, and no payment info is needed to use it.
@@TechXplainator thanks, does it renew every month, or it is a one time thing?
It resets every month ☺️
I wouldn t trust a service like this. Honestly.... Am I alone?
Why?