open-webui has moved on such a lot since this video, functionality that can boost productivy much much more. time you did a new video with the new functionality, like function, prompts, RAG etc
It's good but as new to this, we are missing a point where how to use this chatbot on website a live demonstration of linking this chatbot with website any WordPress etc etc could be helpful
Hello sir, hows your membership works? Is there any group or something? As per yt membership tab its 3 months ago content, please let me know how i can access your latest live streaming projects?
Hi Krish I am having a laptop with Nvidia RTX3050 GPU but docker always gives me error saying gpu not found. can you show us how you've set up docker environment to get gpu cuda option?
Though the language models are open source and free but I think we need to pay for subscription for docker desktop. Can you please confirm. Please also make a video on the minimum configuration that is required for running these models effectively locally.
there is no one who is able to capture the changing genai domain like you sir
One of the best channel on RUclips
Learn alot from you sir
your are so a great teacher keep continue uploading videos
Great video, I had not heard of it yet. Got it working. I had been looking for something just like this.
open-webui has moved on such a lot since this video, functionality that can boost productivy much much more. time you did a new video with the new functionality, like function, prompts, RAG etc
It's good but as new to this, we are missing a point where how to use this chatbot on website a live demonstration of linking this chatbot with website any WordPress etc etc could be helpful
Hello sir, hows your membership works? Is there any group or something? As per yt membership tab its 3 months ago content, please let me know how i can access your latest live streaming projects?
Thanks brother but would love if you could cover modes who allow api cause not everyone is out here running 4000 grand worth of equipment
Hi Krish I am having a laptop with Nvidia RTX3050 GPU but docker always gives me error saying gpu not found. can you show us how you've set up docker environment to get gpu cuda option?
How can integrate this with other applications so that this can build API?
When I open the web UI along with my local LLM Lllama 2, I can see GPT-4 and GPT-4o in the choose LLM dropdown. Do you know what is causing this?
There is anyway to use FAISS with Pipelines in Open WebUI? liked and subscribed.
Existing models do not pull in like yours what did you do so your existing models are shown in webui?
Krish, can you give the complete tutorial on distributed computing
I did all steps, localhost is opening. When I try to find the models from the list, its showing no models found. How to fix it now. ?
I think you need to download the models using command prompt. Correct me if I am wrong
have you any idea how to use functiog calling with open web-ui plzz?
@Krish, I intend to use Ollama with my VS Code, so that it can help me code better, how can I achieve it using Ollama-Codellama model and VS Code?
I was looking for same kind of solution. Can we collaborate on this to make some kind of npm library or extension for vscode?
this is nice. how can i passthrough a colab GPU like I do for Ollama on cmd to use the colab GPU for the open webUI app as well?
Though the language models are open source and free but I think we need to pay for subscription for docker desktop. Can you please confirm. Please also make a video on the minimum configuration that is required for running these models effectively locally.
🙏👍
What configurations are required to run this?
i ran on macbook m1 which has storage of 128gb and 8gb RAM,it is working.
is there any way to deploy these custom trained models?
Hi How to download models in local host? like codeguru, code llamma..
What is the accuracy of this model ?
😢Shows Impacts on Chat Gpt
Is it possible on linux?
Why u keep shaving from now on
Bhai there are much other thing to discuss...focus on ur learnings
Sorry sir if i hurt you
hloo sir