Private LLM Inference: One-Click Open WebUI Setup with Docker

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024
  • Discover how to seamlessly install Open WebUI, the user-friendly WebUI for LLMs, with just a single click using Docker. In this tutorial, I will guide you through the process of setting up and running LLMs locally, ensuring a private and secure environment without the need for an internet connection. Perfect for those who value privacy and want full control over their LLM inference capabilities.
    📌 Don't forget to Like, Comment, and Subscribe for more tutorials!
    Join our DISCORD server: / discord
    Join this channel to get access to perks:
    / @aianytime
    To further support the channel, you can contribute via the following methods:
    Bitcoin Address: 32zhmo5T9jvu8gJDGW3LTuKBM1KPMHoCsW
    UPI: sonu1000raw@ybl
    #ollama #chatgpt #ai

Комментарии • 11

  • @Seeker_of_sense
    @Seeker_of_sense 3 месяца назад +1

    Well explained, better than most other channels. Thank you.

    • @AIAnytime
      @AIAnytime  2 месяца назад

      Glad you think so!

    • @viertekco
      @viertekco 12 дней назад

      🎉🎉cheers to that

  • @PrabhatKumar-nr6ys
    @PrabhatKumar-nr6ys 3 месяца назад +2

    Please suggest best way to run llm locally on production for free ..please suggest a few that runs locally entirely

  • @deepfarkade1879
    @deepfarkade1879 Месяц назад

    Hi @AiAnytime can you just make one video where if we have our own llm model fine tuned and all app ready and now i wanted that to be production ready and users can use that same , then how can we do that

  • @sneharoy3566
    @sneharoy3566 2 месяца назад

    Nicely explained

  • @latlov
    @latlov 2 месяца назад

    Also tell how tu update it, since as a GitHub repo, it is updated very often.

  • @inrevolt
    @inrevolt 2 месяца назад

    Hey nice video man, you explained it very well!
    You said you also notice a significant speed increase in webui vs cli. How bad is it for you? For me it is pretty terrible actually, cli is like 5x the speed at least compared to webui.
    Also, do you know if there is an option like ooba where you can see tokens/s?
    Gj sir! Tysm for the vid!

  • @AltMarc
    @AltMarc 2 месяца назад

    Why I hate docker:
    I just want to set the WEBUI_AUTH to false in a .env file, (like in the non-docker version that runs on another computer)
    2 hours searching = not a clue

  • @Scienceproject9237
    @Scienceproject9237 3 месяца назад +1

    First comment 🎉

  • @justinfye
    @justinfye 2 месяца назад

    why no link?