Now You can Easily Host your Ollama using Salad Cloud at Just $0.3

Поделиться
HTML-код
  • Опубликовано: 23 сен 2024
  • In this video, let’s host your ollama models on cloud.
    Create a chatbot for just $0.3 using Ollama, SALAD and Open webui.
    Let me take you step by step approach on how to do this.
    Let’s do this!
    Join the AI Revolution!
    #SALAD #SALAD GPU#customollama #custommodels #noushermes #functioncalling #jsonstructuredoutpur #AGI #openai #autogen #windows #ollama #ai #llm_selector #auto_llm_selector #localllms #github #streamlit #langchain #openai #ollama #webui #github #python #llm #largelanguagemodels
    CHANNEL LINKS:
    🕵️‍♀️ Join my Patreon: / promptengineer975
    ☕ Buy me a coffee: ko-fi.com/prom...
    📞 Get on a Call with me at $125 Calendly: calendly.com/p...
    ❤️ Subscribe: / @promptengineer48
    💀 GitHub Profile: github.com/Pro...
    🔖 Twitter Profile: / prompt48
    TIME STAMPS:
    0:00 Intro
    🎁Subscribe to my channel: / @promptengineer48
    If you have any questions, comments or suggestions, feel free to comment below.
    🔔 Don't forget to hit the bell icon to stay updated on our latest innovations and exciting developments in the world of AI!

Комментарии • 12

  • @kaviarasana7584
    @kaviarasana7584 2 месяца назад +1

    I cant find the Deployment URL as illustrated. Where do I check them ?

  • @AClotheswoman
    @AClotheswoman 3 месяца назад +1

    nice video, one Question do you know how i set that ollama allows multiple requests

    • @PromptEngineer48
      @PromptEngineer48  3 месяца назад

      yes. check out this video.
      ruclips.net/video/8r_8CZqt5yk/видео.htmlsi=TDCcO0gksibb57P_

    • @AClotheswoman
      @AClotheswoman 3 месяца назад

      @@PromptEngineer48 yes that works. But I don’t know how I can set this on salad

  • @mulkproject687
    @mulkproject687 4 месяца назад

    Bro follow up question. Let say deploy and running. If i will not use the app will it still charge me with per hour rate? and second question the token response is unlimited? no limit?

    • @PromptEngineer48
      @PromptEngineer48  4 месяца назад

      Hi.. yes this will keep on charging you per hour rate. If you want to be charged when you use it only, you need to explore serverless architecture. search for runpods serverless.
      response token is limited by the llm. which llm are you using.. it is not unlimited.

  • @YashDesai95
    @YashDesai95 5 месяцев назад +1

    Best video

  • @josephj6802
    @josephj6802 2 месяца назад +1

    Just $0.394 ... per hour 😏= $283.68 a month?