Ollama on Windows | Run LLMs locally 🔥

Поделиться
HTML-код
  • Опубликовано: 5 янв 2025

Комментарии • 36

  • @juniorsilva3473
    @juniorsilva3473 9 месяцев назад +1

    Great video 🔥

  • @Justme-dk7vm
    @Justme-dk7vm 7 месяцев назад +2

    i want to create similar thing using ollama, but i want my model to answer completelyl accurate from the documents/pdfs provided by me locally, can you please help how can i implement it or make a video soon on it

  • @JaxCavalera
    @JaxCavalera 9 месяцев назад +2

    How are you getting the CPU and MEM usage etc. to appear in the terminal? Is that not Windows Terminal?

    • @j_owatson
      @j_owatson 4 месяца назад

      CPU isn't utilised unless you don't have a GPU or a compatible GPU installed. But that being said running it on my RTX 3050 it seams to be doing alright. You could easily get away with a RTX3050 and 16GB of ram

  • @PUBG-e9n
    @PUBG-e9n 9 дней назад +1

    how can I connect it with whatsapp api

  • @serdar_1978
    @serdar_1978 Месяц назад

    I don't fully understand the models there.are they big language models? phi which language model is used for what.openai is used to run the language models used in the application such as chatgpt on my computer? if so, in the application such as chatgpt, it tells me to get plus membership with limitation after a certain period of time.I think the raw form of these language models, that is, we can call it a kind of database? I don't know if there will be these limitations? briefly, where can I look at the purpose for which these language models are used? and correct me if my explanation I wrote above is wrong, please thank you

  • @souravjha2146
    @souravjha2146 Месяц назад

    loved it man, thanksssss

  • @bamh1re318
    @bamh1re318 7 месяцев назад +2

    Thanks for this easy-to-follow tutorial !

  • @AliAlias
    @AliAlias 10 месяцев назад +5

    Error: could not connect to ollama app, is it running? on windows 10

    • @jansshrimps3785
      @jansshrimps3785 8 месяцев назад

      im having exactly the same issue

    • @sarbajitsarkar9611
      @sarbajitsarkar9611 6 месяцев назад +1

      You can use ollama serve and then ollama run phi

    • @Pennytechnews
      @Pennytechnews 3 месяца назад

      @@sarbajitsarkar9611 same error

  • @tdlab11
    @tdlab11 7 месяцев назад +1

    How did you get your cmd prompt to show CPU and memory usage?

  • @DrMacabre
    @DrMacabre 4 месяца назад

    hello, any idea how to set keep_alive when running the windows exe ?

    • @rishabincloud
      @rishabincloud  4 месяца назад +2

      hey, if you look at the FAQ doc, there are instructions on how to do that.
      github.com/ollama/ollama/blob/main/docs/faq.md

    • @DrMacabre
      @DrMacabre 4 месяца назад +1

      @@rishabincloud thank you so much, right under my nose. :)

    • @rishabincloud
      @rishabincloud  4 месяца назад +1

      @@DrMacabre no worries :)

  • @mrgamer3275
    @mrgamer3275 10 месяцев назад

    Hello rishab i am studying in computer system technician in ST lawrence college so I want to learn some courses which would make me different from others but I am very confuse that from where and which course should I learn

  • @hendoitechnologies
    @hendoitechnologies 5 месяцев назад

    Can you post full course video about "Anthropic AI Model - Claude 3.5 sonnet" web version and API and fine tune and use case and applications.. complete Claude 3.5 sonnet course for developers...

  • @tusharkulange4029
    @tusharkulange4029 10 месяцев назад +1

    hey brother..
    i follow same thing but it is showing module is not callable

    • @jatinchawla1680
      @jatinchawla1680 9 месяцев назад +1

      hey did you find a fix??

    • @tusharkulange4029
      @tusharkulange4029 9 месяцев назад

      Yes i fixed it.

    • @jatinchawla1680
      @jatinchawla1680 9 месяцев назад

      @@tusharkulange4029 how did you fix it ,can you please help?

    • @jatinchawla1680
      @jatinchawla1680 9 месяцев назад

      @@tusharkulange4029 how bro??

    • @Rhaevyn-Hart
      @Rhaevyn-Hart 8 месяцев назад +1

      @@jatinchawla1680 don't you hate that. "Yes, I fixed it." Great, mind sharing? Why people. Just... why?

  • @Ajmal_Yazdani
    @Ajmal_Yazdani 10 месяцев назад

    Hi Rishab, this is so cool, thanks for sharing. What's the spec for your windows machine? the second thing how you made your terminal so beautify?

    • @rishabincloud
      @rishabincloud  10 месяцев назад +1

      Thank you.
      Here is my current setup - ruclips.net/video/6mZTVpFeyrs/видео.html
      And here is a blog post on how setup my terminal - blog.rishabkumar.com/beautify-your-terminal-wsl2

  • @bibletruth7014
    @bibletruth7014 2 месяца назад

    Thanks alot for the simple info, it worked great on one hand...on the other hand get this, I am talking to phi and it seemed to have a mind of it's own, so I asked it's name and it told me it was chatgpt-3 from open ai, I said what? so it turns out it knows nothing about phi llm and then it goes on to tell me mars is in another solar system revolving around a red dwarf lol

  • @Grandepau788
    @Grandepau788 8 месяцев назад

    where does ollama store models?

    • @rishabincloud
      @rishabincloud  8 месяцев назад +4

      They are stored locally on your machine, for Windows: "C:\Users\Username\.ollama"

    • @Grandepau788
      @Grandepau788 8 месяцев назад

      @@rishabincloud cool! thanks bro :D liked n subscribed

  • @BlindArcher0000
    @BlindArcher0000 7 месяцев назад +4

    this crap installs itself on C: (system volume) with no prompt or options to change the destination