Crack Ollama Environment Variables with Ease - Part of the Ollama Course

Поделиться
HTML-код
  • Опубликовано: 15 ноя 2024

Комментарии • 24

  • @DodiInkoTariah
    @DodiInkoTariah Месяц назад +6

    Thanks Matt for your videos. Please are you able to do one with instruction for installing LLama 3.2 11B. It will be very helpful for many people but no pressure.

    • @sadboi5672
      @sadboi5672 Месяц назад

      isn't there no 11B model for 3.2? 3.2 only has 1B and 3B variants

    • @technovangelist
      @technovangelist  Месяц назад

      When it works I will. But there isn’t anything special with it.

    • @ShinyBlueBoots
      @ShinyBlueBoots 19 дней назад

      @@technovangelist Is there an EV for locale labguage setting?
      For example, I want my AI responses to come back in English UK Dictionary?

  • @christopherditto
    @christopherditto Месяц назад

    Thank you for this video! Question, how to I include environmental variables in Ollama responses to prompts? For example, is there something I can add to my modelfile to append the number of tokens used and the general.name environment variable (eg. "Llama 3.2 3B Instruct") to the end of each response?

  • @emil8367
    @emil8367 Месяц назад

    thanks Matt ! Is there any list of all env variables with description for each in the Ollama docs ?

  • @jimlynch9390
    @jimlynch9390 Месяц назад +2

    I think the OLLAMA_HOST needs a bit more explanation. This is what the server variable that lets you use ollama from another system looks like: Environment=OLLAMA_HOST=0.0.0.0. Then you can access the ollama server by setting a local environment variable to something like "export OLLAMA_HOST=192.168.2.41:11434". if the server is on 192.,168.2.41. Without the 0.0.0.0 the server system will reject any attempts to connect to port 11434.

  • @emmanuelgoldstein3682
    @emmanuelgoldstein3682 Месяц назад +1

    I'm subscribed with all notifications turned on but I didn't get this one for some reason... ☹

  • @ShinyBlueBoots
    @ShinyBlueBoots 19 дней назад

    Is there an EV for locale labguage setting?
    For example, I want my AI responses to come back in English UK Dictionary?

    • @technovangelist
      @technovangelist  19 дней назад

      No. If anything that would be part of the prompt

    • @ShinyBlueBoots
      @ShinyBlueBoots 19 дней назад

      @@technovangelist Thanks Matt!
      That's what I do now... it works for a small period of time then forgets :-)

  • @MaxJM74
    @MaxJM74 Месяц назад +1

    tks 👍

  • @M24Tom
    @M24Tom Месяц назад

    what about ollama running in docker container?

    • @technovangelist
      @technovangelist  Месяц назад +1

      What about it. That’s the easy one. Just add them to the docker command

  • @alx8439
    @alx8439 Месяц назад

    After feature request 4361 ollama team have added all the previously missing configuration options to be shown via `ollama serve -h`

  • @alexlee1711
    @alexlee1711 Месяц назад

    Run ollama on macOS, but it only use CPU. In "System Monitoring", GPU occupies 0%. Environment: macOS14 + radeon RX570(metal is supported) and AMD radeon por VII (metal3 is supported).

    • @technovangelist
      @technovangelist  Месяц назад +1

      Gpu on Mac is only supported with Apple Silicon Macs unfortunately. Since they are getting older every day I don’t see that changing.

    • @alexlee1711
      @alexlee1711 Месяц назад

      ​@@technovangelist Thank you for your guidance. It seems that I have to use ubuntu or windows.

    • @technovangelist
      @technovangelist  Месяц назад

      But even if you do install Ubuntu or windows on that machine the gpu isn’t supported. I think your best bet is an updated Mac.

  • @marcusk7855
    @marcusk7855 Месяц назад

    Just tried to change the temp directory yesterday on linux. It does not work.

  • @toadlguy
    @toadlguy Месяц назад +1

    Ha, ha, ha. We understand how RUclips works. We either pay for YT Premium or we watch ads and you get paid based on views. You don’t need to announce it is FREE at the beginning of your video. (Thanks for the content, though😊)

    • @technovangelist
      @technovangelist  Месяц назад +3

      If that were true I wouldn’t be asked often if it would stay free. Lots put a teaser on RUclips then move the rest to a paid platform.