Thanks Matt for your videos. Please are you able to do one with instruction for installing LLama 3.2 11B. It will be very helpful for many people but no pressure.
Thank you for this video! Question, how to I include environmental variables in Ollama responses to prompts? For example, is there something I can add to my modelfile to append the number of tokens used and the general.name environment variable (eg. "Llama 3.2 3B Instruct") to the end of each response?
I think the OLLAMA_HOST needs a bit more explanation. This is what the server variable that lets you use ollama from another system looks like: Environment=OLLAMA_HOST=0.0.0.0. Then you can access the ollama server by setting a local environment variable to something like "export OLLAMA_HOST=192.168.2.41:11434". if the server is on 192.,168.2.41. Without the 0.0.0.0 the server system will reject any attempts to connect to port 11434.
Run ollama on macOS, but it only use CPU. In "System Monitoring", GPU occupies 0%. Environment: macOS14 + radeon RX570(metal is supported) and AMD radeon por VII (metal3 is supported).
Ha, ha, ha. We understand how RUclips works. We either pay for YT Premium or we watch ads and you get paid based on views. You don’t need to announce it is FREE at the beginning of your video. (Thanks for the content, though😊)
Thanks Matt for your videos. Please are you able to do one with instruction for installing LLama 3.2 11B. It will be very helpful for many people but no pressure.
isn't there no 11B model for 3.2? 3.2 only has 1B and 3B variants
When it works I will. But there isn’t anything special with it.
@@technovangelist Is there an EV for locale labguage setting?
For example, I want my AI responses to come back in English UK Dictionary?
Thank you for this video! Question, how to I include environmental variables in Ollama responses to prompts? For example, is there something I can add to my modelfile to append the number of tokens used and the general.name environment variable (eg. "Llama 3.2 3B Instruct") to the end of each response?
thanks Matt ! Is there any list of all env variables with description for each in the Ollama docs ?
I think the OLLAMA_HOST needs a bit more explanation. This is what the server variable that lets you use ollama from another system looks like: Environment=OLLAMA_HOST=0.0.0.0. Then you can access the ollama server by setting a local environment variable to something like "export OLLAMA_HOST=192.168.2.41:11434". if the server is on 192.,168.2.41. Without the 0.0.0.0 the server system will reject any attempts to connect to port 11434.
I'm subscribed with all notifications turned on but I didn't get this one for some reason... ☹
Is there an EV for locale labguage setting?
For example, I want my AI responses to come back in English UK Dictionary?
No. If anything that would be part of the prompt
@@technovangelist Thanks Matt!
That's what I do now... it works for a small period of time then forgets :-)
tks 👍
what about ollama running in docker container?
What about it. That’s the easy one. Just add them to the docker command
After feature request 4361 ollama team have added all the previously missing configuration options to be shown via `ollama serve -h`
yup, most are there
Run ollama on macOS, but it only use CPU. In "System Monitoring", GPU occupies 0%. Environment: macOS14 + radeon RX570(metal is supported) and AMD radeon por VII (metal3 is supported).
Gpu on Mac is only supported with Apple Silicon Macs unfortunately. Since they are getting older every day I don’t see that changing.
@@technovangelist Thank you for your guidance. It seems that I have to use ubuntu or windows.
But even if you do install Ubuntu or windows on that machine the gpu isn’t supported. I think your best bet is an updated Mac.
Just tried to change the temp directory yesterday on linux. It does not work.
Ha, ha, ha. We understand how RUclips works. We either pay for YT Premium or we watch ads and you get paid based on views. You don’t need to announce it is FREE at the beginning of your video. (Thanks for the content, though😊)
If that were true I wouldn’t be asked often if it would stay free. Lots put a teaser on RUclips then move the rest to a paid platform.