Since DeepSeek is open source, you can run the actual model on your own computer unlike chatgpt. To do this, run “ollama pull deepseek-r1” in the container and that’s it!
hi madhu thanks for great tut, but facing one issue with permission between user n admin , so i shared a web url with my friend and given him user id as he saying he cant able to select model which were im seeing as admin and later i tested with another admin acc now he able to select model . My question is how do i configure user n admin model acces power ? Looking forward for answer Thanks ya
I’ve never tested it out since I don’t have an amd card but you could check out this reddit post about using ollama with an amd gpu in docker compose: www.reddit.com/r/ollama/comments/1gec1nx/docker_compose_for_amd_users/
@MrBenix181 the size available in the container should match your host system unless ur running docker in windows where it just makes a linux vm and runs the docker container
Since DeepSeek is open source, you can run the actual model on your own computer unlike chatgpt. To do this, run “ollama pull deepseek-r1” in the container and that’s it!
great tutorial - clear, concise and straight to the point.
I greatly appreciate it!
Exactly what i was looking for
Great vid!!!! Thnks
hi madhu thanks for great tut, but facing one issue with permission between user n admin , so i shared a web url with my friend and given him user id as he saying he cant able to select model which were im seeing as admin and later i tested with another admin acc now he able to select model . My question is how do i configure user n admin model acces power ? Looking forward for answer Thanks ya
What DE is that?
ok and how to use amd gpu?
I’ve never tested it out since I don’t have an amd card but you could check out this reddit post about using ollama with an amd gpu in docker compose: www.reddit.com/r/ollama/comments/1gec1nx/docker_compose_for_amd_users/
If you are on Linux just run the installation script. It takes care of installing the required ROCm drivers as well.
Hi, I got theerror : Error: write /root/.ollama/models/blobs/sha256-1234-partial: no space left on device. Is it docker limitation ?
this happens when your actual drive is full. can you check how much free space you have on your disk?
@@MadhuKraft at least 60 GB, it seems it comes from the container ? Are we limited on docker ?
@MrBenix181 the size available in the container should match your host system unless ur running docker in windows where it just makes a linux vm and runs the docker container
@@MadhuKraft I'm on Winodws, so it's a VM i have to change the size on wslconfig file ?
@@MrBenix181 yeah changing the disk size of the vm should fix it