Hello, the video was nicely and clearly explained, step by step. I would like to see the same Ollma setup but on a serverless architecture. Could you please post a video on the Ollma serverless setup?
@@ScaleUpSaaS I mean setting up ollma on serverless technology on AWS using lambda or other services. Or maybe on Google cloud functions for serverless
You can run it on your computer using docker as we showed in the tutorial. Or the next thing do what we did in the video and restrict access to the server only to your IP (config security group).
@@ScaleUpSaaS but what if you're WiFi Ip is not static and keeps on changing and you want access to the LLM from any device and any network but still keep it safe only accessible to you?
@wagmi614 in that case you can use Elastic IP address. In this video you can see how we are setting elastic IP address in AWS Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server ruclips.net/video/yhiuV6cqkNs/видео.html
Thanks for sharing. Ollama, llama3 or any other LLM that you can pull are free to use. But the server , because we are not using free tier instance type, it will cost you money for aws.
I followed entire turorial but when i type 'llama3' in 'select a model' , 'Pull "llama3" from Ollama ' option is not appearing.
Please try the tutorial again from scratch. We tried it many times with users. And it’s worked each time.
Hello, the video was nicely and clearly explained, step by step. I would like to see the same Ollma setup but on a serverless architecture. Could you please post a video on the Ollma serverless setup?
Thanks for sharing. Appreciated. Can you elaborate more…
@@ScaleUpSaaS I mean setting up ollma on serverless technology on AWS using lambda or other services. Or maybe on Google cloud functions for serverless
We don’t know if it’s possible. But we will check and let you know 🫡
We try to look for a solution for you. Unfortunately we didn't found one yet. We will let you know if something comes up...
@@ScaleUpSaaS, Thank You
How to run is private? Someone looking for those endpoint can find it on clear Web?
You can run it on your computer using docker as we showed in the tutorial. Or the next thing do what we did in the video and restrict access to the server only to your IP (config security group).
@@ScaleUpSaaS but what if you're WiFi Ip is not static and keeps on changing and you want access to the LLM from any device and any network but still keep it safe only accessible to you?
@wagmi614 in that case you can use Elastic IP address. In this video you can see how we are setting elastic IP address in AWS
Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server
ruclips.net/video/yhiuV6cqkNs/видео.html
Watch this. Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server
ruclips.net/video/yhiuV6cqkNs/видео.html
@@ScaleUpSaaS wait i don't get it how elastic ip of aws helps when it's my ip that's changing and i want to input to be accepted from any ip?
Is this free to run on AWS?. If not, can you comment on the AWS cost incurred to run this application?.
Thanks for sharing. Ollama, llama3 or any other LLM that you can pull are free to use. But the server , because we are not using free tier instance type, it will cost you money for aws.