Install & Run Ollama on AWS Linux: Easily Install Llama3 or Any LLM Using Ollama and WebUI

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024

Комментарии • 17

  • @cjoshy
    @cjoshy Месяц назад +1

    I followed entire turorial but when i type 'llama3' in 'select a model' , 'Pull "llama3" from Ollama ' option is not appearing.

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад

      Please try the tutorial again from scratch. We tried it many times with users. And it’s worked each time.

  • @pushkarsawant9789
    @pushkarsawant9789 Месяц назад +1

    Hello, the video was nicely and clearly explained, step by step. I would like to see the same Ollma setup but on a serverless architecture. Could you please post a video on the Ollma serverless setup?

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад

      Thanks for sharing. Appreciated. Can you elaborate more…

    • @pushkarsawant9789
      @pushkarsawant9789 Месяц назад +1

      @@ScaleUpSaaS I mean setting up ollma on serverless technology on AWS using lambda or other services. Or maybe on Google cloud functions for serverless

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад +1

      We don’t know if it’s possible. But we will check and let you know 🫡

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад

      We try to look for a solution for you. Unfortunately we didn't found one yet. We will let you know if something comes up...

    • @pushkarsawant9789
      @pushkarsawant9789 27 дней назад

      @@ScaleUpSaaS, Thank You

  • @ywueeee
    @ywueeee Месяц назад +1

    How to run is private? Someone looking for those endpoint can find it on clear Web?

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад

      You can run it on your computer using docker as we showed in the tutorial. Or the next thing do what we did in the video and restrict access to the server only to your IP (config security group).

    • @ywueeee
      @ywueeee Месяц назад +1

      @@ScaleUpSaaS but what if you're WiFi Ip is not static and keeps on changing and you want access to the LLM from any device and any network but still keep it safe only accessible to you?

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад

      @wagmi614 in that case you can use Elastic IP address. In this video you can see how we are setting elastic IP address in AWS
      Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server
      ruclips.net/video/yhiuV6cqkNs/видео.html

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад

      Watch this. Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server
      ruclips.net/video/yhiuV6cqkNs/видео.html

    • @ywueeee
      @ywueeee Месяц назад +1

      @@ScaleUpSaaS wait i don't get it how elastic ip of aws helps when it's my ip that's changing and i want to input to be accepted from any ip?

  • @jimjohn1719
    @jimjohn1719 Месяц назад +1

    Is this free to run on AWS?. If not, can you comment on the AWS cost incurred to run this application?.

    • @ScaleUpSaaS
      @ScaleUpSaaS  Месяц назад

      Thanks for sharing. Ollama, llama3 or any other LLM that you can pull are free to use. But the server , because we are not using free tier instance type, it will cost you money for aws.