Running Ollama in Colab (Free Tier) - Step by Step Tutorial

Поделиться
HTML-код
  • Опубликовано: 3 фев 2025

Комментарии • 23

  • @HelloIamLauraa
    @HelloIamLauraa 21 день назад +1

    thank youu I really liked it, very good explained

  • @khlifimohamedrayen1303
    @khlifimohamedrayen1303 3 месяца назад +1

    Thank you very much for this tutorial! I was having many problems running the ollama server on colab without the colabxterm... You're such a life saver!

  • @mohamadadhikasuryahaidar7652
    @mohamadadhikasuryahaidar7652 3 месяца назад +1

    thanks for the tutorial

  • @namankumarmuktha4507
    @namankumarmuktha4507 Месяц назад

    Life Saver..

  • @apocart8426
    @apocart8426 Месяц назад +1

    hi thank you very much for this insightful video. I was also wondering if we could run image generation models such as flux on google collab as well.

    • @AboniaSojasingarayar
      @AboniaSojasingarayar  Месяц назад +1

      Hi, yes, absolutely we can.
      Sample notebook: colab.research.google.com/github/NielsRogge/Transformers-Tutorials/blob/master/Flux/Run_Flux_on_an_8GB_machine.ipynb
      Hope this helps.

    • @apocart8426
      @apocart8426 Месяц назад +1

      @@AboniaSojasingarayar You're amazing thankyou

    • @AboniaSojasingarayar
      @AboniaSojasingarayar  Месяц назад +1

      Happy to help.

    • @apocart8426
      @apocart8426 Месяц назад

      @@AboniaSojasingarayar hey so i tried this out, one of the issues im having is that this exhausts all of the free resources of colab. so if you know any good model that we can use on colab and how to use it then please do share

    • @AboniaSojasingarayar
      @AboniaSojasingarayar  Месяц назад

      You may try using a quantized model and GPU runtime.
      To install such model , recently published a tutorial on how we can pull gguf quantized model into ollama: ruclips.net/video/8MjS0aOV8tE/видео.htmlsi=J5fCzL6lw_zRGJ2p
      - Additionally try using subprocess and threading for better performance.sample code here:
      def run_ollama(): subprocess.Popen(["ollama", "serve"])
      ollama_thread = threading.Thread(target=run_ollama) ollama_thread.start()
      Hope this helps.

  • @VenkatesanVenkat-fd4hg
    @VenkatesanVenkat-fd4hg 5 месяцев назад +3

    Great share, insightful share as always...Are u using obs studio for recording....by Senior Data Scientist....

    • @AboniaSojasingarayar
      @AboniaSojasingarayar  5 месяцев назад +1

      Glad it helped.
      Not really! Just using the built-in recording and iMovie to edit it.

  • @enia123
    @enia123 4 месяца назад +1

    thank you I was studying something related, but my computer's performance was very poor due to lack of money. I had a problem with ollama not working in Colab, but it was resolved! thank you I would like to test a model created in Colab. Is there a way to temporarily run it as a web service?

    • @AboniaSojasingarayar
      @AboniaSojasingarayar  4 месяца назад +1

      Most welcome.
      Great and glad to hear that finally it worked.
      1. Of course we can use the flask API and ColabCode package to serve your mode via endpoint in ngrok temporary URL.
      github.com/abhishekkrthakur/colabcode
      2. And another way is using flask and flask-ngrok.
      pypi.org/project/flask-ngrok/
      pypi.org/project/Flask-API/
      Sample code for reference:
      from flask import Flask
      from flask_ngrok import run_with_ngrok
      app = Flask(__name__)
      run_with_ngrok(app)
      @app.route("/")
      def home():
      return "Hello World"
      app.run()
      If needed I'll try to do a tuto on this topic in future.
      Hope this helps:)

    • @enia123
      @enia123 4 месяца назад

      @@AboniaSojasingarayar thank you Have a nice day~

  • @Giorgio_Caniglia
    @Giorgio_Caniglia 4 дня назад

    Doesn't work, Google gives me an error

    • @AboniaSojasingarayar
      @AboniaSojasingarayar  3 дня назад +1

      @@Giorgio_Caniglia Can you post the error screen here please?
      Cheers

    • @Giorgio_Caniglia
      @Giorgio_Caniglia 3 дня назад

      @AboniaSojasingarayar so kind thank you, don't worry I solved It by opening the colab in Chrome, thank you!

    • @AboniaSojasingarayar
      @AboniaSojasingarayar  3 дня назад

      Great 👍