Thank you very much for this tutorial! I was having many problems running the ollama server on colab without the colabxterm... You're such a life saver!
@@AboniaSojasingarayar hey so i tried this out, one of the issues im having is that this exhausts all of the free resources of colab. so if you know any good model that we can use on colab and how to use it then please do share
You may try using a quantized model and GPU runtime. To install such model , recently published a tutorial on how we can pull gguf quantized model into ollama: ruclips.net/video/8MjS0aOV8tE/видео.htmlsi=J5fCzL6lw_zRGJ2p - Additionally try using subprocess and threading for better performance.sample code here: def run_ollama(): subprocess.Popen(["ollama", "serve"]) ollama_thread = threading.Thread(target=run_ollama) ollama_thread.start() Hope this helps.
thank you I was studying something related, but my computer's performance was very poor due to lack of money. I had a problem with ollama not working in Colab, but it was resolved! thank you I would like to test a model created in Colab. Is there a way to temporarily run it as a web service?
Most welcome. Great and glad to hear that finally it worked. 1. Of course we can use the flask API and ColabCode package to serve your mode via endpoint in ngrok temporary URL. github.com/abhishekkrthakur/colabcode 2. And another way is using flask and flask-ngrok. pypi.org/project/flask-ngrok/ pypi.org/project/Flask-API/ Sample code for reference: from flask import Flask from flask_ngrok import run_with_ngrok app = Flask(__name__) run_with_ngrok(app) @app.route("/") def home(): return "Hello World" app.run() If needed I'll try to do a tuto on this topic in future. Hope this helps:)
thank youu I really liked it, very good explained
You are most welcome and glad it helped.
Thank you very much for this tutorial! I was having many problems running the ollama server on colab without the colabxterm... You're such a life saver!
You are most welcome!
Glad it helped.
thanks for the tutorial
Happy to help
Life Saver..
Glad to hear that it was helpful.
hi thank you very much for this insightful video. I was also wondering if we could run image generation models such as flux on google collab as well.
Hi, yes, absolutely we can.
Sample notebook: colab.research.google.com/github/NielsRogge/Transformers-Tutorials/blob/master/Flux/Run_Flux_on_an_8GB_machine.ipynb
Hope this helps.
@@AboniaSojasingarayar You're amazing thankyou
Happy to help.
@@AboniaSojasingarayar hey so i tried this out, one of the issues im having is that this exhausts all of the free resources of colab. so if you know any good model that we can use on colab and how to use it then please do share
You may try using a quantized model and GPU runtime.
To install such model , recently published a tutorial on how we can pull gguf quantized model into ollama: ruclips.net/video/8MjS0aOV8tE/видео.htmlsi=J5fCzL6lw_zRGJ2p
- Additionally try using subprocess and threading for better performance.sample code here:
def run_ollama(): subprocess.Popen(["ollama", "serve"])
ollama_thread = threading.Thread(target=run_ollama) ollama_thread.start()
Hope this helps.
Great share, insightful share as always...Are u using obs studio for recording....by Senior Data Scientist....
Glad it helped.
Not really! Just using the built-in recording and iMovie to edit it.
thank you I was studying something related, but my computer's performance was very poor due to lack of money. I had a problem with ollama not working in Colab, but it was resolved! thank you I would like to test a model created in Colab. Is there a way to temporarily run it as a web service?
Most welcome.
Great and glad to hear that finally it worked.
1. Of course we can use the flask API and ColabCode package to serve your mode via endpoint in ngrok temporary URL.
github.com/abhishekkrthakur/colabcode
2. And another way is using flask and flask-ngrok.
pypi.org/project/flask-ngrok/
pypi.org/project/Flask-API/
Sample code for reference:
from flask import Flask
from flask_ngrok import run_with_ngrok
app = Flask(__name__)
run_with_ngrok(app)
@app.route("/")
def home():
return "Hello World"
app.run()
If needed I'll try to do a tuto on this topic in future.
Hope this helps:)
@@AboniaSojasingarayar thank you Have a nice day~
Doesn't work, Google gives me an error
@@Giorgio_Caniglia Can you post the error screen here please?
Cheers
@AboniaSojasingarayar so kind thank you, don't worry I solved It by opening the colab in Chrome, thank you!
Great 👍