Using Llama3 with Ollama & LlamaIndex | Generative AI Tools
HTML-код
- Опубликовано: 12 июн 2024
- Link To Playlist - • Generative AI Tools
Take your #AI game to the next level with #llama3 , #ollama , and #llamaindex! In this video, we'll show you how to harness the power of these cutting-edge tools to streamline your workflow and achieve incredible results.
From text generation to data analysis, we'll cover it all. Join us as we explore the possibilities of Llama3, Ollama, and LlamaIndex and discover how to get the most out of these revolutionary #ai solutions.
#llama3 #ollama #llamaindex #ai #artificialintelligence #machinelearning #machinelearningwithpython #naturallanguageprocessing #nlptechnology #aitools #productivityhacks #TransformerModel #deeplearning #pythonprogramming #programming #aicommunity #NLPCommunity #machinelearningcommunity #python Наука
Great content .Plz continue ur gen ai series.
Thank you for the acknowledgement 🙌 Seems like AI isn't going anywhere anytime soon, so neither am I 😅
@@thecodecruise Great.i will bing ur gen ai videos for long than 😅
@@parmanandchauhan6182 Surely, if you are interested I have another dedicated playliat for "GenAI with openAI" which will help you equip with the openAI landscape
Llama Rulez!
🦙
Do you also have the installation setup instructions for this to run successfully?
Yes, I can list them here in a while along with their sources. You can follow through this high level doc for setup details docs.llamaindex.ai/en/stable/examples/llm/rungpt/?h=run
@@thecodecruise Thanks but I am interested in Ollama + LlamaIndex setup, not rungpt.
@@anuragagrawal9829 For ollama Here are the setup pointers:
1. Install Ollama ollama.com/
2. Download llama3 (llama3:8b, llama3:instruct) from ollama
For LlamaIndex you will have to do it through the following command:
pip install llama-index
In case you get errors for not having supporting packages, you can see them listed here:
docs.llamaindex.ai/en/stable/getting_started/installation/
To configure ollama with Llama Index, all you need is to change the environment variables. All you need to do is to change the MODEL_PROVIDER key and the MODEL key in .env file for backend. I walk through that in this video here: ruclips.net/video/4wfXF_CSngM/видео.html
When I ran this line of code:
index = VectorStoreIndex.from_documents(documents)
It gave me an error:
Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error)
So I went and created an OpenAI key and initialized it using os.env. After this, it gave me an error saying I had exceeded my quota, so I should look at payment options, etc.
That's why I asked in another video if it was paid or not because I was getting these things.
Kindly guide me on this.
OpenAI is paid so you'll have to bump it up to use it but you can use open-source models like llama3 by Meta. Here's a video which I have explained how you can do it ruclips.net/video/4wfXF_CSngM/видео.html
Alternatively, You can also use `npx create-llama --ask-models` in which it will ask you to specify which model provider you want to use. Hope this helps
@@thecodecruise Okay I'll do this. Hopefully it works. Thanks once again!