Build your own Local Mixture of Agents using Llama Index Pack!!!
HTML-код
- Опубликовано: 25 июл 2024
- I built my own Mixture of Agents a new innovative Model Grouping technique to use Multiple LLMs (by Wang et al.) using Ollama and Llama Index Pack.
This tutorial specifically deals with Local Models. We can leverage Llama 3 and Mistral and any other open model to build this MoA system.
🔗 Links 🔗
Local MoA - Notebook used in the code - github.com/amrrs/local_moa_ol...
MoA pack from Llama index - llamahub.ai/l/llama-packs/lla...
Ollama.ai to download ollama
Ollama Installation tutorial • Ollama on CPU and Priv...
❤️ If you want to support the channel ❤️
Support here:
Patreon - / 1littlecoder
Ko-Fi - ko-fi.com/1littlecoder
🧭 Follow me on 🧭
Twitter - / 1littlecoder
Linkedin - / amrrs Наука
Nice. Tutorial, thank you.
Can we customize system prompt for both aggregator and proposers?
Fantastic. Thanks so much , l have an issue for some reason getting this error Round 3/3 to collecting reference responses.
An error occurred:
Great video as always. I have a question.
I'm trying to build an RAG that answers questions about a dataset that I have using the create_pandas_dataframe_agent. I also have a long list of sample questions and answers that I want the RAG to imitate but not exactly copy. These questions contain some domain knowledge and I've also added some information about the columns at the end of these sample questions and answers.
I'm currently passing this as a prefix parameter but I'm not sure if that's the best way to do this.
The idea is to have this pandas agent as something that can answer questions that don't require pandas as well. What's the best way to build this? Thanks in advance!
On which GPU did the models run? Integrated GPU?
Yep Integrated. but mostly on CPU
thanks! how is title of previous video? i hope for next video of mixture of experts! :)
Could you elaborate? You mean like mixture of experts with llama index ?