RAG from scratch: Part 5 (Query Translation -- Multi Query)
HTML-код
- Опубликовано: 10 май 2024
- Query rewriting is a popular strategy to improve retrieval. Multi-query is an approach that re-writes a question from multiple perspectives, performs retrieval on each re-written question, and takes the unique union of all docs.
Slides:
docs.google.com/presentation/...
Code:
github.com/langchain-ai/rag-f...
i love lance's hand motions, so freakin' entertaining
we love em too!
that shows how intelligent a person is!
God this is amazing series from LangChain and Lance!!! Lance is an angel!!! 🙌🙏
Great piece of knowledge! I am not a professional python developer (yet) and the syntax with building a chain with " | " broke my brain. You could either explain it a little bit or use more explicit syntax if possible in the future.
Thank you for the video. I am enjoying the series I do have a question regarding this method.
If I think about conversations I have had with others, there are times either myself or the other person I am speaking with may say, "I do not understand your questions. What are you trying to say? What are you asking me". Would be better for the LLM to engage the user to see if there is deeper meaning to the question or questions? Once the LLM gains an understand, then retrieval can take place using the Multi Query.
A person who is a investment expert who asks the question "What was Tesla performance in FY2022 compared to FY2023, will have a different expectation in the answer than a layman who who ask the same question.
Just thinking out loud.
Correct me if I was wrong. Is the topic about what the class
langchain.retrievers.multi_query.MultiQueryRetriever
does?
And it is very similar to the Llama-Index SubQuestionQueryEngine, only different is that the Llama-Index applies the break-down of the origin question into sub pieces instead finding similar questions with LLM.
How did you do the graphics ?
So the charging profit is tracing? Like a log system? What about LlamaIndex?
Can you share the notebook?
How do you use embeddings WITHOUT OpenAI ?????
You don't need openai for embeddings. The only benefit of using it is that that it is faster, plug and play, and potentially more accurate.
You can use any embedding model you want, e.g. any embedding model from huggingface, etc. For example, you can use a swap out the openai embeddings for a CLIP model from hugging face.
@@theartofwar1750 If I tell you, build those shelves from scratch then you realize you need a subscription to IKEA. How does that make you feel?
Using another methods would certainly be more "building RAG from scratch" don't you think?
The LCEL is the most confusing thing you have ever invented guys... No need of that sh*t