How does it different from cachi7by UI libraries like chainlit where they use redis to store the embeddings of prompt and if it matches they return the previous response without even hitting the llm api. Which is better?
Howdy! What you're mentioning is embedding caching, which is a complete cache (i.e. the whole answer is stored and retrieved if there's a match). This here is kv cache embedding, it's partial embedding for LLM inference. When part of a prompt is being reused (and it has to be the first part), there are some intermediate values (k and v) that can be reused in the forward pass to generate the response.
@@TrelisResearch got it. why it has to first part? i couldn't quite get it from the video. Also, it is based on initial layers or end layers? how does it help with RAG architectures?
Thank you, as always very useful content!
you're welcome
Bro u a gem
appreciate it
How do we deal with hallucination resulting from our background info?
Take a look at my video on synthetic data generation. I cover it there.
Unless I’m misreading your Q and it relates to caching?
How does it different from cachi7by UI libraries like chainlit where they use redis to store the embeddings of prompt and if it matches they return the previous response without even hitting the llm api. Which is better?
Howdy! What you're mentioning is embedding caching, which is a complete cache (i.e. the whole answer is stored and retrieved if there's a match).
This here is kv cache embedding, it's partial embedding for LLM inference. When part of a prompt is being reused (and it has to be the first part), there are some intermediate values (k and v) that can be reused in the forward pass to generate the response.
@@TrelisResearch got it. why it has to first part? i couldn't quite get it from the video. Also, it is based on initial layers or end layers? how does it help with RAG architectures?
Do you think this will come to open source, self-hosted models?
Yup, I show SGLang (same approach for vLLM) in this video!
Super cool, thank you so much.