How To Build AI Knowledge Assistants with LlamaIndex

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024
  • LLMs are revolutionizing how users can search for, interact with, and generate new content, leading to a huge wave of developer-led, context-augmented LLM applications. Some recent stacks and toolkits around RAG have emerged, enabling developers to build applications such as chatbots using LLMs on their private data. However, while setting up basic RAG-powered QA is straightforward, solving complex question-answering over large quantities of complex data requires new data, retrieval, and LLM architectures. Jerry Liu, CoFounder and CEO at LlamaIndex, provides an overview of these agentic systems, the opportunities they unlock, how to build them, as well as remaining challenges.
    This talk was originally delivered at Arize:Observe 2024 at Shack 15 in San Francisco on July 11, 2024.
    Check out LlamaTrace: LlamaTrace: phoenix.arize....
    Sign up for Paper Readings: arize.com/reso...
    Check out Upcoming Arize AI Events: arize.com/events/
    More Details Here:

Комментарии •