Discover LlamaIndex: Ask Complex Queries over Multiple Documents

Поделиться
HTML-код
  • Опубликовано: 6 сен 2024
  • In this video, we show how to ask complex comparison queries over multiple documents with LlamaIndex. Specifically, we show how to use our SubQuestionQueryEngine object which can break down complex queries into a query plan over subsets of documents.
    LlamaIndex: github.com/jer...
    Docs: gpt-index.read...
    Questions? Hop on our Discord: / discord
    Twitter: / gpt_index

Комментарии • 22

  • @nicholas-axilla
    @nicholas-axilla Год назад +10

    Enjoyed the tutorial. It would be ideal if you could upload also in 1080p quality in the future.

  • @SivakumarSaminathan-s2o
    @SivakumarSaminathan-s2o Месяц назад

    This tutorial is so good and easy to grasp Compared to the ones I have watched previously in this playlist

  • @HarikrishnanK-p9n
    @HarikrishnanK-p9n Месяц назад +1

    Sub Question Query engine not works for open source llms as there is a dependecy for openai.As per one page from you says some llms supports sub question query engine like zyphre,phi3-mini-4k-instruct and llama2:70b but not able to work with sub question query engine when i tried .is there an alternative tool there to perform the same ?or will come in future?if i use any open llm and if open ai key is present in the .env file the Sub question query engine will takes the key and work automatically with open ai.Its a curse as when we install pip install llama index there are 3 libraries associated with openai automatically downloads,if we removed them manually nothing will works.

  • @AndrewGruskin
    @AndrewGruskin Год назад +2

    An excellent addition to querying. There are many use cases that require querying over multiple documents in a “parallel” manner. I’m just curious how the service_context variable specifying the language model to be used is passed to the query.

    • @cameronmarcus5493
      @cameronmarcus5493 10 месяцев назад

      I am also curious why service_context is not given to GPTVectorStoreIndex()

  • @AndrewMagee01
    @AndrewMagee01 Год назад +4

    The documentation and the videos are lacking in quality and specificity. I would focus on listing more parameters in the documentation and less emoji's.

  • @mariozupan1662
    @mariozupan1662 11 месяцев назад +1

    Simple and excellent. Though I have a question about library choice. So, why you didn't choose llamaindex and llama2, why you choose langchain's openai?

    • @mariozupan1662
      @mariozupan1662 11 месяцев назад +3

      where service_context is used after it was defined?

  • @aa-xn5hc
    @aa-xn5hc 10 месяцев назад

    great! please more tutorials like this one

  • @user-tq8zm7yq2b
    @user-tq8zm7yq2b Год назад +4

    What if we have thousands of documents. Do we categorize them into some classes and build a query engine for each class?

    • @planplay5921
      @planplay5921 Год назад +4

      I have the same questions, do you know the answer now?

    • @RedditGuy-22
      @RedditGuy-22 5 месяцев назад +1

      No, you use a vectorstore

  • @plashless3406
    @plashless3406 6 месяцев назад

    amazing stuff. Thanks for sharing

  • @manaranjanpradhan2278
    @manaranjanpradhan2278 Год назад +1

    Can you please provide the tutorial link for this?

  • @basantsingh6404
    @basantsingh6404 6 месяцев назад

    I am getting below error
    ImportError: cannot import name 'ServiceContext' from 'llama_index'

  •  6 месяцев назад

    Could this be done as multimodal? Image index and text index

  • @ShubhamThakur-rt6tn
    @ShubhamThakur-rt6tn 5 месяцев назад

    can we use it for csv files too?

  • @longhaowang9710
    @longhaowang9710 5 месяцев назад

    this doesn't seem very scalable solution. if you have 1000 docs, then you'll build 1000 query engines? Doesn't seem to work ...

  • @jaminthalaivaa8179
    @jaminthalaivaa8179 Год назад +3

    bruh how can u get away with this fab video but no link to the notebook my g

  • @paulhetherington3854
    @paulhetherington3854 3 месяца назад

    You became; that muslim then!

  • @HarikrishnanK-p9n
    @HarikrishnanK-p9n Месяц назад

    Sub Question Query engine not works for open source llms as there is a dependecy for openai.As per one page from you says some llms supports sub question query engine like zyphre,phi3-mini-4k-instruct and llama2:70b but not able to work with sub question query engine when i tried .is there an alternative tool there to perform the same ?or will come in future?if i use any open llm and if open ai key is present in the .env file the Sub question query engine will takes the key and work automatically with open ai.Its a curse as when we install pip install llama index there are 3 libraries associated with openai automatically downloads,if we removed them manually nothing will works.