Fast-track RAG: Chat with SQL Databases using Few-Shot Learning and Gemini | Streamlit | LangChain

Поделиться
HTML-код
  • Опубликовано: 10 дек 2024

Комментарии • 16

  • @keilavasquez728
    @keilavasquez728 7 месяцев назад +1

    Finally!!! I was waiting for it!!

    • @eduardov01
      @eduardov01  7 месяцев назад

      I hope you find the tutorial helpful!

  • @jim02377
    @jim02377 6 месяцев назад +1

    I like the idea of putting the few shot examples into a vector database. That would be a nice video to make.

    • @eduardov01
      @eduardov01  6 месяцев назад

      I'll definitely consider making it. Stay tuned!

  • @NishantRoutray-ug1qt
    @NishantRoutray-ug1qt 6 месяцев назад +1

    Please upload the next part by adding the few shots in vector DB, would be really helpful :-)

    • @eduardov01
      @eduardov01  6 месяцев назад +1

      Thank you for the comment! I'll be making this video soon.

  • @joulong
    @joulong 6 месяцев назад +1

    老師教的真的很好

    • @eduardov01
      @eduardov01  6 месяцев назад

      Thank you so much!

  • @JrTech-rw6wj
    @JrTech-rw6wj 6 месяцев назад +1

    Will it work if i have more tables in the database ?

    • @eduardov01
      @eduardov01  6 месяцев назад

      Yes, you can add as many tables as you like. The function that retrieves the schema will provide all the columns and tables as input to the LLM. You only need to add a few example SQL queries (few shots) for those tables so the LLM can understand how to JOIN them if necessary.

  • @antoniotameirao1703
    @antoniotameirao1703 6 месяцев назад +1

    How do the second model knows the initial question if only the sql response was provided?

    • @eduardov01
      @eduardov01  6 месяцев назад +1

      That's a good remark. Currently, the second model makes an assumption about the initial question based solely on the SQL response provided. For a robust approach, the initial question needs to be added to the prompt of the chain_query function. By including both the initial question and the SQL response as input fields, the final answer will be more accurate.

  • @rajupresingu2805
    @rajupresingu2805 6 месяцев назад +1

    Can you come up with a SQL agent chat with Llama3

    • @eduardov01
      @eduardov01  6 месяцев назад

      Yes, that's a valid approach.