LangChain Expression Language (LCEL) | Langchain Tutorial | Code

Поделиться
HTML-код
  • Опубликовано: 3 фев 2025

Комментарии •

  • @techwithsaketh
    @techwithsaketh 3 месяца назад +1

    Great Tutorial - Keep the good work

  • @prasad_yt
    @prasad_yt 8 месяцев назад +1

    Nice simplified explanation ❤

  • @andaldana
    @andaldana Год назад +1

    Great tutorial - thanks!

  • @MuhammadFaizanMumtaz3
    @MuhammadFaizanMumtaz3 Год назад +1

    Sir! Your doing great job

  • @benepstein3970
    @benepstein3970 Год назад +1

    Thanks, subscribed!

  • @KEVALKANKRECHA
    @KEVALKANKRECHA 7 месяцев назад

    Great video .!

  • @mushinart
    @mushinart 10 месяцев назад

    First of all , thank you for the amazing way you explain stuff.... Its elegant...now i have only one question when we userd the retriever to assign it's value to the context variable, i assume the retriever will pass all of the documents in the rag list to the llm so that when it validates the question,it will use the whole rag documents that are all being passed as a context in the prompt. Am i right? ... Will it be a good idea if we created a chain to first take the question to query the rag and get just the needed context then pass it with the question again to form a human understandable answer from the llm ? ....im asking to see if im understanding it right or not .... Thanks man

  • @Tushii
    @Tushii Год назад

    Is there a way in which I could batch invoke a list of text files ?
    I want to extract certain texts from each file using openai?
    Or would I have to do it one by one and loop

    • @FutureSmartAI
      @FutureSmartAI  Год назад

      you can extract fill text and pass it in batch

    • @Tushii
      @Tushii Год назад

      @@FutureSmartAI cool, thanks, I shall try it out

  • @muhammedaslama9908
    @muhammedaslama9908 Год назад

    AzureOpenAi chat model doesn't seem to support LCEL. Am I doing something wrong?

  • @thumarzeel
    @thumarzeel 11 месяцев назад

    Awesome buddy