LangGraph Agents with Structured Output

Поделиться
HTML-код
  • Опубликовано: 4 фев 2025

Комментарии • 30

  • @blueapollo3982
    @blueapollo3982 5 месяцев назад +73

    can we for the love of god please do a different use case than grabbing the weather in sf every demo?

    • @carbonneutralindiana8009
      @carbonneutralindiana8009 5 месяцев назад +14

      Maybe it's the new hello world.

    • @sarthakkaushik17
      @sarthakkaushik17 5 месяцев назад +2

      Hahahha , same thought!!!

    • @frag_it
      @frag_it 5 месяцев назад

      Or just duck duck go search

    • @sprobertson
      @sprobertson 5 месяцев назад +2

      maybe the weather in LA?

    • @J3R3MI6
      @J3R3MI6 5 месяцев назад +1

      Same with the snake game… it’s a conspiracy at this point

  • @i2c_jason
    @i2c_jason 4 месяца назад +4

    Is it possible to bake more instruction into the single-LLM first example's input prompt so it just follows the JSON format for its output? It's very easy for GPT4o to provide direct JSON output, for example. Just append ",answering in JSON format with the following structure {}" to whatever the input message to the LLM is? -Jason asking about JSON

  • @gitmaxd
    @gitmaxd 5 месяцев назад +1

    Massive Applause 👏
    The best introduction to structured output with LangChain out there!
    Appreciate you keeping it so simple and explanatory with both examples!

    • @xspydazx
      @xspydazx 5 месяцев назад

      These are just the docs bro !
      In fact these guys are making the video docs

  • @Criszusep
    @Criszusep 5 месяцев назад +3

    In the second example, if we are extracting the content of the tool message from the ToolNode and passing it as a HumanMessage in "respond", couldn't you have created a direct edge from the "tools" node to the "respond" node without coming back to the "agent" node for less token usage? Or am i missing a possible error?

  • @esatakkasoglu6381
    @esatakkasoglu6381 5 месяцев назад +6

    Are there any really world problem example?

  • @IdPreferNot1
    @IdPreferNot1 5 месяцев назад +3

    Is it true that only Mac users can use langgraph?
    Is Agent Studio going to be available on other platforms?

    • @OrestisStefanis
      @OrestisStefanis 5 месяцев назад

      No the langgraph python library is available on windows and Linux as well

    • @xspydazx
      @xspydazx 5 месяцев назад

      How to get the langgraph studio working on windows local ? ​@@OrestisStefanis

  • @keenanfernandes1130
    @keenanfernandes1130 3 месяца назад

    Is there a way you guys could provide a transposed version of the code for typescript or mention in the videos how we could do it ourselves, I love langgraph but doing anything in typescript is a bit difficult compared to the python version

  • @edendjanashvili2963
    @edendjanashvili2963 5 месяцев назад +1

    Is this using the native structured Outputs support that openai is providing?

    • @Alexkiller95
      @Alexkiller95 4 месяца назад

      Nope, this is using the with_structured_output method from LangChain which basically adds a tool and extend the prompt used by the LLM.

  • @kuhajeyangunaratnam8652
    @kuhajeyangunaratnam8652 4 месяца назад +1

    Please kindly provide notebooks links all these videos. Thx

  • @sprobertson
    @sprobertson 5 месяцев назад +1

    pro-tip for the first diagram: make the nodes that are the same in the same place so I don't have to parse the whole thing to see there is only one extra node

  • @BoredOnTheWeekend
    @BoredOnTheWeekend 5 месяцев назад

    If the output is longer than 4000x tokens, how can we generate longer output? I have an expected structured output of 5000 tokens since the JSON is so large.

    • @danieldvali9128
      @danieldvali9128 5 месяцев назад

      Infer the first 4k tokens, append it to messages with role == assistant and rerun inference. It will continue completing its first output.

    • @Alexkiller95
      @Alexkiller95 4 месяца назад

      @@danieldvali9128 could you please expand on that ? I have the same problem