LangGraph Agents with Structured Output

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 29

  • @blueapollo3982
    @blueapollo3982 2 месяца назад +59

    can we for the love of god please do a different use case than grabbing the weather in sf every demo?

    • @carbonneutralindiana8009
      @carbonneutralindiana8009 2 месяца назад +11

      Maybe it's the new hello world.

    • @sarthakkaushik17
      @sarthakkaushik17 2 месяца назад +1

      Hahahha , same thought!!!

    • @frag_it
      @frag_it 2 месяца назад

      Or just duck duck go search

    • @sprobertson
      @sprobertson 2 месяца назад +2

      maybe the weather in LA?

    • @J3R3MI6
      @J3R3MI6 2 месяца назад +1

      Same with the snake game… it’s a conspiracy at this point

  • @i2c_jason
    @i2c_jason Месяц назад +2

    Is it possible to bake more instruction into the single-LLM first example's input prompt so it just follows the JSON format for its output? It's very easy for GPT4o to provide direct JSON output, for example. Just append ",answering in JSON format with the following structure {}" to whatever the input message to the LLM is? -Jason asking about JSON

  • @Criszusep
    @Criszusep 2 месяца назад +2

    In the second example, if we are extracting the content of the tool message from the ToolNode and passing it as a HumanMessage in "respond", couldn't you have created a direct edge from the "tools" node to the "respond" node without coming back to the "agent" node for less token usage? Or am i missing a possible error?

  • @sprobertson
    @sprobertson 2 месяца назад +1

    pro-tip for the first diagram: make the nodes that are the same in the same place so I don't have to parse the whole thing to see there is only one extra node

  • @esatakkasoglu6381
    @esatakkasoglu6381 2 месяца назад +5

    Are there any really world problem example?

  • @keenanfernandes1130
    @keenanfernandes1130 20 дней назад

    Is there a way you guys could provide a transposed version of the code for typescript or mention in the videos how we could do it ourselves, I love langgraph but doing anything in typescript is a bit difficult compared to the python version

  • @IdPreferNot1
    @IdPreferNot1 2 месяца назад +3

    Is it true that only Mac users can use langgraph?
    Is Agent Studio going to be available on other platforms?

    • @OrestisStefanis
      @OrestisStefanis 2 месяца назад

      No the langgraph python library is available on windows and Linux as well

    • @xspydazx
      @xspydazx 2 месяца назад

      How to get the langgraph studio working on windows local ? ​@@OrestisStefanis

  • @kuhajeyangunaratnam8652
    @kuhajeyangunaratnam8652 2 месяца назад

    Please kindly provide notebooks links all these videos. Thx

  • @edendjanashvili2963
    @edendjanashvili2963 2 месяца назад +1

    Is this using the native structured Outputs support that openai is providing?

    • @Alexkiller95
      @Alexkiller95 2 месяца назад

      Nope, this is using the with_structured_output method from LangChain which basically adds a tool and extend the prompt used by the LLM.

  • @BoredOnTheWeekend
    @BoredOnTheWeekend 2 месяца назад

    If the output is longer than 4000x tokens, how can we generate longer output? I have an expected structured output of 5000 tokens since the JSON is so large.

    • @danieldvali9128
      @danieldvali9128 2 месяца назад

      Infer the first 4k tokens, append it to messages with role == assistant and rerun inference. It will continue completing its first output.

    • @Alexkiller95
      @Alexkiller95 2 месяца назад

      @@danieldvali9128 could you please expand on that ? I have the same problem

  • @gitmaxd
    @gitmaxd 2 месяца назад +1

    Massive Applause 👏
    The best introduction to structured output with LangChain out there!
    Appreciate you keeping it so simple and explanatory with both examples!

    • @xspydazx
      @xspydazx 2 месяца назад

      These are just the docs bro !
      In fact these guys are making the video docs