Hands on with LangGraph Agent Workflows: Build a LangChain Coding Agent with Custom Tools
HTML-код
- Опубликовано: 28 май 2024
- In this video, I walk through how to build a LangChain-writing agent using LangGraph. I'll build up from the basics of manually managing a conversation with OpenAI + Tools and then walk through how to handle the same workflow with a custom agent built with LangGraph.
Interested in talking about a project? Reach out!
Email: christian@botany-ai.com
LinkedIn: linkedin.com/in/christianerice
Part 1: Running a manual conversation with OpenAI Tools
Part 2: Running a manual conversation with OpenAI plus Tools.
Part 3: Building a LangChain Agent to automate that process.
Part 4: Rebuilding the LangChain Agent with LangGraph.
Part 5: Building a LangGraph agent from scratch that accesses a vector database and writes LangChain code.
Video Chapters:
0:00 Intro
1:19 Manual Conversation
4:54 Manual Conversation w/ Tools
8:39 LangChain Agent w/ Custom Tools
12:10 LangGraph Agent w/ Custom Tools
20:25 LangGraph Developer Agent
Follow along in the code: github.com/christianrice/demo... - Наука
Hey I'm building a health tech app rn and your videos are so helpful. Thank you !
I really appreciate all of your videos. For some reason the volume on all of the content that I've seen you post is like 50% more quiet than other videos on youtube. The "why" of how these things are put together is great in your videos and I'm always glad to have a better understanding of the underlying context.
Thanks for the feedback, I appreciate it! I'll have to look into the audio issue, I obviously haven't spent any time on production quality but hopefully that's a quick fix.
Ultimate video that i was looking for. Can u make a video on
Building a RAG pipeline on schema of a SQL database like Postgres and chunk it & embedding it on to pgvector extension to load only relavant schema in order to optimise tokens and prompt size & then passing natural language to sql with defog SQL Coder and futher give insights like how this video works. Thanks in advance
Thank you so much it is great explaination starting from python only, then langchain , then langgraph and at end with real world scenario .. amazing
Thanks for the note and thanks for watching!
Yeah. Actual machine learning steps ha
Brilliant! Thanks heaps
This langchain codes work well for one Q&A, but how to modify these to allow interactive conversations, and still able to handle the tools? Thanks!
Helpful indeed, thank you very much. I'll be looking out for your more sophisticated hierarchical tutorial involving the Agent Supervisor and task-specific Agents. If you could include an Agent that creates Python code on the fly that the Agent would use to fulfill its task, that would be instructive.
That was excellent
what code examples did you pushed in weaviate ?
Hi, Can I use vector database sample you used in this video?
hi thanks for the video. what is the extension or option that provides the coloured blocks for your indents ?
I use indent-rainbow, you can find it here: marketplace.visualstudio.com/items?itemName=oderwat.indent-rainbow
awesome
Very helpful. Could you pls post your notebook so we can follow along.
Thanks for watching! I added a link to the notebooks in the description.
@deployingai Great breakdown here 👌🏾- really appreciate the detail, particularly when LangChain docs are like navigating the desert - everything looks the same, but look again and everything has changed!
I couldn't help but notice the token counts in LangSmith @19:02
>>> your AgentExecutor version cost 102 tokens
>>> the LangGraph version cost 621 tokens😩
I'm guessing this is purely because it goes back to the Agent each cycle in LangGraph - or something else? Can this be avoided?
That's a great callout! The much larger token count in my LangGraph version was because I was using function calls as opposed to tool calls (tools can be run in parallel, functions cannot and they are now a legacy option). By switching to tools, we can remove the additional step you noticed that occurs between function calls and it brought the token count down to 46. I have to jump but I'll post the code and maybe a video when I have some time.
if you share your based notebook that will helps alot.
Thanks for watching! I added a link to the notebooks in the description.
Thanks, You are the best dude!@@deployingai