Tutorial as per the proper definiton of it! Excellent I am trying to scale up on this and build a bigger orchestrating graph with several agents made out of the graph you made. Not that easy, would you use subgraphs to do so?
@ColeMedin subgraph are fully functional graph you can call in as a node in langraph ruclips.net/video/Z8l7C031xkM/видео.htmlsi=S-yA1tHSMcEqfagh The graph you did could be taken as one subgraoh representing a single agent and we could scale it by having more of them called by an orchestrator node/graph. There might be other method but I have this in mind
Thank you! Great question! I prefer coding things myself and having that level of control which is why I love using LangChain/LangGraph itself instead of FlowiseAI which is just a wrapper on top of LangChain. But it is a great tool to build things faster without having to code!
@aaagaming2023 is correct, 3.11 is recommended! With lower version of Python, you'll have issues streaming the LLM tokens with LangGraph (their documentation claims this and I tested it myself as well!).
@@varg92 Great question! Not for any closed source models, but local LLMs there is a version of Llama 3.1 fine tuned for function calling by Groq. I'd check that one out! You can search for Groq function calling Llama 3.1
Great question! There are a few ways to do this, but the best way would be to leverage LangGraph's built in support for context objects (they even specifically call out DB connections!): langchain-ai.github.io/langgraph/how-tos/state-context-key/
@@ColeMedin The previous or prevalent way to do this delegates entire db functions and everything to tools. And maybe use ToolNode. I have to test both for performance, but this is a sqlalchemy or rest api (say fastapi) problem and is independent of LangGraph.
very well explained, thank you! (still watching)
your code looks great, well structured, accurate and professional, congratulations !!
Thank you Argi, I appreciate it a lot!!
Tutorial as per the proper definiton of it! Excellent
I am trying to scale up on this and build a bigger orchestrating graph with several agents made out of the graph you made.
Not that easy, would you use subgraphs to do so?
Thank you very much! Could you elaborate what you mean by subgraphs?
@ColeMedin subgraph are fully functional graph you can call in as a node in langraph
ruclips.net/video/Z8l7C031xkM/видео.htmlsi=S-yA1tHSMcEqfagh
The graph you did could be taken as one subgraoh representing a single agent and we could scale it by having more of them called by an orchestrator node/graph.
There might be other method but I have this in mind
Oh wow that is super cool!
@@ColeMedin it is! the next logical upgrade of your excellent work
Great video, thank you! What about using Flowiseai - also available open source - instead of LangChain/LangGraph?
Thank you! Great question! I prefer coding things myself and having that level of control which is why I love using LangChain/LangGraph itself instead of FlowiseAI which is just a wrapper on top of LangChain. But it is a great tool to build things faster without having to code!
Thank you very much for sharing this incredible knowledge with all of us, and so well explained, thanks a lot, ... wonderful work !!
Thank you Argi, I appreciate the kind words a ton!
Spaghetti is my JS framework of choice
spaghetti is yummy with the right tomato sauce and fresh succulent veggies in it 🫑🍄🥦🧄🧅...🍝
great work
Good work.
Thank you!!
I am using miniconda for environments, what version of python do you recommend for production? 3.10, 3.11, 3.12, or ???
LangGraph recommends 3.11 I believe.
@aaagaming2023 is correct, 3.11 is recommended! With lower version of Python, you'll have issues streaming the LLM tokens with LangGraph (their documentation claims this and I tested it myself as well!).
is the any finetuned model that works best for tool calling
@@varg92 Great question! Not for any closed source models, but local LLMs there is a version of Llama 3.1 fine tuned for function calling by Groq. I'd check that one out! You can search for Groq function calling Llama 3.1
Tool-ACE
when a Multi-Agent exemple ? 😅 pleeeease ...
@@ArgiSanchez don't worry, that's coming soon! 😉
how will we pass the database session from node to node?
Great question! There are a few ways to do this, but the best way would be to leverage LangGraph's built in support for context objects (they even specifically call out DB connections!):
langchain-ai.github.io/langgraph/how-tos/state-context-key/
@@ColeMedin The previous or prevalent way to do this delegates entire db functions and everything to tools. And maybe use ToolNode. I have to test both for performance, but this is a sqlalchemy or rest api (say fastapi) problem and is independent of LangGraph.