Thank you. I gave up on langchain functions and used output parsers. Was not comfortable doing this, but wanted to get around this issue. Now, I’ll go back and make updates using your work around. 🎉
I am building autonomous agents. Sometimes when the chatbot must transfer to a human it writes the function in the content and send to the customer instead of executing it. It happens sometimes 10% of the requests have a problem like this. Do you know how to solve it?
Ingenious solution, just one question. Would it be feasible to integrate a vector database in this solution? let's say Faiss, because in langchain that is easy but I never saw directly that using the openai api.
First off all, never heard of this before ;-). Die docs state it is for influencing the way a model behaves, but the example I showed is the issue of langchain code. So probably won´t fix. Or can you force the LLM to ALWAYS return a function_call attribute and set the content to null?
Hi, great content! I was dabbling with the idea of using function calling at the top level and can see how it works, but if sits behind the chatbot, where it can either call a function or call default or call the conversation it has for a session. I haven't been able to devise the strategy at how to make it fallback on default and how to make it use its conversation history. Will you have some pointers for me? Ty!
Not sure if I understand your question. Normally the flow is: If function call is needed, do the call, otherweise call the conversation. Default has to be defined as default in the description of the function.
@@codingcrashcourses8533 Here is what i mean. I have 3 main functions - book order which require book name and author, order pen, and book a library visit. So when I call in a prompt and the function calling figures out that out of 2 params for function book order, how do i maintain the state when it asks that the response the user gave is in regards to the function param and not a generic query So how do we orchestrate that?
@@randotkatsenko5157 i got it working but not with langchain. i load up all the functions in mongodb. create a searchfunction "The 'searchFunctions' utility acts as a central hub to locate and load various functionalities, ranging from communication tools like email, social media platforms such as Twitter and Facebook, to a vast array of other capabilities. Think of it as an extension of your digital limbs, providing you with the means to achieve a multitude of tasks.",
Thank you. I gave up on langchain functions and used output parsers. Was not comfortable doing this, but wanted to get around this issue. Now, I’ll go back and make updates using your work around. 🎉
I am building autonomous agents. Sometimes when the chatbot must transfer to a human it writes the function in the content and send to the customer instead of executing it.
It happens sometimes 10% of the requests have a problem like this. Do you know how to solve it?
Probably a fallback as default, if I understand you correctly. Not perfect of course
Ingenious solution, just one question. Would it be feasible to integrate a vector database in this solution? let's say Faiss, because in langchain that is easy but I never saw directly that using the openai api.
I would probably just integrate langchain functionality into your custom functions. That should not be a Problem :)
yes
Hey, thanks for your value!
Do you know if it's still that bad, or is it usable meanwhile?
Have to Check that :)
It's been 8 months since this video was posted and I'm pleased to say that function calling is pretty bad in pretty much every popular platform.
No it´s not anymore. I released a new video this week. If you are interested, watch it :). Didn´t know this video still gets any views
Can you recommend some approaches of evaluating the Function Calling responses?
Can you Provide a small example pf what you want to evaluate?
Does NeMo guard rails offset this issue?
First off all, never heard of this before ;-). Die docs state it is for influencing the way a model behaves, but the example I showed is the issue of langchain code. So probably won´t fix. Or can you force the LLM to ALWAYS return a function_call attribute and set the content to null?
Hi, great content!
I was dabbling with the idea of using function calling at the top level and can see how it works, but if sits behind the chatbot, where it can either call a function or call default or call the conversation it has for a session.
I haven't been able to devise the strategy at how to make it fallback on default and how to make it use its conversation history.
Will you have some pointers for me? Ty!
Not sure if I understand your question. Normally the flow is: If function call is needed, do the call, otherweise call the conversation. Default has to be defined as default in the description of the function.
@@codingcrashcourses8533 Here is what i mean.
I have 3 main functions - book order which require book name and author, order pen, and book a library visit.
So when I call in a prompt and the function calling figures out that out of 2 params for function book order, how do i maintain the state when it asks that the response the user gave is in regards to the function param and not a generic query
So how do we orchestrate that?
How do you load thousands of functions
why thousands? Probably your app does too many things if you got this many functions. You wont be able to do that due to the tokenlimit of the model
You could in theory give a list of function calls to the LLM, to pick the most relevant function call.
@@randotkatsenko5157 i got it working but not with langchain. i load up all the functions in mongodb. create a searchfunction "The 'searchFunctions' utility acts as a central hub to locate and load various functionalities, ranging from communication tools like email, social media platforms such as Twitter and Facebook, to a vast array of other capabilities. Think of it as an extension of your digital limbs, providing you with the means to achieve a multitude of tasks.",
@@DikkeKoelie Thats awesome, good info!
A max of 128 functions are supported.
but why you need thousand function?
Seems like such a basic thing for that glaring oversight
Finally someone said it
I just used langchain for function calling and have exacly the same thoughts. Wasted time for it.