How to Make a Local Function Calling LLM (ollama, local llm)
HTML-код
- Опубликовано: 29 сен 2024
- Make a completely local function calling LLM using a local or external model of your choice.
Function callers are LLM agents that pick the best function to call for a given prompt.
Join the Discord: / discord
Library Used:
github.com/emi...
Code used in the video:
github.com/emi...
You're coding the same time I'm watching -- after midnight.
Great stuff! 👍
Thank you! I've been searching for a solution, and everything online is either irrelevant or consists of 300+ lines of code that are hard to understand. This library makes things much easier. Thanks for sharing! :)
Glad to hear that you found it useful!
is there a solution for prompts that are not related to the functions, because it shows error , actually I need a model that answer to the user prompt either it's in need of call function or not ?
I'm not sure if I understand your question. But you could always default to a normal agent if the function caller fails to answer.
Feel free to join the discord in the description if you want to chat about this.