How to Make a Local Function Calling LLM (ollama, local llm)

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024
  • Make a completely local function calling LLM using a local or external model of your choice.
    Function callers are LLM agents that pick the best function to call for a given prompt.
    Join the Discord: / discord
    Library Used:
    github.com/emi...
    Code used in the video:
    github.com/emi...

Комментарии • 6

  • @dr.mikeybee
    @dr.mikeybee 3 месяца назад

    You're coding the same time I'm watching -- after midnight.

  • @mikew2883
    @mikew2883 3 месяца назад

    Great stuff! 👍

  • @OffsecNinja
    @OffsecNinja 5 месяцев назад

    Thank you! I've been searching for a solution, and everything online is either irrelevant or consists of 300+ lines of code that are hard to understand. This library makes things much easier. Thanks for sharing! :)

    • @polymir9053
      @polymir9053  5 месяцев назад +1

      Glad to hear that you found it useful!

  • @maths_physique_informatiqu2925
    @maths_physique_informatiqu2925 4 месяца назад

    is there a solution for prompts that are not related to the functions, because it shows error , actually I need a model that answer to the user prompt either it's in need of call function or not ?

    • @polymir9053
      @polymir9053  4 месяца назад

      I'm not sure if I understand your question. But you could always default to a normal agent if the function caller fails to answer.
      Feel free to join the discord in the description if you want to chat about this.