LangChain & OpenAI Function Calling - WHY it is pretty BAD!

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 27

  • @Pure_Science_and_Technology
    @Pure_Science_and_Technology Год назад +1

    Thank you. I gave up on langchain functions and used output parsers. Was not comfortable doing this, but wanted to get around this issue. Now, I’ll go back and make updates using your work around. 🎉

  • @ThiagoHenrique-tc7mr
    @ThiagoHenrique-tc7mr Год назад

    I am building autonomous agents. Sometimes when the chatbot must transfer to a human it writes the function in the content and send to the customer instead of executing it.
    It happens sometimes 10% of the requests have a problem like this. Do you know how to solve it?

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад

      Probably a fallback as default, if I understand you correctly. Not perfect of course

  • @Marcelo-qc8lt
    @Marcelo-qc8lt Год назад +1

    Ingenious solution, just one question. Would it be feasible to integrate a vector database in this solution? let's say Faiss, because in langchain that is easy but I never saw directly that using the openai api.

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад +1

      I would probably just integrate langchain functionality into your custom functions. That should not be a Problem :)

    • @RanaAnas-s2h
      @RanaAnas-s2h 5 месяцев назад

      yes

  • @levius_24
    @levius_24 8 месяцев назад +1

    Hey, thanks for your value!
    Do you know if it's still that bad, or is it usable meanwhile?

  • @supertransformer
    @supertransformer 6 месяцев назад +1

    It's been 8 months since this video was posted and I'm pleased to say that function calling is pretty bad in pretty much every popular platform.

    • @codingcrashcourses8533
      @codingcrashcourses8533  6 месяцев назад

      No it´s not anymore. I released a new video this week. If you are interested, watch it :). Didn´t know this video still gets any views

  • @marcocheng9754
    @marcocheng9754 Год назад

    Can you recommend some approaches of evaluating the Function Calling responses?

  • @reticentrex8446
    @reticentrex8446 Год назад

    Does NeMo guard rails offset this issue?

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад +1

      First off all, never heard of this before ;-). Die docs state it is for influencing the way a model behaves, but the example I showed is the issue of langchain code. So probably won´t fix. Or can you force the LLM to ALWAYS return a function_call attribute and set the content to null?

  • @prasenjitgiri919
    @prasenjitgiri919 Год назад

    Hi, great content!
    I was dabbling with the idea of using function calling at the top level and can see how it works, but if sits behind the chatbot, where it can either call a function or call default or call the conversation it has for a session.
    I haven't been able to devise the strategy at how to make it fallback on default and how to make it use its conversation history.
    Will you have some pointers for me? Ty!

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад

      Not sure if I understand your question. Normally the flow is: If function call is needed, do the call, otherweise call the conversation. Default has to be defined as default in the description of the function.

    • @prasenjitgiri919
      @prasenjitgiri919 Год назад

      @@codingcrashcourses8533 Here is what i mean.
      I have 3 main functions - book order which require book name and author, order pen, and book a library visit.
      So when I call in a prompt and the function calling figures out that out of 2 params for function book order, how do i maintain the state when it asks that the response the user gave is in regards to the function param and not a generic query
      So how do we orchestrate that?

  • @DikkeKoelie
    @DikkeKoelie Год назад

    How do you load thousands of functions

    • @codingcrashcourses8533
      @codingcrashcourses8533  Год назад

      why thousands? Probably your app does too many things if you got this many functions. You wont be able to do that due to the tokenlimit of the model

    • @randotkatsenko5157
      @randotkatsenko5157 Год назад

      You could in theory give a list of function calls to the LLM, to pick the most relevant function call.

    • @DikkeKoelie
      @DikkeKoelie Год назад +2

      @@randotkatsenko5157 i got it working but not with langchain. i load up all the functions in mongodb. create a searchfunction "The 'searchFunctions' utility acts as a central hub to locate and load various functionalities, ranging from communication tools like email, social media platforms such as Twitter and Facebook, to a vast array of other capabilities. Think of it as an extension of your digital limbs, providing you with the means to achieve a multitude of tasks.",

    • @randotkatsenko5157
      @randotkatsenko5157 Год назад

      @@DikkeKoelie Thats awesome, good info!

    • @RanaAnas-s2h
      @RanaAnas-s2h 5 месяцев назад

      A max of 128 functions are supported.
      but why you need thousand function?

  • @bertobertoberto3
    @bertobertoberto3 9 месяцев назад

    Seems like such a basic thing for that glaring oversight

  • @JethiyaChampakGada
    @JethiyaChampakGada 10 месяцев назад

    Finally someone said it

  • @robertmazurowski5974
    @robertmazurowski5974 7 месяцев назад

    I just used langchain for function calling and have exacly the same thoughts. Wasted time for it.