Multi-Agent Function Calling LLMs - CODE for AI Agents

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024

Комментарии • 11

  • @cesarchicaiza4831
    @cesarchicaiza4831 Год назад

    This is so amazing, the way you explain things is not just understandable and fun, but bring back the explanation of how it works! It’s super helpful (13:13) . In any case amazing work! Please keep it up

  • @snarkyboojum
    @snarkyboojum Год назад

    It's interesting to see how OpenAI are effectively replacing the orchestrator or planner in libraries like LangChain or Semantic Kernel. This feature looks like it natively replaces Skills or Plugins in Semantic Kernel for example. It'll be interesting to see how this evolves, and if it will slowly provide other planning features to provide a full cognitive architecture for applications to consume.

    • @code4AI
      @code4AI  Год назад +1

      Simple. Now an AI agent can have access to one or multiple plugins for its specific tasks. Think about the logic of a hierarchy of agents, each specialized and interconnected to others or with a control-agent to supervise the jobs to be done by other agents. You can activate a multitude of GPT-4 instances in parallel, simulate a human hierarchy (eg management of companies, ..). And run optimizations, 100, 1000 runs ... and find a solution within your parameters.
      New video on multi-AI agents with GPT-4 today.
      Smile: And AI agents can create other AI agents, given the specific nature of the task.

  • @siddheshwarpandhare1698
    @siddheshwarpandhare1698 Месяц назад

    Hi I am developing ai agent and using function calling.
    After getting data from api to by function.
    LLM need to do series of steps on function call response data but llm is only doing first manipulation step
    Can you help on this issue. Using gpt 4 model

  • @skylark8828
    @skylark8828 Год назад

    So do I have to match my (user) prompt to the function description / function arguments every time I want use this specific function? If the LLM matching process is done once, then I should be able to type another (user) prompt send it to the same LLM, and the LLM works out which function to match it to.

  • @VOKorporation
    @VOKorporation Год назад

    Thx, make more video of this theme, please

  • @GodbornNoven
    @GodbornNoven Год назад +1

    You should cover vLLM

    • @code4AI
      @code4AI  Год назад

      Excellent idea! New video coming soon!

    • @shotelco
      @shotelco Год назад

      vLLM = Apache License Version 2.0 - You got my attention!!

  • @pensiveintrovert4318
    @pensiveintrovert4318 Год назад

    Yeah, why ask people if their opinions may have changed, just base research on a static model that has not seen new ideas in a couple of years.

    • @code4AI
      @code4AI  Год назад +1

      You are missing the development of plugins to GPT-4. And any dev in prompt engineering with ICL. You are missing few shot examples and a ton of other dev, and even if, when have you changed your opinion about the two main political parties in the US? People in general hold on to their believes.