Function Calling in Gemini: A Framework for Connecting LLMs to Real-Time Data

Поделиться
HTML-код
  • Опубликовано: 16 окт 2024
  • Function calling is a powerful built-in feature of Gemini that gives developers structured outputs and full control when connecting large language models (LLMs) to real-time data in external systems via APIs. Function calling is an intuitive way to build generative AI applications that can access real-time information in databases, CRMs, document repositories, customer support platforms, and other types of systems.
    In this session, we’ll give an overview of function calling and show you how to define functions, tools, and parameters that enable Gemini to interact with external systems. We’ll show sample code and apps that can interact with BigQuery via natural language prompts, handle RAG operations to retrieve and summarize documents from vector databases, or (optionally) integrate with OSS frameworks such as LangChain and LlamaIndex.
    We'll also discuss best practices to follow when using function calling, including how to write descriptive function and parameter definitions, how to work with structured outputs, how to handle API calls and function responses, and how to work with function calling in text and chat modalities. By the end of this session, you’ll have a good understanding of how function calling in Gemini works and how you can use it to build apps that interact with your own APIs and services.
    We’ll cover the following topics:
    How function calling solves limitations in LLMs and generative models
    The benefits of using function calling vs. traditional approaches
    How to use function calling and how it works at runtime
    Interacting with SQL databases, vector databases, and other systems
    How to get started with function calling with your own systems
    Codelab: codelabs.devel...
    Sample notebook: github.com/Goo...
    Documentation: cloud.google.c...
    Join, learn, and get your questions answered with the Google Cloud AI Community: goo.gle/ai-com...
    Share your feedback and suggestions for future topics: goo.gle/gcc-ev...

Комментарии • 9

  • @mohamedkarim-p7j
    @mohamedkarim-p7j 5 дней назад +1

    Thank for sharing👍

  • @gemini_537
    @gemini_537 6 месяцев назад +2

    Gemini: This video is about Function Calling in Gemini, a framework for connecting large language models (LLMs) to real-time data and APIs.
    The video starts with an overview of generative AI in Google Cloud. Generative AI has been rapidly advancing, and Google Cloud is at the forefront of this innovation. Google Cloud offers a variety of generative AI models, including the latest Gemini series. Gemini models are not only powerful, but also easy to integrate with other Google Cloud services like Vertex AI.
    Then, the video dives into the challenges of using LLMs and how Function Calling in Gemini can address them. LLMs can produce inconsistent outputs, lack access to information after their training date, and are disconnected from the world. Function Calling in Gemini solves these problems by providing a structured framework for generating consistent outputs, incorporating external data and APIs, and enabling LLMs to interact with the real world.
    The core functionality of Function Calling in Gemini is to allow LLMs to call external functions and APIs. This enables LLMs to access and process information from various sources, leading to more informative and actionable outputs. The video showcases several use cases that demonstrate the power of Function Calling in Gemini, including:
    * Building employee assistance applications to improve team productivity
    * Enhancing customer experiences by integrating LLMs into chatbots and virtual assistants
    * Automating process flows in security and supply chain management
    Overall, Function Calling in Gemini is a powerful tool that can help developers unlock the full potential of LLMs by enabling them to connect to real-time data and APIs.

  • @cdtavijit
    @cdtavijit 3 месяца назад +1

    Amazing session. I had been struggling with multi-turn chat with function calling for a while. This gives some clarity and direction.

  • @darshansharma_
    @darshansharma_ 7 месяцев назад +1

    Great lecture on Function calling in LLM and we discussed about the LLM are not updated to the new events of real world - a major drawback of LLMs

  • @DonBranson1
    @DonBranson1 6 месяцев назад +1

    great session. appreciate the real world use cases and hands on examples.

  • @lancemarchetti8673
    @lancemarchetti8673 7 месяцев назад +2

    Excellent

  • @kirilchi
    @kirilchi 7 месяцев назад

    How many functions can we attach to LLM invocation before performance starts to drop?
    I believe attaching all tools to all invocations is not recommended. Would be great to have higher order framework allowing to select and attach only tools related to the user query.

  • @eyoeldefare4426
    @eyoeldefare4426 6 месяцев назад

    Magnificent

  • @anniehuang7855
    @anniehuang7855 4 месяца назад

    This does not work the same way when I use it