Ollama Tool Calling: Integrate AI with ANY Python Function in 7 mins!

Поделиться
HTML-код
  • Опубликовано: 28 ноя 2024

Комментарии • 18

  • @renierdelacruz4652
    @renierdelacruz4652 2 дня назад +1

    Great improvement of Ollama, thanks for your video.

  • @Techonsapevole
    @Techonsapevole 2 дня назад +1

    i like it works with a small model, thanks for the update

  • @iyerasri
    @iyerasri 22 часа назад

    Thanks for the update and a great video!. 2 Questions - 1) How can this be integrated inside the Open Web UI tool, so that I can use its fantastic chat interface to chat with the local LLMs and in turn the LLMs can call the necessary tools. 2) Any example with Streamlit would be great!

  • @gregorysun6204
    @gregorysun6204 День назад

    wow, thanks a lot, great video with Ollama new updates. One question about function-call demo, the function call has output, while the response.message.content is empty string. like message=Message(role='assistant', content=' ', so if use function call, need to decode every function output?

  • @sethjchandler
    @sethjchandler 2 дня назад +3

    Interesting video. Maybe someone can explain in the comments how the LLM would know which tool to use in the event there were multiple tools. That is, does the LLM sort of do a first pass through the names of the tools and try to think which one is best suited to handle the query. Is there a way to provide additional information so that the LLM makes a good choice about which tool to call under what circumstances?

    • @HassanAllaham
      @HassanAllaham День назад +1

      Under the hood, Ollama Python library uses Pydantic and docstring parsing to generate the JSON schema which was previously required to be provided manually as a tool
      SO this helps the LLM to choose the needed tool (function to use)
      This means that if you want more reliable results, it is better to add docstring to the function (tool).

    • @sethjchandler
      @sethjchandler День назад

      @ thanks for your very helpful response

  • @hannesg.3586
    @hannesg.3586 2 дня назад

    Thanks for the video, it's really interesting.

  • @blueapple2428
    @blueapple2428 2 дня назад +1

    can i make him use when he needs the function calling and when he dosnt he wont use it?

  • @venkateshPremium
    @venkateshPremium 2 дня назад

    If I ask recent listed company stock price what will happen 🙂

  • @SonGoku-pc7jl
    @SonGoku-pc7jl 2 дня назад

    cool, very cool!!!

  • @Anas099X
    @Anas099X 2 дня назад

    cool video. one question though, is there a way to make the response of tool be included in AI response? like for example, on chatgpt you can ask a custom gpt to create a custom pdf then it would return "here is your custom pdf you requested {pdf file}" and the response might differ sometimes

    • @HassanAllaham
      @HassanAllaham День назад

      Instead of returning the tool response it self to the LLM so it can build a formatted answer, include the function inside a function that return the response you want the user to receive and direct this formatted response directly to the user not to the LLM, also in this case you need to add the tool message to the chat history array if you want the LLM to remember this response (i.e. when there might be a follow up questions that may depends on this response)

  • @Jerrel.A
    @Jerrel.A 6 часов назад

    Nice video. However, what I came to know is that Ollama is built to be used in conjunction with Nvidia GPU. Running it on a CPU will be a painful experience.

  • @nufh
    @nufh 2 дня назад +2

    This is super cool.

  • @A_Me_Amy
    @A_Me_Amy 18 часов назад

    I want to give the AI the tool of python and have it make a program to figure out the math that it needs to do and then scrap the program... And same with everything else like you know...

  • @A_Me_Amy
    @A_Me_Amy 18 часов назад

    Here comes the programmers, saving the day xD it's the real life version of the stories hahaha. Raise the Python serpent lol, hail Satan

  • @A_Me_Amy
    @A_Me_Amy 18 часов назад

    import subprocess
    def execute_dynamic_python_script(code):
    """
    Takes a Python code snippet as input, writes it to a temporary file,
    executes it, and returns the output.
    """
    # Write the code to a temporary Python file
    with open("dynamic_script.py", "w") as f:
    f.write(code)
    # Execute the Python script and capture the output
    try:
    result = subprocess.check_output(["python", "dynamic_script.py"], universal_newlines=True)
    return result
    except subprocess.CalledProcessError as e:
    return f"Error in script execution: {e.output}"
    # Example: Create a Python script dynamically
    user_code = """
    # Example Python script
    def calculate():
    return 5 + 7 * 3
    result = calculate()
    print(result)
    """
    output = execute_dynamic_python_script(user_code)
    print(f"Output from the script:
    {output}")
    Would this work to get the AI to program whatever tool it needs?