Function Calling with Llama 3.1 and Ollama | Langchain

Поделиться
HTML-код
  • Опубликовано: 10 сен 2024
  • In this video, we will explore how to implement function (or tool) calling with LLama 3.1 and Ollama locally.
    Code : github.com/The...
    Setting up Ollama : • Build an SQL Agent wit...
    #llama3.1 #functioncalling #ollama #llama3 #llm #langchain #opensource #nlp #machinlearning #coding #python #datascience #ai

Комментарии • 4

  • @jofjofjof
    @jofjofjof 14 дней назад

    Great video. God bless you bro for kindly sharing your knowledge.

  • @user-en1eh5ux7d
    @user-en1eh5ux7d 27 дней назад

    Thank you so much for the videos!

  • @siddhubhai2508
    @siddhubhai2508 Месяц назад +2

    Cool bro I really wanted it, thank you so much! Please have a logo on your channel and be consistent with the same quality, your content is very nice! Actually I'm also building a revolutionary AI tool, please can you help me with that? I made the AI agent using ollama library and other libraries (including my package which is not available in public) which is very nice and became more cool due to your this tutorial, now the only last thing I want please make a video on how to implement unlimited memory (or you can say context) to the AI agent I made, I use ollama.generate so I can't use the LLM's built in context, and even if I could I won't because it's very short. I watched the AI austin's unlimited memory video but I wasn't able to implement it to my AI agent as he was implementing on his AI agent instead of telling how can we also do that on our project actually. Please can you do that for me today, because I need it today. ALSO MAIN THING - Please use local storage instead of any cloud vector database or any type of cloud database.
    Thank you hope you will do that for me today. It's emergency otherwise I won't say to do it instantly. Please I join my hands. Please bro! 🙏🙏😥😥