Run Local AI Agents With Any LLM Provider - Anything LLM Agents Tutorial
HTML-код
- Опубликовано: 25 дек 2024
- #anythingllm#aiagent #ollama
AnythingLLM, an all-in-one AI application that supports multiple closed and open source AI providers like OpenAI, ollama, lm studio, groq, text generation webui and Anthropic Claude.
Explore its robust capabilities, including seamless document processing, speech-to-text features, and built-in agent tools for web scraping, web-browsing, chart generation, RAG memory, and summarization all autonomously and running locally.. Join us for a live demo and discover how to harness the full potential of this evolving AI tool!
🔗 Links
👉 Want to reach out? Join my Discord by clicking here - / discord
Patreon Exclusives - / thelocallab (1- Click Windows Installer)
Support the Channel - cash.app/$TheL...
Local Lab Twitter - / thelocallab_
Instagram - / thelocallabchannel
Business Email - info@locallabdigest.com
Github Repo - github.com/Min...
AnythingLLM website - anythingllm.co...
--------------------------------------------
V I D E O S T O W A T C H N E X T :
Free Text to Speech AI App - Local Install Tutorial - • Best Free Text to Spee...
How To Run Your Llama 3 1 Models With Open WebUI Web Search Locally - • How To Run Your Llama ...
Easy Open-WebUI + LM Studio Tutorial: Free & Local ChatGPT Alternative - • Easy Open-WebUI + LM S...
--------------------------------------------
#anythingllm
#ollama #lmstudio #LLM #textgenerationwebui #openai #AI #llm #aiagent
Excelent video. It is very useful. Thanks for sharing it!!!
can you show us about the SQL Connector, mine never success doing it, like the AI can not connect to the DB.
Using gpt-4o for successfully working
Thank you for brief overview! Does it support image input?
Yes it does. As long as your using a vision model as your llm.
Is it possible to export the code?@@TheLocalLab
@@KalaivaniAruchamy Code for what exactly?
@@TheLocalLabIf code export option is there, then it will be easy for non coders to use the code for similar usecase. If export option is there, then we can export and use the same code to complete the task in short period if time
🔴Create A Local AI Voice Assistant With A Customizable Persona 👉 ruclips.net/video/PltEIx3LOLg/видео.html
👉 Want to reach out? Join my Discord by clicking here - discord.gg/5hmB4N4JFc
can you search the web using a local LLM it seems like your not using a local hosted LLM to search the web
Well the thing is, you can use local llm models through projects like ollama, lm studio or any local provider with a compatible OpenAI API, with the web search feature if you like but from my testing, models smaller then 70B are not that great at using the tools. I was using the groq api in this video with the llama 70B model which can function call and work with these kinds of tools better.
You can integrate with google search API