Hi daniel! Appreciate your work. You are really providing a lot of value, I'm always thankful for that. I have a request, Many people who are using voiceflow are not from coding background, so the videos like the above are very useful. I request you to make a similar video on hugging face API as the API is free and that could help a lot of people and many will also find it interesting. Pick any model from that is similar to chatgpt in hugging face (I recommend mistralai/Mistral-7B-Instruct-v0.2) and make a video on how to make a function on this model. Btw, thankyou for the value Daniel. Keep going.
Love these videos.! I was wondering if there is a way to include "memory" to the funciton. Sometimes VF knowledge get's the answer and sometimes the "chunks" come back blank..
Thank you very much, that is very interesting. How exactly does it work if I want to use an API key directly from OpenAI instead of the tokens from Voiceflow?
How does use of an outsourced model count against token usage in VoiceFlow? I’m guessing like with other features the tokens only accumulate when an AI model in VF is used?
You only use AI tokens when you use the native AI models within Voiceflow (the tokens are just passed through via. openAI or Anthropic). If you're using 3rd party model like this when you're using tokens directly with the provider.
Great value! So to create an agent-like experience with Mistral models or any model we want, we just can take the memory object, save it it a variable and put it in the content/istructions or in the question/prompt, am i right?
Hi daniel! Appreciate your work. You are really providing a lot of value, I'm always thankful for that.
I have a request, Many people who are using voiceflow are not from coding background, so the videos like the above are very useful.
I request you to make a similar video on hugging face API as the API is free and that could help a lot of people and many will also find it interesting. Pick any model from that is similar to chatgpt in hugging face (I recommend mistralai/Mistral-7B-Instruct-v0.2) and make a video on how to make a function on this model.
Btw, thankyou for the value Daniel. Keep going.
Love these videos.! I was wondering if there is a way to include "memory" to the funciton. Sometimes VF knowledge get's the answer and sometimes the "chunks" come back blank..
Hey Daniel, great vid as always and nice to see an example of functions
Glad you liked it! Lets us know what you build - we’d love to showcase it!
Thank you very much, that is very interesting. How exactly does it work if I want to use an API key directly from OpenAI instead of the tokens from Voiceflow?
This tutorial uses a brand new feature called functions!
Learn more about functions here: ruclips.net/video/RohkPUw8EJ8/видео.html
Why would I want to use an open source AI instead of the AI functions within voiceflow itself (e.g. Set and Response)?
How does use of an outsourced model count against token usage in VoiceFlow? I’m guessing like with other features the tokens only accumulate when an AI model in VF is used?
You only use AI tokens when you use the native AI models within Voiceflow (the tokens are just passed through via. openAI or Anthropic). If you're using 3rd party model like this when you're using tokens directly with the provider.
@@Voiceflow Thanks for the quick response! You guys are the best
Happy to help!
could you PLEASE make a tutorial on how to integrate voiceflow with locally hosted LLMs?
You would do it the same way! You'll just need to make an API access point for your locally hosted LLM.
@@Voiceflow I figured it out however it would be nice to give us more control over the knowledge base function.
Great value! So to create an agent-like experience with Mistral models or any model we want, we just can take the memory object, save it it a variable and put it in the content/istructions or in the question/prompt, am i right?
Yup!
Good video!
My bots are light speed!