Using Voiceflow with Mistral AI or Open Sourced AI Model

Поделиться
HTML-код
  • Опубликовано: 11 дек 2024

Комментарии • 19

  • @rahulkrishnapagoti1706
    @rahulkrishnapagoti1706 9 месяцев назад +2

    Hi daniel! Appreciate your work. You are really providing a lot of value, I'm always thankful for that.
    I have a request, Many people who are using voiceflow are not from coding background, so the videos like the above are very useful.
    I request you to make a similar video on hugging face API as the API is free and that could help a lot of people and many will also find it interesting. Pick any model from that is similar to chatgpt in hugging face (I recommend mistralai/Mistral-7B-Instruct-v0.2) and make a video on how to make a function on this model.
    Btw, thankyou for the value Daniel. Keep going.

  • @alexhuottt
    @alexhuottt 10 месяцев назад +1

    Love these videos.! I was wondering if there is a way to include "memory" to the funciton. Sometimes VF knowledge get's the answer and sometimes the "chunks" come back blank..

  • @joshbone9899
    @joshbone9899 10 месяцев назад +1

    Hey Daniel, great vid as always and nice to see an example of functions

    • @Voiceflow
      @Voiceflow  10 месяцев назад

      Glad you liked it! Lets us know what you build - we’d love to showcase it!

  • @FeanorCC
    @FeanorCC 8 месяцев назад

    Thank you very much, that is very interesting. How exactly does it work if I want to use an API key directly from OpenAI instead of the tokens from Voiceflow?

  • @Voiceflow
    @Voiceflow  10 месяцев назад

    This tutorial uses a brand new feature called functions!
    Learn more about functions here: ruclips.net/video/RohkPUw8EJ8/видео.html

  • @kelshaffei
    @kelshaffei 7 месяцев назад

    Why would I want to use an open source AI instead of the AI functions within voiceflow itself (e.g. Set and Response)?

  • @Levicandoit
    @Levicandoit 10 месяцев назад +1

    How does use of an outsourced model count against token usage in VoiceFlow? I’m guessing like with other features the tokens only accumulate when an AI model in VF is used?

    • @Voiceflow
      @Voiceflow  10 месяцев назад +1

      You only use AI tokens when you use the native AI models within Voiceflow (the tokens are just passed through via. openAI or Anthropic). If you're using 3rd party model like this when you're using tokens directly with the provider.

    • @Levicandoit
      @Levicandoit 10 месяцев назад +1

      @@Voiceflow Thanks for the quick response! You guys are the best

    • @DanielVoiceflow
      @DanielVoiceflow 10 месяцев назад

      Happy to help!

  • @xcalibills
    @xcalibills 5 месяцев назад

    could you PLEASE make a tutorial on how to integrate voiceflow with locally hosted LLMs?

    • @Voiceflow
      @Voiceflow  5 месяцев назад

      You would do it the same way! You'll just need to make an API access point for your locally hosted LLM.

    • @xcalibills
      @xcalibills 5 месяцев назад

      @@Voiceflow I figured it out however it would be nice to give us more control over the knowledge base function.

  • @marcometelli8455
    @marcometelli8455 10 месяцев назад

    Great value! So to create an agent-like experience with Mistral models or any model we want, we just can take the memory object, save it it a variable and put it in the content/istructions or in the question/prompt, am i right?

  • @EasyAINow
    @EasyAINow 9 месяцев назад

    Good video!

  • @RoniBliss
    @RoniBliss 8 месяцев назад

    My bots are light speed!