Build a Perplexity-Inspired Answer Engine Using Groq, Mixtral, Langchain, Brave & OpenAI in 10 Min

Поделиться
HTML-код
  • Опубликовано: 4 сен 2024
  • In this video, I walk you through the step-by-step process of creating your own answer engine, much like Perplexity, but utilizing cutting-edge technologies including Groq, Mistral AI's Mixtral 8X7B, LangChain, OpenAI Embeddings & the Brave Search API. This tutorial is designed for those interested in implementing such a system within a JavaScript or Node.js framework. I show you how to configure the engine to deliver not just answers but also sources and potential follow-up questions in response to queries. The journey begins with the initial setup of our project, where I guide you through managing API keys from OpenAI, Groq, and the Brave Search API. From there, we move on to initializing an express server to handle incoming requests effectively. I place a strong emphasis on the importance of speed in our inference processes and share insights on optimizing various components like the embeddings model, how we handle search engine requests, the method of text chunking, and the intricacies of processing queries. As we progress, I demonstrate how to curate response content meticulously, introduce streaming for more dynamic answers, and how we can automate the generation of insightful follow-up questions. The tutorial rounds off with the final touches needed to get our server up and running smoothly.
    For those eager to dive in and start experimenting on your own, I'll be providing a link to download the entire repository from the video description soon.
    This is your chance to get hands-on experience and truly understand the ins and outs of building an advanced answer engine. And if you find this video helpful, don't forget to support the channel by subscribing and sharing it with others who might benefit from this tutorial. Stay tuned for more updates and happy coding!

Комментарии • 37

  • @DevelopersDigest
    @DevelopersDigest  6 месяцев назад +15

    Repo is now live: github.com/developersdigest/llm-answer-engine !🤗

  • @ToddDunning
    @ToddDunning 6 месяцев назад +4

    Thanks a ton for this, I hope you get to 100k subs soon. Looking forward to your repo on this one.

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад

      That is kind of you to say - thank you so much for the your Todd!

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад +1

      Repo is now live: github.com/developersdigest/llm-answer-engine

  • @Cyberspider76
    @Cyberspider76 6 месяцев назад +1

    Thank you so much for putting this out! I am Node dev by day and this help accelerate learning about LLM and all killer tech around it right now!
    Not mention I have been watching to play with Groq so Win, Win all around! Wicked cool content, thanks again!

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад

      Thank you so much for your comment - I love hearing that!

  • @user-bz5wt1nn5c
    @user-bz5wt1nn5c 3 месяца назад +1

    extremely impressive! thank you!

  • @StephenHarbin-sg9yr
    @StephenHarbin-sg9yr 2 месяца назад +2

    Nice

  • @lucasastorian1
    @lucasastorian1 5 месяцев назад +1

    Really excellent demo !

  • @prashlovessamosa
    @prashlovessamosa 6 месяцев назад +1

    Hey Bud thanks for sharing.

  • @shinchima
    @shinchima 4 месяца назад

    DD, another good video. Out of curiosity, is the description LLM generated? It's not often that people pay attention to proper prose in their descriptions.

  • @FaysalAhmed515
    @FaysalAhmed515 6 месяцев назад +1

    I think it makes more sense when you get the streaming in the response as well. Otherwise client doesn't get any benefit of the streaming

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад

      Absolutely - I will have more full-stack examples with exactly this coming soon... Stay tuned

  • @carldomond166
    @carldomond166 5 месяцев назад

    Great video, learned so much! I had a question concerning about the answer portion of the app. Is Groq only referencing sources, or is it able to access those sources and perform an enhanced response? It seems currently, that the sources are links that Groq can only present to the user.

  • @AIEntusiast_
    @AIEntusiast_ 5 месяцев назад +1

    can this also accept uploading files that would be a great feature

  • @square_and_compass
    @square_and_compass 5 месяцев назад +1

    Thank you Sir!

  • @DanielLevy-cl6ht
    @DanielLevy-cl6ht 6 месяцев назад +3

    Thanks. Any chance to get the promised repo?

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад

      Yes! Repo is now live: github.com/developersdigest/llm-answer-engine :)

  • @webdancer
    @webdancer 5 месяцев назад +1

    Wonderful, thanks.
    What will be the best frontend for this?

    • @DevelopersDigest
      @DevelopersDigest  5 месяцев назад +2

      Building an example now. Should be out this week. Stay tuned! :)

    • @webdancer
      @webdancer 5 месяцев назад

      @@DevelopersDigest That'll be great. Thanks.

    • @AIEntusiast_
      @AIEntusiast_ 5 месяцев назад

      @@DevelopersDigest a complete guide to also run this live would be awesome, your content is fantastic for a newbie, learning a ton

    • @AIEntusiast_
      @AIEntusiast_ 5 месяцев назад

      any update on this?

    • @DevelopersDigest
      @DevelopersDigest  5 месяцев назад +2

      ​@@AIEntusiast_ ​ It's taking a little bit longer than I expected... hoping to have it ready and recorded this week!

  • @AIEntusiast_
    @AIEntusiast_ 6 месяцев назад +1

    how you get the post localhost and everyhting else running there

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад +1

      I use thunderclient as a plugin for vs code for my post requests when testing endpoint, something like postman can be used as well!

  • @casekingz2273
    @casekingz2273 6 месяцев назад +1

    When will you drop the code?

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад

      Repo is now live: github.com/developersdigest/llm-answer-engine :)

  • @user-oi8ls4uh5b
    @user-oi8ls4uh5b 6 месяцев назад +1

    Can you provide the code please?

    • @DevelopersDigest
      @DevelopersDigest  6 месяцев назад

      Repo is now live: github.com/developersdigest/llm-answer-engine :)

    • @user-oi8ls4uh5b
      @user-oi8ls4uh5b 6 месяцев назад

      @@DevelopersDigest thank you!