Build an LLM powered Chrome Extension 🔥

Поделиться
HTML-код
  • Опубликовано: 17 ноя 2024

Комментарии • 16

  • @seththunder2077
    @seththunder2077 8 месяцев назад +1

    Would be very cool to see you continuing this where you implement RAG along with stripe where we can try to sell this. Its quite unique and pretty much no one has done that yet and such content can be done by someone like you who has solid background in Gen AI along with other tech stack.

  • @muhammedajmalg6426
    @muhammedajmalg6426 8 месяцев назад +2

    thanks for sharing, I have a special use case "form auto filler", this extension should able to fill the forms automatically.
    some useful points:
    * store user data in a vector database.
    * when needed the data pull it from the database
    * make some changes, fill the form
    can help me to do it?

  • @danush_idk2493
    @danush_idk2493 8 месяцев назад +1

    Can you please make anymore extension video with free llm

  • @manojnagabandi9779
    @manojnagabandi9779 8 месяцев назад +2

    Thank you for the amazing video :) I am trying to code along with you and nice T-shirt.👍

    • @AIAnytime
      @AIAnytime  8 месяцев назад

      Happy to hear that!

  • @jitheshdsouza98
    @jitheshdsouza98 23 дня назад

    Is it possible to write a Chrome extension that runs an AI model without the backend?

  • @SonGoku-pc7jl
    @SonGoku-pc7jl 8 месяцев назад +1

    great tutorial! but if posible other with ollama with other function how summarizing, or chat with github, because i don't know how models of ollama can interact with web, thanks

    • @AIAnytime
      @AIAnytime  8 месяцев назад

      Great suggestion!

  • @m4tthias
    @m4tthias 8 месяцев назад

    How can we verify if the LLM response is accurate in checking whether the site is a phishing site? Would there be response bias given most sites/example used is not a phishing site for this use case. Thanks. Great video as always.

    • @m4tthias
      @m4tthias 8 месяцев назад

      I tried using two local models. TinyLlama responded No. But the response is a Yes after I logged into Openai website and Groq (actual examples). Llama2 responded No to both. In short, the response is not definite and subject to model quality. Just for sharing. 😄

    • @danush_idk2493
      @danush_idk2493 8 месяцев назад

      Is is possible to try with anyother llm model and can give me a reference please@@m4tthias

  • @user4-j1w
    @user4-j1w 8 месяцев назад

    Thank you sir

  • @AngelWhite007
    @AngelWhite007 8 месяцев назад

    please make a video on creating llm powered mobile app

  • @ablaze.2238
    @ablaze.2238 8 месяцев назад

    Bro i need one help bro please

  • @ablaze.2238
    @ablaze.2238 8 месяцев назад

    Please reply 😢😢😢😢😢