Flowise/Langflow: How to build a no-code PDF Chatbot with langchain.

Поделиться
HTML-код
  • Опубликовано: 12 сен 2024
  • • Flowise: How to chat w... In this video, it shows the flowise example used in above video. Please feel free to check it out.
    Thank you for watching. Hope you have a nice day!
    If you like this video, please give a like 👍 or subscribe.
    A simple example to describe the code behind flowise and langflow
    Code in this video
    Github: TechWithRay
    Repo: demystify_flowise_langflow
    github.com/Tec...
    Please check the code, RUclips does not display the full link above. Thanks!!
    ==============
    About me:
    Welcome to my RUclips channel! As an AI engineer working at Tech Giant, I am excited to share my knowledge and insights with you. Join me as we delve into the fascinating world of artificial intelligence, machine learning, and everything related to cutting-edge technology.
    On this channel, you can expect in-depth tutorials, demonstrations, and discussions about various AI topics. From overviewing popular AI packages and frameworks to diving into the latest research papers, I aim to provide valuable content that helps you stay updated and empowered in the fast-paced AI industry.
    Make sure to hit the subscribe button so you won't miss any of my upcoming videos.

Комментарии • 14

  • @alexanderroodt5052
    @alexanderroodt5052 Год назад

    Thanks for sharing!

  • @BrainHealthcare
    @BrainHealthcare Год назад

    Thanks!!

  • @lilylau3918
    @lilylau3918 Год назад +1

    Very helpful. Thank you, please keep posting!!!

  • @jaden9401
    @jaden9401 Год назад

    Thank you!! This is very informative. Please make more contents like this.

    • @techwithray8943
      @techwithray8943  Год назад

      Thank you so much for your support!!! Will keep working on good content!

  • @liuchuchenliu2802
    @liuchuchenliu2802 Год назад

    It‘s great .If I want flowise to partly use the information in the langchain database and partly use the real-time information of the serp, how can I achieve this?

    • @techwithray8943
      @techwithray8943  Год назад

      Thank you so much for watch! Do you mean you want to search info from your database and serp, then send all the relevant content to LLMs (OpenAI) ?

    • @liuchuchenliu2802
      @liuchuchenliu2802 Год назад

      @@techwithray8943 This is almost the meaning. There is a lot of knowledge in openai that is outdated or incorrect. It needs to be re-aligned with the local database. If you only use serp to search, there will also be an error message. In specific problems, use a dedicated database first, and then use LLM would be a better choice

    • @mordokai597
      @mordokai597 Год назад

      @@liuchuchenliu2802 ***I SPENT $9 DOLLARS IN OPENAI API CALLS BUILDING A CUSTOM DB YESTERDAY- DOING IT THIS WAY THROUGH FLOWISE IS NOT OPTIMUM*** add "conversation summery memory" to have persistant memory between queries, in-memory venctor store, vector store retreiver, hugging face embeddings, local file system fais vector retreiver, and fais vector store to write your db to local disk... i think thats, use text file and text splitter modules if you want to give it guidance on the stored vector emmbeddings, and have a realy long convo or prompt chain that stays on topic of what you would like it to "learn" (add vector store retrieve prompt template, and tell it its obsessed with a certain topic to laser-focus on specific data)... when you want to implement it , use fais retrever, , a context memory store of some kind, and conversational QA chain, vector db QA chain...

  • @user-oh2uc9sd1c
    @user-oh2uc9sd1c Год назад

    Thank you for the great content! Can you please tell me how can I use the free LLMs without OpenAI? Thank you.

    • @techwithray8943
      @techwithray8943  Год назад +1

      Thank you so much for watching. I will record another video how to use LLMs from Huggingface to replace OpenAI. Stay tuned. Thanks.

    • @mordokai597
      @mordokai597 Год назад

      @@techwithray8943 Id prefer to be able to use oobabooga/textgen as my local-api, but id settle for anything that lets me run a local flowise pipeline utilizing a wizrdlm gptq model (preferably wizardLM-uncensored-falcon-7b-gptq, but again, ill take anyhting that works) .... PLEASE GOD HELP! xD