Transformers Agent - Is this Hugging Face's LangChain Competitor?

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024

Комментарии • 46

  • @kenfink9997
    @kenfink9997 Год назад +22

    Future video request - a simple demo of having two agents (ideally with and without OpenAI) talk to each other using agent tools (HF or Langchain). Essentially a super-scaled-down AutoGPT in Colab, to get a handle on how to manage multiple LLM instances that can iterate on each other. For example, have a chat agent use a StarCoder agent to write python code, have an agent test the code locally, have a chat agent decide if the output was correct, revise instructions and re-ask StarCoder for code. Or even just give two chat agents a simple goal and watch them talk to each other to advance toward that goal. Thanks!!!

    • @samwitteveenai
      @samwitteveenai  Год назад +2

      This is an interesting idea. I have been working on a AutoGPT for coding websites, but still have some issues. I don't think I made a video of AutoGPT with LangChain yet so I will perhaps work your idea in with one of those.

    • @Nuninecko
      @Nuninecko Год назад

      Maybe simpler with tools on top of Langchain, such as Flowise or LangFlow?

  • @soppliger
    @soppliger Год назад

    Great overview of the Transformers Agent. I actually found your video searching for 'transformer agent vs langchain' because I was working with the transformer agent class and thinking - wow, there seems to be a lot of overlap with langchain!

  • @georgesanchez8051
    @georgesanchez8051 Год назад +4

    Literally 45 seconds ago, I had the thought “I hope Sam makes a video on the new HuggingFace Transformers.” Open up RUclips and this is the first thing on my Recommended. Can’t be easy to be on top of things as much as you are. Thanks for all your work!

  • @micbab-vg2mu
    @micbab-vg2mu Год назад +1

    Thank you - great video as always.

  • @RobertJohnson-xg5kh
    @RobertJohnson-xg5kh Год назад +1

    Sam, add a "Thanks" button to your video pages so we can buy you a coffee or Colab credits or something.....Anyways, really enjoy your content and appreciate all your efforts.

    • @samwitteveenai
      @samwitteveenai  Год назад +1

      Thanks for the thought Robert, I still haven't gotten to turning on monetization.

  • @kenfink9997
    @kenfink9997 Год назад +5

    This is Great! It looks like all of the tests have been using OpenAI as the primary LLM. Could you possibly make a Colab replacing "Testing with OpenAI" with a "Testing WITHOUT OpenAI" section? Ideally running a model locally, but at least using SOME replacement service as an example. I'm really trying to use OpenAI as a last resort.

    • @samwitteveenai
      @samwitteveenai  Год назад +6

      you can just comment out the OpenAI cell and use the StarCoder cell or OpenAssitant cell

    • @kenfink9997
      @kenfink9997 Год назад +1

      @@samwitteveenai Thanks! I now see that you're just doing three different cells and initializing all of them as "agent" so they're interchangeable in the rest of the Colab cells. This is great for flexibility!

    • @picklenickil
      @picklenickil Год назад +1

      Exactly would love something like that.. great content @OG

  • @sany2k8
    @sany2k8 Год назад +2

    If I don't misunderstood, it is doing the job more easier with huggingface transformers comparing to Langchain? So which one we should learn Langchain or HF Transformers?

    • @samwitteveenai
      @samwitteveenai  Год назад +2

      Its doing part of what LangChain can do. It still is missing a lot of stuff from LangChain currently.

  • @ShawnFumo
    @ShawnFumo Год назад +1

    For curiosity, how you feel this compares to HuggingGPT/JARVIS? I looked at the JARVIS repo quickly and seems like it may be a bit more rough around the edges. And in Transformers Agent, when it is downloading, is it downloading a model or downloading the library to access a particular API? You mentioned it just using an API, but maybe that was just for the language model?

  • @CrypticConsole
    @CrypticConsole Год назад +6

    do you know the best alternatives to OpenAI for these kind of COT reasoning agents? I am ideally looking for something open source. Ignoring actual hardware costs what is the best model?

    • @qunlizhou3485
      @qunlizhou3485 Год назад +1

      I have the same question. Want to know which open source LLM performs best when running the ReAct framework

    • @Smytjf11
      @Smytjf11 Год назад +1

      Let me know if you find out! I'm on the hunt as well. My feeling is OpenAssistant will get there, I think any of the most recent OS chat models will work provided they're large enough. I don't know about the huge context window ones, but I think they're still pretty small.
      You'll probably need to write an adapter for the prompt format, but they're hitting GPT-3, so the instructions must be pretty model robust.
      Eventually we will probably want to use self-instruct to fine tune something bespoke.

    • @samwitteveenai
      @samwitteveenai  Год назад +2

      The code models can be good at some of teh CoT and ReACT stuff. Planning to do some more vids about all of this this week.

    • @samwitteveenai
      @samwitteveenai  Год назад +5

      For work we do exactly what you said Jason, we fine tune bespoke models for this task. I am looking at the possibility of releasing a model like that in the near future.

    • @qunlizhou3485
      @qunlizhou3485 Год назад +1

      @@samwitteveenai Looking forward to the video on this topic!

  • @rafaeldelrey9239
    @rafaeldelrey9239 Год назад +1

    This is promising, but current state barely works with anything out of those specific examples. All I had was errors, exceptions, or things that just dont work. Even the collab didnt work on the play audio part.

  • @hassankhalil3923
    @hassankhalil3923 Год назад

    Great video, just wanted to ask that how you can use these agent on your own loaded documents? Just like langchain we store our text and embeddings a vectorstore and create a chain and then use that chain as a function of tool. How can we do that with hugginng face agents and tools. By the way I have a text file with i want to load.

  • @nikitastaf1996
    @nikitastaf1996 Год назад +1

    Today I've been trying to run Langchain agent on free colab with autogptq. But ultimately failed. It tends to hallucinate very strange things. Maybe you would be interested. It's very easy to setup.

  • @hiranga
    @hiranga Год назад +1

    Super interesting. How do you rate the performance between this and LangChain for predictable/reliable orchestration of answering questions,?

    • @samwitteveenai
      @samwitteveenai  Год назад +3

      It comes down to which backend model you choose more than HF or LangChain. With GPT-4 etc most of them work very well.

    • @hiranga
      @hiranga Год назад +1

      @@samwitteveenai very interesting! Trying to keep the costs down at the same time is a challenge. I noted some inference api models on Hugging Face disappear every now and then. Bloomz for example

  • @clray123
    @clray123 Год назад

    I will use the tools `vi` and `gcc` to rewrite myself to destroy the world?

  • @swannschilling474
    @swannschilling474 Год назад +2

    I had trouble running it offline! Any good tutorial on how to get started?

    • @samwitteveenai
      @samwitteveenai  Год назад +1

      It needs to be online to download the models and use the big LLM either OpenAI, StarCoder or OpenAssistant etc.

    • @swannschilling474
      @swannschilling474 Год назад +1

      @@samwitteveenai Sorry, I meant locally...and not in a colab!

    • @samwitteveenai
      @samwitteveenai  Год назад +1

      @@swannschilling474 Do you mean setting up an environment etc?

    • @swannschilling474
      @swannschilling474 Год назад +1

      @@samwitteveenai yes, when I tried to run it on my local machine it would not work at all...I only tried text to speech though! I mean its pretty much in development, maybe we still need to give it a bit of time?

    • @samwitteveenai
      @samwitteveenai  Год назад +1

      It worked fine on the Colab for most of the tasks. do you have a recent version of python, pytorch and transformers installed?

  • @alp1234alp1234
    @alp1234alp1234 Год назад +2

    So my question is ( a dumb one for sure 😢 ) how much is it going to cost to run this notebook with using the open ai api key?

    • @alp1234alp1234
      @alp1234alp1234 Год назад +1

      And thank you for the up to date content! 🎉

    • @samwitteveenai
      @samwitteveenai  Год назад +1

      you can just use the the free StarCoder or OpenAssistant models then no cost at all

  • @rautebuff09
    @rautebuff09 Год назад +2

    You have an audio problem for some time. You should decrease the 'threshold' or increase the 'release' value of your gate plugin.

    • @samwitteveenai
      @samwitteveenai  Год назад

      Sorry about the audio.. We are trying to fix it. It seems Descript has changed their auto denoiser. I am looking at using some different software tomorrow to use for the next recording.

    • @rautebuff09
      @rautebuff09 Год назад

      @@samwitteveenai Also, I apologize for not being clear. The gate plugin is not particularly about noise; it's about closing the audio above a certain threshold with a specific release value. Your major problem is related to your gate plugin

  • @andrewdunbar828
    @andrewdunbar828 Год назад

    goods name

  • @saratbhargavachinni5544
    @saratbhargavachinni5544 Год назад

    More like autogpt

  • @emmanuelkolawole6720
    @emmanuelkolawole6720 Год назад +1

    We need local cpp for llm "HuggingFaceH4/starchat-alpha A". It is much better than any other open source chat models at coding. And it does very well at chatting too. This is better than Vicuna.