Huggingface.js: Step-by-Step Guide to Getting Started

Поделиться
HTML-код
  • Опубликовано: 3 ноя 2024

Комментарии • 67

  • @williamturner6876
    @williamturner6876 Год назад +6

    This was absolutely excellent. Your style is simple and clear. Don't stop, you're a rare breed.

    • @DevelopersDigest
      @DevelopersDigest  Год назад

      Thank you William, that means a lot!

    • @sparshjain3724
      @sparshjain3724 3 месяца назад

      @@DevelopersDigest Hello can you explain why did we have to image seperately and not use a url for the image like if we are giving src to img tag

  • @ryanroman6589
    @ryanroman6589 Год назад +5

    Keep pumping out these incredible videos my man!! Thank you!

  • @dannygamer9995
    @dannygamer9995 4 месяца назад +1

    Excellent and straight to the point, thank you.

  • @قناة_لحظة_إدراك
    @قناة_لحظة_إدراك Год назад +1

    Do you have a video that explains how to add ready-made templates to my WordPress site, for example?

    • @DevelopersDigest
      @DevelopersDigest  Год назад

      I don’t but I can absolutely build some Wordpress examples. Stay tuned!

  • @vladyslav-py-js-cs
    @vladyslav-py-js-cs 28 дней назад

    Thanks ! Great explanation!😁

  • @millermcdonell291
    @millermcdonell291 9 месяцев назад +1

    short and simple, luv it

  • @sw-ln1hh
    @sw-ln1hh 11 месяцев назад +1

    thank you for your js ai content hands on programing video

  • @JohnyIIOh
    @JohnyIIOh Год назад +2

    Nice, exactly what I needed, gonna watch it next w/e

  • @animeshbhatt3383
    @animeshbhatt3383 Месяц назад

    What is the advantage of using the interface as against pipeline?

  • @TobiasMunger
    @TobiasMunger Год назад +1

    Thanks for the video! For scalability, I would need to deploy a huggingface inference API from a model of my choosing right? I think in this case, its using free resources for playgrounding?

  • @priyacrypto
    @priyacrypto Год назад +2

    Great video to kick off with HFJS, can you do 1 for using MPT-7B with nodejs

    • @DevelopersDigest
      @DevelopersDigest  Год назад +2

      That’s a great idea, let me dig into this model. It sounds pretty compelling !

  • @Yangmeista
    @Yangmeista 3 дня назад +1

    helpful! thank you

    • @DevelopersDigest
      @DevelopersDigest  3 дня назад +1

      Thank you for watching!

    • @Yangmeista
      @Yangmeista 2 дня назад

      @@DevelopersDigest :) Are you using groq for your projects? How is your experience with it so far?

  • @hmtbt4122
    @hmtbt4122 Год назад +2

    Thanks for sharing this. 😊

  • @roiborromeo7921
    @roiborromeo7921 6 месяцев назад +1

    This is definitely helpful.

  • @animeEnjoyer2002
    @animeEnjoyer2002 2 месяца назад

    help!!!
    transformer.js error SyntaxError: Unexpected token '

  • @elleryfamilia8291
    @elleryfamilia8291 Год назад +1

    What's unclear for me at the moment is, is the HF library just downloading the models at runtime and running them locally? Are all the models already embedded in the HF library? If I use an LLM from HF, does it download it and run it locally? Does HF provide any sort of hosting or is it just a model repo?

    • @DevelopersDigest
      @DevelopersDigest  Год назад +1

      1. You can set up hugging face to download models and run locally. If interested in doing that with js, check out transformers.js on HF
      2. Not all models are embedded in HF library, it’s mostly open source models with varying permissions on how you can use them for your applications. For example, if you want to build a commercial app, make sure to check the license first as an example.
      3. If you use an LLM from HF does it download and run it locally, you can choose to do that if you have the hardware to run the selected model you can run local or choose to deploy on the cloud, which in some case you might not have any other option. With that said some models are available for inference which you can access from an API which is what I demonstrate here with Huggingface.js. You can choose to host a model from hugging face and have billing directly through HF, those models will be hosted on cloud infrastructure like aws, azure, gcp. huggingface.co/inference-endpoints
      I hope that helps !

    • @saarza9991
      @saarza9991 28 дней назад

      It's running a cloud hosted instance of the models. But it's still as fast as if you'd run it on your server, maybe even faster depending on your GPU

  • @froggy472
    @froggy472 6 месяцев назад

    Thanks for such amazing video, I love.
    have you faced this issue before ?
    `Error: Authorization header is correct, but the token seems invalid`
    do I need to regitster the billing information first ?

    • @froggy472
      @froggy472 6 месяцев назад

      this is full error.
      file:///Users/usr/Documents/hugging_face/node_modules/@huggingface/inference/dist/index.js:169
      throw new Error(output.error);
      ^
      Error: unknown error
      at request (file:///Users/froggy/Documents/hugging_face/node_modules/@huggingface/inference/dist/index.js:169:15)
      at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
      at async textGeneration (file:///Users/froggy/Documents/hugging_face/node_modules/@huggingface/inference/dist/index.js:655:15)
      at async file:///Users/froggy/Documents/hugging_face/huggingface.js:16:1

  • @karamjittech
    @karamjittech Год назад

    Great video. How would you train the model for few specific tasks? By using API, they will also charge..yes?

    • @DevelopersDigest
      @DevelopersDigest  Год назад

      There are free tiers as well as paid tiers for using their inference apis. To train a model would be a bit longer to explain then a comment block. I will be making a video on training models coming in future videos! Stay tuned

  • @sudoPrivileges
    @sudoPrivileges 6 месяцев назад

    How do I display the result to the html code. I keep getting the error document not defined, since it's node and not browser.

  • @hmtbt4122
    @hmtbt4122 Год назад +2

    sir its a request to please upload the code on the repo you provided previously. it would be very helpful sir

  • @TirthRadadiya-hp9sq
    @TirthRadadiya-hp9sq 9 месяцев назад +1

    I have tried this for img2img models but I am unable to get back image result. Can you tell me what could I do to get image?

    • @DevelopersDigest
      @DevelopersDigest  9 месяцев назад

      You may have to write the result of the image to disk or to something like an S3 bucket or equivalent. I would have to take a look at what the particular model returns! Hopefully this helps!

  • @SD-rg5mj
    @SD-rg5mj Год назад +1

    hello, I would like the text generated by hugging face, when I make images to text, to go directly into a cell of my Google sheet,
    can we copy and paste our 10-digit pay keys into my Google sheets like I do with chat gpt?
    anyway thank you very much for your videos

    • @DevelopersDigest
      @DevelopersDigest  Год назад

      You could leverage the google sheets api to do this!

    • @SD-rg5mj
      @SD-rg5mj Год назад

      @DevelopersDigest ok thank you very much for your answer can you suggest me a tutorial for someone like me who doesn't know anything about code
      otherwise I saw a tutorial to learn how to use chatgpt directly in Google sheet, without touching the code, just with the chatgpt API and a Google sheet module,
      but all this is with text,
      do you think i can ask chatgpt to describe the images that are in my google sheet
      anyway thank you very much for your answer

  • @aakashkhamaru9403
    @aakashkhamaru9403 9 месяцев назад +1

    I need a model that could convert text into mongodb queries, I have searched huggingface but could not find such a model. Could anyone help me?

    • @DevelopersDigest
      @DevelopersDigest  9 месяцев назад

      Check out python.langchain.com/docs/use_cases/sql/
      Langchain might be able to help, I am not sure if there is a mongodb equivalent but using a framework like langchain + a model could help you accomplish what you are looking for - cheers 🥂

  • @OPGAMING-rd3qu
    @OPGAMING-rd3qu Год назад +1

    One day you have 10million subs

    • @DevelopersDigest
      @DevelopersDigest  Год назад

      I won’t believe it if I make it to 10,000! Thank you 🙏

  • @sahilsaini9976
    @sahilsaini9976 9 месяцев назад +1

    This gives me error for the fisrt import line. Plz help me with this...

  • @shreerajshinde9715
    @shreerajshinde9715 8 месяцев назад +1

    Can we use huggingfac embeddings

  • @x_game_x
    @x_game_x 9 месяцев назад +1

    Please do a video on image to image

    • @DevelopersDigest
      @DevelopersDigest  9 месяцев назад

      Thank you for the suggestion. What sort of image to image generation did you have in mind?

    • @aricwilliamsdeveloper
      @aricwilliamsdeveloper 9 месяцев назад

      upload two images and combine them... also how can i deploy to server @@DevelopersDigest

  • @askcoachmarty
    @askcoachmarty Год назад +1

    Love your videos. The format of starting with the outline and dropping the code in underneath is brilliant in its simplicity. I like the javascript examples. Have you used langchain with huggingface to retrieve information from a vectordb? I used this ruclips.net/video/vpU_6x3jowg/видео.html to create embeddings in Pinecone and I'm trying to convert the client to query it. Be happy to share the working code once it's complete. Any pointers to code or advice are welcomed! have fun!

    • @DevelopersDigest
      @DevelopersDigest  Год назад +1

      Cheers thank you. If you are interested, I have used pinecone in my video here: m.ruclips.net/video/CF5buEVrYwo/видео.html

    • @askcoachmarty
      @askcoachmarty Год назад

      @@DevelopersDigest Thanks! I'll go check it out...

    • @askcoachmarty
      @askcoachmarty Год назад

      In case anybody comes back to read this thread. I had to create a HuggingFaceInferenceEmbeddings for the model: "sentence-transformers/multi-qa-mpnet-base-dot-v1" to match the pinecone index. Then I used Langchain's Pinecone wrappers and did a vectorStore.similaritySearch, and everything worked great. (haha - like that explanation? now you see why we like the way you explain things ;) )

  • @chrisder1814
    @chrisder1814 13 дней назад

    hello

  • @Sagan1995
    @Sagan1995 Год назад +1

    I did everything like you did in the video, but I can't use many of the models available. When I try to use the summarization model, I get the following error: Error: Could not load model facebook/bart-large-cnn with any of the following classes: (, ).
    at request (file:///C:/Users/user/Desktop/hugging/node_modules/@huggingface/inference/dist/index.mjs:106:15)

    • @DevelopersDigest
      @DevelopersDigest  Год назад

      The inference model free tier does have limits, with that said if you are trying to run models locally it is a bit more involved. You’ll have to make sure everything is installed and downloaded accordingly! Cheers !

  • @justin9494
    @justin9494 Год назад +1

    Is it possible to get the "returnTimestamps: true" of whisper model with HuggingfaceJS inference API? I'm transcribing speech and I want it to return with timestamps. It seems that it's compatible with python, so I thought it is too for javascript. This is what I have now.
    const transcriptionResult = await hf.automaticSpeechRecognition({
    model: 'openai/whisper-base',
    data: fs.readFileSync(chunkPath),
    //@ts-ignore
    parameters: {
    return_timestamps: true //this does not work. It says that parameters does not exist in automaticSpeechRecognition
    }
    });
    I can't switch to python as my backend heavily depends in nodejs. I can't figure out how to do this, I've looked in the documentation, and it seems like inference api is not that good yet. Thank you!

    • @DevelopersDigest
      @DevelopersDigest  Год назад

      I am not familiar whether the whisper base supports timestamps. If you have a working example of it working through Huggingface, my thought would be to log out the payload in your Python request and ensure the same payload is being sent. I have only used whisper directly through OpenAIs API so I am not familiar with this the hf implementation, sorry! 😞

  • @anasosama7558
    @anasosama7558 9 месяцев назад +1

    Too technical for me 🫤

    • @DevelopersDigest
      @DevelopersDigest  9 месяцев назад

      I can absolutely appreciate that - I think it should only be getting easier in time! Cheers