Inference API: The easiest way to integrate NLP models for inference!

Поделиться
HTML-код
  • Опубликовано: 31 окт 2022
  • Learn How to use HuggingFace Inference API to easily integrate NLP models for inference via simple API calls.
    Code: github.com/PradipNichite/Yout...
    NLP Beginner to Advanced Playlist:
    • NLP Beginner to Advanced
    I am a Freelance Data Scientist working on Natural Language Processing (NLP) and building end-to-end NLP applications.
    I have over 7 years of experience in the industry, including as a Lead Data Scientist at Oracle, where I worked on NLP and MLOps.
    I Share Practical hands-on tutorials on NLP and Bite-sized information and knowledge related to Artificial Intelligence.
    LinkedIn: / pradipnichite
    #nlp #bert #transformers #machinelearning #artificialintelligence #datascience

Комментарии • 42

  • @FutureSmartAI
    @FutureSmartAI  Год назад +1

    📌 Hey everyone! Enjoying these NLP tutorials? Check out my other project, AI Demos, for quick 1-2 min AI tool demos! 🤖🚀
    🔗 RUclips: www.youtube.com/@aidemos.futuresmart
    We aim to educate and inform you about AI's incredible possibilities. Don't miss our AI Demos RUclips channel and website for amazing demos!
    🌐 AI Demos Website: www.aidemos.com/
    Subscribe to AI Demos and explore the future of AI with us!

  • @johnknapp5048
    @johnknapp5048 Год назад +2

    Hey Pradip, that's very well presented. Good job!

  • @Nandhakumar_Arumugam8
    @Nandhakumar_Arumugam8 11 месяцев назад +2

    Simple one. But a valuable for someone who is making the transition from Intermediate (codes own model) to Advanced (uses Pretrained model) Kudos🎉

  • @almerdiazbondoc8373
    @almerdiazbondoc8373 9 месяцев назад +1

    SO USEFUL. THANK YOU SO MUCH!

  • @alijawwad1763
    @alijawwad1763 Год назад +1

    Fantastic. The likes on the video dont do the justice how great the helpful the video is.

  • @sohailpatel7549
    @sohailpatel7549 7 месяцев назад +2

    Thanks man! Didn't know it was this easy.

  • @mjacfardk
    @mjacfardk Год назад +1

    Thank you brother for the great tutorial 🙏

  • @saman__fatima
    @saman__fatima 4 месяца назад +1

    thanks for sharing :)

  • @SD-rg5mj
    @SD-rg5mj Год назад

    hello I would like the photos he describes to me to go directly to a Google, could it work with the api?
    Thanks a lot

  • @allandclive
    @allandclive Год назад

    Hello, can it handle real time Speech to Text with ASR models?

  • @user-zx3zk1xn6h
    @user-zx3zk1xn6h 6 месяцев назад

    Please reply, İ don't get the deploy button shown to me on any model page

  • @loadsofpizza4446
    @loadsofpizza4446 Год назад

    If you can't find the snippet code what does that mean?

  • @swapnilcodes
    @swapnilcodes 11 месяцев назад +1

    Nice

  • @bal4350
    @bal4350 Месяц назад

    Explain about read and write tokens

  • @karthikdatta7143
    @karthikdatta7143 Год назад

    Hey, can you please answer this, if we want to use a custom model for inference, where do I specify my outputs and inputs? where do I host the code for the model?

    • @FutureSmartAI
      @FutureSmartAI  Год назад

      you can push model to huggingface hub. I have shown in this video ruclips.net/video/HRGc6QFA_YU/видео.html

  • @mikiallen7733
    @mikiallen7733 5 месяцев назад

    thanks sir for the work you do , however I want to know
    1- how one can integrate specific set of models (pre-trained) ones in to Rstudio ? so that one can simply run examples on data "proprietary in my case " locally within R
    2- is there a way to ask the inference API for tasks different from the typical sentiment classification of text for example "multi-entity tagging" , "modalities" ....etc
    your input is highly appreciated

    • @FutureSmartAI
      @FutureSmartAI  5 месяцев назад

      Hi not sure about Rstudio, but since its an api we should be able to use anywhere we want.
      Regarding different task, if we use pipeline class they we have to use one of the valid task but you can always use model inference and then have your own custom logic.
      You should finetune model if you have different req

  • @pariworecords1386
    @pariworecords1386 Год назад

    Thank you soo much brother! My only question is that I don't have Colab Notebook. Do you know where I could possibly access it and is it also usable for Mac users or are there any alternatives. Blessings

    • @FutureSmartAI
      @FutureSmartAI  Год назад

      You can use colab notebook which is free if you login with google

  • @zaidshaikh9284
    @zaidshaikh9284 Год назад +1

    hey i am using one api which takes input as a text and gives its summary. but i am not getting output correct.
    output = query({
    "inputs": text,
    })
    summary = output
    but not getting output in this case.
    even when i am using this
    summary = output[0]
    still not getting

  • @kettleghost3721
    @kettleghost3721 Год назад

    Hello sir, can you please illustrate how to use a question-answering model like t5 by js inference? Thank you

  • @user-lm8er8yl1o
    @user-lm8er8yl1o 6 месяцев назад

    Can you please show how can I use a code generation model of Hugging face like DeepSeekcoder or StartCoder using Inference API, as I am getting error.

  • @Madiha456
    @Madiha456 Год назад +1

    Hello sir can you please tell how we can generate hugging face api key for building a react app, please reply with an elaborated answer sir.

    • @FutureSmartAI
      @FutureSmartAI  Год назад

      Can you ask this thing in discord?
      discord.gg/teBNbKQ2

  • @arvindratnu5621
    @arvindratnu5621 Год назад

    how to give multiple inputs in query:
    "output = query({
    "inputs": "I like you. I love you",
    })"
    I am trying this:
    "output = query({
    "inputs": "I like you. I love you","I hate you",
    })"
    But I am getting error.
    Please help me.

    • @FutureSmartAI
      @FutureSmartAI  Год назад

      Can you ask this in discord. We also published blog post on inference api blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions

  • @Venkatesh-vm4ll
    @Venkatesh-vm4ll Год назад

    how about the pricing, i didn't understand , i like to make it live, what if 10000 user use this api, how much cost

    • @FutureSmartAI
      @FutureSmartAI  Год назад +1

      The Inference API is free to use, and rate limited. If you need an inference solution for production, check out our Inference Endpoints service.
      huggingface.co/docs/inference-endpoints/index

  • @HasimFN
    @HasimFN 18 дней назад

    I tried a model, and api said max 10 candidates

  • @mmet5772
    @mmet5772 11 месяцев назад +1

    hi how can i use a huggingface api in my flutter app ?

    • @FutureSmartAI
      @FutureSmartAI  11 месяцев назад

      Its an api so we can use in any language. Here is an example of Python and Javscript huggingface.co/docs/api-inference/quicktour

    • @mmet5772
      @mmet5772 11 месяцев назад

      @@FutureSmartAI thank you so much for your reply and i really finf your video so interesting,!

  • @droponmoon4689
    @droponmoon4689 Год назад

    you left half the way, it is difficult for new users

    • @FutureSmartAI
      @FutureSmartAI  Год назад

      Hi you can ask if you have any doubts in discord. We also recently published a blog on inference API that should be helpful.
      blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions