📌 Hey everyone! Enjoying these NLP tutorials? Check out my other project, AI Demos, for quick 1-2 min AI tool demos! 🤖🚀 🔗 RUclips: www.youtube.com/@aidemos.futuresmart We aim to educate and inform you about AI's incredible possibilities. Don't miss our AI Demos RUclips channel and website for amazing demos! 🌐 AI Demos Website: www.aidemos.com/ Subscribe to AI Demos and explore the future of AI with us!
hey i am using one api which takes input as a text and gives its summary. but i am not getting output correct. output = query({ "inputs": text, }) summary = output but not getting output in this case. even when i am using this summary = output[0] still not getting
thanks sir for the work you do , however I want to know 1- how one can integrate specific set of models (pre-trained) ones in to Rstudio ? so that one can simply run examples on data "proprietary in my case " locally within R 2- is there a way to ask the inference API for tasks different from the typical sentiment classification of text for example "multi-entity tagging" , "modalities" ....etc your input is highly appreciated
Hi not sure about Rstudio, but since its an api we should be able to use anywhere we want. Regarding different task, if we use pipeline class they we have to use one of the valid task but you can always use model inference and then have your own custom logic. You should finetune model if you have different req
Hey, can you please answer this, if we want to use a custom model for inference, where do I specify my outputs and inputs? where do I host the code for the model?
Thank you soo much brother! My only question is that I don't have Colab Notebook. Do you know where I could possibly access it and is it also usable for Mac users or are there any alternatives. Blessings
The Inference API is free to use, and rate limited. If you need an inference solution for production, check out our Inference Endpoints service. huggingface.co/docs/inference-endpoints/index
how to give multiple inputs in query: "output = query({ "inputs": "I like you. I love you", })" I am trying this: "output = query({ "inputs": "I like you. I love you","I hate you", })" But I am getting error. Please help me.
Can you ask this in discord. We also published blog post on inference api blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions
Hi you can ask if you have any doubts in discord. We also recently published a blog on inference API that should be helpful. blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions
📌 Hey everyone! Enjoying these NLP tutorials? Check out my other project, AI Demos, for quick 1-2 min AI tool demos! 🤖🚀
🔗 RUclips: www.youtube.com/@aidemos.futuresmart
We aim to educate and inform you about AI's incredible possibilities. Don't miss our AI Demos RUclips channel and website for amazing demos!
🌐 AI Demos Website: www.aidemos.com/
Subscribe to AI Demos and explore the future of AI with us!
Fantastic. The likes on the video dont do the justice how great the helpful the video is.
Glad it was helpful!
Simple one. But a valuable for someone who is making the transition from Intermediate (codes own model) to Advanced (uses Pretrained model) Kudos🎉
Thanks man! Didn't know it was this easy.
Glad I could help!
Hey Pradip, that's very well presented. Good job!
Thank you so much 🙂
SO USEFUL. THANK YOU SO MUCH!
Great bro
Thank you brother for the great tutorial 🙏
Glad it was helpful!
Helpful❤
hey i am using one api which takes input as a text and gives its summary. but i am not getting output correct.
output = query({
"inputs": text,
})
summary = output
but not getting output in this case.
even when i am using this
summary = output[0]
still not getting
Can you post in discord?
is there any free text to text generation api. i want it for my hackathon project
thanks sir for the work you do , however I want to know
1- how one can integrate specific set of models (pre-trained) ones in to Rstudio ? so that one can simply run examples on data "proprietary in my case " locally within R
2- is there a way to ask the inference API for tasks different from the typical sentiment classification of text for example "multi-entity tagging" , "modalities" ....etc
your input is highly appreciated
Hi not sure about Rstudio, but since its an api we should be able to use anywhere we want.
Regarding different task, if we use pipeline class they we have to use one of the valid task but you can always use model inference and then have your own custom logic.
You should finetune model if you have different req
Please reply, İ don't get the deploy button shown to me on any model page
Hello sir can you please tell how we can generate hugging face api key for building a react app, please reply with an elaborated answer sir.
Can you ask this thing in discord?
discord.gg/teBNbKQ2
thanks for sharing :)
Explain about read and write tokens
Hey, can you please answer this, if we want to use a custom model for inference, where do I specify my outputs and inputs? where do I host the code for the model?
you can push model to huggingface hub. I have shown in this video ruclips.net/video/HRGc6QFA_YU/видео.html
hello I would like the photos he describes to me to go directly to a Google, could it work with the api?
Thanks a lot
Hello, can it handle real time Speech to Text with ASR models?
hi how can i use a huggingface api in my flutter app ?
Its an api so we can use in any language. Here is an example of Python and Javscript huggingface.co/docs/api-inference/quicktour
@@FutureSmartAI thank you so much for your reply and i really finf your video so interesting,!
Nice
Thank you soo much brother! My only question is that I don't have Colab Notebook. Do you know where I could possibly access it and is it also usable for Mac users or are there any alternatives. Blessings
You can use colab notebook which is free if you login with google
If you can't find the snippet code what does that mean?
Can you please show how can I use a code generation model of Hugging face like DeepSeekcoder or StartCoder using Inference API, as I am getting error.
can you share error?
how about the pricing, i didn't understand , i like to make it live, what if 10000 user use this api, how much cost
The Inference API is free to use, and rate limited. If you need an inference solution for production, check out our Inference Endpoints service.
huggingface.co/docs/inference-endpoints/index
how to give multiple inputs in query:
"output = query({
"inputs": "I like you. I love you",
})"
I am trying this:
"output = query({
"inputs": "I like you. I love you","I hate you",
})"
But I am getting error.
Please help me.
Can you ask this in discord. We also published blog post on inference api blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions
Hello sir, can you please illustrate how to use a question-answering model like t5 by js inference? Thank you
you mean JavaScript inference?
thanks
I tried a model, and api said max 10 candidates
you left half the way, it is difficult for new users
Hi you can ask if you have any doubts in discord. We also recently published a blog on inference API that should be helpful.
blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions