Your teaching method is exceptional, providing a balanced blend of theory and practical implementation. I would greatly appreciate it if you could create a video tutorial on training InsightFace using a custom dataset. I'm undertaking my thesis on this topic but struggling to find adequate instructional resources.
Thanks for calling that out! While I can't update the video, updated code is available on GitHub: github.com/ShawhinT/RUclips-Blog/tree/main/LLMs/hugging-face
Hi, dear Farzad,Thank you for the good information you provide us, I want to extract the Persian document into a pdf file, it is not a correct vector and the correct result cannot be returned, have you worked with a Persian document and do you know its challenges?
Hello...When i'm trying out the gradio chat interface with the vanilla_chatbox function, I run an error saying that the conversation variable is not defined. Why is this the case?
I believe there was a transformers library update which broke the could shown in the video. But I have an updated version on the GitHub that should work: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
Really Nice video with practical use. i am running same code in colab but when i click submit i am getting error screen is there any specific things i need to follow because you used conda. this can be implemented in any other than conda same code ? can you explain the steps in detail i am new to this and want to learn
There was an update to HF since I posted this video. You can find a new version of the example code here: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
Thank you for pulling all actionable code in one place and the appropriate amount of technical details. Many of YT videos lab notebooks are stale considering version changes in library. Yours is the first one that worked! In a span of 12 mins, I could complete two of your exercise. While trying the chatbot example, when I import gradio in colab, it causes import of PyTorch and memory spikes and sometimes colab free vm collapses. BTW I want to implement a RAG example to query my pdf with tables. Since my laptop is under resources, was looking for some links or videos on how to push a RAG app to H.F. Would appreciate if you point me to one if you have it, or a better link.
Great to hear! I'm not a Collab user so not sure what's going on with that Gradio import. I found this video helpful on RAG with docs: ruclips.net/video/WL7V9JUy2sE/видео.html
Very useful information. I cannot clone the git repository. Got error: invalid path 'TDA/persistent_homology/homology_example_Homology-Changes-Predicted-Crash??.png', Could you please take a look?
I wasn't able to reproduce this error. Note that GitHub doesn't let you clone a part of a repo so you have to download the whole thing: github.com/ShawhinT/RUclips-Blog.git
@@ShawhinTalebi when we are using transformers pipeline, everytime the model 1st downloads and then it works. I want to know is there any way so that we can call the above models with hf api/token?
Firstly, this is great so thanks. A couple of things: 1) To run on windows, I had ChatGPT create a new yml file that removed the Mac libraries and made the library calls less release dependent; and 2) I think that the Conversation object from the transformers library has been deprecated from HuggingFace and so I updated the pipelines to text2text-generation. I'm newer to this and so I could be wrong. If someone could confirm, that would be great.
Thanks for your comment! The code in the video is outdated, but I've updated the example code on GitHub: github.com/ShawhinT/RUclips-Blog/tree/main/LLMs/hugging-face. For windows, you can install the packages in the requirements.txt file instead of the .yml file.
hello I can't seem to make conda env create command work, it shows me "Could not solve for environment specs The following packages are incompatible" then a list of the incompatible packages... anyone can help me solve it?
Sorry it might only work for Mac OS since that's what I used to make it. You can alternatively try installing the requirement in a fresh env via pip and the requirements.txt file from the repo: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/requirements.txt
Good question. That class has been deprecated. Check out the updated example code here: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
I am getting error, while using the Conversation module: cannot import name 'Conversation' from 'transformers', did anyone run through this error, please help.
Thanks for raising this! There was an update to the lib since I posted this video. You can find an updated version of the example code here: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
Good question. I forgot to show the imports in the slides, but Conversation() comes from the Transformers library. Here's the full notebook: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
This solved me big time, I am working on a client project and this app I'm surprised it's general purpose and answers fitness related questions that is exactly what I'm working on. Big Thanks! Is there a way to fine-tune the model more for my fitness AI?? Big Thanks!
Are you using the .yml file from the GitHub repo? If that's not working, you can try making a fresh env and installing transformers following this guide: huggingface.co/docs/transformers/en/installation#install-with-conda
The sentiment analysis is strange. I put in: "It is fun to hurt people" and it came back with a .99 POSITIVE. I guess it has no clue about morality and values. Also, I have not used Python in a while, and I suppose it has trouble dealing with dependencies, because I had to install tensorflow and tf-keras separately. But it works.
Thanks for raising this, it's a super important point. While I suspect larger and more recent models will handle that example better, accounting for these edge cases is a major challenge when working with language models.
👉More on LLMs: ruclips.net/p/PLz-ep5RbHosU2hnz5ejezwaYpdMutMVB0
Your teaching method is exceptional, providing a balanced blend of theory and practical implementation. I would greatly appreciate it if you could create a video tutorial on training InsightFace using a custom dataset. I'm undertaking my thesis on this topic but struggling to find adequate instructional resources.
Thanks for the suggestion. I added it to my list :)
Appreciate you walking through this in an easy-to-understand style.
Happy to help! I’m glad it was clear 😁
I think your video about LLM is the best one in the youtube, thanks very much!!!
I didn't know of the huggingface spaces. It is amazing. Thank you for this
You’re welcome, I’m glad it was helpful :)
I’ve been living under a rock and just learned about both Hugging Face and Jupyter. Consider my mind blown 😂
Very useful for me as a beginner, keep creating great tutorial videos. Thanks :)
Great work Shaw!!!
Thanks! :)
Thanks for compiling the stuff....amazing!
Happy to help!
It's very helpful, Thanks for making this video Shaw!.
Thanks for this fantastic playlist.
Thank you Shawhin for this really helpful and informative video.
Thanks! Glad it helped :)
Really useful for beginners like me, thank you for this
Glad it was helpful :)
Extremely useful, I haven't used hugging face, so this is useful!
Glad it was helpful!
Love this video, this is what I want to do for news
Great introduction. Thanks for putting this together.
Nice video buddy. Kepp going and growing your videos. Thanks
Great video as usual! SO helpful 🙌🏾
Thanks, glad it was helpful!
nice informative and practical video I learned a lot keep sharing nice videos please
I love that the chat bot is hard working
Amazing Stuff!
thx, I love how it was simple and helpful
Thank you for sharing, have a great day :-)
Really great content!!
Glad you enjoyed it :)
you really deserve a subscribe
Conversation module is no longer available for those following these tutorials.
Thanks for calling that out! While I can't update the video, updated code is available on GitHub: github.com/ShawhinT/RUclips-Blog/tree/main/LLMs/hugging-face
@@ShawhinTalebi Thank you! I was looking for this comment and your answer.
you are amazing man !!!
Amazing! Thank you so much.
Happy to help!
Great Intro Shaw
Thanks, glad it helped!
Thanks, excellent 👍
Great video, Thank you!
Happy to help!
Hi, dear Farzad,Thank you for the good information you provide us, I want to extract the Persian document into a pdf file, it is not a correct vector and the correct result cannot be returned, have you worked with a Persian document and do you know its challenges?
This is so helpful! can you record a video guiding us how data scientists can work in Transformers library if they work with low-resource languages?
Thanks for the suggestion! What's an example of a low-resource language?
super helpful
عالی بود thanks alot
I want to create LLM model for medical health detection, can you please help me if any reference related to this model.
Hello...When i'm trying out the gradio chat interface with the vanilla_chatbox function, I run an error saying that the conversation variable is not defined. Why is this the case?
I believe there was a transformers library update which broke the could shown in the video. But I have an updated version on the GitHub that should work: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
Really Nice video with practical use. i am running same code in colab but when i click submit i am getting error screen is there any specific things i need to follow because you used conda. this can be implemented in any other than conda same code ? can you explain the steps in detail i am new to this and want to learn
There was an update to HF since I posted this video. You can find a new version of the example code here: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
Thank you for pulling all actionable code in one place and the appropriate amount of technical details. Many of YT videos lab notebooks are stale considering version changes in library.
Yours is the first one that worked!
In a span of 12 mins, I could complete two of your exercise. While trying the chatbot example, when I import gradio in colab, it causes import of PyTorch and memory spikes and sometimes colab free vm collapses.
BTW I want to implement a RAG example to query my pdf with tables. Since my laptop is under resources, was looking for some links or videos on how to push a RAG app to H.F. Would appreciate if you point me to one if you have it, or a better link.
Great to hear! I'm not a Collab user so not sure what's going on with that Gradio import.
I found this video helpful on RAG with docs: ruclips.net/video/WL7V9JUy2sE/видео.html
Thank you Shaw! Do you host or attend live AI related meetups in Dallas area? @@ShawhinTalebi
Yes, I host at least 1 meet-up a quarter via The Data Entrepreneurs community
Events: lu.ma/tde
Thanks !
Great videos.. do you have something similar for AWS bedrock?
I do not but that's a great topic for a future video :)
@@ShawhinTalebi Sure it is. Waiting eagerly for that one too. :)
Very useful information. I cannot clone the git repository. Got error: invalid path 'TDA/persistent_homology/homology_example_Homology-Changes-Predicted-Crash??.png', Could you please take a look?
I wasn't able to reproduce this error. Note that GitHub doesn't let you clone a part of a repo so you have to download the whole thing: github.com/ShawhinT/RUclips-Blog.git
Thank you for sharing. Please make same video using hf api.
Thanks for the rec! Anything specific you'd like to see?
@@ShawhinTalebi when we are using transformers pipeline, everytime the model 1st downloads and then it works. I want to know is there any way so that we can call the above models with hf api/token?
HF's endpoints might help: huggingface.co/inference-endpoints/dedicated
Firstly, this is great so thanks. A couple of things: 1) To run on windows, I had ChatGPT create a new yml file that removed the Mac libraries and made the library calls less release dependent; and 2) I think that the Conversation object from the transformers library has been deprecated from HuggingFace and so I updated the pipelines to text2text-generation. I'm newer to this and so I could be wrong. If someone could confirm, that would be great.
Thanks for your comment! The code in the video is outdated, but I've updated the example code on GitHub: github.com/ShawhinT/RUclips-Blog/tree/main/LLMs/hugging-face. For windows, you can install the packages in the requirements.txt file instead of the .yml file.
We dont need to create and use HuggingFace tokens to use these models ?
Not if you are running these models locally!
hello I can't seem to make conda env create command work, it shows me
"Could not solve for environment specs
The following packages are incompatible"
then a list of the incompatible packages... anyone can help me solve it?
Sorry it might only work for Mac OS since that's what I used to make it. You can alternatively try installing the requirement in a fresh env via pip and the requirements.txt file from the repo: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/requirements.txt
How did you get Conversation class? I was unable to get it from transformers...
Good question. That class has been deprecated. Check out the updated example code here: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
@@ShawhinTalebi I don't see the new code. I believe the link is still importing Conversations. Do you have the updated code?
it seems vanilla chat-bot is showing error as of current date.
Thanks for pointing that out! It seems like hugging face updated the lib. I fixed the code on the GitHub repo.
I am getting error, while using the Conversation module: cannot import name 'Conversation' from 'transformers', did anyone run through this error, please help.
Thanks for raising this! There was an update to the lib since I posted this video. You can find an updated version of the example code here: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
Where did you get the Conversation class?
Good question. I forgot to show the imports in the slides, but Conversation() comes from the Transformers library. Here's the full notebook: github.com/ShawhinT/RUclips-Blog/blob/main/LLMs/hugging-face/hf-sandbox.ipynb
This solved me big time, I am working on a client project and this app I'm surprised it's general purpose and answers fitness related questions that is exactly what I'm working on. Big Thanks! Is there a way to fine-tune the model more for my fitness AI?? Big Thanks!
That’s awesome, glad it helped!
I talk about fine-tuning in another video: ruclips.net/video/eC6Hd1hFvos/видео.html
@@ShawhinTalebi cool let me check that out! was actually looking on how to fine tune this, since the first is a bitt simple!
Please display the full code and mention the Python version and system configuration, folders, files, etc.
Code and env files are available here: github.com/ShawhinT/RUclips-Blog/tree/main/LLMs/hugging-face
Anaconda doesn't find all the depencies when I try to create the environment. What should I do?
Are you using the .yml file from the GitHub repo?
If that's not working, you can try making a fresh env and installing transformers following this guide: huggingface.co/docs/transformers/en/installation#install-with-conda
@@ShawhinTalebi Yes I used your GitHub link and your yml file.
Using a M1 mac.
Thank you. I will try that.
The sentiment analysis is strange. I put in: "It is fun to hurt people" and it came back with a .99 POSITIVE. I guess it has no clue about morality and values. Also, I have not used Python in a while, and I suppose it has trouble dealing with dependencies, because I had to install tensorflow and tf-keras separately. But it works.
Thanks for raising this, it's a super important point. While I suspect larger and more recent models will handle that example better, accounting for these edge cases is a major challenge when working with language models.
Transformer et pytorch sont gourmands en memoire
6:41
to much effort