I spent the past 2 days trying to study word2vec. Visited all the favourite yt channels, read articles on medium, even tried and read the paper from arxiv. This video most beautifully covered and explained word2vec. Amazing made me feel i actually did learn word2vec without any point being left. The diagrammatic explanation of CBOW and skip-grams took it away.
I just wanted to drop a quick note to say how much I appreciate your RUclips videos. Your teaching style is fantastic, and the content is incredibly helpful. Thank you for making learning so enjoyable and accessible.
Hi Utkarsh, I am new to this course may I know why we have to get the final vector only by subtraction and addition, is there some other ways for the same? Kindly explain.
I Never Thought That One Day,The Knowledge OF " Game OF Thrones " would help me to understand the ML concept :D , Between Great Explanation 🙌🏻🙌🏻 Like Always Thank You So Much :)
This is insanely good. I was studying embeddings from an Andrew Ng course. I understood stuff but wasn't sure. Then went for Krish Naik. I understood more, but not fully. Then a friend recommended this video. Seems like I have a new 'go to' data science RUclipsr Subscribed. Keep up the good work.
Best explanation for w2v that i ever came across ,was struggling with the actual detailed info but this let all the case to rest . Thank you very much for making such an detailed explanation
Sir, thank you so much for your number 1, superb lectures.Pls keep making such type of lectures.After watching your lecture, I am very much satisfied and happy with complete concept understanding.
very well explained. Its a blessing to see content from such a great and passionate teacher that too free of cost.... I don't think even the paid courses can explain things with such great clarity.
Hi I just wanted to know Have you tried to run the code because while downloading that 1.5 gb file I'm getting error please let me know if you are able to download
I can say your way of explaining the concepts is tooooo good and easy to understand. Your content is best among most of the youtube channels. Hope your videos reach larger audience in future👍
Immensely helpful content, very well structured and explained.Thank you for sharing this on youtube while people are charging hefty amount for the same. Appreciate your intentions of social welfare.
finalllyyy .after 2 weeks wait krne k badd...aaahigaya..hope sir aapki holdiay achi gyi hogi...will watch and then comment later if i willl have doubts
does someone feel like its illegal watching this content for free? No way this guy can explain the way such that literally even a new born child can understand easily if it learns from him
I really appreciate your work, i love to watch your videos. I found a small mistake at 54:15, it should be Small Dataset - SkipGram and for Large Dataset - CBow. Thanks
thanks Nitish for this, loving the series!! 2 queries 1. GOT example me hamne skip-gram use kiya ya cbow? 2. while selecting cbow vs skipgram, how do can we decide how much data is large/small data?
First of all, thanks for all these amazing videos these really help so much, bas mera ek sawal tha ki is this NLP playlist enough, i mean does it cover entire NLP or is there anything more which i should explore by myself, please sir bas yeh ek doubt clear kardo bohot help ho jayega
Sir your content and explanation is mind blowing. so thank you sir one request is, sir can you provide your One-Note notes file so that we can refer from it when we needed.
Stopped following Krish Naik after started watching your videos 😄, Great content! Refering many friends to follow your playlist 💪🏽 AttributeError: 'Word2VecKeyedVectors' object has no attribute 'get_normed_vectors' Getting this error while executing " model.wv.get_normed_vectors() " , any suggestion brother?
Problem Solved: my_dict = dict({}) for idx, key in enumerate(model.wv.vocab): my_dict[key] = model.wv[key] X = list(my_dict .values( )) y = list(model.wv.vocab.keys())
hello Nitish sir I am a fan of your machine learning series and I continue watching. my question is (1) model.wv.index_to_key and (2) model.wv.get_normed_vectors() are not working
Sir 1. Why have we used sent_tokenize ? Can't we just tokenize individual words directly ? 2. Which way was the model created - cbow or skipgram ? 3. Could you add links to your deep learning lectures that you kept referring in the video
Nitish , I wanted to know whether judiciously following the assignments will do the thing for us , or do we need to study the NLTK, Spacy libraries in detail. Just asking !
In CBoW section, I think dimensions are wrong as total words are 5 and weights receiving from prior layer (hidden layer) are 3. So, dimension of last layer should be 5x3.
Sir, I have doubts. 1. Why there are differences between Upper Case and Lower Case words? If we see the two vectors of 2 words "bag" and "Bag", the 2 vectors are different. 2. Why there is only one hidden layer? And one more thing. Google has its own GCP. Then google stored the file "GoogleNews-vectors-negative300.bin.gz" on the AWS S3 bucket. Why?
1. Because each character has different binary values. 2. Creators must have tried different architectures. 3. It's not Google who have hosted it. It's the creator of gensim I guess.
That amazon aws google news vector file is giving 404 Not Found error in colab notebook. how can i solve this to see that file?? anybody please help me
Kyuki sir hum aapke yaha Sikh to rahe hai but sir actually interview Kai liye explaination Kai upar khas kar Kai how to explain a project khas to ml Wale so is pai ek video banate hai aap to bahut help ho jayegi sir
Thank you so much Sir for this nicely explained tutorial. I have just one doubt Sir. If a sentence is having only 3 words, but while training the model on GOT data at last, we have considered window size 10. So what are the context word it will consider other than it's two context word ? (Sir, Is it like or can we assume like considering the whole corpus as one single sentence with all the words in the vocabulary ?)
I spent the past 2 days trying to study word2vec. Visited all the favourite yt channels, read articles on medium, even tried and read the paper from arxiv. This video most beautifully covered and explained word2vec. Amazing made me feel i actually did learn word2vec without any point being left. The diagrammatic explanation of CBOW and skip-grams took it away.
He's the best
I agree. All videos are really helpful.
I just wanted to drop a quick note to say how much I appreciate your RUclips videos. Your teaching style is fantastic, and the content is incredibly helpful. Thank you for making learning so enjoyable and accessible.
I dont know why but this guy is so underrated ! His content is to be promoted , cant believe I'm getting to see such content for free ! Thank you Sir
I have never seen a video that has gone through minute details of word2vec except yours. Thanks a lot .
@00:57 - word embeddings
@04:48 - what is word2vec?
@09:34 - w2v banta kaise hai?
@10:00 - w2v Demo
@21:04 - w2v intuition
@34:33 - Types of w2v architectures
@35:52 - CBoW
@50:28 - Skip-gram
@56:23 - Training own model - GoT data
@01:14:23 - assignment
World needs more people like you bro 🫡🫡🫡
@campusX you can add this timestamp/chapters to your video for better reach. Thanks for the awesome content Sir.
Hi Utkarsh, I am new to this course may I know why we have to get the final vector only by subtraction and addition, is there some other ways for the same? Kindly explain.
Speak English. I m not an indian
I Never Thought That One Day,The Knowledge OF " Game OF Thrones " would help me to understand the ML concept :D , Between Great Explanation 🙌🏻🙌🏻 Like Always Thank You So Much :)
This is insanely good.
I was studying embeddings from an Andrew Ng course. I understood stuff but wasn't sure.
Then went for Krish Naik. I understood more, but not fully.
Then a friend recommended this video.
Seems like I have a new 'go to' data science RUclipsr
Subscribed. Keep up the good work.
Best explanation for w2v that i ever came across ,was struggling with the actual detailed info but this let all the case to rest . Thank you very much for making such an detailed explanation
underrated channel....no one is teaching like u...higher edu is though difficult
Sir, thank you so much for your number 1, superb lectures.Pls keep making such type of lectures.After watching your lecture, I am very much satisfied and happy with complete concept understanding.
I like the way how clearly the statements are addressed.. lot to learn. Thanks
@Campusx Thank you so much sir..I really love your videos please continue this playlist till BERT,ALBERT,DISTILBERT and GPT also
yes much needed.......but ig his last video on this playlist is 1 year ago......sir please continue.....nobody can match your teaching skills
While training where we have choses the architecture..? Cbow or skip gram
never found so easy to understand explanation in whole of the internet
very well explained the intuition , now i can finally say i know word2vec.
Thank you so much for putting such effort.
Big Thank You Sir (Guru). This NLP series is very helpful for us.
Came here after watching krish naik's video , must say the explanation was too perfect
OMG ... man 100% you have followed the "feynman technique of learning". Awesome !!!
very well explained. Its a blessing to see content from such a great and passionate teacher that too free of cost.... I don't think even the paid courses can explain things with such great clarity.
Hi
I just wanted to know Have you tried to run the code because while downloading that 1.5 gb file I'm getting error please let me know if you are able to download
I can say your way of explaining the concepts is tooooo good and easy to understand. Your content is best among most of the youtube channels. Hope your videos reach larger audience in future👍
Aap Bahut acha padhate ho👍👍
Pls sir Continue
Very nicely explained. Many concepts got cleared in a single video.
sir apki har new video dekhkar lgta hai ek hi dil hai kitni baar jitoge
Immensely helpful content, very well structured and explained.Thank you for sharing this on youtube while people are charging hefty amount for the same. Appreciate your intentions of social welfare.
not having words for this brillant explanation
A very indepth and great explaination for word 2vec...the best one by far..!! Thanks for the videos..👍👍
Sir.. You are really amazing person.... Your way of explanation is great.. It is easily understandable...one of the best channel CampusX☺
finalllyyy .after 2 weeks wait krne k badd...aaahigaya..hope sir aapki holdiay achi gyi hogi...will watch and then comment later if i willl have doubts
Nitish Your explanations are mind boggling ! Superb !!
Amazing Explanation bro, superb . clear complete concepts about Word2vec, thanks for making such a great content ...
Amazing, you wonderfully explained all aspects of the topic
In a simple word, this tutorial is easy, easy, and easy....♥♥
I am Glad i found you on RUclips :) Lot of power to your work!!
I first like the video, then watch it, because I know, this will be the best Explanation.
bro do you find friends dataset
53:45 small correction , Skipgram with smaller data and CBOW with larger data
does someone feel like its illegal watching this content for free? No way this guy can explain the way such that literally even a new born child can understand easily if it learns from him
Please continue this course.
Please cover topics like named entity recognition, topic modelling, sentiment analysis etc
I really appreciate your work, i love to watch your videos. I found a small mistake at 54:15, it should be Small Dataset - SkipGram and for Large Dataset - CBow.
Thanks
Waiting for this lecture 😍😍❤
Nice explanation, I am got clearity on word2vec and other techniques now.
Thanks Nitish for the detailed explaination..!😀
Excellent..! Please continue this playlist
thank you. watching in 2024 but very informative and your teaching style is very good
Your explanation is tooo good 👍❤
This series is extremely helpful!
Sir please think of taking some Online NLP classes.. where we can complete the course in fast track. These videos are very useful..
game of thrones is my fav serias at 13 year
thanks Nitish for this, loving the series!! 2 queries
1. GOT example me hamne skip-gram use kiya ya cbow?
2. while selecting cbow vs skipgram, how do can we decide how much data is large/small data?
According to his explanation it is CBOW
Maza aaya.You are amazing...
best teacher in the domain of data science, ml, nlp, open cv & gen ai!!!!!!!!!!!!!
Perfectly explained
Sir Please continue this playlist.. Please make video on transformers , BERT etc 😵
Can you please confirm the writting pad that you use for the videos. Thanks in advance.
AT 12:38 IN THE VIDEO THE GIVEN URL IS NOT WORKING NOW? PLEASE PROVIDE SOME SOURCE TO GET DATASET OF GOOGLE NEWS 300
AOA brother please make the video on GloVe because you have great skill in explaining the concepts.
Sir, please make a complete playlist in Computer Vision as well...Eagerly waiting
First of all, thanks for all these amazing videos these really help so much, bas mera ek sawal tha ki is this NLP playlist enough, i mean does it cover entire NLP or is there anything more which i should explore by myself, please sir bas yeh ek doubt clear kardo bohot help ho jayega
ruclips.net/video/PKv_okm1H-k/видео.html
@@campusx-official thanks a lot sir, i completed the entire playlist and then am watching this roadmap now 😅
@@campusx-official Sir, data execution on Jupyter notebook taking more time😞
Great explanation 👍
The link of google data isn't working, right? at 12:39
Kaggle
Also, make a video on self supervised learning for computer vision applications.
THANK YOU SIR JEE
Best lecture ❤️
very informative
Rarest knowledge ❤
O Beeeeeeeeeeen stokes........maza a gya hai.
You have made my day. 🤩
Sir your content and explanation is mind blowing. so thank you sir
one request is,
sir can you provide your One-Note notes file so that we can refer from it when we needed.
nice explained sir
Stopped following Krish Naik after started watching your videos 😄, Great content!
Refering many friends to follow your playlist 💪🏽
AttributeError: 'Word2VecKeyedVectors' object has no attribute 'get_normed_vectors'
Getting this error while executing " model.wv.get_normed_vectors() " , any suggestion brother?
Problem Solved:
my_dict = dict({})
for idx, key in enumerate(model.wv.vocab):
my_dict[key] = model.wv[key]
X = list(my_dict .values( ))
y = list(model.wv.vocab.keys())
True
Same with me bro.
Codebasics + Krish Naik is doing CCP job noting else .. just hype only sound no proper content nor flow 😏😏😏
very well explained Sir!
hello Nitish sir
I am a fan of your machine learning series and I continue watching. my question is
(1) model.wv.index_to_key and
(2) model.wv.get_normed_vectors()
are not working
thank u so much for this video
🙏🙏👌👌😊😊
skip gram mai kaunse weights vector rep mai use hongai?
sir when we import the file then this issu come BadGzipFile: Not a gzipped file (b'
Hi i was having issues with getting the google collab to work I consistently got error not found when trying to get the google pre-existing model
i still dont understand what happens with the two words after multiplication in cbow. will we average them?
Now this googlenews link has only 94 entries , any different link?
Sir
1. Why have we used sent_tokenize ? Can't we just tokenize individual words directly ?
2. Which way was the model created - cbow or skipgram ?
3. Could you add links to your deep learning lectures that you kept referring in the video
Nitish , I wanted to know whether judiciously following the assignments will do the thing for us , or do we need to study the NLTK, Spacy libraries in detail. Just asking !
sir, aap 29:00 me blog ka link dena shayd bhool gaye
brilliant explanation
According to wikipedia, CBOW is much faster than Skip-grams, then how can we use skip-grams for large dataset?
In CBoW section, I think dimensions are wrong as total words are 5 and weights receiving from prior layer (hidden layer) are 3. So, dimension of last layer should be 5x3.
That amazon aws google news vector file is giving 404 Not Found error in colab notebook.
Did you find any solution ?
So when you say 300 dimension vector used in word2vec , does that mean it has 300 nodes in second last layer?
Sir, I have doubts.
1. Why there are differences between Upper Case and Lower Case words? If we see the two vectors of 2 words "bag" and "Bag", the 2 vectors are different. 2. Why there is only one hidden layer?
And one more thing. Google has its own GCP. Then google stored the file "GoogleNews-vectors-negative300.bin.gz" on the AWS S3 bucket. Why?
1. Because each character has different binary values.
2. Creators must have tried different architectures.
3. It's not Google who have hosted it. It's the creator of gensim I guess.
sir linear function use in hidden layer in cbow and skipgram
?????
same word can be present in different sentences ? so for each sentence we calc a word vector and take average?????????
Sir please share the links for basics of nueral networks.
That amazon aws google news vector file is giving 404 Not Found error in colab notebook. how can i solve this to see that file?? anybody please help me
Why did you take 10 nodes in neural network input?
Kyuki sir hum aapke yaha Sikh to rahe hai but sir actually interview Kai liye explaination Kai upar khas kar Kai how to explain a project khas to ml Wale so is pai ek video banate hai aap to bahut help ho jayegi sir
Great 😍
sir please make a video on object detection model
'Mera machine rone lag gaya' 😂
Sir can u do one on Burt, please
super sir
Thank you so much Sir for this nicely explained tutorial. I have just one doubt Sir. If a sentence is having only 3 words, but while training the model on GOT data at last, we have considered window size 10. So what are the context word it will consider other than it's two context word ? (Sir, Is it like or can we assume like considering the whole corpus as one single sentence with all the words in the vocabulary ?)
Thanks a lot sir 🙏
where i can find the link of google news of aws?
please rephrase I loved your videos, please make a video on transformer too