This is amazing! You literally made the topic interesting, I never got bored throughout the entire video! I'll definitely stick to your channel in preparing for my thesis!
This is the first subscription I have made in my you tube...that's only because of your teaching skills ....u make any topic very interesting and provide excellent information on research and it's techniques... Please upload videos on autoencoders .
Really a great explaination. I just played only one video but after that I've watched 4 more Countinous videos which helped me lot to prepare for my interview I was going have tomorrow. Share your divine knowledge for us sir!
I thought I will not learn Deep Learning because it is too much complicated, you have just explained in an effective way, within a week, LSTM skill will be updated in my resume!
Most hot topic of the NLP I learnt it after giving too much time I am sharing what I learnt For embedding we use One hot encoding technigue Word2vec technique Embedding layer of keras technique after pad sequencing You can use it for recurrent neural network
Thank you for sharing this wealth of knowledge with us. TBH I’ve been finding issues in grasping how word embedding works. It’s so clear that they are not as straight forward as basic vectorization approach like BoW and TF-IDF. Nevertheless, I gained something from this video
thank you Sir. please make a video explaining how a sentence is beeing translated with the neural system. but with an example. thank you again you are amazing.
Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.
Sir Kindly guide - How can I use Pre Trained word embedding models for local languages (or languages written in Roman format) that are not available/trained in the pretrained model. Do I have to use an embedding layer(not pre trained) for creating embedding matrices for any local language? How can I get benefit from pretrained models for local language?
Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?
Hi Krish, First of all thanks for making this content. I have some doubts: 1. In predicting analogy,(King->queen), if we take up some other feature instead of gender, then result may not be necessarily true(or it selects the best feature to give results)
It was really helpful. Can u make videos on Grammer Correction using Rule based methord, Language Models & classifiers. its really hard to understand it otherwise
I would like to know how can be obtained that coefficients in vertical columns automatically and what scientific assumption and premisses should be used for doing it. As I see now, it can be done manually by logical consideration. Thanks!
Hey, In word embedding how features are defined? Is it extract from document itself or pre define features set available for related domain or whole language?
@Krish Naik at 7:08 i would like to know if you can share some material for the technique that we use to relate features such as gender to the words like boy and girl not to the apple and mangos but I have doubts what technique we use to make possible a machine learn the relation between features and words. Thanks
Hello, Suppose we need to add more features in our X which are not text..i.e suppose we get a sparse matrix after count vectorizer and now we have one more feature length and we want both features.How to combine both?
Can you please make a video on 'Hybrid Deep Learning Model for Sentiment Classification' that is implementation of CNN and LSTM together for sentiment classification?
I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?
I am facing an issue, trying to install nltk and spacy, but it's asking to downgrade tensorflow from version 2 to version 1.X. What can be done to install it without downgrading the TF.
Sir When u said that we related the gender to boy , then this is done by machine , or this is predefined , i mean to say that on what bases this all comes to an action that gender is related to boy and girl , and royal is related to queen and king ??? thnku sir
I'm literally binge watching your videos like Netflix!
This is amazing! You literally made the topic interesting, I never got bored throughout the entire video!
I'll definitely stick to your channel in preparing for my thesis!
This is the first subscription I have made in my you tube...that's only because of your teaching skills ....u make any topic very interesting and provide excellent information on research and it's techniques... Please upload videos on autoencoders .
Same here! 😊😊
Really a great explaination. I just played only one video but after that I've watched 4 more Countinous videos which helped me lot to prepare for my interview I was going have tomorrow. Share your divine knowledge for us sir!
you are 200% better than my professor in explaining Word Embedding
I thought I will not learn Deep Learning because it is too much complicated, you have just explained in an effective way, within a week, LSTM skill will be updated in my resume!
Most hot topic of the NLP I learnt it after giving too much time
I am sharing what I learnt
For embedding we use
One hot encoding technigue
Word2vec technique
Embedding layer of keras technique after pad sequencing
You can use it for recurrent neural network
Word Embedding is such a masterpiece!
His channel and Statquest are 2 of the best resources for ML and Data Science on RUclips.
Nice ,Clean discription. Well Done ,Krish !!!
I don't usually subscribe to youtube channels ... but this first video I watched from you got me.
Can't wait for the next video.....
Thanks A Lot .....
Excellent! Love from USA
Your speciality is teaching from scratch sir
Keep posting great content. It's worth sharing your content with everyone! Thank you!!!
Amazing way of teaching sir !! Great work. Thanks alot
Thank you for sharing this wealth of knowledge with us. TBH I’ve been finding issues in grasping how word embedding works. It’s so clear that they are not as straight forward as basic vectorization approach like BoW and TF-IDF. Nevertheless, I gained something from this video
@Krish- How you got this features(like Gender,Royal,Age etc).is this features mention any where in word embedding
Krish, you save my life every time
Awsome video!! You are an excellent teacher wish I had a teacher like u at my Master program right now!
Super great! Thanks very much. Krish. It is very good learning video, instructor Krish is great, passionate, exciting. The lesson is very interesting.
I think I found a gem! Thank you Krish! You really make it so easy to understand.
Sir your explanations are fantastic
Try showing embedding projector. It’s an interesting way to visualise embedding and sparks interest.
Super sir I enjoyed before the exam
You are simply amazing sir....hats off to you💯💯
These videos are adding up my knowledge. Thank you, @krish naik sir!
you are a gem man , love your style
What an awesome explanation. Thank you @Krish
You have done a great job, There are students like me, who really need such explannation for getting a rough image...atleast thanks a lot
rough image...means, I got an idea and now i can study more on it and try to clear the image, created by your wonderful explannation
Every day im watching atleast 5-6 videos these days
Thanks krish
applauding for you !! thank you again!
thank you Sir. please make a video explaining how a sentence is beeing translated with the neural system. but with an example. thank you again you are amazing.
Perfect explanation... thanks a lot
Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.
Good content, keep doing videos like this!
Subscribed !
Thanks a lot. Hopefully your videos will be helpful my thesis project !
Sir Kindly guide - How can I use Pre Trained word embedding models for local languages (or languages written in Roman format) that are not available/trained in the pretrained model. Do I have to use an embedding layer(not pre trained) for creating embedding matrices for any local language? How can I get benefit from pretrained models for local language?
Wonderful Video Krish
sir can you please tell me how to choose number of embedding dimensions?
For eg., vocab_size = 10000,max_length = 120, and embedding_dim = ??
Your explanations are great, thank you
Nicely explain Thanks for making things clear
Excellent tutorials .
Sir that's amazing 😍😊
Good explanation krish sir
How do we decide features (gender, age...) in this technique
no words for you sir.. 👌👌
would you please make a video on gradient boosting and Xgboost ML algorithm with all maths stuffs....
Will upload soon
LSTM Preactical session also required Sir
You haven't made any video on gradient boosting yet...Actually.Boosting series is incomplete.
Please make video on GB technique
amazing brother thank you
Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?
Amazing explanation thank you!
please make a video on dimensional reduction technique in order to reduce into 2 dimension from more dimension (coding)
superb ... Is the word embedding fixed or generated for every dataset given for training ?
You are amazing❤️
Very well explained
good explanation sir
Thanks krish for the video.
Great explanation
Nice explanation sir ,always waiting for new videos. we are looking for your book publication.
Sir please make a video on sentiment analysis using VADER.
thanks this is awesome
Hi Krish, First of all thanks for making this content.
I have some doubts:
1. In predicting analogy,(King->queen), if we take up some other feature instead of gender, then result may not be necessarily true(or it selects the best feature to give results)
very good, thank you sir !
thank you so much sir
It was really helpful. Can u make videos on Grammer Correction using Rule based methord, Language Models & classifiers.
its really hard to understand it otherwise
I'm proud of you... But plz don't join BJYUS Or vedantu etc etc... You are the best...
Waiting for your facenet embedding... and clustering process
Simply Awsome.
what kind of mathematics and statistics required for data science carrier? sir please make video on this topic
Basic level
Thank you sir
excellent
I would like to know how can be obtained that coefficients in vertical columns automatically and what scientific assumption and premisses should be used for doing it. As I see now, it can be done manually by logical consideration. Thanks!
Many thanks
Hi krish,
What is the parameter update equation in SVM and logistic regression?
sir please make video on Data scientist carrier degree is required or certification is enough for job
sir, u left most important part w2v plzzzz do cover it
and do cover also maximum 300 dimension funda its really difficult for me to get
Pytorch or Tensorflow 2.0 which one is better for beginners?
Keras
Hey, In word embedding how features are defined? Is it extract from document itself or pre define features set available for related domain or whole language?
MAKING NO ONE ANY WISER.
Sir kindly make video on how we can embed source code into vector and use it for Training DL model
Thanks!
@Krish Naik at 7:08 i would like to know if you can share some material for the technique that we use to relate features such as gender to the words like boy and girl not to the apple and mangos but I have doubts what technique we use to make possible a machine learn the relation between features and words. Thanks
Can you explain the last video in this "Introduction to Word Embeddings" series? "Embeddings matrix"
Are you able to use nlp to build a siri type app in your own foreign language? Thanks for tutorial.
Hello, Suppose we need to add more features in our X which are not text..i.e suppose we get a sparse matrix after count vectorizer and now we have one more feature length and we want both features.How to combine both?
Can you please make a video on 'Hybrid Deep Learning Model for
Sentiment Classification' that is implementation of CNN and LSTM together for sentiment classification?
what is the difference between word embedding layer and WordtoVec class
sir how will be these parameter will be decided as u did . like gender, royal, age , food etc
Can you make a video on balancing the imbalance text data set?
This is really really amazing. Thank you for your efforts. Can you make a video on sentence embedding as well.
There is no such thing as sentence embedding as per my knowledge
is word embedding similar to pandas pivot table expect that we provide features here?
I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?
I am facing an issue, trying to install nltk and spacy, but it's asking to downgrade tensorflow from version 2 to version 1.X. What can be done to install it without downgrading the TF.
Sir,kindly make some basic videos on Pandas.
Check my complete ML playlist
Sir When u said that we related the gender to boy , then this is done by machine , or this is predefined , i mean to say that on what bases this all comes to an action that gender is related to boy and girl , and royal is related to queen and king ??? thnku sir
Hey can you please explain on which bases you are deciding these vector value like 0.01, 0.03 etc.
Refer word2vec video of ritvikmath RUclips channel. Thank me later😇😆
Using Keras Embedding can I embed word form any language?
👏👏