Word Embedding - Natural Language Processing| Deep Learning
HTML-код
- Опубликовано: 14 окт 2024
- A word embedding is a learned representation for text where words that have the same meaning have a similar representation. It is this approach to representing words and documents that may be considered one of the key breakthroughs of deep learning on challenging natural language processing problems.
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
Please do subscribe my other channel too
/ @krishnaikhindi
If you want to Give donation to support my channel, below is the Gpay id
GPay: krishnaik06@okicici
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06
Really a great explaination. I just played only one video but after that I've watched 4 more Countinous videos which helped me lot to prepare for my interview I was going have tomorrow. Share your divine knowledge for us sir!
This is amazing! You literally made the topic interesting, I never got bored throughout the entire video!
I'll definitely stick to your channel in preparing for my thesis!
I'm literally binge watching your videos like Netflix!
This is the first subscription I have made in my you tube...that's only because of your teaching skills ....u make any topic very interesting and provide excellent information on research and it's techniques... Please upload videos on autoencoders .
Same here! 😊😊
I thought I will not learn Deep Learning because it is too much complicated, you have just explained in an effective way, within a week, LSTM skill will be updated in my resume!
you are 200% better than my professor in explaining Word Embedding
@Krish- How you got this features(like Gender,Royal,Age etc).is this features mention any where in word embedding
Amazing way of teaching sir !! Great work. Thanks alot
Keep posting great content. It's worth sharing your content with everyone! Thank you!!!
Try showing embedding projector. It’s an interesting way to visualise embedding and sparks interest.
I think I found a gem! Thank you Krish! You really make it so easy to understand.
Super great! Thanks very much. Krish. It is very good learning video, instructor Krish is great, passionate, exciting. The lesson is very interesting.
His channel and Statquest are 2 of the best resources for ML and Data Science on RUclips.
Awsome video!! You are an excellent teacher wish I had a teacher like u at my Master program right now!
I don't usually subscribe to youtube channels ... but this first video I watched from you got me.
Most hot topic of the NLP I learnt it after giving too much time
I am sharing what I learnt
For embedding we use
One hot encoding technigue
Word2vec technique
Embedding layer of keras technique after pad sequencing
You can use it for recurrent neural network
Nice ,Clean discription. Well Done ,Krish !!!
Sir your explanations are fantastic
Your speciality is teaching from scratch sir
thank you Sir. please make a video explaining how a sentence is beeing translated with the neural system. but with an example. thank you again you are amazing.
These videos are adding up my knowledge. Thank you, @krish naik sir!
Thanks for the tutorial on word embedding models. I wonder how are features selected in these models? I think in some particular cases having control on customizing these feature might enhance the chance of getting more similar words than just using the pretarined ones.
Thanks krish
superb ... Is the word embedding fixed or generated for every dataset given for training ?
Word Embedding is such a masterpiece!
Thank you for sharing this wealth of knowledge with us. TBH I’ve been finding issues in grasping how word embedding works. It’s so clear that they are not as straight forward as basic vectorization approach like BoW and TF-IDF. Nevertheless, I gained something from this video
You are simply amazing sir....hats off to you💯💯
Super sir I enjoyed before the exam
Subscribed !
Thanks a lot. Hopefully your videos will be helpful my thesis project !
no words for you sir.. 👌👌
would you please make a video on gradient boosting and Xgboost ML algorithm with all maths stuffs....
Will upload soon
Can't wait for the next video.....
Thanks A Lot .....
Sir Kindly guide - How can I use Pre Trained word embedding models for local languages (or languages written in Roman format) that are not available/trained in the pretrained model. Do I have to use an embedding layer(not pre trained) for creating embedding matrices for any local language? How can I get benefit from pretrained models for local language?
Hi Krish, First of all thanks for making this content.
I have some doubts:
1. In predicting analogy,(King->queen), if we take up some other feature instead of gender, then result may not be necessarily true(or it selects the best feature to give results)
Every day im watching atleast 5-6 videos these days
What an awesome explanation. Thank you @Krish
Excellent! Love from USA
you are a gem man , love your style
Perfect explanation... thanks a lot
Krish, you save my life every time
sir can you please tell me how to choose number of embedding dimensions?
For eg., vocab_size = 10000,max_length = 120, and embedding_dim = ??
You haven't made any video on gradient boosting yet...Actually.Boosting series is incomplete.
Please make video on GB technique
Nicely explain Thanks for making things clear
Your explanations are great, thank you
You have done a great job, There are students like me, who really need such explannation for getting a rough image...atleast thanks a lot
rough image...means, I got an idea and now i can study more on it and try to clear the image, created by your wonderful explannation
amazing brother thank you
Excellent tutorials .
Very well explained
LSTM Preactical session also required Sir
please make a video on dimensional reduction technique in order to reduce into 2 dimension from more dimension (coding)
Good day, may I ask how to define specific dimensions of features (for example, I want to extract linguistic features such as part of speech tagging, word density, and word frequency) that is going to be vectorized?
Good explanation krish sir
Good content, keep doing videos like this!
Nice explanation sir ,always waiting for new videos. we are looking for your book publication.
It was really helpful. Can u make videos on Grammer Correction using Rule based methord, Language Models & classifiers.
its really hard to understand it otherwise
Sir that's amazing 😍😊
Amazing explanation thank you!
applauding for you !! thank you again!
Wonderful Video Krish
good explanation sir
thanks this is awesome
thank you so much sir
Many thanks
Sir please make a video on sentiment analysis using VADER.
Thank you sir
You are amazing❤️
How do we decide features (gender, age...) in this technique
excellent
Waiting for your facenet embedding... and clustering process
Great explanation
Hey, In word embedding how features are defined? Is it extract from document itself or pre define features set available for related domain or whole language?
Thanks!
sir please make video on Data scientist carrier degree is required or certification is enough for job
This is really really amazing. Thank you for your efforts. Can you make a video on sentence embedding as well.
There is no such thing as sentence embedding as per my knowledge
Are you able to use nlp to build a siri type app in your own foreign language? Thanks for tutorial.
Thanks krish for the video.
I would like to know how can be obtained that coefficients in vertical columns automatically and what scientific assumption and premisses should be used for doing it. As I see now, it can be done manually by logical consideration. Thanks!
very good, thank you sir !
what kind of mathematics and statistics required for data science carrier? sir please make video on this topic
Basic level
@Krish Naik at 7:08 i would like to know if you can share some material for the technique that we use to relate features such as gender to the words like boy and girl not to the apple and mangos but I have doubts what technique we use to make possible a machine learn the relation between features and words. Thanks
sir how will be these parameter will be decided as u did . like gender, royal, age , food etc
Hi krish,
What is the parameter update equation in SVM and logistic regression?
sir, u left most important part w2v plzzzz do cover it
and do cover also maximum 300 dimension funda its really difficult for me to get
Sir kindly make video on how we can embed source code into vector and use it for Training DL model
Sir When u said that we related the gender to boy , then this is done by machine , or this is predefined , i mean to say that on what bases this all comes to an action that gender is related to boy and girl , and royal is related to queen and king ??? thnku sir
Pytorch or Tensorflow 2.0 which one is better for beginners?
Keras
Can you please make a video on 'Hybrid Deep Learning Model for
Sentiment Classification' that is implementation of CNN and LSTM together for sentiment classification?
is word embedding similar to pandas pivot table expect that we provide features here?
Can you explain the last video in this "Introduction to Word Embeddings" series? "Embeddings matrix"
Can you make a video on balancing the imbalance text data set?
Simply Awsome.
I don’t really get word embedding. Does it work outside the mainstream English language? For example medical language is different. If I am studying about medical literature, a lot of my main vocabularies are medical words. What is your opinion on this?
what is the difference between word embedding layer and WordtoVec class
Hello, Suppose we need to add more features in our X which are not text..i.e suppose we get a sparse matrix after count vectorizer and now we have one more feature length and we want both features.How to combine both?
👏👏
Hey can you please explain on which bases you are deciding these vector value like 0.01, 0.03 etc.
Refer word2vec video of ritvikmath RUclips channel. Thank me later😇😆
Sir,kindly make some basic videos on Pandas.
Check my complete ML playlist
I have one doubt can you make it clear, my doubt is how it is assigning the values for opposite genders, like for Boy -1 and for girl 1
I am facing an issue, trying to install nltk and spacy, but it's asking to downgrade tensorflow from version 2 to version 1.X. What can be done to install it without downgrading the TF.
sir video no 35 and 36 same videos Iof this series would you like to check?
Using Keras Embedding can I embed word form any language?