Word2Vec Simplified|Word2Vec explained in simple language|CBOW and Skipgrm methods in word2vec

Поделиться
HTML-код
  • Опубликовано: 7 ноя 2024

Комментарии • 153

  • @abdullahilawal3220
    @abdullahilawal3220 3 года назад +19

    Masha ALLAH, i have searched for three years to get a simple explanation of w2v until i found your video today. U have done a great job. Thanks alot

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад +2

      Your comment made my day Abdullahi. thanks for such motivation. keep watching.

  • @debbs_io
    @debbs_io 6 месяцев назад +1

    Finally a simple, clear and complete explanation of this word embeddings. Thank you so much. I've studied and searched for months and you've just made the whole thing clear and straight forward.

  • @victorsolomonmachinelearning
    @victorsolomonmachinelearning 2 года назад +4

    I've been searching for a comprehensive explanation without the technical jargons. you are the best teacher

  • @metalhamster14
    @metalhamster14 2 года назад +2

    You have a gift.
    A natural teacher.

  • @LocksRevenge
    @LocksRevenge 11 месяцев назад +1

    Simplified and concise, Thank you Sir.

  • @caoilim
    @caoilim 2 года назад

    I really don't comment but this is just great. Really helped me out with my college assignment. I was tearing my hair out before this. Genuinely thank you.

  • @pankajuprety9216
    @pankajuprety9216 3 года назад +1

    your explanation is simple clear and best

  • @henriquemaulerborges1390
    @henriquemaulerborges1390 2 года назад +1

    You video is very simple to undestand, thank you man!

  • @SaHaRaSquad
    @SaHaRaSquad Год назад

    Finally an understandable explanation of how those methods work. Short and to the point, thanks for this video.

  • @nikitasharma4957
    @nikitasharma4957 2 года назад

    Jaise hi word or vector m difference smjh aaya i 👍 ❤️ like it seriously..after that I HV seen full vedio..osam explanation

  • @shivnathsinghgaur5972
    @shivnathsinghgaur5972 2 года назад +1

    i spend too much time to get logic behind woed2vec but now i am satisfy

  • @bekiofficialchannel9746
    @bekiofficialchannel9746 2 года назад

    Thank you! I got the basic concepts about word2vec.

  • @uwaisahamedimad556
    @uwaisahamedimad556 2 года назад

    I have written the same in one of your previously watched video,but still I want to write it...You are really unfolding Data Science.

  • @RezoanurRahman
    @RezoanurRahman 3 года назад

    Literally can not make it more simpler than this.

  • @dub161
    @dub161 3 месяца назад

    crazzzzyyy. You make things so simple!!!!

  • @arcanepersona1676
    @arcanepersona1676 Год назад +2

    May Allah increase your knowledge my friend, i am just starting with NLP and you help me learn the basics well. I didnt know what one hot embedding means. Thanks and keep it up.
    Also could you please do a project end to end in NLP for example sentiment analysis ?
    I want to see these techniques in real action.
    Thanks again :)

  • @SACHINKUMAR-zo5pm
    @SACHINKUMAR-zo5pm 2 года назад

    I find ur videos very helpful, many topics which are not explained well in other channels, i come to ur channel to see if it is available and u have never disappointed me. Keep up the good work. Thanks.

  • @krishnab6444
    @krishnab6444 2 года назад

    Perfectly Explained , hats off to u!! Thank You

  • @fatemeazizi2224
    @fatemeazizi2224 Год назад

    simply explained, Thank you

  • @mrdh927
    @mrdh927 3 года назад +1

    Thank you, Aman, I really appreciate your simple explanation.

  • @RohanPaul-AI
    @RohanPaul-AI 3 года назад +1

    As always, superbly simple and great explanation. Many thanks.

  • @SallyMwalrMusic
    @SallyMwalrMusic 6 месяцев назад

    Just sending love and regards seem like a good person and appreciating your efforts to teach others,

  • @vijaym4598
    @vijaym4598 3 года назад

    Very simplified about Wrod2vec....Thanks and all the best

  • @pranjalgupta9427
    @pranjalgupta9427 3 года назад +1

    Nice explanation

  • @jialaiz9101
    @jialaiz9101 3 года назад +2

    Very intuitive, thx man

  • @piyalikarmakar5979
    @piyalikarmakar5979 3 года назад

    Thank you so much sir.. From the documentations I was confused.. Thanks a lot for such cleared and easy explanation..

  • @sandipansarkar9211
    @sandipansarkar9211 2 года назад

    finished watching

  • @ShahzadKhan-zq9ty
    @ShahzadKhan-zq9ty 3 года назад +1

    one of the finest explanation, thanks bro

  • @SibanandaPattanaik
    @SibanandaPattanaik 2 года назад +1

    @Unfold Data Science In the starting whatever encoding ur doing, I don't think it is one hot encoding instead it's a BOW representation...Correct me if i m wrong.

  • @vittalhundarad1644
    @vittalhundarad1644 2 года назад

    thank you sir , nice explanation

  • @mahikhan5716
    @mahikhan5716 2 года назад

    great aman . tactics of ur teaching is really tremendous

  • @MrBansal907
    @MrBansal907 4 месяца назад

    Good job sir, thank you

  • @karthickd537
    @karthickd537 3 года назад

    Very nicely explained in a very simple way.

  • @suhanshupatel9204
    @suhanshupatel9204 3 года назад +1

    Thanku bhaiyya.

  • @gurmindersinghmalhotra8184
    @gurmindersinghmalhotra8184 3 года назад

    Aman bhai, thanks for excellent explanation.

  • @udayradhe6537
    @udayradhe6537 2 года назад

    Amazing! so well explained in simple terms.

  • @NightSurge.
    @NightSurge. 3 года назад

    Great video, straight to the point and easy to understand.

  • @pakistantechknowledge3911
    @pakistantechknowledge3911 2 года назад

    you mean sparse vector of word will give dense vector. how features are selected and how values are assigned
    ?

  • @learn-and-live
    @learn-and-live 2 года назад

    thank you for this clear explanation!

  • @rejithmadhavan5735
    @rejithmadhavan5735 2 года назад

    Precise explanation. Very useful. Thank you Aman. Just one suggestion, visually it will be more soothing if you could follow straight line on your whiteboard.

  • @exuberantyouth8765
    @exuberantyouth8765 Год назад

    Thanks Aman

  • @AbhishekKumar-hh5oh
    @AbhishekKumar-hh5oh 3 года назад

    well explained and in a simplified way. Thanks a lot for this video

  • @sharatchandra2045
    @sharatchandra2045 4 года назад

    good and simple explaanation

  • @linxin3811
    @linxin3811 4 года назад +3

    Very good introduction to Word2Vec. Would have preferred a bigger writing space, but nonetheless was very clear

  • @anurodhchoudhary1689
    @anurodhchoudhary1689 3 года назад

    when Legends start to teach everything becomes ezzzz!!!!!

  • @aniketwagh570
    @aniketwagh570 3 года назад

    awesome video sir, really awesome.

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      So nice of you Aniket. Your comments are precious for me.

  • @omkarjadhav13
    @omkarjadhav13 4 года назад +1

    Good work.

  • @sunilupadhyay7948
    @sunilupadhyay7948 3 года назад

    very informatics pls continue.

  • @sandipansarkar9211
    @sandipansarkar9211 3 года назад

    great explanation

  • @ribamarsantarosa4465
    @ribamarsantarosa4465 Год назад

    humble and nice presentation :)

  • @RaghavaIndra
    @RaghavaIndra 3 года назад +1

    Very good explanation!

  • @LearningWithFun333
    @LearningWithFun333 3 года назад

    Great explanation. thanks a lot.

  • @saqibcs
    @saqibcs 2 года назад

    Brother, great explanation. I will appreciate if could just tell where I can found next video

  • @dhlpaket2247
    @dhlpaket2247 3 года назад +1

    Best man

  • @vaishnavibhosale2272
    @vaishnavibhosale2272 2 года назад

    great work 👍👍

  • @claudeclaude9991
    @claudeclaude9991 3 года назад

    Man, you are great. Thanks a lot. Do you have any doc2vec video or playlist?

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      Thanks a lot, doc2vec, I will create. Have not created yet.

  • @findyourpassion1677
    @findyourpassion1677 4 года назад

    Very helpful explanation. Looking forward to seeing more. Thanks in advance.

  • @sudiptamang1018
    @sudiptamang1018 3 года назад

    thank you

  • @suriebc6779
    @suriebc6779 3 года назад

    Hi,
    I have an experience of 8 years in IT(Programming) and started preparing for data science career transition.
    My query is How can I put up in Resume.
    Shall I tell them that, I am new to this and Know all the concept and worked on kaggle datasets?
    Will they allow or reject since no past experience on data science?
    looking forward for your valuable comments.

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      Hi,
      The query needs some bit of understanding your profile and giving suggestion
      I see you are looking for some career guidance,
      If yes, you can reach me directly for one on one mentorship.
      You can go to "About" section in the channel and click the link "One to One Mentorship Link"

  • @unexpected2004
    @unexpected2004 4 года назад

    Great stuff! Couldn't find a better video with such well assembled content.

  • @שחרכהן-פ6ד
    @שחרכהן-פ6ד Год назад

    thank you!

  • @binaykumar1616
    @binaykumar1616 3 года назад

    You explained nicely. Please use a bigger screen.

  • @shaheenullah3212
    @shaheenullah3212 3 года назад

    Great sir. please sir also share about pos tagging using this approach

  • @aqsashahzad5139
    @aqsashahzad5139 3 года назад

    Informative video... (y) Or in m sy best algorithm konsa h text classification tasks k liy cbow ya skip gram?

  • @juniormedjeufopa3780
    @juniormedjeufopa3780 3 года назад +1

    Thanks :) Great explanations

  • @piyalikarmakar5979
    @piyalikarmakar5979 3 года назад

    Sir, I have one query that as input we need to feed the one hot representation of the surrounding words including the one hot vector of the target word or excluding so?

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад +1

      Hi Piyali. Excluding. For target you might need to do separate encoding.

    • @piyalikarmakar5979
      @piyalikarmakar5979 3 года назад

      @@UnfoldDataScience okay sir..then in case of getting all the word embeddings of the input sentence..one hot encode of the words under the window passed through CBOW and find the middle target word using the feature based information.. then slide the window and repeat the same untill the end... is it the process sir?

  • @the_street_coder4433
    @the_street_coder4433 3 года назад

    Owosome Men, I Loved the explanation.

  • @omarwaleedsalihalrassam905
    @omarwaleedsalihalrassam905 3 года назад

    Great explanation and well done, my question is how the numbers can be selected to set the features? I mean, in your example at time (5:00) of your video, you assigned 0.9 to the apple and 0.85 to mango ..etc, how these numbers can be assigned?

  • @nagithareddy1798
    @nagithareddy1798 3 года назад

    How to write a program to implement continuous bag of words model using knn algorithm (using python).

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      Hi Nagitha, You can convert word to number and run KNN.

    • @nagithareddy1798
      @nagithareddy1798 3 года назад

      @@UnfoldDataScience could u please tell in detail I'm new to this course I donno how to do if possible please provide the code with explanation

  • @mahikhan5716
    @mahikhan5716 2 года назад

    i have one question since bag of words has issue so isn't it used anywhere ?

  • @omarsalam7586
    @omarsalam7586 2 года назад

  • @gururajraykar9254
    @gururajraykar9254 3 года назад

    How Word2Vec will handle for Test data set?

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      We need to create word vector for test and train in similar way.

  • @anagham2413
    @anagham2413 4 года назад +1

    How to increase the Dimensionality of feature vector

    • @vijethrai2747
      @vijethrai2747 3 года назад +1

      By increasing output size more than 300 units you can get vector with higher dimensions and increase the model quality

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      THanks Vijeth.

  • @prashanthkolaneru3178
    @prashanthkolaneru3178 4 года назад

    great stuff, where this application mostly used?

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +1

      Thanks Prashanth, in many NLP application where we need to capture semantic meaning of sentence.

  • @sidhantgi899
    @sidhantgi899 4 года назад +1

    Sir aap ne graduation kiss subject se kiya hai.

  • @paragpujari3961
    @paragpujari3961 2 года назад

    please do not put ads in between it creates distraction otherwise lecture is good

  • @smartrider.strider
    @smartrider.strider 4 года назад

    Great stuff!

  • @preranatiwary7690
    @preranatiwary7690 4 года назад

    Good one again!

  • @syedharis3235
    @syedharis3235 3 года назад

    Great job bro!

  • @maYYidtS
    @maYYidtS 3 года назад

    bro how can we decide vector space size ex. (5000 words*300 vectors)...here 5000 is our vocab size, but how can we decide the 300 lengths...

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад +1

      Normally we take 300 dimension however if our vocab size is huge, then processing might be difficult with more number of dimension hence we try limiting it to 200 or so.
      If you do not have infra issue, you can go for 500 or even more.

    • @maYYidtS
      @maYYidtS 3 года назад

      @@UnfoldDataScience Thanks....bro

    • @maYYidtS
      @maYYidtS 3 года назад

      @@UnfoldDataScience so it is considered as a hyperparameter?...can we change this in tunning. and please make video on optuna, biasen hyperparameter tunnign.

  • @vihangafernando6644
    @vihangafernando6644 3 года назад

    Are you using CBOW or Skipgram

  • @trexmidnite
    @trexmidnite 3 года назад +1

    is nothing BUTT!