GloVe: Global Vector - Introduction | Glove Word - Word Co-Occurrence Matrix | NLP | #18

Поделиться
HTML-код
  • Опубликовано: 1 окт 2024
  • "Fundamental of GloVe", "What is GloVe", "What is GloVe word embedding", "What is GloVe in NLP" all of these are explained are like never before.
    Next Video - • GloVe: Global Vector -...
    Previous Video- • NLP | Implementing Wor...
    You may also like to watch -
    NLP Playlist Complete - • NLP - Natural Language...
    Sentiment Analysis - • Sentiment Analysis Pro...
    Regular Expressions - • Regular Expressions in...
    Pandas all in one - • Python Pandas Complete...
    Pandas Full Playlist - • Python Pandas Tutorial...
    NumPy Full Playlist - • NumPy
    Matplotlib Full Playlist- • Python Matplotlib Tuto...
    Seaborn Full Playlist - • Seaborn Beginner to Pr...
    download the google news data -
    www.kaggle.com...
    tags -
    GloVe,
    TF-IDF,
    Automating the model,
    Naive Bayes,
    Bag of Words,
    Stop Words,
    Stemming,
    Lemmatization,
    Stemming and Lemmatization,
    Tokenization,
    NLP,
    Natural Language Processing,
    Read PDF in Python,
    #DataScience #NLP #NaturalLanguageProcessing #PyhonProgramming #Python #learnerea

Комментарии • 23

  • @dan12340987
    @dan12340987 7 месяцев назад +5

    For anyone who is watching it. this is NOT glove

  • @aalekhshrivastava7945
    @aalekhshrivastava7945 9 месяцев назад +5

    This is not GloVe. It simple co-occurrence matrix

  • @MD_FAISAL_
    @MD_FAISAL_ 8 месяцев назад +1

    this video is nothing but combination of nothing

    • @learnerea
      @learnerea  7 месяцев назад

      Thanks for the feedback, we can discuss to help further.
      Please feel free to share your thoughts in detail

    • @thisisnonpractice
      @thisisnonpractice 2 месяца назад +1

      @@learnerea Looks like this guy was angry because you mentioned about Ramayan at the end 😀

  • @aterribleyoutuber9039
    @aterribleyoutuber9039 Год назад +1

    This is just co-occurence matrix, not glove embeddings. The misleading title wasted my time

    • @learnerea
      @learnerea  Год назад

      thanks for sharing the thought

  • @Linda___7y6r
    @Linda___7y6r 2 месяца назад

    Hello, I have some marvelous news that will make you happy!

  • @suyogkhadke4755
    @suyogkhadke4755 11 месяцев назад +1

    wow! Thank you for teaching it in such simple manner.

    • @learnerea
      @learnerea  11 месяцев назад

      Glad it was helpful!

  • @shubha07m
    @shubha07m 4 месяца назад +1

    Great job, clearly explained!

    • @learnerea
      @learnerea  Месяц назад

      Glad it was helpful!

  • @diczst
    @diczst Год назад +1

    this is a gold, thank you so much sir

    • @learnerea
      @learnerea  11 месяцев назад

      Glad it helped!

  • @modemnaveen6240
    @modemnaveen6240 Год назад

    with the glove method we are calculating the probabilities of a word given another word right but how exactly we can use those probabilities as embeddings?

    • @learnerea
      @learnerea  Год назад

      well, that can be used.. but depending on the scenarios/situation

  • @yemam4156
    @yemam4156 Год назад +1

    thnx for explanation

  • @mrezuanul
    @mrezuanul 9 месяцев назад

    Now I Understand. Thanks

    • @learnerea
      @learnerea  8 месяцев назад

      Glad it was helpful

  • @OM_BANNA_1312
    @OM_BANNA_1312 10 месяцев назад +1

    આભાર ભઈલા

    • @learnerea
      @learnerea  10 месяцев назад

      Not sure if I understood this

    • @toxoreed4313
      @toxoreed4313 8 месяцев назад

      @@learnerea thanks brother,is what it says