Understanding GloVe method for generating word embeddings

Поделиться
HTML-код
  • Опубликовано: 30 сен 2024
  • #nlp #machinelearning #datascience #artificialintelligence
    its the W matrix that gives us the word embeddings in the last equation

Комментарии • 6

  • @kindaeasy9797
    @kindaeasy9797 Месяц назад

    1:37 your matrix is wrong , assuming the window size is 1 , for 'a' 'good' will have the value 2 and also 'is' will have a value of 2 , you consider both left and right words of the word for which you are filling the table , comeon mann!!

  • @kindaeasy9797
    @kindaeasy9797 Месяц назад

    how did we get Wi and Wk , the word vector representation of i and k

    • @kindaeasy9797
      @kindaeasy9797 Месяц назад

      ooh we are getting them post optimizing that equation , right?

  • @micahbragg3356
    @micahbragg3356 8 месяцев назад +1

    very helpful ! thank you!!

  • @sasidharreddykatikam328
    @sasidharreddykatikam328 Год назад +1

    super explantion dude. i love the explanation of this topic