Word2Vec, GloVe, FastText- EXPLAINED!

Поделиться
HTML-код
  • Опубликовано: 25 дек 2024

Комментарии • 31

  • @uk_with_jatin3512
    @uk_with_jatin3512 Год назад +2

    Just Love you videos man! Thanks for helping! I was scared of learning attention mechanisms at first but now everything is clear. It just helps a lot while working on projects when you know what is going on behind the scenes.

  • @RodrigoChimelliSilva
    @RodrigoChimelliSilva 21 день назад +1

    This RUclips channel slays!

  • @rohitchamp
    @rohitchamp Год назад +1

    Your explanation is super brother one of the best

  • @Techie-time
    @Techie-time 4 месяца назад +4

    This video is so good, if you already know the subject 50%

  • @RivenOmg
    @RivenOmg Год назад +1

    That’s such a great explanation, thank you!

  • @roastmaker1233
    @roastmaker1233 Год назад +2

    ❤ from ಬೆಂಗಳೂರು

    • @CodeEmporium
      @CodeEmporium  Год назад

      Hearts from me too! Thanks for commenting fellow Kannadiga :)

  • @joudjoud1947
    @joudjoud1947 Год назад

    Thank you for all the effort you put to make these subjects easy and accessible . As a newbie, could you please tell me where do I have to start ?

  • @alexanderbrandt6781
    @alexanderbrandt6781 5 месяцев назад

    Hi, I love your videos! Just a minor thing: you mention a vocabulary of size 100 and a dense representation with 256 dimensions. What you meant is something like a vocabulary of 100k entries, right? On the other hand, the algorithm would work with 100 entries, too, I guess.

  • @samson6707
    @samson6707 8 месяцев назад

    quality video. thanks 👍

  • @AnakarParida
    @AnakarParida Год назад

    I am a newbie to ML so forgive me if I am wrong. I was going through your video and then something didn't make sense to me. At time 2:09 you said it is 100 x 1 vector but later you mentioned at 2:38 that its 1 x 100 vector. I think the latter is correct, right @CodeEmporium

  • @krzysztofjarek6476
    @krzysztofjarek6476 Год назад

    Thank you for this video :)

    • @CodeEmporium
      @CodeEmporium  Год назад

      My pleasure :)

    • @krzysztofjarek6476
      @krzysztofjarek6476 Год назад

      @@CodeEmporium Your new, more historical series is a great contribution to yt :D

  • @Patapom3
    @Patapom3 Год назад

    Great video!

  • @s8x.
    @s8x. 7 месяцев назад

    what makes these different than tokenizers

  • @XpLoeRe
    @XpLoeRe Год назад

    my saviour.

  • @BromaniJones
    @BromaniJones Год назад

    When you say “n-gram vector” do you mean “bag of words vector”. I always thought it was the latter and haven’t heard the former

  • @jcl6861
    @jcl6861 5 месяцев назад

    Why did u use 256 dimensions? Whats so special about 256?

  • @kinanradaideh5479
    @kinanradaideh5479 Год назад

    Didn't this guy have a discord, does anyone know if its still up or could anyone send it. id love to be a part of this wonderful youtubers community

  • @phlip00
    @phlip00 Год назад

    Danke Mann!

  • @bhagavanprasad
    @bhagavanprasad 3 месяца назад

    Thank you

  • @ghostrider9084
    @ghostrider9084 8 месяцев назад

    sir neevu kannadigara wow?? do u work in usa sir ?
    or pursuing any degree?

  • @prabhu__why_not
    @prabhu__why_not Год назад +1

    Sir r u from Karnataka

  • @telugumoviesfunnycuts5310
    @telugumoviesfunnycuts5310 6 месяцев назад +2

    Could not get anything from this. too complex