Just Love you videos man! Thanks for helping! I was scared of learning attention mechanisms at first but now everything is clear. It just helps a lot while working on projects when you know what is going on behind the scenes.
Hi, I love your videos! Just a minor thing: you mention a vocabulary of size 100 and a dense representation with 256 dimensions. What you meant is something like a vocabulary of 100k entries, right? On the other hand, the algorithm would work with 100 entries, too, I guess.
I am a newbie to ML so forgive me if I am wrong. I was going through your video and then something didn't make sense to me. At time 2:09 you said it is 100 x 1 vector but later you mentioned at 2:38 that its 1 x 100 vector. I think the latter is correct, right @CodeEmporium
Just Love you videos man! Thanks for helping! I was scared of learning attention mechanisms at first but now everything is clear. It just helps a lot while working on projects when you know what is going on behind the scenes.
This RUclips channel slays!
Your explanation is super brother one of the best
This video is so good, if you already know the subject 50%
That’s such a great explanation, thank you!
❤ from ಬೆಂಗಳೂರು
Hearts from me too! Thanks for commenting fellow Kannadiga :)
Thank you for all the effort you put to make these subjects easy and accessible . As a newbie, could you please tell me where do I have to start ?
Hi, I love your videos! Just a minor thing: you mention a vocabulary of size 100 and a dense representation with 256 dimensions. What you meant is something like a vocabulary of 100k entries, right? On the other hand, the algorithm would work with 100 entries, too, I guess.
quality video. thanks 👍
I am a newbie to ML so forgive me if I am wrong. I was going through your video and then something didn't make sense to me. At time 2:09 you said it is 100 x 1 vector but later you mentioned at 2:38 that its 1 x 100 vector. I think the latter is correct, right @CodeEmporium
Thank you for this video :)
My pleasure :)
@@CodeEmporium Your new, more historical series is a great contribution to yt :D
Great video!
what makes these different than tokenizers
my saviour.
Anytime haha
When you say “n-gram vector” do you mean “bag of words vector”. I always thought it was the latter and haven’t heard the former
Why did u use 256 dimensions? Whats so special about 256?
Didn't this guy have a discord, does anyone know if its still up or could anyone send it. id love to be a part of this wonderful youtubers community
Danke Mann!
Thank you
sir neevu kannadigara wow?? do u work in usa sir ?
or pursuing any degree?
Sir r u from Karnataka
Yep :)
Could not get anything from this. too complex