For more details and code on building a translator using a transformer neural network, check out my playlist "Transformers from scratch": ruclips.net/video/QCJQG4DuHT0/видео.html
what a hugely underrated video. You did such a better job at explaining this on multiple abstraction layers in such a short video than most videos I could find on the topic which were more than twice as long.
Thanks a ton Jeffrey! Means a lot. I've come to realize (fairly recently) that only speaking in jargon isn't going to help. Pealing it down from highly abstract to more technical goes a long way for viewers and myself. I understand more when I break the jargon down. Using this more in future videos
@@CodeEmporium Of course everyone has a different approach to understanding a topic. I recently had to get into a few topics quite quickly and fore me the best way of getting there fast was to start out with very general videos to get a sort of feel for the general ideas and how everything works together on a high level. Then I would watch some more detailed videos or switch to reading more detailed articles until finally reading the actual papers and looking at the formulas and all that stuff. Having an understanding of the bigger picture helped me comprehend the details better. I Also think no explanation can ever be "too simple" cause sometimes when explanations try to save time by glossing over parts or taking things for granted you spend way more time rewinding trying to wrap your head about some small detail just because you might be missing some needed knowledge. I think in an explanation it's like with spices on food. better keep it simple and easy. Individuals can always skip parts for themselves. same with the spices: Better not add too much thinking everyone will be able to take it, rather add a little and if it's not enough for someone they can add it themselves.
@@CodeEmporium Great approach that you've taken. A high level understanding followed by deeper understanding of the topic pretty much clears up the concept. Subscribed.
Great video! Watched it a few times already so these timestamps will help me out: 0:00 Problems with RNNs and LSTMs 3:34 First pass overview over transformer architecture 8:10 Second pass with more detail 10:34 Third pass focusing on attention networks and normalization 11:57 Other resources (code & blog posts)
Wow. I've seen lectures that are 45m+ long trying to understand this architecture, even lectures from the original authors. Your video was hands-down the best, really helped me piece some key missing intuition pieces together. You have a gift for teaching and explaining -- I wholeheartedly hope you're able to leverage that in your professional career!
I bought udemy course for Transformer and BERT but with no help and wasted my time, money and energy. This video and your BERT video made my day. thanks. I may explain in my interview well. :)
We live in the mahtrix... . . Sorry I'll leave. . . Psych. I'm staying since it's my channel. Haha. . . Please don't leave (Thanks for watching and reading this pointless rant)
I saw Yanik's explaination and now I saw yours. Yanik does a terrible job at explaining papers, he usually just jokes around. Your explanation is probably one of the best I have seen so far. Thanks man.
Great video!! I am taking a course in my university and one of the lectures was about RNNs and transformers. Your video of 13 mins explains way better than the 100 mins lecture i attended. Thank you!
I had to make 2 passes of your video to fully understand and appreciate the underlying mathematics and working of the model. You have put a great effort in making it simpler to understand with illustration and animation.
This was a phenomenal video. You managed to explain transformers in 13 minutes better than my professor could in three hours. Thank you and keep on creating content!
Bro TBH no words to appreciate such a well structured video in a short time and the explanation was easly understandable even for people with less knowledge. Thanks for the video man.
With videos like this one you should be having 100,000+ subscribers soon. Adding a bit of humor to uncompromising technical content is a very good way to go.
This is awesome!!! Thank you for breaking it down concisely, understandably, and deeply! It’s hard to find explanations that aren’t so simplistic they’re useless, or so involved they don’t save time in achieving understanding. Thank you!!
Thank you very much for you effort! You just adjusted my attention layer, the pieces start to fall in place and I have a much better understanding of why TNNs are so revolutionary.
Went through several videos on 'Attention is all you need' paper before this, all the details you managed to cover in thirteen minutes is amazing. Could not find explanation that is so easy to understand anywhere else. Great job!
How do you pass the output tokens when they are something we want to predict? I think you pass whatever is generated as output, and since nothing is generated in the start, you pass the token.
Really great video. As someone transitioning from pure math into machine learning and AI, I find the language barrier to be the biggest hurdle and you broke down these concepts in a really clear way. I love the multiple layer approach you took to this video, I think it worked really well to first give a big picture overview of the architecture before delving deeper.
Thank you for making this! As a curious outsider I have been anxious about falling behind in recent years and this was perfect to bring me up to speed - at least enough to follow the conversation.
LOVE the multipass strategy for explaining the architecture. I don't think I've seen this approach used with ML, and it's a shame as this is an incredibly useful strategy for people like me trying to play catch up. I hopped on the ML train a little late, but stuff like this makes me feel not nearly as lost.
Finally a basic explanation I can understand. I tried reading the original "Attention is all you need" paper, but it felt like it was assuming I was already familiar with the basics of NLP, like the encoder-decoder setup. Which I wasn't.
I wonder why we don't have so many foundational LLMs if there are so many people like this guy who know ML so well? Is it purely because of financial reasons (huge cost of training these models)?
Thanks, man. This is a really clear and high-level explanation. Really helpful for some guys like me who just stepped into this area. I read many explanations online. They give tons of details but fail in explaining these abstract items. These explanations always use other abstract definitions to explain this one. This problem happens again in the explanation of the "other abstract item". Sometimes I just forgot originally what I want to understand. Or even worse, they form a circulation... Thank you so much! This video helped me a lot in understanding the paper
Just want to say this video is amazing. Watched like three other 30+ mins videos but they all failed to train my stupid brain. This 13 minutes video is intuitive, detailed, and beginner-friendly. Thank you :3
Watched Andrew Ng, watched this, you got me to stick through the video and Andrew who i consider one of the best in this field did not manage to express as clearly as you did! Cheers man, amazing video!
This is one of the best explanations for transformers I've come across online! Awesome job, man! Thanks. I'll totally recommend your channel to some classmates!! :)
this video was one of the best learning videos i EVER SAW first you give a high level overview, then u step in deeper every step with an understandable example THANK YOU SO MUCH!!!
For more details and code on building a translator using a transformer neural network, check out my playlist "Transformers from scratch": ruclips.net/video/QCJQG4DuHT0/видео.html
what a hugely underrated video. You did such a better job at explaining this on multiple abstraction layers in such a short video than most videos I could find on the topic which were more than twice as long.
Thanks a ton Jeffrey! Means a lot. I've come to realize (fairly recently) that only speaking in jargon isn't going to help. Pealing it down from highly abstract to more technical goes a long way for viewers and myself. I understand more when I break the jargon down. Using this more in future videos
@@CodeEmporium Of course everyone has a different approach to understanding a topic. I recently had to get into a few topics quite quickly and fore me the best way of getting there fast was to start out with very general videos to get a sort of feel for the general ideas and how everything works together on a high level. Then I would watch some more detailed videos or switch to reading more detailed articles until finally reading the actual papers and looking at the formulas and all that stuff.
Having an understanding of the bigger picture helped me comprehend the details better.
I Also think no explanation can ever be "too simple" cause sometimes when explanations try to save time by glossing over parts or taking things for granted you spend way more time rewinding trying to wrap your head about some small detail just because you might be missing some needed knowledge.
I think in an explanation it's like with spices on food. better keep it simple and easy. Individuals can always skip parts for themselves. same with the spices: Better not add too much thinking everyone will be able to take it, rather add a little and if it's not enough for someone they can add it themselves.
So true man !! i scraped the net to find a simple explanation !! you are a genius :)
True, I did not understand the use of attention until watching this video
@@CodeEmporium Great approach that you've taken. A high level understanding followed by deeper understanding of the topic pretty much clears up the concept. Subscribed.
Great video! Watched it a few times already so these timestamps will help me out:
0:00 Problems with RNNs and LSTMs
3:34 First pass overview over transformer architecture
8:10 Second pass with more detail
10:34 Third pass focusing on attention networks and normalization
11:57 Other resources (code & blog posts)
Thanks for this! It'll help others watching too.
@@CodeEmporium Pin this comment.
Thank you so much! I planned to watch this a few times for reference as I delve into transformer code. This will be very useful.
2:03 I died at the Vsauce reference. Well played.
Me too, I had to scroll back around 30s because he just continued explaining and I completely lost focus :D
:v
Wow.
I've seen lectures that are 45m+ long trying to understand this architecture, even lectures from the original authors.
Your video was hands-down the best, really helped me piece some key missing intuition pieces together.
You have a gift for teaching and explaining -- I wholeheartedly hope you're able to leverage that in your professional career!
Incredibly well explained and concise. I can't believe you pulled off such a complete explanation in just 13 minutes!
Thank you for the kind words. Super glad you liked it :)
I bought udemy course for Transformer and BERT but with no help and wasted my time, money and energy. This video and your BERT video made my day. thanks. I may explain in my interview well. :)
By far, the MOST comprehensible explaination on Transformer available in the whole internet space.
You deserve 1M subscribers at least.
Thank you for the kind words! Maybe one day
This is by far the best explanation of the Transformers architecture that I have ever seen! Thanks a lot!
I love the multi-pass way of explanation so that the viewer can process high level concepts and then build upon that knowledge, great job.
The only thing that surprised me more than this high quality explanation is how you pronounce "mahtrix"
We live in the mahtrix...
.
.
Sorry I'll leave.
.
.
Psych. I'm staying since it's my channel. Haha.
.
.
Please don't leave (Thanks for watching and reading this pointless rant)
I saw Yanik's explaination and now I saw yours. Yanik does a terrible job at explaining papers, he usually just jokes around. Your explanation is probably one of the best I have seen so far. Thanks man.
The multi-pass approach to progressively explaining the internals worked well. Thanks for your content!
The understanding converges!
@2:02 that vsauce thing was
cool
Ikr 😅
...or was it?
Love that so many people got the vsauce ref
I came here to say that. Excelsior
Great video!! I am taking a course in my university and one of the lectures was about RNNs and transformers. Your video of 13 mins explains way better than the 100 mins lecture i attended. Thank you!
I had to make 2 passes of your video to fully understand and appreciate the underlying mathematics and working of the model. You have put a great effort in making it simpler to understand with illustration and animation.
This was a phenomenal video. You managed to explain transformers in 13 minutes better than my professor could in three hours.
Thank you and keep on creating content!
Bro
TBH no words to appreciate such a well structured video in a short time and the explanation was easly understandable even for people with less knowledge.
Thanks for the video man.
2:04 or are we... not gonna lie this is the best channel and best explanation ever....................
With videos like this one you should be having 100,000+ subscribers soon. Adding a bit of humor to uncompromising technical content is a very good way to go.
This is awesome!!! Thank you for breaking it down concisely, understandably, and deeply! It’s hard to find explanations that aren’t so simplistic they’re useless, or so involved they don’t save time in achieving understanding. Thank you!!
My pleasure! If you are into building the transformer piece by piece from scratch, I suggest checking out the “Transformers from scratch” playlist.
One of the cleanest explanation for transformers without dabbling too much into the theory!! Thanks man
I watched this video four times. After each time, I feel I understand this topic better than the previous one.
what a guide for the transformer in just 13 min.
thanks a lot for this simplicity.
Thank you very much for you effort! You just adjusted my attention layer, the pieces start to fall in place and I have a much better understanding of why TNNs are so revolutionary.
Went through several videos on 'Attention is all you need' paper before this, all the details you managed to cover in thirteen minutes is amazing. Could not find explanation that is so easy to understand anywhere else. Great job!
right? I couldn't believe this video is only 13 minutes! That's a very good talent to have.
The easiest understanding video about this topic, as I can see! Thank you
You are very welcome :)
Best Channel for ai and ml on RUclips
Wonderful video, this clears up so much for me.
Glad! I am currently building a transformer from scratch for additional context in my playlist “Transformers from scratch “
This is heavily underrated!! Such an awesome video! Thanks Man!
I paused every 10 seconds and took notes, such an excellent video!
One of the best or probably the best explanation I've seen. Thank you very much for the effort.
I don't know how many attention video I have watched. But I swear I'm getting there. Repetition is the key to understanding these shit
How do you pass the output tokens when they are something we want to predict? I think you pass whatever is generated as output, and since nothing is generated in the start, you pass the token.
TOP OF TOP clear explanation you provided !!!
This might be the best explanation for someone who has some experience with ANN and CNN but want to understand Transformers. Thanks!
Thanks for the kind words!
Really great video. As someone transitioning from pure math into machine learning and AI, I find the language barrier to be the biggest hurdle and you broke down these concepts in a really clear way. I love the multiple layer approach you took to this video, I think it worked really well to first give a big picture overview of the architecture before delving deeper.
Super happy this helped!
How is it possible such a beautiful video don't have more views/likes? Thank you CodeEmporium:)
watching your videos for a last min revision. You're awesome
Awesome! And Thank you! I do not deserve such compliments
I am half-way through this video. I have not finished it. But I felt giving a like before watching the next half. Thank you so much.
Thank you for making this! As a curious outsider I have been anxious about falling behind in recent years and this was perfect to bring me up to speed - at least enough to follow the conversation.
You are very welcome! Thanks for watching! :)
Had to watch it like 2-3 times to get the entire thing, but worth it. Somehow so concise, yet every relevant detail is included.
Thanks a lot for watching and commenting! Super happy this video gave you value
LOVE the multipass strategy for explaining the architecture. I don't think I've seen this approach used with ML, and it's a shame as this is an incredibly useful strategy for people like me trying to play catch up. I hopped on the ML train a little late, but stuff like this makes me feel not nearly as lost.
Your way to breaking down step by step is very effective! Congrats and thanks. School systems should use it more
A very good explanation about transformers, Thank you :)
Didnt know what a transformer hype was until I landed on this video. Thanks a lot ! Subscribed. Gotta check more content on this channel now
At 8:50, where did those ‘8’ attention vectors come from?
Great video bro! You're underrated af. One of the best if not the best explanation of some neural network architecture. Keep up the good work. Kudos!
Thanks so much! I am making more related videos. So do check ‘em out :)
Good job CodeEmporium! Very well made overview. thanks.
BEAUTIFUL......deep, concise, pithy, each word is meaningful.....well done.
Finally a basic explanation I can understand. I tried reading the original "Attention is all you need" paper, but it felt like it was assuming I was already familiar with the basics of NLP, like the encoder-decoder setup. Which I wasn't.
This is by far the most comprehensive yet short video i haave seen
Best explanation ever. I have an interview with Microsoft tomorrow. This was the best brushing up I could get.
KEEP IT UP! Please! This is outstanding :) love the diagrams, dry crisp active speaking, & overview technique.
Great video! I love how you go through multiple passes, each getting into deeper specifics!
Oh man I’m going through this now! Didn’t even realise you had a vid on it, this is brilliant. Love that you did a bird’s eye pass then dove in.
You're saying you liked my content without watching this video? You must be a true fan of mine mwahahaa..also thanks for the kind words :)
@@CodeEmporium I mean, tbf, I clicked like before watching it because it's you, then I was like dayyyyuuumm this is great.
Best explanation ! "Multi pass" way of explaining is genius.
Wow. Amazing video. Better than anything I've watched on the topic, all in thirteen minutes.
That was the best way of explaining thins in my opinion. Start big picture, getting more detailed over time.
I wonder why we don't have so many foundational LLMs if there are so many people like this guy who know ML so well? Is it purely because of financial reasons (huge cost of training these models)?
dude you absolutely deserve each and any subsriber. thank you very much for your highly helpful and quality content.
Thanks so much for the lovely comment! And also for subscribing! More to come!
I watched many videos related to this topic, but this video taught me easiest way...
Thanks for watching! And for the compliments :) I try my best for my more recent videos too
9:36 masked attention block
Thanks, man. This is a really clear and high-level explanation. Really helpful for some guys like me who just stepped into this area.
I read many explanations online. They give tons of details but fail in explaining these abstract items. These explanations always use other abstract definitions to explain this one. This problem happens again in the explanation of the "other abstract item". Sometimes I just forgot originally what I want to understand. Or even worse, they form a circulation...
Thank you so much! This video helped me a lot in understanding the paper
best video on Transformers on yt imo
prolly the best explanation on this topic on youtube
Just want to say this video is amazing. Watched like three other 30+ mins videos but they all failed to train my stupid brain. This 13 minutes video is intuitive, detailed, and beginner-friendly. Thank you :3
THE BEST one yet. Take a bow. You saved me.
Thank You man.
The underrated AI channel.
Thanks homie. Spread the word if you can! :)
Watched Andrew Ng, watched this, you got me to stick through the video and Andrew who i consider one of the best in this field did not manage to express as clearly as you did! Cheers man, amazing video!
It always blows my mind how anyone figures out any of this.
I agree 😂
I am now more confident to talk about transformers, it had been abstract to me before. Thank you so much, great explanation.
Glad and you're welcome! :)
Really great video. I have taken so many classes that have discussed this model and never understood it until now. Thanks!
Perfect! Glad this helped!
This earned a subscription! Excellent explanation!
Outstanding. Best explanation of transformers that I’ve seen by far.
Thanks a ton for watching :)
Simple and easy to understand. Great job!
Really nice explanation, allthough even after the video most of Transformers still seem like a big magic black-box. Fascinating stuff
Wow, the best explaination on youtube ! Had to subscribe after watching !
This is one of the best explanations for transformers I've come across online! Awesome job, man! Thanks. I'll totally recommend your channel to some classmates!! :)
Thanks! And Glad it's helpful! Spreading the word of my channel is the best thing you can do to help :)
great job putting attention in simple words. Very intuitive!
Thanks so much!
Wow, I am so glad I found your channel. The concept is clearly explained and assumes an intelligent audience. Well done!
I cant believe you explained so complicated things in crystal clear format. Excellent job dude
This is an amazingly good video - thanks so much!
Brilliant piece of work! You nailed it in few mins!
Thanks for watching!
Great video! I love the layered approach for explaining the concepts. Very well done. Thank you!
I am super glad that approach helped
Even though this definitely assumes some intermediate working level knowledge of ML, best layman explanation on Transformers/Attention so far.
Best channel on the topic! Soo glad that I found this channel
Thanks a ton :)
Its been a year I still come back to this video to revise my transformer knowledge :)
Thanks man!
Thanks for coming back :)
Outstanding explanations: to the point and well illustrated. Thank you.
This channel is so under-rated. Amazing video
I was searching for this particular explanation from a long tym! thanks for this!
this video was one of the best learning videos i EVER SAW
first you give a high level overview, then u step in deeper
every step with an understandable example
THANK YOU SO MUCH!!!
You are very welcome. Thank you for the compliments :)
that was an amazing content heavy explanation for just 13 minutes. thanks a lot!
You are super welcome!
Best explanation so far! Keep up the good work!
Superb explanation in 13 minutes. I have been watching videos over 1 hour long to get this concept. Well done and keep it up. Regards
The best video I have seen on this topic. Great job
Glad it was helpful! And Thanks for watching :)
This is the best explanation! I came in search for transformers but I found Gold.
Very well explained. Keep it up
Hey, thanks a lot, the explanation is great, the video explains much clearer than the lecture in the university.
Thanks for the high praise
Best video on Transformers!!! Thanks a mill!!!!!