Attention Is All You Need

Поделиться
HTML-код
  • Опубликовано: 24 дек 2024

Комментарии • 311

  • @tanmayjain6791
    @tanmayjain6791 9 месяцев назад +160

    Nobody knew this paper would change the world

    • @blitz1867
      @blitz1867 7 месяцев назад

      Did it?

    • @yanniammari1491
      @yanniammari1491 7 месяцев назад +34

      @@blitz1867 hell yeah it did its cited 124k times so it definetly did

    • @JohnDoe-pq8yw
      @JohnDoe-pq8yw 7 месяцев назад +4

      @@yanniammari1491 And we're just getting started.

    • @AmanSingh-xk2lv
      @AmanSingh-xk2lv 6 месяцев назад +1

      @@blitz1867 yes, definitely.

    • @MM-by6qq
      @MM-by6qq 5 месяцев назад

      very true

  • @finlayl2505
    @finlayl2505 4 года назад +385

    Friendship ended with LSTM, transformer is now my best friend.

    • @EvgenSuit
      @EvgenSuit 2 года назад +2

      As far as i know there's a little amount of transformers for audio problems

    • @electric_mind
      @electric_mind 2 года назад +6

      LSTMS generally perform better when it comes to short sequences, and remember LSTM is the revolution that led to the birth of The Transformer. I love both of them!

    • @klam77
      @klam77 Год назад +2

      Lstm sequentialization is kludged inside transformers. Pay attention

    • @jamesbedwell4715
      @jamesbedwell4715 Год назад

      Same but with GRU

    • @st0a
      @st0a Год назад +1

      Friendship ended with Transformers, Retentive Networks are now my best friend.

  • @RobotProctor
    @RobotProctor 3 года назад +297

    I've watched this maybe 5 times over 1 year, each time getting more and more from it. I think I finally intuitively understand how this works. Thanks for your work and your time!

    • @niedas3426
      @niedas3426 2 года назад +25

      This has been my experience with ML in general: I have to re-read papers and books over and over again, and each time I understand more. It's hard, but it pays off to finally get the grasp of such an almost mystical cocnept.

    • @StoutProper
      @StoutProper Год назад +4

      It’s a little bit more complicated than just predicting the next word based on the last, which is the take a lot of people have on it.

    • @electric_sand
      @electric_sand Год назад

      @@niedas3426 How's it going...honestly this is how I feel sometimes, having to go through multiple videos and blogposts just to grasp concepts.

    • @niedas3426
      @niedas3426 Год назад +3

      @@electric_sand Honestly, still making steady progress. I am now at a place where I am much, much further. I've mainly been preoccupied with datasets (e.g. reducing file storage size, faster reading and calculations, pytorch iterdatapipes) and realised it'd help me to go back more to the fundamentals (linear algebra, calculus, probability, pandas, numpy, data structures, builtin methods etc). It's been fun, overall. :)

    • @electric_sand
      @electric_sand Год назад

      @@niedas3426 Thanks for your response. Same here, I decided to go back to the fundamentals as well...I simply got tired of struggling through papers. Wish you the best mate.

  • @TimKaseyMythHealer
    @TimKaseyMythHealer Год назад +15

    Finally, someone is drawing vectors to describe what is meant by encoding with vectors, and how the vectors relate to one another. So many talk about this, but barely understand the details.

  • @languagemodeler
    @languagemodeler Год назад +8

    It's amazing to have this explanation of the paper that is responsible for all of the AI interest and innovation happening now--- described as 'interesting' shortly after it came out. I love it.

  • @herp_derpingson
    @herp_derpingson 5 лет назад +132

    I was searching for a channel like "Two minute papers" but not two mins in length and goes in depth. I think I found it!
    Subbed!

  • @kema8628
    @kema8628 6 лет назад +61

    The explanation of querying a key-value pair is really nice

    • @gorgolyt
      @gorgolyt 3 года назад +5

      I recommend looking at the paper, because they use exactly this analogy. I found their description very helpful:
      "An attention function can be described as mapping a query and a set of key-value pairs to an output,
      where the query, keys, values, and output are all vectors. The output is computed as a weighted sum
      of the values, where the weight assigned to each value is computed by a compatibility function of the
      query with the corresponding key."

  • @prashanthkurella4500
    @prashanthkurella4500 Год назад +9

    Who knew this paper would change how we look at sequences forever

  • @dariodemattiesreyes3788
    @dariodemattiesreyes3788 5 лет назад +91

    Really good explanation. You know how to provide the essence without getting lost into details. Details might be important later but the most important thing at first is the very main nature of the strategy and you provided it crystal clear. Thanks!!!

  • @jugsma6676
    @jugsma6676 7 лет назад +317

    By far the best explanation about the paper "Attention Is All You Need". well explained. Thanks Yannic Kilcher

    • @jugsma6676
      @jugsma6676 4 года назад +5

      @bunch of nerds , BERT is also a transformer but with bidirectional (forward and backward) movement in the input sentence.
      And, GPT is a generative (autoregressive/ randomness) version of Transformer.
      Both are a language mode able to predict and understand the input sentence/s.

    • @jugsma6676
      @jugsma6676 4 года назад

      @bunch of nerds , It's much better to use inbuild stable tools than doing from scratch.

    • @asmadali-
      @asmadali- 4 года назад

      @@jugsma6676 is the most important thing to remember remember that you are the most powerful in the life of the tomb and your children are the most important thing to remember to be a a good friend friend of of a family who has been in in the last few months for a a long term relationship and you have to make sure that you have a good relationship with your your life insurance company in the world world world and of course you will be able and can can afford you for your business in the world and you you want to be a part of your happy moments and happiness in your lives by the time you get to the best of my abilities and I hope that you will be able and willing and willing to help help me in this situation as I am in the process and I am very very grateful for the the opportunity to work with my company in the field and I am am very grateful for the opportunity to work work with with my my partner in this process of learning from a very high level level of customer support for my future goals for the the industry in which I have have the opportunity to work with my business partner in the future of my company as well as a professional professional customer and and I am very confident confident in my ability to work work and to work with you in the process and I look forward to to speak with you you may be able to help me with this project as soon as I have a job in mind for my career and I will try and make sure I have the best of my abilities and I will try and get it to work for the best and I will try to the best of my abilities and I can make it to my end of of this project as soon as I get a job in a a good good place for my future goals for the next few weeks so I am going back to the best of my abilities and will let you know if I have any further ideas or comments about this opportunity and and if you have any further questions or concerns please feel free to contact us via email or or phone number and we can call or text or call or text me if you have any other ideas for us to discuss or if you have any other ideas for future projects and if you you need any help with any of the tomb or any other information you may have or if you have have any questions questions questions please feel free to contact me or you can call me me or my cell phone at any of those places or or at my cell number below are my my cell phone numbers and I will call them and get them to send me the info for your new home home and I can give you a few of the tomb I am looking at on Friday so I will be able and able to make some work if you want me there for a few hours and I can get you some info for the late night or early next weekend if you want me there there is no way I could be of any assistance or if you need need to get me a hotel for the night night and I will be in the office tomorrow morning to pick up my check for you to pick it 2017 and send it to me so I can send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I I I I I I I I I can come pick it out from your house and pick it out on the way ye if you want it for you or you could pick it up tomorrow morning and pick it out on the the phone or on the way to the office or at home or on the way to to get you a new phone or a new one for your phone and send it it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of of my address address so I can send it to the best of my abilities as I am not sure if I will be able to make the payment today because I have not received any response from the seller and and I have not received any reply yet again for my response and I am not sure if I will be able to make this appointment today due to the fact fact that I am not able and I I will not be able to make it to the meeting today because I am unable to make it to class on Friday and I am not sure if if you you will be able or able help me out at this point I am not sure if I will be able to make this work or if I need this to work out or if I need this to work on my end of the semester and I will try to the best of the best of my abilities abilities as well and will let you know if I have any other openings for me I will try to the best of my abilities and will let you know if I have any further further information regarding the job offer or if you would like me help in any of the positions I am interested to apply to your position and I look forward forward to hearing from you in the near the end of next month to to see if I have any further information on the job opportunity that I am looking to move to a few years later this year as I have been working for the last last month of of the month of the month and I have a few things I need to do to get the job completed ASAP and I need a few things things to go over with my parents 243 4444444444443444 4w4424w4w4w442 2 I 4242 the 3 44442 of the country to make sure they have a 2 and they will 2 1 for their 3rd 11 in their 3rd party party 2 2 and they will not 32424242243 the world 44 34 25 2 44442 243 4444444444443444 and 2 2 is not a good place 4444 243 is the only 2 in the floor that that has to do with it in in 2nd century with 24 of a wide 20 and the 55 is the only 2nd largest of all 3 of all time in 2nd year of the tomb year and a long period of 20th 20th of the 4242 24 and a few 554444 243 554444 243 and a few of them were in the floor 3 of a few of them were the ones 2 and I 2 the ones that are in a good shape for a 3rd floor 44 to be the only one who is in a good mood and they will 2 their love and the love of 20 for the first day and a few more than a few days of their own lives in

    • @clray123
      @clray123 4 года назад +12

      It's still like listening to a bad student struggling to explain something they don't themselves understand. And shame on the Google researchers for doing such a shit job explaining themselves, but I guess that's typical of the majority research papers out there - these people just don't care to teach their ideas to others (except maybe a very narrow circle with whom they have already communicated via other channels).

    • @rednas195
      @rednas195 Год назад +2

      @@clray123 What parts do you think are explained poorly? To me it feels like Yannic understands the paper quite well, but i'm interested in what you think he might not understand all too well.

  • @mdnayemuddin5595
    @mdnayemuddin5595 3 года назад +4

    I just got a clear understanding of how the positional encoder works here. Kudos to you. Great Explanation!

  • @owenmarkley446
    @owenmarkley446 Год назад +6

    This is by far the best explanation I've seen of this paper. I'm writing a review of this paper for a class and wouldn't have been able to do it without your video! Immensely grateful!

  • @TijsZwinkels
    @TijsZwinkels Год назад +1

    Yeah, I'm late to the party, but I'd say that this video is still very relevant. I've read the paper several times and watching multiple blog posts and videos, but especially the Q,K,V mechanism never really clicked until watching this. Using dot-products between Q and K as a lookup mechanism. Ingenious! - Thanks for this video!

  • @deleteme924
    @deleteme924 7 лет назад +248

    You have the best videos about machine learning I've seen, comparable to only perhaps 3blue1brown, but his videos aren't about as advanced topics. It would be really nice if you can make more!

    • @snippletrap
      @snippletrap 4 года назад +8

      Arxiv Insights and Henry AI are pretty good too

    • @ambujmittal6824
      @ambujmittal6824 4 года назад +8

      @@snippletrap Arxiv sadly stopped posting a long time back and I personally find Henry AI's discussion to be superficial. Try Chris McCormik and reading groups by Rachael though. :)

    • @peterhojnos6705
      @peterhojnos6705 3 года назад +5

      Who do you mean by “Rachael reading groups”?

    • @48956l
      @48956l 3 года назад +3

      Definitely both great channels but the comparison doesn't do justice to just how good 3b1b's animations are. Here Yannic writes on a tablet lol. Not really comparable.

  • @akhilvenkataraju7791
    @akhilvenkataraju7791 4 года назад +18

    Thank you so much Yannic Kilcher, the paper seemed complex but you "encoded", performed "multi-head attention" and "decoded" it in such a simple way (: An amazing job! Undoubtedly the best explanation

  • @lleger
    @lleger 5 месяцев назад

    i cant believei. just learned the intuition behind softmax, Yannic ur videos are pure gold, i hope life is treating u well !

  • @YtongT
    @YtongT 4 года назад +7

    an amazing explanation. truly amazing. I cant say how much I appreciate you putting dot product and soft max into intuitive and easy to understand words. very grateful

  • @Luxcium
    @Luxcium 9 месяцев назад +2

    This sounds like someone who was reading a paper without realizing that it was to be the third largest thing to happen onward to humanity after a pandemic and from my own perspective an invasive war in Europe then the spark of AI that would happen with ChatGPT and the expansion of generative imaging like Stable Diffusion and Midjourney 😅🎉🎉🎉🎉 I would love to know how many subscribers you got from back then to just before ChatGPT and from ChatGPT up to nowadays 😅😅😅😅 You are such an amazing communicator ❤

  • @rommeltito123
    @rommeltito123 3 года назад +2

    Good that you were interrupted at 17:15. I had to strain my ears and go full volume to hear you. After that it was better.

  • @magnuswiklander8204
    @magnuswiklander8204 2 года назад +2

    Fun to see this today after all the recent successful transformer results! (June 2022) Thanks Yannic, keep it up!!

  • @shandou5276
    @shandou5276 5 лет назад +3

    Very well done! I agree with the other comments that this is the clearest explanation I have seen so far. Thanks for the great work!

  • @BrettHannigan
    @BrettHannigan 2 года назад +3

    Excellent explanation of Transformers. Clear, easy to follow, and great information. Thanks!

  • @tassoskat8623
    @tassoskat8623 4 года назад +2

    Great video and very unique amongst most machine learning videos on youtube.
    Thank you!

  • @vijeta268
    @vijeta268 4 года назад +2

    You have done an excellent job in explaining attention method in simple words. Thanks so much!

  • @patpearce8221
    @patpearce8221 Год назад

    So the words are converted into vector embeddings, than positionally encoded using the sine and cosine function.
    2. This vector is copied with one copy to be passed through the multihead attention layer to be contextualised by splitting each word into a key, query and value which are learned. There are a separate key, value and query vector for each word.
    3. The key is matrix multiplied by the query after being passed through a linear layer, divided by the squareroot of the dimensionality of the key, multiplied by a softmax function and than matrix multiplied by the value.
    4. It is than added to the other copy of the positionally encoded vector.
    5. It is than normalised.
    6. It than passes into the ffnn which has three layers.
    7. In the dense layers it is matrix multiplied with the weight and than has a bias added, both of which are learnt.
    8. It has the ReLU function applied to each word.
    9. It is added with the residual(the vector before it has passed through the ffnn)
    10. It is than normalised again...
    Than voile... what Am I missing here?

  • @fahds2583
    @fahds2583 4 года назад +1

    you have such a cool state of mind ... really adds to making your teaching style more interesting

  • @renehaas7866
    @renehaas7866 4 года назад +1

    I really appreciate that you are making these videos.

  • @Don-gk9ss
    @Don-gk9ss 5 лет назад +2

    the best transformer video I have watched. Well explained

  • @deaths1l3nce
    @deaths1l3nce 4 года назад +5

    Thank you very much! This has helped me a lot. All I could find on this specific paper was confusing and hard to understand, I think it was explained extremely well in your video! Please make more of these, I think you might help lots of people :D

  • @spinner4
    @spinner4 Год назад +2

    Why could there not be such a RUclips explanation from authors of the paper: would be very helpful for humanity right now. But this is quite helpful.

    • @kevinelezi7089
      @kevinelezi7089 Месяц назад

      because publishing a paper doesn't directly prove the skill of being able to explain it in this way

  • @astrobearmusic1977
    @astrobearmusic1977 3 года назад

    I had to revisit this video several times, but I think transformers finally clicked for me. Thank you!

  • @WaylonFlinn
    @WaylonFlinn 2 года назад

    You need to remake this video. You've gotten so much better at doing this since you made this video and this topic is so foundational.

  • @sophiaxia3240
    @sophiaxia3240 3 года назад

    by far the most intuitive explanation. Thanks!

  • @aidangomez6004
    @aidangomez6004 3 года назад +3

    This is a great summary, thanks for making this!!

  • @revanthvejju3727
    @revanthvejju3727 Год назад +3

    the paper that changed everything

  • @ostensibly531
    @ostensibly531 Год назад +1

    Love the relaxing voice. Way better than reading the paper myself. Now I can be on the elliptical and still ingest the gist of papers. Thank you for making this!

  • @RobotProctor
    @RobotProctor 4 года назад +4

    Question: if there is a finite max length of an input/output sequence, why do you need a positional encoding? Wouldn't the network have a static place for the 1st word, 2nd word, ..., nth word in it's inputs? I'm struggling to understand the need for the positional encoder.

    • @RobotProctor
      @RobotProctor 4 года назад +5

      nevermind, I think I understand. Since the multi-head attention mechanism uses the same scaled dot product computation for each word in the sequence (and not different parts of a NN for example), the positional encoding is necessary in order to get different answers for the same word at different locations in the sequence.

    • @YannicKilcher
      @YannicKilcher  3 года назад +2

      you got it :)

  • @teddy5474
    @teddy5474 2 года назад

    Best explanation I've seen on this topic!

  • @RealStonedApe
    @RealStonedApe 11 месяцев назад +1

    Love how 'All you need is attention' also applies to me in terms of understanding this video. Time to chug down some Adderall and take notes!!
    Also, probably not a good start when I have no idea what a vector even is... Take it in bit by bit.
    Robert Pirsig reading Thoreau style, ya dig?!
    Anyways, plz pray for me. Any God, Joseph Smith, Allah, Bill Murray, etc. And he will do. Pray for my attention, pray for my soul, pray for Bill Murray. Love❤🎉

  • @qidichen1756
    @qidichen1756 4 года назад +1

    One of the best explanation !!!

  • @PaulFidika
    @PaulFidika Год назад +7

    I'm watching the history of AGI being built right here

  • @alexandrostsagkaropoulos
    @alexandrostsagkaropoulos Год назад

    Just exceptional explanation. You clear things up so much!

  • @anisakhlyan8581
    @anisakhlyan8581 4 года назад +2

    Thank you! This is a very good explanation which I actually used in presenting this paper. Cheers man!

  • @chandlerclement1365
    @chandlerclement1365 6 лет назад +2

    Excellent video, thank you so much for illustrating these concepts so clearly.

  • @nchahine
    @nchahine 3 года назад

    I always thought about doing a youtube channel like this, but I guess I don't need to because you are so good at this thanks!

  • @Julian-tf8nj
    @Julian-tf8nj 5 лет назад +9

    VERY helpful, thanks! I'd love to see a "part 2" ...

  • @jsphyan
    @jsphyan Год назад

    This is beautiful, I really appreciate your work! Thank you

  • @arslanali900
    @arslanali900 9 месяцев назад +1

    You are amazing!

  • @VinBhaskara_
    @VinBhaskara_ 7 лет назад +1

    great explanation. please keep posting such summaries of great papers thanks!

  • @bayesianlee6447
    @bayesianlee6447 Год назад

    This 6 years after attention is all you need is just crazy rapid growth

  • @simons6512
    @simons6512 2 года назад

    Super Erklärig, bis jetzt eini vo de beste woni gfunde ha.
    Findes cool zgseh das mal öpper us de schwiz sich so uf dere plattform engagiert.
    Witer so!

  • @Darthvanger
    @Darthvanger 2 года назад

    21:30 - thanks for the great softmax explanation! I've had the "aha" moment :)

  • @10x_discovery
    @10x_discovery 8 месяцев назад

    Man. I just found your channel. All the best insh'Allah

  • @olegshpynov
    @olegshpynov 5 лет назад +1

    Great explanation of the transformer model. Thank a lot!

  • @MrChristian331
    @MrChristian331 3 года назад +2

    What's "Add and Norm" mean at each step of the network in the architecture??

  • @starlite5097
    @starlite5097 Год назад

    Thanks, nice video. You've come a long way since then, I'm sure, especially with the open assistant stuff.

  • @julinamaharjan6987
    @julinamaharjan6987 4 года назад +1

    Very intuitive explanation. Thank you!

  • @carlkenner4581
    @carlkenner4581 Год назад +8

    This will never catch on. (Kidding)

  • @lsqshr
    @lsqshr 7 лет назад +9

    Really awesome job! I was puzzling about what the key, value pairs are. Thanks a lot!

  • @fisherh9111
    @fisherh9111 Год назад

    this is excellent. Thank you so much for sharing!

  • @thearianrobben
    @thearianrobben 4 года назад +3

    so Representation in one Natural Language into the Universal Language of Math into anther Natural Language.

  • @eaglesofmai
    @eaglesofmai 3 года назад

    its very well explained at 9 mins already got the answer about Attention vs LSTMs. i wss sesrching long for it on google

  • @prasitamukherjee5864
    @prasitamukherjee5864 3 года назад

    Thank you for the super neat explanation- Cleared a lot of stuff.

  • @ziku8910
    @ziku8910 Год назад

    Very intuitive explanation here, thank you!

  • @goelnikhils
    @goelnikhils Год назад

    Such a clear explanation of Attention. I was struggling to understand attention and I would have watched over 20 videos for this but got no clarity.

  • @bernhardvoggenberger9850
    @bernhardvoggenberger9850 3 года назад

    As a student your videos are very helpful!

  • @bpolat
    @bpolat 7 месяцев назад +3

    This paper changed the world. AI revolution is started after this paper.

  • @ankitbhardwaj1956
    @ankitbhardwaj1956 5 лет назад +1

    Thanks a lot for this explanation video!!

  • @faysoufox
    @faysoufox 5 лет назад

    Nice explanation. Note that by the time somebody gets to the part where you explain dot products, he or she would likeky already know what a dot product is.

  • @tobiaszb
    @tobiaszb 3 года назад

    Thank You for the overview.
    (Let's take care of human attention, train it's span.)

  • @vikaskumarjha9
    @vikaskumarjha9 5 лет назад

    Thank you so much. You explain it so well in very simple terms.

  • @hamedgholami261
    @hamedgholami261 2 года назад

    I really understood the subject, thanks for your clear explanation.

  • @michaelmuller136
    @michaelmuller136 5 лет назад +3

    Thank you, that was very informative and explained well!

  • @xingyubian5654
    @xingyubian5654 2 года назад

    Always wondered wht keys, values, queries are. Thank you for the clear explanation!

  • @suyashshrivastava8317
    @suyashshrivastava8317 3 года назад

    Thank you so much for this. Excellent explanation

  • @sahanagk4011
    @sahanagk4011 3 года назад

    This explanation is amazing!! Thank you for this

  • @이효건-o4o
    @이효건-o4o 6 лет назад +1

    Thank you so much. Your videos are so much helpful.

  • @dailygrowth7967
    @dailygrowth7967 3 года назад

    Thanks, i really enjoy your content!

  • @MadhavanSureshRobos
    @MadhavanSureshRobos Год назад +1

    How far we've come in 5 years!

  • @arunantony3207
    @arunantony3207 4 года назад +1

    Great explanation !

  • @simplyalec
    @simplyalec Месяц назад

    The amount of times this paper has been cited is astounding (rightfully so)

  • @sebchap24
    @sebchap24 Год назад

    Quite an amazing explanation! thanks a lot

  • @sebastianreyes8025
    @sebastianreyes8025 2 года назад

    Protip, print out this paper and take notes on it as you try to follow along with this, also read on your own and take note of questions you have.

  • @marjansherafati6913
    @marjansherafati6913 4 года назад

    Thank you very much, amazing explanation! 🙏🏼🙏🏼

  • @nguyentrung1452
    @nguyentrung1452 2 года назад

    great explanation. Thank you, master

  • @aminzaiwardak6750
    @aminzaiwardak6750 5 лет назад +1

    Thanks a lot you explained very well.

  • @danecchio6621
    @danecchio6621 3 года назад

    Grazie tantissimo per la spiegazione.

  • @alfred17686
    @alfred17686 5 лет назад +7

    This was such a good explanation. I've been trying to really understand these, but until now I haven't found a good resource. Cheers man!

  • @kevind.shabahang
    @kevind.shabahang 4 года назад

    Thank you. Very clear.

  • @Scranny
    @Scranny 4 года назад +1

    I'm confused how the decoder produces the output keys. First, is the key vector always the size of the lexicon of the source language? I (maybe) understand that K are the keys that are used to index V where the real information is stored. Buw does the encoder know how create V and K separately?

    • @YannicKilcher
      @YannicKilcher  4 года назад +1

      No, the key vector is a distributed representation, its size is up to the developer. The model creates V and K separately by having different heads to produce each.

  • @halehdamirchi146
    @halehdamirchi146 4 года назад +1

    This was really helpful, thank you!

  • @AlimamiHD
    @AlimamiHD 6 месяцев назад +1

    Bro reminded me of that one dude who posted a video in 2011 begging people to buy 1$ of bitcoin

  • @YtongT
    @YtongT 4 года назад +2

    "that, that looks ugly" 23:53
    with such an innocent voice, that software did you dirty

  • @rupjitchakraborty8012
    @rupjitchakraborty8012 4 года назад +1

    This is great video. Please make a video on Hierarchial Neural Networks.

  • @tommykelly6840
    @tommykelly6840 2 года назад

    You are literally amazing

  • @tyfoodsforthought
    @tyfoodsforthought 4 года назад

    This was wonderful. Thank you!!!

  • @sarahpanda1167
    @sarahpanda1167 5 лет назад +1

    Very helpful !! Thank you !

  • @shrikanthsingh8243
    @shrikanthsingh8243 5 лет назад +1

    Thank you so much it was a very good explanation

  • @garrettosborne4364
    @garrettosborne4364 3 года назад

    Thanks Yannic, great videos.

  • @andreipokrovsky8376
    @andreipokrovsky8376 2 месяца назад

    By "traditional RNN" he really means RNNs that were practically used from 2012-2016 :) Although i guess BPTT is a 1970s tech.