In's and Out's of attention

Поделиться
HTML-код
  • Опубликовано: 14 ноя 2024
  • This video covers Attention models more intuitively and practically including minute details that are either left out or forgotten. Attention forms the core of transformers which are currently everywhere. I have tried to cover the heart of transformers both mathematically, and conceptually. Let me know if the code and the slides are required in the comment sections.
    Feedback is always welcome. 😁😁

Комментарии • 13

  • @mrrock7229
    @mrrock7229 7 месяцев назад

    perfect explaintion

  • @sneakpeakt22108
    @sneakpeakt22108 7 месяцев назад

    Nice explanation ❤

  • @prashantkulkarni6344
    @prashantkulkarni6344 6 месяцев назад

    Thanks a ton for breaking down transformer implementation!!

  • @not_amanullah
    @not_amanullah 7 месяцев назад

    Thanks ❤

  • @inugr8
    @inugr8 7 месяцев назад

    Wow, this is incredibly captivating!

  • @anujshukla104
    @anujshukla104 7 месяцев назад

    Good work....keep making videos..❤❤❤❤

  • @UsernameDisplay
    @UsernameDisplay Месяц назад

    It would be helpful if you mention the prerequisite and target audience for these videos in the description.
    or if you even mention it in the start of the video.

  • @RakhiPandey-sm6cc
    @RakhiPandey-sm6cc 7 месяцев назад +1

    All the best👍

  • @ShreyaSingh-wj8ie
    @ShreyaSingh-wj8ie 7 месяцев назад +1

    Keep Growing✨ All the best.