Ali Ghodsi, Deep Learning, Attention mechanism, self-attention, S2S, Fall 2023, Lecture 9

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024
  • Attention mechanism and self-attention,
    Sequence-to-sequence models
    This video provides an in-depth exploration of Attention Mechanism and Self-Attention, crucial concepts that have revolutionized the field of Natural Language Processing (NLP). Transformers, the game-changers in NLP, rely heavily on self-attention. Join us as we unravel the fundamentals of attention and self-attention in the context of NLP, and gain a brief insight into their application in image processing.

Комментарии • 4

  • @olabintanibraheem8111
    @olabintanibraheem8111 10 месяцев назад

    Thanks for the class sir..

  • @nguyenple
    @nguyenple 10 месяцев назад

    Thank you

  • @user-us1jf8zd8e
    @user-us1jf8zd8e 9 месяцев назад

    Detail mathematical formula explanation start @47:00

  • @bsementmath6750
    @bsementmath6750 8 месяцев назад

    Prof. You used to be very verbose and invasive on the board. Why this hybrid mode of ppt and some board? Love from Pakistan!