BERT for pretraining Transformers

Поделиться
HTML-код
  • Опубликовано: 17 ноя 2024

Комментарии • 13

  • @erdi749
    @erdi749 Год назад +6

    One of the most underrated Transformers tutorials on RUclips, please keep up the great work!

  • @sachavanweeren9578
    @sachavanweeren9578 2 года назад +6

    This was exactly the missing detail I was looking for. Thanks for the very clear explenation!

  • @szymonpogodzinach2495
    @szymonpogodzinach2495 Год назад

    This is the best video on Bert.

  • @md.mushfiqurrahman4925
    @md.mushfiqurrahman4925 Год назад

    This is the best tutorial video on transformers. Thank you so much

  • @darkingdarlingguy
    @darkingdarlingguy 3 года назад +1

    Very nice and clear knowledege. Thank you for your contribution.

  • @Lonely_trader
    @Lonely_trader Месяц назад

    Great explanation

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 11 месяцев назад

    This is an excellent presentation

  • @iiirannn1
    @iiirannn1 Год назад

    great and simple explanation

  • @johnsmith-mp4pr
    @johnsmith-mp4pr 2 года назад +2

    great explanation!

  • @asifpervezpolok2243
    @asifpervezpolok2243 2 года назад +1

    thank you for a great video

  • @unicarn4475
    @unicarn4475 2 года назад +1

    Learn a lot.

  • @zhaoxiao2002
    @zhaoxiao2002 2 года назад +1

    excellent!

  • @maj46978
    @maj46978 Год назад

    ❤❤❤ One simple Query if every word has embeding, what would be embeding for CLS token and SEP token..are they just random vectors Initialized