Implement BERT From Scratch - PyTorch

Поделиться
HTML-код
  • Опубликовано: 15 ноя 2024

Комментарии • 22

  • @nhatnguyenviet8083
    @nhatnguyenviet8083 Год назад +3

    Great, I hope that will be more "from Scratch" like this. Thanks very much!

  • @learntestenglish
    @learntestenglish Год назад

    I want to learn about this topic but haven't been able find good resources. This is great for me. Thanks 🙏🙏

  • @goktankurnaz
    @goktankurnaz Год назад +2

    Great video with great explanations!

  • @islevkurt1532
    @islevkurt1532 Год назад +1

    This is great for me👍🏻
    Thank you🙏🏻

  • @Ece-kx6qk
    @Ece-kx6qk Год назад +1

    Çok faydalı bir video olmuş teşekkürler

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Год назад

    this is really helpful. thanks!

  • @thomasfathy4769
    @thomasfathy4769 6 месяцев назад

    you are genius brooo 🤩🤩🤩

  • @wilfredomartel7781
    @wilfredomartel7781 8 месяцев назад

    😊 does this new implementation include flash attention?

    • @uygarkurtai
      @uygarkurtai  8 месяцев назад

      Hey, this is just the vanilla implementation. However I believe you can enable flash attention while loading the model. I suggest you check huggingface documentation.

  • @cartoonsondemand_
    @cartoonsondemand_ 11 месяцев назад

    please make a video on implementing gpt2 using pytorch

    • @uygarkurtai
      @uygarkurtai  11 месяцев назад +1

      I'll try to. Thank you for recommendation!

  • @TheAnistop007
    @TheAnistop007 Год назад

    Hello ! great video thanks ! Is there a code where is model is trained ? like with loss and optimizer ?

    • @uygarkurtai
      @uygarkurtai  Год назад +1

      Hi thank you :) I only did the model for this. Maybe I do the full training in the future. If you want to try however I can give a tip. Train a BERT as usual (from huggingface maybe). Instead of importing it from a library, use this model. It should work.