Google BERT Architecture Explained 1/3 - (BERT, Seq2Seq, Encoder Decoder)

Поделиться
HTML-код
  • Опубликовано: 1 окт 2024
  • Google BERT (Bidirectional Encoder Representations from Transformers) Machine Learning model for NLP has been a breakthrough. In this video series I am going to explain the architecture and help reducing time to understand the complex architecture.
    Paper reference: Attention is all you need
    Reference used in this part of Video:
    ai.google/rese...
    rajpurkar.gith...
    google.github....
    All References:
    arxiv.org/pdf/...
    github.com/hug...
    mlexplained.com...
    towardsdatasci...
    towardsdatasci...
    ai.google/rese...
    rajpurkar.gith...
    google.github....
    lilianweng.git...
    stats.stackexc...
    Thanks to training partner: TechieGlobus : www.techieglobu...

Комментарии • 33

  • @adohreemas
    @adohreemas 5 лет назад +7

    Finally someone made a good video on BERT. Hope more details will follow

  • @carmelitamaia8535
    @carmelitamaia8535 5 лет назад +6

    next time I recommend to remove the background music.it just creates noise in the presentation

    • @SandeepBhutani
      @SandeepBhutani  5 лет назад +3

      Sure will take care.. Hope the content and delivery was relevant

  • @ValeriaTiourina
    @ValeriaTiourina 5 лет назад +8

    5:00 you say the language is German? It's Chinese :p ; thanks for the video!

    • @SandeepBhutani
      @SandeepBhutani  5 лет назад +1

      Thanks.. I will learn both languages now 😉😊

    • @ChauNguyen-kg4ok
      @ChauNguyen-kg4ok 4 года назад +2

      Japanese actually :)

    • @epireve
      @epireve 4 года назад +1

      True, it’s Japanese.

  • @mohdazam1404
    @mohdazam1404 4 года назад

    Its not german, Chinese :) , "Seher gut" . My comment is in german :)

    • @SandeepBhutani
      @SandeepBhutani  4 года назад +1

      I hope the Contents were useful to you

  • @propropiro
    @propropiro 5 лет назад +4

    Confusing a bit and insecure ... many words but few Takeaways

  • @aqibfayyaz1619
    @aqibfayyaz1619 3 года назад +1

    very well explained sir Awsome.

  • @bharatbajoria
    @bharatbajoria 4 года назад +1

    Can you please explain building BERT for summarization task?

    • @SandeepBhutani
      @SandeepBhutani  4 года назад

      It is like clustering on BERT embeddings.. Did you face issues with links on Google?

  • @DeepakChauhan-mn5jw
    @DeepakChauhan-mn5jw 3 года назад +1

    You have an Indian flavor and a good way of explanation. Looking forward to your videos.

    • @SandeepBhutani
      @SandeepBhutani  3 года назад

      Thanks

    • @SandeepBhutani
      @SandeepBhutani  3 года назад

      Any specific content you are interested in?

    • @DeepakChauhan-mn5jw
      @DeepakChauhan-mn5jw 3 года назад +2

      @@SandeepBhutani I saw your playlist and most of the videos are on NLP. I think you can work on a playlist which can educate people in application of Machine learning and Deep learning for NLP problems. If you structure it well with an incremental storyline, I think that will have more value than university degrees/research work. Most of the people are scared away just by equations.

  • @achintyaagarwal869
    @achintyaagarwal869 5 лет назад +1

    Great Video.
    I would also love to see the sequence of this video in which @Sandeep Bhutani, practically implements BERT in a Usecase/ project/example.
    Keep up the good work.

  • @umadevig.r1759
    @umadevig.r1759 5 лет назад +1

    Hi Sandeep, I liked your explanation. Would you please make a video on how to implement this for Q&A task. Please explain the code. Thank you

    • @SandeepBhutani
      @SandeepBhutani  5 лет назад

      Sure.. There is another video on QnA using allen NLP, in case you are interested

    • @umadevig.r1759
      @umadevig.r1759 5 лет назад

      Thank you, I will go through it

  • @shreyasvenugopalan1315
    @shreyasvenugopalan1315 5 лет назад +1

    Good overview. Thanks for the video.

  • @Furcifer
    @Furcifer 5 лет назад

    Not much technical aspects there, also try to improve your presentation style - it's very confusing. Thumbs up for the effort.

    • @SandeepBhutani
      @SandeepBhutani  5 лет назад

      Thanks for the feedback.. What are u looking for? Code? Data values? Values at different layers?

  • @RMCMC_AmitNikhade
    @RMCMC_AmitNikhade 4 года назад

    how can i feed multiple inputs to the seq2seq

    • @SandeepBhutani
      @SandeepBhutani  4 года назад

      The question is not clear.. Can you please elaborate your use case

    • @RMCMC_AmitNikhade
      @RMCMC_AmitNikhade 4 года назад

      In a seq2seq model we just input a single input and get a single output... but is it possible to pass multiple inputs and get a single input

    • @SandeepBhutani
      @SandeepBhutani  4 года назад

      You can check caption generator.. There multiple inputs are being fed

  • @kunalr_ai
    @kunalr_ai 5 лет назад

    Farzi video

    • @SandeepBhutani
      @SandeepBhutani  5 лет назад +1

      Thanks for your exper comment. It would be helpful if you can point out where and what went wrong in video.. Will take care in future