BERT Neural Network - EXPLAINED!

Поделиться
HTML-код
  • Опубликовано: 14 ноя 2024

Комментарии • 331

  • @CodeEmporium
    @CodeEmporium  Год назад +10

    For details and code on building a translator using a transformer neural network, check out my playlist "Transformers from scratch": ruclips.net/video/QCJQG4DuHT0/видео.html

  • @brilliantdirectoriestraining
    @brilliantdirectoriestraining 3 года назад +146

    I have studied for several years AI, NPL and Neural networks. But the way you explained this was lovely, friendly and very simple which is why I am pretty sure you are BERT

  • @AmandeepSingh-xk4yv
    @AmandeepSingh-xk4yv 4 года назад +249

    Just watched a video on Transformers, and now this. Am astounded at how you explained such complex notions with such ease!
    Hugely underrated channel!

    • @CodeEmporium
      @CodeEmporium  4 года назад +7

      Thanks a lot! Glad you liked it 😊

  • @xJFGames
    @xJFGames 2 года назад +6

    I'm honestly amazed of how you managed to transform a complex algorithm in a simple 10 minutes video. Much thanks to you, my final thesis appreciates you.

  • @prasannabiswas2727
    @prasannabiswas2727 4 года назад +1

    I read many blogs on BERT but they were more focused on how to use BERT rather than what actually BERT is?. This video helped me clear all my doubts regarding how Bert is trained. Clear and concise explanation.

  • @jeenakk7827
    @jeenakk7827 4 года назад +24

    I wish I had come across this channel earlier. You have a wonderful skill in explaining complicated concepts. I love your "3 pass" approach!!

  • @healthertsy1863
    @healthertsy1863 Год назад

    Don't hesitate, this is the best video of BERT explanation for sure!

  • @erfanshayegani3693
    @erfanshayegani3693 Год назад

    Every time I watch this video I gain a better understanding of the procedure. Thanks a lot for the great content!!!

  • @ziangtian
    @ziangtian Год назад +1

    OMG!!!! This vid is a life-saver! just elucidated so many aspects of NLP to me (a 3 month beginner who still understands nothing

  • @usamahussain4461
    @usamahussain4461 2 года назад +4

    Phenomenal the way you condense such a complicated concept into a few minutes, clearly explained.

    • @CodeEmporium
      @CodeEmporium  Год назад

      Thanks so much for the compliments:)

  • @NadavBenedek
    @NadavBenedek 2 года назад

    I love the "pass1" "pass2" concept of how you explain things. It's great.

  • @madhubagroy
    @madhubagroy 3 года назад +1

    The BEST explanation on BERT. Simply outstanding!

  • @michaelherrera4450
    @michaelherrera4450 3 года назад +5

    Dude this video is incredible. I cannot express how good you are at explaining

    • @CodeEmporium
      @CodeEmporium  3 года назад

      Thanks for watching! Super glad it is useful

    • @path1024
      @path1024 Год назад

      Good try though!

  • @akhileshpandey123
    @akhileshpandey123 Год назад

    Always come back to your explanation whenever want to refresh bert concepts. Thanks for the effort.

  • @rohanmirchandani9726
    @rohanmirchandani9726 3 года назад

    This is one of the best resources explaining BERT available online.

  • @mauriciolandos4712
    @mauriciolandos4712 Год назад

    Best explanator on youtube, you have a good mix of simplifying so it can be understood, but not overly simplifying so we learn deeply enough. The idea of having 3 passes going deeper was a great idea as well.

  • @BogdanSalyp
    @BogdanSalyp 4 года назад +1

    Extremely underrated channel, didn't find any other good explanation on RUclips/Medium/Google

  • @pallavijog912
    @pallavijog912 3 года назад

    Wow.. just switched from another BERT explained video to this.. stark difference.. excellent explanation indeed.. thanks..

  • @rajeshve7211
    @rajeshve7211 5 месяцев назад

    Best ever explanation of BERT! Finally understood how it works :)

  • @ydas9125
    @ydas9125 Год назад

    I am very impressed by the clarity and core focus of your explanations to describe such complex processes. Thank you.

    • @CodeEmporium
      @CodeEmporium  Год назад

      You are very welcome. Thanks for watching and commenting :)

  • @AbdulWahab-cy9ib
    @AbdulWahab-cy9ib 4 года назад +1

    Was struggling to understand the basics of BERT after going through Transformer model. This video was indeed helpful.

  • @mays7n
    @mays7n Год назад

    this must be one of the best explanation videos on the internet, thank you!

  • @goelnikhils
    @goelnikhils Год назад

    Excellent Explanation. Main thing to note is the finer point around explanation/mention of the loss functions that BERT uses. As not many other videos on same topic cover this. Tood Good

  • @aeirya1686
    @aeirya1686 Год назад

    Very very friendly, clear and masterful explanation. This is exactly what I was after. Thank you!

  • @TSP-NPC
    @TSP-NPC Год назад

    Thanks for the great explanation of Transformers and the architecture of BERT.

    • @CodeEmporium
      @CodeEmporium  Год назад

      My pleasure and thank you for the super thanks :)

  • @krishnaik06
    @krishnaik06 4 года назад +210

    Amazing Explanation :)

    • @doyourealise
      @doyourealise 4 года назад +1

      big fan sir

    • @richarda1630
      @richarda1630 3 года назад

      hear hear! so agree!

    • @User-nq9ee
      @User-nq9ee 2 года назад +2

      To teach us , you study and explore .. really grateful for your efforts Krish .

    • @indgaming5452
      @indgaming5452 2 года назад

      Where there is krish sir... I will come there. .... Sir I found u here also...
      We are learning together 😇

    • @itsme1674
      @itsme1674 Год назад

      Nice to meet you sir

  • @jan-rw2qx
    @jan-rw2qx Год назад

    First half is exactly how much I need to understand right now, thank you :)

  • @deepeshkumar4945
    @deepeshkumar4945 Год назад

    dude , you are amazing , you explained the state of the art NLP model , in such a well explained and concise video . Thanks a ton for this video !!!!!!

    • @CodeEmporium
      @CodeEmporium  Год назад +1

      You are super welcome. Thanks so much for commenting this!

  • @maverick3069
    @maverick3069 4 года назад +1

    The multiple passes of explanation is an absolutely brilliant way to explain! Thanks man.

  • @andrewlai3358
    @andrewlai3358 4 года назад +3

    Thank you for the explanation. You really have a knack for explaining NLP concepts clearly without losing much fidelity. Please keep posting!

  • @ankit_khanna
    @ankit_khanna 3 года назад

    One of the best videos on BERT.
    Great work!
    Wishing you loads of success!

  • @rahuldey6369
    @rahuldey6369 3 года назад

    I've read 4 articles before coming here. Couldn't connect the dots. This single video showed me the way.. Thanks a lottt

  • @bidyapatip
    @bidyapatip 4 года назад

    After reading lot of blogs, videos, I understood its so difficult network. But after going through this, I feel so easy to understand BERT(and its varient available in transformer library)

  • @somerset006
    @somerset006 Год назад

    Nice job, man! Especially the multi-phase approach of explaining things, top to bottom.

    • @CodeEmporium
      @CodeEmporium  Год назад

      Super happy you liked the approach. Thanks for commenting

  • @josephselwan1652
    @josephselwan1652 3 года назад +1

    Amazing stuff. For visualization purposes, when you get into a deeper pass, I would recommend always adding the zooming effect for intuitive understanding. I am not sure about others, but when you do that, I instantly know "OK, now we are within this 'box' "

    • @CodeEmporium
      @CodeEmporium  2 года назад +2

      Good thought. I'll try to make this apparent in the future. Thanks!

  • @JP-gk6pi
    @JP-gk6pi 4 года назад

    3 pass explanation is a really good approach to explain this complex concept. Best video on BERT

    • @beteaberra
      @beteaberra 3 года назад

      Great video! But what is pass 1, pass 2 and pass 3?

  • @andymoses95
    @andymoses95 4 года назад +1

    The one which I was looking for the past 6 months.! Thanks a lot for making this.

  • @harshavardhany2970
    @harshavardhany2970 3 года назад +1

    Simple and clear explanations (which shows you know what you're talking about). And cool graphics. Will be back for more videos :)

  • @muhannadobeidat
    @muhannadobeidat Год назад

    Excellent introduction, visualizations and step by step approach to explain this. Thanks a ton.

    • @CodeEmporium
      @CodeEmporium  Год назад

      You are oh-so welcome. Thank you for watching and commenting:)

  • @aashnavaid6918
    @aashnavaid6918 2 года назад

    thank you so very much! one video was enough to get the basics clear

  • @aruncjohn
    @aruncjohn 4 года назад

    Excellent explanation!. Will never miss a video of yours from now on!

  • @alidi5616
    @alidi5616 3 года назад

    That's it. The best explanation i came through. Receive my upvote and subscription 😁

    • @CodeEmporium
      @CodeEmporium  3 года назад

      Many thanks. Join the discord too :3

  • @ilyasaroui7745
    @ilyasaroui7745 4 года назад

    Good touch to put the references on the description instead of on the slides

  • @seeunkim4185
    @seeunkim4185 2 года назад

    Thank you so much for the clear explanation, I get the grip of the BERT now!

  • @shrawansahu9500
    @shrawansahu9500 2 года назад

    Wow , Best Explanation on BERT present in RUclips that too Free, Thanks Man you made NLP easy.🍺

  • @dannysimon2965
    @dannysimon2965 4 года назад

    Wow, thanks!! I tried watching many videos and couldn't understand a single thing. But yours was truly concise and informative.

  • @abhikc8108
    @abhikc8108 4 года назад +3

    Great explanation, I really like the three pass idea it breaks down a lot of complications to simple concepts.

  • @aitrends8901
    @aitrends8901 2 года назад

    Very nice high level understanding of Transformers...

  • @20.nguyenvongockhue80
    @20.nguyenvongockhue80 7 месяцев назад

    wow AMAZING EXPLAINATION. Thank you very much

  • @ParthivShah
    @ParthivShah 3 месяца назад

    Thank You. Love from India.

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak 4 года назад

    Best explanation i have seen so far on BERT.

  • @pinakjani4281
    @pinakjani4281 4 года назад +1

    No one explains DL models better than this guy.

  • @ngavu8750
    @ngavu8750 2 года назад

    So simple and easy to understand, thanks a lot

  • @utkarshujwal3286
    @utkarshujwal3286 Год назад

    The NSP task does not directly involve bidirectional modeling in the same way as masked language modeling (MLM) does. Instead, it serves as a supplementary objective during BERT's pre-training phase. The purpose of the NSP task is to help the model understand relationships between sentences and capture broader context beyond adjacent words.

  • @keyrellousadib6124
    @keyrellousadib6124 2 года назад

    This is an excellent summary. Very clear and super well organized. Thanks very much

    • @CodeEmporium
      @CodeEmporium  Год назад

      Thank you so much for watching ! And for the wonderful comment :$

  • @pathuri86
    @pathuri86 2 года назад

    Excellent and concise explanation. Loved it. Thanks for this fantastic video.

  • @amreshgiri
    @amreshgiri 4 года назад

    Probably the best (easiest to understand in one go) video on BERT. Thanks ❤️

  • @akashsaha3921
    @akashsaha3921 4 года назад

    Well explained. Short and to the point

  • @Kumar08
    @Kumar08 4 года назад

    Fantastic explanation, Covered each and every point in the BERT.
    Looking forward for more videos on NLP.

  • @susantachary7456
    @susantachary7456 3 года назад

    No more reading after this. Loved IT. 😊

  • @jx7433
    @jx7433 8 месяцев назад

    Excellent Explanation! Thank you!

  • @LoLelfy
    @LoLelfy 3 года назад

    Omg your videos are so good! So happy I found your channel, I'm binge watching everything :D

    • @CodeEmporium
      @CodeEmporium  3 года назад +1

      Glad you found my channel too! Thank you! Hope you enjoy them!

  • @maryamaziz3841
    @maryamaziz3841 3 года назад

    Thanks for wonderful explanation for bert architecture 🍀🌹

  • @felixakwerh5189
    @felixakwerh5189 4 года назад

    great video, i like the 3-pass method you used to explain the concepts

  • @josephpareti9156
    @josephpareti9156 2 года назад

    awesome introduction to a very challenging topic

    • @CodeEmporium
      @CodeEmporium  2 года назад

      Thank you. Uploading a related video on this soon too :)

  • @ruyanshou805
    @ruyanshou805 3 года назад

    Well explained! I have been looking for something like this for quite long!

  • @sathyakumarn7619
    @sathyakumarn7619 3 года назад

    Not bad! Loved the video. Please add a little bit of more explanation for upcoming vids if preferrable.

  • @MarketingLeap
    @MarketingLeap 4 года назад

    Loved how you explained BERT really well. Great job!

  • @magelauditore333
    @magelauditore333 4 года назад

    Such a underrated channel. Keep it up man

  • @adityaj9984
    @adityaj9984 Год назад

    BEST EXPLAINATION EVEERRRR

  • @Deddiward
    @Deddiward 3 года назад

    Wow I've just discovered your channel, it's full of resources, very nice!

  • @sabirahmed6191
    @sabirahmed6191 3 года назад

    Firstly thanks for the really cool explanation. Would like to point out please remove the text animation as it causes a huge distraction for a few people, I had to watch this with multiple breaks cause my head was aching due to the text animation.

  • @arvindvasudevan45
    @arvindvasudevan45 4 года назад

    Excellent articulation of the concept. Thank you.

  • @urd4651
    @urd4651 2 года назад

    thank you for sharing the video! very clear and helpful!!

  • @dupontremi5638
    @dupontremi5638 Год назад

    great explanation, I understood everything ! thanks a lot

    • @CodeEmporium
      @CodeEmporium  Год назад

      Thanks so much for watching and commenting :)

  • @gauravchatterjee794
    @gauravchatterjee794 3 года назад

    Best Explanation by far!

  • @JillRhoads
    @JillRhoads Год назад

    Im a teacher with a compi-sci masters and am just diving into AI. This was absolutely great! I studied natural language in college about 20 years ago and this video really helped form a mental bridge with the new technologies. Language chatbots have been around forever!and the ones I studied used Markov chains that used small corpuses. So of course students would dump porn novels, the bibel and whatnot into it and then start it talking. We never laughed so hard!

  • @sajaal-dabet148
    @sajaal-dabet148 3 года назад +1

    This is amazing! Crystal clear explanation, thanks a lot.

  • @rashmikuchhal5339
    @rashmikuchhal5339 4 года назад

    I always watch your videos and appreciate the efforts you put to make the complicated topics so easy and clear. Thankyou for all your work. I really like the way you explain in 3 passes.... great work

  • @쉔송이
    @쉔송이 4 года назад

    Hey, your explantation and presentation on complicated concepts made me clear about TF and BERT.
    I will expect you to upload more exciting videos.

  • @mohsennayebi86
    @mohsennayebi86 2 года назад

    Amazing Explanation! I am speechless :)

  • @atamir8339
    @atamir8339 4 месяца назад

    loved your explanation bro, earned yourself a sub

  • @themaninjork
    @themaninjork 3 года назад

    Your Videos are very well explained !

  • @arturjaroszewicz8424
    @arturjaroszewicz8424 8 месяцев назад

    Beautiful explanation!

  • @caoshixing7954
    @caoshixing7954 4 года назад

    Very good explanation, Easy to understand! Come on!

  • @NachoProblems
    @NachoProblems Год назад

    Do you realize you are the only good description of how exactly fine tuning works I have found, and I've been researching for months. Thank you!!!

    • @CodeEmporium
      @CodeEmporium  Год назад +1

      You are too kind. Thank you for the donation. You didn’t have to bug it is appreciated. Also super glad this content was useful! More of this to come

  • @kogocher
    @kogocher 3 года назад

    explained the concepts clearly

  • @DrRavi_TutorialsforEngineering

    Hi dear,
    How are you? Seen your video. It's your teacher and friend at SJCE Mysore.

  • @noumaaaan
    @noumaaaan 3 года назад

    This explanation is pretty amazing! I have a presentation on this soon. Thank you so much!

  • @nooshnabi9248
    @nooshnabi9248 3 года назад

    Great Video! So easy to follow!

  • @pratyushnegi4082
    @pratyushnegi4082 2 года назад

    Thanks for explaining this in such a simple way :)

  • @antoinefdu
    @antoinefdu 3 года назад +1

    @CodeEmporium I have a question for you. Imagine the sentence :
    "My dad went to the dentist yesterday and he told him that he needs to floss more."
    Can BERT understand that in this context "he" is probably the dentist, and "him" my dad?

    • @CodeEmporium
      @CodeEmporium  3 года назад +2

      Great question. It's actually a studied problem in NLP called "cataphoric resolution" and "anaphoric resolution". I have seen examples of it online, so i think we should be able to do this to some degree.

    • @lumos309
      @lumos309 3 года назад +2

      It's actually even more complex than you mentioned, since the first "he" is the dentist but the second "he" is your dad! It would be fascinating to see if this can actually be done.

  • @dipanjanghosal6164
    @dipanjanghosal6164 4 года назад

    Thanks for such a nice overview .. Very well done animation which augments the understanding further.. Maybe you could have touched on Roberta here and the differences. May be a separate video for that?

    • @CodeEmporium
      @CodeEmporium  4 года назад +1

      Thanks! Roberta is on my list. Wasn't kidding when I said there is a lot on Bert. Will make that in another video

  • @AbdulWahab-cy9ib
    @AbdulWahab-cy9ib 4 года назад

    In the video, you mentioned that during training C is binary output but in the paper is it mentioned that it is a vector.

  • @mohamedsaaou9258
    @mohamedsaaou9258 4 года назад

    Wonderful, and the best explanation of BERT ever :) , Good Job , and big thank you !

  • @blue_smoothie
    @blue_smoothie 3 года назад

    Great explanation! I am currently writing my master's thesis in computer science and was wondering how I could cite you as a source? Your videos really helped me! Thanks in advance!

    • @CodeEmporium
      @CodeEmporium  3 года назад +1

      Awesome! I know some conferences/journals require different formats. As long as "CodeEmporium" and the video title here is there, that's good.. if they want a URL, you can write that too

    • @blue_smoothie
      @blue_smoothie 3 года назад

      @@CodeEmporium Great! Thank you so much for the quick reply!

  • @dhananjaysonawane1996
    @dhananjaysonawane1996 3 года назад

    @11:08 Please go on forever :):D... the best explanation I have seen so far, Thanks man!

  • @TheMehrdadIE
    @TheMehrdadIE 4 года назад

    Awesome! How simple you explain these models :)

  • @AIMLDLNLP-TECH
    @AIMLDLNLP-TECH Год назад

    sentence: "The cat sat on the mat."
    BERT reads sentences bidirectionally, which means it looks at the words both before and after each word to understand its meaning in the context of the whole sentence.
    For example, let's say you want to know what the word "mat" means in the sentence: "The cat sat on the mat." BERT understands "mat" not just by itself, but by considering the words around it. It knows that a cat can sit on something, and "mat" here means a flat piece of fabric on the floor.

  • @yuri-nehc
    @yuri-nehc Год назад

    very good explanation! thank you!