Residual Networks and Skip Connections (DL 15)

Поделиться
HTML-код
  • Опубликовано: 3 янв 2025

Комментарии • 105

  • @GisleGaren
    @GisleGaren Месяц назад +6

    I can't remember the last time I commented on a youtube video as it was much too long ago, but I just had to because your videos on deeplearning are CRIMINALLY underrated. I have yet to find another resource that explains ResNet so intuitively as you break down each concepts to laymen terms and you take your time explaining them. You have an amazing way of explaining concepts and I sincerely hope your videos get all the recognition they deserve!

  • @ashishbhong5901
    @ashishbhong5901 Год назад +10

    i have seen a lot of online lectures but you are the best for two reasons, the way you speak is not monotonous which give time to comprehend and process what your are explaining, and the second is the effort put in video editing to speed up when writing things down on board which doesn't break the flow of the lecture. Liked your video. Thanks🙂!

  • @anirudhsarma937
    @anirudhsarma937 Год назад +22

    very very very good explanation. almost all explanations on this forget about the influence of random weights on the forward propagation and focus solely on the backward gradient multiplication. which is why i never understood why you needed to feed forward the input. thanks a lot

  • @alexei.domorev
    @alexei.domorev Год назад +30

    ResNets are tricky to conceptualise as there are many nuances to consider. Dr Bryce, you have done a great job here offering such a brilliant explanation that is both logical and easy to follow. You definitely have a gift of explaining complex ideas. Thank you!

  • @priyanshpal4412
    @priyanshpal4412 17 дней назад

    Thank you professor. The best explanation, he includes influence of random weights on forward propagation.

  • @AdityaSingh-qk4qe
    @AdityaSingh-qk4qe 10 месяцев назад +4

    This is the clearest video that I've ever seen which explains the resnet for a layman, while at the same time conveying all the very important and relevant information related to resnet - I couldn't understand the paper - but with this video finally understood it - thanks a lot Professor Bryce - hope you create more such videos on deep learning

  • @vernonmascarenhas1801
    @vernonmascarenhas1801 7 месяцев назад +4

    I am writing a thesis on content-based image retrieval and I had to understand the ResNet architecture in-depth and by far this is the most transparent explanation ever!!

  • @Engrbrain
    @Engrbrain Год назад +10

    I am going to complete the entire playlist. Thanks, Bryce, you are a life saver

  • @lallama202
    @lallama202 11 месяцев назад +3

    Love your explanation, very easy to understand the concept and the flow of the ResNet in 17 mins! Really appreciate it

  • @crowsnest6753
    @crowsnest6753 Месяц назад

    Thank you for the clear and concise explanation.

  • @giordano_vitale
    @giordano_vitale 11 месяцев назад +1

    Every single second of this video conveys an invaluable amount of information to properly understand these topics. Thanks a lot!

  • @garydalley2349
    @garydalley2349 9 месяцев назад

    Awesome explanation. Got me through a learning hurdle that several others could not.

  • @andychess
    @andychess 2 месяца назад

    Great explanation, it helped me a lot. Thank you for taking the time to make this video!

  • @akashnayak3752
    @akashnayak3752 Месяц назад

    The best explanation ever. Thank you professor

  • @puyushgupta1768
    @puyushgupta1768 11 месяцев назад +1

    16 golden minutes.❤

  • @alissabrave424
    @alissabrave424 7 месяцев назад +1

    Brilliant explanation! Thank you so much, Professor Bryce!

  • @eeThial5
    @eeThial5 2 месяца назад

    Wow, so clear! That was stellar, thank you!

  • @beatbustersindia3641
    @beatbustersindia3641 Год назад +1

    Brilliant explanation.

  • @noumanahmad308
    @noumanahmad308 3 месяца назад

    That was amazing! So clear and concise explanation. Thanks!

  • @pariyajebreili
    @pariyajebreili 4 месяца назад

    Your explanation is great.

  • @strictly-ai
    @strictly-ai 9 месяцев назад +1

    Best explanation of resnet on the internet

  • @subramanianiyer3300
    @subramanianiyer3300 Год назад

    Thank you Prof. Bruce for explaining this thing with minimal complicated technicality

  • @ali57555
    @ali57555 10 месяцев назад

    Thank you very much for putting the time and effort. This is one of the best explanations I've seen (including US uni. professors)

  • @huyngo9507
    @huyngo9507 25 дней назад

    Thank you for such a clear explanation

  • @rabindhakal
    @rabindhakal 10 месяцев назад

    You have my respect, Professor.

  • @mohamedsidibe9876
    @mohamedsidibe9876 3 месяца назад

    Thank you for your good explanation, helped me a lot on my deep understanding journey of all these mechanims 😊

  • @kindness_mushroom
    @kindness_mushroom Год назад

    Thank you for the clear, concise, yet comprehensive explanation!

  • @zhen_zhong
    @zhen_zhong 8 месяцев назад

    This tutorial is so clear that I can follow along as a non-native English speaker. Thanks a lot!

  • @赵赵宇哲
    @赵赵宇哲 Год назад +1

    Very nice video!

  • @rohithr2071
    @rohithr2071 8 месяцев назад

    Best explanation i came across resnet so far.

  • @shobhitsrivastava9112
    @shobhitsrivastava9112 Год назад +1

    Until now, this is the best Residual Network tutorial I have found. As constructive feedback, I would like you to dive more deeply into how shape mismatches are handled because that part is not at par with the rest of the highly intuitive explanations of various things happening in a ResNet.

  • @rishabhagarwal4702
    @rishabhagarwal4702 7 месяцев назад

    Brilliant explanation, the 3D diagrams were excellent and I could understand some tricky concepts, thank you so much!

  • @michaelkern7154
    @michaelkern7154 4 месяца назад

    Such a great explanation. Love this!

  • @raulpena9865
    @raulpena9865 Год назад +1

    Thank you professor Bryce, Resnets where brilliantly explained by you. I am looking forward for new videos on more recent deep learning architectures!

  • @lhdtomlee
    @lhdtomlee 5 месяцев назад

    Thank you Professor! This introduction is really helpful and detailed!

  • @sanjeevjangra84
    @sanjeevjangra84 8 месяцев назад

    So clear and well explained. Thank you!

  • @davar_d
    @davar_d Год назад

    Brilliant explanation. Thank you!

  • @abdulsaboorkhan8337
    @abdulsaboorkhan8337 11 месяцев назад

    Thank you so much Mr Bryce.

  • @nikhilthapa9300
    @nikhilthapa9300 Год назад

    Your explanations are very clear and well structured. Please never stop teaching.

  • @lalop4258
    @lalop4258 Год назад

    Excellent class! I watched many videos before I came to this video and none explained the concept of residual networks as clearly as you did.
    Greetings from México!

  • @nguyentranconghuy6965
    @nguyentranconghuy6965 7 месяцев назад

    nice explanation, thank you very much Professor Bryce

  • @luisaruquipac.381
    @luisaruquipac.381 6 месяцев назад

    Awesome explanation! Thanks a lot.

  • @AqsaChappalwala
    @AqsaChappalwala 8 месяцев назад

    What an explanation

  • @MrMiguelDonate
    @MrMiguelDonate 8 месяцев назад

    Brilliant explanation!!!

  • @swethanandyala
    @swethanandyala 7 месяцев назад

    Amazing expalinaton. Thank you sir

  • @oliverFree-s5v
    @oliverFree-s5v Год назад

    Thanks so much! very informative brief explanation

  • @vaibhavnakrani2983
    @vaibhavnakrani2983 Год назад

    awesome.Loved it clear and concise!

  • @business_central
    @business_central Год назад +2

    Omg this is so helpful! Thank you so much !!!

  • @Mewgu_studio
    @Mewgu_studio 7 месяцев назад

    Thanks for your video.

  • @jonathanzkoch
    @jonathanzkoch Год назад +1

    Great video on this, super informative.

  • @JavierMorales-v2n
    @JavierMorales-v2n Год назад

    Great explanation, congrats.

  • @thelife5628
    @thelife5628 7 месяцев назад +2

    Another example of a random youtuber with very less subscriber explaining a complex topic so brilliantly...
    Thankyou so much sir

  • @schmiede1998
    @schmiede1998 Год назад

    Thank you so much for this video!

  • @rhysm8167
    @rhysm8167 Год назад

    this was fantastic - thank you

  • @sam-vv6gl
    @sam-vv6gl 9 месяцев назад

    thank you for the great explanation

  • @EDward-u1f6i
    @EDward-u1f6i Год назад

    your explanation is clear and concise! Thank you so much

  • @ArtJug
    @ArtJug Год назад

    Wow This explanation is amazing. So clear! I saw some videos about resNets but none of them describes what skip connections mean inside, what is their inside structure and working logic. But your explanation gives me much more. You explained the way of thinking and inside structure and advantages. Wow!

  • @Bachelorarbeit-op4he
    @Bachelorarbeit-op4he Год назад

    great explanation, thank you!

  • @trivendrareddy8236
    @trivendrareddy8236 6 месяцев назад

    Thank you sir great explanation

  • @minkijung3
    @minkijung3 Год назад

    Amazing. Thanks a lot. Your explanation is so clear. Please keep making videos professor!🙏

  • @عمرعلام-ز7د
    @عمرعلام-ز7د Год назад

    Really Great explanation. Thanks Prof. ♥

  • @DarkGoatLord
    @DarkGoatLord 6 месяцев назад

    you saved my life

  • @kevinkevin7900
    @kevinkevin7900 3 месяца назад

    AMAZING!!

  • @itmesneha
    @itmesneha 2 месяца назад

    thank you so so much for this video!

  • @genericchannel8589
    @genericchannel8589 Год назад

    Awesome explanation!! Thank you for your effort :)

  • @bakhoinguyen5156
    @bakhoinguyen5156 Год назад

    Thank you!!!

  • @sharmashikhashikha3
    @sharmashikhashikha3 Год назад

    You are a star!

  • @AymanFakri-ou8ro
    @AymanFakri-ou8ro 11 месяцев назад

    very nice! thank you!

  • @gnull
    @gnull 4 месяца назад

    tfw you click on a video and it's your old college professor lmao

    • @fluffsquirrel
      @fluffsquirrel 4 месяца назад

      That's awesome! Haven't seen anything by my professors other than what they shared in class, but you never know

  • @amitabhachakraborty497
    @amitabhachakraborty497 Год назад

    Best Explanation

  • @adityabhatt4173
    @adityabhatt4173 11 месяцев назад

    Great Explanation !!!!

  • @efeburako.9670
    @efeburako.9670 5 месяцев назад

    Thx dude u are awesome !

  • @nilishamp245
    @nilishamp245 Год назад

    you are brilliant!! Thank you for explaining this so well!!!!❤❤❤

  • @happyvioloniste08
    @happyvioloniste08 Год назад

    Thank you 👏👏

  • @charlesd4572
    @charlesd4572 Год назад

    Superb!

  • @SatyamAnand-ow4ub
    @SatyamAnand-ow4ub Год назад

    Awesome explanation

  • @zanzmeraankit4820
    @zanzmeraankit4820 Год назад

    got a meaningfull insights from this video

  • @sashimiPv
    @sashimiPv Год назад

    Prof. Bryce is the GOAT!

  • @wouladjecabrelwen1006
    @wouladjecabrelwen1006 Год назад

    Who is this teacher? Damn he is good. Thank you

  • @axe863
    @axe863 Год назад

    Loss landscape looking super smooth .....

  • @sajedehtalebi902
    @sajedehtalebi902 Год назад

    It was clear and useful. Tnx a lot

  • @lovenyajain6026
    @lovenyajain6026 11 месяцев назад

    Waow. Thankyou

  • @kkjun7157
    @kkjun7157 2 года назад +2

    This is such a clean and helpful video! Thank you very much! The only thing I still don't know is during the propagation, we now have two sets of gradients for each block? One for going through the layers, one for going around the layers, then how do we know which one to use to update the weights and bias?

    • @csprof
      @csprof  2 года назад +1

      Good question. For any given weight (or bias), its partial derivative expresses how it affects the loss along *all* paths. That means we have to use both the around- and through-paths to calculate the gradient. Luckily, this is easy to compute because the way to combine those paths is just to add up their contributions!

  • @AsilKhalifa
    @AsilKhalifa 6 месяцев назад

    Thanks

  • @newbie8051
    @newbie8051 Год назад

    Coudn't understand how we can treat the shape-mismatch 13:40
    Great lecture nonetheless, thank you sir !! Understood what Residual Networks are 🙏

  • @paulocezarcunha
    @paulocezarcunha 7 месяцев назад

    great!

  • @praveshbudhathoki736
    @praveshbudhathoki736 5 месяцев назад

    Thanks for nice explanation
    But I have one query, in part 16:00 where you said "each output neuron get input from every neuron across the depth of previous layer", here doesn't that make each output depth neuron same??

  • @kranthikumar9998
    @kranthikumar9998 Год назад

    @csprof, By consistently including the original information alongside the features obtained from each residual block, are we inadvertently constraining our ResNet model to closely adhere to the input data, possibly leading to a form of over-memorization?

  • @wege8409
    @wege8409 9 месяцев назад

    10:10
    Concerns: shape mis-match
    nervous sweating

  • @mohammadyahya78
    @mohammadyahya78 Год назад

    Thank you very much. I am not sure yet how residual block lead to faster gradient passing when the gradient has to go through both paths please? It means as I understand that this adds more overhead to compute the gradient. Please correct me if I am wrong. Also can you please add more how 1x1 reduce the depth or make a video please if possible? For example, I am not sure how the entire depth say of size 255 gives output to one neuron.

    • @csprof
      @csprof  Год назад +3

      You're right that the residual connections mean more-complicated gradient calculations, which are therefore slower to compute for one pass. The sense in which it's faster is that it takes fewer training iterations for the network to learn something useful, because each update is more informative. Another way to think about it is that the function you're trying to learn with a residual architecture is simpler, so your random starting point is a lot more likely to be in a place where gradient descent can make rapid downhill progress.
      For the second part of your question, whenever we have 2D convolutions applied to a 3D tensor (whether the third dimension is color channels in the initial image, or different outputs from a preceding convolutional layer) we generally have a connection from *every* input along that third dimension to each of the neurons. If you do 1x1 convolution, each neuron gets input from a 1x1 patch in the first two dimensions, so the *only* thing it's doing is computing some function over all the third-dimension inputs. And then by choosing how many output channels you want, you can change the size on that dimension. For example, say that you have a 20x20x3 image. If you use 1x1 convolution with 8 output channels, then each neuron will get input from a 1x1x3 sub-image, but you'll have 8 different functions computed on that same patch, resulting in a 20x20x8 output.

  • @spotlessmind9263
    @spotlessmind9263 6 месяцев назад

    Isn't this similar to RNNs where subsets of data is used for each epoch & in residual network, a block of layers is injected with fresh signal, much like boosting.

  • @anirudhsarma937
    @anirudhsarma937 Год назад

    Can you please talk about GANs and if possible stable diffusion

  • @maindepth8830
    @maindepth8830 4 месяца назад

    I am still confused.
    What are the prerequisites to understanding this video.

    • @VenchislavCodes
      @VenchislavCodes 4 месяца назад +1

      I would recommend you to read the official paper "Deep Residual Learning for Image Recognition". I found explanations there pretty clear + there are videos on youtube explaining this paper.

  • @rayananwar8106
    @rayananwar8106 7 месяцев назад

    Do you mean that RESNET is just a skip connection not an individual network ?????????

  • @WorkStation-t7n
    @WorkStation-t7n Год назад

    👍

  • @johnberkcg
    @johnberkcg 3 месяца назад

    speaking about the error value and calling it a Loss value using that term out of original context, makes this confusing to the new learner...

  • @EeniyahShelmon
    @EeniyahShelmon 5 месяцев назад

    Free books

  • @dapphari007
    @dapphari007 17 дней назад

    Anna university

  • @1991liuyangyang
    @1991liuyangyang 8 месяцев назад

    great explanation, simple and straightforward.

  • @chetan-g9t
    @chetan-g9t Год назад

    Brilliant explanation. Thank you!