Backpropagation in CNN - Part 1

Поделиться
HTML-код
  • Опубликовано: 5 авг 2024
  • Backpropagation in CNN is one of the very difficult concept to understand. And I have seen very few people actually producing content on this topic.
    So here in this video, we will understand Backpropagation in CNN properly. This is part 1 of this tutorial, and in this is we will just look at Backpropagation for Convolutional Operation. In part 2, we will see how the gradients propagate backward in the entire architecture.
    All the frameworks used for Deep Learning automatically implement Backpropagation for CNN. But as we humans are curious, we want to know how it works and not let it be implemented automatically.
    So buckle up! And let's understand Backpropagation in CNN.
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    Timestamp:
    0:00 Intro
    1:49 What to obtain
    4:22 dL/dK
    11:46 dL/dB
    13:20 dL/dX
    18:51 End
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    📕 PDF notes for this video: bit.ly/BackPropCNNP1
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    Follow my entire playlist on Convolutional Neural Network (CNN) :
    📕 CNN Playlist: • What is CNN in deep le...
    At the end of some videos, you will also find quizzes 📑 that can help you to understand the concept and retain your learning.
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    ✔ Complete Neural Network Playlist: • How Neural Networks wo...
    ✔ Complete Logistic Regression Playlist: • Logistic Regression Ma...
    ✔ Complete Linear Regression Playlist: • What is Linear Regress...
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    If you want to ride on the Lane of Machine Learning, then Subscribe ▶ to my channel here: / @codinglane

Комментарии • 139

  • @PranjS
    @PranjS 10 месяцев назад +27

    You will be remembered by many future developers as the person who helped them clear their concepts and probably also the person who helped them crack their job in this field. A big thanks for your work!

    • @CodingLane
      @CodingLane  10 месяцев назад +1

      Wow… so glad to hear this Pranjal. Thank you so much for your words 😇

  • @waleedrafi7977
    @waleedrafi7977 2 года назад +46

    Even though I have already watched your videos but due to your hard work and spending a lot of time on research, I also watch the whole video from my other accounts just to support you to increase your watch time, etc. so you don't stop making videos & I will highly encourage you to continue your hard work, one day you have a big audience.

    • @CodingLane
      @CodingLane  2 года назад +6

      I don’t even how to react to this great comment 😄. Probably the best coment I have seen. Thank you so much. It really means a lot to me. I will keep on making such videos!

  • @daniteka
    @daniteka 8 месяцев назад +3

    I wish I could show my appreciation by liking your videos multiple times, but unfortunately, the system limits us to a single like per video.
    Thank you so much for sharing your knowledge and expertise!

  • @muskanmahajan04
    @muskanmahajan04 2 года назад +11

    Prolly the best explanation on CNN Backprop there is on yt , thank you Jay!

    • @CodingLane
      @CodingLane  2 года назад

      Your welcome. Glad to help!

  • @user-pg4ui4us1u
    @user-pg4ui4us1u 9 месяцев назад +1

    An absolutely perfect playlist that provided a wealth of insights - a huge thanks!

  • @nadiaaouadi4266
    @nadiaaouadi4266 Год назад

    thanx for the whole series, I hope that you continue uploading videos in the future because they are amazing! *I never comment on videos but this was totally worth it*

  • @nurabba3273
    @nurabba3273 10 месяцев назад +1

    Hello brother. I am not the first I know to get enlightened by your educative videos. But I will say this, you have save my days by giving me so much insight on machine learning stuffs. Thank you

  • @locostineverything4810
    @locostineverything4810 7 месяцев назад

    Thanks for all the contents so far , Appreciate it !

  • @shrotkumarsrivastava2441
    @shrotkumarsrivastava2441 Год назад

    This is awesome and rare. Really appreciate the work you have put in here. Thanks for it. Keep rocking.

  • @jordiwang
    @jordiwang Год назад

    again bro, its quiet clear and I am loving it appreciate it

  • @mariap.9768
    @mariap.9768 Год назад

    Excellent work, I like how you get to the point quickly.

  • @avisinh7249
    @avisinh7249 Год назад

    Best example and explanation I have seen. Thanks!

  • @MrAMerang
    @MrAMerang 5 месяцев назад

    BEST youtube on this topic, Thank you very much.

  • @WilliamJung98
    @WilliamJung98 Год назад

    This is incredibly helpful. Thank you so much.

  • @devvratsahai2068
    @devvratsahai2068 Год назад +1

    you're a life saver man!!

  • @bevel1702
    @bevel1702 2 года назад +2

    Very down to the point and descriptive, didn't think I'd be able to understand this but the way you described it made it crystal clear. Good work!

  • @jaiminjariwala5
    @jaiminjariwala5 Год назад

    Best Explanation Ever!
    Thank You so much Brother!

  • @user-ob5ts8yc7v
    @user-ob5ts8yc7v Год назад

    I'm Korean subscriber, you contents is very helpful for me. Thank you!

  • @Maciek17PL
    @Maciek17PL Год назад

    Awesome video, crystal clear explanation!!!

  • @kumkumupreti
    @kumkumupreti 7 месяцев назад

    Thank you so much...u really make every concept crystal clear ❤

  • @farrugiamarc0
    @farrugiamarc0 4 месяца назад

    Very well explained despite this being a very complex topic and very challenging to teach. Well done!

  • @Pierredefermatteee
    @Pierredefermatteee 2 года назад +1

    Thank you for the excellent content that you provided for no cost. Please continue making such precious videos, I know you will rock it.

    • @CodingLane
      @CodingLane  2 года назад

      Thank You for your kind words!

  • @thienlu7011
    @thienlu7011 Год назад

    Thank you a lot! Very intuitive explaination.

  • @arashroshanpoor1682
    @arashroshanpoor1682 3 месяца назад

    This is the best explanation I looking for. thx

  • @lz-ym5eq
    @lz-ym5eq 8 месяцев назад

    Thank you for an explanation. It helped me a lot

  • @emanuel8418
    @emanuel8418 2 года назад +1

    I just can't describe how much you helped me with this video. Thank you so much.

  • @user-oq7ju6vp7j
    @user-oq7ju6vp7j 9 месяцев назад

    Thank you for your videos!They are very helpfull for those of us, who are making NN's from srcatch.

  • @shahnewazchowdhury4175
    @shahnewazchowdhury4175 Год назад

    This is a fantastic video. Keep up the great work!
    It's surprising this does not have several hundred thousand views already.

  • @AnkurKumar-dc3db
    @AnkurKumar-dc3db Год назад +1

    Finally after watching this video I now fully understand the backpropagation through CNNs. Thanks man for creating this video.

  • @jackrozmaryn7905
    @jackrozmaryn7905 3 месяца назад

    Jay, Excellent, Excellent, Excellent and excellent thank you!

  • @Aca99100
    @Aca99100 2 года назад +8

    Hey Coding Lane! This tutorial was a life saver for me and just what I was looking for. As you said there arent many sources on the internet that explain backprop in CNN's especially in this depth.
    Thank you for this video, you got yourself likes on both parts and a new sub! Keep doing what youre doing!

    • @CodingLane
      @CodingLane  2 года назад

      Happy to help! And thank you for the comment 😇

  • @alexsemchenkov5740
    @alexsemchenkov5740 2 года назад +1

    The best explanation on the internet! Thanks!

    • @CodingLane
      @CodingLane  Год назад

      Thank you. Much appreciate your words!! 🙂😄

  • @vandanvirani167
    @vandanvirani167 2 года назад +1

    Dude really great I was searching whole over internet to find perfect explanations of cnn back propagation but on one explain like you even best teachers on RUclips

    • @CodingLane
      @CodingLane  2 года назад

      Thank you. Glad I could help 😇

  • @anshulsaini5401
    @anshulsaini5401 2 года назад +2

    Man, this is prolly the best video I have seen so far. Literally thanks for this video I have been struggling with backpropagation for 2 weeks and today I feel like ik everything lmao. Hats off to you brother keep grinding, keep rising. Definitely sharing this playlist (especially this video) in my circle!!

    • @CodingLane
      @CodingLane  2 года назад +1

      Thank you so much for your comment. I feel very good seeing this. Glad to see that it helped you. Comments like these keeps me going and make me create more such content 😇.

  • @josephj1643
    @josephj1643 Год назад +3

    Hey, Thank you for making this video. Most of the people on RUclips have explained CNN, but have not explained how Back Prop works in CNN saying it is similar to DNN.
    I was looking for this video, since a week, it really helped. Do continue making such videos!!!

    • @CodingLane
      @CodingLane  Год назад

      Glad it was helpful! Sure, will keep uploading more videos. 🙂

  • @BlackmetalSM
    @BlackmetalSM Год назад

    Even Chat GPT was not as clear as you. Great job!

  • @SharathsooryaBCS
    @SharathsooryaBCS Год назад

    Sir, Really wonderfull explanation sir. I got the concept at the very first attempt. Thank you so much sir.

  • @naseeruddin832
    @naseeruddin832 2 года назад +1

    At last, found one the real backprop of CNN. Thanks buddy

    • @CodingLane
      @CodingLane  2 года назад

      Your welcome! Glad it helped.

  • @tnmyk_
    @tnmyk_ 2 года назад +1

    Amazing lecture! Very well explained! Keep up the good work, man!

  • @learningfoundation6601
    @learningfoundation6601 Год назад

    Amazing content keep uploading

  • @user-do3mw8yx4y
    @user-do3mw8yx4y Год назад

    Thank you so much!!

  • @jameshaochi0824
    @jameshaochi0824 Год назад

    Thank you so much.

  • @adelzier4264
    @adelzier4264 11 месяцев назад +1

    You are the best bro ! Hope you all the best

    • @CodingLane
      @CodingLane  11 месяцев назад +1

      Hey… thank you!

  • @getisbhai
    @getisbhai Год назад

    amazing video bhai

  • @omerihtizaz9043
    @omerihtizaz9043 2 года назад +1

    Great video, Keep up the good work!

  • @semon00
    @semon00 15 дней назад +2

    That's a very good explanation

  • @sushantregmi2126
    @sushantregmi2126 Год назад +1

    your content is unreal, thank you very much...

    • @CodingLane
      @CodingLane  Год назад

      Thank you so much. Glad you find it helpful! 😀

  • @mukandrathee
    @mukandrathee 4 месяца назад +1

    U made it so easy. Thanks

  • @ce108meetsaraiya4
    @ce108meetsaraiya4 4 месяца назад +1

    This is best explanation of backpropogation in CNN

  • @ummayhaney4162
    @ummayhaney4162 2 месяца назад

    Thank you❤💙

  • @enricollen
    @enricollen Год назад

    thanks sir, appreciate it

  • @VR-fh4im
    @VR-fh4im Год назад

    Brilliant.

  • @shreedevisindagi888
    @shreedevisindagi888 6 месяцев назад

    Thank you .

  • @ahmedhesham7537
    @ahmedhesham7537 26 дней назад

    thank you so much bro

  • @a.k.103
    @a.k.103 2 года назад +1

    bhai tere liye mere pass koi shabd nahi h ( I don't have any words for you), I have never seen content like this. i can write like this comment on every video but it will also be so less against your hard work. keep this up I always support you. Love you, bro.

    • @CodingLane
      @CodingLane  2 года назад

      Thank you so much 😊. I am really glad that my content is this much valuable. This is One of the most loving comment i have seen.

  • @firstkaransingh
    @firstkaransingh Год назад

    Good explanation bruh

  • @cobblin_gock
    @cobblin_gock 2 года назад +1

    awesome work. well explained.

  • @qaiserali6773
    @qaiserali6773 Год назад +1

    Lovely!!

  • @jgg0207
    @jgg0207 2 года назад +1

    preciate this!! Very helpful

  • @rimeblb573
    @rimeblb573 2 месяца назад

    thank you

  • @waleedrafi1509
    @waleedrafi1509 2 года назад +1

    Great Video.

  • @joelrcha3368
    @joelrcha3368 2 года назад +1

    Great explanation

  • @dhawalpatil7779
    @dhawalpatil7779 Год назад +1

    Hey coding lane I have only 2 words to say YOU ROCK 🎉

    • @CodingLane
      @CodingLane  Год назад

      Hahaha… Thanks a lot! Cheers 🎉

  • @RH-mk3rp
    @RH-mk3rp Год назад +1

    You are an AMAZING PERSON, THANK YOU.

    • @CodingLane
      @CodingLane  Год назад

      You are so welcome! Thank you! 🙂

    • @RH-mk3rp
      @RH-mk3rp Год назад +1

      @@CodingLane I retract my statement. Your convolution backpropagation assumes batch_size=1 and channels=1, perhaps to simplify the problem. but in practice this is never the case. If you explain all that in another video then I apologize, but thus far I have found none.

    • @CodingLane
      @CodingLane  Год назад

      @@RH-mk3rp Hi, it would have been difficult to understand the backpropagation if batch_size is taken greater than 1. The purpose of this video was to help viewers understand how backpropagation works. But I hope you find and understand the solution where batch_size > 1 and channels > 1 😇

  • @John-wx3zn
    @John-wx3zn 3 месяца назад

    thank you. why is del L/del X being found when backpropagation only updates the kernel weights matrix and the bias scalar?

  • @ASdASd-kr1ft
    @ASdASd-kr1ft Год назад

    why appear the sum in the chain rule for the derivati of a matrix respect to another one?
    you know someany source where i can figure out how this work? Thanks

  • @gauravshinde8767
    @gauravshinde8767 6 месяцев назад

    Doing Masters in AI in Ireland.
    Lectures: Understood nothing
    Coding Lane videos: Understood everything
    Teaching is a skill, you got that bro

  • @user-fz3ui4zp6h
    @user-fz3ui4zp6h 2 года назад +1

    I cought this logic due investing lots of time. However, I can't understand one aspect:
    In the full connected model: The model is received the "INPUT" we make the forward then to get the loss and based on it we fix each weight in the model using backpropagetion.
    During this process we never deal with "INPUT", because the "INPUT" could be different but we need to train our model to predict.
    As I realise in a convolutional model a kernel plays the same role as weight in the full connected model and during training our goal to configure the kernel for better prediction
    What I can't really understand why fix the INPUT data in the convolutional model convolutional model INPUT is alwais changed but kernel stay constantly. Based on this logic we need only to configure the kernel.
    Could you explain, please?

  • @DangNguyen-yh5mm
    @DangNguyen-yh5mm 2 года назад +2

    Amazing job, hope you have an example on simple CNN with updating weights on these layer(conv->relu->pool->... just a few layers), thank you for your this video.

    • @CodingLane
      @CodingLane  2 года назад +1

      Yes... that is coming in the next video

  • @punamkhandar2678
    @punamkhandar2678 4 месяца назад

    informative

  • @gajendrasinghdhaked
    @gajendrasinghdhaked 8 месяцев назад +1

    insane content

  • @lodemrakesh7092
    @lodemrakesh7092 2 года назад +1

    Good one

  • @user-lv2pg5so2j
    @user-lv2pg5so2j Год назад +1

    nice video sir

  • @ness3963
    @ness3963 2 года назад +1

    Thank you bhai 🙁

  • @John-wx3zn
    @John-wx3zn 3 месяца назад

    thank you. why didn't you show the activation functions step?

  • @harryt5878
    @harryt5878 Год назад +2

    This is super useful thank you! However I am slightly stuck with finding dl/dk as for example in my neural network the previous layer has output 14x14x20 and the convolutional layer uses 20 3x3x20 filters so the dl/dk needs to be of size 3x3x20x20, but by applying the convolution(X, dl/dz) the output is 3x3x20 as dl/dz has size 12x12x20) how do I fix this?

    • @user-oq7ju6vp7j
      @user-oq7ju6vp7j 9 месяцев назад

      HI. Did you find out how to solve it? I have the same problem

  • @arpit743
    @arpit743 2 года назад +1

    hi jay! can you please explain the Einstein summation convention part? My issue is that the convention says Mij= sigma k (Aik*Bkj), but in the video you mentioned dl/dkmn = sigma( dl/dzij * dzij/dkmn), is this equation consistent with Einstein convention? as it has both as ij ??

    • @CodingLane
      @CodingLane  2 года назад +2

      Hello... yea it is still consistent with Einstein convention. Convension just says that if you have ij appearing at cross position (numerator of one and denominator of other) then it implies to do summation over those terms.

  • @fitrinailahanwar4102
    @fitrinailahanwar4102 Год назад +1

    Clear explain, thank you sir, but please add subtittle in order can be easily to understand , thank you if you read this suggest

    • @CodingLane
      @CodingLane  Год назад

      Hi, RUclips somehow faced error to add subtitles in this video. Sorry for the inconvenience. But glad you found it helpful!

  • @spyder2374
    @spyder2374 2 года назад +2

    Ultra Nice explanation ... 👍
    Q - why we update kernels while back propagation, kernels should be fixed right ?
    Let if a kernel is an vertical edge detector, after back propagation through it, what it will be ???

    • @CodingLane
      @CodingLane  Год назад +1

      Hi, Kernels should not be fixed. Because you don't know which kernel is detecting what kind of features. The vertical edge detection was just an example to show that kernels detects edges, but in any model, we don't know which kernel is detecting what kind of edges. The model decides it itself. And that is why we update kernels through backpropagation, so that they automatically take appropriate values to identify features. Manually setting kernel values of so many kernels will be very tedious job. Let say, if you change your model and add new kernels, then you will have to set those values as well. Hope the answer helps. And sorry for this really late reply.

  • @siddhanthsridhar4742
    @siddhanthsridhar4742 Год назад

    hi
    at 15:30 equation of del L/del X12 is written wrong-it should be del L/del Z11*k11 insread of del L/del Z12*k11

  • @rahulkumarjha2404
    @rahulkumarjha2404 Год назад

    Great video.
    I just have one doubt,
    Why are we calculating dL/dX
    I mean using backpropogation we only update weights, biases and the terms in filter matrix.
    Please answer.

    • @CodingLane
      @CodingLane  Год назад

      Hi, yes the end goal is to update weights and biases. dL/dX helps us calculating dL/dW and dL/dB of previous layer. Checkout 3:27 timestamp. dL/dX will become dL/dZ to the previous layer.

  • @mounmountain141
    @mounmountain141 Год назад +1

    I would like to ask if there is any material to learn when encountering situation, when stride != 1 or have dilation

    • @CodingLane
      @CodingLane  Год назад

      I found one article which showed that… but unfortunately I have lost it now. May be you can find it on google.

    • @mounmountain141
      @mounmountain141 Год назад

      ​@@CodingLane OK, Thank you for this very helpful video.

  • @axitamohanta6743
    @axitamohanta6743 Год назад

    and what about delL/delZ

  • @samueljohanes5219
    @samueljohanes5219 Год назад

    what is L?

  • @puchaharinathreddy5556
    @puchaharinathreddy5556 Год назад

    can you give us some example

  • @ramchandhablani9834
    @ramchandhablani9834 Год назад

    He Jay at 3:34, I think you have written wrong equations Z11=X11K11+X12K12+X21K21+X22K22+B, B is a 2x2 matrix, you can not add to scalar Z11😣

  • @semon00
    @semon00 15 дней назад +1

    Likeeeeee ❤

  • @surender_kovvuri
    @surender_kovvuri 6 месяцев назад

    bro can you provide the python code for this CNN model

  • @haidarrmehsen
    @haidarrmehsen 10 месяцев назад

    am I missing something or he didn't mention what is L?

    • @gamerx3582
      @gamerx3582 3 месяца назад

      It's loss function

  • @Jeffrey-uw8un
    @Jeffrey-uw8un Год назад

    MATHHHHHHHHHHHHHHHHHHHHH TOOOOOO MUCH MATTTTTTTTTTTTTHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

  • @malanosi2869
    @malanosi2869 6 месяцев назад +1

    for gods sake please buy a mic
    i love ur lectures but can hear a damn thing in some clips

    • @CodingLane
      @CodingLane  6 месяцев назад

      Hey thanks. I have got one, will use it when create new videos 😊

  • @berwinamir5325
    @berwinamir5325 2 года назад +1

    thank you a lot, this is really very very helpful 🤩🙌

    • @CodingLane
      @CodingLane  Год назад

      You're welcome. Glad I could be of help! 😄🙂