Backpropagation in RNN | Backpropagation through time

Поделиться
HTML-код
  • Опубликовано: 24 дек 2024

Комментарии • 38

  • @meklitmaki5605
    @meklitmaki5605 2 года назад +3

    finally I got an easy to understand explanation for BPTT. thank you so much

  • @junwang3375
    @junwang3375 2 года назад +3

    thanks! this video deserves more exposure

  • @kamrankhan-kk3en
    @kamrankhan-kk3en Год назад +2

    thank you jay patel it is really interesting to watch your videos

  • @hyukppen
    @hyukppen 2 года назад +3

    very helpful to me. Thank you!!

  • @yachiru9336
    @yachiru9336 2 года назад +3

    Hey your videos are helping me and my friends study for our exam. Thank you for creating these!

  • @ducgia1493
    @ducgia1493 Год назад +2

    Its just the concatenation game 😮. We're just concatenating two feedforward neural networks here.

  • @tianyixu2649
    @tianyixu2649 Год назад +1

    OMG I JUST GOT THIS!@!!!!!!! been looking everwhere how adding and multiplying work together in BPTT.... OMG THANKS!

  • @korrapatisrujan7943
    @korrapatisrujan7943 9 месяцев назад +1

    after all the innumerable days of watching countless videos trying to understand this concept I stumble upon the right one

  • @zemariamm
    @zemariamm Год назад +2

    THANK YOU!! Very very helpful to understand the summation while calculation Waa

  • @v1hana350
    @v1hana350 2 года назад +2

    Can you explain the attention model and why it used 'query, value, key ' words in the model?

    • @MachineLearningWithJay
      @MachineLearningWithJay  2 года назад +1

      I will try to cover that topic if I can. Thanks for the suggestion.

    • @v1hana350
      @v1hana350 2 года назад

      thanks for your reply. I am waiting for the video to clarify my doubts about the attention model.

    • @michaelvangulik85
      @michaelvangulik85 2 года назад

      lolololsdfnasldfjsdilgjspdjgkx, I DIDN'T HEAR ANY OF THOSE IMPORTANT KEY TERMS GET MENTIONED AT ALL? PLEASE STOP, YOU'RE GOING NUTS, OK? THIS IS ALL IMPORTANT VERY SMALL WORDS BEING PUT ACROSS MAKE SURE YOU PULL BACK AND STOP BLOWING A COVER GASKET! IF I CAN LET YOU IN, PLEASE DON'T TALK TO ANYONE OK THIS IS IMPORTTANT LMFCO LFOASHOEJGAEGJKDKDIJFKDLOLOLOLO

  • @VR-fh4im
    @VR-fh4im Год назад +1

    You made it easy.

  • @VSUKUMARBCE
    @VSUKUMARBCE Год назад

    very helpful bro to remember rnn and understand what going in side the the rnn . Thank you so much for creating best videos of deep learning

  • @ArchitStark
    @ArchitStark 5 месяцев назад +1

    Can you change the order of the video in playlist, the LSTM comes first and backpropogation comes next, but LSTM video requires one to see the Backpropogation video first

  • @jayasreeravi4580
    @jayasreeravi4580 4 месяца назад

    So, in this example, there is no hidden layer and no non-linear function in previous time stamps and also. Am i right?

  • @VR-fh4im
    @VR-fh4im Год назад

    How will you do Back Propagation for many to many RNN? My loss function changes with each RNN cell.

  • @TomChenyangJI
    @TomChenyangJI 9 месяцев назад

    I think there is sth. wrong with the dL/dwaa equation. Based on the explanation previously in the video, I think it should be SUM(i)(accerlation_product(dL/dO * dO/da1...dai/dwaa)). Am i right?

  • @convolutionalnn2582
    @convolutionalnn2582 2 года назад

    If there are more than 1 output y...What is the equation for it?

  • @-JunaidKhalidi
    @-JunaidKhalidi 2 года назад +2

    Bro , please make some videos on RNN Projects (by Python coding ), by applying all these concepts .And you are doing a great job . Thanks a lot

  • @wuyirenn
    @wuyirenn 3 месяца назад +3

    it took me way too long to find this bptt video... THANK YOU!

  • @tueikfdig
    @tueikfdig 5 месяцев назад +1

    thanks bro

  • @bhavanisankarlenka
    @bhavanisankarlenka 4 месяца назад +2

    Thanks broo, u made it better to understand than krish naiks video on this

  • @arjundhar7729
    @arjundhar7729 Год назад +2

    The maths is always the easy part, esp when writing equations. Conceptually I feel much is desired to be explained by the video. For instance, @ what INSTANCE is the backpropagation done. Nothing about sequence length is mentioned. There is no effort to run an actual example with some iterations to show the working and map it to the concepts and then the maths. Neural architectures require a conceptual grasp.

  • @spoc.mnmjecspringboardmnmjec
    @spoc.mnmjecspringboardmnmjec Год назад

    Please share pdf