Recurrent Neural Networks (RNNs), Clearly Explained!!!

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 791

  • @statquest
    @statquest  2 года назад +33

    To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning
    To learn more about Grid: www.grid.ai/
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @rathnakumarv3956
      @rathnakumarv3956 2 года назад

      whether the book include LSTM and RNN for bi variate input time series data as well?

    • @statquest
      @statquest  2 года назад

      @@rathnakumarv3956 Nope, just basic neural networks. My next book on deep learning will have more stuff about fancy neural networks.

    • @rathnakumarv3956
      @rathnakumarv3956 2 года назад +2

      @@statquest
      please include CNN, RNN and LSTM for multivariate time series as input and continuous variable as out put - problems, these are very much useful in climate change studies.

    • @statquest
      @statquest  2 года назад +1

      @Tech What time point, minutes and seconds, is confusing? And have you watched the entire series on Neural Networks before watching this one?

    • @Lyr00
      @Lyr00 Год назад

      @statquest I have a question about the vanishing gradient problem. I understand that input1 from the first timestep is having less and less impact on the output the more steps we take but isn’t the gradient also relying on the new inputs of every timestep? I don’t understand why the gradient is vanishing if the new inputs aren’t that heavily discounted as the timesteps that are older. I imagined it’s more like the old inputs are less impactful and the network is more focused on the newer inputs but can still train normally. Is there something I’m missing?

  • @s0meus3r
    @s0meus3r 8 месяцев назад +118

    The only place on the internet where you can actually grasp a complex topic before diving deeper into the topic. I am so grateful people like you exist. Thank you!

  • @wildpjah
    @wildpjah Год назад +75

    I'm in a deep learning class right now and the amount of straight math that my teacher throws at me is overwhelming. Videos like yours are incredible and I'm so thankful for the help and the color coding and the fun that makes it worth watching! It is super helpful as I'm studying for my midterm and just want to get a more definite grasp of what all this math actually means without reading someone's Mathematics PhD dissertation or something

    • @statquest
      @statquest  Год назад +5

      Good luck on your midterm! :)

  • @mohammadhamed-ro9tx
    @mohammadhamed-ro9tx Год назад +17

    Everytime I watch on of your lessons, I become sooo happy, because you make all the subjects easy to be understood in magical way. Thank you for your effort

  • @artem_isakow
    @artem_isakow Год назад +6

    Thanks to your series of videos on neural networks, I was able to pass the entrance exam for PhD program at St. Petersburg State University.

    • @statquest
      @statquest  Год назад +5

      Congratulations!!! TRIPLE BAM!!! :)

  • @jamesyoun1143
    @jamesyoun1143 2 года назад +11

    One of the most underrated channels. Never once have I had trouble understanding the intuition of whatever you explain. I'd donate money to you if I weren't a broke college student.

  • @toddgillies3380
    @toddgillies3380 Год назад +27

    Never quite understood RNNs until I watched this video, thank you! A hand-calculated example of a one-to-one RNN is extremely hard to find online, so this was perfect. The only one out there, I believe.

  • @kirannbhavaraju5978
    @kirannbhavaraju5978 2 года назад +4

    This is the CLEAREST explanation of RNNs.

  • @elmehditalbi8972
    @elmehditalbi8972 Год назад +3

    You can't understand how good this is. I've spent all of yesterday trying to understand these concepts but I couldn't grasp them. THANK YOU!!!

  • @sushankmishra53
    @sushankmishra53 8 месяцев назад +1

    With this level of simplicity in teaching, even a high schooler could grasp these concepts, probably quicker than me!
    Scared of the future now....

  • @thecomputerpal221
    @thecomputerpal221 Год назад +17

    Josh, I found your channel yesterday and have been binge watching. Incredible work in democratizing knowledge. Thankful for your work.

  • @charlescoult
    @charlescoult Год назад +3

    This explanation covers some very important points that were missed in several other lectures I've watched on this subject. Thank you for clearing things up for me.

    • @charlescoult
      @charlescoult Год назад +2

      For example, the note at 10:31

    • @statquest
      @statquest  Год назад +1

      Thank you! Yes, that little detail is important, but often glossed over.

  • @LuizHenrique-qr3lt
    @LuizHenrique-qr3lt 2 года назад +78

    Josh!!!! I love u!!! I can't wait to learn about the Transformers!! thank you very much for your content

    • @statquest
      @statquest  2 года назад +4

      Thank you!

    • @capyk5455
      @capyk5455 2 года назад +4

      Josh teaching about transformers would be a blessing

    • @statquest
      @statquest  2 года назад +4

      @@capyk5455 I'm looking forward to it!

    • @shaiguitar
      @shaiguitar 2 года назад +1

      Transformers out yet or some ETA to expect?

    • @statquest
      @statquest  2 года назад +9

      @@shaiguitar LSTMs comes out in the next week or so. Then I'll start working on transformers.

  • @shivanshgupta934
    @shivanshgupta934 2 месяца назад +1

    wow, just wow! 2 days of headache solved by a 17 min video! thank you for existing.😊

  • @CodingwithRayyan
    @CodingwithRayyan 6 месяцев назад +2

    This is not fair, I literally am addicted to your style of teaching and find it quite hard to learn from other sources now.

    • @statquest
      @statquest  6 месяцев назад +1

      Hopefully, in the long term, what you learn from these videos will make it easier to understand other sources. At least, that's starting to happen for me. It's still hard, though.

  • @ZaidAlhusainy
    @ZaidAlhusainy Год назад +7

    Your channel should be mandatory for all universities teaching AI 💖

  • @rw7154
    @rw7154 5 месяцев назад +1

    StatQuest’s stylized scalar-based mumerical examples are amazing even for learning beyond the introductory level. To get the full vectorized-matrix version of the algorithms, I just mentally swap x, w, b etc with the corresponding vectors and matrices, then it’s golden!

  • @svensvensson3679
    @svensvensson3679 6 месяцев назад +1

    Our lecturer at the uni recommended us this video. I am amazed how simply it is put. Great job! Both funny and informative ❤

  • @Ahmad_Alhasanat
    @Ahmad_Alhasanat 2 года назад +3

    I was looking for a small thing in RNN, but your way of explanation forced me to keep watching the entire video! and I subscribed to your channel!!

    • @statquest
      @statquest  2 года назад +2

      Hooray! Thank you very much! :)

  • @samama251
    @samama251 3 месяца назад +1

    First time commenting on youtube video. You are a living legend. It has been two days I am trying to understand RNNs and came across your video and Baam. RNNs concept got cleared.

    • @statquest
      @statquest  3 месяца назад

      Glad I could help!

  • @carleanoravelzawongso9786
    @carleanoravelzawongso9786 2 года назад +48

    I'm just in love with your content. I've watched your neural network series and it was just so easy to understand. You really deserve more subs and views Josh!

    • @statquest
      @statquest  2 года назад +3

      Thank you very much! :)

  • @chaitanyasharma6270
    @chaitanyasharma6270 2 года назад +9

    you definitely are the best teacher for machine learning and deep learning

  • @DavidBlayvas-wo4lj
    @DavidBlayvas-wo4lj Год назад +2

    You're gonna carry me through my neural networks class, what a godsend

  • @wongkitlongmarcus9310
    @wongkitlongmarcus9310 9 месяцев назад +1

    Josh, you are the person who make ML theory so understandable!

  • @pixel_yeast
    @pixel_yeast 7 месяцев назад +52

    i come to listen to " peep poop poop"

  • @anonymousertugrul5858
    @anonymousertugrul5858 9 месяцев назад +1

    I like the way how clearly and easily you explain concepts. Thank you very much!

    • @statquest
      @statquest  9 месяцев назад

      You're very welcome!

  • @qwertz2167
    @qwertz2167 2 года назад +2

    True Hero. I have an exam on 29th about rnns, lstms and transformers.

  • @v-sig2389
    @v-sig2389 2 года назад +2

    Thank you so much 😭
    People like you are the real mvp of humanity !

  • @santiagocalvo
    @santiagocalvo 8 месяцев назад +1

    honestly your channel is one of if not the best channel on all youtube, thank you so much for this!

    • @statquest
      @statquest  8 месяцев назад +1

      Wow, thank you!

  • @deepikasheshabutter4790
    @deepikasheshabutter4790 6 месяцев назад +1

    i literally was having a menatal breakdown coz i was unable to understand things. your video helped me a lot and also brought a smile on my face :))

    • @statquest
      @statquest  6 месяцев назад +1

      Glad I could help!

  • @TheChair610
    @TheChair610 3 месяца назад +2

    OH MY GOD THIS IS EXACTLY WHAT I WAS LOOKING FOR THANK YOU SO MUCH

  • @manishnarang6490
    @manishnarang6490 2 года назад +31

    Really looking forward to your LSTM video.. You are a very good teacher !!

    • @statquest
      @statquest  2 года назад +2

      Thank you!

    • @WonPeace94
      @WonPeace94 2 года назад +5

      @@statquest when will you make the next vid ? i have exam in two weeks and i need your LSTM video

    • @statquest
      @statquest  2 года назад +2

      @@WonPeace94 :)

  • @curiosityspace8635
    @curiosityspace8635 10 месяцев назад

    Oh man i literally watch his videos like a web series its very fun and very easy to understand thank you very much sir !!!!😭😭

  • @cat-a-lyst
    @cat-a-lyst Год назад +1

    you are vry very very very very very brilliant teacher ! you are my low variance and low bias position.

  • @huruynegash4847
    @huruynegash4847 Год назад +1

    Hello Guys let's make this man happy always as he did for us!!!!!!!!!!!!! Nothing to say just thanks a lot.

  • @anashaat95
    @anashaat95 2 года назад +8

    Very high level explanation. Waiting for the next video on "Long-short Term Networks". Thank you so much.

  • @BrunoJr09
    @BrunoJr09 Год назад +1

    OMG, Finally I understand Vanishing Exploding Gradient, Thank you StatQuest!

    • @statquest
      @statquest  Год назад +1

      HOORAY!!! Thanks for supporting StatQuest!!! TRIPLE BAM! :)

  • @waizwafiq9481
    @waizwafiq9481 2 года назад +65

    Amazing video as always, professor! I cant wait for the video on LSTM

    • @statquest
      @statquest  2 года назад +15

      You and me both!

    • @usamsersultanov689
      @usamsersultanov689 2 года назад

      @@statquest when it comes to application of RNN the LSTM is sometimes a must to have:) that’s why it would be great to have clearly explanation of LSTM. But these are little things. In any case, thank you very much for the valuable knowledge that we can get here.

    • @statquest
      @statquest  2 года назад +13

      @@usamsersultanov689 I hope to have a video on LSTMs soon.

    • @james199879
      @james199879 2 года назад +5

      @@statquest One on Transformers and their variations would be even greater :D

    • @statquest
      @statquest  2 года назад +13

      @@james199879 That's the master plan, but we'll get there one step at a time.

  • @saharshayegan
    @saharshayegan Год назад +1

    This was the best explanation I've heard for RNNs!

  • @brandonso5477
    @brandonso5477 Год назад +1

    why are you master of everything???? I have been watching your video for two years through out my university course

  • @IsaacKLusuku
    @IsaacKLusuku 5 месяцев назад +1

    Hey Josh...the way you made this so easy is mind blowing, I love you man, keep being awesome 😊

    • @statquest
      @statquest  5 месяцев назад

      Thank you so much 😀!

  • @KayYesYouTuber
    @KayYesYouTuber 2 года назад +2

    Hi Josh, You are the best. Nobody has explained exploding gradient like you have, Thank you

  • @zytriesthings4540
    @zytriesthings4540 4 месяца назад +1

    Thank you brother, this was really intuitive and easy to understand

  • @broccoli322
    @broccoli322 2 года назад +1

    Im a simple man, I see statquest, I click like. Can't wait for the videos on transformers.

  • @CharlieYou823
    @CharlieYou823 7 месяцев назад +1

    PENTA BAM!!! The best pre-course !

  • @solotop5916
    @solotop5916 6 месяцев назад +1

    Dude that DOUBLE BAMM and TRIPLE BAMMM kills me. Actually fun way to get info. Also greate video very easy to understand

  • @gabbye165
    @gabbye165 2 года назад +1

    12:22 is probably the cutest bam I've heard
    Also thank you for your videos! They have definitely been helping me get through my Bioinformatics grad course. You are AWESOME

    • @statquest
      @statquest  Год назад +1

      Thank you so much and good luck with your course! :)

  • @8bit-ascii
    @8bit-ascii 2 года назад +1

    They way you explained RNNs made me so excited for LSTMs. Can‘t wait to see it!

  • @lore10010
    @lore10010 Год назад +1

    Excelente proyecto! no pense que con dibujos fuera tan entretenido e informativo. Definitivamente un muy buen video para comenzar!

  • @HEYTHERE-ko6we
    @HEYTHERE-ko6we 7 месяцев назад +1

    Those tones won won bam double bam kaboom n d fun way of learning, opens up mind for grasping things real quick as well as we can think freely wdout bcming nervous. U lord🙌

  • @sheltonsong6120
    @sheltonsong6120 2 года назад +2

    Clearly explained the difference between RNN and normal network, gradient vanishing/exploding! Looking forward to the LSTM and Transformer videos!!!

  • @osamahabdullah3715
    @osamahabdullah3715 2 года назад +1

    literally before I see your video, I made a like, that much I trust your information and knowledge , thank you for your time and effort to explain this to us

  • @ai_station_fa
    @ai_station_fa 2 года назад +1

    Best explanation I've seen from RNNs. Thanks.

  • @AI_ML_DL_LLM
    @AI_ML_DL_LLM 2 года назад +1

    Great video, you must become the President of the ClearlyExplainedLand

  • @felipela2227
    @felipela2227 11 месяцев назад +1

    El video estuvo muy bien explicado, lo entendí facilmente. Gracias

    • @statquest
      @statquest  11 месяцев назад

      Muchas gracias! :)

  • @chrism3440
    @chrism3440 4 месяца назад +1

    Oh my gosh that was an amazing explanation. I'm quite literally flabbergasted. Thanks, mate!!

    • @statquest
      @statquest  4 месяца назад +1

      Glad you liked it!

  • @indusairaman2126
    @indusairaman2126 2 года назад +2

    U have an amazing way of explaining with adlibs loved it and thank you so much as I was not able to understand at all but now it is very clear

  • @AhmadAbuNassar
    @AhmadAbuNassar 7 месяцев назад +1

    Any darn fool can make something complex; it takes a genius to make something simple (" Albert Einstein"), and you made it very very simple. Thanks!

    • @statquest
      @statquest  7 месяцев назад

      Thank you very much! :)

  • @mahu1203
    @mahu1203 Год назад +1

    Thank you so much. You make great videos... Just great teaching. Thanks alot.

  • @loreii1982
    @loreii1982 Год назад +1

    I saw a light turning on my head! great video

  • @safiyajd
    @safiyajd 2 года назад +1

    Josh, you are amazing! Thank God you exist

  • @CarlosCaetanoJr
    @CarlosCaetanoJr 2 года назад +1

    Anxious for the LSTM, Transformers and Attention StatQuests!

    • @statquest
      @statquest  2 года назад +1

      It's already available to early access patreon supporters and will be available to everyone else soon.

  • @justLu__
    @justLu__ 10 месяцев назад +1

    Hey, hopefully this will safe my Deep Learning exam. And... love the sound effects.

    • @statquest
      @statquest  10 месяцев назад

      Best of luck!

  • @TheGibberingGoblin
    @TheGibberingGoblin Год назад +1

    ... you sir are a timeless legend!

  • @iReaperYo
    @iReaperYo 2 года назад +3

    You don’t understand how good the timing of this is. Been struggling to explain the concept in detail on my MSc project.
    Are you are doing a video on LSTM / GRU soon ??

  • @giorda77
    @giorda77 8 месяцев назад +1

    Clear as day!!! Hooray!!!! Thank you Josh

  • @harishbattula9881
    @harishbattula9881 2 года назад +1

    Thanks a lot Josh. Every concept explained by you is a BAM!!!!!!!!!!

  • @exxzxxe
    @exxzxxe 2 года назад +1

    Your professorial ability is only exceeded by your singing!!

    • @statquest
      @statquest  2 года назад

      Thanks!

    • @exxzxxe
      @exxzxxe 2 года назад +1

      @@statquest When are you going to release your, surely to be a best-selling, book on singing lessons?

    • @statquest
      @statquest  2 года назад

      @@exxzxxe Just as soon as I win Eurovision.

  • @shofiyabootwala2094
    @shofiyabootwala2094 11 месяцев назад

    the vanishing/exploding gradient problem is synonymous to choosing a right value for alpha (learning rate) as choosing a greater value would leave us bouncing and choosing a lower value would lead to more iterations of gradient descent

  • @sarahk13peace
    @sarahk13peace Год назад +1

    OMG I love you for your teaching style

  • @JulietNovember9
    @JulietNovember9 2 года назад +1

    Oh man! This has been super tough for me to wrap my head around. I knew this was going to be a great weekend! Thank you for the drop! :D

  • @WALID0306
    @WALID0306 Год назад +1

    Gracias !! Estuvo excelente ✨✨✨✨ Bendiciones

  • @mohammadahmedbasri3067
    @mohammadahmedbasri3067 2 года назад +3

    Thank you for this amazing explanation! Waiting for the video on LSTM! :)

  • @mostafamarwanmostafa9975
    @mostafamarwanmostafa9975 Год назад +1

    Thank YOUUU Clearly explained !! I have been struggling with it !

  • @NockyLucky
    @NockyLucky 8 месяцев назад +1

    Really liked the video. Quite creative and straight to the point!

  • @hamzaahmad1224
    @hamzaahmad1224 Год назад +1

    Thank you for the video. I believe it was a clear explanation.

  • @lukastoral5059
    @lukastoral5059 8 месяцев назад +1

    Thank you for making these videos! They are very helpful.

    • @statquest
      @statquest  8 месяцев назад

      TRIPLE BAM!!! Thank you so much for supporting StatQuest!!! :)

  • @AuthsKads
    @AuthsKads Год назад +1

    u are king my friend
    perfect explanation with simple example

  • @luistalavera6392
    @luistalavera6392 Год назад +1

    Man, this is awesome. I wasn't understanding anything about RNNs in my course but thanks to this video is all clear now.
    Thank you Josh Stamer :D

  • @sinaro93
    @sinaro93 2 года назад

    KA-BAAAM! Thank you for all these amazing videos. I wish you had different series about CNNs and RNNs separately.

    • @statquest
      @statquest  2 года назад +1

      If you want to learn about CNNs, see: ruclips.net/video/HGwBXDKFk9I/видео.html

    • @sinaro93
      @sinaro93 2 года назад

      @@statquest How could I not have seen this?! By different series, I mean it would be great if you could create more videos covering each of these topics in more detail. But of course, you've already done so far, and I'm so grateful to you for sharing your knowledge in such a good way.

  • @Tapsthequant
    @Tapsthequant 2 года назад +1

    Been waiting and waiting, the waiting is BAM!!!

  • @dikshagupta3500
    @dikshagupta3500 Год назад +1

    Your book on Machine Learning was excellent. I am looking forward to reading your book on deep learning.

  • @Xayuap
    @Xayuap Год назад +1

    gonna be honest,
    I get here looking for backprop.
    I didn't, instead found myself doing the whole course. Now I'm taking the selective courses 🧘🏽
    I do feel like Neo wanting more Xaolin.

  • @chelvynchristsonimmanuel1453
    @chelvynchristsonimmanuel1453 Год назад +1

    Amazing explaination with simple, easy-absorb, attractive method and but still pursue clear concept. 🙂 Kabaam... nice job

  • @chriskong7418
    @chriskong7418 11 месяцев назад +1

    Love the little embarrassing singings during the videos. Subscribed. Great videos!

  • @maximus2978
    @maximus2978 10 месяцев назад +1

    You are insane Man, very clear and understandable explaination!! Thanks a ton 🎉

    • @statquest
      @statquest  10 месяцев назад

      Happy to help!

  • @Raulvic
    @Raulvic 2 года назад +1

    Thank you for sharing 🙂 super excited for the transformers statsquest!

  • @harshshah3797
    @harshshah3797 Год назад +1

    Summary
    Problem with regular neural network
    It takes fixed length of input.
    Here comes RNN to help.
    It can take variable length of input.
    How RNN is made?
    Input + Previous Output == Output
    Why RNN is not popular?
    As it has one problem call Vanishing / Exploding Gradient Problem.
    As we have long chains,it is natural to this problem will arise.
    Lets say we have weight that multiplies with previous output greater than 1. As we have long chain,we will be multiplying many times numbers will become large. (Output will very too much)
    If we have less than 1 it will become very small. (Output will not change at all)
    Here is an analogy that might help you understand the Vanishing / Exploding Gradient Problem:
    Imagine that you are trying to find your way through a forest. You have a map, but the map is very old and the trails are not marked very well. As you walk through the forest, you make a lot of small decisions about which way to turn. These decisions are based on the gradients of the map.
    If the gradients are very large, you might make a big turn and end up very far away from your destination. If the gradients are very small, you might make a small turn and not make much progress.
    The Vanishing / Exploding Gradient Problem is like having a map with very large or very small gradients. In this case, it would be very difficult to find your way through the forest.

  • @mermich
    @mermich Год назад +1

    You graced upon us as a stats saviour :))) Send love from Australia

  • @alekseishkurin4590
    @alekseishkurin4590 Год назад +1

    Omg, that intro jingle is gold!

  • @krishnaphanindra1841
    @krishnaphanindra1841 Год назад +1

    Beautiful and succinct explanations!! So glad I found your channel....lots of love

  • @TPLCompany
    @TPLCompany 2 года назад +2

    Great video!! I can't wait for LSTM and transformer videos!

  • @ClemensPutz-ist-der-beste
    @ClemensPutz-ist-der-beste Год назад +1

    I wish you were my math teacher! The whole class would have sang like you while calculating🤣

  • @kanisrini01
    @kanisrini01 5 месяцев назад +1

    Great video & explanation 👏🌟. Thank you so much.

    • @statquest
      @statquest  5 месяцев назад

      Glad it was helpful!

  • @mohammedjaddoa9783
    @mohammedjaddoa9783 Год назад +1

    appreciate your effort & work,THANK YOU

  • @zeljkobalanovic8900
    @zeljkobalanovic8900 2 года назад +1

    That intro was sick. I smashed like button immediatelly :D

  • @Naveedahmed-bq5iz
    @Naveedahmed-bq5iz 2 года назад +1

    Amazing, This is one best and coolest learning tutorial i have watched ever, great work Josh, keep it up. Thanks

  • @FahimAhmed-gq4rh
    @FahimAhmed-gq4rh 8 месяцев назад +1

    Awesome explaination by the creator ❤

  • @BlizzgamesAlive
    @BlizzgamesAlive 2 года назад +1

    this videos just get better and better

  • @hasansoufan
    @hasansoufan Год назад +1

    You're the best, thanks from the heart❤