To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning To learn more about Grid: www.grid.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@@statquest please include CNN, RNN and LSTM for multivariate time series as input and continuous variable as out put - problems, these are very much useful in climate change studies.
@statquest I have a question about the vanishing gradient problem. I understand that input1 from the first timestep is having less and less impact on the output the more steps we take but isn’t the gradient also relying on the new inputs of every timestep? I don’t understand why the gradient is vanishing if the new inputs aren’t that heavily discounted as the timesteps that are older. I imagined it’s more like the old inputs are less impactful and the network is more focused on the newer inputs but can still train normally. Is there something I’m missing?
The only place on the internet where you can actually grasp a complex topic before diving deeper into the topic. I am so grateful people like you exist. Thank you!
I'm in a deep learning class right now and the amount of straight math that my teacher throws at me is overwhelming. Videos like yours are incredible and I'm so thankful for the help and the color coding and the fun that makes it worth watching! It is super helpful as I'm studying for my midterm and just want to get a more definite grasp of what all this math actually means without reading someone's Mathematics PhD dissertation or something
Everytime I watch on of your lessons, I become sooo happy, because you make all the subjects easy to be understood in magical way. Thank you for your effort
One of the most underrated channels. Never once have I had trouble understanding the intuition of whatever you explain. I'd donate money to you if I weren't a broke college student.
Never quite understood RNNs until I watched this video, thank you! A hand-calculated example of a one-to-one RNN is extremely hard to find online, so this was perfect. The only one out there, I believe.
This explanation covers some very important points that were missed in several other lectures I've watched on this subject. Thank you for clearing things up for me.
Hopefully, in the long term, what you learn from these videos will make it easier to understand other sources. At least, that's starting to happen for me. It's still hard, though.
StatQuest’s stylized scalar-based mumerical examples are amazing even for learning beyond the introductory level. To get the full vectorized-matrix version of the algorithms, I just mentally swap x, w, b etc with the corresponding vectors and matrices, then it’s golden!
First time commenting on youtube video. You are a living legend. It has been two days I am trying to understand RNNs and came across your video and Baam. RNNs concept got cleared.
I'm just in love with your content. I've watched your neural network series and it was just so easy to understand. You really deserve more subs and views Josh!
@@statquest when it comes to application of RNN the LSTM is sometimes a must to have:) that’s why it would be great to have clearly explanation of LSTM. But these are little things. In any case, thank you very much for the valuable knowledge that we can get here.
12:22 is probably the cutest bam I've heard Also thank you for your videos! They have definitely been helping me get through my Bioinformatics grad course. You are AWESOME
Those tones won won bam double bam kaboom n d fun way of learning, opens up mind for grasping things real quick as well as we can think freely wdout bcming nervous. U lord🙌
literally before I see your video, I made a like, that much I trust your information and knowledge , thank you for your time and effort to explain this to us
You don’t understand how good the timing of this is. Been struggling to explain the concept in detail on my MSc project. Are you are doing a video on LSTM / GRU soon ??
the vanishing/exploding gradient problem is synonymous to choosing a right value for alpha (learning rate) as choosing a greater value would leave us bouncing and choosing a lower value would lead to more iterations of gradient descent
@@statquest How could I not have seen this?! By different series, I mean it would be great if you could create more videos covering each of these topics in more detail. But of course, you've already done so far, and I'm so grateful to you for sharing your knowledge in such a good way.
gonna be honest, I get here looking for backprop. I didn't, instead found myself doing the whole course. Now I'm taking the selective courses 🧘🏽 I do feel like Neo wanting more Xaolin.
Summary Problem with regular neural network It takes fixed length of input. Here comes RNN to help. It can take variable length of input. How RNN is made? Input + Previous Output == Output Why RNN is not popular? As it has one problem call Vanishing / Exploding Gradient Problem. As we have long chains,it is natural to this problem will arise. Lets say we have weight that multiplies with previous output greater than 1. As we have long chain,we will be multiplying many times numbers will become large. (Output will very too much) If we have less than 1 it will become very small. (Output will not change at all) Here is an analogy that might help you understand the Vanishing / Exploding Gradient Problem: Imagine that you are trying to find your way through a forest. You have a map, but the map is very old and the trails are not marked very well. As you walk through the forest, you make a lot of small decisions about which way to turn. These decisions are based on the gradients of the map. If the gradients are very large, you might make a big turn and end up very far away from your destination. If the gradients are very small, you might make a small turn and not make much progress. The Vanishing / Exploding Gradient Problem is like having a map with very large or very small gradients. In this case, it would be very difficult to find your way through the forest.
To learn more about Lightning: github.com/PyTorchLightning/pytorch-lightning
To learn more about Grid: www.grid.ai/
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
whether the book include LSTM and RNN for bi variate input time series data as well?
@@rathnakumarv3956 Nope, just basic neural networks. My next book on deep learning will have more stuff about fancy neural networks.
@@statquest
please include CNN, RNN and LSTM for multivariate time series as input and continuous variable as out put - problems, these are very much useful in climate change studies.
@Tech What time point, minutes and seconds, is confusing? And have you watched the entire series on Neural Networks before watching this one?
@statquest I have a question about the vanishing gradient problem. I understand that input1 from the first timestep is having less and less impact on the output the more steps we take but isn’t the gradient also relying on the new inputs of every timestep? I don’t understand why the gradient is vanishing if the new inputs aren’t that heavily discounted as the timesteps that are older. I imagined it’s more like the old inputs are less impactful and the network is more focused on the newer inputs but can still train normally. Is there something I’m missing?
The only place on the internet where you can actually grasp a complex topic before diving deeper into the topic. I am so grateful people like you exist. Thank you!
Thanks!
@@statquest np
I'm in a deep learning class right now and the amount of straight math that my teacher throws at me is overwhelming. Videos like yours are incredible and I'm so thankful for the help and the color coding and the fun that makes it worth watching! It is super helpful as I'm studying for my midterm and just want to get a more definite grasp of what all this math actually means without reading someone's Mathematics PhD dissertation or something
Good luck on your midterm! :)
Everytime I watch on of your lessons, I become sooo happy, because you make all the subjects easy to be understood in magical way. Thank you for your effort
Wow, thank you!
Thanks to your series of videos on neural networks, I was able to pass the entrance exam for PhD program at St. Petersburg State University.
Congratulations!!! TRIPLE BAM!!! :)
One of the most underrated channels. Never once have I had trouble understanding the intuition of whatever you explain. I'd donate money to you if I weren't a broke college student.
Thanks!
Never quite understood RNNs until I watched this video, thank you! A hand-calculated example of a one-to-one RNN is extremely hard to find online, so this was perfect. The only one out there, I believe.
Thanks!
This is the CLEAREST explanation of RNNs.
Thank you!
You can't understand how good this is. I've spent all of yesterday trying to understand these concepts but I couldn't grasp them. THANK YOU!!!
Glad it helped!
With this level of simplicity in teaching, even a high schooler could grasp these concepts, probably quicker than me!
Scared of the future now....
:)
Josh, I found your channel yesterday and have been binge watching. Incredible work in democratizing knowledge. Thankful for your work.
Thank you!
democratizing knowledge -- exactly !!!
This explanation covers some very important points that were missed in several other lectures I've watched on this subject. Thank you for clearing things up for me.
For example, the note at 10:31
Thank you! Yes, that little detail is important, but often glossed over.
Josh!!!! I love u!!! I can't wait to learn about the Transformers!! thank you very much for your content
Thank you!
Josh teaching about transformers would be a blessing
@@capyk5455 I'm looking forward to it!
Transformers out yet or some ETA to expect?
@@shaiguitar LSTMs comes out in the next week or so. Then I'll start working on transformers.
wow, just wow! 2 days of headache solved by a 17 min video! thank you for existing.😊
bam!
This is not fair, I literally am addicted to your style of teaching and find it quite hard to learn from other sources now.
Hopefully, in the long term, what you learn from these videos will make it easier to understand other sources. At least, that's starting to happen for me. It's still hard, though.
Your channel should be mandatory for all universities teaching AI 💖
Maybe one day!
It will be for sure 😊
StatQuest’s stylized scalar-based mumerical examples are amazing even for learning beyond the introductory level. To get the full vectorized-matrix version of the algorithms, I just mentally swap x, w, b etc with the corresponding vectors and matrices, then it’s golden!
Bam! :)
Our lecturer at the uni recommended us this video. I am amazed how simply it is put. Great job! Both funny and informative ❤
Thank you!
I was looking for a small thing in RNN, but your way of explanation forced me to keep watching the entire video! and I subscribed to your channel!!
Hooray! Thank you very much! :)
First time commenting on youtube video. You are a living legend. It has been two days I am trying to understand RNNs and came across your video and Baam. RNNs concept got cleared.
Glad I could help!
I'm just in love with your content. I've watched your neural network series and it was just so easy to understand. You really deserve more subs and views Josh!
Thank you very much! :)
you definitely are the best teacher for machine learning and deep learning
Thank you!
You're gonna carry me through my neural networks class, what a godsend
You can do it!
Josh, you are the person who make ML theory so understandable!
Thank you!
i come to listen to " peep poop poop"
bam! :)
I like the way how clearly and easily you explain concepts. Thank you very much!
You're very welcome!
True Hero. I have an exam on 29th about rnns, lstms and transformers.
Good luck! :)
Thank you so much 😭
People like you are the real mvp of humanity !
Thanks!
honestly your channel is one of if not the best channel on all youtube, thank you so much for this!
Wow, thank you!
i literally was having a menatal breakdown coz i was unable to understand things. your video helped me a lot and also brought a smile on my face :))
Glad I could help!
OH MY GOD THIS IS EXACTLY WHAT I WAS LOOKING FOR THANK YOU SO MUCH
bam! :)
Really looking forward to your LSTM video.. You are a very good teacher !!
Thank you!
@@statquest when will you make the next vid ? i have exam in two weeks and i need your LSTM video
@@WonPeace94 :)
Oh man i literally watch his videos like a web series its very fun and very easy to understand thank you very much sir !!!!😭😭
Thanks!
you are vry very very very very very brilliant teacher ! you are my low variance and low bias position.
bam! :)
Hello Guys let's make this man happy always as he did for us!!!!!!!!!!!!! Nothing to say just thanks a lot.
BAM! Thanks!
Very high level explanation. Waiting for the next video on "Long-short Term Networks". Thank you so much.
Thanks! :)
OMG, Finally I understand Vanishing Exploding Gradient, Thank you StatQuest!
HOORAY!!! Thanks for supporting StatQuest!!! TRIPLE BAM! :)
Amazing video as always, professor! I cant wait for the video on LSTM
You and me both!
@@statquest when it comes to application of RNN the LSTM is sometimes a must to have:) that’s why it would be great to have clearly explanation of LSTM. But these are little things. In any case, thank you very much for the valuable knowledge that we can get here.
@@usamsersultanov689 I hope to have a video on LSTMs soon.
@@statquest One on Transformers and their variations would be even greater :D
@@james199879 That's the master plan, but we'll get there one step at a time.
This was the best explanation I've heard for RNNs!
Thank you! :)
why are you master of everything???? I have been watching your video for two years through out my university course
Ha! Thank you! :)
Hey Josh...the way you made this so easy is mind blowing, I love you man, keep being awesome 😊
Thank you so much 😀!
Hi Josh, You are the best. Nobody has explained exploding gradient like you have, Thank you
Wow, thanks!
Thank you brother, this was really intuitive and easy to understand
Thanks!
Im a simple man, I see statquest, I click like. Can't wait for the videos on transformers.
bam! :)
PENTA BAM!!! The best pre-course !
Thanks!
Dude that DOUBLE BAMM and TRIPLE BAMMM kills me. Actually fun way to get info. Also greate video very easy to understand
Thank you! :)
12:22 is probably the cutest bam I've heard
Also thank you for your videos! They have definitely been helping me get through my Bioinformatics grad course. You are AWESOME
Thank you so much and good luck with your course! :)
They way you explained RNNs made me so excited for LSTMs. Can‘t wait to see it!
bam!
Excelente proyecto! no pense que con dibujos fuera tan entretenido e informativo. Definitivamente un muy buen video para comenzar!
Muchas gracias! :)
Those tones won won bam double bam kaboom n d fun way of learning, opens up mind for grasping things real quick as well as we can think freely wdout bcming nervous. U lord🙌
Thanks!
Clearly explained the difference between RNN and normal network, gradient vanishing/exploding! Looking forward to the LSTM and Transformer videos!!!
Thanks!
literally before I see your video, I made a like, that much I trust your information and knowledge , thank you for your time and effort to explain this to us
Thanks!
Best explanation I've seen from RNNs. Thanks.
Thank you!
Great video, you must become the President of the ClearlyExplainedLand
:)
El video estuvo muy bien explicado, lo entendí facilmente. Gracias
Muchas gracias! :)
Oh my gosh that was an amazing explanation. I'm quite literally flabbergasted. Thanks, mate!!
Glad you liked it!
U have an amazing way of explaining with adlibs loved it and thank you so much as I was not able to understand at all but now it is very clear
Glad I could help!
Any darn fool can make something complex; it takes a genius to make something simple (" Albert Einstein"), and you made it very very simple. Thanks!
Thank you very much! :)
Thank you so much. You make great videos... Just great teaching. Thanks alot.
Glad you like them!
I saw a light turning on my head! great video
bam! :)
Josh, you are amazing! Thank God you exist
Thank you!
Anxious for the LSTM, Transformers and Attention StatQuests!
It's already available to early access patreon supporters and will be available to everyone else soon.
Hey, hopefully this will safe my Deep Learning exam. And... love the sound effects.
Best of luck!
... you sir are a timeless legend!
Thanks!
You don’t understand how good the timing of this is. Been struggling to explain the concept in detail on my MSc project.
Are you are doing a video on LSTM / GRU soon ??
Yes, I hope so.
Clear as day!!! Hooray!!!! Thank you Josh
Thank you!
Thanks a lot Josh. Every concept explained by you is a BAM!!!!!!!!!!
Glad you think so!
Your professorial ability is only exceeded by your singing!!
Thanks!
@@statquest When are you going to release your, surely to be a best-selling, book on singing lessons?
@@exxzxxe Just as soon as I win Eurovision.
the vanishing/exploding gradient problem is synonymous to choosing a right value for alpha (learning rate) as choosing a greater value would leave us bouncing and choosing a lower value would lead to more iterations of gradient descent
Nice
OMG I love you for your teaching style
Thank you! 😃
Oh man! This has been super tough for me to wrap my head around. I knew this was going to be a great weekend! Thank you for the drop! :D
BAM! :)
Gracias !! Estuvo excelente ✨✨✨✨ Bendiciones
Muchas gracias!
Thank you for this amazing explanation! Waiting for the video on LSTM! :)
Coming soon!
Thank YOUUU Clearly explained !! I have been struggling with it !
BAM! :)
Really liked the video. Quite creative and straight to the point!
Thanks!
Thank you for the video. I believe it was a clear explanation.
Glad it was helpful!
Thank you for making these videos! They are very helpful.
TRIPLE BAM!!! Thank you so much for supporting StatQuest!!! :)
u are king my friend
perfect explanation with simple example
Thanks!
Man, this is awesome. I wasn't understanding anything about RNNs in my course but thanks to this video is all clear now.
Thank you Josh Stamer :D
BAM! :)
KA-BAAAM! Thank you for all these amazing videos. I wish you had different series about CNNs and RNNs separately.
If you want to learn about CNNs, see: ruclips.net/video/HGwBXDKFk9I/видео.html
@@statquest How could I not have seen this?! By different series, I mean it would be great if you could create more videos covering each of these topics in more detail. But of course, you've already done so far, and I'm so grateful to you for sharing your knowledge in such a good way.
Been waiting and waiting, the waiting is BAM!!!
:)
Your book on Machine Learning was excellent. I am looking forward to reading your book on deep learning.
Thank you very much!
gonna be honest,
I get here looking for backprop.
I didn't, instead found myself doing the whole course. Now I'm taking the selective courses 🧘🏽
I do feel like Neo wanting more Xaolin.
That's awesome!
Amazing explaination with simple, easy-absorb, attractive method and but still pursue clear concept. 🙂 Kabaam... nice job
Thank you! :)
Love the little embarrassing singings during the videos. Subscribed. Great videos!
Thanks!
You are insane Man, very clear and understandable explaination!! Thanks a ton 🎉
Happy to help!
Thank you for sharing 🙂 super excited for the transformers statsquest!
Thanks!
Summary
Problem with regular neural network
It takes fixed length of input.
Here comes RNN to help.
It can take variable length of input.
How RNN is made?
Input + Previous Output == Output
Why RNN is not popular?
As it has one problem call Vanishing / Exploding Gradient Problem.
As we have long chains,it is natural to this problem will arise.
Lets say we have weight that multiplies with previous output greater than 1. As we have long chain,we will be multiplying many times numbers will become large. (Output will very too much)
If we have less than 1 it will become very small. (Output will not change at all)
Here is an analogy that might help you understand the Vanishing / Exploding Gradient Problem:
Imagine that you are trying to find your way through a forest. You have a map, but the map is very old and the trails are not marked very well. As you walk through the forest, you make a lot of small decisions about which way to turn. These decisions are based on the gradients of the map.
If the gradients are very large, you might make a big turn and end up very far away from your destination. If the gradients are very small, you might make a small turn and not make much progress.
The Vanishing / Exploding Gradient Problem is like having a map with very large or very small gradients. In this case, it would be very difficult to find your way through the forest.
bam
You graced upon us as a stats saviour :))) Send love from Australia
Awesome! Thank you!
Omg, that intro jingle is gold!
bam!
Beautiful and succinct explanations!! So glad I found your channel....lots of love
Thanks!
Great video!! I can't wait for LSTM and transformer videos!
Coming soon!
I wish you were my math teacher! The whole class would have sang like you while calculating🤣
That would be awesome!
Great video & explanation 👏🌟. Thank you so much.
Glad it was helpful!
appreciate your effort & work,THANK YOU
Thank you!
That intro was sick. I smashed like button immediatelly :D
bam! :)
Amazing, This is one best and coolest learning tutorial i have watched ever, great work Josh, keep it up. Thanks
Thanks, will do!
Awesome explaination by the creator ❤
Thank you!
this videos just get better and better
Thank you!
You're the best, thanks from the heart❤
Thank you! :)