Watched 7:30 mins and before i complete the rest of the video i felt an overwhelming need to tell you that you taught this concept in a brilliant manner
I've been "just getting through" my machine learning class for the last 9 weeks and now after watching this video I finally feel like I understand these concepts!
I have been through every single RNN video trying to understand it and you are the only one that has explained it well. I am sick of abstract topics such as NN's being unapproachable because of teachers who don't know how to explain things with a tamer vocabulary and EXAMPLES. Lots and lots of examples.
best video on the topic so far. firstly explaining the topic in a almost oversimplified manner, and then gradually increase complexity and difficult terminology. Perfect teaching style!
I have looked at many videos and I rarely comment so my words carry a lot of weight. This is hands down the best tutorial I have seen yet for machine learning.
This is hands down one of the best tutorials I've ever seen on a Machine Learning topic. The quality and the ease of explanation with which the video was made and presented really helped me understand the scary concept of RNN in a very uncomplicated way. Thank you very much.
My hunting for clarity on RNN ended with this video. I had read many medium articles and saw the videos too. Putting all those together can't reach this video. Thank you Luis Serrano
Brilliant. I love how you spell out the matrices that implement the rules of the neural network. Great job pulling back the curtain on the Wizard. Also Mt Kilimanjerror!
Thanks! Coming from you, this is very high praise, higher than Mt. Kilimanjerror! (actually, was between that one and Mt. Rainierror... maybe for the next error function) :)
Dude you can teach this supposedly extremely advanced theory to a primary school kid with your brilliant way of explanation. Respect and thank you so much!
Fantastic!! By presenting simple Neural network operations as matrix multiplications you have explained the basics of RNNs to me in a way that no one on RUclips was able to do! You're fantastic Luis💛
This is a kickass explanation of RNN.. You are a genius in teaching.. Trust me i am a student of one of the renowned institutes of the world.. but i didn’t get to hear this simple and effective way of teaching
wow! this is the exact tutorial I've looking for ages, by starting with examples and motivation, then moving into matrices multiplication. now I think a dump like me can understand how RNN works...
Seriously for putting it in such a easy way and you have given a great idea of how it is working internally through matrices rather than using nodes and edges because it is way too difficult to understand and really thank you for making it such easy.
Luis, Great video as always ! I am in udacity machine learning nanodegree program and I love your teaching style. Please keep making videos you are making a big difference for people like us.
De todas as explicações que vi na internet até agora, a sua é de longe a melhor. Ao contrário do convencional, você procura explicar claramente esse conceito que inicialmente é extremamente abstrato. Muita coisa complexa que não podia dizer que compreendia, agora percebo que estou começando a entender. Muito obrigado!
your pytorch udacity courses helped me out. But without understanding the topic, I went to youtube and again help out. Thanks for the course. You will explain so that the child can understand
4:01 shows how a nn can map simple inputs to specific outputs. Uses the same as Stanford notation for feed forward Wx Weights matrix, each row is the weights of coming into a node in the hidden/output layer
Hi. I think the thing that makes RNN useful is when your output is in same type as input. Whether the input is sequence is not that crucial because regular CNN can also extra feature from from sequential input properly.
thank you so much brother you a genius. you've enlightened my mind towards RNN. I've watched plenty of videos trying to figure out what's going on, but your video gave me hope. thank you so much.
Thanks for your video, I did get a friendly introduction to RNN :) It reminds me so much of the Hidden Markov Models (HMM in short; here, what he cooks is the hidden state and the weather is the observation in your diagram at 10:39 of this video. I guess I'll do some search how HMM and RNN's are connected! Your comments are most welcome here!
Thank you very much for that prompt response, Luis!! You really have the knack for clarifying these fundamental issues. I understand clearly now the motivation for recurrent neural networks.
aha nevermind, this was answered at 20:55 -Another question, @--11:50-- when you show the food & weather matrices, practically speaking, how would these parameters be found? via training your network right? It's my understanding that these matrices represent the weights to, I guess I can call them the first hidden layer nodes, is this correct? Do you mind clarifying this a bit more please? I'd really like to make sure I understand the material but I was born a visual learner haha :D-
Congrats Luis, what an awesome video! The concept of RNN was broken down to the bare minimum and the rest of the explanation stemmed from this simple principle, brilliant!
Luis! I love your NN series, but question that threw me off a bit. @18:06 when you add the inputs, how did you get [0,1,0,1,2,1] when the first node is 1+0 & the second node is 0+0, shouldn't it equaled to [1,0,0,1,2,1] or is there some other input that I am missing? I mean ultimately it's irreverent because after the Non-Linear function it transforms into a 0, but just want to make sure I am not missing anything there aha.
Dang! Yes you're totally right, that's a typo, it should be 1,0,0,1,2,1... Thank you! And yes, also right that the non-linear function makes it 0 anyway, but yeah, I put the one in the wrong place.
phew, okay cool. & thanks for the video! I've yet came across a simple explainer on how to write a LSTM RNN & this did the trick for me, keep up the great work!
I Must Say Excellent teacher u are.. really i have been searching this topic and again and again was confused. Today i watch your video. Welldone it was clearly described. I am impressed .keep it up
Definitely one the best videos if not the best on the RNN concepts . It would be great addition to link the intuition with mathematical rigor . How could you map the last example in particular to mathematical notation , input x, hidden states h(t) , and predicted output y^(t) , and more important how do you relate the weight matrices drafted manually with Whh, Wxh, and the known equations of Vanilla RNN. Finally it would be a great pedagogical addition if you can train an RNN using Keras / TF on this toy example and extract the weights and compare it with the manually presented weights . This will greatly add value to your explanations
This video is great and a must view for those interested in AI. But good to watch it while having some food 🥘 in case the examples of the perfect roommate makes you jealous or hungry
I have searched and searched and searched and at last an absolutely fantastic introduction suite of courses to dip your toes into that explains and shows what ML and RNN are all about. My next challenge is to find the easiest way to build my own RNN or LSTM and would welcome any suggestions on how best to go about this challenge. Again thank you so much for the time and effort you have clearly put in to this.
"If you can't explain it simply, you don't understand it well enough..."
You proved it can be done...!
if you can't explain it with at least one bad pun, then you don't understand the concept of humour well enough!
Who else think Luis Serrano is a genius teacher ? Wow !
Thank you. :)
You are welcome professor !
he explained a quite complex problems in a very intuitive and easy to understand way
Amazing explanation
@@SerranoAcademy Really great lecture :) Thanks
Watched 7:30 mins and before i complete the rest of the video i felt an overwhelming need to tell you that you taught this concept in a brilliant manner
Lol I actually did the same and went straight to the comment section in the same minute
Thank you Luis. It's a rare talent, to explain things in such a clear and simple way.
I've been "just getting through" my machine learning class for the last 9 weeks and now after watching this video I finally feel like I understand these concepts!
I have been through every single RNN video trying to understand it and you are the only one that has explained it well. I am sick of abstract topics such as NN's being unapproachable because of teachers who don't know how to explain things with a tamer vocabulary and EXAMPLES. Lots and lots of examples.
I really love your ability to convert extremely complex concepts into simple things by giving day to day life examples. Hats off to you!!!!
best video on the topic so far. firstly explaining the topic in a almost oversimplified manner, and then gradually increase complexity and difficult terminology. Perfect teaching style!
I have no words for this guy, what a legend! Thank you for being such a great teacher!
I can't believe I got to learn this for free, thank you!
I have looked at many videos and I rarely comment so my words carry a lot of weight. This is hands down the best tutorial I have seen yet for machine learning.
Intelligence, simplicity and didactic. Three ingredients of a genial Machine Learning teacher!
Watching this video for the first time exactly after 6 years.
simplest and amazing explanation thank you sir
By far the clearest and most approachable intro to recurrent NNs I've come across!
This is hands down one of the best tutorials I've ever seen on a Machine Learning topic. The quality and the ease of explanation with which the video was made and presented really helped me understand the scary concept of RNN in a very uncomplicated way. Thank you very much.
My hunting for clarity on RNN ended with this video. I had read many medium articles and saw the videos too. Putting all those together can't reach this video. Thank you
Luis Serrano
All the other tutorials just explained NN as a black box. Your use of matrices for the explanation really helped strengthen the understanding! :)
This is easily the best RNN explanation on the internet.
"The Vector of the Chicken." I wonder how many times in the history of humanity that phrase has been uttered.
Dmitry Karpovich
hahaha, I wonder if it's the first time! :)
I feel like there is a joke or pun somewhere in there, but I cant find it...
Sir ,you have a talent for representing complication things in a simplest manner.
God! This one is a saviour. It changed my perspective towards NNs.
Brilliant. I love how you spell out the matrices that implement the rules of the neural network. Great job pulling back the curtain on the Wizard.
Also Mt Kilimanjerror!
And thanks for the shoutout :)
Thanks! Coming from you, this is very high praise, higher than Mt. Kilimanjerror! (actually, was between that one and Mt. Rainierror... maybe for the next error function) :)
Before reading this comment I was just about to say that it's cool aproach with matrices!
Dude you can teach this supposedly extremely advanced theory to a primary school kid with your brilliant way of explanation. Respect and thank you so much!
Fantastic!! By presenting simple Neural network operations as matrix multiplications you have explained the basics of RNNs to me in a way that no one on RUclips was able to do! You're fantastic Luis💛
3 years in AI engineering and I've never seen an interpretation of neural network like this. Amazing Sir!
Before I go any further, I really liked how you stated what Machine Learning does to us.
Genius!!!
This is a kickass explanation of RNN.. You are a genius in teaching.. Trust me i am a student of one of the renowned institutes of the world.. but i didn’t get to hear this simple and effective way of teaching
Just want to leave a comment so that more people could learn from your amazing videos! Many thanks for the wonderful and fun creation!!!
The most intuitive introduction to RNNs that I've come across thus far! Thank you!
wow! this is the exact tutorial I've looking for ages, by starting with examples and motivation, then moving into matrices multiplication. now I think a dump like me can understand how RNN works...
One of the best videos for beginning DNNs. It sets our psyche properly for all the things to come in Deep Neural Networks.
Real classic intro to a complicated topic. Love the smooth introduction. Just perfect.
after watching 50 videos about RNN, finally this one teaches me the idea.
17:50 is the most important figure of the whole video. The explanation was very good, simple and easy!
now i can practically teach my students about gradient descent, very intuitive lessons here.thanks alot
Bingo ! My journey to understand RNN intuitively finally ends, thanks for this great video.
Seriously for putting it in such a easy way and you have given a great idea of how it is working internally through matrices rather than using nodes and edges because it is way too difficult to understand and really thank you for making it such easy.
Luis,
Great video as always ! I am in udacity machine learning nanodegree program and I love your teaching style. Please keep making videos you are making a big difference for people like us.
Thank you for your message, Avinash! Great to hear that you enjoy the program! Definitely, as time permits, I'll keep adding videos here. Cheers!
a must video for everybody trying to understand RNN. Really appreciate your work to make basic concepts simpler for audience.
You are the best. All of your videos are awesome. Even a 5-year-old can understand what you are saying. I respect your contribution !!
I am normally very lazy in commenting, but this guy made me do it. You Sir are awesome!!!
Luis Serrano, you have an incredible ability to represent tough concepts in such an interesting way
I wish I had teachers like you in UNI. Thank you!!!
Best explanation of RNNs i found on RUclips. Thanks a tonne.
This is the best video that user has seen which explains complex things in simple way
De todas as explicações que vi na internet até agora, a sua é de longe a melhor. Ao contrário do convencional, você procura explicar claramente esse conceito que inicialmente é extremamente abstrato. Muita coisa complexa que não podia dizer que compreendia, agora percebo que estou começando a entender. Muito obrigado!
your pytorch udacity courses helped me out. But without understanding the topic, I went to youtube and again help out. Thanks for the course. You will explain so that the child can understand
Best,easy and simple explanation of RNN.
keep up the great work.Thanks
4:01 shows how a nn can map simple inputs to specific outputs.
Uses the same as Stanford notation for feed forward Wx
Weights matrix, each row is the weights of coming into a node in the hidden/output layer
Your method of teaching with all those images is really awesome
the best RNN tutorial period. Thanks
Hello Luis, please don't stop making these videos. Your NN series are awesome. I had to come back to comment on this. Thanks a lot man.
You are one of the best tutors here. You make complex things look damn easy. Thanks lot for all your videos
This is the best and most easily understanding introduction I have ever heard. Fantastic!
Your voice and teaching skill both are soothing enough.
It really was an amazing video. It was really nice to see how such an esoteric topic was presented in really simple way. Keep it going dude!
Hi. I think the thing that makes RNN useful is when your output is in same type as input. Whether the input is sequence is not that crucial because regular CNN can also extra feature from from sequential input properly.
This is gold!. How do you like a YT video more than once?
The errorrest and kilimanjerror pun was perfect!
thank you so much brother you a genius. you've enlightened my mind towards RNN. I've watched plenty of videos trying to figure out what's going on, but your video gave me hope. thank you so much.
Best explanation I found for RNN.
Thanks for your video, I did get a friendly introduction to RNN :) It reminds me so much of the Hidden Markov Models (HMM in short; here, what he cooks is the hidden state and the weather is the observation in your diagram at 10:39 of this video. I guess I'll do some search how HMM and RNN's are connected! Your comments are most welcome here!
Love it! discovered that NN can be represented as a matrix.
Didn't know this as well! Is this always the case?
Best explanation on RNN I have seen so far. Thanks for doing this
Great Explaination sir in very simple language thats signof the best teacher
Thank you very much for that prompt response, Luis!! You really have the knack for clarifying these fundamental issues. I understand clearly now the motivation for recurrent neural networks.
This is the best for beginners. You deserve more likes!
i LOVED THE INTRODUCTORY WITH YOUR PICTURES, exactly what happened to me.
The best RUclips video I've ever seen
This tutorial has reinforced my understanding and see it in a new light. Superb explanation. Very very clear. Thanks very much.
aha nevermind, this was answered at 20:55 -Another question, @--11:50-- when you show the food & weather matrices, practically speaking, how would these parameters be found? via training your network right? It's my understanding that these matrices represent the weights to, I guess I can call them the first hidden layer nodes, is this correct? Do you mind clarifying this a bit more please? I'd really like to make sure I understand the material but I was born a visual learner haha :D-
hey Jab
Thank you for making this video! Most articles on RNNs didn't explicitly explain how two inputs were added to make a proper output
Congrats Luis, what an awesome video! The concept of RNN was broken down to the bare minimum and the rest of the explanation stemmed from this simple principle, brilliant!
nice video for people to understand one layer neural network, multi-layer neural network and deep learning
This is a very easy-to-understand explanation of Recurrent Neural Networks! Good job to you!
Easily the best video on RNN intro out there
Luis! I love your NN series, but question that threw me off a bit. @18:06 when you add the inputs, how did you get [0,1,0,1,2,1] when the first node is 1+0 & the second node is 0+0, shouldn't it equaled to [1,0,0,1,2,1] or is there some other input that I am missing? I mean ultimately it's irreverent because after the Non-Linear function it transforms into a 0, but just want to make sure I am not missing anything there aha.
Dang! Yes you're totally right, that's a typo, it should be 1,0,0,1,2,1... Thank you!
And yes, also right that the non-linear function makes it 0 anyway, but yeah, I put the one in the wrong place.
phew, okay cool. & thanks for the video! I've yet came across a simple explainer on how to write a LSTM RNN & this did the trick for me, keep up the great work!
Jabrils I appreciate your recommendation for this video ;)
Amazing .. wonderful.. What a great teacher you are!! Lot of prep required to explain a complicated subject in few minutes with an easy example.
great way of explaining why dont you make a video about lstms next?
Thank you so much for the neural network series. Such simple explanations without the mathematics.
Indeed a great teacher. Loved your explanation.
whole day was looking for this
Whoa this helps a lot. I watch a bunch of videos about this and I keep getting confused. Glad I find this video. Thank you!
Thank you for demystifying the RNN. It is really beginner-friendly, thank you.
“If you can't explain it to a six year old, you don't understand it yourself.”
― Albert Einstein
Luis, you are a great teacher.👌
I Must Say Excellent teacher u are.. really i have been searching this topic and again and again was confused. Today i watch your video. Welldone it was clearly described. I am impressed .keep it up
This was an outstanding explanation of RNN... Thanks for making this :)
This is just wow! Such a lucid explanation of RNN
Definitely one the best videos if not the best on the RNN concepts . It would be great addition to link the intuition with mathematical rigor . How could you map the last example in particular to mathematical notation , input x, hidden states h(t) , and predicted output y^(t) , and more important how do you relate the weight matrices drafted manually with Whh, Wxh, and the known equations of Vanilla RNN. Finally it would be a great pedagogical addition if you can train an RNN using Keras / TF on this toy example and extract the weights and compare it with the manually presented weights . This will greatly add value to your explanations
This video is great and a must view for those interested in AI. But good to watch it while having some food 🥘 in case the examples of the perfect roommate makes you jealous or hungry
Congrats Luis! It is explained a quite complex problems in a very intuitive and easy to understand way
Thank you for the video, your explanation is clear as crystal
Sir, this is really amazing. Loved this example in general because Neural Networks as a linear transformation in general sounds so cool!
Especially the mapping between the operations on matrices and the network of nodes helps visualize the topic. Great job, sir, indeed! Thank you
I have searched and searched and searched and at last an absolutely fantastic introduction suite of courses to dip your toes into that explains and shows what ML and RNN are all about. My next challenge is to find the easiest way to build my own RNN or LSTM and would welcome any suggestions on how best to go about this challenge. Again thank you so much for the time and effort you have clearly put in to this.
Thank you for making this video! It's allowed me to understand RNNs in terms of matrices much more clearly!
Luis, you are the man! You demystified that brilliantly, Richard Feynman style.
Hands down the best vid I have ever seen. Great job mate. Great job.
Best video I found that explained RNN, thank you🙏🙏🙏
I am really enjoying learning from your Neural Networks playlist. Thank you so much for such amazing teaching and great quality content.
Excellent video. I understood it only after watching this video. Tried many earlier. Good Service