Sir plz dont leave this playlist incomplete, u can't even imagine how this playlist has become a blessing for many, it's my heartedly request, and i m sure I m not alone in requesting the same, there are many others also, plz do update this playlist also, I have given up many paid courses to study from your masterpiece videos, plzz sir it's a humble request 🙏🙏🙏🙏
Thankyou bhaiya for your hardwork. This series means a lot to me as it was so helpful in learning deep algorithms and everything. The content you provide was pretty commedable. virtual hugs to you. Love from Ghaziabad🙂❤
Sir it would be really helpful if you can provide remaining videos..after watching your videos we have become addictive to your videos and not comfortable with any other videos
It breaks my heart how much shadow banned you are. I literally have to search you by typing the entire thing because this doesn't show up in the first 20 videos or so. You are the best teacher ever. I can't thank you enough sir!
I have studied RNN from many other teachers, but after seeing this video, I really understand how it works. I now have the confidence to explain this topic to anyone. Thank you very much, sir.
Hi, first of all, great content! I was looking forward to this playlist but I see it hasn't been updated since 6 months, I was hoping to learn more about LSTMs, GRUs, Transformers, BERT, GPT, LLMs etc through this playlist, I hope you continue this series soon, thanks anyway :)
input format in RNN -> (timesteps, input_features) .... when we send the first word is sent, time = 1 "Movie was good" ... sent in as a tensor of (3, 5) where timesteps = 3 & input_features = 5 unfolding through time -> every node has an activation function which by default is tan h output of one cycle is fed back to the nodes in the next cycle
Hii Nitish, i appreciate the way you explain RNN , Most importanatly mathematical calculation..Can you suggest me book that explains deep learning in your style yes in your style..Bcoz i started reading many DL books but thier mathematical explanation was confusing and complex for me...Please respond
@@campusx-official ,thanks bhai pls finish Feature selection ,xgboost catboost from ML ,DL and advance NLP along with python ..if possible staart MLOPs as well
you said something around 39:40 that final output layer (activated by softmax/ sigmoid) is used only for last timestamp, ig it's only for this RNN (sentiment analysis) , otherwise for each recurrent block, we may have used final output layers too alongwith hidden layers?
This is hands down the best deep learning playlists out there, i am blindly following this because i just know sir will surely be covering all the content in depth and i don't have to go look for it anywhere else.
Sir, the way your explaining the things it's simply amazing I'm eagerly searching for learning of RNN MATHEMATICAL INTUITION finally when I watch your lecture I stopped searching for RNN .. you made the concept step by step more understandable.... please continuously upload these kind of concepts which will helpful for the upcoming students then can learn from you... sir it's my humble request......
Sir still I don't understand one thing when you explained the working this working still doesn't explain how your able to retain the context or semantic meaning it just tells how your able to maintain the order of words ime which word comes before which word but not it's importance of each word how much does each word has an impact on other words
semantics aise retain ho rahe ki we are following this feedback approach har time step me unrolling k waqt we are using the output of the previous step as the input to the next step, ANN k ki tarah sare i/p eksath nhi jaa rahe network me, so agr hum input k order change karenge 1st step me so corresponding O1, O2, O3 sare hi change ho jayenge..... thus maintaining the semantics of the orginial word u input
Hey Bhai, The way you go into depth is great. Whatever deep learning I know is because of you only. Plus the hinglish hits different when learning new things. At least for me 😅
Sir your tutorials are like : watch it once and you are done with understanding concept. later all we need is a revision for couple of times and done with the topic. Thank you for everything 🙏
Sir plz dont leave this playlist incomplete, u can't even imagine how this playlist has become a blessing for many, it's my heartedly request, and i m sure I m not alone in requesting the same, there are many others also, plz do update this playlist also, I have given up many paid courses to study from your masterpiece videos, plzz sir it's a humble request 🙏🙏🙏🙏
Nai chor rahe hai bhai. Was travelling this weekend so could not upload. Will upload the next video tomorrow.
@@campusx-official thnks a lot sir, for everything..!
@@campusx-official 🥲why r u not uploading next video ??
I gave up on paid courses too to follow this channel.
@@campusx-official Sir,baaki videos kb aaenge. Please Sir unhe bhi upload krden
Sir, you are genuinely the best data science teacher ... I feel sooooo blessed
No words to express your teaching style Sir. So simplified and in depth.Hats off to your knowledge and presentation. God Bless You !
This is the best explanation video of RNN . Thank you sir.
Beautiful explanation !!!! You are doing a great service to community !!!!
This is really a blessing love your videos sir and how simply and gracefully you explain these topics.
You are just amazing. Please share video on LLM or make LLM's part of your mentorship program.
very simplified and logical lectures
Thankyou bhaiya for your hardwork. This series means a lot to me as it was so helpful in learning deep algorithms and everything. The content you provide was pretty commedable. virtual hugs to you. Love from Ghaziabad🙂❤
Amazing explanation 💯💯
Sir it would be really helpful if you can provide remaining videos..after watching your videos we have become addictive to your videos and not comfortable with any other videos
It breaks my heart how much shadow banned you are. I literally have to search you by typing the entire thing because this doesn't show up in the first 20 videos or so. You are the best teacher ever. I can't thank you enough sir!
Thank u so much sir.
Great Video Sir
around 36, weight sharing, input dimension in keras, can store sequence of 10 time steps
Amazing explanation i wish I could have the notebook
I have never ever commented on any video. Simplifying such concepts, you are the best teacher I have ever seen in my exploration of Data Science.
Super Amazing video...
Hi Nitish. Amazing explaination! Breaking down such complex topic into simple bits! But please do upload backpropogation too for RNN.
Next video wahi hai
Amazing
I have studied RNN from many other teachers, but after seeing this video, I really understand how it works. I now have the confidence to explain this topic to anyone. Thank you very much, sir.
Thank you for very clear explanation! Could you please tell me the book you usually follow to understand concepts like RNN?
Learning new concepts.
August 13, 2023😊
Sir sare video dekh liye... Waoting for more video...plz sir jaldi jaldi bana digiye
Thank you so much!
Sir how u easily clarify all the difficult concepts
Hi, first of all, great content! I was looking forward to this playlist but I see it hasn't been updated since 6 months, I was hoping to learn more about LSTMs, GRUs, Transformers, BERT, GPT, LLMs etc through this playlist, I hope you continue this series soon, thanks anyway :)
Yes even I want the full playlist
where is the link for this google colab
Please bring videos on LSTM and BERT
input format in RNN -> (timesteps, input_features) .... when we send the first word is sent, time = 1
"Movie was good" ... sent in as a tensor of (3, 5) where timesteps = 3 & input_features = 5
unfolding through time -> every node has an activation function which by default is tan h
output of one cycle is fed back to the nodes in the next cycle
Thanks
Hii Nitish, i appreciate the way you explain RNN , Most importanatly mathematical calculation..Can you suggest me book that explains deep learning in your style yes in your style..Bcoz i started reading many DL books but thier mathematical explanation was confusing and complex for me...Please respond
May be Deep Learning with Python will suit your need
Its been 8 days no video yet ,pls upload 2 videos per week and finish the playlist
Upload the next video on Tuesday. Was travelling this week so could not shoot.
@@campusx-official ,thanks bhai pls finish Feature selection ,xgboost catboost from ML ,DL and advance NLP along with python ..if possible staart MLOPs as well
So good! We want atleast 1M subscribers for you!
when will this playlist restart
sir update this playlist, we are eagerly waiting for the completion of this playlist
11:33 for anyone having problem that here input sizes are different. how would unequal be used as input. agley lecture main answer hai.
I had already subscribed to the channel, but the video didn't appear in the search results 😑 because it's not in English? May be!
you said something around 39:40 that final output layer (activated by softmax/ sigmoid) is used only for last timestamp, ig it's only for this RNN (sentiment analysis) , otherwise for each recurrent block, we may have used final output layers too alongwith hidden layers?
At t=2 why the input is connected only to 1 hidden layer it should be connected to all(3) hidden layer
Sir, I hope you are feeling good.
Sir please tell me how to have question answer with you
plzzzzzz
best
best
Hi Nitish, Cab you upload videos on encoders, decoders, attention models and Transformers. btw your explanation is great. Good work .
11:35-->14:35 2 main difference between RNN and ANN
This playlist is amazing! One of the best for understanding deep learning. Thanks a lot! 🎉
Can u make a playlist cnn with lstm block
Sir please continue This playlist sir. Pleasee
please upload xgboost video sir
Sir How did you find the vectors like for movie 10000 ,for was 01000 etc etc
This is hands down the best deep learning playlists out there, i am blindly following this because i just know sir will surely be covering all the content in depth and i don't have to go look for it anywhere else.
Thank You Sir, can you please suggest book for more information ?
Sir, the way your explaining the things it's simply amazing I'm eagerly searching for learning of RNN MATHEMATICAL INTUITION finally when I watch your lecture I stopped searching for RNN .. you made the concept step by step more understandable.... please continuously upload these kind of concepts which will helpful for the upcoming students then can learn from you... sir it's my humble request......
Please make tutorials on PINN. Thank you
Sir please please complete this playlist, thank you so much for the videos uploaded till now...
Sir still I don't understand one thing when you explained the working this working still doesn't explain how your able to retain the context or semantic meaning it just tells how your able to maintain the order of words ime which word comes before which word but not it's importance of each word how much does each word has an impact on other words
semantics aise retain ho rahe ki we are following this feedback approach har time step me unrolling k waqt we are using the output of the previous step as the input to the next step, ANN k ki tarah sare i/p eksath nhi jaa rahe network me, so agr hum input k order change karenge 1st step me so corresponding O1, O2, O3 sare hi change ho jayenge..... thus maintaining the semantics of the orginial word u input
Pls upload next video sir
😭😭😭 how many thanks should we say to you ❤️❤️❤️❤️
@17:37,please let me know why one neuron output should go to other neuron in same layer
Thank You Sir.
Could u plz update the ML Roadmap 2022 with the CNN and RNN videos
hi sir when will u upload videos on transfromers
best ever explanation of RNN architecture. Thanks a ton
Simply the best. I have been recommending this channel to all
No words to express your teaching style Sir. So simplified and in depth.Hats off to your knowledge and presentation. God Bless You !
thanks sir
for simple explanation
Hey Bhai, The way you go into depth is great. Whatever deep learning I know is because of you only.
Plus the hinglish hits different when learning new things. At least for me 😅
The best❤
Most beautiful explanation sir ❤❤❤❤
really great effort sir , please complete this playlist
best
O complicated RNN, Bhaiya ji ne suru kar diya, itni khushi itni khushi🤡, time series 👀👀
super
is playlist ke complete notes h kisi ke paas ?
finished watching and coding
so much humbled by your dedication sir. kudos for Great teaching!!
wonderful
great explanation..
💚💚💚💚
Good Explanation, But how the past information affect the future once
Thank u Sir for your great efforts.
Sir, when will we get the new video?
sir literally crying while typing this have tears in my eyes
thanks a lot sir
kandha dun
Awesome. Simple and detailed explanation of RNN.
Nitesh sir amazing!!!!
god level content! simply amazing!
so beautiful and stay blessed
Badshaho, hats off to you. cha gae ho.
Sir please provide these notes and code links.
Superbbbbb Explaination!!!!
No need to use Recurrence (repeat video) once intrepreted neately
Thanks
are the weights are same across all time steps?
Sir very good video
Best explanation ever :)
Amazing explanation!
Thank you Nitish sir 💯
🙏
Sir your tutorials are like : watch it once and you are done with understanding concept.
later all we need is a revision for couple of times and done with the topic.
Thank you for everything 🙏
ye aadmi legend hai ❤🔥❤🔥
This is exactly what I needed to understand RNN better. Thanks!