Yaa bro, but I am stuck in NLP playlists. I follow bhaiya video last one year and I don't understand from other teacher's teaching that's why I commented.
Started with a small doubt in hard margin problem in SVM and saw the whole ML playlist on 2x in 2 days. It connected all the missing dots or missed concepts like a story. I am perusing MS in data science still never found such a good relation of deep learning and ml and such a concise explanation. Hats off. 🙌🙌 Day 4 and I am in the middle of revising deep learning concepts with such a good explanation
Niteshji...I am speachless to appreciate your efforts for us... Kabhi nhi socha tha ki higher level study me bhi itne achhe teacher mil sakte hai... Aap ka work awesome... And unique hai...
Sir, what are you made up off ?? inch by inch .. point by point, you explained this concept just like baby footsteps ... Sir ..in my entire 30+yrs of learning journey i never ever seen such a teacher / mentor ... I am mean YOU ARE GOD !! :-)
Never seen such a simple explanation for back propagation. You are an exceptionally gifted teacher. Thank you for your hard work and clear teaching Nitish Sir.
The way you explained the whole thing is literally awesome ! With this video am sure every student will understand the backpropagation clearly with an insightful way.
Lectures are justt greatt , i mean jst great .....41:25 nd 42:02 , inplace of O21 , there should be O12, btw u can ignore this small thing also as i was doing it on my own , so found out nd wanted to comment something nd finding even this little error is so hard cz your teaching is just Mind Blowing . Thanku for being there nd Happie Dayiiii , i will continuee......Sayonaraaa💃💃💃💃
See how things like partial derivatives and derivatives are works in practical and solving real world problems . This man telling us by explaining them so easily. Big like to you sir.❤
Excellent Sir Excellent! Currently I am pursuing my PhD and was here for a quick recap, and Sir you made my day. Keep on creating such wonderful content.
Sir ye sb prhny k bawjood b apse prhny ka apna mza ata he thanks aloot for this great content. Also a request is that please make some videos in future on the transformers using hugging face i really need this Thanks.
Niteshji, I'm heartly grateful and thankful to you teach us. While teaching derivatives to find update value of weight & bias, if you would used real values examples for calculation then it is more beneficial to us for the purpose of understanding.
@CampusX I have a question regarding the calculation at 43:16 in your video. When determining the derivative of O11 with respect to W1_11 , you seem to be using the pre-activation value. However, shouldn't O11 be the post-activation output (i.e., after applying the activation function)? Thank you very much for your efforts.
Cancelled Netflix and Hotstar subscriptions. Binge watching CampusX these days! Magnificent 100/10 ratings!
Me too 😂Sir is really an amazing teacher. Itna clarity aur interesting way me bolte hai movie se kam nahi hai😁May god bless him for his selfless work🥰
Exactly 😊@@DebjaniMajumder-q5p
😅😅😅
Me too
Sir just complete Deep learning series the way you teach is magnificent, eventually people will find this treasure soon,just keep doing good work
Yaa bro, but I am stuck in NLP playlists. I follow bhaiya video last one year and I don't understand from other teacher's teaching that's why I commented.
Started with a small doubt in hard margin problem in SVM and saw the whole ML playlist on 2x in 2 days. It connected all the missing dots or missed concepts like a story.
I am perusing MS in data science still never found such a good relation of deep learning and ml and such a concise explanation. Hats off. 🙌🙌
Day 4 and I am in the middle of revising deep learning concepts with such a good explanation
bro 2 din mein 2x mein kaise complete kiya the whole ML playlist 🤔
I was not able to understand this from my professors in fancy universities in UK. A big salute to you Sir
Waah beta
paisa hai bhaii
He is the best teacher. His way of teaching is outstanding and better than all the universities of Canada.
Sir your 1hr video feels like 10minutes and it's satisfying as like we are watching a interesting web series thanks a lot the best teacher 🙏
Niteshji...I am speachless to appreciate your efforts for us...
Kabhi nhi socha tha ki higher level study me bhi itne achhe teacher mil sakte hai...
Aap ka work awesome... And unique hai...
Nitish Sir, can't thank you enough for creating this masterpiece of a playlist for free. It's the best resource. Will always be indebted to you.
I am very happy to see such detailed derivation: at 41:25, it should be O12, by typo you kept O21.. please have a look
Yeah i Agree with you
i was so scared to learn backprop. But his video is too easy and too good. Excellent! 1000/10
कोई शब्द नहीं है, इतना बेहतरीन explanation.. a lot of thanks sir
Sir, what are you made up off ?? inch by inch .. point by point, you explained this concept just like baby footsteps ... Sir ..in my entire 30+yrs of learning journey i never ever seen such a teacher / mentor ... I am mean YOU ARE GOD !! :-)
ab toh buddha ho gya
He is the best teacher. His way of teaching is outstanding and better than all the universities of Pakistan.
Never seen such a simple explanation for back propagation. You are an exceptionally gifted teacher. Thank you for your hard work and clear teaching Nitish Sir.
The way you explained the whole thing is literally awesome ! With this video am sure every student will understand the backpropagation clearly with an insightful way.
watched your videos before my ML exam and my exam went so well Alhamdulillah. cant thank you enough 😊 a true gem in ML
I have seen all RUclips Channels for backpropogation but yours explaination is best.
Rather than saying that this is of the best channels I have come across, this one is best❤
Lectures are justt greatt , i mean jst great .....41:25 nd 42:02 , inplace of O21 , there should be O12, btw u can ignore this small thing also as i was doing it on my own , so found out nd wanted to comment something nd finding even this little error is so hard cz your teaching is just Mind Blowing . Thanku for being there nd Happie Dayiiii , i will continuee......Sayonaraaa💃💃💃💃
Yes I also catch this
was searching this comment
you are billion times better then my university professors
love you Nitish sir from Pakistan😍
from machine learing to deep learning
great and full detail stuff
you are the BEST teacher on youtube.
Superlative Explanation. I have gone through many videos regarding back-prop, it is the best explanation by far. Thanks Nitish ji
See how things like partial derivatives and derivatives are works in practical and solving real world problems . This man telling us by explaining them so easily. Big like to you sir.❤
best explanation of back propogation so far truly good way of teaching
One of the finest videos , any rating would be less. Thank you so much
Best Data science teacher in the world. Period!!
CampusX is an Addiction🙏😍.
I found your video on my final year project sir. since then I started following and learning from you. Still many things to learn.
The greatest explanation of the Backpropagation and Chain Rule, Thanks a lot.
Excellent Sir Excellent!
Currently I am pursuing my PhD and was here for a quick recap, and Sir you made my day. Keep on creating such wonderful content.
Big fan of this deep learning series and the way you explain. Please upload daily in this series so I can finally get an internship 😂😂😂
Very Nice Explanation Thank God gifted us such a good teacher
i have no words for you sir!!! Amazing. love from pakistan
25:55 What is the reason behind finding the gradient?
48:30 What is back propagation algorithm?
Best & clear compilation of all the important loss functions. Thank you for this video.
Absolutely amazed by your teachings!!
You lecture is not boring but like a story.. ❤❤❤❤❤
@28:00 -- Chain Rule Of Differentiation (Computing Gradient of the Loss Function)
Just one word, Fantastic! Thanks a lot!
Mind blowing, now everything clear in backpropagation
among one of most amazing video part! touching the neurons of DL
Very well explained. I am from Java background and understood Deep Learning concepts so far very good.
best video on Backpropagation sir thanks to you.
Amazed after seeing this video
I like your teching method .. thank you sir .. you explain very best way .... you give ans. to every why question and i like it..thank again!!
Thank you so much sir for such clear explanation, the way you explained backpropagation is literally very awesome.
wahh...really sir hats off to you...what a explanation ❤❤❤❤❤❤❤
That's the reason why I am following nitish sir
Thanks for this much clarity ! Very very grateful to your efforts
your explanation is amazing, sir 🖤
you are a legend sir. seriously you made tghings soo easy to understand.. Hats off and thanks for all the knowledge.
Bro. Just Incredible.😍
Thanks, sir for this best content. Now I am understanding Backpropagation.
This man take u in deep in every topic. Hatts off 🎉🎉🎉
My good luck! i finally find the Gold mine❤
there isa mistake at 42:02, it should be o12 instead of o21, for yhat equation.
One of the best teachers🙏🙏
Thanks for the wonderful explanation!
moj kardi sir. dil garden garden sa ho gaya😊
awesome explanation.. I don't think anyone can explain this perfectly. hats off.
Amazing video . Nice way of explanation . Thanks Alot Sir for this Amaziing Video
Great ,Hats off to you Sir ,your teaching style is really awesome 🙏🙂
Sir ye sb prhny k bawjood b apse prhny ka apna mza ata he thanks aloot for this great content.
Also a request is that please make some videos in future on the transformers using hugging face i really need this Thanks.
You are going to be famous soon remember my words .the way you explain things it takes so much effort❤
one of the best explanation sir
now that's what a lecture from first principles called
iit me hu fir bhi sir ke videos dekhta hu concept smjhne ke liye ....baki ap smjhdar ho
53:57
I hope mein samjhaya paya ❌
I hope tumhara brain samajh paya ✅
What an amazing explaination 🫡👏🏻
Thank you so much sir 🙏🙏🙏
sir nice explanation for back propagation but i request pls upload video regularly
What an explanation!!!!!!!!!!!!!
RUclips 's best video, even better than Andrew NG
You are exceptional, Great teaching skills. Thanks sir
Great explanation. Thank you so much for this. Keep up the great work.
amazing explanation for backprop , thanks sir 🙌🏻
Well explained sir 🙏🏻
No one have ever explained back propagation like u🔥
Bhaiya, two requests first is "Our Community" and second one is "NLP Playlist"
best explanation ever. thanks thousand time for your effort to clear such complex thing in such wasy way to us :)
Just awesome sir🎉❤
What a explanation.
Thank a lot😀
Niteshji, I'm heartly grateful and thankful to you teach us. While teaching derivatives to find update value of weight & bias, if you would used real values examples for calculation then it is more beneficial to us for the purpose of understanding.
Incredible!!!!!!!Hats Off to you...
thanks sir for simple and wonderful explanation
Thank you so much for explanation sir 🙏
unparalleled explanation. Amazing
Thank you sir 😇
Thank you so much sir 😊❤
Guys, please watch the gradient descent wala video first cause it is very important
damm that was too ezzzzz .. you made it look like so simple anyone can understand. again thank you so much for your hard work.
Very good tutorial. Thank you very much for such a good tutorial.
huge effort. amazing
At 33:36 , shouldn't dy hat/ db21 =0 since differentiation of constant is zero ?
Thank you sir you explained in detail 🔥🔥🔥
Whattttaa great explanation, thank you Sir! 🙏❤
Sir! you are great!!!
@CampusX
I have a question regarding the calculation at 43:16 in your video. When determining the derivative of O11 with respect to W1_11 , you seem to be using the pre-activation value. However, shouldn't O11 be the post-activation output (i.e., after applying the activation function)? Thank you very much for your efforts.
Legend of Deep learning
Amazing!!🤯🤯
Very well explained,