hi nitish this is abhinav , i accidently discovered your channel one day and since then i've been binge watching your videos about machine learning . Thanks for keeping the spark alive in me with your beautiful explanations , you are the teacher i always wanted:)
I just wanted to take a moment to share something that means a lot to me. after a long journey filled with ups and downs, I recently landed a full-time off-campus offer as an ML Engineer at a startup, earning $1200/month. it might not seem like a huge milestone to everyone, but for me, it's a proof of what sincere efforts can achieve. couldnt have done this without you, and I am incredibly grateful. Thank you for believing in me!
It's a great youtube chanell for data science.. the way you explain is awesome.. Little drawback is.. pointer or cursor is not visible while explaining . It's hard to concentrate where you are pointing out.
Do we need to write the chain rule for derivative of w[1] 11 like this? L -> yhat -> w[3]11 -> o21 -> w[2]11 -> o11 -> w[1]11 ? Why are you not considering weights in the chain rule of derivatives while calculating the derivative of w[1]11 ?
@@near_. hypothesis testing is a generic topic. It’s needed before starting any project in any domain. Its not just related to data science. Ex:: when u step put of your house u expect a road in front of u , thats a hypothesis.
One more quick suggestion - when you are explaining and pointing your pen at something, we can’t identify the Pinpointer in few places which you are referring. And we had to search where are you actually pointing. Good job and this is just a feedback for your future videos
Hi nitish, there might be some mistake in the NN diagram, you have written O31(Yhat) to O23, and b23 to b31, which can be confusing, so could you please let me know if I am thinking wrong or it will be O31 and not O23, (the output layer when you are explaining the memoization at 10:50, Thanks!
Sir apko data science apny channel pr b apply krna chahye Mery aik sir he unho ne different technique apply ki thi or unky 48 hours me 100k subscriber mil gay he is a grand master at kaggle from Pakistan. You should try this too agr apko wo video ka link chahye then wo b me apko de du ga ap dkh Lena.
hi nitish this is abhinav , i accidently discovered your channel one day and since then i've been binge watching your videos about machine learning . Thanks for keeping the spark alive in me with your beautiful explanations , you are the teacher i always wanted:)
There are no accidents - Master Oogway
I just wanted to take a moment to share something that means a lot to me. after a long journey filled with ups and downs, I recently landed a full-time off-campus offer as an ML Engineer at a startup, earning $1200/month. it might not seem like a huge milestone to everyone, but for me, it's a proof of what sincere efforts can achieve. couldnt have done this without you, and I am incredibly grateful. Thank you for believing in me!
@@susdoge3767 can you share your journey of ML and how to land a job?
no one cares
The way of your teaching is easy to understand. Glad i came across this channel.. all i can give you is a single like.. :)
Again a master piece. Described so greatly.
It's a great youtube chanell for data science.. the way you explain is awesome..
Little drawback is.. pointer or cursor is not visible while explaining . It's hard to concentrate where you are pointing out.
indeed a master piece hand's off
thank you for your hard work
insanely good playlist
Kya batayen... Main to chain rule bhi yahin seeka 12th me to ata nahi tha 😀... Aap Cheejon ko itna aasan bana dete hain sir... Salute you sir👏
Simply superb !, Thanks a lot Nitish bhai.
Wonderful explanation Nitesh bhai
Thanks
another amazing lecture....Thank you soo much Sir !!
concept described so well..
If you see my comment, write something in the comment, it helps this guy to get more subscribers; he deserves it. He is working hard.
great explanation of multi layer nns. thanks
Nice explanation. Even Dr No fails on this aspect.
Thanks a lot for all your hard work, sir
Sir, The bias of the last node of neural network should be b31.
Correct me if i am wrong!
Btw great video. Thank you so much!!!!
Ya... you are correct
25:22 Sir, why do you thank us when we should be thanking you? Thank you sooooooooo much sir. Your are legend .
Really helpful. Thanks for the playlist.
Very good explainatioin
You are most welcome, Nitish sir!
the background noise of birds and squirrel is nice. ❤
Already used every adjective to praise Nitish's teaching ability...
good explanation , i wonder kitna complicated ho jata hoga weights update krna in super complex architectures
Very well explained
Sir, 10:24 Would this example call for 4 layers? According to Andrew NG convention input layer is not considered in count of layers
its off 3-layer
Sir ur better than Krish naik
Nitish Sir is gem please don't confict him with another person.
Yes, Nitish sir deserves 1M subscribers within 1 year
@@nahidulislam5889 vo kuch bhi krein, tujhe kya,zyada bol mat
Nice cover. 😊
Amazing content , and very beautiful explanation
Amazing playlist
Do we need to write the chain rule for derivative of w[1] 11 like this?
L -> yhat -> w[3]11 -> o21 -> w[2]11 -> o11 -> w[1]11 ?
Why are you not considering weights in the chain rule of derivatives while calculating the derivative of w[1]11 ?
best teacher in AI ever
Thank You Sir.
Very informative
Sir, can you please make a video on Hypothesis testing? it seems like very difficult concept.
I saw many videos on this topic. Still blank! couldn't understand where to use it!!! Please Nitish, one video for this topic too!
@@rutvigohel5255 watch brandon foltz.. he has a complete playlist on this topic.
@@rutvigohel5255 one ques. Do we need hypothesis testing in Deep learning project?
@@carti8778 one ques. Do we need hypothesis testing in Deep learning project?
@@near_. hypothesis testing is a generic topic. It’s needed before starting any project in any domain. Its not just related to data science. Ex:: when u step put of your house u expect a road in front of u , thats a hypothesis.
One more quick suggestion - when you are explaining and pointing your pen at something, we can’t identify the Pinpointer in few places which you are referring. And we had to search where are you actually pointing.
Good job and this is just a feedback for your future videos
Hi nitish, there might be some mistake in the NN diagram, you have written O31(Yhat) to O23, and b23 to b31, which can be confusing, so could you please let me know if I am thinking wrong or it will be O31 and not O23, (the output layer when you are explaining the memoization at 10:50, Thanks!
i think in last node it will be b31 and O31...
Revising my concepts.
August 11, 2023
thanks a lot Sir. Thanks thousand time :)
26 alphabet can't praise nitish sir teaching style
Sir so is memoization built into the backward propagation when u code of shoud we explicitly mention the memoization in the code
Sir, ye konse playlist ka video hai? Aap please playlist ka link bhi description me de dijiye. Its convenient for us to follow. Thank you
100 Days of Deep Learning: ruclips.net/p/PLKnIA16_RmvYuZauWaPlRTC54KxSNLtNn
Thank you very much. I hope your channel grows leaps and bound.
Sir, please also provide your handwritten notes
Thanks :)
completed
is possible to to get notes Please
Thank you sir …
good explanation sir jee
Sir Can you tell me TO kya Back propagation last(
23:07
finished watching
while True:
print("Thank you")
CFBR
Sir can you please give the link to your website
Amazing again
ha so deep learning is dp then
Always wasted time by watching code with Harry, This man is great.
Love from Pakistan❤❤❤❤❤
Great video sir
great content
Amazing
best
Sir apko data science apny channel pr b apply krna chahye Mery aik sir he unho ne different technique apply ki thi or unky 48 hours me 100k subscriber mil gay he is a grand master at kaggle from Pakistan. You should try this too agr apko wo video ka link chahye then wo b me apko de du ga ap dkh Lena.
de hi do yaar
@@campusx-official yele sir
ruclips.net/video/NaVMlzTk9Vo/видео.html
Once again god level teacher ❤️ 🤌🏻