hey , great playlists , i can now say i can really understand deep learning , can you please make a video explaining the perceptron algorithm and its complexity along with kernels ?!
Hey I implemented Backpropagation on CNN.....dL/dZ = dL/dA dot product on dA/dZ according to your video....In my implemented, dL/dA and dA/dZ are both the shape (training size,image height, image weight, channel size)...If this is corrected, how should we dot product it....
I had a problem with the output in Spyder that it only show one iteration which is iter:1 cost: 0.697567606727616 train_acc:0.65 test_acc:0.3 == May I know what possibly is the error?
I keep getting this error ( ValueError: shapes (1,1) and (10,1000) not aligned: 1 (dim 1) != 10 (dim 0) ) in reference to this line ( grads["dZ" + str(l)] = np.dot(parameters['W' + str(l+1)].T,grads["dZ" + str(l+1)])*derivative_relu(forward_parameters['A' + str(l)]) ) does anyone know why?
Hi, Mr. Jay Patel! Thanks a lot for such explaining! Why you don't use derivative in the output layer (AL) for sigmoid function during backward pass? Can we state that the weights of the last layer (WL) learn without taking into account back pass of the output error (AL-Y) through sigmoid? If yes, why you and other guys don't use it?
You made a small mistake while typing the code for derivative_tanh(x) function. ✔ The correct code will be : def derivative_tanh(x): return 1 - np.power(np.tanh(x), 2)
Dude....your videos are FABULOUS!!
keep going!!we need you!!
Bro, your content and you as well are AWESOME.
Liked and subbed, keep it up!
It's a pleasure to learn with your lesson!
Hi, glad they are helpful! It’s a pleasure to create these videos!
Bro this is the simplest explanation in whatever I have seen
You and Josh Starmer (StatQuest) totally demystify DNNs. Thanks!!!
Hey.. thats a big compliment. Thanks!
Straight forward and to the point! Good video
Thanks bro ! you are simply great teacher
Hey, thank you so much!
thank u, it the best and clear vedio I have watched😍U an extremely handsome man!!!
Good stuff. No videos for a year? please keep uploading. Thank you
Awesome Video man!!!
hey , great playlists , i can now say i can really understand deep learning , can you please make a video explaining the perceptron algorithm and its complexity along with kernels ?!
I really liked it. I would also like for you to create a video about LSTMs and Transformers (from scratch).
Thanks for the suggestion :)
you're back....damnnnnn💥💥
Haha… Thanks! 😁
Very helpful. Please make a playlist for GAN and transformers like you made for CNN.
Too Good man....Thank u so much!
You’re welcome!
Train_x data is not working...
@@MachineLearningWithJay
Thank you so much sir, very much helpful 🙂
Welcome!
@@MachineLearningWithJay please make a playlist on YOLO algorithm
very decent explanation, would you like to do the same for CNN?
thanks a lot bro your videos really helped me
Hey I implemented Backpropagation on CNN.....dL/dZ = dL/dA dot product on dA/dZ according to your video....In my implemented, dL/dA and dA/dZ are both the shape (training size,image height, image weight, channel size)...If this is corrected, how should we dot product it....
I had a problem with the output in Spyder that it only show one iteration which is
iter:1 cost: 0.697567606727616 train_acc:0.65 test_acc:0.3
==
May I know what possibly is the error?
I keep getting this error ( ValueError: shapes (1,1) and (10,1000) not aligned: 1 (dim 1) != 10 (dim 0) ) in reference to this line ( grads["dZ" + str(l)] = np.dot(parameters['W' + str(l+1)].T,grads["dZ" + str(l+1)])*derivative_relu(forward_parameters['A' + str(l)]) ) does anyone know why?
Hi, Mr. Jay Patel!
Thanks a lot for such explaining!
Why you don't use derivative in the output layer (AL) for sigmoid function during backward pass?
Can we state that the weights of the last layer (WL) learn without taking into account back pass of the output error (AL-Y) through sigmoid?
If yes, why you and other guys don't use it?
is this video suitable for beginers? If not recommend me what to watch before jumping to this
keep it up✌️💯
Thanks Shubham 😄
Hi bro, try to do the videos on Pre-trained hugging face transformers
Thanks for your suggestion… will try to make a video on it
great content, would you like to do the same for RNN?
hi. what is your dataset name
Please make a video on COVID-19 detection using chest X Ray
Thanks bro
You made a small mistake while typing the code for derivative_tanh(x) function.
✔ The correct code will be :
def derivative_tanh(x):
return 1 - np.power(np.tanh(x), 2)
you are SVNIT passout na?
Hi Pranav. Yea, I am from SVNIT
copied from coursera
Haha.. yeah, I have learned from that only. It's a very good source tbh.
I have made a video on RUclips, bcoz they don't have any video of it.