My First comment on you tube, what a great explanation. I spent more than one lakh to get certification on AIML. But ultimately learning concepts from you. your teaching is like where weight (knowledge) is improved but loss (fee) is zero.
The best thing on RUclips for a data science learner is your channel. Thank you for making these videos which are like holy scriptures for learners out here. I am blown out by your level of simplicity in making these daunting topics appear so easy to understand. You never disappoint in helping us to grasp these concepts. Big thanks Dhaval.
Man, you just taught me so well. I have trained few models and try some machine learning projects before watching your videos. But the truth is I didn't really spend time on "really understanding" the math in machine learning. Today I finally really understand it and thank you very much . Great series, I promise that I will finish as much as I can about your machine learning videos.
you are simply superb sir.... there are so many who knows gradient descent , but very few of them like you can teach how gradient descent works... hats-off sir.. thankyou so much.....thanks alot....
Great Explanation....never though ill understand Gradient Descent so well and that too with code....you showed the very granular explanation of how we can write python code to get same output as model....that's awesome...you are increasing confidence in many ppl...great work.. Tks a lot...
The way of explaining is just awesome, I say. You are explaining topic like deep Learning in such a attractive way that I never get enough of watching your videos. Your presentations are superb, making everything easy for us to understand and retain for longer time.
Thanks for your kind words. If you found this to be useful, please share with your friends through linkedin, whatsapp or facebook that way this content can go to more people who are studying this topic.
Hi , I am studying your tutorial series last 15 days. It is super and helping me lot to learning ML and DL.Thank you very much for this excellent kind of teaching and presentation with exercise. I believe your health is doing good. Pray to god for your very first recovery...
Actually, your course is one of the best deep learning tutorials ever. love it. How can I download the data insurance for training in my own jupyter notebook?
hello sir, This is Krishvin in 'def gradient_descent(age, affordability, y_true, epochs, loss_thresold):' the ' print (f'Epoch:{i}, w1:{w1}, w2:{w2}, bias:{bias}, loss:{loss}')' is given after the changing the w1,w2,b so for each epoch the next weights get printed.This is the reason why we are not able to see w1=1 w2=1 b=0 in the first.the weights are getting misplaced..Im a huge fan of ur tutorial. in ur tutorial complex things are taught in a simple way which makes me really curious to learn more.......Thank you,sir
For real, this is great video and great effort. I have small question why you used np.transpose(age) age is one 1-D array so I think it's useless in this case. we can just do that: diff = y_predicted-y_true w1d = np.mean(age*diff) w2d = np.mean(affordability*diff) as we did with bias: bias_d = np.mean(y_predicted-y_true) Thank you very much.
🎯 Key points for quick navigation: 00:00 *🧠 Introduction to Gradient Descent* - Understanding the importance of gradient descent in machine learning - Overview of the theory behind gradient descent - The relevance of gradient descent for data scientists and machine learning engineers 03:00 *📊 Predictive Functions in Machine Learning* - Explanation of prediction functions in supervised learning - Definition and role of weights and biases in prediction functions - Importance of establishing a prediction function for supervised learning techniques 05:43 *📈 Training a Neural Network with Gradient Descent* - Walkthrough of training a neural network using gradient descent - Explanation of forward pass and error calculation during training - Importance of adjusting weights and biases using derivative and learning rate for training optimization 17:20 *💻 Implementing Gradient Descent in Python* - Demonstration of using TensorFlow to build a simple neural network for prediction - Explanation of model compilation and fitting in TensorFlow - Interpretation of model accuracy evaluation and prediction results 26:22 *🧠 Overview of prediction function and calculation* - Explanation of the prediction function combining weights, age, affordability, and bias. 29:00 *📉 Setting up helper methods and implementing gradient descent* - Introduction to helper methods like log loss and numpy sigmoid function. - Description of implementing the gradient descent function in Python from scratch with specified epochs, initialization, and learning rate. 33:15 *🔄 Calculating derivatives for updating weights in gradient descent* - Demonstration of calculating derivatives for weights, affordability, and bias in the context of updating weights during gradient descent. Made with HARPA AI
Hi, I must admit that I really enjoyed this video. However, I have a question in the math analyses of error/Log loss (binary cross-entropy). If we have a predicted value of y_predict = 0.99 and an actual value of y_label = 1, applying the Log loss equation according to your video is = 0.01. My calculation applying the formula does not give me 0.01. Can you through light on this? Maybe my Math sucks 😃 or maybe you just used a random value for explanation purposes. Your response will help me assimilate the procedure better. Thanks again for your hardwork. I look forward to hearing from you. Regards
This lecture was very depth and nicely explained. Just have one question, how to implement gradient descent with pytorch model or with tensorflow? here you built gradient descent from scratch but building this for almost 100 input is difficult so can we use some this with already built machine learning model?
4.31 is to be pronounced as four point three one. Not as four point thirty one. Just a suggestion for improvement. I am using this series to revise my knowledge acquired from another reputed source. So I can join many others in stating your style of teaching is very good .
Hi, I need a small clarification, One of the sample data has the value Age 25, affordability 0, bought_insurance 1 (without affordability he bought the insurance. Its fine) But our model predicts the Age 26 , affordability 0 from X_test as 0 (means not bought insurance)..I am bit confused. Can you pls clarify...why both are behaving in exact opposite Meanwhile I am learning ML and AI from this channel. Your explaining its too good. Thanks for the dedication.
It is not necessary that it will always predict things in a right way. ML model generates a prediction function that can *generalize most of the cases* it has seen so far in training dataset. It is ok to have errors but the idea is on *majority* of the samples in test dataset it will predict things ok (this thing is measured by *accuracy* parameter). or model.score()
Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
Sir can you kindly please post the slides to the descripition that we can understand the vedios to clearly
@@honeymilongton8401 no
@@narutoshippuden7024 i was looking for the same, did you figure out where to get that dataset from ?
people are just wasting money in Online courses this man has gave you excellent tutorial.
I agree
One of the great playlist on deep learning.. You make it so simple
☺️👍
Agreed 100%.. taught really well
yes make it so simple, thank a lot
Completely agree
i watched more than 30 videos on this concept... none of them filled confidence in me in this concept. Thanks sir...
Deepak you can do it. I wish you all the best 👍☺️
My First comment on you tube, what a great explanation. I spent more than one lakh to get certification on AIML. But ultimately learning concepts from you. your teaching is like where weight (knowledge) is improved but loss (fee) is zero.
The best thing on RUclips for a data science learner is your channel. Thank you for making these videos which are like holy scriptures for learners out here. I am blown out by your level of simplicity in making these daunting topics appear so easy to understand. You never disappoint in helping us to grasp these concepts. Big thanks Dhaval.
29:39 logloss function for each epoch(all rows at time) nice video
This is one of the best explanation over gradient descent. Thanks a lot Sir
Totally deserve a lot of recognition since the concepts are explained so nicely and it gives a chance to code along too.
Hi @codebasics, where can I find the raw files that you are using? I am coding it alongside
Man, you just taught me so well.
I have trained few models and try some machine learning projects before watching your videos.
But the truth is I didn't really spend time on "really understanding" the math in machine learning.
Today I finally really understand it and thank you very much .
Great series, I promise that I will finish as much as I can about your machine learning videos.
For the matter of fact, the same comment would be of mine but 陳翰儒 陳翰儒 has mentioned before me.
yarrrr hats off to this man who gets into such details and implementations
Hi Dhaval, This is a great explanation, can't thank you enough for the effort you are putting in to help the data science community.
This summed a bachelors degree of knowledge, thx
I never seen such simplicity of teaching like you Thanks 👍
you are the best teacher I have ever seen. Thank you very much
You have the best and intuitive explanation of concepts and code, EVER. Thanks!!
Glad it was helpful Vishal!
superb explanation and practical session. Never seen anything like it on RUclips.
When I started deep learning I don't understand the logic behind the formula €(weights*input+bias)..thank you sir❤️❤️
I just want to say I really really like the expression on your face when explaining something. Anyway, thank you for make thing so simple for beginner
you are simply superb sir....
there are so many who knows gradient descent , but very few of them like you can teach how gradient descent works... hats-off sir.. thankyou so much.....thanks alot....
Your teaching skills are awesome. Thank You for making these great tutorials
best lecture i got in up till now in my data science course, tysm sir 💗
You've opened my mind with this video. Very helpful for deep learning understanding. Thank you from Argentina.
☺️🙏
Excellent Teacher !!! Great scientist !!! Very nice video ...the best on yourtube related to machine learning
You make it quite easy to understand
Hats off! for the Explanation ,
Hats off! for the Effort ,
Hats off! for the Content ,
Thanks a lot Sir : )
The way of teaching is awesome. Great explanation. Your effort is appreciated
haha at first was not gonna watch coz it was 40 mins but u just made it so engaging. I loved how u explained and compared it in python.
I am from Bangladesh. Great explanation with combination of math & code. which I want really
Great Explanation....never though ill understand Gradient Descent so well and that too with code....you showed the very granular explanation of how we can write python code to get same output as model....that's awesome...you are increasing confidence in many ppl...great work.. Tks a lot...
It is the best course I have ever seen. Thank you very much!
Hi, great class. Congratulations from Brazil-Teresina-PI
glad you liked it Ocean
@@codebasics how did the tensor flow know that it has to stop at loss 0.4631
@@SudhanshuKumar-lp1nr that value is variable, if you increase epochs, then loss will dip even more
@@subhamsekharpradhan297 thanks
Loved the way you implemented gradient decent yourself, I started feeling ML :-)
Glad to hear that!
The way of explaining is just awesome, I say. You are explaining topic like deep Learning in such a attractive way that I never get enough of watching your videos.
Your presentations are superb, making everything easy for us to understand and retain for longer time.
Thanks for your kind words. If you found this to be useful, please share with your friends through linkedin, whatsapp or facebook that way this content can go to more people who are studying this topic.
@@codebasics sure sir,
wow Dhaval sir,
you can prepare anybody to a potential data scientist
Thank you so much bhaiya for helping... I was feeling so lost before I stumbled onto your playlist
Wow This was amazing
Thank You Dhaval Patel ji 🙏❤
Hi , I am studying your tutorial series last 15 days. It is super and helping me lot to learning ML and DL.Thank you very much for this excellent kind of teaching and presentation with exercise. I believe your health is doing good. Pray to god for your very first recovery...
Hey Uday thanks for your prayers my friend. Yes my health is improving steadily 😊👍
Genius trainer❤❤ You are gifted brother 😊
Simply Super B.No More Questions.Your videos are like medicine with no side effects
👍😊😊🙏
Very nice explanation.
I was always confuse with gradient descent, but now I have understood well. 👍 👍
Thanks
I am happy this was helpful to you.
@@codebasics please share dataset link
Excellent videos! They are very detailed. Thank you for sharing your knowledge.
Your detailed explanation helped me to understand how actually neural nets work. Thankyou for such a good content, it helped a lot.
Clear explanation. You made it so simple to grasp. Kudos
👍☺️
Excellent sir ..the way of your teaching is very good ..i got it easiily..thanku infinitive times sir
Nice, the initial iteration using one value at a time is very good. Loved it.
Nice lucid explanation with simple codes. Great effort. Thank you so much.
Ur explanations are simple and indepth.. thank u
crazy, Excelent playlist going ! I love it
sir, hats off, this is complex yet seems so simple as you tell. Thanks wont be enough.
Great explanation, A wise man once said, real intelligence is making difficult things simple. You are an example sir. thank you very much
where did you get the dataset?
every single second is crucial of this video
Big fan of you, What a explaination man!!!!! kudos.. keep it up.
The way of teaching is awesome
I am happy this was helpful to you.
Thumbs up, you proved that neural network is not “Black Box”
excellent explanation., complex concept made so simple
SIr.. Great tutorials ... I just hope you should also start giving certificates to people who complete watching these videos :)
Favorite videos for learning very complicated topic!
Glad you think so!
thank you for such a simple explanation of this gradient descent for neural network.
☺️👍
Actually, your course is one of the best deep learning tutorials ever. love it. How can I download the data insurance for training in my own jupyter notebook?
from gilhub link in the description
Sir, your tutorial are too good.
Excellent. Simply awesome!
hello sir,
This is Krishvin in 'def gradient_descent(age, affordability, y_true, epochs, loss_thresold):' the ' print (f'Epoch:{i}, w1:{w1}, w2:{w2}, bias:{bias}, loss:{loss}')' is given after the changing the w1,w2,b so for each epoch the next weights get printed.This is the reason why we are not able to see w1=1 w2=1 b=0 in the first.the weights are getting misplaced..Im a huge fan of ur tutorial. in ur tutorial complex things are taught in a simple way which makes me really curious to learn more.......Thank you,sir
Thank you. Seeing your video, even a 10th grade can understand Deep Learning
🤓🤓👍
Nicely done! I am following your tutorials since the beginning, and they are fantastic! Eager to check your next video.
☺️👍
An awesome person you are...cheers from Nigeria
👍😊
Beautiful explanations.
Thank you.
i just cannot thank you enough. Blessings
your explanation is very very nice sir. It help me a lot. Thank u so much sir.
For real, this is great video and great effort.
I have small question why you used np.transpose(age)
age is one 1-D array so I think it's useless in this case.
we can just do that:
diff = y_predicted-y_true
w1d = np.mean(age*diff)
w2d = np.mean(affordability*diff)
as we did with bias:
bias_d = np.mean(y_predicted-y_true)
Thank you very much.
Thank you so much, I enjoy a lot with this.
You are simply great. Thank you
Great explaination Sir !
thank you for the video, it gave a deep level of insight into the topic as well as practice
Glad it was helpful!
Sir, You are doing a great job.....deep respect !!!
You're the greatest. Thank you for these.
Glad you like them!
perfect .you are genus sir
Superb explanation 👌🏻👌🏻👌🏻
vere level explanation awesome bro
Superb explanation once again Thanks a lot.
Glad you liked it
Sir plz upload the dataset on Otherwise that video really helps me a lot.
🎯 Key points for quick navigation:
00:00 *🧠 Introduction to Gradient Descent*
- Understanding the importance of gradient descent in machine learning
- Overview of the theory behind gradient descent
- The relevance of gradient descent for data scientists and machine learning engineers
03:00 *📊 Predictive Functions in Machine Learning*
- Explanation of prediction functions in supervised learning
- Definition and role of weights and biases in prediction functions
- Importance of establishing a prediction function for supervised learning techniques
05:43 *📈 Training a Neural Network with Gradient Descent*
- Walkthrough of training a neural network using gradient descent
- Explanation of forward pass and error calculation during training
- Importance of adjusting weights and biases using derivative and learning rate for training optimization
17:20 *💻 Implementing Gradient Descent in Python*
- Demonstration of using TensorFlow to build a simple neural network for prediction
- Explanation of model compilation and fitting in TensorFlow
- Interpretation of model accuracy evaluation and prediction results
26:22 *🧠 Overview of prediction function and calculation*
- Explanation of the prediction function combining weights, age, affordability, and bias.
29:00 *📉 Setting up helper methods and implementing gradient descent*
- Introduction to helper methods like log loss and numpy sigmoid function.
- Description of implementing the gradient descent function in Python from scratch with specified epochs, initialization, and learning rate.
33:15 *🔄 Calculating derivatives for updating weights in gradient descent*
- Demonstration of calculating derivatives for weights, affordability, and bias in the context of updating weights during gradient descent.
Made with HARPA AI
Sir,
Videos are good with all basic information.
If possible can you make a combined video on maths concept related to AI-ML-DNN-NLP?
Hi,
I must admit that I really enjoyed this video. However, I have a question in the math analyses of error/Log loss (binary cross-entropy). If we have a predicted value of y_predict = 0.99 and an actual value of y_label = 1, applying the Log loss equation according to your video is = 0.01. My calculation applying the formula does not give me 0.01. Can you through light on this? Maybe my Math sucks 😃 or maybe you just used a random value for explanation purposes. Your response will help me assimilate the procedure better. Thanks again for your hardwork. I look forward to hearing from you.
Regards
This lecture was very depth and nicely explained. Just have one question, how to implement gradient descent with pytorch model or with tensorflow? here you built gradient descent from scratch but building this for almost 100 input is difficult so can we use some this with already built machine learning model?
Great explanation sir. very good content sir Thank you so much sir.
Glad you enjoyed it
beautiful explanation
4.31 is to be pronounced as four point three one. Not as four point thirty one. Just a suggestion for improvement. I am using this series to revise my knowledge acquired from another reputed source. So I can join many others in stating your style of teaching is very good .
Thanks Sowmyan for the feedback 🙏🏼 I will keep this in mind and wish you all the best!
great video, lot of concept to go through, thanks a lot!
Very nice explanation . Do you have any real life example of Multi layer Neural Network ?
Thank u very much from all my heart ♥️
Wow great tutorial I have learned a lot.Thanks
Great Explanation. I love it. Thank You
Glad you enjoyed it!
Great job. Really useful
Hi,
I need a small clarification, One of the sample data has the value Age 25, affordability 0, bought_insurance 1 (without affordability he bought the insurance. Its fine)
But our model predicts the Age 26 , affordability 0 from X_test as 0 (means not bought insurance)..I am bit confused. Can you pls clarify...why both are behaving in exact opposite
Meanwhile I am learning ML and AI from this channel. Your explaining its too good. Thanks for the dedication.
It is not necessary that it will always predict things in a right way. ML model generates a prediction function that can *generalize most of the cases* it has seen so far in training dataset. It is ok to have errors but the idea is on *majority* of the samples in test dataset it will predict things ok (this thing is measured by *accuracy* parameter). or model.score()
Amazing playlist sir.... I request you to teach how to implement levenberg marquardt using keras... Please sir..
very much well explained sir thank you so much
I am happy this was helpful to you.
40:28 Awesome🙌
It's simple. Thank you sir.
Sir nicely explained
Mix this series and Andrew Ngs course, J alamars blogs and brudda you are set for a ML job. Trust my word