Machine Learning-Bias And Variance In Depth Intuition| Overfitting Underfitting
HTML-код
- Опубликовано: 26 июн 2024
- In statistics and machine learning, the bias-variance tradeoff is the property of a set of predictive models whereby models with a lower bias in parameter estimation have a higher variance of the parameter estimates across samples, and vice versa
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
Please do subscribe my other channel too
/ @krishnaikhindi
If you want to Give donation to support my channel, below is the Gpay id
GPay: krishnaik06@okicici
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06
Good turorial. My thoughts below (hope it adds to someone's understanding):
We perform cross validation (to make sure that model has good accuracy rate and it can be used for prediction using unseen/new or test data). To do so, we use train and test data by properly splitting our dataset for example 80% for training, 20% for testing the model. This can be performed using train_test, train_test_split or K-fold (K-fold is mostly used to avoid under and overfiting problems).
A model is considered as a good model when it gives high accuracy using training as well as testing data. Good accuracy on test data means, model will have good accuracy when it is trying to make predictions on new or unseen data for example, using the data which is not included in the training set.
Good accuracy also means that the value predicted by the model will be very much close to the actual value.
Bias will be low and variance will be high when model performs well on the training data but performs bad or poorly on the test data. High variance means the model cannot generalize to new or unseen data. (This is the case of overfiting)
If the model performs poorly (means less accurate and cannot generalize) on both training data and test data, it means it has high bias and high variance. (This is the case of underfiting)
If model performs well on both test and training data. Performs well meaning, predictions are close to actual values for unseens data so accuracy will be high. In this case, bias will be low and variance will also be low.
The best model must have low bias (low error rate on training data) and low variance (can generalize and has low error rate on new or test data).
(This is the case for best fit model) so always have low bias and low variance for your models.
Wonderful summary!
You should probably create articles coz you are good at summarising concepts!
If you have one please do share!
Great
Very well written 👍🏻
Thanks for sharing
👍🏻 Consider writing blogs
Really very nice and well written. After watching video, if we go through your summery, its a stamp on our brains. Thanks to both for your efforts.
This video need to be watched again and again.Machine learning is nothing but proper understanding of ovrfitting and underfitting..Watching the second time.Thanks Krish
Ageeed!
This is what they asked me in OLA interview. And the interviewer covered great depth on this topic only. It's pretty fundamental to ML. Sad to report they rejected me though.
@@batman9937 hi man plz help to know what other questions they asked .
@@ashishbomble8547 buy the book :: ace the data science interview by Kevin Huo and nick singh .
This was my biggest doubt and you clarified it in so easy terms. Thank you so much Krish.
XGBoost, the answer cant be simple, but what happens is when dealing with high bias, do better feature engineering n decrease regularization, so in XGBoost we increase depth of each tree and other techniques to handle it to minimize the loss...so you can come to conclusion that if proper parameters are defined (including regularization etc) it ll yield low bias and low variance
Hi Krish,thanks for the explanation ..6:02 it should be high bias and low variance in case of under fitting
Yes exactly i was looking for this comment
Amazing video by Krish. Thanks for pointing out this. @Krish Naik please make a note of this
yess!!!
yess
Exactly! I searched for this comment :)
Can't express my gratitude enough ! Thank you for explaining it so well
You can't get a clearer explanation than this, hats off mate
The best explanation among the whole youtube channels 👏. I love the way how you always keep things simple. Glad to find out about your channel, sir.
providing these info makes you a great teacher... the way you explain everything going to brain.....
What an excellent explanation on bias and variance. I finally understood both terms. Thank you so much for the video and keep up the good work!
This guy is really great...Thank you so much for effort you put for us.
krish sir i hope God bless you with whole heart you are doing great job and thanks for the INEURON it made my life easy.
I have been trying to understand this concept since long ... But never knew its this simple 😀 thank u Krish for this amazingly simple explanation to understand.
Very good. Revised my concepts perfectly 🔥🔥
I really love his in-depth intuition videos ... compared to his plethora of videos!
at 6:10 you made it all clear to me in just 2 lines!! Thank you for this video :)
One video all clear content... thanks bro it was really a nice session.. u really belong to low bias n low variance human. Keep posting such clear ML videos..
Great I learnt by watching your entire playlist.
I really was in great need of such an excellent explanation of Bias and variance. great help!
You nailed it man ! Great work ! Respect your time and effort!
bhai, tu bahot sahi hai, 2.80 lacs fees bharke jo baat nahi samzi easily wo tumne 16 minutes me bata di..kudos..amazing word dear, all the very best
Thank you for a detailed explanation of bias and variance. Great teaching!!
Beautifully explained. My concept are now clear on Over fitting and Under fitting models. 👍 Thanks 🍻
This is an awesome video - was fully confused earlier - this video made it all clear !! Thanks a lot sir !!
sir after watching this video , mera confusion ek baar mein clear ho gya between bias and variance , awsome explaination
After watching this video doubt is clear really helping this. And Thanks given ur precious time...
Thanks for revising these important concepts
Thanks for this. Amazing explanation.
One of the best explanations of Bias and Varianace w.r.t Overitting and underfitting...
Very important discussion on important words in ML. Thanks. Easy explanation on hard words.
Awesome video. It explained many concepts significantly.
Krish, you are a master in statistics and machine learning
You make one of the best tech videos on youtube !!!!
Thank you , This video is really helpful to understand the Bias and Variance concepts
Thank you for good explanation of bias & variance..❤️
Awesome video, thank you so much for these wonderful explanations, they are much needed!
U are Reallly great sir ... ur explanation is very much Crystal Clear
You explained it really well!! Thank you!
Thanks Krish, had scourged the net, but this understanding was great. Good memory hook! Thanks for this.
The most clear and precise information 🎉 thank you sir❤
Thank you sooo much for making it so easy to understand.
Please give a video on some mathematical terminology like gradient descent etc. You are really doing a great job.
Thank you, this video cleared all my doubts :)
Superbbb explained..it connected my dots. Thank u
Brilliantly explained !! Thank you !!
ultimate discussion and person who discussed
Very succinct explanation of the very fundamental ML concept. Thank you for the video!
Very well explained. Thank you so much sir.
Today, I got clarity about this Topic, Tq u sir
Krish thankyou so much., this is the best channel for data science that I ever seen. Great efforts Krish. Thanks again.
Great Explanation Sir, thanks a lot for the video
Love watching your video’s..You explain very well.
Best Explanation on Bias and Variance!
you made my work easy by this explanation. thanks.
It was really good video and it clears all the doubts I have.
Insanely good video. Also this has amazing energy!
Awesome explanation.Thanks a lot for the video
Thanks Mr. Krish for your best explanation, now I can clearly understand about Bias and Variance :D
Tqsm Sir.Very Valuable Information
Thank You so much Krish Sir..!!
watched this again,Each time I feind something new
Well articulated, thank you Krish
tbh, best video on youtube about Bias And Variance.
At 06:08 it is said that the underfitted data, the model has high bias and high variability. To my understanding, the information is not correct.
Variance is the complexity of a model that can capture the internal distribution of the data points in the training set. When variance is high, the model will be fitted to most (even all) of the traiining data points. It will result in high training accruacy and low test accuracy.
So in summary :
When the model is overfitted : Low bias and high variance
When the model is underfitted :High bias and Low variance
Bias : The INABILITY of the model to be fit on the training data
Variance : The complexity of the model which helps the model to fit with the training data.
yes bro, you are correct
I also have same doubt. @Krish Naik sir , please have a look on it.
But under fitting suppose to have low accuracy of training data know ? Confusing !!
Have I learned the wrong definition of bias and variance by krish sir's explanation? Now I am confused😑
@prachi... not at all concept is at the end same
you explain everything so well :)
Way of explanation is woww.
your are so awesome, I love your teaching
Thank you very much for the simple and proper explanation...
XGBoost should have low bias & low variance !
Not really it will depend how do you tune the hyperparameters of the model, for this reason it is important to tune a model in order to find a compromise that ensure a low biais (capacity of the model to fit a theoritical function) and low variance (capacity of model to generalisation)
Thanks a lot for the wonderful explanation
brilliant video!!!!! explained everything to the point.
Unbelievably amazing 👏
Mam even though I am studying AI in my clg probably this is easy to understand thanks man..
very useful lecture , it helps me much to understand this topic in a simple and easy way please keep going
You are amazing, thanks a lot for your wisdom
Good pedagogy and easy explanation. Thanks a lot
excellent tutorial. better than IIT professors who r teaching machine learning.
Very good video, easiest video for understanding logic of bias & variance.
Great explanation. Thank you so much!
Thank you so much bro ! So clear !!!
GREAT SIR I GOT IT, THANKS FOR YOUR EFFORT.
I just love this guy, - from PH.
very niceee, filled the gap in my knowledge
Very well explained. Thanks
Simple and crisp explaination.
Clear explanation. @krish sir thanks for making this video
Thanks for updating this video
Simple, easy to understand. Thanks
woow awesome, great work done in one single video. insightful
Perfectly explain sir
Thanks for such a nice explanation
XGBoost has the property of low bias and high variance, however it can be regularised and turned into low bias and low variance. Useful video indeed.
Perfect Lecture!!! Thanks Krish :)
Excellent teaching
This is the beautiful explanation👏
Sir superb explanation 🙏🙏
Thanks..very useful..perfectly understood sir.
well-done sir ....keep it up
Good video sir, its a great help for learners like me.
Underfitting : High Bias and Low Variance
OverFitting : Low Bias and High Variance
and Generalized Model : Low Bias & Low Variance.
Bias : Error from Training Data
Variance : Error from Testing Data
@Krish Please confirm
I am confused ...
It means that underfitted model has high accuracy on testing data?
Underfitting : High Bias and HIGH Variance
@@videoinfluencers3415 I mean under fitting model has low accuracy on Testing and Training Data and the difference between the Training accuracy and test accuracy is very less, that's why we get low variance and high biased in Under fitting models.
You are correct bro I checked on Wikipedia also..and in some different sources too.
@Krish Please Confirm.
If it makes it any clear for other learners, here's my explanation...
BIAS is the simplifying assumptions made by a model to make the target function (the underlying function that the ML model is trying to learn) easier to learn.
VARIANCE refers to the changes to the estimate of the target function that occur if the dataset is changed when implementing the model.
Considering the linear model in the example, it makes an assumption that the input and output are related linearly causing the target function to underfit and hence giving HIGH BIAS ERROR.
But the same model when used with similar test data, will give quite similar results and hence giving LOW VARIANCE ERROR.
I hope this clears the doubt.