Sir, I am so happy for the students who don't have a good financial condition or because of English, won't be able to learn Data Science. This channel brings new hope for them, You are an inspiration for us.
This helped to cover the evaluation metrics quickly in less time, definitely a nice video to see before interview. Thanks the teaching in simple manner.
Great explanation sir, as well as great examples. I was just looking for your videos in order to understand this concept. Couldn't find this topic in English so came here.
1. Start with Recall: Focus on maximizing recall to ensure you capture as many potential crashes as possible. The primary goal is to ensure that as many actual crashes as possible are detected. Missing a crash (high FN) could lead to significant financial losses . By maximizing recall, you reduce the risk of overlooking a critical downturn. This helps in avoiding missed opportunities. 2. Optimize Precision: Once you’ve achieved a reasonable recall, work on improving precision to reduce the number of false positives. This ensures that when your model predicts a crash, it is more likely to be accurate, thus reducing unnecessary panic or overreaction in the market.
Best way to make dumb people like me understand the performance measurement of ML models. I was always confused between Recall and Precision. Kudos to you Krish!!
I love the way you teach but everything is in bits and pieces . If there was a single playlist for data science with video numbers would have been great to follow .
Sir arent all these metrics then meant just for logistic regression, if we use LR or smth in which we have multiplie options confision matrix wont work ?
18:13 sir is case me to ager model sabhi ko cancer bata de to bhi ye best model rahega aapke logic ke hisab se q ki as u said person at lest test to karwa lega :P this question ask in interview I'm not able to answer.
is case me although model ka accuracy badhega par precision kam ho jayega, bcoz FP + TP ka sum badhega. aur logically hm soche ki mera model sabko cancer patient bta dega to sare log ja kr check krwane lgenge, par hmne model phir bannya hi kis liye tha? taki isis gap ko kam kr ske right..............
Type 1 and Type 2 error search karke uske baare me padho. Ek aise insaan ko, jise cancer nahi hai, usse ye bolna ki tumhe cancer hai, ye utna bada error nahi hai jitna bada error hoga ek aise insaan ko, jise cancer hai, usse ye bolna ki tumhe cancer nahi hai
so bhai what is a proper example of a balance data set, is there any method/algorithm to balance these data set ? Also if we get unbalanced dataset does it mean the accuracy is low
In an imbalanced dataset, it's not accurate to say that the model's accuracy will definitely be low or high. What we can say is that accuracy alone is not a reliable metric for evaluating performance in such cases.
Tomorrow Stock market is going to crash that scenario i use recall bcz when (actually stock market are crush but model says it not crush so i use) plz sir corrrect or not reply me?
1. Start with Recall: Focus on maximizing recall to ensure you capture as many potential crashes as possible. The primary goal is to ensure that as many actual crashes as possible are detected. Missing a crash (high FN) could lead to significant financial losses . By maximizing recall, you reduce the risk of overlooking a critical downturn. This helps in avoiding missed opportunities. 2. Optimize Precision: Once you’ve achieved a reasonable recall, work on improving precision to reduce the number of false positives. This ensures that when your model predicts a crash, it is more likely to be accurate, thus reducing unnecessary panic or overreaction in the market.
in confusion matrix the x axis or top line occupies actual values while the y axis or the vertical line occupies the prediction value. Accuracy is not used in case of imbalanced data eg 0:900 and 1:100 ie no of zeros are 900 and no of ones is 100. This is imbalanced data set. If we used accuracy in imbalanced data set then our accuracy will be high already which will give false signal. Suppose using this imbalanced data set we create a model that only generates 0 as the output then this model using the formula given TP+TN/all will give 90% accuracy as all TN will be hit and all TP will be zero but due to imbalance the accuracy will be high. Hence a differnet performance metric is used in case of imbalanced data set which are precision and recall. Precision is TP/(TP+FP) like spam email model and Recall is TP/(TP+FN) like cancer detection model. And in case of stock market prediction when we need to reduce both false positive and false negative then in that case f-beta score is used. When both false pos and false nef are importabt then beta=1 (harmonic mean) ie 2*(PR)/(P+R) and when FP is more imp than FN beta=0.5 and when FN is more imp than FP then beta=2 is taken.
Sir, I am so happy for the students who don't have a good financial condition or because of English, won't be able to learn Data Science. This channel brings new hope for them, You are an inspiration for us.
First time I could able to understand actual use of metrics after learning for 3 years 😂..nice
Its very good to know that you also answer immediately along with the questions which is a good way of teaching technique.
Dear sir great teaching method.You deserve lot of Subscribers ❤
this is the only video that gives detailed and simple explanation in 23 min.
This helped to cover the evaluation metrics quickly in less time, definitely a nice video to see before interview. Thanks the teaching in simple manner.
That's the best ml video ...best best best❤❤❤❤❤
for the first time, I got feel in Machine learning. THANK YOU SO MUCH.
thankyou so much krish sir for making our concepts crystal clear...again thankyou ...doing hardwork for us
Great explanation sir,
as well as great examples.
I was just looking for your videos in order to understand this concept.
Couldn't find this topic in English so came here.
Excellent, got a very good understanding of all the terms with proper examples
Explained so wonderfully, made me understand fully..
1. Start with Recall: Focus on maximizing recall to ensure you capture as many potential crashes as possible.
The primary goal is to ensure that as many actual crashes as possible are detected.
Missing a crash (high FN) could lead to significant financial losses . By maximizing recall, you reduce the risk of overlooking a critical downturn. This helps in avoiding missed opportunities.
2. Optimize Precision: Once you’ve achieved a reasonable recall, work on improving precision to reduce the number of false positives. This ensures that when your model predicts a crash, it is more likely to be accurate, thus reducing unnecessary panic or overreaction in the market.
Crystal clear 👍
Bhai great video thankyou for the contribution ..
Best explanation one can expect!!! Excellent.
Best way to make dumb people like me understand the performance measurement of ML models. I was always confused between Recall and Precision. Kudos to you Krish!!
I love the way you teach but everything is in bits and pieces . If there was a single playlist for data science with video numbers would have been great to follow .
Thanks sir , first time I got clean on this topic
Your videos are always helpful sir 🙌🏼
Just Awesome!
Excellent, best wishes ever, Thanks
Sir you are great, Love from Pakistan
great teacher ever'
superb explanation.
absolute clear sir.
Sir, please make an end-to-end Machine Learning project till deployment in Hindi. It will be very helpful for us,
great video
Thank you sir ♥️
In H{beta} score the denominator 's {beta}^2 should be only multiplied to precision and not to whole of prcision+recall
Amazing sir thanks a lot
18:57 recall is important
Sir arent all these metrics then meant just for logistic regression, if we use LR or smth in which we have multiplie options confision matrix wont work ?
Thank you so much
Well done
sir apne beta value kaise decide ki idhar 1 ya 0.5 ... i mean why for FP it is 0.5
Very amazing
Bhai In precision is not the TP from all the Actual value(y) or is it from predicted value(y^) ?
Make it for multiclass classification
Sir make more videos and keet it up
Amazing tutorial I wish I had watched it before my exams 🫡
Sir if i join your full stack data science course , will you teach in the same way as in this video??
I think you have taught very well !!
Yes sir
@@krishnaikhindi in this video which drawingboard tool are you using ?
Is it Microsoft whiteboard ?
06:00 Sir You forget to cut this 😄
18:13 sir is case me to ager model sabhi ko cancer bata de to bhi ye best model rahega aapke logic ke hisab se q ki as u said person at lest test to karwa lega :P this question ask in interview I'm not able to answer.
is case me although model ka accuracy badhega par precision kam ho jayega, bcoz FP + TP ka sum badhega.
aur logically hm soche ki mera model sabko cancer patient bta dega to sare log ja kr check krwane lgenge, par hmne model phir bannya hi kis liye tha? taki isis gap ko kam kr ske right..............
Type 1 and Type 2 error search karke uske baare me padho. Ek aise insaan ko, jise cancer nahi hai, usse ye bolna ki tumhe cancer hai, ye utna bada error nahi hai jitna bada error hoga ek aise insaan ko, jise cancer hai, usse ye bolna ki tumhe cancer nahi hai
@@prakashraushan2621 nice explanation
@@rasengan4480 nice explanation
Hi krish i m fresher in data science and i want to know how will i get the job?
theory toh samajh aa gaya, practical ke liye kaha se refer kare? Koi paid video hai kya??
Yes same for me
so bhai what is a proper example of a balance data set, is there any method/algorithm to balance these data set ? Also if we get unbalanced dataset does it mean the accuracy is low
In an imbalanced dataset, it's not accurate to say that the model's accuracy will definitely be low or high. What we can say is that accuracy alone is not a reliable metric for evaluating performance in such cases.
where r this lecture notes
sir, provide pdf file for this video lecture.
Bhai what is support in the F beta score ?
It's simply the number of instances in the matrix. I.e., the count of TP, TN, FP, FN
Hello krish sir can u tell me which drawing app or software you are using ?
Scrible available in Microsoft store
@@krishnaikhindi thanks
Tomorrow Stock market is going to crash that scenario i use recall bcz when (actually stock market are crush but model says it not crush so i use) plz sir corrrect or not reply me?
1. Start with Recall: Focus on maximizing recall to ensure you capture as many potential crashes as possible.
The primary goal is to ensure that as many actual crashes as possible are detected.
Missing a crash (high FN) could lead to significant financial losses . By maximizing recall, you reduce the risk of overlooking a critical downturn. This helps in avoiding missed opportunities.
2. Optimize Precision: Once you’ve achieved a reasonable recall, work on improving precision to reduce the number of false positives. This ensures that when your model predicts a crash, it is more likely to be accurate, thus reducing unnecessary panic or overreaction in the market.
precision
in confusion matrix the x axis or top line occupies actual values while the y axis or the vertical line occupies the prediction value. Accuracy is not used in case of imbalanced data eg 0:900 and 1:100 ie no of zeros are 900 and no of ones is 100. This is imbalanced data set. If we used accuracy in imbalanced data set then our accuracy will be high already which will give false signal. Suppose using this imbalanced data set we create a model that only generates 0 as the output then this model using the formula given TP+TN/all will give 90% accuracy as all TN will be hit and all TP will be zero but due to imbalance the accuracy will be high. Hence a differnet performance metric is used in case of imbalanced data set which are precision and recall. Precision is TP/(TP+FP) like spam email model and Recall is TP/(TP+FN) like cancer detection model. And in case of stock market prediction when we need to reduce both false positive and false negative then in that case f-beta score is used. When both false pos and false nef are importabt then beta=1 (harmonic mean) ie 2*(PR)/(P+R) and when FP is more imp than FN beta=0.5 and when FN is more imp than FP then beta=2 is taken.
Imbalance dataset miss ho gya video me lagging k karan
Is has a small correction which is rows represent actual class and columns represent prediction class
But if they asked why I gave more importance to FP or FN....why did I gave them equally importantance ...then what will be the answer
recall
👍
F 1 score and f beta score same he kya
f1 score is basically f beta where beta=1
Confusion Matrix: ruclips.net/video/FVlqyKJMhy4/видео.html
Hello sir. We use precision when FP is important. Then what is the need of F beta score like we use beta=0.5 when FP>FN. could you please explain it.
We can use any one of them
@@krishnaikhindi thank you for the replying and clearing my doubt. Great teacher, great teaching skills and great person also❤️😇
Hello sir
Sir apna video ko ku bda diye timing08:34 pe
iska answer
pERFECT
There is a mistake in F Beta score formula
( (1 + Beta^2) *(precision * recall) ) /(Beta * precision + recall)
Sir what if Precision score and Recall score both become 0?
Thanks
actually in confusion matrix you mentioned wrong FP and FN just swap it then it is correct sir
[1,0] = FN and [0,1] is FP
maza nai aaya
recall
Thank you sir😊