Performance measure on multiclass classification [accuracy, f1 score, precision, recall]

Поделиться
HTML-код
  • Опубликовано: 29 июл 2024
  • Here video I describe accuracy, precision, recall, and F1 score for measuring the performance of your machine learning model.
    How will you select one best model among many machine learning model you built? You may use accuracy or f1 score depends on your dataset balance.
    all machine learning youtube videos from me,
    • Machine Learning

Комментарии • 124

  • @Alchemist10241
    @Alchemist10241 3 месяца назад +1

    the other tutorials explain confusion matrix only with a two by two table which is incomplete, but with this video we can understand what confusion matrix and harmonic mean really is, epic work

  • @KnowledgeVideos234
    @KnowledgeVideos234 5 лет назад +2

    Best explanation just before my b.tech project presentation,,,,,man this type of content need to have good views,,,,youtube recommend it all people please!

  • @sankusaha9486
    @sankusaha9486 4 года назад +1

    Minsuk - This is a great tutorial to understand the confusion matrix. Well done

  • @MohitSingh-ke8fh
    @MohitSingh-ke8fh 6 лет назад +6

    Thank you for this tutorial, it was really good!

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak 5 лет назад +3

    This is the best explanation i have seen so far. Thanks for the awesome video tutorial.

  • @GMCvancouver
    @GMCvancouver 4 года назад +2

    best RUclips video ever, many thanks
    Minsuk Heo 허민석, this very clear and simple explanation. i llove you man i passed my project :)

  • @kkevinluke
    @kkevinluke 3 года назад +3

    Amazing, you saved me. Brilliant explanation. This video basically cleared all my doubts on this topic. 🙏

  • @shivaborusu
    @shivaborusu 4 года назад +1

    Thank you Minsuk, keep doing posts with comparisons like which regression models to use under multiple scenarios, classification models, performance metrics, good content (y)

  • @tarajano1980
    @tarajano1980 4 года назад +1

    Absolutely great explanation! Wonderful examples !

  • @guptaachin
    @guptaachin 5 лет назад +1

    This is absolutely amazing. Thanks @Minsuk

  • @rakeshkumarkuwar6053
    @rakeshkumarkuwar6053 5 лет назад +2

    Thank you very much for such an wonderful explanation.

  • @MrRushin95
    @MrRushin95 5 лет назад +1

    Nailed it. Liked the method of explanation.

  • @mohammadalshawabkeh5791
    @mohammadalshawabkeh5791 3 года назад +1

    Thank you for the intuitive explanation!

  • @Alexhoony
    @Alexhoony 5 лет назад +2

    thank you so much, helped a lot!

  • @yolandatorres2103
    @yolandatorres2103 5 лет назад +1

    Fantastic, very clear! Congrats :-)

  • @MayurMadhekar
    @MayurMadhekar 3 года назад +1

    Thanks!! Simplified and interpretable explanation!!

  • @majdiflah
    @majdiflah 4 года назад +1

    Very smooth presentation! Continue like this !!

  • @gloriamacia1120
    @gloriamacia1120 5 лет назад +1

    Such an amazing video!!!

  • @prakashd842
    @prakashd842 4 года назад +1

    HI Minsuk , This truly is the best Video. I am subscriber now.

  • @russianrightnow
    @russianrightnow 3 года назад +1

    OMG, this is the best video! Thank you for your help to prepare to exam about machine learning

  • @yasnaseri193
    @yasnaseri193 5 лет назад +1

    Great work buddy!

  • @Syaidafirzana
    @Syaidafirzana 3 года назад +1

    awesome clear explanation video on this topic. Thanks!

  • @linasuhaili1409
    @linasuhaili1409 4 года назад

    Very clear! 감사합니다

  • @muhammadmujtabanawaz111
    @muhammadmujtabanawaz111 4 года назад +1

    Thank you so much

  • @steveg93
    @steveg93 6 лет назад +1

    A very concise explanation of classification metrics

  • @tatisc4185
    @tatisc4185 5 лет назад +1

    I love it! Thanks

  • @danishzahid2797
    @danishzahid2797 3 года назад +1

    Loved it!

  • @larissaalves6737
    @larissaalves6737 2 года назад +1

    Great!! Thanks for your help.

  • @HazemAzim
    @HazemAzim 3 года назад +1

    Best Explanation I have seen on Multi-class performance measures ... ++ Thanks

  • @tahamagdy4932
    @tahamagdy4932 6 лет назад +2

    You have made my day!
    Thank You

    • @TheEasyoung
      @TheEasyoung  6 лет назад +1

      Taha Magdy thanks for cheerful comments, I will keep up good thing!

  • @ezbitz23
    @ezbitz23 4 года назад +1

    Great video. Very clear explanations.

  • @kevinmcinerney9552
    @kevinmcinerney9552 6 лет назад +1

    Very clear. Thank you

  • @afsanaahsanjeny2065
    @afsanaahsanjeny2065 6 лет назад +1

    Just excellent explanation

  • @adelinevoon422
    @adelinevoon422 2 года назад +1

    Thank youu u make it so easy to understand! ❤️

  • @betultultay1615
    @betultultay1615 3 года назад +1

    Nice video and explanation. Thank you!

  • @rajorshibhattachary
    @rajorshibhattachary 4 года назад +2

    excellent!

  • @VirtusRex48
    @VirtusRex48 4 года назад +2

    Very well done. Thank you

  • @ertanuysal5890
    @ertanuysal5890 4 года назад +1

    It was very clearr thank you !

  • @Dan-wq8id
    @Dan-wq8id 3 года назад

    Brilliantly explained, liked and subscribed!

  • @emmanuel.obaga.001
    @emmanuel.obaga.001 6 лет назад +1

    Awesome!!!!

  • @kazijahidurrahamanriyad9245
    @kazijahidurrahamanriyad9245 3 года назад

    Thank you so much for this content.

  • @RedShipsofSpainAgain
    @RedShipsofSpainAgain 6 лет назад +1

    Best explanation on when to use F score vs use Accuracy. Thank you!

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      thank you very much!

    • @RedShipsofSpainAgain
      @RedShipsofSpainAgain 6 лет назад +1

      The visual/geometric explanation of the harmonic was particularly helpful. Thanks again!

  • @MarsLanding91
    @MarsLanding91 3 года назад +1

    Thank you!

  • @jubjubfriend
    @jubjubfriend 4 года назад +1

    Very good explanation, so clear and well made

  • @AnkitSingh-wq2rk
    @AnkitSingh-wq2rk 5 лет назад +1

    thank you soooo much

  • @k23raj2
    @k23raj2 5 лет назад +1

    Step by step , clear and wonderful explanation . Thanks lot @ Minsuk Heo

    • @AvinashKunamneni
      @AvinashKunamneni 5 лет назад

      Hi could you pls tell me how the values in the rows can be figured out if i'm having 5 classifiers(agree,strongly aggree,disagree,strongly digagree, neither agree nor digagree) ??

  • @dr.amarnadhs5262
    @dr.amarnadhs5262 4 года назад

    Clearly explained each and every step.... thank you so much.....:-)

  • @CogaxCH
    @CogaxCH 4 года назад +1

    So good!

  • @TheWesley1412
    @TheWesley1412 2 года назад +1

    you saved me, thank you!!

  • @akashpoudel571
    @akashpoudel571 5 лет назад +1

    thank u sir...damn clear explanation

  • @user-hu7pk6tx7p
    @user-hu7pk6tx7p 2 года назад +1

    Thank you. Perfect explanation style

  • @MartinDelia
    @MartinDelia 2 года назад +1

    thanks for a clear explanation.

  • @ravindarmadishetty736
    @ravindarmadishetty736 6 лет назад +2

    Thanks Minsuk, it was very nice explanation. Very important concept you have explained

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      ravindar madishetty thanks for commenting your thoughts. I appreciate it.

  • @muhamadbayu5889
    @muhamadbayu5889 6 лет назад +3

    Very nice Explanation Sir! a ton of thanks for u

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      Muhamad Bayu My pleasure. Thanks!

  • @tatendatasara
    @tatendatasara 2 года назад +1

    thank you, finally I understand

  • @MohammadFarhadBulbul
    @MohammadFarhadBulbul 6 лет назад +1

    Excellent

  • @bdscorpioking
    @bdscorpioking 3 года назад +1

    Best explanation.

  • @captain27pal
    @captain27pal 4 года назад +1

    Very good explanation.. thank you

  • @Samuel-wl4fw
    @Samuel-wl4fw 3 года назад +1

    Very very good, Ill leave another comment for the algorithm ;)

  • @claudiomarcio7579
    @claudiomarcio7579 3 года назад +1

    Very good explanation.

  • @mohammadkaramikram8186
    @mohammadkaramikram8186 4 года назад +1

    great good job thanks very helpful

  • @breakdancerQ
    @breakdancerQ 4 года назад +1

    great GREAT video

  • @richerite
    @richerite 3 года назад +1

    Superb explanation

  • @digitalkosmos8004
    @digitalkosmos8004 4 года назад +1

    great video

  • @Romba2020
    @Romba2020 5 лет назад +1

    Very helpful

  • @artworkofficial2423
    @artworkofficial2423 4 года назад +1

    excellent

  • @birhanewondmaneh8260
    @birhanewondmaneh8260 5 лет назад +1

    great thanks

  • @oozzar2841
    @oozzar2841 4 года назад +1

    How to find kappa value for multiclass (4*4 confusion matrix)

  • @osvaldofigo8698
    @osvaldofigo8698 2 года назад +2

    Hi, thank you for the explanation. I was wondering how if the data is balanced. Is it better to use F1-score or Accuracy?

    • @TheEasyoung
      @TheEasyoung  2 года назад

      Accuracy is good for balanced data. F1 is also good for balanced data.

  • @shaukataliabbasi2942
    @shaukataliabbasi2942 6 лет назад +1

    nice

  • @nadimpallijyothi7108
    @nadimpallijyothi7108 5 лет назад +1

    super

  • @theinternetcash
    @theinternetcash 5 лет назад

    can you point me to c# implementation of this concept ?

  • @arielle-cheriepaterson3317
    @arielle-cheriepaterson3317 4 года назад

    How were the values in the matrix determined?

  • @yontenjamtsho1539
    @yontenjamtsho1539 5 лет назад

    I think splitting the datasets into an equal number of classes solves the problem. I have tried with a simple accuracy and the F1-score. The output is the same.

  • @amnakhan8516
    @amnakhan8516 5 лет назад +1

    Great work!, pleas make more videos in english too as some are not in english!

  • @hamfat4515
    @hamfat4515 3 года назад

    Your video @ 2:37 is only sensitivity right? Because, sensitivity = TP/TP+FN , but accuracy = TP+TN/TP+FN+TN+FP

  • @rsokhan44984
    @rsokhan44984 5 лет назад +1

    Outstanding explanation as i was dealing with an imbalanced dataset but could not explain the high accuracy i was getting. I also noticed that the Kappa value was very low when dealing with imbalanced dataset. Do you have a video that explains kappa value clearly? thanks for the great videos

    • @TheEasyoung
      @TheEasyoung  5 лет назад

      Ro Ro thanks, I don’t have kappa video though. Plz feel free to share kappa video or blog in this thread if you find good one!

    • @ismbil
      @ismbil 5 лет назад

      In an unbalanced dataset, there is a bias towards the majority class. Therefore, the model classify most of the samples with majority class. This increases the accuracy since most of the samples belongs to that class. Eexamine how the samples of other classes are classified in confusion matrix. This is why there are other important performance metrics you should use in an unbalanced data like sensitivity, specificity.

  • @vineetpatnaik2979
    @vineetpatnaik2979 4 года назад

    can you tell me what are the A,B,C,D

  • @darchcruise
    @darchcruise 5 лет назад +1

    On large dataset, how can I find out if data is balanced or not?

    • @TheEasyoung
      @TheEasyoung  5 лет назад

      darchz you can device and conquer. map reduce can help find count of each class. It depends on where your data is. Map reduce is answer for hadoop, value_counts for pandas dataframe, query for db. Thanks!

  • @nevilparekh6400
    @nevilparekh6400 4 года назад +2

    Formula for Accuracy is as below.
    Accuracy = (TP+TN)/N.
    In your case, true negative is missing. Can you please clarify?

    • @TheEasyoung
      @TheEasyoung  4 года назад

      TN is TP from other classes. Joining multiple classes’ TP will automatically includes TN. Thanks!

    • @user-or6mz4gy6i
      @user-or6mz4gy6i 3 года назад +1

      @@TheEasyoung In this video: ruclips.net/video/FAr2GmWNbT0/видео.html the TN is defined as all the cells from the matrix, except those for the row and column of your class. I would think that you are right as it is my previous instinct, yet since for each class we consider all other data as agregated, I would think this other view matches better this consideration. What would you think, please?

  • @adityanjsg99
    @adityanjsg99 4 года назад +1

    You R God...!

  • @harryflores1092
    @harryflores1092 5 лет назад +1

    Sir, how to determine if f1 score is acceptable or not? i mean, it is said that it reaches 0 as its worst and 1 as its best. if it happens for example, f1 score is 0.7 or 0.4, how to prove that it is acceptable or not?

    • @TheEasyoung
      @TheEasyoung  5 лет назад

      Harry Flores hi, the rule of thumbs is to compare with your base(most simple model) model’s f1 score. Or you can just compare its accuracy with just your existing data distribution. Say if yours is binary classification, and your data has 70% true, your model’s accuracy must be higher than 70% since your model supposed to be better than just say true for all data. Hope this helps!

    • @harryflores1092
      @harryflores1092 5 лет назад +1

      ​@@TheEasyoung Thank you so much Sir, actually i am only trying to test a model's accuracy with unbalanced class distribution, that's why i chose f1 score as accuracy metric. What I am looking for is a baseline to interpret the f1 score (whether it is acceptable or not). I have no model to compare it with since i am only working on one model and proving its efficiency in terms of relaying correct predictions. that's why i am looking for a baseline. If i dont get it wrong, this metric is best when comparing two or more model, but not on evaluating one's accuracy? i dont know if this question is relevant though, i am still new in machine learning concept.

    • @TheEasyoung
      @TheEasyoung  5 лет назад

      Harry Flores even if you have multi classes and unbalanced data, you can find TP, TN, FP, FN from your dataset when you think your base model is just predicting major class. for example, if you classify number to 0 to 9 and the number of data you have is 100 and you have 70 data of label 5, you can assume the base model always predict any number to label 5. Then you will get TP, TN, FP, FN and also F1 score from it and you will be able to compare your model’s f1 score. But I suggest you just compare accuracy with base until you have another machine learning model to compare with. Thanks!

    • @harryflores2219
      @harryflores2219 5 лет назад

      @@TheEasyoung That's pretty clear. Thank you so much Sir Minsuk

  • @dinasamir2778
    @dinasamir2778 6 лет назад +1

    very good video, can you share the presentation please

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      thanks, unfortunately, I can't share ppt though!

  • @mayurikarne5417
    @mayurikarne5417 5 лет назад +1

    Superb explanation ever

  • @mustafasalah9491
    @mustafasalah9491 6 лет назад +1

    thanks for the nice presentation. can you share the citation? please.

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      mustafa salah thanks for comment, I don’t share ppt yet. Sorry for that!

    • @mustafasalah9491
      @mustafasalah9491 6 лет назад +1

      please, do you have any book can use it as a citation in my thesis?

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      mustafa salah nope my knowledges are not from book. :)

  • @ayushi6424
    @ayushi6424 4 года назад +1

    hey minsuk..if our recall precision,f1 score comes up with 1.00..and teacher ask..this is 100%..how should we explain it

    • @TheEasyoung
      @TheEasyoung  4 года назад

      If Recall and pr are 1 then f1 is 1, meaning ml model was 100% correct on your test data. I don’t quite understand your question.

    • @ayushi6424
      @ayushi6424 4 года назад

      @@TheEasyoung heya...i mean i am doing my mtech..and my recall prescision and f1 score comes up to 1.00..in viva .. my teachers are questionning...how is this possible that means your model is 100 % accurate and they are not accepting this fact that these all are 100 %

  • @mehmeteminm
    @mehmeteminm 5 лет назад +1

    I thought like, wow! what a beautiful explanation for an indian (at x1.25 speed). Then i realize he is chinese :D
    Ty, that is a really good video.

  • @TrencTolize
    @TrencTolize 5 лет назад

    Maybe somebody can help me: Just read something about micro average precision vs macro average precision. The precision used in this video matches the definition of macro: You take the sum of all precisions per class and divide it by the number of classes. When calculating micro average precision though, you take the sum of all true positives per class and divide it by the sum of all true positives per class PLUS all false positives per class. And here comes my question: Isn't the sum of all true positives per class + all false positives per class equal to the count of the total dataset and thus the result of the micro average precision is the same as the accuracy value? I applied both the formula for accuracy and the formula for micro average precision to the examples used in this video and always got the exact same result. => Micro average precision = accuracy. Can somebody confirm this?

    • @TheEasyoung
      @TheEasyoung  5 лет назад

      TrencTolize hmm I believe unless accuracy is 100% or 0% macro and micro normally different just like Simpson’s paradox. And this video I covered only macro.
      The example of comparison these here,
      datascience.stackexchange.com/questions/15989/micro-average-vs-macro-average-performance-in-a-multiclass-classification-settin/16001
      Hope this helps!

    • @TrencTolize
      @TrencTolize 5 лет назад +1

      @@TheEasyoung Thank you for your answer! Yes, I read the article and I understood that macro and micro average precision usually are two different values. My question is though, is micro average precision always equal to accuracy? I applied the micro average precision formula from the stackexchange discussion to the examples you used in the video and always got a value equal to the accuracy value as a result. Maybe I'm wrong, just wondering. Because if I'm right, why would anyone need the micro av. precision formula, since it always matches the accuracy value?

    • @TheEasyoung
      @TheEasyoung  5 лет назад +1

      TrencTolize i got your point. The formula from the link is same as accuracy. Sorry for not giving you clear answer in micro avg precision.

    • @TrencTolize
      @TrencTolize 5 лет назад +1

      @@TheEasyoung No problem. The whole topic can get a little bit confusing, so my explanation kinda reflected that.

  • @alexkoh9060
    @alexkoh9060 4 года назад +2

    More English video please

  • @rho992
    @rho992 8 месяцев назад

    but accuracy considers both true positive and tru negatives... doesn't it? here only true positive is used

  • @ajax9486
    @ajax9486 4 года назад +1

    nitc piller

  • @user-rj4sj9uk8s
    @user-rj4sj9uk8s 4 года назад

    뭔가 친숙한 발음이라고 느껴서 자세히 보니까 한국인 ㄷㄷ

  • @peterv.276
    @peterv.276 5 лет назад +1

    The true negatives in accuracy are missing

  • @maybeinsha
    @maybeinsha 2 года назад +1

    ghamsamida

  • @harshitsinghai1395
    @harshitsinghai1395 5 лет назад

    Anyone from Bennett University... ?

  • @ocean694
    @ocean694 4 года назад +2

    Good lecture, but poor English

  • @rubennadevi
    @rubennadevi 3 года назад

    Thank you!