Performance measure on multiclass classification [accuracy, f1 score, precision, recall]

Поделиться
HTML-код
  • Опубликовано: 1 дек 2024

Комментарии • 124

  • @Alchemist10241
    @Alchemist10241 7 месяцев назад +1

    the other tutorials explain confusion matrix only with a two by two table which is incomplete, but with this video we can understand what confusion matrix and harmonic mean really is, epic work

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak 5 лет назад +3

    This is the best explanation i have seen so far. Thanks for the awesome video tutorial.

  • @KnowledgeVideos234
    @KnowledgeVideos234 5 лет назад +2

    Best explanation just before my b.tech project presentation,,,,,man this type of content need to have good views,,,,youtube recommend it all people please!

  • @HazemAzim
    @HazemAzim 3 года назад +1

    Best Explanation I have seen on Multi-class performance measures ... ++ Thanks

  • @kkevinluke
    @kkevinluke 3 года назад +3

    Amazing, you saved me. Brilliant explanation. This video basically cleared all my doubts on this topic. 🙏

  • @ЭнесЭсветКузуджу
    @ЭнесЭсветКузуджу 3 года назад +1

    Thank you. Perfect explanation style

  • @jubjubfriend
    @jubjubfriend 4 года назад +1

    Very good explanation, so clear and well made

  • @sankusaha9486
    @sankusaha9486 4 года назад +1

    Minsuk - This is a great tutorial to understand the confusion matrix. Well done

  • @GMCvancouver
    @GMCvancouver 4 года назад +2

    best RUclips video ever, many thanks
    Minsuk Heo 허민석, this very clear and simple explanation. i llove you man i passed my project :)

  • @MayurMadhekar
    @MayurMadhekar 3 года назад +1

    Thanks!! Simplified and interpretable explanation!!

  • @RedShipsofSpainAgain
    @RedShipsofSpainAgain 6 лет назад +1

    Best explanation on when to use F score vs use Accuracy. Thank you!

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      thank you very much!

    • @RedShipsofSpainAgain
      @RedShipsofSpainAgain 6 лет назад +1

      The visual/geometric explanation of the harmonic was particularly helpful. Thanks again!

  • @prakashd842
    @prakashd842 4 года назад +1

    HI Minsuk , This truly is the best Video. I am subscriber now.

  • @betultultay1615
    @betultultay1615 4 года назад +1

    Nice video and explanation. Thank you!

  • @steveg93
    @steveg93 6 лет назад +1

    A very concise explanation of classification metrics

  • @russianrightnow
    @russianrightnow 3 года назад +1

    OMG, this is the best video! Thank you for your help to prepare to exam about machine learning

  • @ravindarmadishetty736
    @ravindarmadishetty736 6 лет назад +2

    Thanks Minsuk, it was very nice explanation. Very important concept you have explained

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      ravindar madishetty thanks for commenting your thoughts. I appreciate it.

  • @ezbitz23
    @ezbitz23 4 года назад +1

    Great video. Very clear explanations.

  • @captain27pal
    @captain27pal 4 года назад +1

    Very good explanation.. thank you

  • @Dan-wq8id
    @Dan-wq8id 3 года назад

    Brilliantly explained, liked and subscribed!

  • @claudiomarcio7579
    @claudiomarcio7579 3 года назад +1

    Very good explanation.

  • @Syaidafirzana
    @Syaidafirzana 4 года назад +1

    awesome clear explanation video on this topic. Thanks!

  • @mohammadalshawabkeh5791
    @mohammadalshawabkeh5791 3 года назад +1

    Thank you for the intuitive explanation!

  • @tarajano1980
    @tarajano1980 4 года назад +1

    Absolutely great explanation! Wonderful examples !

  • @richerite
    @richerite 4 года назад +1

    Superb explanation

  • @majdiflah
    @majdiflah 4 года назад +1

    Very smooth presentation! Continue like this !!

  • @MrRushin95
    @MrRushin95 6 лет назад +1

    Nailed it. Liked the method of explanation.

  • @adelinevoon422
    @adelinevoon422 2 года назад +1

    Thank youu u make it so easy to understand! ❤️

  • @bdscorpioking
    @bdscorpioking 4 года назад +1

    Best explanation.

  • @k23raj2
    @k23raj2 6 лет назад +1

    Step by step , clear and wonderful explanation . Thanks lot @ Minsuk Heo

    • @AvinashKunamneni
      @AvinashKunamneni 6 лет назад

      Hi could you pls tell me how the values in the rows can be figured out if i'm having 5 classifiers(agree,strongly aggree,disagree,strongly digagree, neither agree nor digagree) ??

  • @MohitSingh-ke8fh
    @MohitSingh-ke8fh 6 лет назад +6

    Thank you for this tutorial, it was really good!

  • @muhamadbayu5889
    @muhamadbayu5889 6 лет назад +3

    Very nice Explanation Sir! a ton of thanks for u

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      Muhamad Bayu My pleasure. Thanks!

  • @MartinDelia
    @MartinDelia 3 года назад +1

    thanks for a clear explanation.

  • @TheWesley1412
    @TheWesley1412 3 года назад +1

    you saved me, thank you!!

  • @ertanuysal5890
    @ertanuysal5890 4 года назад +1

    It was very clearr thank you !

  • @Samuel-wl4fw
    @Samuel-wl4fw 3 года назад +1

    Very very good, Ill leave another comment for the algorithm ;)

  • @larissaalves6737
    @larissaalves6737 2 года назад +1

    Great!! Thanks for your help.

  • @mohammadkaramikram8186
    @mohammadkaramikram8186 4 года назад +1

    great good job thanks very helpful

  • @VirtusRex48
    @VirtusRex48 4 года назад +2

    Very well done. Thank you

  • @tatendatasara
    @tatendatasara 3 года назад +1

    thank you, finally I understand

  • @danishzahid2797
    @danishzahid2797 4 года назад +1

    Loved it!

  • @digitalkosmos8004
    @digitalkosmos8004 5 лет назад +1

    great video

  • @osvaldofigo8698
    @osvaldofigo8698 3 года назад +2

    Hi, thank you for the explanation. I was wondering how if the data is balanced. Is it better to use F1-score or Accuracy?

    • @TheEasyoung
      @TheEasyoung  3 года назад

      Accuracy is good for balanced data. F1 is also good for balanced data.

  • @rakeshkumarkuwar6053
    @rakeshkumarkuwar6053 5 лет назад +2

    Thank you very much for such an wonderful explanation.

  • @rajorshibhattachary
    @rajorshibhattachary 5 лет назад +2

    excellent!

  • @tahamagdy4932
    @tahamagdy4932 6 лет назад +2

    You have made my day!
    Thank You

    • @TheEasyoung
      @TheEasyoung  6 лет назад +1

      Taha Magdy thanks for cheerful comments, I will keep up good thing!

  • @shivaborusu
    @shivaborusu 4 года назад +1

    Thank you Minsuk, keep doing posts with comparisons like which regression models to use under multiple scenarios, classification models, performance metrics, good content (y)

  • @mehmeteminm
    @mehmeteminm 5 лет назад +1

    I thought like, wow! what a beautiful explanation for an indian (at x1.25 speed). Then i realize he is chinese :D
    Ty, that is a really good video.

  • @yasnaseri193
    @yasnaseri193 5 лет назад +1

    Great work buddy!

  • @afsanaahsanjeny2065
    @afsanaahsanjeny2065 6 лет назад +1

    Just excellent explanation

  • @breakdancerQ
    @breakdancerQ 4 года назад +1

    great GREAT video

  • @gloriamaciam
    @gloriamaciam 5 лет назад +1

    Such an amazing video!!!

  • @nevilparekh6400
    @nevilparekh6400 4 года назад +2

    Formula for Accuracy is as below.
    Accuracy = (TP+TN)/N.
    In your case, true negative is missing. Can you please clarify?

    • @TheEasyoung
      @TheEasyoung  4 года назад

      TN is TP from other classes. Joining multiple classes’ TP will automatically includes TN. Thanks!

    • @user-or6mz4gy6i
      @user-or6mz4gy6i 4 года назад +1

      @@TheEasyoung In this video: ruclips.net/video/FAr2GmWNbT0/видео.html the TN is defined as all the cells from the matrix, except those for the row and column of your class. I would think that you are right as it is my previous instinct, yet since for each class we consider all other data as agregated, I would think this other view matches better this consideration. What would you think, please?

  • @CogaxCH
    @CogaxCH 4 года назад +1

    So good!

  • @oozzar2841
    @oozzar2841 5 лет назад +1

    How to find kappa value for multiclass (4*4 confusion matrix)

  • @linasuhaili1409
    @linasuhaili1409 4 года назад

    Very clear! 감사합니다

  • @artworkofficial2423
    @artworkofficial2423 5 лет назад +1

    excellent

  • @guptaachin
    @guptaachin 6 лет назад +1

    This is absolutely amazing. Thanks @Minsuk

  • @muhammadmujtabanawaz111
    @muhammadmujtabanawaz111 4 года назад +1

    Thank you so much

  • @yontenjamtsho1539
    @yontenjamtsho1539 5 лет назад

    I think splitting the datasets into an equal number of classes solves the problem. I have tried with a simple accuracy and the F1-score. The output is the same.

  • @mayurikarne5417
    @mayurikarne5417 5 лет назад +1

    Superb explanation ever

  • @yolandatorres2103
    @yolandatorres2103 5 лет назад +1

    Fantastic, very clear! Congrats :-)

  • @kazijahidurrahamanriyad9245
    @kazijahidurrahamanriyad9245 4 года назад

    Thank you so much for this content.

  • @akashpoudel571
    @akashpoudel571 5 лет назад +1

    thank u sir...damn clear explanation

  • @amnakhan8516
    @amnakhan8516 6 лет назад +1

    Great work!, pleas make more videos in english too as some are not in english!

  • @rsokhan44984
    @rsokhan44984 6 лет назад +1

    Outstanding explanation as i was dealing with an imbalanced dataset but could not explain the high accuracy i was getting. I also noticed that the Kappa value was very low when dealing with imbalanced dataset. Do you have a video that explains kappa value clearly? thanks for the great videos

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      Ro Ro thanks, I don’t have kappa video though. Plz feel free to share kappa video or blog in this thread if you find good one!

    • @ismbil
      @ismbil 5 лет назад

      In an unbalanced dataset, there is a bias towards the majority class. Therefore, the model classify most of the samples with majority class. This increases the accuracy since most of the samples belongs to that class. Eexamine how the samples of other classes are classified in confusion matrix. This is why there are other important performance metrics you should use in an unbalanced data like sensitivity, specificity.

  • @dinasamir2778
    @dinasamir2778 7 лет назад +1

    very good video, can you share the presentation please

    • @TheEasyoung
      @TheEasyoung  7 лет назад

      thanks, unfortunately, I can't share ppt though!

  • @dr.amarnadhs5262
    @dr.amarnadhs5262 5 лет назад

    Clearly explained each and every step.... thank you so much.....:-)

  • @mustafasalah9491
    @mustafasalah9491 6 лет назад +1

    thanks for the nice presentation. can you share the citation? please.

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      mustafa salah thanks for comment, I don’t share ppt yet. Sorry for that!

    • @mustafasalah9491
      @mustafasalah9491 6 лет назад +1

      please, do you have any book can use it as a citation in my thesis?

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      mustafa salah nope my knowledges are not from book. :)

  • @Alexhoony
    @Alexhoony 5 лет назад +2

    thank you so much, helped a lot!

  • @tatisc4185
    @tatisc4185 5 лет назад +1

    I love it! Thanks

  • @rubennadevi
    @rubennadevi 4 года назад

    Thank you!

  • @hamfat4515
    @hamfat4515 3 года назад

    Your video @ 2:37 is only sensitivity right? Because, sensitivity = TP/TP+FN , but accuracy = TP+TN/TP+FN+TN+FP

  • @kevinmcinerney9552
    @kevinmcinerney9552 6 лет назад +1

    Very clear. Thank you

  • @ayushi6424
    @ayushi6424 4 года назад +1

    hey minsuk..if our recall precision,f1 score comes up with 1.00..and teacher ask..this is 100%..how should we explain it

    • @TheEasyoung
      @TheEasyoung  4 года назад

      If Recall and pr are 1 then f1 is 1, meaning ml model was 100% correct on your test data. I don’t quite understand your question.

    • @ayushi6424
      @ayushi6424 4 года назад

      @@TheEasyoung heya...i mean i am doing my mtech..and my recall prescision and f1 score comes up to 1.00..in viva .. my teachers are questionning...how is this possible that means your model is 100 % accurate and they are not accepting this fact that these all are 100 %

  • @alexkoh9060
    @alexkoh9060 4 года назад +2

    More English video please

  • @arielle-cheriepaterson3317
    @arielle-cheriepaterson3317 5 лет назад

    How were the values in the matrix determined?

  • @harryflores1092
    @harryflores1092 5 лет назад +1

    Sir, how to determine if f1 score is acceptable or not? i mean, it is said that it reaches 0 as its worst and 1 as its best. if it happens for example, f1 score is 0.7 or 0.4, how to prove that it is acceptable or not?

    • @TheEasyoung
      @TheEasyoung  5 лет назад

      Harry Flores hi, the rule of thumbs is to compare with your base(most simple model) model’s f1 score. Or you can just compare its accuracy with just your existing data distribution. Say if yours is binary classification, and your data has 70% true, your model’s accuracy must be higher than 70% since your model supposed to be better than just say true for all data. Hope this helps!

    • @harryflores1092
      @harryflores1092 5 лет назад +1

      ​@@TheEasyoung Thank you so much Sir, actually i am only trying to test a model's accuracy with unbalanced class distribution, that's why i chose f1 score as accuracy metric. What I am looking for is a baseline to interpret the f1 score (whether it is acceptable or not). I have no model to compare it with since i am only working on one model and proving its efficiency in terms of relaying correct predictions. that's why i am looking for a baseline. If i dont get it wrong, this metric is best when comparing two or more model, but not on evaluating one's accuracy? i dont know if this question is relevant though, i am still new in machine learning concept.

    • @TheEasyoung
      @TheEasyoung  5 лет назад

      Harry Flores even if you have multi classes and unbalanced data, you can find TP, TN, FP, FN from your dataset when you think your base model is just predicting major class. for example, if you classify number to 0 to 9 and the number of data you have is 100 and you have 70 data of label 5, you can assume the base model always predict any number to label 5. Then you will get TP, TN, FP, FN and also F1 score from it and you will be able to compare your model’s f1 score. But I suggest you just compare accuracy with base until you have another machine learning model to compare with. Thanks!

    • @harryflores2219
      @harryflores2219 5 лет назад

      @@TheEasyoung That's pretty clear. Thank you so much Sir Minsuk

  • @darchcruise
    @darchcruise 6 лет назад +1

    On large dataset, how can I find out if data is balanced or not?

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      darchz you can device and conquer. map reduce can help find count of each class. It depends on where your data is. Map reduce is answer for hadoop, value_counts for pandas dataframe, query for db. Thanks!

  • @MohammadFarhadBulbul
    @MohammadFarhadBulbul 6 лет назад +1

    Excellent

  • @vineetpatnaik2979
    @vineetpatnaik2979 5 лет назад

    can you tell me what are the A,B,C,D

  • @AnkitSingh-wq2rk
    @AnkitSingh-wq2rk 5 лет назад +1

    thank you soooo much

  • @emmanuel.obaga.001
    @emmanuel.obaga.001 6 лет назад +1

    Awesome!!!!

  • @birhanewondmaneh8260
    @birhanewondmaneh8260 5 лет назад +1

    great thanks

  • @Romba2020
    @Romba2020 6 лет назад +1

    Very helpful

  • @adityanjsg99
    @adityanjsg99 4 года назад +1

    You R God...!

  • @nadimpallijyothi7108
    @nadimpallijyothi7108 5 лет назад +1

    super

  • @theinternetcash
    @theinternetcash 5 лет назад

    can you point me to c# implementation of this concept ?

  • @shaukataliabbasi2942
    @shaukataliabbasi2942 6 лет назад +1

    nice

  • @TrencTolize
    @TrencTolize 6 лет назад

    Maybe somebody can help me: Just read something about micro average precision vs macro average precision. The precision used in this video matches the definition of macro: You take the sum of all precisions per class and divide it by the number of classes. When calculating micro average precision though, you take the sum of all true positives per class and divide it by the sum of all true positives per class PLUS all false positives per class. And here comes my question: Isn't the sum of all true positives per class + all false positives per class equal to the count of the total dataset and thus the result of the micro average precision is the same as the accuracy value? I applied both the formula for accuracy and the formula for micro average precision to the examples used in this video and always got the exact same result. => Micro average precision = accuracy. Can somebody confirm this?

    • @TheEasyoung
      @TheEasyoung  6 лет назад

      TrencTolize hmm I believe unless accuracy is 100% or 0% macro and micro normally different just like Simpson’s paradox. And this video I covered only macro.
      The example of comparison these here,
      datascience.stackexchange.com/questions/15989/micro-average-vs-macro-average-performance-in-a-multiclass-classification-settin/16001
      Hope this helps!

    • @TrencTolize
      @TrencTolize 6 лет назад +1

      @@TheEasyoung Thank you for your answer! Yes, I read the article and I understood that macro and micro average precision usually are two different values. My question is though, is micro average precision always equal to accuracy? I applied the micro average precision formula from the stackexchange discussion to the examples you used in the video and always got a value equal to the accuracy value as a result. Maybe I'm wrong, just wondering. Because if I'm right, why would anyone need the micro av. precision formula, since it always matches the accuracy value?

    • @TheEasyoung
      @TheEasyoung  6 лет назад +1

      TrencTolize i got your point. The formula from the link is same as accuracy. Sorry for not giving you clear answer in micro avg precision.

    • @TrencTolize
      @TrencTolize 6 лет назад +1

      @@TheEasyoung No problem. The whole topic can get a little bit confusing, so my explanation kinda reflected that.

  • @rho992
    @rho992 Год назад

    but accuracy considers both true positive and tru negatives... doesn't it? here only true positive is used

  • @peterv.276
    @peterv.276 5 лет назад +1

    The true negatives in accuracy are missing

  • @김준호-s5i
    @김준호-s5i 4 года назад

    뭔가 친숙한 발음이라고 느껴서 자세히 보니까 한국인 ㄷㄷ

  • @ajax9486
    @ajax9486 4 года назад +1

    nitc piller

  • @silent.whisker07
    @silent.whisker07 2 года назад +1

    ghamsamida

  • @ocean694
    @ocean694 5 лет назад +2

    Good lecture, but poor English

  • @harshitsinghai1395
    @harshitsinghai1395 6 лет назад

    Anyone from Bennett University... ?

  • @MarsLanding91
    @MarsLanding91 4 года назад +1

    Thank you!