Cohen's Kappa (Inter-Rater-Reliability)

Поделиться
HTML-код
  • Опубликовано: 15 янв 2025

Комментарии • 60

  • @Natasha-vz8dd
    @Natasha-vz8dd 3 месяца назад +1

    Thank you for great explanations. Found you searching for Fleiss Kappa and now keep watching other videos for the sheer pleasure.

  • @amalanassar
    @amalanassar 2 года назад +3

    No one can explain obvious like you, perfect !

    • @datatab
      @datatab  2 года назад

      Many many thanks for your nice Feedback! Regards Hannah

  • @pghjmuhammadfirdauspghjrus8870
    @pghjmuhammadfirdauspghjrus8870 Год назад +4

    Thank you for such an excellent yet simple explanation 👍

  • @jorgeulloagonzalez550
    @jorgeulloagonzalez550 10 месяцев назад +2

    Excelente muchas Gracias

    • @datatab
      @datatab  9 месяцев назад

      Glad it was helpful and many thanks for your feedback! Regards Hannah

  • @lulisboa81
    @lulisboa81 Год назад +2

    Beautifully explained!

    • @datatab
      @datatab  Год назад

      Glad it was helpful!

  • @user-yw1ul4xq1g
    @user-yw1ul4xq1g Год назад +1

    Many thanks for the clear explanation!

  • @sanikagodbole8874
    @sanikagodbole8874 Год назад +3

    Such a perfect explanation! Thank you ❤

    • @datatab
      @datatab  Год назад

      Glad it was helpful!

  • @justice3530
    @justice3530 2 года назад +4

    thank you~!! so easy and good explanation!!

    • @datatab
      @datatab  2 года назад

      Glad it was helpful!

  • @alonsamuel7106
    @alonsamuel7106 2 года назад +1

    Great explanation of the kappa, thank you very much!!!!!! :)

    • @datatab
      @datatab  2 года назад

      Glad you liked it!

  • @dinukabimsarabodaragama716
    @dinukabimsarabodaragama716 2 года назад +2

    Thank you so muchhhhhh!!!!! A great explanation!

    • @datatab
      @datatab  2 года назад +1

      Many thanks!

  • @kebenny
    @kebenny 8 месяцев назад +1

    Thanks for your video. It really helpful

    • @datatab
      @datatab  8 месяцев назад +1

      Glad it was helpful! Regards Hannah

  • @musasilahl463
    @musasilahl463 3 месяца назад

    great explanation

  • @bongekamkhize3244
    @bongekamkhize3244 2 года назад +3

    Thank for the great video. Could you perhaps do a video where you use articles to calculate the Kappa.

    • @datatab
      @datatab  2 года назад

      Great suggestion! I will but it on my to do list!

  • @jhulialeigho.climaco9292
    @jhulialeigho.climaco9292 2 года назад +3

    Hello. Can you do an example where you use Cohen's Kappa but there are 3 raters? Thank you.

    • @datatab
      @datatab  2 года назад

      Hi, then you use Fleiss Kappa, here is our video: ruclips.net/video/ga-bamq7Qcs/видео.html Regards, Hannah

    • @shishi-g6l
      @shishi-g6l Месяц назад

      ​@@datatab can you do an example where u usw cohen's kappa for 6 raters?

  • @ericdizon3320
    @ericdizon3320 2 года назад +1

    Is Cohen's Kappa can be used for a 5 raters?

  • @dentistryhm1804
    @dentistryhm1804 Год назад +1

    Excellent, thank you very much

    • @datatab
      @datatab  Год назад

      Glad it was helpful!

  • @rohanirohan9756
    @rohanirohan9756 2 года назад

    Wonderful Explanation!!! Thanks

  • @SoniaBiatrizCarlosNhancale
    @SoniaBiatrizCarlosNhancale 4 месяца назад

    hi excelent lesson. I would like to know how to proceed when one of the raters's response is only yes or only no for all questions

  • @mukasamohammed716
    @mukasamohammed716 Год назад +1

    Can you use cohen kappa when two diffrent instruments are used measuring same thing rather than for individuals

  • @surayuthpintawong8332
    @surayuthpintawong8332 2 года назад +1

    Great video! Thanks.

    • @datatab
      @datatab  2 года назад +1

      Glad it was helpful! Thanks for your nice feedback! Regards Hannah

  • @nitinraturi2289
    @nitinraturi2289 Год назад +1

    well explained

  • @themightyenglish5310
    @themightyenglish5310 9 месяцев назад +1

    What if I have four levels? Thank you.
    1 = definitely exclude
    2 = not sure, but tendency to exclude
    3 = not sure, but tendency to include
    4 = definitely include

    • @datatab
      @datatab  9 месяцев назад

      Hi, then you can also use Cohen's Kappa, maybe you can also use the weighted Cohen's Kappa.

  • @originalvonster
    @originalvonster Год назад +1

    2:15 inter-rater reliability
    5:45 calculating

  • @larissacury7714
    @larissacury7714 2 года назад +1

    Thank you!

    • @datatab
      @datatab  2 года назад

      You're welcome! : ) Regards Hannah

  • @irispham8807
    @irispham8807 2 года назад +1

    Excellent explanation . Tks a lot

    • @datatab
      @datatab  2 года назад +1

      Glad you liked it!

    • @irispham8807
      @irispham8807 2 года назад

      @@datatab you are my life-saver for my statistic subject. Tks so much from the bottom of my heart.

  • @archanakamble5212
    @archanakamble5212 Год назад

    Very informative video, however, I have a question. When manually calculated Kappa was found to be 0.44 while using DATAtab it was 0.23, why this difference?

  • @KOTESWARARAOMAKKENAPHD
    @KOTESWARARAOMAKKENAPHD Год назад

    very informative

  • @prathyu2559
    @prathyu2559 2 года назад +1

    Superb mam

  • @javedkalva659
    @javedkalva659 2 года назад +1

    Perfect

  • @qqqq-mh5if
    @qqqq-mh5if 5 месяцев назад

    Can this test be used for more than2 categorical variables by two raters eg depressed, not depressed, unknown

  • @janitosalipdan6002
    @janitosalipdan6002 2 года назад

    How is Cohen's Kappa different from the Cronbach's Alpha?

  • @SupeRafa500
    @SupeRafa500 2 года назад +2

    Very good video.

    • @datatab
      @datatab  2 года назад +1

      Thank you!

  • @chrischan-e2s
    @chrischan-e2s Год назад

    how did u got 0.72?

  • @dr.jeevamathan718
    @dr.jeevamathan718 2 года назад

    Randolph’s K = 0.84 what does it mean

  • @arberg200x
    @arberg200x 10 месяцев назад +1

    Why did you make an example where the sum of different variables is identical, two times 25 and I have a headache.

    • @datatab
      @datatab  9 месяцев назад

      Oh sorry!

  • @nourchenekharrat3860
    @nourchenekharrat3860 2 года назад

    Thank you so much for your effort. I have 2 questions:
    Is 0.081 a good Kappa value?
    Can I run Cronbach's alpha test to assess the agreement between two raters?

    • @Cell1808
      @Cell1808 2 года назад +1

      0.081 is very bad. There are debates about what is considered good and it also depends on the context of the research. In general, I'd say in most cases everything above 0.5 is an area, in which it gets interesting.

    • @Cell1808
      @Cell1808 2 года назад +2

      The video actually talks about good and bad values at 9:25 onwards