Calculating Inter Rater Reliability/Agreement in Excel

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024
  • A brief description on how to calculate inter-rater reliability or agreement in Excel.

Комментарии • 25

  • @Simmy56
    @Simmy56 8 лет назад +36

    Technically this should not be called inter-rater reliability as you are presenting only inter-rater agreement (absolute consensus) as opposed to inter-rater consistency (moving in the same direction or maintenance of rank-order across judges). Further, if doing % agreement it would be beneficial to correct for chance using Cohen's kappa or in the case where your data is continuous (as it appears to be here) then it might also be simpler to just calculate a correlation between the two raters.

    • @drkayotu
      @drkayotu  7 лет назад +2

      Yes - both good suggestions. I suppose I have simply followed the model I was taught and read most in education. But these are good options. Thank you.

  • @FooodConfusion
    @FooodConfusion 3 года назад +2

    Thanks for making it so easy, but can you explain rwg(j) or rwg that is done for team level variables..

  • @DvdBdjzDvl
    @DvdBdjzDvl 4 года назад +4

    What would you recommend for when you have 10 coders and each variable has 4-7 values (nominal)?

  • @SPORTSCIENCEps
    @SPORTSCIENCEps 3 года назад

    Thank you for uploading it!

  • @donnaderose4959
    @donnaderose4959 4 года назад +1

    thank you.
    Could you share a video on how to calculate inter rater reliability by using SPSS application?

  • @Machina3123
    @Machina3123 4 года назад +4

    Excuse me sir, I would like ask about how high should the result be to be considered reliable? Is it above 80%?

    • @drkayotu
      @drkayotu  4 года назад

      It depends, and I do not believe there are set guidelines, but I like to get 80-95%. Remember, you can always go back and discuss with your rating partner.

  • @Ammar__Ninja7973
    @Ammar__Ninja7973 Год назад +1

    thank youuu

  • @soukainaaziz1623
    @soukainaaziz1623 8 месяцев назад

    Thank you. What is the used formula please?

  • @subhayumitra5417
    @subhayumitra5417 4 года назад +1

    Thank you 😊

    • @drkayotu
      @drkayotu  4 года назад +1

      You are very welcome

  • @madelineespunkt6240
    @madelineespunkt6240 8 лет назад +2

    What if I have more than 2 raters? should I give a 1, if everyone rate the question with the same number? Is this the same procedure for a nominal scale? what is that (pearson, spearman correlation or none of those??)

    • @drkayotu
      @drkayotu  8 лет назад +2

      +Madeline Espunkt Good question. There are different ways. You could require all three to agree to get a 1 but that seems a bit strict. You could calculate and present IRR for each pair. Same procedure for any scale.

    • @madelineespunkt6240
      @madelineespunkt6240 8 лет назад

      +Robin Kay thanks for you quick reply :)! I´ve plan to run a test manual with my fellow students, so I have about 20 Raters for one object. There are 30 points to reach and they see a video of the test subject and they´ve gonna rate it.. but I´m not sure how to calculate it.. :/

  • @sanjnathakur6489
    @sanjnathakur6489 3 года назад

    Sir please tell for likert scale interrater is used or not?
    And for checklist test retest can be used?
    Or its vice versa?
    Please its urgent

  • @jegathiswarymou3117
    @jegathiswarymou3117 2 года назад

    how if all the scores is not match?

  • @SufraDIYCrafts1
    @SufraDIYCrafts1 3 года назад

    This is not how we calculate Inter Rater Reliability on a scale of 10. There are much higher chances of not agreement on scale of 10, there must be some normalization

  • @shurooqa.a.3245
    @shurooqa.a.3245 8 лет назад

    Plz I need the reference that explains this procedure.
    Thank you

    • @drkayotu
      @drkayotu  7 лет назад

      Well there are a number of references - here is one quick one: www.statisticshowto.com/inter-rater-reliability/

  • @dr.yousufmoosa2561
    @dr.yousufmoosa2561 8 лет назад

    please show us how to calculate intra examiner reliability

    • @drkayotu
      @drkayotu  8 лет назад

      +Dr. Yousuf Moosa I think the same procedure would apply - Rater 2 would simply be the scores for the same person on a second occasion. The caluclations would be the same.

    • @dr.yousufmoosa2561
      @dr.yousufmoosa2561 8 лет назад

      Thank you

  • @lindsyrichardson2708
    @lindsyrichardson2708 8 лет назад

    Hi Robin. Great video! Is this Cohen's Kappa unweighted? Lindsy

    • @drkayotu
      @drkayotu  8 лет назад

      +Lindsy Richardson No - This is a simple percent formula. No fancy name. Basically - number of agrees over the total number of answers