Checking Intercoder Reliability in your Content Analysis

Поделиться
HTML-код
  • Опубликовано: 21 авг 2024
  • In this video, Dr. Sweetser showcases one way that you can determine intercoder reliability for a media assessment content analysis project. Having coded the items in Google Forms and cleaned the data in Google Sheets, she shows how she "grades" a coder's category determinations against a key coder. She walks through Holsti's formula for the percentage of agreement as a means to compete a reliability score for each article.

Комментарии • 2

  • @mic4308
    @mic4308 Месяц назад

    my supervisor asked me to do Kappa stats on intercoder reliability, Kathleen are these the same you mentioned?

    • @DrSweetser
      @DrSweetser  Месяц назад

      No, this video does not demonstrate Cohen's Kappa. I used Holsti's formula for percentage of agreement for intercoder reliability in this video.