Skip to main content
Erschienen in: Quality & Quantity 5/2013

01.08.2013

Underlying determinants driving agreement among coders

verfasst von: Guangchao Charles Feng

Erschienen in: Quality & Quantity | Ausgabe 5/2013

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

There are plenty of intercoder reliability indices, whereas the choice of them has been debated. With a Monte Carlo simulation, the determinants of the agreement indices were empirically tested. The chance agreement of Bennett’s S is found to be only affected by the number of categories. Consequently, S is a category based index. The chance agreements of Krippendorff’s \(\alpha \), Scott’s \(\pi \) and Cohen’s \(\kappa \) are affected by the marginal distribution, the level of difficulty and the interaction between them, and yet the difficulty level influences their chance agreements abnormally. The three indices are hence in general distribution based indices. Gwet’s \(AC_1\) reversed the direction of the three aforementioned indices, but its chance agreement is additionally affected by the number of categories and the interaction between the number of categories and the marginal distribution. \(AC_1\) can be classified into a class based on the number of categories, the marginal distribution and the level of difficulty. Both theoretical and practical implications were also discussed in the end.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
Finn’s r is designed for continuous ratings, but it can be used for binary nominal data.
 
2
Its formula of chance agreement is \(\frac{1}{k^{(n-1)}}\) , where n is the number of raters. Therefore, it is the only one in the S family applicable to multiple raters.
 
3
Although \(I_r\) cannot be classified into the S family according to Feng (2012), they share the same formula of calculating chance agreement. Consequently, the results of S also apply to \(I_r\).
 
4
The correlation with the unfolded distribution is higher, but their relationship is actually nonlinear. Therefore, correlation is not applicable.
 
Literatur
Zurück zum Zitat Brennan, R., Prediger, D.: Coefficient kappa: Some uses, misuses, and alternatives. Educ. Psychol. Meas. 41(3), 687–699 (1981)CrossRef Brennan, R., Prediger, D.: Coefficient kappa: Some uses, misuses, and alternatives. Educ. Psychol. Meas. 41(3), 687–699 (1981)CrossRef
Zurück zum Zitat Gwet, K.: Inter-rater reliability: dependency on trait prevalence and marginal homogeneity. Stat. Methods Inter Rater Reliab. Assess. Ser. 2, 1–9 (2002) Gwet, K.: Inter-rater reliability: dependency on trait prevalence and marginal homogeneity. Stat. Methods Inter Rater Reliab. Assess. Ser. 2, 1–9 (2002)
Zurück zum Zitat Gwet, K.: Computing inter-rater reliability and its variance in the presence of high agreement. Br. J. Math. Stat. Psychol. 61(1), 29–48 (2008)CrossRef Gwet, K.: Computing inter-rater reliability and its variance in the presence of high agreement. Br. J. Math. Stat. Psychol. 61(1), 29–48 (2008)CrossRef
Zurück zum Zitat Gwet, K.: Handbook of Inter-rater reliability: a definitive guide to measuring the extent of agreement among multiple raters. In; Advanced Analytics. LLC, Gaithersburg (2010) Gwet, K.: Handbook of Inter-rater reliability: a definitive guide to measuring the extent of agreement among multiple raters. In; Advanced Analytics. LLC, Gaithersburg (2010)
Zurück zum Zitat Hayes, A., Krippendorff, K.: Answering the call for a standard reliability measure for coding data. Commun. Methods Meas. 1(1), 77–89 (2007) Hayes, A., Krippendorff, K.: Answering the call for a standard reliability measure for coding data. Commun. Methods Meas. 1(1), 77–89 (2007)
Zurück zum Zitat Holsti, O.: Content Analysis for the Social Sciences and Humanities. Addison-Wesley, Reading (1969) Holsti, O.: Content Analysis for the Social Sciences and Humanities. Addison-Wesley, Reading (1969)
Zurück zum Zitat Krippendorff, K.: Content Analysis: An Introduction to Its Methodology, 2nd edn. Sage Publications, Thousand Oaks (2004a) Krippendorff, K.: Content Analysis: An Introduction to Its Methodology, 2nd edn. Sage Publications, Thousand Oaks (2004a)
Zurück zum Zitat Krippendorff, K.: A dissenting view on so-called paradoxes of reliability coefficients. Commun. Yearbook 36 519-591(2012) Krippendorff, K.: A dissenting view on so-called paradoxes of reliability coefficients. Commun. Yearbook 36 519-591(2012)
Zurück zum Zitat Lord, F., Novick, M., Birnbaum, A.: Statistical Theories of Mental Test Scores, 2008th edn. Addison-Wesley, Don Mills (1968) Lord, F., Novick, M., Birnbaum, A.: Statistical Theories of Mental Test Scores, 2008th edn. Addison-Wesley, Don Mills (1968)
Zurück zum Zitat Partchev, I.: Irtoys: simple interface to the estimation and plotting of irt models. R package version 01 2 (2009) Partchev, I.: Irtoys: simple interface to the estimation and plotting of irt models. R package version 01 2 (2009)
Zurück zum Zitat Potter, W., Levine-Donnerstein, D.: Rethinking validity and reliability in content analysis. J. Appl. Commun. Res. 27(3), 258–284 (1999)CrossRef Potter, W., Levine-Donnerstein, D.: Rethinking validity and reliability in content analysis. J. Appl. Commun. Res. 27(3), 258–284 (1999)CrossRef
Zurück zum Zitat Shrout, P.: Measurement reliability and agreement in psychiatry. Stat. Methods Med. Res. 7(3), 301–317 (1998)CrossRef Shrout, P.: Measurement reliability and agreement in psychiatry. Stat. Methods Med. Res. 7(3), 301–317 (1998)CrossRef
Zurück zum Zitat Tinsley, H.E., Weiss, D.J.: Interrater reliability and agreement. In: Tinsley, H., Brown, S. (eds.) Handbook of Applied Multivariate Statistics and Mathematical Modeling, Chap 4, pp. 95–124. Academic Press, San Diego (2000) Tinsley, H.E., Weiss, D.J.: Interrater reliability and agreement. In: Tinsley, H., Brown, S. (eds.) Handbook of Applied Multivariate Statistics and Mathematical Modeling, Chap 4, pp. 95–124. Academic Press, San Diego (2000)
Zurück zum Zitat Whitehurst, G.: Interrater agreement for journal manuscript reviews. Am. Psychol. 39(1), 22 (1984)CrossRef Whitehurst, G.: Interrater agreement for journal manuscript reviews. Am. Psychol. 39(1), 22 (1984)CrossRef
Zurück zum Zitat Zhao, X.: A Reliability Index (ai) that Assumes Honest Coders and Variable Randomness. Association for Education in Journalism and Mass Communication, Chicago (2012) Zhao, X.: A Reliability Index (ai) that Assumes Honest Coders and Variable Randomness. Association for Education in Journalism and Mass Communication, Chicago (2012)
Metadaten
Titel
Underlying determinants driving agreement among coders
verfasst von
Guangchao Charles Feng
Publikationsdatum
01.08.2013
Verlag
Springer Netherlands
Erschienen in
Quality & Quantity / Ausgabe 5/2013
Print ISSN: 0033-5177
Elektronische ISSN: 1573-7845
DOI
https://doi.org/10.1007/s11135-012-9807-z

Weitere Artikel der Ausgabe 5/2013

Quality & Quantity 5/2013 Zur Ausgabe

Premium Partner