0
   

Inter-rater reliability using Cohen's Kappa

 
 
Reply Wed 7 Oct, 2015 09:38 am
Hi,

My question is:

I have a very large data file with many scales which were filled in by hand and then copied into SPSS/Excel by a first rater. Every tenth case (so 10% of the cases) was entered again by a second rater to check whether the first did it correctly. Can I use Cohen's kappa to test the inter-rater reliability between the first and second rater on the 10% of re-entered cases? And also: Can I use Cohen's kappa on the means of the scales, or do I have to use it for each separate question? (There are a LOT of individual questions and only about 20 scales)

Thank you,
Hannah
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Question • Score: 0 • Views: 349 • Replies: 1
No top replies

 
Hannah51789
 
  1  
Reply Thu 8 Oct, 2015 05:36 am
@Hannah51789,
Edit: I have realized in the meantime that I believe I should be using Intra-Class Correlations instead of Cohen's kappa's to calculate the IRR between the raters on 10% of the cases, because it is ordinal and not categorical data. And I believe this can be done using the averages of the scales instead of each individual item. I'd love to hear from someone whether I'm on the right path here... it would be greatly appreciated!
0 Replies
 
 

Related Topics

 
  1. Forums
  2. » Inter-rater reliability using Cohen's Kappa
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.12 seconds on 11/16/2024 at 05:38:58