[Corpora-List] kappas

Katja Hofmann K.Hofmann at uva.nl
Wed Oct 1 14:53:27 UTC 2008


The topic is discussed in a forth-coming article in Computational 
Linguistics:

Ron Artstein and­ Massimo Poesio: Inter-Coder Agreement for 
Computational Linguistics
http://www.mitpressjournals.org/doi/abs/10.1162/coli.07-034-R2

-- Katja




Steen, G.J. wrote:
> Dear all,
>  
> one way to measure inter-coder reliability is Cohen's kappa. But this 
> can only be applied to pairs of raters, at least in the standard use.
>  
> One solution to the problem of having more than two coders is to average 
> Cohen's kappas across all possible pairs of raters, but I am not sure 
> how this is looked upon in the testing community.
>  
> Another solution to this problem appears to be Fleiss' kappa, which can 
> accommodate more raters in one reliability analysis. What sort of 
> experience do you have with this statistic? And are there any software 
> packages that include it (since SPSS does not seem to have it)?
>  
> Any advice will be greatly appreciated.
>  
> Many thanks,
>  
> Gerard Steen
>  
>  
>  
> Professor of Language Use and Cognition
> Director, Language, Cognition, Communication program
> Faculty of Arts, 11A-35
> Department of Language and Communication
> VU University Amsterdam
> De Boelelaan 1105
> 1081 HV Amsterdam
> 
> T: ++31-20-5986433
> F: ++31-20-5986500
>  
> http://www.let.vu.nl/staf/gj.steen/
> 
> 
> ------------------------------------------------------------------------
> 
> _______________________________________________
> Corpora mailing list
> Corpora at uib.no
> http://mailman.uib.no/listinfo/corpora


_______________________________________________
Corpora mailing list
Corpora at uib.no
http://mailman.uib.no/listinfo/corpora



More information about the Corpora mailing list