Издательство: | Книга по требованию |
Дата выхода: | июль 2011 |
ISBN: | 978-6-1328-1695-5 |
Объём: | 68 страниц |
Масса: | 123 г |
Размеры(В x Ш x Т), см: | 23 x 16 x 1 |
Cohen's kappa coefficient is a statistical measure of inter-rater agreement or inter-annotator agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since ? takes into account the agreement occurring by chance. Some researchers (e.g. Strijbos, Martens, Prins, & Jochems, 2006) have expressed concern over ?'s tendency to take the observed categories' frequencies as givens, which can have the effect of underestimating agreement for a category that is also commonly used; for this reason, ? is considered an overly conservative measure of agreement.
Данное издание не является оригинальным. Книга печатается по технологии принт-он-деманд после получения заказа.