What is IRR
interrater reliability is the agreement of the same data obtained by different raters, using the same scale, classification, instrument, or procedure, when assessing the same subjects or objects.
The basic measure for inter-rater reliability is a percent agreement between raters.
ICC values less than 0.5 are indicative of poor reliability, values between 0.5 and 0.75 indicate moderate reliability, values between 0.75 and 0.9 indicate good reliability, and values greater than 0.90 indicate excellent reliability
Psychologists consider three types of consistency: over time (test-retest reliability), across items (internal consistency), and across different researchers (inter-rater reliability).
Intra-rater r: Used to assess the consistency of results across items within a test. Inter-rater r: While inter-rater reliability involves two or more raters, intra-rater reliability is the consistency of grading by a single rater. Scores on a test are rated by a single rater/judge at different times.
n statistics, intra-rater reliability is the degree of agreement among repeated administrations of a diagnostic test performed by a single rater.
The Intraclass Correlation Coefficient (ICC) is a measure of the reliability of measurements or ratings.
Why is it important
What ways to measure
Medical examples
In the social sciences, coding is an analytical process in which data, in both quantitative form (such as questionnaires results) or qualitative form (such as interview transcripts) are categorized to facilitate analysis. One purpose of coding is to transform the data into a form suitable for computer-aided analysis.
Providing quality data in health care - almost perfect inter-rater agreement in the Norwegian tonsil surgery register
https://bmcmedresmethodol.biomedcentral.com/articles/10.1186/s12874-018-0651-2
Measuring hospital adverse events: assessing inter-rater reliability and trigger performance of the Global Trigger Tool.
https://www.ncbi.nlm.nih.gov/pubmed/20534607
Inter-Rater Reliability Testing For Utilization Management Staff
https://www.managedcaremag.com/archives/2001/6/inter-rater-reliability-testing-utilization-management-staff
Inter-rater reliability in performance status assessment among health care professionals: a systematic review (palliative medicine)
http://apm.amegroups.com/article/view/9595/10790
References:
-
Hayes AF, Krippendorff K.
Answering the call for a standard reliability measure for coding data.
Communication Methods and Measures 2007; 1(1): 77-89.
afhayes.com -
Freelon D.
ReCal: reliability calculation for the masses
an online utility that computes intercoder/interrater reliability coefficients for nominal, ordinal, interval, or ratio-level data.
dfreelon.org -
Löfgren K.
Ordinal data: Krippendorff alpha inter-rater reliability test
How to use a statistical test (Krippendorff alpha) to check the reliability of a variable with ordinal data. (Windows PC & SPSS.)
www.wyoutube.com -
Löfgren K.
Nominal dichotomous yes/no data: Krippendorff alpha inter-rater reliability
How to use a statistical test (Krippendorff alpha) to check the reliability of a variable with nominal/dichotomous data. (Windows PC & SPSS.)
www.wyoutube.com -
Zaiontz C.
Real Statistics Using Excel: Krippendorff's alpha
How to use a statistical test (Krippendorff alpha) to check the reliability of a variable. (Windows PC & EXCEL.)
www.wyoutube.com