lnu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Scoring reading parameters: An inter-rater reliability study using the MNREAD chart
Linnaeus University, Faculty of Health and Life Sciences, Department of Medicine and Optometry. (REVO)ORCID iD: 0000-0002-3745-0035
Linnaeus University, Faculty of Health and Life Sciences, Department of Medicine and Optometry. University of Minho Braga, Portugal.ORCID iD: 0000-0003-3436-2010
University of Minnesota, USA.
University of Minho Braga, Portugal.
Show others and affiliations
2019 (English)In: PLoS ONE, ISSN 1932-6203, E-ISSN 1932-6203, Vol. 14, no 6, p. 1-14, article id e0216775Article in journal (Refereed) Published
Abstract [en]

Purpose First, to evaluate inter-rater reliability when human raters estimate the reading performance of visually impaired individuals using the MNREAD acuity chart. Second, to evaluate the agreement between computer-based scoring algorithms and compare them with human rating. Methods Reading performance was measured for 101 individuals with low vision, using the Portuguese version of the MNREAD test. Seven raters estimated the maximum reading speed (MRS) and critical print size (CPS) of each individual MNREAD curve. MRS and CPS were also calculated automatically for each curve using two different algorithms: the original standard deviation method (SDev) and a non-linear mixed effects (NLME) modeling. Intra-class correlation coefficients (ICC) were used to estimate absolute agreement between raters and/or algorithms. Results Absolute agreement between raters was ‘excellent’ for MRS (ICC = 0.97; 95%CI [0.96, 0.98]) and ‘moderate’ to ‘good’ for CPS (ICC = 0.77; 95%CI [0.69, 0.83]). For CPS, inter-rater reliability was poorer among less experienced raters (ICC = 0.70; 95%CI [0.57, 0.80]) when compared to experienced ones (ICC = 0.82; 95%CI [0.76, 0.88]). Absolute agreement between the two algorithms was ‘excellent’ for MRS (ICC = 0.96; 95%CI [0.91, 0.98]). For CPS, the best possible agreement was found for CPS defined as the print size sustaining 80% of MRS (ICC = 0.77; 95%CI [0.68, 0.84]). Absolute agreement between raters and automated methods was ‘excellent’ for MRS (ICC = 0.96; 95% CI [0.88, 0.98] for SDev; ICC = 0.97; 95% CI [0.95, 0.98] for NLME). For CPS, absolute agreement between raters and SDev ranged from ‘poor’ to ‘good’ (ICC = 0.66; 95% CI [0.3, 0.80]), while agreement between raters and NLME was ‘good’ (ICC = 0.83; 95% CI [0.76, 0.88]). Conclusion For MRS, inter-rater reliability is excellent, even considering the possibility of noisy and/or incomplete data collected in low-vision individuals. For CPS, inter-rater reliability is lower. This may be problematic, for instance in the context of multisite investigations or follow-up examinations. The NLME method showed better agreement with the raters than the SDev method for both reading parameters. Setting up consensual guidelines to deal with ambiguous curves may help improve reliability. While the exact definition of CPS should be chosen on a case-by-case basis depending on the clinician or researcher’s motivations, evidence suggests that estimating CPS as the smallest print size sustaining about 80% of MRS would increase inter-rater reliability.

Place, publisher, year, edition, pages
Public Library of Science , 2019. Vol. 14, no 6, p. 1-14, article id e0216775
Keywords [en]
MNREAD acuity chart, visual impairment, algorithms
National Category
Ophthalmology
Research subject
Natural Science, Optometry
Identifiers
URN: urn:nbn:se:lnu:diva-84759DOI: 10.1371/journal.pone.0216775ISI: 000470658500005PubMedID: 31173587Scopus ID: 2-s2.0-85067111247OAI: oai:DiVA.org:lnu-84759DiVA, id: diva2:1321533
Available from: 2019-06-08 Created: 2019-06-08 Last updated: 2019-08-29Bibliographically approved

Open Access in DiVA

fulltext(2079 kB)33 downloads
File information
File name FULLTEXT01.pdfFile size 2079 kBChecksum SHA-512
a0e879054d5e2a169de454ac0f7d9b7c9f1e5405f37c01d23f601695e4392799b49ac6c3a668d9e4436bd99fec5b058bd9cfe0692aa5994f537973c920bff154
Type fulltextMimetype application/pdf

Other links

Publisher's full textPubMedScopus

Authority records BETA

Baskaran, KarthikeyanMacedo, António Filipe

Search in DiVA

By author/editor
Baskaran, KarthikeyanMacedo, António Filipe
By organisation
Department of Medicine and Optometry
In the same journal
PLoS ONE
Ophthalmology

Search outside of DiVA

GoogleGoogle Scholar
Total: 33 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 153 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf