Asynchronous interpretation of manual and automated audiometry: agreement and reliability

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Introduction: Remote interpretation of automated audiometry offers the potential to enable asynchronous tele-audiology assessment and diagnosis in areas where synchronous tele-audiometry may not be possible or practical. The aim of this study was to compare remote interpretation of manual and automated audiometry. Methods: Five audiologists each interpreted manual and automated audiograms obtained from 42 patients. The main outcome variable was the audiologist’s recommendation for patient management (which included treatment recommendations, referral or discharge) between the manual and automated audiometry test. Cohen’s Kappa and Krippendorff’s Alpha were used to calculate and quantify the intra- and inter-observer agreement, respectively, and McNemar’s test was used to assess the audiologist-rated accuracy of audiograms. Audiograms were randomised and audiologists were blinded as to whether they were interpreting a manual or automated audiogram. Results: Intra-observer agreement was substantial for management outcomes when comparing interpretations for manual and automated audiograms. Inter-observer agreement was moderate between clinicians for determining management decisions when interpreting both manual and automated audiograms. Audiologists were 2.8 times more likely to question the accuracy of an automated audiogram compared to a manual audiogram. Discussion: There is a lack of agreement between audiologists when interpreting audiograms, whether recorded with automated or manual audiometry. The main variability in remote audiogram interpretation is likely to be individual clinician variation, rather than automation. © 2016, © The Author(s) 2016.
Original languageEnglish
Pages (from-to)37–43
Number of pages7
JournalJournal of Telemedicine and Telecare
Volume24
Issue number1
DOIs
Publication statusPublished - Jan 2018

Fingerprint

Audiometry
Audiology
Automation
Audiologists
Referral and Consultation

Cite this

@article{a634e2616a9047e49180f8d104475b4c,
title = "Asynchronous interpretation of manual and automated audiometry: agreement and reliability",
abstract = "Introduction: Remote interpretation of automated audiometry offers the potential to enable asynchronous tele-audiology assessment and diagnosis in areas where synchronous tele-audiometry may not be possible or practical. The aim of this study was to compare remote interpretation of manual and automated audiometry. Methods: Five audiologists each interpreted manual and automated audiograms obtained from 42 patients. The main outcome variable was the audiologist’s recommendation for patient management (which included treatment recommendations, referral or discharge) between the manual and automated audiometry test. Cohen’s Kappa and Krippendorff’s Alpha were used to calculate and quantify the intra- and inter-observer agreement, respectively, and McNemar’s test was used to assess the audiologist-rated accuracy of audiograms. Audiograms were randomised and audiologists were blinded as to whether they were interpreting a manual or automated audiogram. Results: Intra-observer agreement was substantial for management outcomes when comparing interpretations for manual and automated audiograms. Inter-observer agreement was moderate between clinicians for determining management decisions when interpreting both manual and automated audiograms. Audiologists were 2.8 times more likely to question the accuracy of an automated audiogram compared to a manual audiogram. Discussion: There is a lack of agreement between audiologists when interpreting audiograms, whether recorded with automated or manual audiometry. The main variability in remote audiogram interpretation is likely to be individual clinician variation, rather than automation. {\circledC} 2016, {\circledC} The Author(s) 2016.",
author = "Brennan-Jones, {Christopher G.} and Eikelboom, {Robert H.} and Bennett, {Rebecca J.} and Tao, {Karina F.M.} and Swanepoel, {De Wet}",
year = "2018",
month = "1",
doi = "10.1177/1357633X16669899",
language = "English",
volume = "24",
pages = "37–43",
journal = "Journal of Telemedicine & Telecare",
issn = "1357-633X",
publisher = "SAGE Publications Ltd",
number = "1",

}

TY - JOUR

T1 - Asynchronous interpretation of manual and automated audiometry: agreement and reliability

AU - Brennan-Jones, Christopher G.

AU - Eikelboom, Robert H.

AU - Bennett, Rebecca J.

AU - Tao, Karina F.M.

AU - Swanepoel, De Wet

PY - 2018/1

Y1 - 2018/1

N2 - Introduction: Remote interpretation of automated audiometry offers the potential to enable asynchronous tele-audiology assessment and diagnosis in areas where synchronous tele-audiometry may not be possible or practical. The aim of this study was to compare remote interpretation of manual and automated audiometry. Methods: Five audiologists each interpreted manual and automated audiograms obtained from 42 patients. The main outcome variable was the audiologist’s recommendation for patient management (which included treatment recommendations, referral or discharge) between the manual and automated audiometry test. Cohen’s Kappa and Krippendorff’s Alpha were used to calculate and quantify the intra- and inter-observer agreement, respectively, and McNemar’s test was used to assess the audiologist-rated accuracy of audiograms. Audiograms were randomised and audiologists were blinded as to whether they were interpreting a manual or automated audiogram. Results: Intra-observer agreement was substantial for management outcomes when comparing interpretations for manual and automated audiograms. Inter-observer agreement was moderate between clinicians for determining management decisions when interpreting both manual and automated audiograms. Audiologists were 2.8 times more likely to question the accuracy of an automated audiogram compared to a manual audiogram. Discussion: There is a lack of agreement between audiologists when interpreting audiograms, whether recorded with automated or manual audiometry. The main variability in remote audiogram interpretation is likely to be individual clinician variation, rather than automation. © 2016, © The Author(s) 2016.

AB - Introduction: Remote interpretation of automated audiometry offers the potential to enable asynchronous tele-audiology assessment and diagnosis in areas where synchronous tele-audiometry may not be possible or practical. The aim of this study was to compare remote interpretation of manual and automated audiometry. Methods: Five audiologists each interpreted manual and automated audiograms obtained from 42 patients. The main outcome variable was the audiologist’s recommendation for patient management (which included treatment recommendations, referral or discharge) between the manual and automated audiometry test. Cohen’s Kappa and Krippendorff’s Alpha were used to calculate and quantify the intra- and inter-observer agreement, respectively, and McNemar’s test was used to assess the audiologist-rated accuracy of audiograms. Audiograms were randomised and audiologists were blinded as to whether they were interpreting a manual or automated audiogram. Results: Intra-observer agreement was substantial for management outcomes when comparing interpretations for manual and automated audiograms. Inter-observer agreement was moderate between clinicians for determining management decisions when interpreting both manual and automated audiograms. Audiologists were 2.8 times more likely to question the accuracy of an automated audiogram compared to a manual audiogram. Discussion: There is a lack of agreement between audiologists when interpreting audiograms, whether recorded with automated or manual audiometry. The main variability in remote audiogram interpretation is likely to be individual clinician variation, rather than automation. © 2016, © The Author(s) 2016.

U2 - 10.1177/1357633X16669899

DO - 10.1177/1357633X16669899

M3 - Article

VL - 24

SP - 37

EP - 43

JO - Journal of Telemedicine & Telecare

JF - Journal of Telemedicine & Telecare

SN - 1357-633X

IS - 1

ER -