Variation in tone presentation by Pure Tone Audiometers: the potential for error in screening audiometry

Christopher Barlow, Lee Davison, Mark Ashmore

    Research output: Chapter in Book/Report/Published conference proceedingConference contributionpeer-review

    249 Downloads (Pure)

    Abstract

    Manual pure tone audiometry has been in consistent use for a long period of time, and is considered to be the ‘gold standard’ for the assessment of hearing thresholds. Increasing legislative requirements, and a significant global cost impact of noise induced hearing loss, means that a significant amount of reliance is placed on this tool for diagnosis.
    There are a number of questions regarding the degree of accuracy of pure tone audiometry when undertaken in field conditions, particularly relating to the difference in conditions between laboratory calibration and clinical or industrial screening use.
    This study assessed the test-retest variability of a number of commercially available screening pure tone audiometers, all with recent calibration, using both laboratory tests and clinical tests in accordance with ISO 8253-1.
    The results of both lab and clinical studies showed a high level of test-retest variability, with maximum between test variation of 21 decibels at some frequencies in the lab tests, and 35 dB in the clinical tests. Considerable variation occurred at all frequencies, with a particularly high level of variation at 6kHz for all meters. Levels of variation measured in this study suggests a high potential for diagnostic error when using screening pure tone audiometry.
    Original languageEnglish
    Title of host publicationProceedings of Euronoise 2015
    Publication statusPublished - Jul 2015

    Fingerprint

    Dive into the research topics of 'Variation in tone presentation by Pure Tone Audiometers: the potential for error in screening audiometry'. Together they form a unique fingerprint.

    Cite this