Variation in tone presentation by Pure Tone Audiometers: the potential for error in screening audiometry.

Christopher Barlow, Lee Davison, Mark Ashmore

Research output: Chapter in Book/Report/Published conference proceedingConference contribution

Abstract

Manual pure tone audiometry has been in consistent use for a long period of time, and is considered to be the ‘gold standard’ for the assessment of hearing thresholds. Increasing legislative requirements, and a significant global cost impact of noise induced hearing loss, means that a significant amount of reliance is placed on this tool for diagnosis.
There are a number of questions regarding the degree of accuracy of pure tone audiometry when undertaken in field conditions, particularly relating to the difference in conditions between laboratory calibration and clinical or industrial screening use.
This study assessed the test-retest variability of a number of commercially available screening pure tone audiometers, all with recent calibration, using both laboratory tests and clinical tests in accordance with ISO 8253-1.
The results of both lab and clinical studies showed a high level of test-retest variability, with maximum between test variation of 21 decibels at some frequencies in the lab tests, and 35 dB in the clinical tests. Considerable variation occurred at all frequencies, with a particularly high level of variation at 6kHz for all meters. Levels of variation measured in this study suggests a high potential for diagnostic error when using screening pure tone audiometry.
Original languageEnglish
Title of host publicationProceedings of Euronoise 2015
Publication statusPublished - Jul 2015

Fingerprint

Pure-Tone Audiometry
Audiometry
Calibration
Noise-Induced Hearing Loss
Diagnostic Errors
Hearing
Costs and Cost Analysis

Cite this

@inproceedings{77464653f37248b994ddb35b63f0ac99,
title = "Variation in tone presentation by Pure Tone Audiometers: the potential for error in screening audiometry.",
abstract = "Manual pure tone audiometry has been in consistent use for a long period of time, and is considered to be the ‘gold standard’ for the assessment of hearing thresholds. Increasing legislative requirements, and a significant global cost impact of noise induced hearing loss, means that a significant amount of reliance is placed on this tool for diagnosis.There are a number of questions regarding the degree of accuracy of pure tone audiometry when undertaken in field conditions, particularly relating to the difference in conditions between laboratory calibration and clinical or industrial screening use. This study assessed the test-retest variability of a number of commercially available screening pure tone audiometers, all with recent calibration, using both laboratory tests and clinical tests in accordance with ISO 8253-1. The results of both lab and clinical studies showed a high level of test-retest variability, with maximum between test variation of 21 decibels at some frequencies in the lab tests, and 35 dB in the clinical tests. Considerable variation occurred at all frequencies, with a particularly high level of variation at 6kHz for all meters. Levels of variation measured in this study suggests a high potential for diagnostic error when using screening pure tone audiometry.",
author = "Christopher Barlow and Lee Davison and Mark Ashmore",
year = "2015",
month = "7",
language = "English",
booktitle = "Proceedings of Euronoise 2015",

}

Variation in tone presentation by Pure Tone Audiometers: the potential for error in screening audiometry. / Barlow, Christopher; Davison, Lee; Ashmore, Mark.

Proceedings of Euronoise 2015. 2015.

Research output: Chapter in Book/Report/Published conference proceedingConference contribution

TY - GEN

T1 - Variation in tone presentation by Pure Tone Audiometers: the potential for error in screening audiometry.

AU - Barlow, Christopher

AU - Davison, Lee

AU - Ashmore, Mark

PY - 2015/7

Y1 - 2015/7

N2 - Manual pure tone audiometry has been in consistent use for a long period of time, and is considered to be the ‘gold standard’ for the assessment of hearing thresholds. Increasing legislative requirements, and a significant global cost impact of noise induced hearing loss, means that a significant amount of reliance is placed on this tool for diagnosis.There are a number of questions regarding the degree of accuracy of pure tone audiometry when undertaken in field conditions, particularly relating to the difference in conditions between laboratory calibration and clinical or industrial screening use. This study assessed the test-retest variability of a number of commercially available screening pure tone audiometers, all with recent calibration, using both laboratory tests and clinical tests in accordance with ISO 8253-1. The results of both lab and clinical studies showed a high level of test-retest variability, with maximum between test variation of 21 decibels at some frequencies in the lab tests, and 35 dB in the clinical tests. Considerable variation occurred at all frequencies, with a particularly high level of variation at 6kHz for all meters. Levels of variation measured in this study suggests a high potential for diagnostic error when using screening pure tone audiometry.

AB - Manual pure tone audiometry has been in consistent use for a long period of time, and is considered to be the ‘gold standard’ for the assessment of hearing thresholds. Increasing legislative requirements, and a significant global cost impact of noise induced hearing loss, means that a significant amount of reliance is placed on this tool for diagnosis.There are a number of questions regarding the degree of accuracy of pure tone audiometry when undertaken in field conditions, particularly relating to the difference in conditions between laboratory calibration and clinical or industrial screening use. This study assessed the test-retest variability of a number of commercially available screening pure tone audiometers, all with recent calibration, using both laboratory tests and clinical tests in accordance with ISO 8253-1. The results of both lab and clinical studies showed a high level of test-retest variability, with maximum between test variation of 21 decibels at some frequencies in the lab tests, and 35 dB in the clinical tests. Considerable variation occurred at all frequencies, with a particularly high level of variation at 6kHz for all meters. Levels of variation measured in this study suggests a high potential for diagnostic error when using screening pure tone audiometry.

M3 - Conference contribution

BT - Proceedings of Euronoise 2015

ER -