Abstract:
Auditory adaptation is characterized by a decrease in apparent loudness over several minutes of time. This is true for magnitude estimated loudness adaptation measured either ipsilaterally [Weiler et al., Br. J. Audio. 15, 201--204 (1981)] or induced binaurally [Botte et al., J. Acoust. Soc. Am. 72, 727--739 (1982)]. The extent of ipsilateral adaptation is also a function of time of day [Sandman et al., J. Aud. Res. 22, 65--69 (1982)], as well as duration of exposure [Weiler and Cobb, J. Aud. Res. 22, 233--239 (1982)]. The dB adaptation by the classic Simultaneous Dichotic Loudness Balance procedure progresses over time [Hood, Acta Oto-Laryngol. Suppl. 92, 1--57 (1950)]. In 1976, Davis and Weiler [Br. J. Audiol. 10, 102--106] found that simple reaction time (RT) to a constant intensity, increased reliably after 7 min of exposure, as if the intensity had decreased. Goldman et al. [J. Aud. Res. 21, 13--16 (1981)] and Weiler et al. [J. Gen. Psychol. 114 (1987; errata, 1988)] confirmed this effect. However, in a modified design [T. Goldman, Ph.D. dissertation, University of Cincinnati (1985)] the reaction time increased significantly only for the second trial after 15 s of exposure. Thereafter RT values did not differ significantly from the baseline. Hick's law is offered as an explanation, as well as differences in procedure between RT adaptation studies.