5pPP13. Lateralization of nonspeech audio-visual stimulus combinations.

Session: Friday Afternoon, May 17


Author: N. J. Holt
Location: Dept. of Psych., Univ. of York, Heslington, York YO1 5DD, England

Abstract:

Ability to lateralize stimuli was measured in eight normally hearing subjects. In experiment 1 auditory or visual stimuli were presented. Subjects responded with an auditory or visual pointer in conditions where stimulus and response modalities were the same (uni-modal) or different (cross-modal). A linear relationship was found between the position of the target stimuli and the perceived lateral position, establishing the correspondence between auditorily and visually presented positions, consistent with Yost [J. Acoust. Soc. Am. 70, 397--409 (1981)]. Mean judgments of linear position were independent of stimulus or response modality. In experiment 2 subjects were presented with bi-modal audio-visual stimuli with spatially and temporally correspondent modal components and subjects responded with an auditory pointer. Mean judgments of position were similar to those in experiment 1 but standard deviations were significantly smaller for the bi-modal stimuli relative to uni-modal stimuli. Experiments 3 and 4 involved manipulations of the spatial or temporal relationship between modal components of bi-modal stimuli. Whereas the relative importance of the visual modality was confirmed [Colavita, Percept. Psychophys. 15(2), 409--412 (1974)] the results of both experiments indicated that perception of the location of an audio-visual stimulus is influenced by information conveyed in both modalities. [Work supported by UKBBSRC.]


from ASA 131st Meeting, Indianapolis, May 1996