3aSP2. Aurally aided visual search in depth.

Session: Wednesday Morning, December 3


Author: Raymond Lim
Location: Dept. of Psych., California State Univ., 5151 State Univ. Dr., Los Angeles, CA 90032
Author: David R. Perrott
Location: Dept. of Psych., California State Univ., 5151 State Univ. Dr., Los Angeles, CA 90032
Author: Jose A. Gallegos
Location: Dept. of Psych., California State Univ., 5151 State Univ. Dr., Los Angeles, CA 90032

Abstract:

Minimum latencies required to locate and identify a visual target (visual search) were obtained in which the visual target could appear in a number of places on a depth plane (0-deg azimuth). A visual target was among the visual distracters located on a finite array segregated at three distances with placement of the target varied within the session from 24 to 232 in. from the subject. The stimulus was a broadband noise (70 dB A). Factoring in the distance of each speaker in relation to the subject, the inverse square law was considered in this paradigm. Four subjects were tested in three conditions: aurally aided search (spatially correlated), aurally unaided search (spatially uncorrelated, where no spatial information was provided), and unaided search (no auditory stimulus was used). The introduction of aurally aided search cues enhanced the ability to localize a target [f(38,84)=2.11, p<0.005], thus lessening the workload on the visual system when distracters were in the visual field. Theoretical aspects and applications will be discussed further.


ASA 134th Meeting - San Diego CA, December 1997