ASA 124th Meeting New Orleans 1992 October

4aAA1. Computational aspects of immersive acoustical environments.

Bruce K. Sawhill

Santa Fe Inst., Santa Fe, NM 87501

and Human Interface Technol. Lab., Univ. of Washington, Seattle, WA 98105

The importance of the auditory channel in creating realistic virtual environments has emerged as being of comparable importance to the much better understood visual channel. The auditory channel presents special computational challenges because of bandwidth and update rate requirements. So far, this has limited the synthesis of virtual sound to sound placement of multiple sources using head-related transfer functions (HRTFs), acoustical ranging, and simple echo processing using ray-tracing techniques borrowed from the field of graphics. Recent advances in microprocessor technology coupled with advances in computational algorithms have brought the possibility of true 3-D sound field synthesis to the fore. The essential advance required to make such a synthesis possible is the real-time diffractive treatment of sound. The mathematics and digital signal processing of diffractive sound is discussed, along with possible applications. [Work supported by the Human Interface Technology Laboratory, University of Washington, Seattle.]