[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Head Motion Experiments
I thought maybe our lab (Steve Colburn's Binaural Hearing Lab at Boston
University http://www.bu.edu/dbin/binaural) could bring something to the
conversation about head motion experiments. We have been thinking about
these issues from the perspective of localization of spatially dynamic
sound sources (moving sources, moving head) but I think a lot of the
issues we have confronted apply to any study using head motion.
I first would like to address the issue of implementation. There are two
systems (FREE for anyone to implement) I would like to bring to
everyones attention. These are both software systems which require a
standard computer and a head tracker (if head motion is required).
1) Sound Lab (SLAB) from NASA (http://human-factors.arc.nasa.gov/SLAB/)
is a very well put together real-time virtual acoustic environment
rendering system that runs on windows. I don't know as much about this
system, but do know that it is easy to use and may be appropriate for a
lot of the experiments in question. There is a lot of documentation and
information about the characteristics of the system on their website.
2) We have also developed a real-time virtual auditory space system in
our lab: http://www.bu.edu/dbin/binaural/pubs/SCW05.pdf
Our system has some temporal stability advantages (which may not matter
for most experiments). Although I feel our system is just as easy to
use once set up, the setup process is more tricky. In particular, the
system runs on top of a standard Linux system which has been patched
with a real time kernel. This is the difficult part of the setup
process and requires someone with expertise in installing Linux systems.
One of the more important issues when choosing a real-time system is the
temporal latency from when the head moves to when the spatialization is
updated. Although I don't feel the question is fully answered yet there
has been some work on what latencies are "good enough" (Brungart et al.
2005, Brungart et al. 2005, Wenzel 1999 references/links below). Both of
the systems above will perform within what is recommended by these
papers. However, I would also point out that "good enough" latency is
dependent on the task and stimuli, so I would still leave the question open.
I agree that the Wallach experiment (Wallach 1940), and the Wightman and
Kistler (Wightman and Kistler 1999) front-back resolution experiment
show that there are interesting effects of head dynamics. To that I will
add an anecdote, simple cues such as ITD give us different percepts
during head motion. For example, the rate at which the ITD changes will
give us an elevation cue. Although I have not explored this
experimentally, I have subjectively experienced reliable elevation
percepts when using only ITD. We should be open to cues (which have been
studied mostly statically) contributing differently when dynamics are
involved.
I think that there are a lot of interesting questions that still need to
be answered and a virtual system will allow us to control the variables
and stimuli in question. I also know that there is a lot of time
involved setting up a system to run the experiments. If someone has a
pet experiment they would like to do (such as the questions raised
earlier on the list), our lab is open to collaboration and may have
students interested in doing short/long term exploration projects.
Please contact H. Steven Colburn or myself if you would like to discuss
this further.
Jacob Scarpaci
Binaural Hearing Lab
Boston University
REFERENCES:
----------------------------------------------------------------------------
Brungart, D. S., Simpson, B. D., Kordik, J. A. (2005, July) THE
DETECTABILITY OF HEADTRACKER LATENCY IN VIRTUAL AUDIO DISPLAYS In
Proceedings of the eleventh meeting of the international conference on
auditory display.
http://www.idc.ul.ie/icad2005/downloads/f51.pdf
Brungart, D. S., Simpson, B. D., McKinley, R. L., Kordik, J. A.,
Dallman, R. C., & Ovenshire, D. A. (2004, July). The interaction between
head-tracker latency, source duration, and response time in the
localization of virtual sound sources. In Proceedings of the tenth
meeting of the international conference on auditory display.
http://www.icad.org/websiteV2.0/Conferences/ICAD2004/papers/brungart_etal.pdf
Scarpaci, J. W., Colburn, H. S., and White, J. A. (2005). "A SYSTEM FOR
REAL-TIME VIRTUAL AUDITORY SPACE," Proceedings of ICAD 05-Eleventh
Meeting of the International Conference on Auditory Display.
http://www.bu.edu/dbin/binaural/pubs/SCW05.pdf
Wallach, H. (1940, October). The role of head movements and vestibular
and visual cues in sound localization. Journal of Experimental
Psychology, 27 (4), 339-368.
Wenzel, E. (1999). Effect of increasing system latency on localization
of virtual sounds. In Proc. aes 16th int. conf. on spatial sound
reproduction (p. 42-50).
Wightman, F. L., & Kistler, D. J. (1999). Resolution of front-back
ambiguity in spatial hearing by listener and source movement. J. Acoust.
Soc. Am., 105 (5), 2841-53.