Dear Frederico, Difficult at this stage of what you have presented. These could be protocol issues, Ambisonic decoder issues, coordinate system issues, etc. For protocol issues, we had similar difficulties with simply identifying directional differences between targets, being able to account for front/back confusions, in this study: L. Simon, A. Andreopoulou, and B. F. G. Katz, “Investigation of perceptual interaural time difference evaluation protocols in a binaural context,” Acta Acust united Ac, vol. 102, pp. 129–140, 2016, doi:10.3813/AAA.918930. Left/Right needs to be well defined, as left/right rotation is not the same as lateral. As subjects expressly told that sources are only in the front, or not, etc. I would then invite you to be sur that the sounds your are rendering over Ambisonics correspond to what you think. Meaning, if you decode over a virtual speaker array are the right speakers getting the signals at the right time, I the right direction. As different decoders can sometimes use different coordinate systems (e.g. where is (0°/0°), different channel formats (CAN,FHM, etc.), and different units (° or radians), this is basic test before any perceptual issues can be examined. Finally, verify how the Ambisonic to Binaural being carried out, which HRTFs, which method, Ambisonic decoder options if any, is there a room effect being added, etc. Best regards, -Brian FG Katz -- Brian FG Katz, Research Director, CNRS Groupe Lutheries - Acoustique – Musique Sorbonne Université, CNRS, UMR 7190, Institut Jean Le Rond ∂'Alembert http://www.dalembert.upmc.fr/home/katz From: AUDITORY - Research in Auditory Perception <AUDITORY@xxxxxxxxxxxxxxx> On Behalf Of Frederico Pereira Dear Auditory List, Hoping this email finds you all well. Me and colleagues have been conducting some experiments with participants with the aim of characterizing the perception of movement of a virtual auditory target stimulus. The experiment is fundamentally simple: The participant listens over headphones to a 600ms band-passed noise signal (150 to 8000Hz) and responds to which of leftward or rightward movement direction it was perceived. Signals are always coded to be in the frontal hemisphere, in the horizontal plane describing different arc lengths (varying angular velocity). We are running these experiments at various orders of ambisonic encoding, Supported by the "snapshot" theory, that motion emerges from successive discrimination of target location over time and, confirmation that higher encoding orders result in better localisation of fixed targets, we expect better discrimination at higher orders, thus the reduction of the Minimum audible movement angle (MAMA) at finer encoding resolutions, but:
There is limited literature on the perceptual evaluation of auditory moving targets, even less so on virtual audio environments (stimulus presented over headphones). Are there any of you who came across experiences or studies reporting similar hurdles? I´d be very interested in hearing from you if you have any comments or further questions, or just willing to discuss this facet of spatial hearing. Best, - Frederico -- Frederico Pereira |