[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[AUDITORY] Call for Contributions - Special issue on Psychoacoustics for Extended Reality (XR)



Hi everyone,

I would like to invite you to contribute to Applied Sciences journal (IF 2.474, Q1)'s Special Issue on "Psychoacoustics for Extended Reality (XR)".  The submission deadline is 30 Apr 2021. Please email me directly to h.lee@xxxxxxxxx if you would like to express your intention to publish your work in this special issue or to discuss potential topics.

https://www.mdpi.com/journal/applsci/special_issues/Psychoacoustics_Extended_Reality

Dear Colleagues,

Extended reality (XR), which embraces the concepts of virtual reality, augmented reality and mixed reality, is a rapidly growing area of research and development. XR technologies are now being adopted in many sectors of the industry (music and film entertainment, medical training, military training, architectural simulation, virtual tourism, virtual education, etc.) XR ultimately aims to provide the user with realistic, engaging and interactive virtual experiences in 3-degrees-of-freedom (3DOF) or 6-degrees-of-freedom (6DOF), and for this, it is important to achieve the high-quality dynamic rendering of audio as well as visual information. The traditional psychoacoustics research has focused mainly on the investigations of specific auditory cues in controlled listening environments. However, to provide the user with a more plausible multimodal sensory experience in XR, psychoacoustics research needs to evolve and provide more ecologically valid experimental data and theories about how human auditory perception works in various practical XR scenarios. From this background, this Special Issue aims to introduce the recent development of psychoacoustics-based research focusing on XR and provide insights into future directions of research and development in this field. This issue will aim to collect more than 10 papers and will be published as a book collection.

Research topics of interest include, but are not limited to, the following:

  *   Dynamic sound localisation
  *   Auditory spatial perception
  *   Binaural processing with head-tracking or/and motion-tracking
  *   Auditory-visual interaction/multimodal perception
  *   Rendering and perception of virtual acoustics
  *   Sound recording and mixing techniques
  *   Sound synthesis and design
  *   Interactive and immersive storytelling
  *   Hearing aid
  *   Assistive listening
  *   Auditory(–visual) simulation and training

Prof. Dr. Hyunkook Lee
Guest Editor

               
=========================================
Dr Hyunkook Lee, BMus(Tonmeister), PhD, FAES, FHEA
Reader (Associate Professor) in Music Technology
Director of the Centre for Audio and Psychoacoustic Engineering (CAPE)
Founder/Leader of the Applied Psychoacoustics Laboratory (APL)
http://www.hud.ac.uk/apl
Phone: +44 (0)1484 471893
Email: h.lee@xxxxxxxxx
Office: HA2/06 
School of Computing and Engineering
University of Huddersfield
Huddersfield
HD1 3DH
United Kingdom
University of Huddersfield inspiring global professionals.


This transmission is confidential and may be legally privileged. If you receive it in error, please notify us immediately by e-mail and remove it from your system. If the content of this e-mail does not relate to the business of the University of Huddersfield, then we do not endorse it and will accept no liability.