[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[AUDITORY] CfP: Special Issue on Interactive Sonification at Multimodal User Interfaces (JMUI)



***Apologies for cross-posting*****

Following the ISon 2016 Workshop on December 15-16th 2016 (proceedings can be found on http://interactive-sonification.org/ISon2016/), we would like to announce the Journal on Multimodal User Interfaces (JMUI) Special Issue on Interactive Sonification”.

This special issue will address computational models, techniques, methods, and systems for Interactive Sonification and their evaluation. This is the second special issue of JMUI dedicated to Interaction Sonification. The first one was published in 2012.

Sonification & Auditory Displays are increasingly becoming an established technology for exploring data, monitoring complex processes, or assisting exploration and navigation of data spaces. Sonification addresses the auditory sense by transforming data into sound, allowing the human user to get valuable information from data by using their natural listening skills. 

The main differences of sound displays over visual displays are that sound can:

  • Represent frequency responses in an instant (as timbral characteristics)
  • Represent changes over time, naturally
  • Allow microstructure to be perceived
  • Rapidly portray large amounts of data
  • Alert listener to events outside the current visual focus
  • Holistically bring together many channels of information 

Auditory displays typically evolve over time since sound is inherently a temporal phenomenon. Interaction thus becomes an integral part of the process in order to select, manipulate, excite or control the display, and this has implications for the interface between humans and computers. In recent years, it has become clear that there is an important need for research to address the interaction with auditory displays more explicitly. Interactive Sonification is the specialized research topic concerned with the use of sound to portray data, but where there is a human being at the heart of an interactive control loop. Specifically it deals with the following areas (but not limited to), in which we invite submissions of research papers:

  • interfaces between humans and auditory displays
  • mapping strategies and models for creating coherency between action and reaction (e.g. acoustic feedback, but also combined with haptic or visual feedback)
  • perceptual aspects of the display (how to relate actions and sound, e.g. cross-modal effects, importance of synchronisation)
  • applications of Interactive Sonification
  • evaluation of performance, usability and multimodal interactive systems including auditory feedback

For this special issue, we invite particular works focussing on Adaptivity and Scaffolding in Interactive Sonification, i.e. how auditory feedback and Interactive Sonification provide a scaffolding for familiarizing with interaction and learning to interact, and how users adapt their activity patterns according to the feedback and their level of experience. For example, a sports movement sonification could initially focus the displayed information on the most basic pattern (e.g. active arm) and once the user progresses (i.e. feedback indicates that they understand and utilize this information), increasingly emphasize subtle further cues (e.g. knees) by making such auditory streams more salient. This feeds into the important question, how we can evaluate the complex and temporally developing interrelationship between the human user and an interactive system that is coupled to the user by means of Interactive Sonification. To make a sustainable contribution, we strongly encourage a reproducible research approach in Interactive Sonification to:

  • allow for the formal evaluation and comparison of Interactive Sonification systems,
  • establish standards in Interactive Sonification  

Schedule (subject to change):

     Submission deadline: December 15, 2017

     Notification of acceptance: March 16, 2018

     Final paper submission: April 20, 2018

     Tentative Publication: June/July 2018

 

Guest Editors:

Roberto Bresin, KTH School of Computer Science and Communication, Stockholm, Sweden; roberto@xxxxxx

Thomas Hermann, Bielefeld University, Ambient Intelligence Group, Bielefeld, Germany; thermann@xxxxxxxxxxxxxxxxxxxxxxxx

 Jiajun Yang, Bielefeld University, Ambient Intelligence Group, Bielefeld, Germany; jyang@xxxxxxxxxxx-bielefeld.de
 

Authors are requested to follow instructions for manuscript submission to the Journal of Multimodal User Interfaces (http://www.springer.com/computer/hci/journal/12193) and to submit manuscripts at the following link: http://www.editorialmanager.com/jmui/. The article type to be selected is Special Issue S.I. : Interactive Sonification - 2017

Link to the PDF file with the call for papers: 

http://static.springer.com/sgw/documents/1620044/application/pdf/CFP_JMUI_Special+Issue_InteractiveSonification_2017.pdf

Best regards,

Roberto Bresin, Thomas Hermann,  Jiajun Yang