[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

2nd CfP: Extended submission deadline for Special Issue on Interactive Sonification - Springer Journal on Multimodal User Interfaces (JMUI)



[apologies for cross posting]
[please circulate] 

********************************************************************************************************************
Special Issue on Interactive Sonification
Springer Journal on Multimodal User Interfaces (JMUI)

2nd Call for Papers
New deadline for paper submission: 17th January 2011 (previous deadline was 17th December 2010)
********************************************************************************************************************

This special issue will address computational models, techniques, methods, and systems for Interactive Sonification and their evaluation.

Sonification & Auditory Displays are increasingly becoming an established technology for exploring data, monitoring complex processes, or assisting exploration and navigation of data spaces. Sonification addresses the auditory sense by transforming data into sound, allowing the human user to get valuable information from data by using their natural listening skills. The main differences of sound displays over visual displays are that sound can:

	• Represent frequency responses in an instant (as timbral characteristics)
	• Represent changes over time, naturally
	• Allow microstructure to be perceived
	• Rapidly portray large amounts of data
	• Alert listener to events outside the current visual focus
	• Holistically bring together many channels of information

Auditory displays typically evolve over time since sound is inherently atemporal phenomenon. Interaction thus becomes an integral part of the process in order to select, manipulate, excite or control the display, and this has implications for the interface between humans and computers. In recent years it has become clear that there is an important need for research to address the interaction with auditory displays more explicitly. Interactive Sonification is the specialized research topic concerned with the use of sound to portray data, but where there is a human being at the heart of an interactive control loop. Specifically it deals with the following areas (but not limited to), in which we invite submissions of research papers:

	• interfaces between humans and auditory displays
	• mapping strategies and models for creating coherency between action and reaction (e.g. acoustic feedback, but also combined with haptic or visual feedback)
	• perceptual aspects of the display (how to relate actions and sound, e.g. cross-modal effects, importance of synchronisation)
	• applications of Interactive Sonification
	• evaluation of performance, usability and multi-modal interactive systems including auditory feedback

IMPORTANT DATES:
	• Deadline for paper submission: 17th January 2011 (previous deadline was 17th December 2010)
	• Notification of acceptance: 7th March 2011
	• Camera-ready version of accepted papers: 9th May 2011
	• Publication date: July/August 2011


GUEST EDITORS:
Roberto Bresin
KTH School of Computer Science and Communication, Stockholm, Sweden; roberto@xxxxxx
Thomas Hermann
Bielefeld University, Ambient Intelligence Group, Bielefeld, Germany; thermann@xxxxxxxxxxxxxxxxxxxxxxxx
Andy Hunt
University of York, Electronics Dept., York, UK; adh@xxxxxxxxxxxxxx

INSTRUCTIONS FOR AUTHORS:
Submissions should be 6 to 12 pages long and must be written in English.
Formatting instructions and templates are available on: http://www.jmui.org

Authors should register and upload their submission on the following website: http://www.editorialmanager.com/jmui/
During the submission process, please select "SI (Special Issue) - Interactive Sonification" as article type.

Authors are encouraged to send to: Roberto Bresin, roberto@xxxxxx a brief email indicating their intention to participate as soon as possible, including their contact information and the topic they intend to address in their submissions.

The Journal on Multimodal User Interfaces is a publication of OpenInterface (www.openinterface.org)
Editor in Chief: Jean-Claude Martin
Founding Editor: Benoit Macq
More information on www.springer.com/12193 or www.jmui.org


Attachment: JMUI_InteractiveSonification_2nd_CfP.pdf
Description: Adobe PDF document

______________________________________________
Roberto Bresin, PhD
Associate Professor
KTH Royal Institute of Technology
School of Computer Science and Communication
Dept of Speech, Music and Hearing
Lindstedtsv. 24
SE - 114 28  Stockholm
Sweden	

url		: www.speech.kth.se/~roberto
skype	: robertobresin
tel		: +46-8-790 78 76
mob	: +46-70-795 78 76
fax 		: +46-8-790 78 54