CONNECTIONIST MODELLING OF AUDITORY SCENE ANALYSIS (Guy Brown )


Subject: CONNECTIONIST MODELLING OF AUDITORY SCENE ANALYSIS
From:    Guy Brown  <G.Brown(at)dcs.shef.ac.uk>
Date:    Wed, 11 Sep 1996 17:38:34 +0100

CALL FOR SPEAKERS NIPS'96 Postconference Workshop CONNECTIONIST MODELLING OF AUDITORY SCENE ANALYSIS Snowmass (Aspen), Colorado USA Friday Dec 6th, 1996 Guy J. Brown DeLiang Wang Department of Computer Science Department of Computer & Information University of Sheffield Sci. and Center for Cognitive Sci. Regent Court, 211 Portobello St. The Ohio State University Sheffield S1 4DP, U.K. Columbus, OH 43210-1277, USA Fax: +44 (0)114 2780972 Fax: (614)2922911 Email: guy(at)dcs.shef.ac.uk Email: dwang(at)cis.ohio-state.edu http://www.dcs.shef.ac.uk/~guy http://www.cis.ohio-state.edu/~dwang OBJECTIVES Auditory scene analysis describes the ability of listeners to separate the acoustic events arriving from different environmental sources into separate perceptual representations (streams). It is related to, but is more general than, the well-known "cocktail party effect", which refers to the ability of listeners to segregate one voice from a mixture of many other voices. Computational models of auditory scene analysis are likely to play an important role in building speech recognition systems that work in realistic acoustic environments. However, many aspects of this important modelling problem are as yet largely unsolved. Recently, there has been significant growth in neural modelling of auditory scene analysis since Albert Bregman published his book "Auditory Scene Analysis" in 1990. This workshop seeks to bring together a diverse group of researchers to critically examine the progress made so far in this challenging research area, and to discuss unsolved problems. In particular, we intend to address the following issues: * Whether attention is involved in primitive (bottom-up) auditory scene analysis * How primitive auditory scene analysis is coupled with schema-based (knowledge-based) auditory scene analysis * The utility of the oscillatory approach In addition to reviewing these issues, we would like to chart, if possible, a neural network framework for segmenting simultaneously presented auditory patterns. WORKSHOP FORMAT This one-day workshop will be organised into two three-hour sessions, one in early morning and one in late afternoon. The intermitting time is reserved for skiing or free-wheeling interactions between participants. Each session consists of 2 hour oral presentations and 1 hour panel discussion. SUBMISSION OF ABSTRACTS A group of invited experts, including Albert Bregman, will speak in the workshop. We are seeking a few more speakers to contribute. If you have done work on this or related topics and would like to contribute, please send an abstract as soon as possible to: GUY J. BROWN Department of Computer Science University of Sheffield Regent Court, 211 Portobello Street Sheffield S1 4DP, UK Phone: +44 (0)114 2825568; Fax: +44 (0)114 2780972 Email: guy(at)dcs.shef.ac.uk Abstracts should be sent by email or by fax. Important Dates: Deadline for submission of abstracts: 27 September, 1996 Notification of acceptance: 7 October, 1996 A set of workshop notes will be produced. Please contact the workshop organizers for further information, or consult the NIPS WWW home page: http://www.cs.cmu.edu/afs/cs.cmu.edu/Web/Groups/NIPS/ PS: For those of you who are not familiar with the NIPS (Neural Information Processing Systems) Conference, NIPS is an annual meeting of the premier interdisciplinary conference covering all aspects of neural processing and computation. NIPS brings together neuroscientists, computer scientists, cognitive scientists, engineers, physicists, mathematicians, and statisticians, with interests in natural and artificial neural systems. This year's NIPS is the tenth meeting. NIPS postconference workshops are held at a world-class ski resort.


This message came from the mail archive
http://www.auditory.org/postings/1996/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University