[AUDITORY] Call for Papers: VIHAR-2017 (Ricard Marxer )


Subject: [AUDITORY] Call for Papers: VIHAR-2017
From:    Ricard Marxer  <ricardmp@xxxxxxxx>
Date:    Tue, 21 Feb 2017 20:37:40 +0000
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--94eb2c046ada2435920549105b7c Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable ------------------------------------------------------------ ------------------------------------- (apologies for any unintended cross-mailing) CALL FOR PARTICIPATION/PAPERS 1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots (VIHAR-2017) 25-26 August 2017 University of Sk=C3=B6vde, Sweden http://vihar-2017.vihar.org Almost all animals exploit vocal signals for a range of ecologically-motivated purposes: from detecting predators/prey and marking territory, to expressing emotions, establishing social relations and sharing information. Whether it=E2=80=99s a bird raising an alarm, a whale = calling to potential partners, a dog responding to human commands, a parent reading a story with a child, or a businessperson accessing stock prices using Siri on an iPhone, vocalisation provides a valuable communications channel through which behaviour may be coordinated and controlled, and information may be distributed and acquired. Indeed, the ubiquity of vocal interaction has led to research across an extremely diverse array of fields, from assessing animal welfare, to understanding the precursors of human language, to developing voice-based human-machine interaction. Clearly, there is potential for cross-fertilisation between disciplines; for example, using robots to investigate contemporary theories of language grounding, using machine learning to analyse different habitats or adding vocal expressivity to the next generation of autonomous social agents. However, many opportunities remain unexplored, not least due to the lack of a suitable forum. VIHAR-2017 aims to bring together researchers studying vocalisations and speech-based interaction in-and- between humans, animals and robots from a variety of different fields. The workshop is being organised by participants of a recent Dagstuhl Seminar on the same topic, and follows the publication of a short review article: http://journal.frontiersin. org/article/10.3389/frobt.2016.00061/full <http://journal.frontiersin.org/article/10.3389/frobt.2016.00061/full.>. VIHAR-2017 will provide a timely opportunity for scientists and engineers from different fields to share theoretical insights, best practices, tools and methodologies, and to identify common principles underpinning vocal behaviour in a multi-disciplinary environment. We invite original submissions of four-page INTERSPEECH-style papers (plus one page of references) or two-page extended abstracts (including references) in all areas of vocal interactivity. Suggested workshop topics may include, but are not limited to, the following areas: - physiological and morphological similarities/differences between vocal systems in animals - properties and functions of animal signals - evolution of vocal interactivity - vocal imitation and learning - conveyance of emotion - comparative analyses of human and animal vocalisations - use of vocalisation - vocal interactivity between non-conspecifics - spoken language systems - technology-based research methods Paper submissions should conform to the format as detailed in the INTERSPEECH 2017 Author=E2=80=99s Kit (http://www.interspeech2017. org/wp-content/uploads/2016/11/IS2017_paper_kit.zip). Key Dates: Submission deadline - 2 June 2017 Notification of acceptance - 30 June 2017 Workshop - 25-26 August 2017 Organisers: Robert Eklund, Link=C3=B6ping University Angela Dassow, Carthage College Ricard Marxer, University of Sheffield Roger K. Moore, University of Sheffield Bhiksha Raj, Carnegie Mellon University Rita Singh, Carnegie Mellon University Serge Thill, University of Sk=C3=B6vde Benjamin Weiss, Technical University of Berlin VIHAR-2017 takes place immediately after INTERSPEECH in Stockholm ( http://www.interspeech2017.org) and is supported by the International Speech Communication Association (http://www.isca-speech.org <http://www.isca-speech.org/iscaweb/>). For details, visit the VIHAR-2017 website: http://vihar-2017.vihar.org If you wish to join the VIHAR community, please subscribe to our mailing list by entering your e-mail address at http://www.freelists.org/list/vihar <http://www.freelists.org/list/vihar,> and then responding to the confirmation e-mail that you will receive from the list server. Please forward this CfP to anyone who you think might be interested in contributing/attending. ------------------------------------------------------------ ------------------------------------- --94eb2c046ada2435920549105b7c Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable <div dir=3D"ltr"><div style=3D"font-size:12.8px">--------------------------= ----<wbr>------------------------------<wbr>------------------------------<= wbr>-------</div><div style=3D"font-size:12.8px">(apologies for any uninten= ded cross-mailing)<br></div><div style=3D"font-size:12.8px"><br></div><div = style=3D"font-size:12.8px"><span style=3D"line-height:1.45"><span class=3D"= gmail-il">CALL</span>=C2=A0FOR PARTICIPATION/</span><span style=3D"line-hei= ght:1.45"><span class=3D"gmail-il">PAPERS</span></span><br></div><div style= =3D"font-size:12.8px"><br></div><div style=3D"font-size:12.8px">1st=C2=A0In= ternational Workshop on Vocal Interactivity in-and-between Humans, Animals = and Robots (<span class=3D"gmail-il">VIHAR</span>-2017)</div><div style=3D"= font-size:12.8px"><br></div><div style=3D"font-size:12.8px">25-26 August 20= 17<br></div><div style=3D"font-size:12.8px">University of=C2=A0Sk=C3=B6vde,= Sweden<br></div><div style=3D"font-size:12.8px"><br></div><div style=3D"fo= nt-size:12.8px"><a href=3D"http://vihar-2017.vihar.org/" target=3D"_blank">= http://<span class=3D"gmail-il">vihar</span>-2017.<span class=3D"gmail-il">= vihar</span>.org</a></div><div style=3D"font-size:12.8px"><br></div><div st= yle=3D"font-size:12.8px">Almost all animals exploit vocal signals for a ran= ge of ecologically-motivated purposes: from detecting predators/prey and ma= rking territory, to expressing emotions, establishing social relations and = sharing information. Whether it=E2=80=99s a bird raising an alarm, a whale = calling to potential partners, a dog responding to human commands, a parent= reading a story with a child, or a businessperson accessing stock prices u= sing Siri on an iPhone, vocalisation provides a valuable communications cha= nnel through which behaviour may be coordinated and controlled, and informa= tion may be distributed and acquired. Indeed, the ubiquity of vocal interac= tion has led to research across an extremely diverse array of fields, from = assessing animal welfare, to understanding the precursors of human language= , to developing voice-based human-machine interaction. Clearly, there is po= tential for cross-fertilisation between disciplines; for example, using rob= ots to investigate contemporary theories of language grounding, using machi= ne learning to analyse different habitats or adding vocal expressivity to t= he next generation of autonomous social agents. However, many opportunities= remain unexplored, not least due to the lack of a suitable forum.</div><di= v style=3D"font-size:12.8px"><br></div><div style=3D"font-size:12.8px"><spa= n class=3D"gmail-il">VIHAR</span>-2017 aims to bring together researchers s= tudying vocalisations and speech-based interaction in-and- between humans, = animals and robots from a variety of different fields. The workshop is bein= g organised by participants of a recent Dagstuhl Seminar on the same topic,= and follows the publication of a short review article:=C2=A0<a href=3D"htt= p://journal.frontiersin.org/article/10.3389/frobt.2016.00061/full." target= =3D"_blank">http://journal.frontiersin.<wbr>org/article/10.3389/frobt.<wbr>= 2016.00061/full</a>.=C2=A0<span class=3D"gmail-il">VIHAR</span>-2017 will p= rovide a=C2=A0timely opportunity for scientists and engineers from differen= t fields to share theoretical insights, best practices, tools and methodolo= gies, and to identify common principles underpinning vocal behaviour in a m= ulti-disciplinary environment.</div><div style=3D"font-size:12.8px"><br></d= iv><div style=3D"font-size:12.8px">We invite original submissions of four-p= age INTERSPEECH-style=C2=A0<span class=3D"gmail-il">papers</span>=C2=A0(plu= s one page of references)=C2=A0or two-page extended abstracts (including re= ferences)=C2=A0in all areas of vocal interactivity. Suggested workshop topi= cs may include, but are not limited to, the following areas:</div><div styl= e=3D"font-size:12.8px"><ul><li style=3D"margin-left:15px">physiological and= morphological similarities/differences between vocal systems in animals<br= ></li><li style=3D"margin-left:15px">properties and functions of animal sig= nals<br></li><li style=3D"margin-left:15px">evolution of vocal interactivit= y</li><li style=3D"margin-left:15px">vocal imitation and learning</li><li s= tyle=3D"margin-left:15px">conveyance of emotion<br></li><li style=3D"margin= -left:15px">comparative analyses of human and animal vocalisations<br></li>= <li style=3D"margin-left:15px">use of vocalisation<br></li><li style=3D"mar= gin-left:15px">vocal interactivity between non-conspecifics<br></li><li sty= le=3D"margin-left:15px">spoken language systems<br></li><li style=3D"margin= -left:15px">technology-based research methods=C2=A0</li></ul></div><div sty= le=3D"font-size:12.8px"><br></div><div style=3D"font-size:12.8px"><span cla= ss=3D"gmail-il">Paper</span>=C2=A0submissions should conform to the format = as detailed in the INTERSPEECH 2017 Author=E2=80=99s Kit (<!-- <a href=3D"h= ttp://www.interspeech2017.org/wp-content/uploads/2016/11/IS2017_paper_kit.z= ip" target=3D"_blank"> -->http://www.interspeech2017.<wbr>org/wp-content/up= loads/2016/<wbr>11/IS2017_paper_kit.zip <font color=3Dgray>[ www. interspeech2017. org/wp-content/uploads/2016/11/= IS2017_paper_kit. zip ]</font> <!-- </a> -->).<br></div><div style=3D"font-= size:12.8px"><br></div><div style=3D"font-size:12.8px">Key Dates:</div><div= style=3D"font-size:12.8px">Submission deadline - 2 June 2017<br></div><div= style=3D"font-size:12.8px"><div></div></div><div style=3D"font-size:12.8px= ">Notification of acceptance - 30 June 2017<br></div><div style=3D"font-siz= e:12.8px">Workshop - 25-26 August 2017</div><div style=3D"font-size:12.8px"= ><br></div><div style=3D"font-size:12.8px">Organisers:</div><div style=3D"f= ont-size:12.8px">Robert Eklund, Link=C3=B6ping University</div><div style= =3D"font-size:12.8px">Angela Dassow, Carthage College</div><div style=3D"fo= nt-size:12.8px">Ricard Marxer, University of Sheffield</div><div style=3D"f= ont-size:12.8px">Roger K. Moore, University of Sheffield</div><div style=3D= "font-size:12.8px">Bhiksha Raj, Carnegie Mellon University</div><div style= =3D"font-size:12.8px">Rita Singh, Carnegie Mellon University</div><div styl= e=3D"font-size:12.8px">Serge Thill, University of Sk=C3=B6vde</div><div sty= le=3D"font-size:12.8px">Benjamin Weiss, Technical University of Berlin</div= ><div style=3D"font-size:12.8px"><br></div><div style=3D"font-size:12.8px">= <span class=3D"gmail-il">VIHAR</span>-2017 takes place=C2=A0immediately aft= er INTERSPEECH in Stockholm (<a href=3D"http://www.interspeech2017.org/" ta= rget=3D"_blank">http://www.interspeech2017.<wbr>org</a>) and is=C2=A0suppor= ted=C2=A0by the International Speech Communication Association (<a href=3D"= http://www.isca-speech.org/iscaweb/" target=3D"_blank">http://www.isca-spee= ch.org</a>).<br></div><div style=3D"font-size:12.8px"><br></div><div style= =3D"font-size:12.8px">For details, visit the=C2=A0<span class=3D"gmail-il">= VIHAR</span>-2017 website:=C2=A0<a href=3D"http://vihar-2017.vihar.org/" ta= rget=3D"_blank" style=3D"color:rgb(0,0,238)">http://<span class=3D"gmail-il= ">vihar</span>-2017.<wbr><span class=3D"gmail-il">vihar</span>.org</a></div= ><div style=3D"font-size:12.8px"><br></div><div style=3D"font-size:12.8px">= If you wish to join the=C2=A0<span class=3D"gmail-il">VIHAR</span>=C2=A0com= munity, please subscribe to our mailing list by entering your e-mail addres= s=C2=A0at=C2=A0<a href=3D"http://www.freelists.org/list/vihar," target=3D"_= blank">http://www.<wbr>freelists.org/list/<span class=3D"gmail-il">vihar</s= pan></a>=C2=A0and then responding to the confirmation=C2=A0e-mail that=C2= =A0you will receive from the list server.<br></div><div style=3D"font-size:= 12.8px"><br></div><div style=3D"font-size:12.8px">Please forward this CfP t= o anyone who you think might be interested in contributing/attending.</div>= <div style=3D"font-size:12.8px">------------------------------<wbr>--------= ----------------------<wbr>------------------------------<wbr>-------</div>= <div><br></div></div> --94eb2c046ada2435920549105b7c--


This message came from the mail archive
../postings/2017/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University