Call for Chapters - Multimodal Interface Design for Biometric Applications (IGI Global) ("Flaithri.Neff" )


Subject: Call for Chapters - Multimodal Interface Design for Biometric Applications (IGI Global)
From:    "Flaithri.Neff"  <Flaithri.Neff@xxxxxxxx>
Date:    Sun, 11 Dec 2016 22:04:11 +0000
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--_000_126141A0B42545C09E0611DE3E095D1Elitie_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Dear Auditory List Members, Apologies for crossposting. We would like to invite chapter proposals for an upcoming book publication = - Multimodal Interface Design for Biometric Applications (IGI Global). We e= nvisage a significant portion of this publication will include research rel= ating to the auditory system and auditory interface design as applied to bi= ometric applications. Important Dates January 15, 2017: Proposal Submission Deadline (1,000 to 2,000 words) February 15, 2017: Notification of Acceptance April 15, 2017: Full Chapter Submission June 15, 2017: Review Results Returned July 30, 2017: Final Acceptance Notification August 15, 2017: Final Chapter Submission Objective This book will aim to provide relevant theoretical frameworks and the lates= t empirical research findings in the area of multimodal interface design fo= r biometric applications. It will be written for professionals who want to = improve their understanding of multimodal design concepts and how these are= applied to complex biometric systems. It will aim to introduce novel user-= interface models based on perceptual and cognitive principles across audito= ry, visual, tactile and gestural modalities. Editors Dr. Flaithri Neff, Interactive Systems Research Group, Limerick Institute o= f Technology. Flaithri.Neff@xxxxxxxx<mailto:Flaithri.Neff@xxxxxxxx> Dr. Katie Crowley, Trinity College Institute of Neuroscience, Trinity Colle= ge Dublin. k.crowley@xxxxxxxx<mailto:k.crowley@xxxxxxxx> For more information: http://www.igi-global.com/publish/call-for-papers/cal= l-details/2459 Kind regards, Flaithri Neff, PhD. --_000_126141A0B42545C09E0611DE3E095D1Elitie_ Content-Type: text/html; charset="us-ascii" Content-ID: <C81282E802A2414E9625440A67319373@xxxxxxxx> Content-Transfer-Encoding: quoted-printable <html> <head> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Dus-ascii"= > </head> <body style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-lin= e-break: after-white-space;" class=3D""> Dear Auditory List Members, <div class=3D""><br class=3D""> </div> <div class=3D"">Apologies for crossposting.</div> <div class=3D""><br class=3D""> </div> <div class=3D"">We would like to invite chapter proposals for an upcoming b= ook publication -&nbsp;Multimodal Interface Design for Biometric Applicatio= ns (IGI Global). We envisage a significant portion of this publication will= include research relating to the auditory system and auditory interface design as applied to biometric applications.= </div> <div class=3D""><br class=3D""> </div> <div class=3D""><u class=3D"">Important Dates</u><br class=3D""> January 15, 2017:&nbsp;Proposal Submission Deadline (1,000 to 2,000 words)<= br class=3D""> February 15, 2017:&nbsp;Notification of Acceptance<br class=3D""> April 15, 2017:&nbsp;Full Chapter Submission<br class=3D""> June 15, 2017:&nbsp;Review Results Returned<br class=3D""> July 30, 2017:&nbsp;Final Acceptance Notification<br class=3D""> August 15, 2017:&nbsp;Final Chapter Submission<br class=3D""> <br class=3D""> </div> <div class=3D""><u class=3D"">Objective</u><br class=3D""> This book will aim to provide relevant theoretical frameworks and the lates= t empirical research findings in the area of multimodal interface design fo= r biometric applications. It will be written for&nbsp;professionals who wan= t to improve their understanding of multimodal design concepts and how these are applied to complex biometric systems. It= &nbsp;will aim to introduce novel user-interface models based on perceptual= and cognitive principles across auditory, visual, tactile and gestural&nbs= p;modalities.<br class=3D""> <br class=3D""> <u class=3D"">Editors</u><br class=3D""> Dr. Flaithri Neff, Interactive Systems Research Group, Limerick Institute o= f Technology. <a href=3D"mailto:Flaithri.Neff@xxxxxxxx" class=3D"">Flaithri.Neff@xxxxxxxx</a>= </div> <div class=3D"">Dr. Katie Crowley, Trinity College Institute of Neuroscienc= e, Trinity College Dublin. <a href=3D"mailto:k.crowley@xxxxxxxx" class=3D"">k.crowley@xxxxxxxx</a><br clas= s=3D""> </div> <div class=3D"">&nbsp;&nbsp;</div> <div class=3D""><br class=3D""> </div> <div class=3D"">For more information:&nbsp;<a href=3D"http://www.igi-global= .com/publish/call-for-papers/call-details/2459" class=3D"">http://www.igi-g= lobal.com/publish/call-for-papers/call-details/2459</a></div> <div class=3D""><br class=3D""> </div> <div class=3D""><br class=3D""> </div> <div class=3D"">Kind regards,</div> <div class=3D""><br class=3D""> </div> <div class=3D"">Flaithri Neff, PhD.</div> </body> </html> --_000_126141A0B42545C09E0611DE3E095D1Elitie_--


This message came from the mail archive
/var/www/html/postings/2016/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University