CfP INTERSPEECH 2013 ComParE: Autism Sub-Challenge opened (Bjoern Schuller )


Subject: CfP INTERSPEECH 2013 ComParE: Autism Sub-Challenge opened
From:    Bjoern Schuller  <schuller@xxxxxxxx>
Date:    Wed, 9 Jan 2013 00:17:33 +0000
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--_000_543D4FD247AF9C489398AAD86F8577FE3B1CA866BADWLRZSWMBX13a_ Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Dear List, For those of you interested, let us announce the opening of the third Sub-C= hallenge of the INTERSPEECH 2013 Computational Paralinguistics Challenge: Call for Participation INTERSPEECH 2013 ComParE: COMPUTATIONAL PARALINGUISTICS CHALLENGE Social Signals, Conflict, Emotion, Autism http://emotion-research.net/sigs/speech-sig/is13-compare Third Sub-Challenge now open - obtain the data: http://emotion-research.net/sigs/speech-sig/IS13-Challenge-Agreements-CPSD.= pdf The Challenge After four consecutive Challenges at INTERSPEECH, there still exists a mult= iplicity of not yet covered, but highly relevant paralinguistic phenomena. = In the last instalments, we focused on single speakers. With a new task, we= now want to broaden to analysing discussion of multiple speakers in the Co= nflict Sub-Challenge. A further novelty is introduced by the Social Signals= Sub-Challenge: For the first time, non-linguistic events have to be classi= fied and localised - laughter and fillers. In the Emotion Sub-Challenge we = are literally "going back to the roots". However, by intention, we use acte= d material for the first time to fuel the ever on-going discussion on diffe= rences between naturalistic and acted material and hope to highlight the di= fferences. Finally, the Autism Sub-Challenge picks up on Autism Spectrum Co= ndition in children's speech in this year. Apart from intelligent and socia= lly competent future agents and robots, main applications are found in the = medical domain and surveillance. The Challenge corpora feature rich annotat= ion such as speaker meta-data, orthographic transcript, phonemic transcript= , and segmentation. All four are given with distinct definitions of test, d= evelopment, and training partitions, incorporating speaker independence as = needed in most real-life settings. Benchmark results of the most popular ap= proaches will be provided as in the years before. In these respects, the IN= TERSPEECH 2013 COMPUTATIONAL PARALINGUISTICS CHALLENGE (ComParE) shall help= bridging the gap between excellent research on paralinguistic information = in spoken language and low compatibility of results. In summary, four Sub-Challenges are addressed: * In the Social Signals Sub-Challenge, non-linguistic events - = laughter and fillers - of a speaker have to be classified and localised bas= ed on acoustics. * In the Conflict Sub-Challenge, group discussions have to be a= utomatically evaluated aiming at retrieving conflicts. * In the Emotion Sub-Challenge, the emotion of a speaker's voic= e has to be determined by a suited learning algorithm and acoustic features= . * In the Autism Sub-Challenge, the type of pathology of a speak= er has to be determined by a suited classification algorithm and acoustic f= eatures. The measures of competition will be Unweighted Average Area Under receiver = operating Curve and Recall. All Sub-Challenges allow contributors to find t= heir own features with their own machine learning algorithm. However, a sta= ndard feature set will be provided per corpus that may be used. Participant= s will have to stick to the definition of training, development, and test s= ets. They may report on results obtained on the development set, but have o= nly five trials to upload their results on the test sets, whose labels are = unknown to them. Each participation will be accompanied by a paper presenti= ng the results that undergoes peer-review and has to be accepted for the co= nference in order to participate in the Challenge. The organisers preserve = the right to re-evaluate the findings, but will not participate themselves = in the Challenge. Participants are encouraged to compete in all Sub-Challen= ges. Overall, contributions using the provided or equivalent data are sought for= (but not limited to): * Participation in a Sub-Challenge * Contributions focussing on Computational Paralinguistics cent= red around the Challenge topics The results of the Challenge will be presented at Interspeech 2013 in Lyon,= France. Prizes will be awarded to the Sub-Challenge winners. If you are in= terested and planning to participate in INTERSPEECH 2013 ComParE, or if you= want to be kept informed about the Challenge, please send the organisers a= n e-mail to indicate your interest and visit the homepage: http://emotion-r= esearch.net/sigs/speech-sig/is13-compare Organisers: Bj=F6rn Schuller (TUM, Germany) Stefan Steidl (FAU Erlangen-Nuremberg, Germany) Anton Batliner (TUM, Germany) Alessandro Vinciarelli (University of Glasgow, UK) Klaus Scherer (Swiss Center for Affective Sciences, Switzerland) Fabien Ringeval (University of Fribourg, Switzerland) Mohamed Chetouani (Universit=E9 Pierre et Marie Curie, France) Dates: Paper Submission 18 March 2013 Final Result Upload 24 May 2013 Camera-ready Paper 29 May 2013 Sponsors: HUMAINE Association (http://emotion-research.net/) SSPNet (http://sspnet.eu/) ASC-Inclusion (http://www.asc-inclusion.eu= /) ___________________________________________ PD Dr. habil. DI Bj=F6rn W. Schuller Head Machine Intelligence & Signal Processing Group Institute for Human-Machine Communication Technische Universit=E4t M=FCnchen D-80333 M=FCnchen Germany +49-(0)89-289-28548 schuller@xxxxxxxx<mailto:schuller@xxxxxxxx> www.mmk.ei.tum.de/~sch<http://www.mmk.ei.tum.de/~sch> ___________________________________________ --_000_543D4FD247AF9C489398AAD86F8577FE3B1CA866BADWLRZSWMBX13a_ Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable <html> <head> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Diso-8859-= 1"> <meta name=3D"Generator" content=3D"Microsoft Word 14 (filtered medium)"> <style><!-- /* Font Definitions */ @xxxxxxxx {font-family:Calibri; panose-1:2 15 5 2 2 2 4 3 2 4;} /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0cm; margin-bottom:.0001pt; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-fareast-language:EN-US;} a:link, span.MsoHyperlink {mso-style-priority:99; color:blue; text-decoration:underline;} a:visited, span.MsoHyperlinkFollowed {mso-style-priority:99; color:purple; text-decoration:underline;} span.E-MailFormatvorlage17 {mso-style-type:personal; font-family:"Calibri","sans-serif"; color:windowtext;} span.E-MailFormatvorlage18 {mso-style-type:personal; font-family:"Calibri","sans-serif"; color:#1F497D;} span.E-MailFormatvorlage19 {mso-style-type:personal-reply; font-family:"Calibri","sans-serif"; color:#1F497D;} .MsoChpDefault {mso-style-type:export-only; font-size:10.0pt;} @xxxxxxxx WordSection1 {size:612.0pt 792.0pt; margin:70.85pt 70.85pt 2.0cm 70.85pt;} div.WordSection1 {page:WordSection1;} --></style><!--[if gte mso 9]><xml> <o:shapedefaults v:ext=3D"edit" spidmax=3D"1026" /> </xml><![endif]--><!--[if gte mso 9]><xml> <o:shapelayout v:ext=3D"edit"> <o:idmap v:ext=3D"edit" data=3D"1" /> </o:shapelayout></xml><![endif]--> </head> <body lang=3D"EN-GB" link=3D"blue" vlink=3D"purple"> <div class=3D"WordSection1"> <p class=3D"MsoNormal">Dear List,<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">For those of you interested, let us announce the ope= ning of the third Sub-Challenge of the <o:p></o:p></p> <p class=3D"MsoNormal">INTERSPEECH 2013 Computational Paralinguistics Chall= enge:<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">Call for Participation<o:p></o:p></p> <p class=3D"MsoNormal">INTERSPEECH 2013 ComParE:<o:p></o:p></p> <p class=3D"MsoNormal">COMPUTATIONAL PARALINGUISTICS CHALLENGE <o:p></o:p><= /p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <div style=3D"border:none;border-bottom:solid windowtext 1.0pt;padding:0cm = 0cm 1.0pt 0cm"> <p class=3D"MsoNormal">Social Signals, Conflict, Emotion, Autism<o:p></o:p>= </p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><a href=3D"http://emotion-research.net/sigs/speech-s= ig/is13-compare">http://emotion-research.net/sigs/speech-sig/is13-compare</= a><o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> </div> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">Third Sub-Challenge now open &#8211; obtain the data= :<o:p></o:p></p> <p class=3D"MsoNormal"><span style=3D"color:#1F497D"><a href=3D"http://emot= ion-research.net/sigs/speech-sig/IS13-Challenge-Agreements-CPSD.pdf">http:/= /emotion-research.net/sigs/speech-sig/IS13-Challenge-Agreements-CPSD.pdf</a= ><o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"color:#1F497D"><o:p>&nbsp;</o:p></spa= n></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <div style=3D"border:none;border-bottom:solid windowtext 1.0pt;padding:0cm = 0cm 1.0pt 0cm"> <p class=3D"MsoNormal">The Challenge<o:p></o:p></p> </div> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">After four consecutive Challenges at INTERSPEECH, th= ere still exists a multiplicity of not yet covered, but highly relevant par= alinguistic phenomena. In the last instalments, we focused on single speake= rs. With a new task, we now want to broaden to analysing discussion of multiple speakers in the Conflict Sub-C= hallenge. A further novelty is introduced by the Social Signals Sub-Challen= ge: For the first time, non-linguistic events have to be classified and loc= alised &#8211; laughter and fillers. In the Emotion Sub-Challenge we are literally &#8220;going back to the roots&= #8221;. However, by intention, we use acted material for the first time to = fuel the ever on-going discussion on differences between naturalistic and a= cted material and hope to highlight the differences. Finally, the Autism Sub-Challenge picks up on Autism Spectrum Condition in= children&#8217;s speech in this year. Apart from intelligent and socially = competent future agents and robots, main applications are found in the medi= cal domain and surveillance. The Challenge corpora feature rich annotation such as speaker meta-data, orthographic tr= anscript, phonemic transcript, and segmentation. All four are given with di= stinct definitions of test, development, and training partitions, incorpora= ting speaker independence as needed in most real-life settings. Benchmark results of the most popular approach= es will be provided as in the years before. In these respects, the INTERSPE= ECH 2013 COMPUTATIONAL PARALINGUISTICS CHALLENGE (ComParE) shall help bridg= ing the gap between excellent research on paralinguistic information in spoken language and low compatibility of = results.<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">In summary, four Sub-Challenges are addressed:<o:p><= /o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">&#8226;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp;&nbsp;&nbsp; In the Social Signals Sub-Challenge, non-lingui= stic events &#8211; laughter and fillers &#8211; of a speaker have to be cl= assified and localised based on acoustics. <o:p></o:p></p> <p class=3D"MsoNormal">&#8226;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp;&nbsp;&nbsp; In the Conflict Sub-Challenge, group discussion= s have to be automatically evaluated aiming at retrieving conflicts.<o:p></= o:p></p> <p class=3D"MsoNormal">&#8226;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp;&nbsp;&nbsp; In the Emotion Sub-Challenge, the emotion of a = speaker&#8217;s voice has to be determined by a suited learning algorithm a= nd acoustic features.<o:p></o:p></p> <p class=3D"MsoNormal">&#8226;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp;&nbsp;&nbsp; In the Autism Sub-Challenge, the type of pathol= ogy of a speaker has to be determined by a suited classification algorithm = and acoustic features.<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">The measures of competition will be Unweighted Avera= ge Area Under receiver operating Curve and Recall. All Sub-Challenges allow= contributors to find their own features with their own machine learning al= gorithm. However, a standard feature set will be provided per corpus that may be used. Participants will have t= o stick to the definition of training, development, and test sets. They may= report on results obtained on the development set, but have only five tria= ls to upload their results on the test sets, whose labels are unknown to them. Each participation will be ac= companied by a paper presenting the results that undergoes peer-review and = has to be accepted for the conference in order to participate in the Challe= nge. The organisers preserve the right to re-evaluate the findings, but will not participate themselves in = the Challenge. Participants are encouraged to compete in all Sub-Challenges= . <o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">Overall, contributions using the provided or equival= ent data are sought for (but not limited to):<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">&#8226;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp;&nbsp;&nbsp; Participation in a Sub-Challenge<o:p></o:p></p> <p class=3D"MsoNormal">&#8226;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp;&nbsp;&nbsp; Contributions focussing on Computational Parali= nguistics centred around the Challenge topics<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">The results of the Challenge will be presented at In= terspeech 2013 in Lyon, France. Prizes will be awarded to the Sub-Challenge= winners. If you are interested and planning to participate in INTERSPEECH = 2013 ComParE, or if you want to be kept informed about the Challenge, please send the organisers an e-mail to= indicate your interest and visit the homepage: <a href=3D"http://emotion-research.net/sigs/speech-sig/is13-compare">http:/= /emotion-research.net/sigs/speech-sig/is13-compare</a><o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <div style=3D"border:none;border-bottom:solid windowtext 1.0pt;padding:0cm = 0cm 1.0pt 0cm"> <p class=3D"MsoNormal"><span lang=3D"DE">Organisers:<o:p></o:p></span></p> </div> <p class=3D"MsoNormal"><span lang=3D"DE"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE">Bj=F6rn Schuller (TUM, Germany)<o:= p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE">Stefan Steidl (FAU Erlangen-Nuremb= erg, Germany)<o:p></o:p></span></p> <p class=3D"MsoNormal">Anton Batliner (TUM, Germany)<o:p></o:p></p> <p class=3D"MsoNormal">Alessandro Vinciarelli (University of Glasgow, UK)<o= :p></o:p></p> <p class=3D"MsoNormal">Klaus Scherer (Swiss Center for Affective Sciences, = Switzerland)<o:p></o:p></p> <p class=3D"MsoNormal">Fabien Ringeval (University of Fribourg, Switzerland= )<o:p></o:p></p> <p class=3D"MsoNormal"><span lang=3D"FR">Mohamed Chetouani (Universit=E9 Pi= erre et Marie Curie, France)<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"FR"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"FR"><o:p>&nbsp;</o:p></span></p> <div style=3D"border:none;border-bottom:solid windowtext 1.0pt;padding:0cm = 0cm 1.0pt 0cm"> <p class=3D"MsoNormal">Dates:<o:p></o:p></p> </div> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">Paper Submission &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp= ;&nbsp;&nbsp;&nbsp;&nbsp; 18 March &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbs= p;&nbsp;&nbsp;&nbsp;&nbsp; 2013<o:p></o:p></p> <p class=3D"MsoNormal">Final Result Upload&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp; 24 May &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbs= p;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2013<o:p></o:p></p> <p class=3D"MsoNormal">Camera-ready Paper&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 29 = May &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp= ;&nbsp;&nbsp;&nbsp; 2013<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <div style=3D"border:none;border-bottom:solid windowtext 1.0pt;padding:0cm = 0cm 1.0pt 0cm"> <p class=3D"MsoNormal">Sponsors:<o:p></o:p></p> </div> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><span lang=3D"FR">HUMAINE Association&nbsp;&nbsp;&nb= sp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;= &nbsp;&nbsp;&nbsp; (<a href=3D"http://emotion-research.net/">http://emotion= -research.net/</a>)<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE">SSPNet&nbsp;&nbsp;&nbsp;&nbsp;&nbs= p;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&= nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbs= p;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&= nbsp; (</span><span lang=3D"FR"><a href=3D"http://sspnet.eu/"><span lang=3D= "DE">http://sspnet.eu/</span></a></span><span lang=3D"DE">)<o:p></o:p></spa= n></p> <p class=3D"MsoNormal"><span lang=3D"DE">ASC-Inclusion &nbsp;&nbsp;&nbsp;&n= bsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp= ;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&= nbsp;&nbsp;&nbsp; (</span><span lang=3D"FR"><a href=3D"http://www.asc-inclu= sion.eu/"><span lang=3D"DE">http://www.asc-inclusion.eu/</span></a></span><= span lang=3D"DE">)<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span style=3D"mso-fareast-language:EN-GB">_________= __________________________________<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"mso-fareast-language:EN-GB"><o:p>&nbs= p;</o:p></span></p> <p class=3D"MsoNormal"><span style=3D"mso-fareast-language:EN-GB">PD Dr. ha= bil. DI Bj=F6rn W. Schuller<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"mso-fareast-language:EN-GB"><o:p>&nbs= p;</o:p></span></p> <p class=3D"MsoNormal"><span style=3D"mso-fareast-language:EN-GB">Head Mach= ine Intelligence &amp; Signal Processing Group<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"mso-fareast-language:EN-GB">Institute= for Human-Machine Communication<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"mso-fareast-language:EN-GB">Technisch= e </span><span lang=3D"DE" style=3D"mso-fareast-language:EN-GB">Universit= =E4t M=FCnchen<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B">D-80333 M=FCnchen<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B">Germany<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B">&#43;49-(0)89-289-28548<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B"><a href=3D"mailto:schuller@xxxxxxxx">schuller@xxxxxxxx</a><o:p></o:p></span>= </p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B"><a href=3D"http://www.mmk.ei.tum.de/~sch">www.mmk.ei.tum.de/~sch</a><o:p= ></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B">___________________________________________<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"DE" style=3D"mso-fareast-language:EN-G= B"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> </div> </body> </html> --_000_543D4FD247AF9C489398AAD86F8577FE3B1CA866BADWLRZSWMBX13a_--


This message came from the mail archive
/var/www/postings/2013/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University