[AUDITORY] Non-intrusive Objective Speech Quality Assessment (NISQA) Challenge for Online Conferencing Applications (Conferencing Speech 2022) ("Williamson, Donald S." )


Subject: [AUDITORY] Non-intrusive Objective Speech Quality Assessment (NISQA) Challenge for Online Conferencing Applications (Conferencing Speech 2022)
From:    "Williamson, Donald S."  <williads@xxxxxxxx>
Date:    Sun, 30 Jan 2022 21:31:47 +0000

--_004_CH0PR08MB7426576D980B87ADA9F5352CC5249CH0PR08MB7426namp_ Content-Type: multipart/alternative; boundary="_000_CH0PR08MB7426576D980B87ADA9F5352CC5249CH0PR08MB7426namp_" --_000_CH0PR08MB7426576D980B87ADA9F5352CC5249CH0PR08MB7426namp_ Content-Type: text/plain; charset="Windows-1252" Content-Transfer-Encoding: quoted-printable Dear all, In order to stimulate the research of non-intrusive objective speech qualit= y assessment for online conferencing applications, we will hold the =93Non-= intrusive Objective Speech Quality Assessment (NISQA) Challenge for Online = Conferencing Applications (Conferencing Speech 2022)=94 as a special sessio= n of INTERSPEECH 2022. We sincerely invite you to participate in this chall= enge. With the advances in speech communication systems such as online conferenci= ng applications, we can seamlessly work with people regardless where they a= re. However, during the meeting, the speech quality is significantly affect= ed by background noise, reverberation, packet loss, network jitter and many= other factors. Therefore, an effective objective assessment approach is ne= eded to evaluate or monitor the speech quality of the ongoing call. If seve= rely degraded speech quality is detected, then the statistical information = could be shared with the service provider in time. This enables the service= provider to find out the root cause and improve the speech quality to guar= antee better user experience in the future. This challenge will provide some speech datasets with subjective ratings wh= ich were not published before. These datasets contain hundreds of hours of = speech samples with subjective Mean Opinion Score (MOS) and covering most o= f the impairment scenarios users might face in on-line speech communication= . The final ranking of this challenge will be decided by the accuracy of th= e predicted MOS scores from the submitted model or algorithm on the test da= taset. Each registered team could participate in the above task. Top two wi= nning teams from the task will be awarded the prizes provided by Tencent Et= hereal Audio Lab. The First Prize is 1500 USD. The Second Prize is 800 USD.= The internship opportunities at Tencent Ethereal Audio Lab will be provide= d to the students with excellent performance. We plan to use the webpage of= Tencent Ethereal Audio Lab. (https://tea-lab.qq.com/) as the competition r= egistration entry point, and a baseline system will be provided by the orga= nizing committee. A tentative schedule is as follows: Challenge registration open: January 19, 2022 Release of evaluation plan, the list of training data and development t= est set: January 27, 2022 Release of baseline system: February 17, 2022 Deadline of challenge registration: March 12, 2022 Release of evaluation test set: March 14, 2022 Deadline of submitting the results: March 17, 2022 Notification of the results of participants: March 20, 2022 Interspeech paper submission deadline: March 21, 2022 For more detail, you can visit the webpage (https://tea-lab.qq.com/conferen= cingspeech-2022). If you might be interested in participating in the challenge and writing a = paper for the special session, please not reply to this address but Confere= ncingSpeech@xxxxxxxx<mailto:ConferencingSpeech@xxxxxxxx>. Regards, The Conferencing Speech 2022 organizing committee Gaoxiong Yi, Wei Xiao, Yiming Xiao, Babak Naderi, Sebastian M=F6ller, Gabri= el Mittag, Ross Cutler, Zhuohuang Zhang, Donald S. Williamson, Fei Chen, Fu= zheng yang and Shidong Shang -- Donald S. Williamson Assistant Professor Department of Computer Science Luddy School of Informatics, Computing and Engineering Personal page: https://homes.luddy.indiana.edu/williads ASPIRE Research Group: https://aspire.sice.indiana.edu<https://aspire.sice.= indiana.edu/> [signature_869230902]https://twitter.com/IUAspireRschGrp https://www.instagram.com/iuaspirerschgrp/ https://www.youtube.com/channel/UCw6r8rLx5RypKBMeBjEzsjw/ --_000_CH0PR08MB7426576D980B87ADA9F5352CC5249CH0PR08MB7426namp_ Content-Type: text/html; charset="Windows-1252" Content-Transfer-Encoding: quoted-printable <html xmlns:v=3D"urn:schemas-microsoft-com:vml" xmlns:o=3D"urn:schemas-micr= osoft-com:office:office" xmlns:w=3D"urn:schemas-microsoft-com:office:word" = xmlns:m=3D"http://schemas.microsoft.com/office/2004/12/omml" xmlns=3D"http:= //www.w3.org/TR/REC-html40"> <head> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3DWindows-1= 252"> <meta name=3D"Generator" content=3D"Microsoft Word 15 (filtered medium)"> <!--[if !mso]><style>v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} </style><![endif]--><style><!-- /* Font Definitions */ @xxxxxxxx {font-family:"Cambria Math"; panose-1:2 4 5 3 5 4 6 3 2 4;} @xxxxxxxx {font-family:Calibri; panose-1:2 15 5 2 2 2 4 3 2 4;} /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in; font-size:12.0pt; font-family:"Calibri",sans-serif;} a:link, span.MsoHyperlink {mso-style-priority:99; color:#0563C1; text-decoration:underline;} span.EmailStyle17 {mso-style-type:personal-compose; font-family:"Calibri",sans-serif; color:windowtext;} .MsoChpDefault {mso-style-type:export-only; font-size:12.0pt; font-family:"Calibri",sans-serif;} @xxxxxxxx WordSection1 {size:8.5in 11.0in; margin:1.0in 1.0in 1.0in 1.0in;} div.WordSection1 {page:WordSection1;} --></style> </head> <body lang=3D"EN-US" link=3D"#0563C1" vlink=3D"#954F72" style=3D"word-wrap:= break-word"> <div class=3D"WordSection1"> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Dear all,<o:p></o:p= ></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">In order to stimula= te the research of non-intrusive objective speech quality assessment for on= line conferencing applications, we will hold the =93Non-intrusive Objective= Speech Quality Assessment (NISQA) Challenge for Online Conferencing Applications (Conferencing Speech 2022)=94 as a sp= ecial session of INTERSPEECH 2022. We sincerely invite you to participate i= n this challenge.<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">With the advances i= n speech communication systems such as online conferencing applications, we= can seamlessly work with people regardless where they are. However, during= the meeting, the speech quality is significantly affected by background noise, reverberation, packet loss, ne= twork jitter and many other factors. Therefore, an effective objective asse= ssment approach is needed to evaluate or monitor the speech quality of the = ongoing call. If severely degraded speech quality is detected, then the statistical information could be shar= ed with the service provider in time. This enables the service provider to = find out the root cause and improve the speech quality to guarantee better = user experience in the future.&nbsp;<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">This challenge will= provide some speech datasets with subjective ratings which were not publis= hed before. These datasets contain hundreds of hours of speech samples with= subjective Mean Opinion Score (MOS) and covering most of the impairment scenarios users might face in on-line = speech communication. The final ranking of this challenge will be decided b= y the accuracy of the predicted MOS scores from the submitted model or algo= rithm on the test dataset. Each registered team could participate in the above task. Top two winning teams= from the task will be awarded the prizes provided by Tencent Ethereal Audi= o Lab. The First Prize is 1500 USD. The Second Prize is 800 USD. The intern= ship opportunities at Tencent Ethereal Audio Lab will be provided to the students with excellent performance. We = plan to use the webpage of Tencent Ethereal Audio Lab. (<a href=3D"https://= tea-lab.qq.com/">https://tea-lab.qq.com/</a>) as the competition registrati= on entry point, and a baseline system will be provided by the organizing committee.<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">A tentative schedul= e is as follows:<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Challenge registration open: January 19, 2022<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Release of evaluation plan, the list of training data and development test = set: January 27, 2022<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Release of baseline system: February 17, 2022<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Deadline of challenge registration: March 12, 2022<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Release of evaluation test set: March 14, 2022<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Deadline of submitting the results: March 17, 2022<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Notification of the results of participants: March 20, 2022<o:p></o:p></spa= n></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">&nbsp;&nbsp;&nbsp; = Interspeech paper submission deadline: March 21, 2022<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">For more detail, yo= u&nbsp;</span><span lang=3D"EN" style=3D"font-size:11.0pt">can visit the we= bpage (</span><span style=3D"font-size:11.0pt"><a href=3D"https://tea-lab.q= q.com/conferencingspeech-2022" title=3D"https://tea-lab.qq.com/conferencing= speech-2022"><span lang=3D"EN">https://tea-lab.qq.com/conferencingspeech-20= 22</span></a></span><span lang=3D"EN" style=3D"font-size:11.0pt">).&nbsp;</= span><span style=3D"font-size:11.0pt"><o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">If you might be int= erested in participating in the challenge and writing a paper for the speci= al session, please not reply to this address but&nbsp;<a href=3D"mailto:Con= ferencingSpeech@xxxxxxxx" title=3D"mailto:ConferencingSpeech@xxxxxxxx= ">ConferencingSpeech@xxxxxxxx</a>.<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Regards,<o:p></o:p>= </span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">The Conferencing Sp= eech 2022 organizing committee<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Gaoxiong Yi, Wei Xi= ao, Yiming Xiao, Babak Naderi, Sebastian M=F6ller,&nbsp;Gabriel Mittag, Ros= s Cutler, Zhuohuang Zhang, Donald S. Williamson,&nbsp;Fei Chen, Fuzheng yan= g and Shidong Shang<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">--&nbsp;<o:p></o:p>= </span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Donald S. Williamso= n<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Assistant Professor= <o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Department of Compu= ter Science<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Luddy School of Inf= ormatics, Computing and Engineering<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">Personal page: <a h= ref=3D"https://homes.luddy.indiana.edu/williads"> <span style=3D"color:#0563C1">https://homes.luddy.indiana.edu/williads</spa= n></a><o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">ASPIRE Research Gro= up: <a href=3D"https://aspire.sice.indiana.edu/"> <span style=3D"color:#0563C1">https://aspire.sice.indiana.edu</span></a><o:= p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><img border=3D"0" w= idth=3D"27" height=3D"28" style=3D"width:.2812in;height:.2916in" id=3D"Pict= ure_x0020_1" src=3D"cid:image001.png@xxxxxxxx" alt=3D"signature_86= 9230902"><a href=3D"https://twitter.com/IUAspireRschGrp"><span style=3D"col= or:#0563C1">https://twitter.com/IUAspireRschGrp</span></a><o:p></o:p></span= ></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><a href=3D"https://= www.instagram.com/iuaspirerschgrp/"><span style=3D"color:#0563C1">https://w= ww.instagram.com/iuaspirerschgrp/</span></a><o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt">https://www.youtube= .com/channel/UCw6r8rLx5RypKBMeBjEzsjw/<o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> </div> </body> </html> --_000_CH0PR08MB7426576D980B87ADA9F5352CC5249CH0PR08MB7426namp_-- --_004_CH0PR08MB7426576D980B87ADA9F5352CC5249CH0PR08MB7426namp_ Content-Type: image/png; name="image001.png" Content-Description: image001.png Content-Disposition: inline; filename="image001.png"; size=717; creation-date="Sun, 30 Jan 2022 21:31:47 GMT"; modification-date="Sun, 30 Jan 2022 21:31:47 GMT" Content-ID: <image001.png@xxxxxxxx> Content-Transfer-Encoding: base64 iVBORw0KGgoAAAANSUhEUgAAABsAAAAcCAIAAAFotGMsAAAAAXNSR0IArs4c6QAAAARnQU1BAACx jwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAJZSURBVDhPvZTPaxNBFMdz7EGwGk2ttoLgreBJ UPRWPAT0HyhevJfiD/CgFfHswUOP4smi5icJbTVSbS7Fg7Uae7ARiz2ESDqbZE2y3WzMr/U7mXVn splGGxIfH5LZl++8vJ333rgeberetyrDdftj6XhIefq9PB5W6MPpaNY0TesBXwwX38AcF1/n6cqS Mh9diZtsLO9WqQ49bNhHuPewn3gC9JnB/0mE/4MI9Y61VniFs69y3OsJ0NSYcS8o15twnY8JWoDt R8Qc7JWI/N2k9CQ96CNI7VwsN/R8p5U4tRsfeCAuxUs//qZbkpbVmmbbKdrSEyFlftuwVKZ5IZa3 RYz95DqzVnS49kJexk7kR93J4HSnIrT8WOBz+n1x7qs+sZBlHsB1Dza0pXQFrTu5rLKzRC+N/Dlz rrsct35u0E6j9mSLDheD60aDio6qCTa5zGvDdUglmqpYEtNUjIYnKGvVqdXCZqHGRLEfFYyV/RNo i3fIT06GFbwKGx4RrutO33XXhUbvC/vo63/kf0XEyHYWBk6UFJVD2Y4FldGQgk+HBkgiYtut9RJu +KMBwm45RB8JEnhYt8Ay5UayUMN4wy/uBZKIuGkuvVHZVVqsNp9tG1dXf84mNBbLtpdp2oWSHpPm eO1dwdq3t93/rOGWdOwF8nN0+4l3RdWqbUMmGuYZB9KZIJBHxJEfeLHjDpCHX3YdUYnRwOQ7pk5E EhE1PbOYu7leiqaMtWx1Q63FM7/mkvqVuIo6oFwOvQN5jjZjIXkndeEvEXtgABEjqcqdT9q9RH+4 m9B+A5oipd9XVlZpAAAAAElFTkSuQmCCAAAAAAAAAAAA --_004_CH0PR08MB7426576D980B87ADA9F5352CC5249CH0PR08MB7426namp_--


This message came from the mail archive
src/postings/2022/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University