Re: [AUDITORY] Measuring perceptual similarity ("Weisser, Adam" )


Subject: Re: [AUDITORY] Measuring perceptual similarity
From:    "Weisser, Adam"  <Adam.Weisser@xxxxxxxx>
Date:    Thu, 22 Mar 2018 08:12:40 +0000
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--_000_DB48E1AB0567E440862DD5D0F3AEA479E010349EDCPEXCH001heari_ Content-Type: text/plain; charset="Windows-1252" Content-Transfer-Encoding: quoted-printable Dear Pat and all, I agree with the last comments about the inadequacy of the suggested tests= to evaluate similarity, but would like to add some more. Statistically sp= eaking, testing difference and similarity are two different hypotheses, ev= en if they appear the same as far as the measurement procedures go. For a = difference test (e.g., pairwise comparison), if one rejects the null hypot= hesis (similarity of stimuli), this does not mean that similarity has been= established. I'm unaware of studies in hearing that have used these metho= ds, but it's a well-established type of studies in sensory science (showin= g that two ingredients taste the same in the mix) and in pharmacology (sho= wing equivalence between generic and original drugs), for example. In equi= valence tests, the hypotheses should be inverted, which is best done using= an interval type of hypothesis, depending on the expected distribution fr= om the specific test design. A good reference about this is: Bi, Jian. "Similarity testing in sensory and consumer research." Food Qual= ity and Preference 16.2 (2005): 139-149.<https://s3.amazonaws.com/academia= .edu.documents/35919526/similarity_testing_in_sensory_and_consumer_researc= h.pdf?AWSAccessKeyId=3DAKIAIWOWYYGZ2Y53UL3A&Expires=3D1521709526&Signature= =3D4UnvAghYOLfsX1u2mSJFan6RZ%2BY%3D&response-content-disposition=3Dinline%= 3B%20filename%3DSimilarity_testing_in_sensory_and_consum.pdf> All the best, Adam Weisser. _______________________________________________ Adam Weisser PhD Candidate in Acoustics and Hearing, Macquarie Univeristy Australian Hearing Hub 16 University Ave Macquarie University NSW 2109 Australia ________________________________ From: AUDITORY - Research in Auditory Perception [AUDITORY@xxxxxxxx= ] on behalf of Pat Savage [patsavagenz@xxxxxxxx Sent: Thursday, March 22, 2018 6:19 PM To: AUDITORY@xxxxxxxx Subject: Re: [AUDITORY] Measuring perceptual similarity Hi Bruno, Sabine, Daniel, and all, Thanks for the clarification. I was indeed wondering how the MUSHRA approa= ch would deal with multidimensionality, but since several people had indep= endently suggested it I figured I just needed to read more to see how it w= as done=85 I think for now we will indeed probably stick with pairwise dissimilarity = ratings to start with with using a small sample of <=3D20 stimuli, while a= lso thinking about other possible options=85 Cheers, Pat --- Dr. Patrick Savage http://PatrickESavage.com <http://PatrickESavage.com> On Mar 21, 2018, at 19:29, Bruno L. Giordano <brungio@xxxxxxxx<mailto:bru= ngio@xxxxxxxx>> wrote: Dear Pat, in addition to what pointed out by Daniel, I personally favour pairwise di= ssimilarity ratings over sorting unless the number of stimuli is so large = that it is not possible to acquire a full dissimilarity matrix in one huma= ne experimental session (<=3D ~40 stimuli). As you might have read, dissim= ilarity ratings produces estimates of the distances that are much more rel= iable (because of the larger number of stimulus playback involved, I suppo= se), much less distorted (cf. binarization of dissimilarities in free sort= ing and skewed distribution of hierarchical sorting dissimilarities), and = much more indicative of stimulus features than the more efficient sorting = methods. Alternative methods come to mind that rely on the placement of stimuli on = a visual space and consider the inter-stimulus distances as estimates of t= he perceptual dissimilarities (e.g., Harbke, 2003; Kriegeskorte and Mur, 2= 012). Importantly and unsurprisingly, these "direct MDS" methods bias the = perceptual space towards a 2D representation (see Harbke, 2003) and for th= is reason are a suboptimal choice for the discovery of perceptually releva= nt stimulus features. In short, there is no free meal in the behavioural estimation of distances= : if your goal is accuracy, methods that are less efficient from the time = allocation point of view are, in my opinion, still the best option. Best, Bruno @xxxxxxxx{harbke2003evaluation, author =3D {C. R. Harbke}, title =3D {Evaluation of data collection techniques for multidimensiona= l=20scaling with large stimulus sets}, school =3D {Washington State University, Department of Psychology}, year =3D {2003}, } @xxxxxxxx{kriegeskorte2012inverse, title=3D{Inverse MDS: Inferring dissimilarity structure from multiple it= em arrangements}, author=3D{Kriegeskorte, Nikolaus and Mur, Marieke}, journal=3D{Frontiers in psychology}, volume=3D{3}, pages=3D{245}, year=3D{2012}, publisher=3D{Frontiers} } On 20 March 2018 at 12:29, Oberfeld-Twistel, Daniel <oberfeld@xxxxxxxx= <mailto:oberfeld@xxxxxxxx>> wrote: Thanks for sharing the references! In my view, MUSHRA cannot be recommended for studying musical similarity. The method is designed to identify differences between stimuli on a define= d dimension (which is audio quality in the MUSHRA recommendation, although= this rating method could also be used for evaluating other perceptual dim= ensions). In the MUSHRA method, listeners are NOT asked to rate the similarity of th= e stimuli, however. While in principle information about similarity could = be deduced in an indirect manner from the ratings obtained with MUSHRA (si= milar mean ratings =3D high similarity), this would require that you can s= pecify the perceptual dimension on which your stimuli differ or are simil= ar (say, rhythm, tempo, consonance/dissonance, mood etc.). If that is not possible, the other approaches that were suggested like tri= adic tests or MDS can be used *without* having to specify which exact dime= nsion the similarity judgments should refer to, and to identify structure= s in the (dis-) similarity ratings. In addition, I could imagine that the MUSHRA concepts of a high-quality =93= reference=94 and a low-quality =93anchor=94 do not easily apply to the exp= eriments you have in mind. Best Daniel --------------------------------- Dr. Daniel Oberfeld-Twistel Associate Professor Johannes Gutenberg - Universitaet Mainz Institute of Psychology Experimental Psychology Wallstrasse 3<https://maps.google.com/?q=3DWallstrasse+3+%0D%0A+55122+Main= z+%0D%0A+Germany&entry=3Dgmail&source=3Dg> 55122 Mainz<https://maps.google.com/?q=3DWallstrasse+3+%0D%0A+55122+Mainz+= %0D%0A+Germany&entry=3Dgmail&source=3Dg> Germany<https://maps.google.com/?q=3DWallstrasse+3+%0D%0A+55122+Mainz+%0D%= 0A+Germany&entry=3Dgmail&source=3Dg> Phone ++49 (0) 6131 39 39274<tel:+49%206131%203939274> Fax ++49 (0) 6131 39 39268<tel:+49%206131%203939268> http://www.staff.uni-mainz.de/oberfeld/ https://www.facebook.com/WahrnehmungUndPsychophysikUniMainz From: AUDITORY - Research in Auditory Perception <AUDITORY@xxxxxxxx= <mailto:AUDITORY@xxxxxxxx>> On Behalf Of Pat Savage Sent: Tuesday, March 20, 2018 6:19 AM To: AUDITORY@xxxxxxxx<mailto:AUDITORY@xxxxxxxx> Subject: Re: Measuring perceptual similarity Dear list, Thanks very much for all of your responses. I=92m summarizing below all th= e reference recommendations I received. I still want to more fully read some of these, but so far my impression is= that the Giordano et al. (2011) paper gives a good review of the benefits= and drawbacks of previous methods, but since that was published MUSHRA=20= seems to have become the standard method for these types of subjective per= ceptual similarity ratings. Please let me know if I seem to be misunderstanding anything here. Cheers, Pat -- Flexer, A., & Grill, T. (2016). The Problem of Limited Inter-rater Agreeme= nt in Modelling Music Similarity. Journal of New Music Research, 45(3), 1=96= 13. P. Susini, S. McAdams, S. Winsberg: A multidimensional technique for sound= quality assessment. Acta Acustica united with Acustica 85 (1999), 650=966= 56. Novello, A., McKinney, M. F., & Kohlrausch, A. (2006). Perceptual evaluati= on of music similarity. In Proceedings of the 7th International Conference= on Music Information Retrieval. Retrieved from http://ismir2006.ismir.net= /PAPERS/ISMIR06148_Paper.pdf Michaud, P. Y., Meunier, S., Herzog, P., Lavandier, M., & D=92Aubigny, G. = D. (2013). Perceptual evaluation of dissimilarity between auditory stimuli= : An alternative to the paired comparison. Acta Acustica United with Acust= ica, 99(5), 806=96815. Wolff, D., & Weyde, T. (2011). Adapting=20Metrics for Music Similarity Usi= ng Comparative Ratings. 12th International Society for Music Information R= etrieval Conference (ISMIR=9211), Proc., (Ismir), 73=9678. B. L. Giordano, C. Guastavino, E. Murphy, M. Ogg, B. K. Smith, S. McAdams:= Comparison of methods for collecting and modeling dissimilarity data: App= lications to complex sound stimuli. Multivariate Behavioral Research 46 (2= 011), 779=96811. P. Y. Michaud, S. Meunier, P. Herzog, M. Lavandier, G. d=92Aubigny: Percep= tual evaluation of dissimilarity between auditory stimuli: an alternative = to the paired comparison. Acta Acustica united with Acustica 99 (2013), 8= 06=96815. Collett, E., Marx, M., Gaillard, P., Roby, B., Fraysse, B., & Deguine, O. = (2016). Categorization of common sounds by cochlear implanted and normal h= earing adults. Hearing Research, 335, 207=96219. International Telecommunication Union. (2015). ITU-R BS.1534-3, Method for= the subjective assessment of intermediate quality level of audio systems.= ITU-R Recommendation, 1534=963, 1534=963. Retrieved from https://www.itu.= int/dms_pubrec/itu-r/rec/bs/R-REC-BS.1534-3-201510-I!!PDF-E.pdf Lavandier, M., Meunier, S., & Herzog, P. (2008). Identification of some pe= rceptual dimensions underlying loudspeaker dissimilarities. Journal of the= Acoustical Society of America, 123(6), 4186=964198. DZHAFAROV, E. N., & OLDENBURG, H. C. (2006). RECONSTRUCTING DISTANCES AMON= G OBJECTS FROM THEIR DISCRIMINABILITY. Psychometrika, 71(2), 365=96386. Software: Various: http://www.ak.tu-berlin.de/menue/digitale_ressourcen/research_tools/whispe= r_listening_test_toolbox_for_matlab/ MUSHRA: https://github.com/audiolabs/webMUSHRA Free-sorting: http://petra.univ-tlse2.fr/spip.php?rubrique49&lang=3Den --- Dr. Patrick Savage Project Associate Professor Faculty of Environment and Information Studies Keio University SFC (Shonan Fujisawa Campus) http://PatrickESavage.com<http://patrickesavage.com/> -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Bruno L. Giordano, PhD =96 CR1 Institut de Neurosciences de la Timone (INT) UMR 7289, CNRS and Aix Marseille Universit=E9 http://www.int.univ-amu.fr/ Campus sant=E9 Timone 27, boulevard Jean Moulin 13385 Marseille cedex 5 Www: http://www.brunolgiordano.net<http://www.brunolgiordano.net/> -------------------------------------------------------------------------= ----- AUSTRALIAN HEARING DISCLAIMER This email message and any accompanying attachments may contain informatio= n that is confidential and is subject to legal privilege. Confidentiality = and legal privilege attached to this communication are not waived or lost = by reason of mistaken delivery to you. If you are not the intended recipie= nt, do not read, use, disseminate, distribute or copy this message or atta= chments. If you have received this message in error, please notify the sen= der immediately and delete this message. Australian Hearing Services can a= lso be contacted on telephone No. +61 2 9412 6800. Any views expressed in = this message are those of the individual sender, except where the sender e= xpressly, and with authority, states them to be the views of Australian He= aring Services. Before opening any attachments, please check them for viru= ses and defects. Australian Hearing Services accepts no liability for any = loss caused either directly or indirectly by a virus arising from the use = of this message or any attached file. -------------------------------------------------------------------------= ----- --_000_DB48E1AB0567E440862DD5D0F3AEA479E010349EDCPEXCH001heari_ Content-Type: text/html; charset="Windows-1252" Content-Transfer-Encoding: quoted-printable <html dir=3D"ltr"> <head> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3DWindows-= 1252"> <style type=3D"text/css" id=3D"owaParaStyle">P {margin-top:0;margin-bottom= :0;}</style> </head> <body class=3D"" style=3D"word-wrap:break-word; line-break:after-white-spa= ce" fpstyle=3D"1" ocsi=3D"0"> <div style=3D"direction: ltr;font-family: Tahoma;color: #000000;font-size:= 10pt;">Dear Pat and all,<br> <br> I agree with the last comments about the inadequacy of the suggested tests= to evaluate similarity, but would like to add some more. Statistically sp= eaking, testing difference and similarity are two different hypotheses, ev= en if they appear the same as far as the measurement procedures go. For a difference test (e.g., pairwise comp= arison), if one rejects the null hypothesis (similarity of stimuli), this = does not mean that similarity has been established. I'm unaware of studies= in hearing that have used these methods, but it's a well-established type of studies in sensory science (showing t= hat two ingredients taste the same in the mix) and in pharmacology (showin= g equivalence between generic and original drugs), for example. In equival= ence tests, the hypotheses should be inverted, which is best done using an interval type of hypothesis, depend= ing on the expected distribution from the specific test design. <br> <br> A good reference about this is:<br> <br> <a href=3D"https://s3.amazonaws.com/academia.edu.documents/35919526/simila= rity_testing_in_sensory_and_consumer_research.pdf?AWSAccessKeyId=3DAKIAIWO= WYYGZ2Y53UL3A&amp;Expires=3D1521709526&amp;Signature=3D4UnvAghYOLfsX1u2mSJ= Fan6RZ%2BY%3D&amp;response-content-disposition=3Dinline%3B%20filename%3DSi= milarity_testing_in_sensory_and_consum.pdf">Bi, Jian. &quot;Similarity testing in sensory and consumer research.&quot; <i= >Food Quality and Preference</i> 16.2 (2005): 139-149.</a><br> <br> All the best,<br> Adam Weisser.<br> <br> <span style=3D"font-size:11.0pt; font-family:&quot;Calibri&quot;,sans-seri= f; color:black">_______________________________________________</span> <p><span style=3D"font-size:10.0pt; font-family:&quot;Verdana&quot;,sans-s= erif; color:#5B9BD5">Adam Weisser</span><span style=3D"font-size:11.0pt; f= ont-family:&quot;Calibri&quot;,sans-serif; color:black"></span></p> <p><span style=3D"font-size:10.0pt; font-family:&quot;Verdana&quot;,sans-s= erif; color:#5B9BD5">PhD Candidate in Acoustics and Hearing, Macquarie Uni= veristy</span><span style=3D"font-size:11.0pt; font-family:&quot;Calibri&q= uot;,sans-serif; color:black"></span></p> <p><span style=3D"font-size:10.0pt; font-family:&quot;Verdana&quot;,sans-s= erif; color:#5B9BD5">&nbsp;</span><span style=3D"font-size:11.0pt; font-fa= mily:&quot;Calibri&quot;,sans-serif; color:black"></span></p> <span style=3D"font-size:9.0pt; font-family:&quot;Verdana&quot;,sans-serif= ; color:#5B9BD5">Australian Hearing Hub<br> 16 University Ave<br> Macquarie University NSW 2109 Australia<br> </span><br> <br> <div style=3D"font-family: Times New Roman; color: #000000; font-size: 16p= x"> <hr tabindex=3D"-1"> <div id=3D"divRpF640317" style=3D"direction: ltr;"><font size=3D"2" color=3D= "#000000" face=3D"Tahoma"><b>From:</b> AUDITORY - Research in Auditory Per= ception [AUDITORY@xxxxxxxx on behalf of Pat Savage [patsavagenz@xxxxxxxx= AIL.COM]<br> <b>Sent:</b> Thursday, March 22, 2018 6:19 PM<br> <b>To:</b> AUDITORY@xxxxxxxx<br> <b>Subject:</b> Re: [AUDITORY] Measuring perceptual similarity<br> </font><br> </div> <div></div> <div>Hi Bruno, Sabine, Daniel, and all, <div class=3D""><br class=3D""> </div> <div class=3D"">Thanks for the clarification. I was indeed wondering how t= he MUSHRA approach would deal with multidimensionality, but since several = people had independently suggested it I figured I just needed to read more= to see how it was done=85&nbsp;</div> <div class=3D""><br class=3D""> </div> <div class=3D"">I think for now we will indeed probably stick with pairwis= e dissimilarity ratings to start with with using a small sample of &lt;=3D= 20 stimuli, while also thinking about other possible options=85</div> <div class=3D""><br class=3D""> </div> <div class=3D"">Cheers,</div> <div class=3D"">Pat</div> <div class=3D""> <div class=3D""> <div class=3D"" style=3D"word-wrap:break-word"><span class=3D"" style=3D"c= olor:rgb(0,0,0); font-family:Helvetica; font-size:12px; font-style:normal;= font-weight:normal; letter-spacing:normal; text-align:start; text-indent:= 0px; text-transform:none; white-space:normal; word-spacing:0px; float:none= ; display:inline!important">---</span><br class=3D"" style=3D"color:rgb(0,= 0,0); font-family:Helvetica; font-size:12px; font-style:normal; font-weigh= t:normal; letter-spacing:normal; text-align:start; text-indent:0px; text-t= ransform:none; white-space:normal; word-spacing:0px"> <span class=3D"" style=3D"color:rgb(0,0,0); font-family:Helvetica; font-si= ze:12px; font-style:normal; font-weight:normal; letter-spacing:normal; tex= t-align:start; text-indent:0px; text-transform:none; white-space:normal; w= ord-spacing:0px; float:none; display:inline!important">Dr. Patrick Savage</span><br class=3D"" style=3D"color:rgb(0,0,0); font-famil= y:Helvetica; font-size:12px; font-style:normal; font-weight:normal; letter= -spacing:normal; text-align:start; text-indent:0px; text-transform:none; w= hite-space:normal; word-spacing:0px"> <div style=3D"color:rgb(0,0,0); font-family:Helvetica; font-size:12px; fon= t-style:normal; font-weight:normal; letter-spacing:normal; text-align:star= t; text-indent:0px; text-transform:none; white-space:normal; word-spacing:= 0px"> <a href=3D"http://PatrickESavage.com" class=3D"" target=3D"_blank">http://= PatrickESavage.com&nbsp;</a></div> <div class=3D"" style=3D"color:rgb(0,0,0); font-family:Helvetica; font-siz= e:12px; font-style:normal; font-weight:normal; letter-spacing:normal; text= -align:start; text-indent:0px; text-transform:none; white-space:normal; wo= rd-spacing:0px"> <br class=3D""> </div> <br class=3D"Apple-interchange-newline"> </div> <br class=3D"Apple-interchange-newline"> </div> <div><br class=3D""> <blockquote type=3D"cite" class=3D""> <div class=3D"">On Mar 21, 2018, at 19:29, Bruno L. Giordano &lt;<a href=3D= "mailto:brungio@xxxxxxxx" class=3D"" target=3D"_blank">brungio@xxxxxxxx<= /a>&gt; wrote:</div> <br class=3D"Apple-interchange-newline"> <div class=3D""> <div dir=3D"ltr" class=3D"">Dear Pat, <div class=3D""><br class=3D""> </div> <div class=3D"">in addition to what pointed out by Daniel, I personally fa= vour pairwise dissimilarity ratings over sorting unless the number of stim= uli is so large that it is not possible to acquire a full dissimilarity ma= trix in one humane experimental session (&lt;=3D ~40 stimuli). As you might have read, dissimilarity ratings prod= uces estimates of the distances that are much more reliable (because of th= e larger number of stimulus playback involved, I suppose), much less disto= rted (cf. binarization of dissimilarities in free sorting and skewed distribution of hierarchical sorting dissimila= rities), and much more indicative of stimulus features than the more effic= ient sorting methods.</div> <div class=3D""><br class=3D""> </div> <div class=3D"">Alternative methods come to mind that rely on the placemen= t of stimuli on a visual space and consider the inter-stimulus distances a= s estimates of the perceptual dissimilarities (e.g., Harbke, 2003;=20Krieg= eskorte and Mur, 2012). Importantly and unsurprisingly, these &quot;direct MDS&quot; methods bias the perceptual = space towards a 2D representation (see Harbke, 2003) and for this reason a= re a suboptimal choice for the discovery of perceptually relevant stimulus= features.</div> <div class=3D""><br class=3D""> </div> <div class=3D"">In short, there is no free meal in the behavioural estimat= ion of distances: if your goal is accuracy, methods that are less efficien= t from the time allocation point of view are, in my opinion, still the bes= t option.</div> <div class=3D""><br class=3D""> </div> <div class=3D"">Best,</div> <div class=3D""><br class=3D""> </div> <div class=3D"">&nbsp; &nbsp; &nbsp;Bruno</div> <div class=3D""><br class=3D""> </div> <div class=3D""> <div class=3D"">@xxxxxxxx{harbke2003evaluation,</div> <div class=3D"">&nbsp; author =3D {C. R. Harbke},</div> <div class=3D"">&nbsp; title&nbsp; =3D {Evaluation of data collection tech= niques for multidimensional scaling with large stimulus sets},</div> <div class=3D"">&nbsp; school=20=3D {Washington State University, Departme= nt of Psychology},</div> <div class=3D"">&nbsp; year&nbsp; &nbsp;=3D {2003},</div> <div class=3D"">}</div> </div> <div class=3D""><br class=3D""> </div> <div class=3D""><br class=3D""> </div> <div class=3D""> <div class=3D"">@xxxxxxxx{kriegeskorte2012inverse,</div> <div class=3D"">&nbsp; title=3D{Inverse MDS: Inferring dissimilarity struc= ture from multiple item arrangements},</div> <div class=3D"">&nbsp; author=3D{Kriegeskorte, Nikolaus and Mur, Marieke},= </div> <div class=3D"">&nbsp; journal=3D{Frontiers in psychology},</div> <div class=3D"">&nbsp; volume=3D{3},</div> <div class=3D"">&nbsp; pages=3D{245},</div> <div class=3D"">&nbsp; year=3D{2012},</div> <div class=3D"">&nbsp; publisher=3D{Frontiers}</div> <div class=3D"">}</div> </div> <div class=3D""><br class=3D""> </div> </div> <div class=3D"gmail_extra"><br class=3D""> <div class=3D"gmail_quote">On 20 March 2018 at 12:29, Oberfeld-Twistel, Da= niel <span dir=3D"ltr" class=3D""> &lt;<a href=3D"mailto:oberfeld@xxxxxxxx" class=3D"" target=3D"_blank">= oberfeld@xxxxxxxx</a>&gt;</span> wrote:<br class=3D""> <blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex; border-left:= 1px #ccc solid; padding-left:1ex"> <div class=3D"" lang=3D"DE"> <div class=3D"m_1196544489159743617WordSection1"> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">Thanks for sharing t= he references!<u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">In my view, MUSHRA c= annot be recommended for studying musical similarity.<u class=3D""></u><u = class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">The method is design= ed to identify differences between stimuli on a defined dimension (which is audio quality in the MUSHRA recommendation, although this ratin= g method could also be used for evaluating other perceptual dimensions).<u= class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">In the MUSHRA method= , listeners are NOT asked to rate the similarity of the stimuli, however. While in principle information about similarity could be deduced= in an indirect manner from the ratings obtained with MUSHRA (similar mean= ratings =3D high similarity), this would require that you can specify&nbs= p; the perceptual dimension on which your stimuli differ or are similar (say, rhythm, tempo, consonance/dissonance,= mood etc.). <u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">If that is not possi= ble, the other approaches that were suggested like triadic tests or MDS can be used *<b class=3D""><span class=3D"" style=3D"font-we= ight:bold">without</span></b>* having to specify which exact dimension &nb= sp;the similarity judgments should refer to, and to identify structures in= the (dis-) similarity ratings. <u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">In addition, I could= imagine that the MUSHRA concepts of a high-quality =93reference=94 and a low-quality =93anchor=94 do not easily apply to the experiments you= have in mind.<u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d">Best<u class=3D""></u><u class=3D""= ></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d"><u class=3D""></u>&nbsp;<u class=3D= ""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d">Daniel <u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d"><u class=3D""></u>&nbsp;<u class=3D= ""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d">------------------------------<wbr = class=3D"">---<u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">Dr. Daniel Oberfeld-= Twistel<u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">Associate Professor<= u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d">Johannes Gutenberg - Universitaet M= ainz<u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d">Institute of Psychology<u class=3D"= "></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">Experimental Psychol= ogy<u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><a href=3D"https://m= aps.google.com/?q=3DWallstrasse&#43;3&#43;%0D%0A&#43;55122&#43;Mainz&#43;%= 0D%0A&#43;Germany&amp;entry=3Dgmail&amp;source=3Dg" class=3D"" target=3D"_= blank">Wallstrasse 3</a><u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><a href=3D"https://m= aps.google.com/?q=3DWallstrasse&#43;3&#43;%0D%0A&#43;55122&#43;Mainz&#43;%= 0D%0A&#43;Germany&amp;entry=3Dgmail&amp;source=3Dg" class=3D"" target=3D"_= blank">55122 Mainz</a><u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><a href=3D"https://m= aps.google.com/?q=3DWallstrasse&#43;3&#43;%0D%0A&#43;55122&#43;Mainz&#43;%= 0D%0A&#43;Germany&amp;entry=3Dgmail&amp;source=3Dg" class=3D"" target=3D"_= blank">Germany</a><u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">Phone <a href=3D"tel:&#43;49%206131%203939274" value=3D"&#43;4961313939274" clas= s=3D"" target=3D"_blank"> &#43;&#43;49 (0) 6131 39 39274</a> <u class=3D""></u><u class=3D""></u></s= pan></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US">Fax&nbsp;&nbsp; <a href=3D"tel:&#43;49%206131%203939268" value=3D"&#43;4961313939268" clas= s=3D"" target=3D"_blank"> &#43;&#43;49 (0) 6131 39 39268</a><u class=3D""></u><u class=3D""></u></sp= an></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><a href=3D"http://ww= w.staff.uni-mainz.de/oberfeld/" class=3D"" target=3D"_blank">http://www.st= aff.uni-mainz.de/<wbr class=3D"">oberfeld/</a><u class=3D""></u><u class=3D= ""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><a href=3D"https://w= ww.facebook.com/WahrnehmungUndPsychophysikUniMainz" class=3D"" target=3D"_= blank">https://www.facebook.com/<wbr class=3D"">WahrnehmungUndPsychophysik= UniM<wbr class=3D"">ainz</a><u class=3D""></u><u class=3D""></u></span></f= ont></p> </div> <p class=3D"MsoNormal"><font class=3D"" size=3D"2" color=3D"#1f497d" face=3D= "Calibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Ca= libri&quot;,sans-serif; color:#1f497d" lang=3D"EN-US"><u class=3D""></u>&n= bsp;<u class=3D""></u></span></font></p> <div class=3D"" style=3D"border:none; border-left:solid blue 1.5pt; paddin= g:0cm 0cm 0cm 4.0pt"> <div class=3D""> <div class=3D"" style=3D"border:none; border-top:solid #e1e1e1 1.0pt; padd= ing:3.0pt 0cm 0cm 0cm"> <p class=3D"MsoNormal"><b class=3D""><font class=3D"" size=3D"2" face=3D"C= alibri"><span class=3D"" style=3D"font-size:11.0pt; font-family:&quot;Cali= bri&quot;,sans-serif; font-weight:bold" lang=3D"EN-US">From:</span></font>= </b><font class=3D"" size=3D"2" face=3D"Calibri"><span class=3D"" style=3D= "font-size:11.0pt; font-family:&quot;Calibri&quot;,sans-serif" lang=3D"EN-= US"> AUDITORY - Research in Auditory Perception &lt;<a href=3D"mailto:AUDITORY= @xxxxxxxx" class=3D"" target=3D"_blank">AUDITORY@xxxxxxxx</a= >&gt; <b class=3D""><span class=3D"" style=3D"font-weight:bold">On Behalf Of </s= pan></b>Pat Savage<br class=3D""> <b class=3D""><span class=3D"" style=3D"font-weight:bold">Sent:</span></b>= Tuesday, March 20, 2018 6:19 AM<br class=3D""> <b class=3D""><span class=3D"" style=3D"font-weight:bold">To:</span></b> <= a href=3D"mailto:AUDITORY@xxxxxxxx" class=3D"" target=3D"_blank"> AUDITORY@xxxxxxxx</a><br class=3D""> <b class=3D""><span class=3D"" style=3D"font-weight:bold">Subject:</span><= /b> Re: Measuring perceptual similarity<u class=3D""></u><u class=3D""></u= ></span></font></p> </div> </div> <div class=3D""> <div class=3D"h5"> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US"><u class=3D""= ></u>&nbsp;<u class=3D""></u></span></font></p> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">Dear list,<u = class=3D""></u><u class=3D""></u></span></font></p> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US"><u class=3D""= ></u>&nbsp;<u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">Thanks very m= uch for all of your responses. I=92m summarizing below all the reference r= ecommendations I received.&nbsp;<u class=3D""></u><u class=3D""></u></span= ></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US"><u class=3D""= ></u>&nbsp;<u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">I still want = to more fully read some of these, but so far my impression is that the Gio= rdano et al. (2011) paper gives a good review of the benefits and drawbacks of previous methods, but since that was publis= hed MUSHRA seems to have become the standard method for these types of sub= jective perceptual similarity ratings.<u class=3D""></u><u class=3D""></u>= </span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US"><u class=3D""= ></u>&nbsp;<u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">Please let me= know if I seem to be misunderstanding anything here.<u class=3D""></u><u = class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US"><u class=3D""= ></u>&nbsp;<u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">Cheers,<u cla= ss=3D""></u><u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">Pat&nbsp;<u c= lass=3D""></u><u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">--<u class=3D= ""></u><u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US">Flexer, A., &amp; Grill, T. (2016). The Problem of Limited I= nter-rater Agreement in Modelling Music Similarity.&nbsp;<i class=3D""><sp= an class=3D"" style=3D"font-style:italic">Journal of New Music Research</span></i>,&nbsp;<i class=3D""><span class=3D"" sty= le=3D"font-style:italic">45</span></i>(3), 1=9613.<u class=3D""></u><u cla= ss=3D""></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt; b= ackground:white" lang=3D"EN-US">P. Susini, S. McAdams, S. Winsberg: A mult= idimensional technique for sound quality assessment. Acta Acustica united with Acustica 85 (1999),=20650=96656.</span></font><= span class=3D"" lang=3D"EN-US"><u class=3D""></u><u class=3D""></u></span>= </p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US">Novello, A., McKinney, M. F., &amp; Kohlrausch, A. (2006). P= erceptual evaluation of music similarity. In&nbsp;<i class=3D""><span clas= s=3D"" style=3D"font-style:italic">Proceedings of the 7th International Conference on Music Information Retrieval</span>= </i>. Retrieved from&nbsp;</span><a href=3D"http://ismir2006.ismir.net/PAP= ERS/ISMIR06148_Paper.pdf" class=3D"" target=3D"_blank"><span class=3D"" la= ng=3D"EN-US">http://ismir2006.ismir.<wbr class=3D"">net/PAPERS/ISMIR06148_= Paper.<wbr class=3D"">pdf</span></a></font><span class=3D"" lang=3D"EN-US"= ><u class=3D""></u><u class=3D""></u></span></p> </div> <div class=3D""> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US">Michaud, P. Y., Meunier, S., Herzog, P., Lavandier, M., &amp= ; D=92Aubigny, G. D. (2013). Perceptual evaluation of dissimilarity between auditory stimuli: An alternative to the paired comparison.&nbsp;<= i class=3D""><span class=3D"" style=3D"font-style:italic">Acta Acustica Un= ited with Acustica</span></i>,&nbsp;<i class=3D""><span class=3D"" style=3D= "font-style:italic">99</span></i>(5), 806=96815.<u class=3D""></u><u class= =3D""></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US">Wolff, D., &amp; Weyde, T. (2011). Adapting Metrics for Musi= c Similarity Using Comparative Ratings.&nbsp;<i class=3D""><span class=3D"= " style=3D"font-style:italic">12th International Society for Music Information Retrieval Conference (ISMIR=92= 11), Proc.</span></i>, (Ismir), 73=9678.&nbsp;<u class=3D""></u><u class=3D= ""></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt; b= ackground:white" lang=3D"EN-US">B. L. Giordano, C. Guastavino, E. Murphy, = M. Ogg, B. K. Smith, S. McAdams: Comparison of methods for collecting and modeling dissimilarity data: Applications to complex s= ound stimuli. Multivariate Behavioral Research 46 (2011), 779=96811.</span= ></font><span class=3D"" lang=3D"EN-US"><u class=3D""></u><u class=3D""></= u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt; b= ackground:white" lang=3D"EN-US">P. Y. Michaud, S. Meunier, P. Herzog, M. L= avandier, G. d=92Aubigny: Perceptual evaluation of dissimilarity between auditory stimuli: an alternative to the paired&nbsp= ; comparison. Acta Acustica united with Acustica 99 (2013), 806=96815.</sp= an></font><span class=3D"" lang=3D"EN-US"><u class=3D""></u><u class=3D"">= </u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US">Collett, E., Marx, M., Gaillard, P., Roby, B., Fraysse, B., = &amp; Deguine, O. (2016). Categorization of common sounds by cochlear implanted and normal hearing adults.&nbsp;<i class=3D""><span= class=3D"" style=3D"font-style:italic">Hearing Research</span></i>,&nbsp;= <i class=3D""><span class=3D"" style=3D"font-style:italic">335</span></i>,= 207=96219.<u class=3D""></u><u class=3D""></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US">International Telecommunication Union. (2015). ITU-R BS.1534= -3, Method for the subjective assessment of intermediate quality level of audio systems.&nbsp;<i class=3D""><span class=3D"" style= =3D"font-style:italic">ITU-R Recommendation</span></i>,&nbsp;<i class=3D""= ><span class=3D"" style=3D"font-style:italic">1534</span></i>=96<i class=3D= ""><span class=3D"" style=3D"font-style:italic">3</span></i>, 1534=963. Retrieved from&nbsp;</span><a href=3D"https://www.itu.int/dms_pubrec/itu-= r/rec/bs/R-REC-BS.1534-3-201510-I!!PDF-E.pdf" class=3D"" target=3D"_blank"= ><span class=3D"" lang=3D"EN-US">https://www.itu.int/dms_<wbr class=3D"">p= ubrec/itu-r/rec/bs/R-REC-BS.<wbr class=3D"">1534-3-201510-I!!PDF-E.pdf</sp= an></a></font><span class=3D"" lang=3D"EN-US"><u class=3D""></u><u class=3D= ""></u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt">L= avandier, M., Meunier, S., &amp; Herzog, P. (2008). </span></font><span class=3D"" lang=3D"EN-US">Identification of some perce= ptual dimensions underlying loudspeaker dissimilarities.&nbsp;<i class=3D"= "><span class=3D"" style=3D"font-style:italic">Journal of the Acoustical S= ociety of America</span></i>,&nbsp;<i class=3D""><span class=3D"" style=3D= "font-style:italic">123</span></i>(6), 4186=964198.&nbsp;<u class=3D""></u><u class=3D""></u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US"><u class=3D""></u>&nbsp;<u class=3D""></u></span></font></p>= <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"><font class=3D"" size=3D= "3" face=3D"Times New Roman"><span class=3D"" style=3D"font-size:12.0pt" l= ang=3D"EN-US">DZHAFAROV, E. N., &amp; OLDENBURG, H. C. (2006). RECONSTRUCT= ING DISTANCES AMONG OBJECTS FROM THEIR DISCRIMINABILITY.&nbsp;</span><i cl= ass=3D""><span class=3D"" style=3D"font-style:italic">Psychometrik<wbr cla= ss=3D"">a</span></i>,&nbsp;<i class=3D""><span class=3D"" style=3D"font-st= yle:italic">71</span></i>(2), 365=96386.<u class=3D""></u><u class=3D""></u></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt"><u class=3D""></u>&nbsp;<u c= lass=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt">Software:&nbsp;<u class=3D""= ></u><u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt"><u class=3D""></u>&nbsp;<u c= lass=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt">Various:<u class=3D""></u><u= class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt"><a href=3D"http://www.ak.tu-= berlin.de/menue/digitale_ressourcen/research_tools/whisper_listening_test_= toolbox_for_matlab/" class=3D"" target=3D"_blank"><font class=3D"" size=3D= "2" color=3D"purple" face=3D"Calibri"><span class=3D"" style=3D"font-size:= 11.0pt; font-family:&quot;Calibri&quot;,sans-serif; color:purple">http://w= ww.ak.tu-berlin.de/<wbr class=3D"">menue/digitale_ressourcen/<wbr class=3D= "">research_tools/whisper_<wbr class=3D"">listening_test_toolbox_for_<wbr = class=3D"">matlab/</span></font></a></span></font><u class=3D""></u><u cla= ss=3D""></u></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt"><u class=3D""></u>&nbsp;<u c= lass=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt">MUSHRA:<u class=3D""></u><u = class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt"><a href=3D"https://github.co= m/audiolabs/webMUSHRA" class=3D"" target=3D"_blank">https://github.com/aud= iolabs/<wbr class=3D"">webMUSHRA</a></span></font><u class=3D""></u><u cla= ss=3D""></u></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times=20New Rom= an"><span class=3D"" style=3D"font-size:12.0pt"><u class=3D""></u>&nbsp;<u= class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt">Free-sorting:<u class=3D""><= /u><u class=3D""></u></span></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt"><a href=3D"http://petra.univ= -tlse2.fr/spip.php?rubrique49&amp;lang=3Den" class=3D"" target=3D"_blank">= http://petra.univ-tlse2.fr/<wbr class=3D"">spip.php?rubrique49&amp;lang=3D= en</a></span></font><u class=3D""></u><u class=3D""></u></p> </div> <div class=3D""> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">---<br class=3D= ""> Dr. Patrick Savage<br class=3D""> Project Associate Professor<u class=3D""></u><u class=3D""></u></span></fo= nt></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">Faculty of En= vironment and Information Studies<u class=3D""></u><u class=3D""></u></spa= n></font></p> </div> <div class=3D""> <p class=3D"MsoNormal"><font class=3D"" size=3D"3" face=3D"Times New Roman= "><span class=3D"" style=3D"font-size:12.0pt" lang=3D"EN-US">Keio Universi= ty SFC (Shonan Fujisawa Campus)<br class=3D""> </span><a href=3D"http://patrickesavage.com/" class=3D"" target=3D"_blank"= ><span class=3D"" lang=3D"EN-US">http://PatrickESavage.com</span></a></fon= t><span class=3D"" lang=3D"EN-US"><u class=3D""></u><u class=3D""></u></sp= an></p> </div> </div> </div> </div> </div> </div> </div> </blockquote> </div> <br class=3D""> <br class=3D"" clear=3D"all"> <div class=3D""><br class=3D""> </div> -- <br class=3D""> <div class=3D"gmail_signature"> <div dir=3D"ltr" class=3D""> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">~~~~~~~= ~~~~~~~~~~~~~~~~~~~~~~~~~</div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">Bruno L= . Giordano, PhD =96 CR1</div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">Institu= t de Neurosciences de la Timone (INT)</div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">UMR 728= 9, CNRS and Aix Marseille Universit=E9</div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px"><a href= =3D"http://www.int.univ-amu.fr/" class=3D"" style=3D"color:rgb(17,85,204)"= target=3D"_blank">http://www.int.univ-amu.fr/</a></div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">Campus = sant=E9 Timone</div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">27, bou= levard Jean Moulin</div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">13385 M= arseille cedex 5</div> <div class=3D"" style=3D"color:rgb(136,136,136); font-size:12.8px">Www:&nb= sp;<a href=3D"http://www.brunolgiordano.net/" class=3D"" style=3D"color:rg= b(17,85,204)" target=3D"_blank">http://www.brunolgiordano.net</a></div> </div> </div> </div> </div> </blockquote> </div> <br class=3D""> </div> </div> </div> </div> <br clear=3D"both"> -------------------------------------------------------------------------= -----<BR> AUSTRALIAN HEARING DISCLAIMER<BR> <BR> This email message and any accompanying attachments may contain informatio= n that is confidential and is subject to legal privilege. Confidentiality = and legal privilege attached to this communication are not waived or lost = by reason of mistaken delivery to you. If you are not the intended recipie= nt, do not read, use, disseminate, distribute or copy this message or atta= chments. If you have received this message in error, please notify the sen= der immediately and delete this message. Australian Hearing Services can a= lso be contacted on telephone No. +61 2 9412 6800. Any views expressed in = this message are those of the individual sender, except where the sender e= xpressly, and with authority, states them to be the views of Australian He= aring Services. Before opening any attachments, please check them for viru= ses and defects. Australian Hearing Services accepts no liability for any = loss caused either directly or indirectly by a virus arising from the use = of this message or any attached file.<BR> -------------------------------------------------------------------------= -----<BR> </body> </html> --_000_DB48E1AB0567E440862DD5D0F3AEA479E010349EDCPEXCH001heari_--


This message came from the mail archive
../postings/2018/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University