Re: [AUDITORY] Measuring perceptual similarity ("Bruno L. Giordano" )


Subject: Re: [AUDITORY] Measuring perceptual similarity
From:    "Bruno L. Giordano"  <brungio@xxxxxxxx>
Date:    Wed, 21 Mar 2018 11:29:47 +0100
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--001a114fd0aaa1e3b30567e9abb3 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable Dear Pat, in addition to what pointed out by Daniel, I personally favour pairwise dissimilarity ratings over sorting unless the number of stimuli is so large that it is not possible to acquire a full dissimilarity matrix in one humane experimental session (<=3D ~40 stimuli). As you might have read, dissimilarity ratings produces estimates of the distances that are much more reliable (because of the larger number of stimulus playback involved, I suppose), much less distorted (cf. binarization of dissimilarities in free sorting and skewed distribution of hierarchical sorting dissimilarities), and much more indicative of stimulus features than the more efficient sorting methods. Alternative methods come to mind that rely on the placement of stimuli on a visual space and consider the inter-stimulus distances as estimates of the perceptual dissimilarities (e.g., Harbke, 2003; Kriegeskorte and Mur, 2012). Importantly and unsurprisingly, these "direct MDS" methods bias the perceptual space towards a 2D representation (see Harbke, 2003) and for this reason are a suboptimal choice for the discovery of perceptually relevant stimulus features. In short, there is no free meal in the behavioural estimation of distances: if your goal is accuracy, methods that are less efficient from the time allocation point of view are, in my opinion, still the best option. Best, Bruno @xxxxxxxx{harbke2003evaluation, author =3D {C. R. Harbke}, title =3D {Evaluation of data collection techniques for multidimensional scaling with large stimulus sets}, school =3D {Washington State University, Department of Psychology}, year =3D {2003}, } @xxxxxxxx{kriegeskorte2012inverse, title=3D{Inverse MDS: Inferring dissimilarity structure from multiple ite= m arrangements}, author=3D{Kriegeskorte, Nikolaus and Mur, Marieke}, journal=3D{Frontiers in psychology}, volume=3D{3}, pages=3D{245}, year=3D{2012}, publisher=3D{Frontiers} } On 20 March 2018 at 12:29, Oberfeld-Twistel, Daniel <oberfeld@xxxxxxxx> wrote: > Thanks for sharing the references! > > > > In my view, MUSHRA cannot be recommended for studying musical similarity. > > > > The method is designed to identify differences between stimuli on a > defined dimension (which is audio quality in the MUSHRA recommendation, > although this rating method could also be used for evaluating other > perceptual dimensions). > > > > In the MUSHRA method, listeners are NOT asked to rate the similarity of > the stimuli, however. While in principle information about similarity cou= ld > be deduced in an indirect manner from the ratings obtained with MUSHRA > (similar mean ratings =3D high similarity), this would require that you c= an > specify the perceptual dimension on which your stimuli differ or are > similar (say, rhythm, tempo, consonance/dissonance, mood etc.). > > > > If that is not possible, the other approaches that were suggested like > triadic tests or MDS can be used **without** having to specify which > exact dimension the similarity judgments should refer to, and to identif= y > structures in the (dis-) similarity ratings. > > > > In addition, I could imagine that the MUSHRA concepts of a high-quality > =E2=80=9Creference=E2=80=9D and a low-quality =E2=80=9Canchor=E2=80=9D do= not easily apply to the > experiments you have in mind. > > > > Best > > > > Daniel > > > > --------------------------------- > > Dr. Daniel Oberfeld-Twistel > > Associate Professor > > Johannes Gutenberg - Universitaet Mainz > > Institute of Psychology > > Experimental Psychology > > Wallstrasse 3 > <https://maps.google.com/?q=3DWallstrasse+3+%0D%0A+55122+Mainz+%0D%0A+Ger= many&entry=3Dgmail&source=3Dg> > > 55122 Mainz > <https://maps.google.com/?q=3DWallstrasse+3+%0D%0A+55122+Mainz+%0D%0A+Ger= many&entry=3Dgmail&source=3Dg> > > Germany > <https://maps.google.com/?q=3DWallstrasse+3+%0D%0A+55122+Mainz+%0D%0A+Ger= many&entry=3Dgmail&source=3Dg> > > > > Phone ++49 (0) 6131 39 39274 <+49%206131%203939274> > > Fax ++49 (0) 6131 39 39268 <+49%206131%203939268> > > http://www.staff.uni-mainz.de/oberfeld/ > > https://www.facebook.com/WahrnehmungUndPsychophysikUniMainz > > > > *From:* AUDITORY - Research in Auditory Perception < > AUDITORY@xxxxxxxx> *On Behalf Of *Pat Savage > *Sent:* Tuesday, March 20, 2018 6:19 AM > *To:* AUDITORY@xxxxxxxx > *Subject:* Re: Measuring perceptual similarity > > > > Dear list, > > > > Thanks very much for all of your responses. I=E2=80=99m summarizing below= all the > reference recommendations I received. > > > > I still want to more fully read some of these, but so far my impression i= s > that the Giordano et al. (2011) paper gives a good review of the benefits > and drawbacks of previous methods, but since that was published MUSHRA > seems to have become the standard method for these types of subjective > perceptual similarity ratings. > > > > Please let me know if I seem to be misunderstanding anything here. > > > > Cheers, > > Pat > > -- > > Flexer, A., & Grill, T. (2016). The Problem of Limited Inter-rater > Agreement in Modelling Music Similarity. *Journal of New Music Research*, > *45*(3), 1=E2=80=9313. > > > > P. Susini, S. McAdams, S. Winsberg: A multidimensional technique for soun= d > quality assessment. Acta Acustica united with Acustica 85 (1999), 650=E2= =80=93656. > > > > Novello, A., McKinney, M. F., & Kohlrausch, A. (2006). Perceptual > evaluation of music similarity. In *Proceedings of the 7th International > Conference on Music Information Retrieval*. Retrieved from > http://ismir2006.ismir.net/PAPERS/ISMIR06148_Paper.pdf > > Michaud, P. Y., Meunier, S., Herzog, P., Lavandier, M., & D=E2=80=99Aubig= ny, G. D. > (2013). Perceptual evaluation of dissimilarity between auditory stimuli: = An > alternative to the paired comparison. *Acta Acustica United with Acustica= * > , *99*(5), 806=E2=80=93815. > > > > Wolff, D., & Weyde, T. (2011). Adapting Metrics for Music Similarity Usin= g > Comparative Ratings. *12th International Society for Music Information > Retrieval Conference (ISMIR=E2=80=9911), Proc.*, (Ismir), 73=E2=80=9378. > > > > B. L. Giordano, C. Guastavino, E. Murphy, M. Ogg, B. K. Smith, S. McAdams= : > Comparison of methods for collecting and modeling dissimilarity data: > Applications to complex sound stimuli. Multivariate Behavioral Research 4= 6 > (2011), 779=E2=80=93811. > > > > P. Y. Michaud, S. Meunier, P. Herzog, M. Lavandier, G. d=E2=80=99Aubigny: > Perceptual evaluation of dissimilarity between auditory stimuli: an > alternative to the paired comparison. Acta Acustica united with Acustica > 99 (2013), 806=E2=80=93815. > > > > Collett, E., Marx, M., Gaillard, P., Roby, B., Fraysse, B., & Deguine, O. > (2016). Categorization of common sounds by cochlear implanted and normal > hearing adults. *Hearing Research*, *335*, 207=E2=80=93219. > > > > International Telecommunication Union. (2015). ITU-R BS.1534-3, Method fo= r > the subjective assessment of intermediate quality level of audio systems.= *ITU-R > Recommendation*, *1534*=E2=80=93*3*, 1534=E2=80=933. Retrieved from > https://www.itu.int/dms_pubrec/itu-r/rec/bs/R-REC-BS. > 1534-3-201510-I!!PDF-E.pdf > > > > Lavandier, M., Meunier, S., & Herzog, P. (2008). Identification of some > perceptual dimensions underlying loudspeaker dissimilarities. *Journal of > the Acoustical Society of America*, *123*(6), 4186=E2=80=934198. > > > > DZHAFAROV, E. N., & OLDENBURG, H. C. (2006). RECONSTRUCTING DISTANCES > AMONG OBJECTS FROM THEIR DISCRIMINABILITY. *Psychometrika*, *71*(2), > 365=E2=80=93386. > > > > Software: > > > > Various: > > http://www.ak.tu-berlin.de/menue/digitale_ressourcen/ > research_tools/whisper_listening_test_toolbox_for_matlab/ > > > > MUSHRA: > > https://github.com/audiolabs/webMUSHRA > > > > Free-sorting: > > http://petra.univ-tlse2.fr/spip.php?rubrique49&lang=3Den > > --- > Dr. Patrick Savage > Project Associate Professor > > Faculty of Environment and Information Studies > > Keio University SFC (Shonan Fujisawa Campus) > http://PatrickESavage.com > --=20 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Bruno L. Giordano, PhD =E2=80=93 CR1 Institut de Neurosciences de la Timone (INT) UMR 7289, CNRS and Aix Marseille Universit=C3=A9 http://www.int.univ-amu.fr/ Campus sant=C3=A9 Timone 27, boulevard Jean Moulin 13385 Marseille cedex 5 Www: http://www.brunolgiordano.net --001a114fd0aaa1e3b30567e9abb3 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable <div dir=3D"ltr">Dear Pat,<div><br></div><div>in addition to what pointed o= ut by Daniel, I personally favour pairwise dissimilarity ratings over sorti= ng unless the number of stimuli is so large that it is not possible to acqu= ire a full dissimilarity matrix in one humane experimental session (&lt;=3D= ~40 stimuli). As you might have read, dissimilarity ratings produces estim= ates of the distances that are much more reliable (because of the larger nu= mber of stimulus playback involved, I suppose), much less distorted (cf. bi= narization of dissimilarities in free sorting and skewed distribution of hi= erarchical sorting dissimilarities), and much more indicative of stimulus f= eatures than the more efficient sorting methods.</div><div><br></div><div>A= lternative methods come to mind that rely on the placement of stimuli on a = visual space and consider the inter-stimulus distances as estimates of the = perceptual dissimilarities (e.g., Harbke, 2003; Kriegeskorte and Mur, 2012)= . Importantly and unsurprisingly, these &quot;direct MDS&quot; methods bias= the perceptual space towards a 2D representation (see Harbke, 2003) and fo= r this reason are a suboptimal choice for the discovery of perceptually rel= evant stimulus features.</div><div><br></div><div>In short, there is no fre= e meal in the behavioural estimation of distances: if your goal is accuracy= , methods that are less efficient from the time allocation point of view ar= e, in my opinion, still the best option.</div><div><br></div><div>Best,</di= v><div><br></div><div>=C2=A0 =C2=A0 =C2=A0Bruno</div><div><br></div><div><d= iv>@xxxxxxxx{harbke2003evaluation,</div><div>=C2=A0 author =3D {C. R. = Harbke},</div><div>=C2=A0 title=C2=A0 =3D {Evaluation of data collection te= chniques for multidimensional scaling with large stimulus sets},</div><div>= =C2=A0 school =3D {Washington State University, Department of Psychology},<= /div><div>=C2=A0 year=C2=A0 =C2=A0=3D {2003},</div><div>}</div></div><div><= br></div><div><br></div><div><div>@xxxxxxxx{kriegeskorte2012inverse,</div><d= iv>=C2=A0 title=3D{Inverse MDS: Inferring dissimilarity structure from mult= iple item arrangements},</div><div>=C2=A0 author=3D{Kriegeskorte, Nikolaus = and Mur, Marieke},</div><div>=C2=A0 journal=3D{Frontiers in psychology},</d= iv><div>=C2=A0 volume=3D{3},</div><div>=C2=A0 pages=3D{245},</div><div>=C2= =A0 year=3D{2012},</div><div>=C2=A0 publisher=3D{Frontiers}</div><div>}</di= v></div><div><br></div></div><div class=3D"gmail_extra"><br><div class=3D"g= mail_quote">On 20 March 2018 at 12:29, Oberfeld-Twistel, Daniel <span dir= =3D"ltr">&lt;<a href=3D"mailto:oberfeld@xxxxxxxx" target=3D"_blank">obe= rfeld@xxxxxxxx</a>&gt;</span> wrote:<br><blockquote class=3D"gmail_quot= e" style=3D"margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"> <div lang=3D"DE" link=3D"blue" vlink=3D"purple"> <div class=3D"m_1196544489159743617WordSection1"> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">Thanks for sharing the references!<u></u><u><= /u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">In my view, MUSHRA cannot be recommended for = studying musical similarity.<u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">The method is designed to identify difference= s between stimuli on a defined dimension (which is audio quality in the MUSHRA recommendation, although t= his rating method could also be used for evaluating other perceptual dimens= ions).<u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">In the MUSHRA method, listeners are NOT asked= to rate the similarity of the stimuli, however. While in principle information about similarity coul= d be deduced in an indirect manner from the ratings obtained with MUSHRA (s= imilar mean ratings =3D high similarity), this would require that you can s= pecify=C2=A0 the perceptual dimension on which your stimuli differ or are similar (say, rhythm, tempo, consonance/d= issonance, mood etc.). <u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">If that is not possible, the other approaches= that were suggested like triadic tests or MDS can be used *<b><span style=3D"font-weight:bold">without</spa= n></b>* having to specify which exact dimension =C2=A0the similarity judgme= nts should refer to, and to identify structures in the (dis-) similarity ra= tings. <u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">In addition, I could imagine that the MUSHRA = concepts of a high-quality =E2=80=9Creference=E2=80=9D and a low-quality =E2=80=9Canchor=E2=80=9D do = not easily apply to the experiments you have in mind.<u></u><u></u></span><= /font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <div> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= color:#1f497d">Best<u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= color:#1f497d">Daniel <u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= color:#1f497d">------------------------------<wbr>---<u></u><u></u></span><= /font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">Dr. Daniel Oberfeld-Twistel<u></u><u></u></sp= an></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">Associate Professor<u></u><u></u></span></fon= t></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= color:#1f497d">Johannes Gutenberg - Universitaet Mainz<u></u><u></u></span>= </font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= color:#1f497d">Institute of Psychology<u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">Experimental Psychology<u></u><u></u></span><= /font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><a href=3D"https://maps.google.com/?q=3DWalls= trasse+3+%0D%0A+55122+Mainz+%0D%0A+Germany&amp;entry=3Dgmail&amp;source=3Dg= ">Wallstrasse 3</a><u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><a href=3D"https://maps.google.com/?q=3DWalls= trasse+3+%0D%0A+55122+Mainz+%0D%0A+Germany&amp;entry=3Dgmail&amp;source=3Dg= ">55122 Mainz</a><u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><a href=3D"https://maps.google.com/?q=3DWalls= trasse+3+%0D%0A+55122+Mainz+%0D%0A+Germany&amp;entry=3Dgmail&amp;source=3Dg= ">Germany</a><u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">Phone <a href=3D"tel:+49%206131%203939274" va= lue=3D"+4961313939274" target=3D"_blank">++49 (0) 6131 39 39274</a> <u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d">Fax=C2=A0=C2=A0 <a href=3D"tel:+49%206131%203= 939268" value=3D"+4961313939268" target=3D"_blank">++49 (0) 6131 39 39268</= a><u></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><a href=3D"http://www.staff.uni-mainz.de/ober= feld/" target=3D"_blank">http://www.staff.uni-mainz.de/<wbr>oberfeld/</a><u= ></u><u></u></span></font></p> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><a href=3D"https://www.facebook.com/Wahrnehmu= ngUndPsychophysikUniMainz" target=3D"_blank">https://www.facebook.com/<wbr>= WahrnehmungUndPsychophysikUniM<wbr>ainz</a><u></u><u></u></span></font></p> </div> <p class=3D"MsoNormal"><font size=3D"2" color=3D"#1f497d" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif;color:#1f497d"><u></u>=C2=A0<u></u></span></font></p> <div style=3D"border:none;border-left:solid blue 1.5pt;padding:0cm 0cm 0cm = 4.0pt"> <div> <div style=3D"border:none;border-top:solid #e1e1e1 1.0pt;padding:3.0pt 0cm = 0cm 0cm"> <p class=3D"MsoNormal"><b><font size=3D"2" face=3D"Calibri"><span lang=3D"E= N-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;= font-weight:bold">From:</span></font></b><font size=3D"2" face=3D"Calibri">= <span lang=3D"EN-US" style=3D"font-size:11.0pt;font-family:&quot;Calibri&qu= ot;,sans-serif"> AUDITORY - Research in Auditory Perception &lt;<a href=3D"mailto:AUDITORY@xxxxxxxx= LISTS.MCGILL.CA" target=3D"_blank">AUDITORY@xxxxxxxx</a>&gt; <b><spa= n style=3D"font-weight:bold">On Behalf Of </span></b>Pat Savage<br> <b><span style=3D"font-weight:bold">Sent:</span></b> Tuesday, March 20, 201= 8 6:19 AM<br> <b><span style=3D"font-weight:bold">To:</span></b> <a href=3D"mailto:AUDITO= RY@xxxxxxxx" target=3D"_blank">AUDITORY@xxxxxxxx</a><br> <b><span style=3D"font-weight:bold">Subject:</span></b> Re: Measuring perce= ptual similarity<u></u><u></u></span></font></p> </div> </div><div><div class=3D"h5"> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p= > <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">Dear list,<u></u><u></u></span></font= ></p> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p= > </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">Thanks very much for all of your resp= onses. I=E2=80=99m summarizing below all the reference recommendations I re= ceived.=C2=A0<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p= > </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">I still want to more fully read some = of these, but so far my impression is that the Giordano et al. (2011) paper= gives a good review of the benefits and drawbacks of previous methods, but since that was published MUSHRA seems to have bec= ome the standard method for these types of subjective perceptual similarity= ratings.<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p= > </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">Please let me know if I seem to be mi= sunderstanding anything here.<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p= > </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">Cheers,<u></u><u></u></span></font></= p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">Pat=C2=A0<u></u><u></u></span></font>= </p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">--<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt">Flexer, A., &amp; Grill, T. (2016). The Problem of Limited I= nter-rater Agreement in Modelling Music Similarity.=C2=A0<i><span style=3D"= font-style:italic">Journal of New Music Research</span></i>,=C2=A0<i><span = style=3D"font-style:italic">45</span></i>(3), 1=E2=80=9313.<u></u><u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt;background:white">P. Susini, S. McAdams, S. Winsberg: A multi= dimensional technique for sound quality assessment. Acta Acustica united wi= th Acustica 85 (1999), 650=E2=80=93656.</span></font><span lang=3D"EN-US"><= u></u><u></u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt">Novello, A., McKinney, M. F., &amp; Kohlrausch, A. (2006). P= erceptual evaluation of music similarity. In=C2=A0<i><span style=3D"font-st= yle:italic">Proceedings of the 7th International Conference on Music Information Retrieval</span></i>. Retrieved from=C2=A0</span><a h= ref=3D"http://ismir2006.ismir.net/PAPERS/ISMIR06148_Paper.pdf" target=3D"_b= lank"><span lang=3D"EN-US">http://ismir2006.ismir.<wbr>net/PAPERS/ISMIR0614= 8_Paper.<wbr>pdf</span></a></font><span lang=3D"EN-US"><u></u><u></u></span= ></p> </div> <div> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt">Michaud, P. Y., Meunier, S., Herzog, P., Lavandier, M., &amp= ; D=E2=80=99Aubigny, G. D. (2013). Perceptual evaluation of dissimilarity b= etween auditory stimuli: An alternative to the paired comparison.=C2=A0<i><span style=3D"font-style:italic">Acta Acustica United= with Acustica</span></i>,=C2=A0<i><span style=3D"font-style:italic">99</sp= an></i>(5), 806=E2=80=93815.<u></u><u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt">Wolff, D., &amp; Weyde, T. (2011). Adapting Metrics for Musi= c Similarity Using Comparative Ratings.=C2=A0<i><span style=3D"font-style:i= talic">12th International Society for Music Information Retrieval Conference (ISMIR=E2=80=9911), Proc.</span></i>, (Ismir), 73=E2= =80=9378.=C2=A0<u></u><u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt;background:white">B. L. Giordano, C. Guastavino, E. Murphy, M= . Ogg, B. K. Smith, S. McAdams: Comparison of methods for collecting and mo= deling dissimilarity data: Applications to complex sound stimuli. Multivariate Behavioral Research 46 (2011), 779= =E2=80=93811.</span></font><span lang=3D"EN-US"><u></u><u></u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt;background:white">P. Y. Michaud, S. Meunier, P. Herzog, M. La= vandier, G. d=E2=80=99Aubigny: Perceptual evaluation of dissimilarity betwe= en auditory stimuli: an alternative to the paired=C2=A0 comparison. Acta Acustica united with Acustica 99 (2013), 806=E2=80=93815.= </span></font><span lang=3D"EN-US"><u></u><u></u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt">Collett, E., Marx, M., Gaillard, P., Roby, B., Fraysse, B., = &amp; Deguine, O. (2016). Categorization of common sounds by cochlear impla= nted and normal hearing adults.=C2=A0<i><span style=3D"font-style:italic">H= earing Research</span></i>,=C2=A0<i><span style=3D"font-style:italic">335</span><= /i>, 207=E2=80=93219.<u></u><u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt">International Telecommunication Union. (2015). ITU-R BS.1534= -3, Method for the subjective assessment of intermediate quality level of a= udio systems.=C2=A0<i><span style=3D"font-style:italic">ITU-R Recommendation</span></i>,=C2=A0<i><span style=3D"font-style:italic">1534<= /span></i>=E2=80=93<i><span style=3D"font-style:italic">3</span></i>, 1534= =E2=80=933. Retrieved from=C2=A0</span><a href=3D"https://www.itu.int/dms_p= ubrec/itu-r/rec/bs/R-REC-BS.1534-3-201510-I!!PDF-E.pdf" target=3D"_blank"><= span lang=3D"EN-US">https://www.itu.int/dms_<wbr>pubrec/itu-r/rec/bs/R-REC-= BS.<wbr>1534-3-201510-I!!PDF-E.pdf</span></a></font><span lang=3D"EN-US"><u= ></u><u></u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span style=3D"font-size:12.0pt">= Lavandier, M., Meunier, S., &amp; Herzog, P. (2008). </span></font><span lang=3D"EN-US">Identification of some perceptual dimens= ions underlying loudspeaker dissimilarities.=C2=A0<i><span style=3D"font-st= yle:italic">Journal of the Acoustical Society of America</span></i>,=C2=A0<= i><span style=3D"font-style:italic">123</span></i>(6), 4186=E2=80=934198.=C2=A0<u></u><u></u></span></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> <p class=3D"MsoNormal" style=3D"margin-left:24.0pt"> <font size=3D"3" face=3D"Times New Roman"><span lang=3D"EN-US" style=3D"fon= t-size:12.0pt">DZHAFAROV, E. N., &amp; OLDENBURG, H. C. (2006). RECONSTRUCT= ING DISTANCES AMONG OBJECTS FROM THEIR DISCRIMINABILITY.=C2=A0</span><i><sp= an style=3D"font-style:italic">Psychometrik<wbr>a</span></i>,=C2=A0<i><span= style=3D"font-style:italic">71</span></i>(2), 365=E2=80=93386.<u></u><u></u></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt">Software:=C2=A0<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt">Various:<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt"><a href=3D"http://www.ak.tu-berlin.de/menue/digitale= _ressourcen/research_tools/whisper_listening_test_toolbox_for_matlab/" targ= et=3D"_blank"><font size=3D"2" color=3D"purple" face=3D"Calibri"><span styl= e=3D"font-size:11.0pt;font-family:&quot;Calibri&quot;,sans-serif;color:purp= le">http://www.ak.tu-berlin.de/<wbr>menue/digitale_ressourcen/<wbr>research= _tools/whisper_<wbr>listening_test_toolbox_for_<wbr>matlab/</span></font></= a></span></font><u></u><u></u></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt">MUSHRA:<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt"><a href=3D"https://github.com/audiolabs/webMUSHRA" t= arget=3D"_blank">https://github.com/audiolabs/<wbr>webMUSHRA</a></span></fo= nt><u></u><u></u></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt"><u></u>=C2=A0<u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt">Free-sorting:<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span styl= e=3D"font-size:12.0pt"><a href=3D"http://petra.univ-tlse2.fr/spip.php?rubri= que49&amp;lang=3Den" target=3D"_blank">http://petra.univ-tlse2.fr/<wbr>spip= .php?rubrique49&amp;lang=3Den</a></span></font><u></u><u></u></p> </div> <div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">---<br> Dr. Patrick Savage<br> Project Associate Professor<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">Faculty of Environment and Informatio= n Studies<u></u><u></u></span></font></p> </div> <div> <p class=3D"MsoNormal"><font size=3D"3" face=3D"Times New Roman"><span lang= =3D"EN-US" style=3D"font-size:12.0pt">Keio University SFC (Shonan Fujisawa = Campus)<br> </span><a href=3D"http://PatrickESavage.com" target=3D"_blank"><span lang= =3D"EN-US">http://PatrickESavage.com</span></a></font><span lang=3D"EN-US">= <u></u><u></u></span></p> </div> </div> </div></div></div> </div> </div> </blockquote></div><br><br clear=3D"all"><div><br></div>-- <br><div class= =3D"gmail_signature" data-smartmail=3D"gmail_signature"><div dir=3D"ltr"><d= iv style=3D"color:rgb(136,136,136);font-size:12.8px">~~~~~~~~~~~~~~~~~~~~~~= ~~~~~~~~~~</div><div style=3D"color:rgb(136,136,136);font-size:12.8px">Brun= o L. Giordano, PhD =E2=80=93 CR1</div><div style=3D"color:rgb(136,136,136);= font-size:12.8px">Institut de Neurosciences de la Timone (INT)</div><div st= yle=3D"color:rgb(136,136,136);font-size:12.8px">UMR 7289, CNRS and Aix Mars= eille Universit=C3=A9</div><div style=3D"color:rgb(136,136,136);font-size:1= 2.8px"><a href=3D"http://www.int.univ-amu.fr/" style=3D"color:rgb(17,85,204= )" target=3D"_blank">http://www.int.univ-amu.fr/</a></div><div style=3D"col= or:rgb(136,136,136);font-size:12.8px">Campus sant=C3=A9 Timone</div><div st= yle=3D"color:rgb(136,136,136);font-size:12.8px">27, boulevard Jean Moulin</= div><div style=3D"color:rgb(136,136,136);font-size:12.8px">13385 Marseille = cedex 5</div><div style=3D"color:rgb(136,136,136);font-size:12.8px">Www:=C2= =A0<a href=3D"http://www.brunolgiordano.net/" style=3D"color:rgb(17,85,204)= " target=3D"_blank">http://www.brunolgiordano.net</a></div></div></div> </div> --001a114fd0aaa1e3b30567e9abb3--


This message came from the mail archive
../postings/2018/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University