Re: [AUDITORY] Software for internet-based auditory testing (Mark Cartwright )


Subject: Re: [AUDITORY] Software for internet-based auditory testing
From:    Mark Cartwright  <mcartwright@xxxxxxxx>
Date:    Wed, 4 Oct 2017 18:04:23 +0000
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--f403043e5accb098af055abc70d2 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable To add to the list of tools. Here are some relevant audio crowdsourcing tools that I've worked on: 1) CAQE - a tool for running crowdsourced audio quality evaluation (both pairwise and multi-stimulus tests), which is setup for easy use with MTurk workers and Heroku hosting. https://github.com/interactiveaudiolab/CAQE/issues 2) Audio Annotator - a browser-based sound-event annotation tool that we've used in crowdsourced experiments https://github.com/CrowdCurio/audio-annotator 3) HearingScreening.js - browser-based implementation of a simple tone-counting-based hearing screening that ensures a wide-bandwidth listening environment (i.e. not-laptop speakers) for crowdsourced participants https://github.com/mcartwright/hearing-screening.js Mark On Wed, Oct 4, 2017 at 3:49 AM Brecht De Man <b.deman@xxxxxxxx> wrote: > Our =E2=80=98Web Audio Evaluation Tool=E2=80=99 aims to address several o= f the points > raised here; e.g. > - =E2=80=9Cinexpensive and simple to program=E2=80=9D: free, open source,= and with an > optional GUI test creator > - "ideally with response times=E2=80=9D: all timing information (clicks, = plays, =E2=80=A6) > is logged, and can for instance be visualised as a timeline ( > https://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Features#metri= cs > ) > - =E2=80=9Cgood functionality for auditory playback=E2=80=9D: based on th= e Web Audio API > (HTML) so no Flash, Java or other 3rd party software needed, very fast > response and seamless switching, very widely compatible including mobile > devices > - =E2=80=9Ccan be used for all kinds of experiments=E2=80=9D: implements = a wide variety of > standards as presets, based on a few elementary interfaces: vertical and > horizontal sliders, Likert, AB(CD=E2=80=A6), AB(CD=E2=80=A6)X, ranking, a= nd waveform > annotation ( > https://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Interfaces). > Not so much =E2=80=98method of adjustment=E2=80=99 at this time. > > We welcome any contributions and feature requests, as we aim to make a > maximally comprehensive yet elegant and easy-to-use listening test tool > through community effort. > > I am not aware of any published use of it on Mechanical Turk - though it= =E2=80=99s > something I want to try myself soon - but others have integrated it in > systems which track progress of several experiments, for instance. We=E2= =80=99ve > included some functionality to facilitate this, like the =E2=80=98returnU= RL=E2=80=99 > attribute which specifies the page to direct to upon test completion. > > All info on > https://github.com/BrechtDeMan/WebAudioEvaluationTool > and > Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, "Web > Audio Evaluation Tool: A Browser-Based Listening Test Environment," 12th > Sound and Music Computing Conference, July 2015. ( > http://smcnetwork.org/system/files/SMC2015_submission_88.pdf) > > Please send any questions, suggestions or comments you may have to > b.deman@xxxxxxxx > > Best wishes, > > Brecht > > ________________________________________________ > > Brecht De Man > Postdoctoral researcher > Centre for Digital Music > Queen Mary University of London > > School of Electronic Engineering and Computer Science > Mile End Road > London E1 4NS > United Kingdom > > www.brechtdeman.com > b.deman@xxxxxxxx > Twitter <http://twitter.com/BrechtDeMan> | LinkedIn > <http://www.linkedin.com/in/BrechtDeMan> | GitHub > <https://github.com/BrechtDeMan> > Google Scholar <https://scholar.google.com/citations?user=3D3ndnho0AAAAJ>= | > ResearchGate <https://www.researchgate.net/profile/Brecht_De_Man> | > Academia <http://qmul.academia.edu/BrechtDeMan> > > > > > On 4 Oct 2017, at 06:38, Richard F. Lyon <dicklyon@xxxxxxxx> wrote: > > Many thanks, Sam and Bryan and Kevin and all those who replied privately. > > I can see many possible ways forward; just need to get pecking at some... > > Dick > > On Tue, Oct 3, 2017 at 7:06 PM, kevin woods <kevinwoods@xxxxxxxx> > wrote: > >> Further to Sam's email, here is a link to a code package we put together >> to implement our headphone screening task (intended to improve the quali= ty >> of crowdsourced data): http://mcdermottlab.mit.edu/downloads.html >> >> We have generally found that the quality of data obtained online with ou= r >> screening procedure is comparable to that of data obtained in the lab on >> the same experiments. For obvious reasons we have only run experiments >> where precise stimulus control seems unlikely to be critical. >> >> Please feel free to contact us at kwoods@xxxxxxxx with questions. >> >> Sincerely, >> >> Kevin Woods (on behalf of the McDermott Lab, Department of Brain and >> Cognitive Sciences, MIT) >> >> >> On Tue, Oct 3, 2017 at 12:59 AM, Samuel Mehr <sam@xxxxxxxx> wrote= : >> >>> Dear Dick, >>> >>> Lots of folks do successful audio-based experiments on Turk and I >>> generally find it to be a good platform for the sort of work you're >>> describing (which is not really what I do, but experimentally is simila= r >>> enough for the purposes of your question). I've done a few simple liste= ning >>> experiments of the form "listen to this thing, answer some questions ab= out >>> it", and the results directly replicate parallel in-person experiments = in >>> my lab, even when Turkers geolocate to lots of far-flung countries. I >>> require subjects to wear headphones and validate that requirement with = this >>> great task from Josh McDermott's lab: >>> >>> Woods, K. J. P., Siegel, M. H., Traer, J., & McDermott, J. H. (2017). >>> Headphone screening to facilitate web-based auditory experiments. *Atte= ntion, >>> Perception, & Psychophysics*, 1=E2=80=939. >>> https://doi.org/10.3758/s13414-017-1361-2 >>> <https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__doi.org_10.3758= _s13414-2D017-2D1361-2D2&d=3DDwMFaQ&c=3DWO-RGvefibhHBZq3fL85hQ&r=3DhClPLRw3= tDEOn9yIUk8lXthixMA_5Xmkz_VcyZe7Nis&m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--f= QjCiOfBg&s=3DEMJagVRicwRwOoDLHL7J_KOVCZaPaWuy6gr0vYRmTK4&e=3D> >>> >>> In a bunch of piloting, passing the headphone screener correlates with = a >>> bunch of other checks on Turker compliance, positively. Things like "Wh= at >>> color is the sky? Please answer incorrectly, on purpose" and "Tell us >>> honestly how carefully you completed this HIT". Basically, if you have = a >>> few metrics in an experiment that capture variance on some dimension >>> related to participant quality, you should be able to easily tell which >>> Turkers are actually doing good work and which aren't. Depending on how >>> your ethics approval is set up, you can either pay everyone and filter = out >>> bad subjects, or require them to pass some level of quality control to >>> receive payment. >>> >>> best >>> Sam >>> >>> >>> -- >>> Samuel Mehr >>> Department of Psychology >>> Harvard University >>> themusiclab.org >>> <https://urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__themusiclab.org_= &d=3DDwMFaQ&c=3DWO-RGvefibhHBZq3fL85hQ&r=3DhClPLRw3tDEOn9yIUk8lXthixMA_5Xmk= z_VcyZe7Nis&m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--fQjCiOfBg&s=3DpCUtydTJaz4= 7M50EeAcCAChXuyOuAISrqupQZFMpbQ4&e=3D> >>> naturalhistoryofsong.org >>> <https://urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__naturalhistoryof= song.org_&d=3DDwMFaQ&c=3DWO-RGvefibhHBZq3fL85hQ&r=3DhClPLRw3tDEOn9yIUk8lXth= ixMA_5Xmkz_VcyZe7Nis&m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--fQjCiOfBg&s=3Dv6= DXeHQU7koDz4xM-awDfZV_oPoAjPtmyikYNxdtgH0&e=3D> >>> >>> >>> >>> On Tue, Oct 3, 2017 at 8:57 AM, Richard F. Lyon <dicklyon@xxxxxxxx> >>> wrote: >>> >>>> Five years on, are there any updates on experience using Mechanical >>>> Turk and such for sound perception experiments? >>>> >>>> I've never conducted psychoacoustic experiments myself (other than >>>> informal ones on myself), but now I think I have some modeling ideas t= hat >>>> need to be tuned and tested with corresponding experimental data. Is = MTurk >>>> the way to go? If it is, are IRB approvals still needed? I don't even= know >>>> if that applies to me; probably my company has corresponding approval >>>> requirements. >>>> >>>> I'm interested in things like SNR thresholds for binaural detection an= d >>>> localization of different types of signals and noises -- 2AFC tests wh= ose >>>> relative results across conditions would hopefully not be strongly >>>> dependent on level or headphone quality. Are there good MTurk task >>>> structures that motivate people to do a good job on these, e.g. by mak= ing >>>> their space quieter, paying attention, getting more pay as the task ge= ts >>>> harder, or just getting to do more similar tasks, etc.? Can the pay d= epend >>>> on performance? Or just cut them off when the SNR has been lowered to >>>> threshold, so that people with lower thresholds stay on and get paid l= onger? >>>> >>>> If anyone in academia has a good setup for human experiments and an >>>> interest in collaborating on binaural model improvements, I'd love to >>>> discuss that, too, either privately or on the list. >>>> >>>> Dick >>>> >>>> >>>> > --f403043e5accb098af055abc70d2 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable <div dir=3D"ltr">To add to the list of tools. Here are some relevant audio = crowdsourcing tools that I&#39;ve worked on:<div><br></div><div>1) CAQE - a= tool for running crowdsourced audio quality evaluation (both pairwise and = multi-stimulus tests), which is setup for easy use with MTurk workers and H= eroku hosting.</div><div><a href=3D"https://github.com/interactiveaudiolab/= CAQE/issues">https://github.com/interactiveaudiolab/CAQE/issues</a></div><d= iv><br></div><div>2) Audio Annotator - a browser-based sound-event annotati= on tool that we&#39;ve used in crowdsourced experiments</div><div><a href= =3D"https://github.com/CrowdCurio/audio-annotator">https://github.com/Crowd= Curio/audio-annotator</a></div><div><br></div><div>3) HearingScreening.js -= browser-based implementation of a simple tone-counting-based hearing scree= ning that ensures a wide-bandwidth listening environment (i.e. not-laptop s= peakers) for crowdsourced participants</div><div><a href=3D"https://github.= com/mcartwright/hearing-screening.js">https://github.com/mcartwright/hearin= g-screening.js</a><br></div><div><br></div><div>Mark</div><div><br><div cla= ss=3D"gmail_quote"><div dir=3D"ltr">On Wed, Oct 4, 2017 at 3:49 AM Brecht D= e Man &lt;<a href=3D"mailto:b.deman@xxxxxxxx">b.deman@xxxxxxxx</a>&gt; = wrote:<br></div><blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8e= x;border-left:1px #ccc solid;padding-left:1ex"> <div style=3D"word-wrap:break-word"> <div>Our =E2=80=98Web Audio Evaluation Tool=E2=80=99 aims to address severa= l of the points raised here; e.g.</div> <div>- =E2=80=9Cinexpensive and simple to program=E2=80=9D: free, open sour= ce, and with an optional GUI test creator</div> <div>- &quot;ideally with response times=E2=80=9D: all timing information (= clicks, plays, =E2=80=A6) is logged, and can for instance be visualised as = a timeline (<a href=3D"https://github.com/BrechtDeMan/WebAudioEvaluationToo= l/wiki/Features#metrics" target=3D"_blank">https://github.com/BrechtDeMan/W= ebAudioEvaluationTool/wiki/Features#metrics</a>)</div> <div>- =E2=80=9Cgood functionality for auditory playback=E2=80=9D: based on= the Web Audio API (HTML) so no Flash, Java or other 3rd party software nee= ded, very fast response and seamless switching, very widely compatible incl= uding mobile devices</div> <div>- =E2=80=9Ccan be used for all kinds of=C2=A0experiments=E2=80=9D: imp= lements a wide variety of standards as presets, based on a few elementary= =C2=A0interfaces:=C2=A0vertical and horizontal sliders, Likert, AB(CD=E2=80= =A6), AB(CD=E2=80=A6)X, ranking, and waveform annotation (<a href=3D"https= ://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Interfaces" target=3D= "_blank">https://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Interfa= ces</a>). Not so much =E2=80=98method of adjustment=E2=80=99 at this time.</div> <div><br> </div> <div>We welcome any contributions and feature requests, as we aim to make a= maximally comprehensive yet elegant and easy-to-use listening test tool th= rough community effort.=C2=A0</div> <div><br> </div> <div>I am not aware of any published use of it on Mechanical Turk - though = it=E2=80=99s something I want to try myself soon - but others have integrat= ed it in systems which track progress of several experiments, for instance.= We=E2=80=99ve included some functionality to facilitate this, like the =E2=80=98returnURL=E2=80=99 attribute which s= pecifies the page to direct to upon test completion.=C2=A0</div> <div><br> </div> <div>All info on</div> <div><a href=3D"https://github.com/BrechtDeMan/WebAudioEvaluationTool" targ= et=3D"_blank">https://github.com/BrechtDeMan/WebAudioEvaluationTool</a></di= v> <div>and=C2=A0</div> <div>Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, &q= uot;Web Audio Evaluation Tool: A Browser-Based Listening=C2=A0Test Environm= ent,&quot;=C2=A012th Sound and Music Computing Conference, July 2015. (<a h= ref=3D"http://smcnetwork.org/system/files/SMC2015_submission_88.pdf" target= =3D"_blank">http://smcnetwork.org/system/files/SMC2015_submission_88.pdf</a= >)=C2=A0</div> <div><br> </div> <div>Please send any questions, suggestions or comments you may have to <a = href=3D"mailto:b.deman@xxxxxxxx" target=3D"_blank"> b.deman@xxxxxxxx</a>.</div> <div><br> </div> <div>Best wishes,</div> <div><br> </div> <div>Brecht</div> <br> <div> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <span class=3D"m_1482703753068435458Apple-style-span" style=3D"border-colla= pse:separate;border-spacing:0px"> <div>________________________________________________</div> <div><br> </div> <div>Brecht De Man</div> <div>Postdoctoral researcher</div> <div>Centre for Digital Music</div> <div>Queen Mary University of London</div> <div><br> </div> <div>School of Electronic Engineering and Computer Science</div> <div>Mile End Road</div> <div>London E1 4NS</div> <div>United Kingdom</div> <div><br> </div> <div><a href=3D"http://www.brechtdeman.com" target=3D"_blank">www.brechtdem= an.com</a></div> <a href=3D"mailto:b.deman@xxxxxxxx" target=3D"_blank">b.deman@xxxxxxxx<= /a>=C2=A0<br> <div style=3D"word-wrap:break-word"> <span class=3D"m_1482703753068435458Apple-style-span" style=3D"border-colla= pse:separate;border-spacing:0px"><a href=3D"http://twitter.com/BrechtDeMan"= style=3D"text-align:-webkit-auto" target=3D"_blank">Twitter</a>=C2=A0|=C2= =A0<a href=3D"http://www.linkedin.com/in/BrechtDeMan" target=3D"_blank">Lin= kedIn</a>=C2=A0|=C2=A0<a href=3D"https://github.com/BrechtDeMan" style=3D"t= ext-align:-webkit-auto" target=3D"_blank">GitHub</a></span></div> <a href=3D"https://scholar.google.com/citations?user=3D3ndnho0AAAAJ" style= =3D"text-align:-webkit-auto" target=3D"_blank">Google Scholar</a>=C2=A0|=C2= =A0<a href=3D"https://www.researchgate.net/profile/Brecht_De_Man" style=3D"= text-align:-webkit-auto" target=3D"_blank">ResearchGate</a>=C2=A0|=C2=A0<a = href=3D"http://qmul.academia.edu/BrechtDeMan" style=3D"text-align:-webkit-a= uto" target=3D"_blank">Academia</a>=C2=A0</span></div> </div> </div> </div> </div> </div> </div> </div> </div> <br class=3D"m_1482703753068435458Apple-interchange-newline"> <br> </div></div><div style=3D"word-wrap:break-word"> <div><br> </div> <br> <div> <blockquote type=3D"cite"> <div>On 4 Oct 2017, at 06:38, Richard F. Lyon &lt;<a href=3D"mailto:dicklyo= n@xxxxxxxx" target=3D"_blank">dicklyon@xxxxxxxx</a>&gt; wrote:</div> <br class=3D"m_1482703753068435458Apple-interchange-newline"> <div> <div dir=3D"ltr">Many thanks, Sam and Bryan and Kevin and all those who rep= lied privately.<br> <br> I can see many possible ways forward; just need to get pecking at some...<b= r> <br> Dick <div class=3D"gmail_extra"><br> <div class=3D"gmail_quote">On Tue, Oct 3, 2017 at 7:06 PM, kevin woods <spa= n dir=3D"ltr"> &lt;<a href=3D"mailto:kevinwoods@xxxxxxxx" target=3D"_blank">kevinwo= ods@xxxxxxxx</a>&gt;</span> wrote:<br> <blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-left:1p= x #ccc solid;padding-left:1ex"> <div dir=3D"ltr"> <div style=3D"font-size:12.8px">Further to Sam&#39;s email, here is a link = to a code package we put together to implement our headphone screening task= (intended to improve the quality of crowdsourced data):=C2=A0<a href=3D"ht= tp://mcdermottlab.mit.edu/downloads.html" target=3D"_blank">http://mcdermot= tlab.mit.edu/downloads.html</a></div> <div style=3D"font-size:12.8px"><br> </div> <div style=3D"font-size:12.8px">We have generally found that the quality of= data obtained online with our screening procedure is comparable to that of= data obtained in the lab on the same experiments. For obvious reasons we h= ave only run experiments where precise stimulus control seems unlikely to be critical.=C2=A0</div> <span class=3D"m_1482703753068435458m_6977978009752606249gmail-im" style=3D= "font-size:12.8px"> <div><br> </div> <div>Please feel free to contact us at=C2=A0<a href=3D"mailto:kwoods@xxxxxxxx= u" target=3D"_blank">kwoods@xxxxxxxx</a>=C2=A0with questions.</div> <div><br> </div> <div>Sincerely,</div> <div><br> </div> </span> <div style=3D"font-size:12.8px">Kevin Woods (on behalf of the McDermott Lab= , Department of Brain and Cognitive Sciences, MIT)</div> <div class=3D"m_1482703753068435458m_6977978009752606249gmail-yj6qo m_14827= 03753068435458m_6977978009752606249gmail-ajU" style=3D"margin:2px 0px 0px;f= ont-size:12.8px"> <div id=3D"m_1482703753068435458m_6977978009752606249gmail-:jp" class=3D"m_= 1482703753068435458m_6977978009752606249gmail-ajR"><img class=3D"m_14827037= 53068435458m_6977978009752606249gmail-ajT" src=3D"https://ssl.gstatic.com/u= i/v1/icons/mail/images/cleardot.gif"></div> <div id=3D"m_1482703753068435458m_6977978009752606249gmail-:jp" class=3D"m_= 1482703753068435458m_6977978009752606249gmail-ajR"><br> </div> </div> <div class=3D"gmail_extra"><br> <div class=3D"gmail_quote"><span>On Tue, Oct 3, 2017 at 12:59 AM, Samuel Me= hr <span dir=3D"ltr">&lt;<a href=3D"mailto:sam@xxxxxxxx" target=3D"_bla= nk">sam@xxxxxxxx</a>&gt;</span> wrote:<br> </span> <blockquote class=3D"gmail_quote" style=3D"margin:0px 0px 0px 0.8ex;border-= left:1px solid rgb(204,204,204);padding-left:1ex"> <div dir=3D"ltr">Dear Dick, <div><br> </div> <div><span>Lots of folks do successful audio-based experiments on Turk and = I generally find it to be a good platform for the sort of work you&#39;re d= escribing (which is not really what I do, but experimentally is similar eno= ugh for the purposes of your question). I&#39;ve done a few simple listening experiments of the= form &quot;listen to this thing, answer some questions about it&quot;, and= the results directly replicate parallel in-person experiments in my lab, e= ven when Turkers geolocate to lots of far-flung countries. I require subjects to wear headphones and validate that require= ment with this great task from Josh McDermott&#39;s lab: <div><span><br> </span></div> <div><span>Woods, K. J. P., Siegel, M. H., Traer, J., &amp; McDermott, J. H= . (2017). Headphone screening to facilitate web-based auditory experiments. </span><i>Attention, Perception, &amp; Psychophysics</i><span>, 1=E2=80=939= . <a href=3D"https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__doi.org_1= 0.3758_s13414-2D017-2D1361-2D2&amp;d=3DDwMFaQ&amp;c=3DWO-RGvefibhHBZq3fL85h= Q&amp;r=3DhClPLRw3tDEOn9yIUk8lXthixMA_5Xmkz_VcyZe7Nis&amp;m=3D33Mv2SLQ46aah= HEyH5V-aoAK1FI9yxgj--fQjCiOfBg&amp;s=3DEMJagVRicwRwOoDLHL7J_KOVCZaPaWuy6gr0= vYRmTK4&amp;e=3D" target=3D"_blank"> https://doi.org/10.3758/s13414-017-1361-2</a></span> <div class=3D"m_1482703753068435458m_6977978009752606249gmail-m_69449499733= 39543626m_-6026670406689544790m_-3352800085354056723m_-6318531832382596260g= mail-m_-3941813762889735983gmail-csl-bib-body" style=3D"line-height:2;margi= n-left:2em"> <span class=3D"m_1482703753068435458m_6977978009752606249gmail-m_6944949973= 339543626m_-6026670406689544790m_-3352800085354056723m_-6318531832382596260= gmail-m_-3941813762889735983gmail-Z3988" title=3D"url_ver=3DZ39.88-2004&amp= ;ctx_ver=3DZ39.88-2004&amp;rfr_id=3Dinfo%3Asid%2Fzotero.org%3A2&amp;rft_id= =3Dinfo%3Adoi%2F10.3758%2Fs13414-017-1361-2&amp;rft_val_fmt=3Dinfo%3Aofi%2F= fmt%3Akev%3Amtx%3Ajournal&amp;rft.genre=3Darticle&amp;rft.atitle=3DHeadphon= e%20screening%20to%20facilitate%20web-based%20auditory%20experiments&amp;rf= t.jtitle=3DAttention%2C%20Perception%2C%20%26%20Psychophysics&amp;rft.stitl= e=3DAtten%20Percept%20Psychophys&amp;rft.aufirst=3DKevin%20J.%20P.&amp;rft.= aulast=3DWoods&amp;rft.au=3DKevin%20J.%20P.%20Woods&amp;rft.au=3DMax%20H.%2= 0Siegel&amp;rft.au=3DJames%20Traer&amp;rft.au=3DJosh%20H.%20McDermott&amp;r= ft.date=3D2017-07-10&amp;rft.pages=3D1-9&amp;rft.spage=3D1&amp;rft.epage=3D= 9&amp;rft.issn=3D1943-3921%2C%201943-393X&amp;rft.language=3Den"></span></d= iv> </div> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra">In a bunch of piloting, passing the headphone sc= reener correlates with a bunch of other checks on Turker compliance, positi= vely. Things like &quot;What color is the sky? Please answer incorrectly, o= n purpose&quot; and &quot;Tell us honestly how carefully you completed this HIT&quot;. Basically, if you have a few metrics in an e= xperiment that capture variance on some dimension related to participant qu= ality, you should be able to easily tell which Turkers are actually doing g= ood work and which aren&#39;t. Depending on how your ethics approval is set up, you can either pay everyone and fil= ter out bad subjects, or require them to pass some level of quality control= to receive payment.</div> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra">best</div> <div class=3D"gmail_extra">Sam</div> <span class=3D"m_1482703753068435458m_6977978009752606249gmail-m_6944949973= 339543626m_-6026670406689544790HOEnZb"><font color=3D"#888888"> <div class=3D"gmail_extra"> <div><br class=3D"m_1482703753068435458m_6977978009752606249gmail-m_6944949= 973339543626m_-6026670406689544790m_-3352800085354056723m_-6318531832382596= 260gmail-Apple-interchange-newline"> <br> </div> --=C2=A0<br> <div class=3D"m_1482703753068435458m_6977978009752606249gmail-m_69449499733= 39543626m_-6026670406689544790m_-3352800085354056723m_-6318531832382596260g= mail-m_-3941813762889735983gmail_signature"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div dir=3D"ltr"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"color:rgb(136,136,136);font-size:12.8px"> <div dir=3D"ltr" style=3D"color:rgb(34,34,34);font-size:12.8px"> <div dir=3D"ltr" style=3D"font-size:12.8px"> <div dir=3D"ltr" style=3D"font-size:12.8px"> <div dir=3D"ltr" style=3D"font-size:12.8px"> <div dir=3D"ltr" style=3D"font-size:12.8px"> <div dir=3D"ltr" style=3D"font-size:12.8px"><font color=3D"#666666">Samuel = Mehr</font></div> <div style=3D"font-size:12.8px"><font color=3D"#666666">Department of Psych= ology</font></div> <div dir=3D"ltr" style=3D"font-size:12.8px"><span style=3D"color:rgb(102,10= 2,102);font-size:12.8px">Harvard University</span><br> </div> <div dir=3D"ltr" style=3D"font-size:12.8px"><font color=3D"#666666"><a href= =3D"https://urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__themusiclab.org_&= amp;d=3DDwMFaQ&amp;c=3DWO-RGvefibhHBZq3fL85hQ&amp;r=3DhClPLRw3tDEOn9yIUk8lX= thixMA_5Xmkz_VcyZe7Nis&amp;m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--fQjCiOfBg&= amp;s=3DpCUtydTJaz47M50EeAcCAChXuyOuAISrqupQZFMpbQ4&amp;e=3D" target=3D"_bl= ank">themusiclab.org</a></font></div> <div dir=3D"ltr" style=3D"font-size:12.8px"><font color=3D"#666666"><a href= =3D"https://urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__naturalhistoryofs= ong.org_&amp;d=3DDwMFaQ&amp;c=3DWO-RGvefibhHBZq3fL85hQ&amp;r=3DhClPLRw3tDEO= n9yIUk8lXthixMA_5Xmkz_VcyZe7Nis&amp;m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--f= QjCiOfBg&amp;s=3Dv6DXeHQU7koDz4xM-awDfZV_oPoAjPtmyikYNxdtgH0&amp;e=3D" styl= e=3D"font-size:12.8px" target=3D"_blank">naturalhistoryofsong.org</a></font= ></div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </font></span></span> <div> <div class=3D"m_1482703753068435458m_6977978009752606249gmail-m_69449499733= 39543626m_-6026670406689544790h5"> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra"><br> <div class=3D"gmail_quote"><span>On Tue, Oct 3, 2017 at 8:57 AM, Richard F.= Lyon <span dir=3D"ltr">&lt;<a href=3D"mailto:dicklyon@xxxxxxxx" target=3D"_blank"= >dicklyon@xxxxxxxx</a>&gt;</span> wrote:<br> </span> <blockquote class=3D"gmail_quote" style=3D"margin:0px 0px 0px 0.8ex;border-= left:1px solid rgb(204,204,204);padding-left:1ex"> <span> <div dir=3D"ltr"> <div> <div> <div>Five years on, are there any updates on experience using Mechanical Tu= rk and such for sound perception experiments?<br> <br> </div> I&#39;ve never conducted psychoacoustic experiments myself (other than info= rmal ones on myself), but now I think I have some modeling ideas that need = to be tuned and tested with corresponding experimental data.=C2=A0 Is MTurk= the way to go?=C2=A0 If it is, are IRB approvals still needed? I don&#39;t even know if that applies to me; probably my com= pany has corresponding approval requirements.<br> <br> </div> I&#39;m interested in things like SNR thresholds for binaural detection and= localization of different types of signals and noises -- 2AFC tests whose = relative results across conditions would hopefully not be strongly dependen= t on level or headphone quality.=C2=A0 Are there good MTurk task structures that motivate people to do a good job on = these, e.g. by making their space quieter, paying attention, getting more p= ay as the task gets harder, or just getting to do more similar tasks, etc.?= =C2=A0 Can the pay depend on performance?=C2=A0 Or just cut them off when the SNR has been lowered to threshold, so that p= eople with lower thresholds stay on and get paid longer?</div> <div><br> </div> <div>If anyone in academia has a good setup for human experiments and an in= terest in collaborating on binaural model improvements, I&#39;d love to dis= cuss that, too, either privately or on the list.<br> </div> <div><br> </div> Dick<br> <div><br> </div> </div> </span><br> </blockquote> </div> </div> </div> </div> </div> </div> </blockquote> </div> </div> </div> </blockquote> </div> </div> </div> </div> </blockquote> </div> <br> </div></blockquote></div></div></div> --f403043e5accb098af055abc70d2--


This message came from the mail archive
../postings/2017/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University