Re: [AUDITORY] Software for internet-based auditory testing (Julia Strand )


Subject: Re: [AUDITORY] Software for internet-based auditory testing
From:    Julia Strand  <jstrand@xxxxxxxx>
Date:    Wed, 4 Oct 2017 07:56:59 +0000
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--001a113d7c5a753260055ab3f456 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable Hi Dick, My lab also has a paper with some recommendations about collecting spoken word recognition data online using mTurk: Slote, J., & Strand, J. (2016). Conducting spoken word recognition research online: Validation and a new timing method. Behavior Research Methods 48(2), 553-66. doi: 10.3758/s13428-015-0599-7 https://www.dropbox.com/s/0lniloy1cwit2rh/Slote%2C%20Strand.%202016.%20Cond= ucting%20spoken%20word%20recognition%20research%20online%20Validation%20and= %20a%20new%20timing%20method.pdf?dl=3D0 Hope this is helpful. Best, Julia On Wed, Oct 4, 2017 at 2:52 AM Brecht De Man <b.deman@xxxxxxxx> wrote: > Our =E2=80=98Web Audio Evaluation Tool=E2=80=99 aims to address several o= f the points > raised here; e.g. > - =E2=80=9Cinexpensive and simple to program=E2=80=9D: free, open source,= and with an > optional GUI test creator > - "ideally with response times=E2=80=9D: all timing information (clicks, = plays, =E2=80=A6) > is logged, and can for instance be visualised as a timeline ( > https://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Features#metri= cs > ) > - =E2=80=9Cgood functionality for auditory playback=E2=80=9D: based on th= e Web Audio API > (HTML) so no Flash, Java or other 3rd party software needed, very fast > response and seamless switching, very widely compatible including mobile > devices > - =E2=80=9Ccan be used for all kinds of experiments=E2=80=9D: implements = a wide variety of > standards as presets, based on a few elementary interfaces: vertical and > horizontal sliders, Likert, AB(CD=E2=80=A6), AB(CD=E2=80=A6)X, ranking, a= nd waveform > annotation ( > https://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Interfaces). > Not so much =E2=80=98method of adjustment=E2=80=99 at this time. > > We welcome any contributions and feature requests, as we aim to make a > maximally comprehensive yet elegant and easy-to-use listening test tool > through community effort. > > I am not aware of any published use of it on Mechanical Turk - though it= =E2=80=99s > something I want to try myself soon - but others have integrated it in > systems which track progress of several experiments, for instance. We=E2= =80=99ve > included some functionality to facilitate this, like the =E2=80=98returnU= RL=E2=80=99 > attribute which specifies the page to direct to upon test completion. > > All info on > https://github.com/BrechtDeMan/WebAudioEvaluationTool > and > Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, "Web > Audio Evaluation Tool: A Browser-Based Listening Test Environment," 12th > Sound and Music Computing Conference, July 2015. ( > http://smcnetwork.org/system/files/SMC2015_submission_88.pdf) > > Please send any questions, suggestions or comments you may have to > b.deman@xxxxxxxx > > Best wishes, > > Brecht > > ________________________________________________ > > Brecht De Man > Postdoctoral researcher > Centre for Digital Music > Queen Mary University of London > > School of Electronic Engineering and Computer Science > Mile End Road > London E1 4NS > United Kingdom > > www.brechtdeman.com > b.deman@xxxxxxxx > Twitter <http://twitter.com/BrechtDeMan> | LinkedIn > <http://www.linkedin.com/in/BrechtDeMan> | GitHub > <https://github.com/BrechtDeMan> > Google Scholar <https://scholar.google.com/citations?user=3D3ndnho0AAAAJ>= | > ResearchGate <https://www.researchgate.net/profile/Brecht_De_Man> | > Academia <http://qmul.academia.edu/BrechtDeMan> > > > > > On 4 Oct 2017, at 06:38, Richard F. Lyon <dicklyon@xxxxxxxx> wrote: > > Many thanks, Sam and Bryan and Kevin and all those who replied privately. > > I can see many possible ways forward; just need to get pecking at some... > > Dick > > On Tue, Oct 3, 2017 at 7:06 PM, kevin woods <kevinwoods@xxxxxxxx> > wrote: > >> Further to Sam's email, here is a link to a code package we put together >> to implement our headphone screening task (intended to improve the quali= ty >> of crowdsourced data): http://mcdermottlab.mit.edu/downloads.html >> >> We have generally found that the quality of data obtained online with ou= r >> screening procedure is comparable to that of data obtained in the lab on >> the same experiments. For obvious reasons we have only run experiments >> where precise stimulus control seems unlikely to be critical. >> >> Please feel free to contact us at kwoods@xxxxxxxx with questions. >> >> Sincerely, >> >> Kevin Woods (on behalf of the McDermott Lab, Department of Brain and >> Cognitive Sciences, MIT) >> >> >> On Tue, Oct 3, 2017 at 12:59 AM, Samuel Mehr <sam@xxxxxxxx> wrote= : >> >>> Dear Dick, >>> >>> Lots of folks do successful audio-based experiments on Turk and I >>> generally find it to be a good platform for the sort of work you're >>> describing (which is not really what I do, but experimentally is simila= r >>> enough for the purposes of your question). I've done a few simple liste= ning >>> experiments of the form "listen to this thing, answer some questions ab= out >>> it", and the results directly replicate parallel in-person experiments = in >>> my lab, even when Turkers geolocate to lots of far-flung countries. I >>> require subjects to wear headphones and validate that requirement with = this >>> great task from Josh McDermott's lab: >>> >>> Woods, K. J. P., Siegel, M. H., Traer, J., & McDermott, J. H. (2017). >>> Headphone screening to facilitate web-based auditory experiments. *Atte= ntion, >>> Perception, & Psychophysics*, 1=E2=80=939. >>> https://doi.org/10.3758/s13414-017-1361-2 >>> <https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__doi.org_10.3758= _s13414-2D017-2D1361-2D2&d=3DDwMFaQ&c=3DWO-RGvefibhHBZq3fL85hQ&r=3DhClPLRw3= tDEOn9yIUk8lXthixMA_5Xmkz_VcyZe7Nis&m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--f= QjCiOfBg&s=3DEMJagVRicwRwOoDLHL7J_KOVCZaPaWuy6gr0vYRmTK4&e=3D> >>> >>> In a bunch of piloting, passing the headphone screener correlates with = a >>> bunch of other checks on Turker compliance, positively. Things like "Wh= at >>> color is the sky? Please answer incorrectly, on purpose" and "Tell us >>> honestly how carefully you completed this HIT". Basically, if you have = a >>> few metrics in an experiment that capture variance on some dimension >>> related to participant quality, you should be able to easily tell which >>> Turkers are actually doing good work and which aren't. Depending on how >>> your ethics approval is set up, you can either pay everyone and filter = out >>> bad subjects, or require them to pass some level of quality control to >>> receive payment. >>> >>> best >>> Sam >>> >>> >>> -- >>> Samuel Mehr >>> Department of Psychology >>> Harvard University >>> themusiclab.org >>> <https://urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__themusiclab.org_= &d=3DDwMFaQ&c=3DWO-RGvefibhHBZq3fL85hQ&r=3DhClPLRw3tDEOn9yIUk8lXthixMA_5Xmk= z_VcyZe7Nis&m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--fQjCiOfBg&s=3DpCUtydTJaz4= 7M50EeAcCAChXuyOuAISrqupQZFMpbQ4&e=3D> >>> naturalhistoryofsong.org >>> <https://urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__naturalhistoryof= song.org_&d=3DDwMFaQ&c=3DWO-RGvefibhHBZq3fL85hQ&r=3DhClPLRw3tDEOn9yIUk8lXth= ixMA_5Xmkz_VcyZe7Nis&m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--fQjCiOfBg&s=3Dv6= DXeHQU7koDz4xM-awDfZV_oPoAjPtmyikYNxdtgH0&e=3D> >>> >>> >>> >>> On Tue, Oct 3, 2017 at 8:57 AM, Richard F. Lyon <dicklyon@xxxxxxxx> >>> wrote: >>> >>>> Five years on, are there any updates on experience using Mechanical >>>> Turk and such for sound perception experiments? >>>> >>>> I've never conducted psychoacoustic experiments myself (other than >>>> informal ones on myself), but now I think I have some modeling ideas t= hat >>>> need to be tuned and tested with corresponding experimental data. Is = MTurk >>>> the way to go? If it is, are IRB approvals still needed? I don't even= know >>>> if that applies to me; probably my company has corresponding approval >>>> requirements. >>>> >>>> I'm interested in things like SNR thresholds for binaural detection an= d >>>> localization of different types of signals and noises -- 2AFC tests wh= ose >>>> relative results across conditions would hopefully not be strongly >>>> dependent on level or headphone quality. Are there good MTurk task >>>> structures that motivate people to do a good job on these, e.g. by mak= ing >>>> their space quieter, paying attention, getting more pay as the task ge= ts >>>> harder, or just getting to do more similar tasks, etc.? Can the pay d= epend >>>> on performance? Or just cut them off when the SNR has been lowered to >>>> threshold, so that people with lower thresholds stay on and get paid l= onger? >>>> >>>> If anyone in academia has a good setup for human experiments and an >>>> interest in collaborating on binaural model improvements, I'd love to >>>> discuss that, too, either privately or on the list. >>>> >>>> Dick >>>> >>>> >>>> > -- Julia Strand, PhD Assistant Professor of Psychology Carleton College One North College Street Northfield, Minnesota 55057 507-222-5637 (office) --001a113d7c5a753260055ab3f456 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable <div><div dir=3D"auto"><div dir=3D"auto">Hi Dick,</div><div dir=3D"auto">My= lab also has a paper with some recommendations about collecting spoken wor= d recognition data online using mTurk:</div></div><div dir=3D"auto"><div di= r=3D"auto"><br></div><div dir=3D"auto"><div dir=3D"auto">Slote, J., &amp; S= trand, J. (2016). Conducting spoken word recognition research online: Valid= ation and a new timing method. Behavior Research Methods 48(2), 553-66. doi= : 10.3758/s13428-015-0599-7=C2=A0</div><div dir=3D"auto"><br></div><div dir= =3D"auto"><a href=3D"https://www.dropbox.com/s/0lniloy1cwit2rh/Slote%2C%20S= trand.%202016.%20Conducting%20spoken%20word%20recognition%20research%20onli= ne%20Validation%20and%20a%20new%20timing%20method.pdf?dl=3D0" target=3D"_bl= ank">https://www.dropbox.com/s/0lniloy1cwit2rh/Slote%2C%20Strand.%202016.%2= 0Conducting%20spoken%20word%20recognition%20research%20online%20Validation%= 20and%20a%20new%20timing%20method.pdf?dl=3D0</a><br></div><div dir=3D"auto"= ><br></div><div dir=3D"auto"><br></div><div dir=3D"auto">Hope this is helpf= ul.</div><div dir=3D"auto">Best,</div><div dir=3D"auto">Julia</div><div dir= =3D"auto"><br></div></div></div><div class=3D"gmail_quote"><div>On Wed, Oct= 4, 2017 at 2:52 AM Brecht De Man &lt;<a href=3D"mailto:b.deman@xxxxxxxx"= >b.deman@xxxxxxxx</a>&gt; wrote:<br></div><blockquote class=3D"gmail_quot= e" style=3D"margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"> <div style=3D"word-wrap:break-word"> <div>Our =E2=80=98Web Audio Evaluation Tool=E2=80=99 aims to address severa= l of the points raised here; e.g.</div> <div>- =E2=80=9Cinexpensive and simple to program=E2=80=9D: free, open sour= ce, and with an optional GUI test creator</div> <div>- &quot;ideally with response times=E2=80=9D: all timing information (= clicks, plays, =E2=80=A6) is logged, and can for instance be visualised as = a timeline (<a href=3D"https://github.com/BrechtDeMan/WebAudioEvaluationToo= l/wiki/Features#metrics" target=3D"_blank">https://github.com/BrechtDeMan/W= ebAudioEvaluationTool/wiki/Features#metrics</a>)</div> <div>- =E2=80=9Cgood functionality for auditory playback=E2=80=9D: based on= the Web Audio API (HTML) so no Flash, Java or other 3rd party software nee= ded, very fast response and seamless switching, very widely compatible incl= uding mobile devices</div> <div>- =E2=80=9Ccan be used for all kinds of=C2=A0experiments=E2=80=9D: imp= lements a wide variety of standards as presets, based on a few elementary= =C2=A0interfaces:=C2=A0vertical and horizontal sliders, Likert, AB(CD=E2=80= =A6), AB(CD=E2=80=A6)X, ranking, and waveform annotation (<a href=3D"https= ://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Interfaces" target=3D= "_blank">https://github.com/BrechtDeMan/WebAudioEvaluationTool/wiki/Interfa= ces</a>). Not so much =E2=80=98method of adjustment=E2=80=99 at this time.</div> <div><br> </div> <div>We welcome any contributions and feature requests, as we aim to make a= maximally comprehensive yet elegant and easy-to-use listening test tool th= rough community effort.=C2=A0</div> <div><br> </div> <div>I am not aware of any published use of it on Mechanical Turk - though = it=E2=80=99s something I want to try myself soon - but others have integrat= ed it in systems which track progress of several experiments, for instance.= We=E2=80=99ve included some functionality to facilitate this, like the =E2=80=98returnURL=E2=80=99 attribute which s= pecifies the page to direct to upon test completion.=C2=A0</div> <div><br> </div> <div>All info on</div> <div><a href=3D"https://github.com/BrechtDeMan/WebAudioEvaluationTool" targ= et=3D"_blank">https://github.com/BrechtDeMan/WebAudioEvaluationTool</a></di= v> <div>and=C2=A0</div> <div>Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, &q= uot;Web Audio Evaluation Tool: A Browser-Based Listening=C2=A0Test Environm= ent,&quot;=C2=A012th Sound and Music Computing Conference, July 2015. (<a h= ref=3D"http://smcnetwork.org/system/files/SMC2015_submission_88.pdf" target= =3D"_blank">http://smcnetwork.org/system/files/SMC2015_submission_88.pdf</a= >)=C2=A0</div> <div><br> </div> <div>Please send any questions, suggestions or comments you may have to <a = href=3D"mailto:b.deman@xxxxxxxx" target=3D"_blank"> b.deman@xxxxxxxx</a>.</div> <div><br> </div> <div>Best wishes,</div> <div><br> </div> <div>Brecht</div> <br> <div> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <div style=3D"color:rgb(0,0,0);letter-spacing:normal;text-align:start;text-= indent:0px;text-transform:none;white-space:normal;word-spacing:0px;word-wra= p:break-word"> <span class=3D"m_-1664777998414686Apple-style-span" style=3D"border-collaps= e:separate;border-spacing:0px"> <div>________________________________________________</div> <div><br> </div> <div>Brecht De Man</div> <div>Postdoctoral researcher</div> <div>Centre for Digital Music</div> <div>Queen Mary University of London</div> <div><br> </div> <div>School of Electronic Engineering and Computer Science</div> <div>Mile End Road</div> <div>London E1 4NS</div> <div>United Kingdom</div> <div><br> </div> <div><a href=3D"http://www.brechtdeman.com" target=3D"_blank">www.brechtdem= an.com</a></div> <a href=3D"mailto:b.deman@xxxxxxxx" target=3D"_blank">b.deman@xxxxxxxx<= /a>=C2=A0<br> <div style=3D"word-wrap:break-word"> <span class=3D"m_-1664777998414686Apple-style-span" style=3D"border-collaps= e:separate;border-spacing:0px"><a href=3D"http://twitter.com/BrechtDeMan" s= tyle=3D"text-align:-webkit-auto" target=3D"_blank">Twitter</a>=C2=A0|=C2=A0= <a href=3D"http://www.linkedin.com/in/BrechtDeMan" target=3D"_blank">Linked= In</a>=C2=A0|=C2=A0<a href=3D"https://github.com/BrechtDeMan" style=3D"text= -align:-webkit-auto" target=3D"_blank">GitHub</a></span></div> <a href=3D"https://scholar.google.com/citations?user=3D3ndnho0AAAAJ" style= =3D"text-align:-webkit-auto" target=3D"_blank">Google Scholar</a>=C2=A0|=C2= =A0<a href=3D"https://www.researchgate.net/profile/Brecht_De_Man" style=3D"= text-align:-webkit-auto" target=3D"_blank">ResearchGate</a>=C2=A0|=C2=A0<a = href=3D"http://qmul.academia.edu/BrechtDeMan" style=3D"text-align:-webkit-a= uto" target=3D"_blank">Academia</a>=C2=A0</span></div> </div> </div> </div> </div> </div> </div> </div> </div> <br class=3D"m_-1664777998414686Apple-interchange-newline"> <br> </div></div><div style=3D"word-wrap:break-word"> <div><br> </div> <br> <div> <blockquote type=3D"cite"> <div>On 4 Oct 2017, at 06:38, Richard F. Lyon &lt;<a href=3D"mailto:dicklyo= n@xxxxxxxx" target=3D"_blank">dicklyon@xxxxxxxx</a>&gt; wrote:</div> <br class=3D"m_-1664777998414686Apple-interchange-newline"> <div> <div>Many thanks, Sam and Bryan and Kevin and all those who replied private= ly.<br> <br> I can see many possible ways forward; just need to get pecking at some...<b= r> <br> Dick <div class=3D"gmail_extra"><br> <div class=3D"gmail_quote">On Tue, Oct 3, 2017 at 7:06 PM, kevin woods <spa= n> &lt;<a href=3D"mailto:kevinwoods@xxxxxxxx" target=3D"_blank">kevinwo= ods@xxxxxxxx</a>&gt;</span> wrote:<br> <blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-left:1p= x #ccc solid;padding-left:1ex"> <div> <div style=3D"font-size:12.8px">Further to Sam&#39;s email, here is a link = to a code package we put together to implement our headphone screening task= (intended to improve the quality of crowdsourced data):=C2=A0<a href=3D"ht= tp://mcdermottlab.mit.edu/downloads.html" target=3D"_blank">http://mcdermot= tlab.mit.edu/downloads.html</a></div> <div style=3D"font-size:12.8px"><br> </div> <div style=3D"font-size:12.8px">We have generally found that the quality of= data obtained online with our screening procedure is comparable to that of= data obtained in the lab on the same experiments. For obvious reasons we h= ave only run experiments where precise stimulus control seems unlikely to be critical.=C2=A0</div> <span class=3D"m_-1664777998414686m_6977978009752606249gmail-im" style=3D"f= ont-size:12.8px"> <div><br> </div> <div>Please feel free to contact us at=C2=A0<a href=3D"mailto:kwoods@xxxxxxxx= u" target=3D"_blank">kwoods@xxxxxxxx</a>=C2=A0with questions.</div> <div><br> </div> <div>Sincerely,</div> <div><br> </div> </span> <div style=3D"font-size:12.8px">Kevin Woods (on behalf of the McDermott Lab= , Department of Brain and Cognitive Sciences, MIT)</div> <div class=3D"m_-1664777998414686m_6977978009752606249gmail-yj6qo m_-166477= 7998414686m_6977978009752606249gmail-ajU" style=3D"margin:2px 0px 0px;font-= size:12.8px"> <div id=3D"m_-1664777998414686m_6977978009752606249gmail-:jp" class=3D"m_-1= 664777998414686m_6977978009752606249gmail-ajR"><img class=3D"m_-16647779984= 14686m_6977978009752606249gmail-ajT" src=3D"https://ssl.gstatic.com/ui/v1/i= cons/mail/images/cleardot.gif"></div> <div id=3D"m_-1664777998414686m_6977978009752606249gmail-:jp" class=3D"m_-1= 664777998414686m_6977978009752606249gmail-ajR"><br> </div> </div> <div class=3D"gmail_extra"><br> <div class=3D"gmail_quote"><span>On Tue, Oct 3, 2017 at 12:59 AM, Samuel Me= hr <span>&lt;<a href=3D"mailto:sam@xxxxxxxx" target=3D"_blank">sam@xxxxxxxx= harvard.edu</a>&gt;</span> wrote:<br> </span> <blockquote class=3D"gmail_quote" style=3D"margin:0px 0px 0px 0.8ex;border-= left:1px solid rgb(204,204,204);padding-left:1ex"> <div>Dear Dick, <div><br> </div> <div><span>Lots of folks do successful audio-based experiments on Turk and = I generally find it to be a good platform for the sort of work you&#39;re d= escribing (which is not really what I do, but experimentally is similar eno= ugh for the purposes of your question). I&#39;ve done a few simple listening experiments of the= form &quot;listen to this thing, answer some questions about it&quot;, and= the results directly replicate parallel in-person experiments in my lab, e= ven when Turkers geolocate to lots of far-flung countries. I require subjects to wear headphones and validate that require= ment with this great task from Josh McDermott&#39;s lab: <div><span><br> </span></div> <div><span>Woods, K. J. P., Siegel, M. H., Traer, J., &amp; McDermott, J. H= . (2017). Headphone screening to facilitate web-based auditory experiments. </span><i>Attention, Perception, &amp; Psychophysics</i><span>, 1=E2=80=939= . <a href=3D"https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__doi.org_1= 0.3758_s13414-2D017-2D1361-2D2&amp;d=3DDwMFaQ&amp;c=3DWO-RGvefibhHBZq3fL85h= Q&amp;r=3DhClPLRw3tDEOn9yIUk8lXthixMA_5Xmkz_VcyZe7Nis&amp;m=3D33Mv2SLQ46aah= HEyH5V-aoAK1FI9yxgj--fQjCiOfBg&amp;s=3DEMJagVRicwRwOoDLHL7J_KOVCZaPaWuy6gr0= vYRmTK4&amp;e=3D" target=3D"_blank"> https://doi.org/10.3758/s13414-017-1361-2</a></span> <div class=3D"m_-1664777998414686m_6977978009752606249gmail-m_6944949973339= 543626m_-6026670406689544790m_-3352800085354056723m_-6318531832382596260gma= il-m_-3941813762889735983gmail-csl-bib-body" style=3D"line-height:2;margin-= left:2em"> <span class=3D"m_-1664777998414686m_6977978009752606249gmail-m_694494997333= 9543626m_-6026670406689544790m_-3352800085354056723m_-6318531832382596260gm= ail-m_-3941813762889735983gmail-Z3988" title=3D"url_ver=3DZ39.88-2004&amp;c= tx_ver=3DZ39.88-2004&amp;rfr_id=3Dinfo%3Asid%2Fzotero.org%3A2&amp;rft_id=3D= info%3Adoi%2F10.3758%2Fs13414-017-1361-2&amp;rft_val_fmt=3Dinfo%3Aofi%2Ffmt= %3Akev%3Amtx%3Ajournal&amp;rft.genre=3Darticle&amp;rft.atitle=3DHeadphone%2= 0screening%20to%20facilitate%20web-based%20auditory%20experiments&amp;rft.j= title=3DAttention%2C%20Perception%2C%20%26%20Psychophysics&amp;rft.stitle= =3DAtten%20Percept%20Psychophys&amp;rft.aufirst=3DKevin%20J.%20P.&amp;rft.a= ulast=3DWoods&amp;rft.au=3DKevin%20J.%20P.%20Woods&amp;rft.au=3DMax%20H.%20= Siegel&amp;rft.au=3DJames%20Traer&amp;rft.au=3DJosh%20H.%20McDermott&amp;rf= t.date=3D2017-07-10&amp;rft.pages=3D1-9&amp;rft.spage=3D1&amp;rft.epage=3D9= &amp;rft.issn=3D1943-3921%2C%201943-393X&amp;rft.language=3Den"></span></di= v> </div> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra">In a bunch of piloting, passing the headphone sc= reener correlates with a bunch of other checks on Turker compliance, positi= vely. Things like &quot;What color is the sky? Please answer incorrectly, o= n purpose&quot; and &quot;Tell us honestly how carefully you completed this HIT&quot;. Basically, if you have a few metrics in an e= xperiment that capture variance on some dimension related to participant qu= ality, you should be able to easily tell which Turkers are actually doing g= ood work and which aren&#39;t. Depending on how your ethics approval is set up, you can either pay everyone and fil= ter out bad subjects, or require them to pass some level of quality control= to receive payment.</div> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra">best</div> <div class=3D"gmail_extra">Sam</div> <span class=3D"m_-1664777998414686m_6977978009752606249gmail-m_694494997333= 9543626m_-6026670406689544790HOEnZb"><font color=3D"#888888"> <div class=3D"gmail_extra"> <div><br class=3D"m_-1664777998414686m_6977978009752606249gmail-m_694494997= 3339543626m_-6026670406689544790m_-3352800085354056723m_-631853183238259626= 0gmail-Apple-interchange-newline"> <br> </div> --=C2=A0<br> <div class=3D"m_-1664777998414686m_6977978009752606249gmail-m_6944949973339= 543626m_-6026670406689544790m_-3352800085354056723m_-6318531832382596260gma= il-m_-3941813762889735983gmail_signature"> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"color:rgb(136,136,136);font-size:12.8px"> <div style=3D"color:rgb(34,34,34);font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"> <div style=3D"font-size:12.8px"><font color=3D"#666666">Samuel Mehr</font><= /div> <div style=3D"font-size:12.8px"><font color=3D"#666666">Department of Psych= ology</font></div> <div style=3D"font-size:12.8px"><span style=3D"color:rgb(102,102,102);font-= size:12.8px">Harvard University</span><br> </div> <div style=3D"font-size:12.8px"><font color=3D"#666666"><a href=3D"https://= urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__themusiclab.org_&amp;d=3DDwMF= aQ&amp;c=3DWO-RGvefibhHBZq3fL85hQ&amp;r=3DhClPLRw3tDEOn9yIUk8lXthixMA_5Xmkz= _VcyZe7Nis&amp;m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--fQjCiOfBg&amp;s=3DpCUt= ydTJaz47M50EeAcCAChXuyOuAISrqupQZFMpbQ4&amp;e=3D" target=3D"_blank">themusi= clab.org</a></font></div> <div style=3D"font-size:12.8px"><font color=3D"#666666"><a href=3D"https://= urldefense.proofpoint.com/v2/url?u=3Dhttp-3A__naturalhistoryofsong.org_&amp= ;d=3DDwMFaQ&amp;c=3DWO-RGvefibhHBZq3fL85hQ&amp;r=3DhClPLRw3tDEOn9yIUk8lXthi= xMA_5Xmkz_VcyZe7Nis&amp;m=3D33Mv2SLQ46aahHEyH5V-aoAK1FI9yxgj--fQjCiOfBg&amp= ;s=3Dv6DXeHQU7koDz4xM-awDfZV_oPoAjPtmyikYNxdtgH0&amp;e=3D" style=3D"font-si= ze:12.8px" target=3D"_blank">naturalhistoryofsong.org</a></font></div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </font></span></span> <div> <div class=3D"m_-1664777998414686m_6977978009752606249gmail-m_6944949973339= 543626m_-6026670406689544790h5"> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra"><br> </div> <div class=3D"gmail_extra"><br> <div class=3D"gmail_quote"><span>On Tue, Oct 3, 2017 at 8:57 AM, Richard F.= Lyon <span>&lt;<a href=3D"mailto:dicklyon@xxxxxxxx" target=3D"_blank">dicklyon@xxxxxxxx= m.org</a>&gt;</span> wrote:<br> </span> <blockquote class=3D"gmail_quote" style=3D"margin:0px 0px 0px 0.8ex;border-= left:1px solid rgb(204,204,204);padding-left:1ex"> <span> <div> <div> <div> <div>Five years on, are there any updates on experience using Mechanical Tu= rk and such for sound perception experiments?<br> <br> </div> I&#39;ve never conducted psychoacoustic experiments myself (other than info= rmal ones on myself), but now I think I have some modeling ideas that need = to be tuned and tested with corresponding experimental data.=C2=A0 Is MTurk= the way to go?=C2=A0 If it is, are IRB approvals still needed? I don&#39;t even know if that applies to me; probably my com= pany has corresponding approval requirements.<br> <br> </div> I&#39;m interested in things like SNR thresholds for binaural detection and= localization of different types of signals and noises -- 2AFC tests whose = relative results across conditions would hopefully not be strongly dependen= t on level or headphone quality.=C2=A0 Are there good MTurk task structures that motivate people to do a good job on = these, e.g. by making their space quieter, paying attention, getting more p= ay as the task gets harder, or just getting to do more similar tasks, etc.?= =C2=A0 Can the pay depend on performance?=C2=A0 Or just cut them off when the SNR has been lowered to threshold, so that p= eople with lower thresholds stay on and get paid longer?</div> <div><br> </div> <div>If anyone in academia has a good setup for human experiments and an in= terest in collaborating on binaural model improvements, I&#39;d love to dis= cuss that, too, either privately or on the list.<br> </div> <div><br> </div> Dick<br> <div><br> </div> </div> </span><br> </blockquote> </div> </div> </div> </div> </div> </div> </blockquote> </div> </div> </div> </blockquote> </div> </div> </div> </div> </blockquote> </div> <br> </div></blockquote></div></div><div dir=3D"ltr">-- <br></div><div class=3D"= gmail_signature" data-smartmail=3D"gmail_signature"><div dir=3D"ltr"><div><= div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div><di= v dir=3D"ltr"><span style=3D"color:rgb(153,153,153)">Julia Strand, PhD</spa= n><br style=3D"color:rgb(153,153,153)"><font style=3D"color:rgb(153,153,153= )" color=3D"#888888"><span>Assistant Professor of Psychology<br>Carleton Co= llege =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 <br>One North College Street<br>Northfield, Minnesota 55057= </span></font><br style=3D"color:rgb(153,153,153)"><span style=3D"color:rgb= (153,153,153)">507-222-5637 (office)</span></div></div></div></div></div></= div></div></div></div> --001a113d7c5a753260055ab3f456--


This message came from the mail archive
../postings/2017/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University