[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[AUDITORY] Online rhythm production experiments: Update




Thanks for the suggestions. Below a brief summary of the responses I received. These came in three flavors:

1) solutions suggesting specific hardware at the client side (e.g. using e.g., a two channel audio card)
2) solutions using client side software (e.g., _javascript_)
3) offline and/or post-processing solutions

For our purpose (relatively large-scale online rhythm production experiments) solution type 1 is unrealistic. 
[input from Werner Hemmert and others]

Solution type 2 was tried by several researchers/institutes (using, e.g, PsychoPY _javascript_, etc.). However, most report - as expected - relatively large timing errors, largely due to keyboard scan rates, drivers, and/or operating system (as reported in the references mentioned in the original message). (Despite the claim of psychopy.org of <4ms precision in online studies). 
[Input from Ignacio Spiousas, Nick Haywood, Ben Schultz, Kyle Jasmin and others]
N.B. PeerJ recently published a comparative study [1]

Solution type 3 was suggested by some: i.e. o record the rhythmic pattern by tapping e.g. with a pencil on your desk or device microphone, along with the streamed sound, at the client side, upload the resulting audio file using a standard browser, and analyse it at the serverside using onset-detection and some crosscorrelation techniques. Depending on the sampling rate, latencies can be reduced to 1 ms or less. 
[Input from Roger Dannenberg, Krzysztof Basiński, Justin London and others] 

N.B.1 Ben Schultz announced to make his version of Solution type 1 available as open source (repeated below).
N.B.2. Nori Jacoby announced to make their version of Solution type 3 available as appendix to a forthcoming paper (repreated below).

Nevertheless, my hope is still on some elegant solution of type 2. If you have one, please let us know.

Best,

Henkjan Honing

.
University of Amsterdam
Faculty of Humanities 
Faculty of Science
.

——

Subject: RE: Online rhythm production experiments
Date: 20 October 2020 at 07:16:47 CEST

Hi Henkjan and list,
 
I managed to get the latency and variability synced with audio/video down to the variability of the input device (~8ms for keyboards, larger for touch screens and dependent on the model). I have integrated this with html and _javascript_ in Qualtrics and performed benchmark tests using an automated responder. Response times do not appear to be affected by internet connection speeds (but I have not yet tried dial-up).
 
I am in the process of writing the manuscript with the benchmarks for publication and the scripts will be open-source. These could be adapted for any webpage. 
 
Best regards,
Ben


From: "Jacoby, Nori" <nori.jacoby@xxxxxxxxx>
Subject: Re: Online rhythm production experiments
Date: 20 October 2020 at 16:10:20 CEST
Reply-To: "Jacoby, Nori" <nori.jacoby@xxxxxxxxx>

Hi Henkjan and everybody,

My research group has developed a technology that has solved this problem and allowed us to collect reliable tapping data in an online setup. We’ve successfully collected large tapping datasets this way, and we believe that our method fully addresses the issues mentioned in this thread (low latency and jitter) while also being practical in terms of realistic online data collection. We plan to publish a preprint by the end of the year and therefore make the details of the technology accessible to everyone soon. If you are interested in using the technology earlier, please contact me.

Very best,
Nori Jacoby

Max Planck Group Leader, “Computational Auditory Perception”
Max Planck Institute for Empirical Aesthetics
Grüneburgweg 14, 60322 Frankfurt am Main, Germany
nori.jacoby@xxxxxxxxx +49 69 8300479-820