[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [AUDITORY] Online rhythm production experiments: Update



Dear Jonna,

Inquisit Web seems indeed type 2 solution (not open source, but still)

Thanks very much,

Henkjan

On 26 Oct 2020, at 13:15, Jonna Katariina Vuoskoski <j.k.vuoskoski@xxxxxxxxxx> wrote:

Dear Henkjan,

Has anyone so far suggested Inquisit Web (https://www.millisecond.com/products/inquisit6/weboverview.aspx) or E-Prime Go (https://pstnet.com/introducing-e-prime-go-for-remote-data-collection/)? Both solutions require the participant to download stand-alone experiment software to their own devices, so timing data is recorded within the native system of the participant’s device. We here at the RITMO Centre are planning to use Inquisit Web for a series of studies that rely on collecting precise timing information. Our laboratory engineer has run some preliminary tests to test the accuracy, and the results so far seem promising.

Kind regards,

Jonna


****************************
Jonna Vuoskoski
Associate Professor in Music Cognition
RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion
Department of Musicology & Department of Psychology
University of Oslo





On 25 Oct 2020, at 12:32, Prof. dr Henkjan Honing <honing@xxxxxx> wrote:


Thanks for the suggestions. Below a brief summary of the responses I received. These came in three flavors:

1) solutions suggesting specific hardware at the client side (e.g. using e.g., a two channel audio card)
2) solutions using client side software (e.g., _javascript_)
3) offline and/or post-processing solutions

For our purpose (relatively large-scale online rhythm production experiments) solution type 1 is unrealistic. 
[input from Werner Hemmert and others]

Solution type 2 was tried by several researchers/institutes (using, e.g, PsychoPY _javascript_, etc.). However, most report - as expected - relatively large timing errors, largely due to keyboard scan rates, drivers, and/or operating system (as reported in the references mentioned in the original message). (Despite the claim of psychopy.org of <4ms precision in online studies). 
[Input from Ignacio Spiousas, Nick Haywood, Ben Schultz, Kyle Jasmin and others]
N.B. PeerJ recently published a comparative study [1]

Solution type 3 was suggested by some: i.e. o record the rhythmic pattern by tapping e.g. with a pencil on your desk or device microphone, along with the streamed sound, at the client side, upload the resulting audio file using a standard browser, and analyse it at the serverside using onset-detection and some crosscorrelation techniques. Depending on the sampling rate, latencies can be reduced to 1 ms or less. 
[Input from Roger Dannenberg, Krzysztof Basiński, Justin London and others] 

N.B.1 Ben Schultz announced to make his version of Solution type 1 available as open source (repeated below).
N.B.2. Nori Jacoby announced to make their version of Solution type 3 available as appendix to a forthcoming paper (repreated below).

Nevertheless, my hope is still on some elegant solution of type 2. If you have one, please let us know.

Best,

Henkjan Honing

.
University of Amsterdam
Faculty of Humanities 
Faculty of Science
.

——

Subject: RE: Online rhythm production experiments
Date: 20 October 2020 at 07:16:47 CEST

Hi Henkjan and list,
 
I managed to get the latency and variability synced with audio/video down to the variability of the input device (~8ms for keyboards, larger for touch screens and dependent on the model). I have integrated this with html and _javascript_ in Qualtrics and performed benchmark tests using an automated responder. Response times do not appear to be affected by internet connection speeds (but I have not yet tried dial-up).
 
I am in the process of writing the manuscript with the benchmarks for publication and the scripts will be open-source. These could be adapted for any webpage. 
 
Best regards,
Ben


From: "Jacoby, Nori" <nori.jacoby@xxxxxxxxx>
Subject: Re: Online rhythm production experiments
Date: 20 October 2020 at 16:10:20 CEST
Reply-To: "Jacoby, Nori" <nori.jacoby@xxxxxxxxx>

Hi Henkjan and everybody,

My research group has developed a technology that has solved this problem and allowed us to collect reliable tapping data in an online setup. We’ve successfully collected large tapping datasets this way, and we believe that our method fully addresses the issues mentioned in this thread (low latency and jitter) while also being practical in terms of realistic online data collection. We plan to publish a preprint by the end of the year and therefore make the details of the technology accessible to everyone soon. If you are interested in using the technology earlier, please contact me.

Very best,
Nori Jacoby

Max Planck Group Leader, “Computational Auditory Perception”
Max Planck Institute for Empirical Aesthetics
Grüneburgweg 14, 60322 Frankfurt am Main, Germany
nori.jacoby@xxxxxxxxx +49 69 8300479-820


.
University of Amsterdam
Faculty of Humanities 
Faculty of Science
 
Prof. dr Henkjan Honing
Professor of Music Cognition
 
Music Cognition Group (MCG)
Amsterdam Brain & Cognition (ABC)
Institute for Logic, Language and Computation (ILLC)

Academic: www.mcg.uva.nl  
.