[AUDITORY] Fwd: Open access and visualization: Repovizz and the QUARTET dataset (Esteban Maestre )


Subject: [AUDITORY] Fwd: Open access and visualization: Repovizz and the QUARTET dataset
From:    Esteban Maestre  <esteban.maestre@xxxxxxxx>
Date:    Thu, 16 Feb 2017 12:28:28 -0500
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

This is a multi-part message in MIME format. --------------ACFF6113C2C25B9687592C0A Content-Type: text/plain; charset=utf-8; format=flowed Content-Transfer-Encoding: quoted-printable X-MIME-Autoconverted: from 8bit to quoted-printable by edgeum2.it.mcgill.ca id v1GHQsLo025907 (apologies for cross-postings) I believe this could be relevant also for the Auditory community... Best, Esteban -------- Forwarded Message -------- Subject: Open access and visualization: Repovizz and the QUARTET dataset Date: Wed, 15 Feb 2017 11:30:01 -0500 From: Esteban Maestre <esteban.maestre@xxxxxxxx> To: community@xxxxxxxx Dear ISMIR Community, Over the past few years, this list has seen an increasingly active=20 discussion about publishing and accessing datasets for reuse in academic=20 research. Although sometimes driven by concrete needs concerning a=20 particular dataset or project, this topic is not accessory. In a=20 data-driven research community like ours, it is very healthy to exchange=20 ideas and perspectives on how to devise flexible means for making our=20 data and results accessible--a valuable pursuit towards supporting=20 research reproducibility. The Music Technology Group of UPF hosts and provides free access to a=20 number of datasets for music and audio research. As it normally happens=20 with other published datasets, one needs to download the data files and=20 procure means for exploring them locally. If limited to audio files or=20 annotations, such process does not generally bring significant=20 difficulties other than data volume. However, as the number and nature=20 of modalities, extracted descriptors, and annotations increase (think of=20 motion capture, video, physiological signals, time series of different=20 sample rates, etc.), difficulties arise not only in the design or=20 adoption of formatting schemes, but also in the availability of=20 platforms that enable and facilitate exchange by providing simple ways=20 to remotely visualize or explore the data before downloading. In the context of several recent projects focused on music performance=20 analysis and multimodal interaction, we had to collect, process, and=20 annotate music performance recordings that included dense data streams=20 of different modalities. Envisioning the future release of our dataset=20 for the research community, we realized the need for better means to=20 explore and exchange data. Since then, at UPF we have been developing=20 Repovizz, a remote hosting platform for multimodal data storage,=20 visualization, annotation, and selective retrieval via a web interface=20 and a dedicated API. By way of the recently published article /E. Maestre, P. Papiotis, M. Marchini, Q. Llimona, O. Mayor, A. P=C3=A9re= z,=20 M. Wanderley,// //Enriched Multimodal Representations of Music Performances: Online=20 Access and Visualization// //IEEE MultiMedia, Vol 24:1, pp. 24-34, 2017=20 (http://ieeexplore.ieee.org/document/7849104/)/, we introduce Repovizz to the ISMIR Community and open access to the=20 QUARTET dataset, a fully annotated collection of string quartet=20 multimodal recordings released through Repovizz. For a short, unadorned video demonstrating Repovizz, please go to=20 http://www.youtube.com/watch?v=3DJcHbGtltuG4. Although still under development, Repovizz can be used by anyone in the=20 academic community. The Repovizz website can be found at http://repovizz.upf.edu (enter as=20 /guest/ to check the documentation). The only requirement is to use=20 Chrome or Safari (no mobile app yet). The QUARTET dataset comprises 96 recordings of string quartet exercises=20 involving solo and ensemble conditions, containing multichannel audio=20 (ambient microphones and piezoelectric pickups), video, motion capture=20 (optical and magnetic) of instrumental gestures and of musician upper=20 bodies, computed bowing gesture signals, extracted audio descriptors,=20 and multitrack score-performance alignment. The dataset, processed and=20 curated over the past years partly in the context of the PhD=20 dissertation work of Panos Papiotis on ensemble interdependence=20 (http://www.tdx.cat/bitstream/10803/361107/2/tpp.pdf), is now freely=20 available for the research community. A description of its contents,=20 including links to the corresponding Repovizz entries, can be explored=20 at http://mtg.upf.edu/download/datasets/quartet-dataset. Repovizz and the QUARTET dataset only represent our own attempt at=20 expanding the possibilities for exchange and online visualization of=20 multimodal music performance data. Although there are many issues to=20 resolve as we keep on developing Repovizz, we hope that this initiative=20 triggers further discussion and serves as an inspiration for others in=20 proposing and developing future means for nourishing a digital library=20 ecosystem that enables rich interaction with multimodal music resources. Best regards, Esteban Maestre ..also on behalf of Panos Papiotis, Quim Llimona, and Oscar Mayor. -- Esteban Maestre Music Technology Group, Universitat Pompeu Fabra Center for Interdisciplinary Research in Music Media and Technology,=20 McGill University --------------ACFF6113C2C25B9687592C0A Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-MIME-Autoconverted: from 8bit to quoted-printable by edgeum2.it.mcgill.ca id v1GHQsLo025907 <html> <head> <meta http-equiv=3D"content-type" content=3D"text/html; charset=3Dutf= -8"> </head> <body bgcolor=3D"#FFFFFF" text=3D"#000000"> <p>(apologies for cross-postings)</p> <p>I believe this could be relevant also for the Auditory community...</p> <p>Best,</p> <p>Esteban<br> </p> <div class=3D"moz-forward-container"><br> <br> -------- Forwarded Message -------- <table class=3D"moz-email-headers-table" border=3D"0" cellpadding=3D= "0" cellspacing=3D"0"> <tbody> <tr> <th align=3D"RIGHT" nowrap=3D"nowrap" valign=3D"BASELINE">Sub= ject: </th> <td>Open access and visualization: Repovizz and the QUARTET dataset</td> </tr> <tr> <th align=3D"RIGHT" nowrap=3D"nowrap" valign=3D"BASELINE">Dat= e: </th> <td>Wed, 15 Feb 2017 11:30:01 -0500</td> </tr> <tr> <th align=3D"RIGHT" nowrap=3D"nowrap" valign=3D"BASELINE">Fro= m: </th> <td>Esteban Maestre <a class=3D"moz-txt-link-rfc2396E" href=3D= "mailto:esteban.maestre@xxxxxxxx">&lt;esteban.maestre@xxxxxxxx&gt;</a></td> </tr> <tr> <th align=3D"RIGHT" nowrap=3D"nowrap" valign=3D"BASELINE">To:= </th> <td><a class=3D"moz-txt-link-abbreviated" href=3D"mailto:comm= unity@xxxxxxxx">community@xxxxxxxx</a></td> </tr> </tbody> </table> <br> <br> <meta http-equiv=3D"content-type" content=3D"text/html; charset=3Du= tf-8"> <p>Dear ISMIR Community,<br> <br> Over the past few years, this list has seen an increasingly active discussion about publishing and accessing datasets for reuse in academic research. Although sometimes driven by concrete needs concerning a particular dataset or project, this topic is not accessory. In a data-driven research community like ours, it is very healthy to exchange ideas and perspectives on how to devise flexible means for making our data and results accessible--a valuable pursuit towards supporting research reproducibility.<br> <br> The Music Technology Group of UPF hosts and provides free access to a number of datasets for music and audio research. As it normally happens with other published datasets, one needs to download the data files and procure means for exploring them locally. If limited to audio files or annotations, such process does not generally bring significant difficulties other than data volume. However, as the number and nature of modalities, extracted descriptors, and annotations increase (think of motion capture, video, physiological signals, time series of different sample rates, etc.), difficulties arise not only in the design or adoption of formatting schemes, but also in the availability of platforms that enable and facilitate exchange by providing simple ways to remotely visualize or explore the data before downloading. <br> <br> In the context of several recent projects focused on music performance analysis and multimodal interaction, we had to collect, process, and annotate music performance recordings that included dense data streams of different modalities. Envisioning the future release of our dataset for the research community, we realized the need for better means to explore and exchange data. Since then, at UPF we have been developing Repovizz, a remote hosting platform for multimodal data storage, visualization, annotation, and selective retrieval via a web interface and a dedicated API.<br> <br> By way of the recently published article<br> <br> <i>E. Maestre, P. Papiotis, M. Marchini, Q. Llimona, O. Mayor, A. P=C3=A9rez, M. Wanderley,</i><i><br> </i><i>Enriched Multimodal Representations of Music Performances: Online Access and Visualization</i><i><br> </i><i>IEEE MultiMedia, Vol 24:1, pp. 24-34, 2017 (<a moz-do-not-send=3D"true" class=3D"moz-txt-link-freetext" href=3D"http://ieeexplore.ieee.org/document/7849104/">http://= ieeexplore.ieee.org/document/7849104/</a>)</i>,<br> <br> we introduce Repovizz to the ISMIR Community and open access to the QUARTET dataset, a fully annotated collection of string quartet multimodal recordings released through Repovizz.<br> <br> For a short, unadorned video demonstrating Repovizz, please go to <a moz-do-not-send=3D"true" class=3D"moz-txt-link-freetext" href=3D"http://www.youtube.com/watch?v=3DJcHbGtltuG4">http://ww= w.youtube.com/watch?v=3DJcHbGtltuG4</a>.<br> Although still under development, Repovizz can be used by anyone in the academic community. <br> The Repovizz website can be found at <a moz-do-not-send=3D"true" class=3D"moz-txt-link-freetext" href=3D"http://repovizz.upf.edu= ">http://repovizz.upf.edu</a> (enter as <i>guest</i> to check the documentation). The only requirement is to use Chrome or Safari (no mobile app yet).<br> <br> The QUARTET dataset comprises 96 recordings of string quartet exercises involving solo and ensemble conditions, containing multichannel audio (ambient microphones and piezoelectric pickups), video, motion capture (optical and magnetic) of instrumental gestures and of musician upper bodies, computed bowing gesture signals, extracted audio descriptors, and multitrack score-performance alignment. The dataset, processed and curated over the past years partly in the context of the PhD dissertation work of Panos Papiotis on ensemble interdependence (<a moz-do-not-send=3D"true" class=3D"moz-txt-link-freetext" href=3D"http://www.tdx.cat/bitstream/10803/361107/2/tpp.pdf">ht= tp://www.tdx.cat/bitstream/10803/361107/2/tpp.pdf</a>), is now freely available for the research community. A description of its contents, including links to the corresponding Repovizz entries, can be explored at <a moz-do-not-send=3D"true" class=3D"moz-txt-link-freetext" href=3D"http://mtg.upf.edu/download/datasets/quartet-dataset">h= ttp://mtg.upf.edu/download/datasets/quartet-dataset</a>.<br> <br> Repovizz and the QUARTET dataset only represent our own attempt at expanding the possibilities for exchange and online visualization of multimodal music performance data. Although there are many issues to resolve as we keep on developing Repovizz, we hope that this initiative triggers further discussion and serves as an inspiration for others in proposing and developing future means for nourishing a digital library ecosystem that enables rich interaction with multimodal music resources.<br> <br> Best regards,<br> <br> Esteban Maestre<br> ..also on behalf of Panos Papiotis, Quim Llimona, and Oscar Mayor.<br> <br> <br> --<br> Esteban Maestre<br> Music Technology Group, Universitat Pompeu Fabra<br> Center for Interdisciplinary Research in Music Media and Technology, McGill University<br> </p> <br> </div> <br> <pre class=3D"moz-signature" cols=3D"72"> </pre> </body> </html> --------------ACFF6113C2C25B9687592C0A--


This message came from the mail archive
../postings/2017/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University