Re: [AUDITORY] Ambisonic Room Impulse Responses (=?UTF-8?Q?Andr=C3=A9s_P=C3=A9rez_L=C3=B3pez?=)


Subject: Re: [AUDITORY] Ambisonic Room Impulse Responses
From:    =?UTF-8?Q?Andr=C3=A9s_P=C3=A9rez_L=C3=B3pez?= <=?UTF-8?Q?Andr=C3=A9s_P=C3=A9rez_L=C3=B3pez?=>
Date:    Fri, 14 Sep 2018 18:56:48 +0200
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--=_99800035bf649cf5a2259d4f3a693132 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-MIME-Autoconverted: from 8bit to quoted-printable by edgeum1.it.mcgill.ca id w8EGup83032353 Dear Gauthier, regarding recorded Ambisonics RIRs, there are several resources publicly available. Probably you can find one that fits your necessities.=20 -P. Coleman and colleagues from Surrey University have been working on the field for a while. Within their S3A project, they gathered ambisonics RIRs from 5 different enclosures [3]. You can download the data from here [4] (upon registration). -The Center for Digital Music at QMUL provides ambisonic IR recordings [1] from 4 different rooms. (These are the datastes from isophonics that you comment). -The Open Acoustic Impulse Response Library (OpenAIR [5]) from York University features ambisonics IR recordings from around 15 different rooms. -Adavanne and colleagues from Tampere University have recently released a dataset of reverberant ambisonics recordings [6] (along with the synthetic IR version [7]) for SED.=20 While the actual IRs are currently not available, I would say it is possible to contact the authors for that issue. - My colleague J. de Muynke has just released a collection of ambisonic IRs gathered within the BINCI project [8]. It consists of 23 sets of recordings at three different european touristic sites, using three different microphones (including an Eigenmike). The files are stored in the recently proposed AmbisonicsDRIR SOFA convention [9], and can be used with the Matlab [10], C++ [11] and Python [12] APIs. Given the growing interest on these kind of datasets, I hope that we can centralise the information in the next future. Personally, I would bet for the OpenAIR lib... Best, Andres El 2018-09-03 14:35, Gauthier Berthomieu escribi=C3=B3:=20 > Dear list,=20 >=20 > I am currently working on an experiment that will take place in a > virtual environment. In order to simulate a loudspeaker that renders > sounds within a given distance range (from 1 to about 20 m) from the > subject in a reverberant environment, I am looking for Ambisonic RIRs > recorded at several distances from a sound source in a same > environment. >=20 > I already found datasets on isophonics [1 [1]] that almost match what I= am > looking for, but the closest RIRs are measured at a distance of 2 m, > and I would need them to start at 1 m.=20 >=20 > Does someone know where such a dataset would be available? >=20 > Thank you in advance, > Gauthier >=20 > --=20 > [2 [2]]=20 >=20 > Links: > ------ > [1] http://isophonics.net/content/room-impulse-response-data-set > [2] http://www.univ-brest.fr =20 Links: ------ [1] http://isophonics.net/content/room-impulse-response-data-set [2] http://www.univ-brest.fr [3] http://epubs.surrey.ac.uk/808465/ [4] http://cvssp.org/data/s3a/public/s3arir_register2.php [5] http://www.openairlib.net/ [6] https://zenodo.org/record/1237793 [7] https://zenodo.org/record/1237703 [8] https://zenodo.org/record/1417727#.W5uO2pMzZ24 [9] https://zenodo.org/record/1299894 [10] https://github.com/jdemuynke/API_MO [11] https://github.com/andresperezlopez/API_Cpp [12] https://andresperezlopez.github.io/pysofaconventions/ --=_99800035bf649cf5a2259d4f3a693132 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable <html><head><meta http-equiv=3D"Content-Type" content=3D"text/html; charset= =3DUTF-8" /></head><body style=3D'font-size: 10pt; font-family: Verdana,Gen= eva,sans-serif'> <div class=3D"pre" style=3D"margin: 0; padding: 0; font-family: monospace">= Dear Gauthier,<br /> <br /> regarding recorded Ambisonics RIRs, there are s= everal resources publicly available.<br /> Probably you can find one that f= its your necessities. <br /> <br /> -P. Coleman and colleagues from Surrey = University have been working on the field for a while.<br /> Within their S= 3A project, they gathered <a href=3D"http://epubs.surrey.ac.uk/808465/">amb= isonics RIRs from 5 different enclosures</a>.<br /> You can download the da= ta from <a href=3D"http://cvssp.org/data/s3a/public/s3arir_register2.php">h= ere</a> (upon registration).<br /> <br /> -The Center for Digital Music at = QMUL provides <a href=3D"http://isophonics.net/content/room-impulse-respons= e-data-set">ambisonic IR recordings</a> from 4 different rooms. (These are = the datastes from isophonics that you comment).<br /> <br /> -The Open Acou= stic Impulse Response Library (<a href=3D"http://www.openairlib.net/">OpenA= IR</a>) from York University features ambisonics IR recordings from around = 15 different rooms.<br /> <br /> -Adavanne and colleagues from Tampere Univ= ersity have recently released a dataset of <a href=3D"https://zenodo.org/re= cord/1237793">reverberant ambisonics recordings</a> (along with the <a href= =3D"https://zenodo.org/record/1237703">synthetic IR version</a>) for SED. <= br /> While the actual IRs are currently not available, I would say it is p= ossible to contact the authors for that issue.<br /> <br /> - My colleague = J. de Muynke has just released a collection of <a href=3D"https://zenodo.or= g/record/1417727#.W5uO2pMzZ24">ambisonic IRs gathered within the BINCI proj= ect</a>.<br /> It consists of 23 sets of recordings at three different euro= pean touristic sites, using three different microphones (including an Eigen= mike).<br /> The files are stored in the recently proposed <a href=3D"https= ://zenodo.org/record/1299894">AmbisonicsDRIR SOFA convention</a>, and can b= e used with the <a href=3D"https://github.com/jdemuynke/API_MO">Matlab</a>,= <a href=3D"https://github.com/andresperezlopez/API_Cpp">C++</a> and <a hre= f=3D"https://andresperezlopez.github.io/pysofaconventions/">Python</a> APIs= =2E<br /> <br /> Given the growing interest on these kind of datasets, I ho= pe that we can centralise the information in the next future.<br /> Persona= lly, I would bet for the OpenAIR lib...<br /> <br /> Best,<br /> <br /> And= res<br /> <br /> El 2018-09-03 14:35, Gauthier Berthomieu escribi&oacute;: <blockquote type=3D"cite" style=3D"padding: 0 0.4em; border-left: #1010ff 2= px solid; margin: 0">Dear list, <br /> <br /> I am currently working on an = experiment that will take place in a<br /> virtual environment. In order to= simulate a loudspeaker that renders<br /> sounds within a given distance r= ange (from 1 to about 20 m) from the<br /> subject in a reverberant environ= ment, I am looking for Ambisonic RIRs<br /> recorded at several distances f= rom a sound source in a same<br /> environment.<br /> <br /> I already foun= d datasets on isophonics [<a href=3D"http://isophonics.net/content/room-imp= ulse-response-data-set" target=3D"_blank" rel=3D"noreferrer">1</a>] that al= most match what I am<br /> looking for, but the closest RIRs are measured a= t a distance of 2 m,<br /> and I would need them to start at 1 m. <br /> <b= r /> Does someone know where such a dataset would be available?<br /> <br /= > Thank you in advance,<br /> Gauthier<br /> <br /> <span class=3D"sig">--&= nbsp;<br /> &nbsp;[<a href=3D"http://www.univ-brest.fr" target=3D"_blank" r= el=3D"noreferrer">2</a>] <br /> <br /> &nbsp;<br /> <br /> Links:<br /> ---= ---<br /> [1] <a href=3D"http://isophonics.net/content/room-impulse-respons= e-data-set" target=3D"_blank" rel=3D"noreferrer">http://isophonics.net/cont= ent/room-impulse-response-data-set</a><br /> [2] <a href=3D"http://www.univ= -brest.fr" target=3D"_blank" rel=3D"noreferrer">http://www.univ-brest.fr</a= ></span></blockquote> </div> </body></html> --=_99800035bf649cf5a2259d4f3a693132--


This message came from the mail archive
src/postings/2018/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University