[AUDITORY] Final call for submissions to a Special Collection of Music & Science on "Explaining Music with AI" (David Meredith )


Subject: [AUDITORY] Final call for submissions to a Special Collection of Music & Science on "Explaining Music with AI"
From:    David Meredith  <dave@xxxxxxxx>
Date:    Sat, 2 Jul 2022 20:32:44 +0200

> This message is in MIME format. Since your mail reader does not understand this format, some or all of this message may not be legible. --B_3739638765_469198580 Content-type: text/plain; charset="UTF-8" Content-transfer-encoding: quoted-printable Dear list members, Please find below our final call for submissions to our special collection = of Music & Science on =E2=80=9CExplaining music with AI: Advancing the scientific = understanding of music through computation=E2=80=9D. Note that there is no need to send an expression of interest at this stage.= Note also that the deadline for submission is 31 August 2022 and that we ex= pect to have the collection published by May 2023. Kind regards, David Meredith, Anja Volk and Tom Collins =20 Final Call for Papers for a Special Collection of Music & Science on=20 =E2=80=9CExplaining music with AI: Advancing the scientific understanding of musi= c through computation=E2=80=9D Guest edited by David Meredith, Anja Volk & Tom Collins In recent years, a huge number of publications, particularly in the areas o= f music information retrieval and music generation, have reported on project= s in which deep learning neural network models have been used successfully t= o carry out a wide variety of generation, regression and classification task= s on musical data. This work has significantly contributed to the arsenal of= computational tools that we have at our disposal, if we want to explore, or= ganise, create or simply enjoy digital music resources.=20 However, the majority of such models are =E2=80=9Cblack box=E2=80=9D models that have t= housands or even millions of free parameters, whose values are determined th= rough training on, typically, large amounts of data. The computing pioneer, = John von Neumann, allegedly joked, =E2=80=9CWith four free variables I can fit an = elephant and with five I can make him wiggle his trunk=E2=80=9D (Freeman Dyson, 20= 04, =E2=80=9CA meeting with Enrico Fermi=E2=80=9D, Nature, 427:297). Such considerations= prompt us to question whether such black-box deep learning models make a si= gnificant contribution to our scientific understanding of music, musical pro= cesses and musical behaviour. For this special collection, we seek high quality contributions that report= on recent research in which any computational method has been used to advan= ce our understanding of how and why music is created, communicated and recei= ved. We are particularly interested in shining a light on computational meth= ods that have perhaps not received the attention they deserve because of the= dominance of deep learning in recent years. At the same time, contributions= in which deep learning and other neural network models have been shown to a= dvance the scientific understanding of music are also very welcome. Submissions may address any aspect of musical behaviour, including, but not= limited to, composition, improvisation, performance, listening, musical ges= tures and dance. Contributions may also focus on any aspects of music, e.g.,= rhythm, harmony, melody, counterpoint, instrumentation or timbre. We likewi= se set no constraints on the considered music=E2=80=99s style, genre, period or pl= ace of origin. However, the reported work must have adopted a computational = approach that has led to an advancement in our scientific understanding of m= usic. We are also keen to cover a variety of application areas where music is put= to use, not only for pure entertainment or artistic purposes, but also, for= example, in healthcare, in rituals and ceremonies, in meditation, in film s= oundtracks or video games, or in politics or advertising. We welcome, in par= ticular, contributions where a computational approach has been employed in c= onjunction with methodologies and knowledge from other fields, such as psych= ology, neuroscience, musicology, sociology, biology, physics, anthropology o= r ethnomusicology. The deadline for submission of full manuscripts is Wednesday 31 August 2022= . Full manuscripts will need to be submitted using the online submission sys= tem at https://mc.manuscriptcentral.com/mns and should follow the Submission= Guidelines, which can be accessed at https://journals.sagepub.com/author-in= structions/MNS. Accepted papers in this collection will be published with open access by th= e end of May 2023. Important dates Submission of first version of full manuscript: By Wednesday 31 August 2022 First decision on manuscript sent to authors: By Wednesday 30 November 2022 Submission of revised manuscripts: By Tuesday 28 February 2022 Results of reviews of revised manuscripts sent to authors: By Tuesday 2 May= 2023 Publication of accepted papers online: By Wednesday 31 May 2023 =20 Kind regards, David Meredith=20 Department of Architecture, Design, and Media Technology, Aalborg Universit= y, Denmark Anja Volk Department of Information and Computing Sciences, Utrecht University, The N= etherlands Tom Collins Department of Music, University of York, United Kingdom =20 --B_3739638765_469198580 Content-type: text/html; charset="UTF-8" Content-transfer-encoding: quoted-printable <html xmlns:o=3D"urn:schemas-microsoft-com:office:office" xmlns:w=3D"urn:schema= s-microsoft-com:office:word" xmlns:m=3D"http://schemas.microsoft.com/office/20= 04/12/omml" xmlns=3D"http://www.w3.org/TR/REC-html40"><head><meta http-equiv=3DC= ontent-Type content=3D"text/html; charset=3Dutf-8"><meta name=3DGenerator content=3D= "Microsoft Word 15 (filtered medium)"><style><!-- /* Font Definitions */ @xxxxxxxx {font-family:Wingdings; panose-1:5 0 0 0 0 0 0 0 0 0;} @xxxxxxxx {font-family:"Cambria Math"; panose-1:2 4 5 3 5 4 6 3 2 4;} @xxxxxxxx {font-family:Calibri; panose-1:2 15 5 2 2 2 4 3 2 4;} /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0cm; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-fareast-language:EN-US;} h2 {mso-style-priority:9; mso-style-link:"Heading 2 Char"; mso-margin-top-alt:auto; margin-right:0cm; mso-margin-bottom-alt:auto; margin-left:0cm; font-size:18.0pt; font-family:"Calibri",sans-serif; font-weight:bold;} a:link, span.MsoHyperlink {mso-style-priority:99; color:#0563C1; text-decoration:underline;} span.Heading2Char {mso-style-name:"Heading 2 Char"; mso-style-priority:9; mso-style-link:"Heading 2"; font-family:"Calibri",sans-serif; mso-fareast-language:EN-GB; font-weight:bold;} .MsoChpDefault {mso-style-type:export-only; font-size:10.0pt;} @xxxxxxxx WordSection1 {size:612.0pt 792.0pt; margin:72.0pt 72.0pt 72.0pt 72.0pt;} div.WordSection1 {page:WordSection1;} /* List Definitions */ @xxxxxxxx l0 {mso-list-id:7488856; mso-list-template-ids:449600626;} @xxxxxxxx l0:level1 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:36.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level2 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:72.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level3 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:108.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level4 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:144.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level5 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:180.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level6 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:216.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level7 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:252.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level8 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:288.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l0:level9 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:324.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l1 {mso-list-id:47531771; mso-list-template-ids:-2095528230;} @xxxxxxxx l1:level1 {mso-level-number-format:bullet; mso-level-text:=EF=82=B7; mso-level-tab-stop:36.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Symbol;} @xxxxxxxx l1:level2 {mso-level-number-format:bullet; mso-level-text:o; mso-level-tab-stop:72.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:"Courier New"; mso-bidi-font-family:"Times New Roman";} @xxxxxxxx l1:level3 {mso-level-number-format:bullet; mso-level-text:=EF=82=A7; mso-level-tab-stop:108.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l1:level4 {mso-level-number-format:bullet; mso-level-text:=EF=82=A7; mso-level-tab-stop:144.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l1:level5 {mso-level-number-format:bullet; mso-level-text:=EF=82=A7; mso-level-tab-stop:180.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l1:level6 {mso-level-number-format:bullet; mso-level-text:=EF=82=A7; mso-level-tab-stop:216.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l1:level7 {mso-level-number-format:bullet; mso-level-text:=EF=82=A7; mso-level-tab-stop:252.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l1:level8 {mso-level-number-format:bullet; mso-level-text:=EF=82=A7; mso-level-tab-stop:288.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l1:level9 {mso-level-number-format:bullet; mso-level-text:=EF=82=A7; mso-level-tab-stop:324.0pt; mso-level-number-position:left; text-indent:-18.0pt; mso-ansi-font-size:10.0pt; font-family:Wingdings;} ol {margin-bottom:0cm;} ul {margin-bottom:0cm;} --></style></head><body lang=3DEN-GB link=3D"#0563C1" vlink=3D"#954F72" style=3D'wo= rd-wrap:break-word'><div class=3DWordSection1><p style=3D'mso-margin-top-alt:0cm= ;margin-right:0cm;margin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-siz= e:12.0pt;color:black'>Dear list members,<o:p></o:p></span></p><p style=3D'mso-= margin-top-alt:0cm;margin-right:0cm;margin-bottom:6.0pt;margin-left:0cm'><sp= an style=3D'font-size:12.0pt;color:black'>Please find below our final call for= submissions to our special collection of <i>Music &amp; Science</i> on =E2=80=9CE= xplaining music with AI: Advancing the scientific understanding of music thr= ough computation=E2=80=9D.<o:p></o:p></span></p><p style=3D'mso-margin-top-alt:0cm;m= argin-right:0cm;margin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:= 12.0pt;color:black'>Note that there is no need to send an expression of inte= rest at this stage. Note also that the deadline for submission is 31 August = 2022 and that we expect to have the collection published by May 2023.<o:p></= o:p></span></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;margin-bott= om:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;color:black'>Kind re= gards,<o:p></o:p></span></p><div style=3D'border:none;border-bottom:double win= dowtext 2.25pt;padding:0cm 0cm 1.0pt 0cm'><p style=3D'mso-margin-top-alt:0cm;m= argin-right:0cm;margin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:= 12.0pt;color:black'>David Meredith, Anja Volk and Tom Collins<o:p></o:p></sp= an></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;margin-bottom:6.0pt= ;margin-left:0cm'><span style=3D'font-size:12.0pt;color:black'><o:p>&nbsp;</o:= p></span></p></div><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;margin-= bottom:6.0pt;margin-left:0cm'><b><span style=3D'font-size:15.0pt;color:black'>= Final Call for Papers for a Special Collection of <i>Music &amp; Science</i>= on&nbsp;</span></b></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;ma= rgin-bottom:6.0pt;margin-left:0cm'><b><span style=3D'font-size:15.0pt;color:bl= ack'>=E2=80=9CExplaining music with AI: Advancing the scientific understanding of = music through computation=E2=80=9D</span></b></p><p style=3D'mso-margin-top-alt:0cm;= margin-right:0cm;margin-bottom:6.0pt;margin-left:0cm'><b><span style=3D'font-s= ize:12.0pt;color:black'>Guest edited by David Meredith, Anja Volk &amp; Tom = Collins</span></b></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;marg= in-bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;color:black'>= In recent years, a huge number of publications, particularly in the areas of= music information retrieval and music generation, have reported on projects= in which deep learning neural network models have been used successfully to= carry out a wide variety of generation, regression and classification tasks= on musical data. This work has significantly contributed to the arsenal of = computational tools that we have at our disposal, if we want to explore, org= anise, create or simply enjoy digital music resources.&nbsp;</span></p><p st= yle=3D'mso-margin-top-alt:0cm;margin-right:0cm;margin-bottom:6.0pt;margin-left= :0cm'><span style=3D'font-size:12.0pt;color:black'>However, the majority of su= ch models are =E2=80=9Cblack box=E2=80=9D models that have thousands or even millions of= free parameters, whose values are determined through training on, typically= , large amounts of data. The computing pioneer, John von Neumann, allegedly = joked, =E2=80=9C<i>With four free variables I can fit an elephant and with five I = can make him wiggle his trunk</i>=E2=80=9D (Freeman Dyson, 2004, =E2=80=9CA meeting with= Enrico Fermi=E2=80=9D, <i>Nature, </i>427:297). Such considerations prompt us to = question whether such black-box deep learning models make a significant cont= ribution to our <i>scientific</i> understanding of music, musical processes = and musical behaviour.</span></p><p style=3D'mso-margin-top-alt:0cm;margin-rig= ht:0cm;margin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;co= lor:black'>For this special collection, we seek high quality contributions t= hat report on recent research in which <b><i>any computational method has be= en used to advance our understanding of how and why music is created, commun= icated and received</i></b>. We are particularly interested in shining a lig= ht on computational methods that have perhaps not received the attention the= y deserve because of the dominance of deep learning in recent years. At the = same time, contributions in which deep learning and other neural network mod= els have been shown to advance the scientific understanding of music are als= o very welcome.</span></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;= margin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;color:bla= ck'>Submissions may address any aspect of musical behaviour, including, but = not limited to, composition, improvisation, performance, listening, musical = gestures and dance. Contributions may also focus on any aspects of music, e.= g., rhythm, harmony, melody, counterpoint, instrumentation or timbre. We lik= ewise set no constraints on the considered music=E2=80=99s style, genre, period or= place of origin. However, the reported work must have adopted a computation= al approach that has led to an advancement in our scientific understanding o= f music.</span></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;margin-= bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;color:black'>We = are also keen to cover a variety of application areas where music is put to = use, not only for pure entertainment or artistic purposes, but also, for exa= mple, in healthcare, in rituals and ceremonies, in meditation, in film sound= tracks or video games, or in politics or advertising. We welcome, in particu= lar, contributions where a computational approach has been employed in conju= nction with methodologies and knowledge from other fields, such as psycholog= y, neuroscience, musicology, sociology, biology, physics, anthropology or et= hnomusicology.</span></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;m= argin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;color:blac= k'>The deadline for submission of full manuscripts is <b><i>Wednesday 31 Aug= ust 2022</i></b>. Full manuscripts will need to be submitted using the onlin= e submission system at </span><a href=3D"https://mc.manuscriptcentral.com/mns"= ><span style=3D'font-size:12.0pt;color:#1155CC'>https://mc.manuscriptcentral.c= om/mns</span></a><span style=3D'font-size:12.0pt;color:black'> and should foll= ow the Submission Guidelines, which can be accessed at </span><a href=3D"https= ://journals.sagepub.com/author-instructions/MNS"><span style=3D'font-size:12.0= pt;color:#1155CC'>https://journals.sagepub.com/author-instructions/MNS</span= ></a><span style=3D'font-size:12.0pt;color:black'>.</span></p><p style=3D'mso-ma= rgin-top-alt:0cm;margin-right:0cm;margin-bottom:6.0pt;margin-left:0cm'><span= style=3D'font-size:12.0pt;color:black'>Accepted papers in this collection wil= l be published <b>with open access</b> by the end of May 2023.</span></p><h2= style=3D'mso-margin-top-alt:18.0pt;margin-right:0cm;margin-bottom:6.0pt;margi= n-left:0cm'><span style=3D'color:black'>Important dates</span><o:p></o:p></h2>= <ul style=3D'margin-top:0cm;padding-inline-start:48px' type=3Ddisc><li class=3DMso= Normal style=3D'color:black;mso-list:l1 level1 lfo3;vertical-align:baseline'><= i><span style=3D'font-size:12.0pt;mso-fareast-language:EN-GB'>Submission of fi= rst version of full manuscript</span></i><span style=3D'font-size:12.0pt;mso-f= areast-language:EN-GB'>: By Wednesday 31 August 2022<o:p></o:p></span></li><= li class=3DMsoNormal style=3D'color:black;mso-list:l1 level1 lfo3;vertical-align= :baseline'><i><span style=3D'font-size:12.0pt;mso-fareast-language:EN-GB'>Firs= t decision on manuscript sent to authors</span></i><span style=3D'font-size:12= .0pt;mso-fareast-language:EN-GB'>: By Wednesday 30 November 2022<o:p></o:p><= /span></li><li class=3DMsoNormal style=3D'color:black;mso-list:l1 level1 lfo3;ve= rtical-align:baseline'><i><span style=3D'font-size:12.0pt;mso-fareast-language= :EN-GB'>Submission of revised manuscripts</span></i><span style=3D'font-size:1= 2.0pt;mso-fareast-language:EN-GB'>: By Tuesday 28 February 2022<o:p></o:p></= span></li><li class=3DMsoNormal style=3D'color:black;mso-list:l1 level1 lfo3;ver= tical-align:baseline'><i><span style=3D'font-size:12.0pt;mso-fareast-language:= EN-GB'>Results of reviews of revised manuscripts sent to authors</span></i><= span style=3D'font-size:12.0pt;mso-fareast-language:EN-GB'>: By Tuesday 2 May = 2023<o:p></o:p></span></li><li class=3DMsoNormal style=3D'color:black;margin-bot= tom:6.0pt;mso-list:l1 level1 lfo3;vertical-align:baseline'><i><span style=3D'f= ont-size:12.0pt;mso-fareast-language:EN-GB'>Publication of accepted papers o= nline</span></i><span style=3D'font-size:12.0pt;mso-fareast-language:EN-GB'>: = By Wednesday 31 May 2023<o:p></o:p></span></li></ul><p class=3DMsoNormal><o:p>= &nbsp;</o:p></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;margin-bot= tom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;color:black'>Kind r= egards,</span></p><p style=3D'mso-margin-top-alt:0cm;margin-right:0cm;margin-b= ottom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;color:black'>Davi= d Meredith <br>Department of Architecture, Design, and Media Technology, Aal= borg University, Denmark</span></p><p style=3D'mso-margin-top-alt:0cm;margin-r= ight:0cm;margin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-size:12.0pt;= color:black'>Anja Volk<br>Department of Information and Computing Sciences, = Utrecht University, The Netherlands</span></p><p style=3D'mso-margin-top-alt:0= cm;margin-right:0cm;margin-bottom:6.0pt;margin-left:0cm'><span style=3D'font-s= ize:12.0pt;color:black'>Tom Collins<br>Department of Music, University of Yo= rk, United Kingdom</span></p><p class=3DMsoNormal><o:p>&nbsp;</o:p></p></div><= /body></html> --B_3739638765_469198580--


This message came from the mail archive
src/postings/2022/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University