Subject: REMINDER: DEADLINE 31 Jan 2012 for digital music PhD studentships at Queen Mary University of London From: Marcus Pearce <marcus.pearce@xxxxxxxx> Date: Sat, 28 Jan 2012 12:47:08 +0000 List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>This is a multi-part message in MIME format. --------------020209030402050500050908 Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: quoted-printable Quick reminder: DEADLINE for C4DM PhD studentships is *next Tuesday, 31=20 Jan 2012*. -- ----- - -- ---- --- --- - - ------ - - *Funded PhD studentships at the Centre for Digital Music=20 <http://www.elec.qmul.ac.uk/digitalmusic/>, Queen Mary University of Lond= on* (details also here=20 <http://www.eecs.qmul.ac.uk/%7Eeniale/c4dm-studentships.html>). NB:=20 Studentships marked with an asterisk (*) have UK residency=20 <http://www.epsrc.ac.uk/funding/students/pages/eligibility.aspx> requirem= ents. *Mathematical Models for Musical Prosodic Gestures=20 <http://www.eecs.qmul.ac.uk/phd/docs/Mathematical_models_for_musical_pros= odic_gestures-chewbandtlow.pdf> **Supervisor: * Prof. Elaine Chew (Centre for Digital Music) *Co-supervisor: * Dr. Oscar Bandtlow (School of Mathematical Sciences) *Application deadline: *31^st January 2012 In music performance studies, prosody is the musician-specific timing,=20 stress, and sometimes intonation added when interpreting a notated=20 score. Mid- to high-level music prosodic gestures, for example tempo=20 trajectories, often invoke parallels in the physical world, such as a=20 damped oscillator. This project seeks to identify and mathematically=20 model such gestures. The mathematical descriptors will form the basis=20 for a vocabulary of prosodic gestures for music. *Ensemble Interaction Over Distance=20 <http://www.eecs.qmul.ac.uk/phd/docs/Ensemble_interaction_over_distance-c= hewhealey-my.pdf> **Supervisor: * Prof. Elaine Chew (Centre for Digital Music) *Co-supervisor: * Prof. Patrick Healey (Interaction Media and Communicati= on) *Application deadline: *31^st January 2012 When a small group of musicians negotiate in performance (i.e.=20 real-time) the shaping and execution of a collective interpretation, the=20 communication is non-verbal; some of the cues can be embedded in the=20 musical prosody, and some demonstrated through gestures. This project=20 aims to capture, analyze, quantify, and model the cues necessary for=20 effective and engaging ensemble performance, by studying both co-located=20 as well as distributed (over the Internet) ensembles. *Adaptive, personalised digital musical instruments=20 <http://www.eecs.qmul.ac.uk/phd/docs/Adaptive_personalized_digital_music_= instruments-mcphersonchew.pdf> **Supervisor: * Dr. Andrew McPherson (Centre for Digital Music) *Co-supervisor: * Prof. Elaine Chew (Centre for Digital Music) *Application deadline: *31^st January 2012 A performer can take decades to learn a musical instrument. This=20 studentship will focus on creating instruments that learn the=20 capabilities and artistic preferences of the individual performer, with=20 a particular focus on the relationship between physical gesture and=20 sound production. The successful candidate will develop intelligent=20 gesture-sound mapping strategies, which dynamically update based on=20 feedback from the performer. User studies with professional and amateur=20 musicians will be integral to all stages of the project, and the=20 successful candidate will take a leading role in designing and=20 conducting these studies. The project aims to make performance more=20 accessible to beginning musicians while enabling new modes of expression=20 for experts. *Probabilistic Modelling of Temporal Expectations in Music=20 <http://www.eecs.qmul.ac.uk/phd/docs/PhDadvertpearcepurver.pdf> **Supervisor: * Dr. Marcus Pearce (Centre for Digital Music) *Co-supervisor: * Dr. Matthew Purver (Interaction Media and Communication= ) *Application deadline: *31^st January 2012 The project's goal is to construct and evaluate computational models of=20 human temporal expectation. It involves developing probabilistic models=20 of temporal prediction, taking representational account of rhythm and=20 metre. The models and parameters are optimised to maximise prediction=20 performance and compared to human temporal expectations in empirical=20 studies of listeners. *Intelligent Interfaces for Accelerating Intermediate Piano Learning=20 <http://www.eecs.qmul.ac.uk/phd/docs/PhDadvert-Intelligentinterfacesforpi= anolearning.pdf> (*) **Supervisor: * Prof. Elaine Chew (Centre for Digital Music) *Sponsoring Company: * Yamaha R&D Centre London *Application deadline: *31^st January 2012 Ubiquitous access to digital music, and the hours of practice required=20 to master new pieces, has led to a decline in amateur instrument=20 playing. Cognisance of music structure can facilitate planning and=20 sequence production, and enhance music making pleasure. Machine=20 intelligence can help diagnose areas of difficulty and offer targeted=20 constructive assistance. The studentship will propose/evaluate=20 score-based visualisations of music structure and gestures that=20 accelerate intermediate piano piece mastery for young and adult=20 learners. Candidates should be proficient at programming, experienced=20 with user interface design, have some background in statistics, and=20 possess at least amateur-level piano playing ability. *Semantic Audio: bringing audio signal analysis together with future=20 internet technologies=20 <http://www.eecs.qmul.ac.uk/phd/docs/semanticAudioSandler.pdf> (*) **Supervisor: * Prof. Mark Sandler (Centre for Digital Music) *Sponsoring Company: * Focusrite/Novation *Application deadline: *31^st January 2012 The project is concerned with analysis of musical content where it is=20 created (typically in studios), thus affording much cleaner computer=20 representation of musicological information in the music that can then=20 be used both to enhance consumer experiences and recording studio=20 practices. We base the representation on RDF and ontologies, which are=20 the technologies that underpin Open Data, Semantic Web and the Internet=20 of Things. We have collaborated with organisations such as BBC,=20 MusicBrainz and the British Library in developing these principles. Content (e.g. music, film, tv) recommendation and discovery is reaching=20 a level of maturity (for example, last.fm and Genius). But today, these=20 content descriptions and semantics are derived from the /finished=20 product /(e.g. CD, MP3, DVD). The research question explored in this PhD=20 relies on performing the audio signal content analysis at the /point of=20 content creation/. By using ontologies and RDF (Resource Description=20 Framework =C2=96 a superior version of XML), many new user modalities are= =20 enabled. For example, new and complex user queries/searches of the=20 form, =C2=93find songs in A minor, with lead and rhythm guitars, less tha= n=20 2m30secs, and a rhythm that modulates between 90 and 120 beats per=20 minute=C2=94. Not only can content semantics enhance the listener/consume= r=20 experience, they also enhance the workflow in the recording studio. This=20 is where Focusrite=C2=92s interest lies, and especially integration with = OSC. ------ In addition, there are 10(*)+1 or 2 funded studentships through the=20 *Media and Arts Technology*=20 <http://www.mat.qmul.ac.uk/scholarships.html> doctoral training centre.=20 Limited funding may be available for other PhD research at the *Centre=20 for Digital Music* <http://www.elec.qmul.ac.uk/digitalmusic/phdstudy.html= >. Marcus Pearce -- Lecturer in Sound and Music Processing Centre for Digital Music School of Electronic Engineering & Computer Science Queen Mary, University of London Mile End Road, London E1 4NS, UK Tel: +44 (0)20 7882 5352 http://webprojects.eecs.qmul.ac.uk/marcusp MSc in Digital Music Processinghttp://bit.ly/zb21lm MSc in Digital Signal Processing http://bit.ly/zdZ8cP BEng / MEng in Audio Systems Engineering http://bit.ly/zL1TDS Ends of Audience Wkshp 30-31 May http://bit.ly/zHYDdj ** Deadline 30 Jan=20 2012 ** CMMR 2012 Conference 19-22 Jun http://bit.ly/xJdUIa ** Deadline 1 Feb=20 2012 ** --------------020209030402050500050908 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable <html> <head> <meta http-equiv=3D"content-type" content=3D"text/html; charset=3DUTF= -8"> </head> <body bgcolor=3D"#FFFFFF" text=3D"#000000"> Quick reminder: DEADLINE for C4DM PhD studentships is <b><font class=3D"Apple-style-span" color=3D"#ff0000">next Tuesday, 31 Jan 2012</font></b>. <div><span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "><span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class=3D"Apple-style-span" style=3D"border-collapse: separa= te; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class=3D"Apple-style-span" style=3D"border-collapse= : separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><spa= n class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">= <span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word= ; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-spaci= ng: 0px; -webkit-border-vertical-spacing= : 0px; -webkit-text-decorations-in-eff= ect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><span class=3D"Apple-style-span" style=3D"border-collapse: separate; font-variant: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: -webkit-auto; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-border-horizontal-s= pacing: 0px; -webkit-border-vertical-spa= cing: 0px; -webkit-text-decorations-in= -effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; "></span></div> </span></div> </span></div> </span></div> </span></div> </span></div> </span></div> </span></div> </span></div> </span></span></div> <div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div> <div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space= ; -webkit-line-break: after-white-space; "> <div> <div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div><br> </div> </div> </div> </div> </div> </div> </div> <div>-- ----- - -- ---- --- --- - - ------ - -<br> </div> </div> <div><br> </div> <div><b>Funded PhD studentships at the=C2=A0<a moz-do-not-send=3D"true" href=3D"http://www.elec.qmul.ac.uk/digitalmusic/">Cen= tre for Digital Music</a>, Queen Mary University of London</b></div> <div>(details also=C2=A0<a moz-do-not-send=3D"true" href=3D"http://www.eecs.qmul.ac.uk/%7Eeniale/c4dm-stude= ntships.html">here</a>). NB: Studentships marked with an asterisk (*) have UK=C2=A0= <a moz-do-not-send=3D"true" href=3D"http://www.epsrc.ac.uk/funding/students/pages/e= ligibility.aspx">residency</a>=C2=A0requirements.</div> <div><br> </div> <div><br> </div> <div><b><a moz-do-not-send=3D"true" href=3D"http://www.eecs.qmul.ac.uk/phd/docs/Mathematical_models_for_music= al_prosodic_gestures-chewbandtlow.pdf">Mathematical Models for Musical Prosodic Gestures</a><br> </b><b>Supervisor:=C2=A0</b>=C2=A0Prof. Elaine Chew (Cent= re for Digital Music)<br> <b>Co-supervisor:=C2=A0</b>=C2=A0Dr. Oscar Bandtlow (Scho= ol of Mathematical Sciences)<br> <b>Application deadline:=C2=A0</b>31<sup>st</sup>=C2=A0Ja= nuary 2012<br> <p id=3D"para1" style=3D"display: block; ">In music performance studies, prosody is the musician-specific timing, stress, and sometimes intonation added when interpreting a notated score. Mid- to high-level music prosodic gestures, for example tempo trajectories, often invoke parallels in the physical world, such as a damped oscillator. This project seeks to identify and mathematically model such gestures. The mathematical descriptors will form the basis for a vocabulary of prosodic gestures for music. =C2=A0</p> </div> <div><br> </div> <div><b><a moz-do-not-send=3D"true" href=3D"http://www.eecs.qmul.ac.uk/phd/docs/Ensemble_interaction_over_dis= tance-chewhealey-my.pdf">Ensemble Interaction Over Distance</a><br> </b><b>Supervisor:=C2=A0</b>=C2=A0Prof. Elaine Chew (Cent= re for Digital Music)<br> <b>Co-supervisor:=C2=A0</b>=C2=A0Prof. Patrick Healey (In= teraction Media and Communication)<br> <b>Application deadline:=C2=A0</b>31<sup>st</sup>=C2=A0Ja= nuary 2012<br> <p id=3D"para6" style=3D"display: block; ">When a small group of musicians negotiate in performance (i.e. real-time) the shaping and execution of a collective interpretation, the communication is non-verbal; some of the cues can be embedded in the musical prosody, and some demonstrated through gestures. This project aims to capture, analyze, quantify, and model the cues necessary for effective and engaging ensemble performance, by studying both co-located as well as distributed (over the Internet) ensembles. =C2=A0=C2=A0= </p> </div> <div><br> </div> <div><b><a moz-do-not-send=3D"true" href=3D"http://www.eecs.qmul.ac.uk/phd/docs/Adaptive_personalized_digital= _music_instruments-mcphersonchew.pdf">Adaptive, personalised digital musical instruments</a><br> </b><b>Supervisor:=C2=A0</b>=C2=A0Dr. Andrew McPherson (C= entre for Digital Music)<br> <b>Co-supervisor:=C2=A0</b>=C2=A0Prof. Elaine Chew (Centr= e for Digital Music)<br> <b>Application deadline:=C2=A0</b>31<sup>st</sup>=C2=A0Ja= nuary 2012<br> <p id=3D"para5" style=3D"display: block; ">A performer ca= n take decades to learn a musical instrument. This studentship will focus on creating instruments that learn the capabilities and artistic preferences of the individual performer, with a particular focus on the relationship between physical gesture and sound production. The successful candidate will develop intelligent gesture-sound mapping strategies, which dynamically update based on feedback from the performer. User studies with professional and amateur musicians will be integral to all stages of the project, and the successful candidate will take a leading role in designing and conducting these studies. The project aims to make performance more accessible to beginning musicians while enabling new modes of expression for experts. =C2=A0</p> </div> <div><br> </div> <div><b><a moz-do-not-send=3D"true" href=3D"http://www.eecs.qmul.ac.uk/phd/docs/PhDadvert= pearcepurver.pdf">Probabilistic Modelling of Temporal Expectations in Music</a><br> </b><b>Supervisor:=C2=A0</b>=C2=A0Dr. Marcus Pearce (Cent= re for Digital Music)<br> <b>Co-supervisor:=C2=A0</b>=C2=A0Dr. Matthew Purver (Inte= raction Media and Communication)<br> <b>Application deadline:=C2=A0</b>31<sup>st</sup>=C2=A0Ja= nuary 2012<br> <p id=3D"para11" style=3D"display: block; ">The project's goal is to construct and evaluate computational models of human temporal expectation. It involves developing probabilistic models of temporal prediction, taking representational account of rhythm and metre. The models and parameters are optimised to maximise prediction performance and compared to human temporal expectations in empirical studies of listeners. =C2=A0<= /p> </div> <div><br> </div> <div><b><a moz-do-not-send=3D"true" href=3D"http://www.eecs.qmul.ac.uk/phd/docs/PhDadvert-Intelligentinterfac= esforpianolearning.pdf">Intelligent Interfaces for Accelerating Intermediate Piano Learning</a>=C2=A0(*)<br> </b><b>Supervisor:=C2=A0</b>=C2=A0Prof. Elaine Chew (Cent= re for Digital Music)<br> <b>Sponsoring Company:=C2=A0</b>=C2=A0Yamaha R&D Cent= re London<br> <b>Application deadline:=C2=A0</b>31<sup>st</sup>=C2=A0Ja= nuary 2012<br> <p id=3D"para16" style=3D"display: block; ">Ubiquitous access to digital music, and the hours of practice required to master new pieces, has led to a decline in amateur instrument playing. Cognisance of music structure can facilitate planning and sequence production, and enhance music making pleasure. Machine intelligence can help diagnose areas of difficulty and offer targeted constructive assistance. The studentship will propose/evaluate score-based visualisations of music structure and gestures that accelerate intermediate piano piece mastery for young and adult learners. Candidates should be proficient at programming, experienced with user interface design, have some background in statistics, and possess at least amateur-level piano playing ability. =C2=A0</p> </div> <div><br> </div> <div><b><a moz-do-not-send=3D"true" href=3D"http://www.eecs.qmul.ac.uk/phd/docs/semanticA= udioSandler.pdf">Semantic Audio: bringing audio signal analysis together with future internet technologies</a>=C2=A0(*)<br> </b><b>Supervisor:=C2=A0</b>=C2=A0Prof. Mark Sandler (Cen= tre for Digital Music)<br> <b>Sponsoring Company:=C2=A0</b>=C2=A0Focusrite/Novation<= br> <b>Application deadline:=C2=A0</b>31<sup>st</sup>=C2=A0Ja= nuary 2012<br> <p id=3D"para24" style=3D"display: block; "><font class=3D"Apple-style-span" face=3D"Arial">The project= is concerned with analysis of musical content where it is created (typically in studios), thus affording much cleaner computer representation of musicological information in the music that can then be used both to enhance consumer experiences and recording studio practices. We base the representation on RDF and ontologies, which are the technologies that underpin Open Data, Semantic Web and the Internet of Things. We have collaborated with organisations such as BBC, MusicBrainz and the British Library in developing these principles.<font class=3D"Apple-style-span" color=3D"#3b67f5"><br> </font></font></p> <div> <div style=3D"margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 12px/normal Garamond; "><font class=3D"Apple-style-span" face=3D"Arial">Content (e.g. music, film, tv) recommendation and discovery is reaching a level of maturity (for example, last.fm and Genius). But today, these content descriptions and semantics are derived from the=C2=A0<i>finished product=C2=A0</i>(e.g. CD= , MP3, DVD). The research question explored in this PhD relies on performing the audio signal content analysis at the=C2=A0<i>point of content creation</= i>. By using ontologies and RDF (Resource Description Framework =C2=96 a superior version of XML), many n= ew user modalities are enabled. For example, new and complex user queries/searches of the form, =C2=93fi= nd songs in A minor, with lead and rhythm guitars, less than 2m30secs, and a rhythm that modulates between 90 and 120 beats per minute=C2=94. Not only= can content semantics enhance the listener/consumer experience, they also enhance the workflow in the recording studio. This is where Focusrite=C2=92s interest lies, and especially integration with OSC.</font></div> </div> <div><br class=3D"webkit-block-placeholder"> </div> </div> <div>------</div> In addition, there are 10(*)+1 or 2 funded studentships through the=C2=A0<a moz-do-not-send=3D"true" href=3D"http://www.mat.qmul.ac.uk/scholarships.html"><b>M= edia and Arts Technology</b></a>=C2=A0doctoral training cent= re. =C2=A0Limited funding may be available for other PhD resear= ch at the=C2=A0<a moz-do-not-send=3D"true" href=3D"http://www.elec.qmul.ac.uk/digitalmusic/phdstudy.= html"><b>Centre for Digital Music</b></a>.</div> </div> </div> <div><br> </div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">Marcus Pearce<br> --<br> Lecturer in Sound and Music Processing<br> Centre for Digital Music<br> School of Electronic Engineering & Computer Science<br> Queen Mary, University of London<br> Mile End Road, London E1 4NS, UK<br> Tel: +44 (0)20 7882 5352<br> <a href=3D"http://webprojects.eecs.qmul.ac.uk/marcusp">http://w= ebprojects.eecs.qmul.ac.uk/marcusp</a><br> <br> </div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">MSc in Digital Music Processing<span>=C2=A0=C2=A0</span><a moz-do-not-send=3D"true" href=3D"http://bit.ly/zb21lm">http://bit.ly/zb21lm</a></div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">MSc in Digital Signal Processing=C2=A0<a moz-do-not-send=3D"true" href=3D"http://bit.ly/zdZ8cP">http://bit.ly/zdZ8cP</a></div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">BEng / MEng in Audio Systems Engineering=C2=A0<a moz-do-not-send=3D"true" href=3D"http://bit.ly/zL1TDS">http://bit.ly/zL1TDS</a></div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space= ; -webkit-line-break: after-white-space; "><br> </div> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space= ; -webkit-line-break: after-white-space; "> <div style=3D"word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">Ends of Audience Wkshp 30-31 May=C2=A0<a moz-do-not-send=3D"true" href=3D"http://bit.ly/zHYDdj">http://bit.ly/zHYDdj</a>=C2= =A0** Deadline 30 Jan 2012 **<br> CMMR 2012 Conference =C2=A0=C2=A019-22 Jun=C2=A0<a href=3D"http://bit.ly/xJdUIa">http://bit.ly/xJdUIa</a>=C2= =A0** Deadline =C2=A01 Feb 2012 **</div> </div> </div> </div> </div> </div> <div></div> </body> </html> --------------020209030402050500050908--