Research positions in visual and auditory modelling and (Sue Denham )


Subject: Research positions in visual and auditory modelling and
From:    Sue Denham  <sue(at)SOC.PLYM.AC.UK>
Date:    Sun, 30 Jun 2002 12:30:21 +0100

This is a multi-part message in MIME format. ------=_NextPart_000_0094_01C22031.E68FDF30 Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Plymouth Institute of Neuroscience Applications are invited for the following positions: 2 postdoctoral = research fellowships and 2 research studentships in visual and auditory = modelling and psychophysics. =20 The open positions are part of a project, "Attend-to-learn and = learn-to-attend with neuromorphic VLSI", funded by the European = Community (IST-2001-38099), which combines perceptual, computational, = and hardware research. In addition to the University of Plymouth, UK, = the project also involves, ETH Zurich and University of Bern, = Switzerland, the National Institute of Health, Rome, Italy, Siemens AG, = Germany, and the UC Davis, USA. Research at the Plymouth Institute of = Neuroscience received a rating of 5 in the 2001 RAE exercise.=20 =20 The aim of the project is to develop a general architecture for = memory-guided attention using both software modelling and neuromorphic = VLSI technology. The architecture will be tested on natural visual and = auditory stimuli and its performance compared to human = observers/listeners. Plymouth will be responsible for developing visual = and auditory stimuli that are restricted enough to be tractable by an = artificial system, yet rich enough for psychophysical tests of attention = and learning in human observers/listeners.=20 =20 Specific tasks will include: a.. designing feature spaces that efficiently encode of natural images = and sounds b.. conducting psychophysical experiments to characterise attention = and learning with novel classes of synthetic images and sounds c.. developing computational models for visual and auditory feature = saliency and feature tracking d.. providing performance benchmarks that are suitable for testing the = models and hardware components developed by project partners e.. disseminating the results at international conferences. Applicants for the postdoctoral research fellowships should have, or = expect to obtain, a Ph.D. in Psychophysics, Computational Modelling, or = a related field. Experience in information theoretic analysis is also = desirable.=20 Applicants for research studentships should have, or expect to obtain, a = good honours degree in Psychology, Neuroscience, Physics, Computing, or = related fields. The ideal candidate will possess good analytical, = experimental and computational skills.=20 All posts are available from 1 September 2002 for 3 years. Salaries = will be internationally competitive and reflect the successful = candidate's qualifications. For further information, please contact Prof. Jochen Braun (visual = psychophysics and modelling, +44 1752 232 711, achim(at)pion.ac.uk) or Dr. = Sue Denham (auditory psychophysics and modelling, +44 1752 232610, = sue(at)pion.ac.uk). =20 =20 =20 ------=_NextPart_000_0094_01C22031.E68FDF30 Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> <HTML><HEAD> <META http-equiv=3DContent-Type content=3D"text/html; = charset=3Diso-8859-1"> <META content=3D"MSHTML 6.00.2715.400" name=3DGENERATOR> <STYLE></STYLE> </HEAD> <BODY bgColor=3D#ffffff><FONT face=3DArial size=3D2> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><EM><B=20 style=3D"mso-bidi-font-weight: normal"><SPAN style=3D"FONT-STYLE: = normal"><FONT=20 face=3D"Times New Roman" size=3D3>Plymouth Institute of=20 Neuroscience</FONT></SPAN></B></EM><B style=3D"mso-bidi-font-weight: = normal"><I=20 style=3D"mso-bidi-font-style: normal"><BR=20 style=3D"mso-special-character: line-break"><BR=20 style=3D"mso-special-character: line-break"></I></B><B=20 style=3D"mso-bidi-font-weight: normal"><I=20 style=3D"mso-bidi-font-style: normal"><SPAN=20 style=3D"mso-fareast-font-family: Times"><?xml:namespace prefix =3D o ns = =3D=20 "urn:schemas-microsoft-com:office:office" = /><o:p></o:p></SPAN></I></B></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3><o:p></o:p></FONT></FONT></SPAN></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><SPAN=20 style=3D"DISPLAY: none; FONT-FAMILY: 'Arial Unicode MS'; = mso-bidi-font-family: 'Times New Roman'; mso-hide: all"><FONT=20 size=3D3><o:p></o:p></FONT></SPAN></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><FONT size=3D3><FONT=20 face=3D"Times New Roman">Applications are invited for the following = positions: 2=20 postdoctoral research fellowships and 2 research studentships in visual = <SPAN=20 style=3D"mso-fareast-font-family: Times">and auditory modelling and=20 psychophysics.<o:p></o:p></SPAN></FONT></FONT></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3>&nbsp;<o:p></o:p></FONT></FONT></SPAN></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><FONT face=3D"Times = New Roman"=20 size=3D3>The open positions are part of a project,<SPAN=20 style=3D"mso-spacerun: yes">&nbsp; </SPAN>"Attend-to-learn and = learn-to-attend=20 with neuromorphic VLSI", funded by the European Community = (IST-2001-38099),=20 which combines perceptual, computational, and hardware research.<SPAN=20 style=3D"mso-spacerun: yes">&nbsp; </SPAN>In addition to<SPAN=20 style=3D"mso-fareast-font-family: Times"> the University of Plymouth, = UK, the=20 project also involves, </SPAN>ETH <SPAN=20 style=3D"mso-fareast-font-family: Times">Zurich and University of Bern,=20 Switzerland, the National Institute of Health, Rome, Italy, Siemens AG, = Germany,=20 and the UC Davis, USA</SPAN>.<SPAN style=3D"mso-spacerun: yes">&nbsp;=20 </SPAN>Research at the Plymouth Institute of Neuroscience received a = rating of 5=20 in the 2001 RAE exercise. </FONT></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><FONT=20 face=3D"Times New Roman"><FONT = size=3D3>&nbsp;<o:p></o:p></FONT></FONT></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3>The aim of the project is to develop a general architecture for = memory-guided attention using both software modelling and neuromorphic = VLSI=20 technology. The architecture will be tested on natural visual and = auditory=20 stimuli and its performance compared to human observers/listeners.<SPAN=20 style=3D"mso-spacerun: yes">&nbsp; </SPAN>Plymouth will be responsible = for=20 developing visual and auditory stimuli that are restricted enough to be=20 tractable by an artificial system, yet rich enough for psychophysical = tests of=20 attention and learning in human observers/listeners.=20 <o:p></o:p></FONT></FONT></SPAN></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3>&nbsp;<o:p></o:p></FONT></FONT></SPAN></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3>Specific tasks will = include:<o:p></o:p></FONT></FONT></SPAN></P> <UL style=3D"MARGIN-TOP: 0cm" type=3Ddisc> <LI class=3DMsoNormal=20 style=3D"MARGIN: 0cm 0cm 0pt; mso-list: l1 level1 lfo2; tab-stops: = list 36.0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3>designing feature spaces that efficiently encode of natural = images and=20 sounds<o:p></o:p></FONT></FONT></SPAN></LI> <LI class=3DMsoNormal=20 style=3D"MARGIN: 0cm 0cm 0pt; mso-list: l1 level1 lfo2; tab-stops: = list 36.0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3>conducting psychophysical experiments to characterise = attention and=20 learning with novel classes of synthetic images and=20 sounds<o:p></o:p></FONT></FONT></SPAN></LI> <LI class=3DMsoNormal=20 style=3D"MARGIN: 0cm 0cm 0pt; mso-list: l1 level1 lfo2; tab-stops: = list 36.0pt"><SPAN=20 style=3D"mso-fareast-font-family: Times"><FONT face=3D"Times New = Roman"><FONT=20 size=3D3>developing computational models for visual and auditory = feature=20 saliency and feature tracking<o:p></o:p></FONT></FONT></SPAN></LI> <LI class=3DMsoNormal=20 style=3D"MARGIN: 0cm 0cm 0pt; mso-list: l1 level1 lfo2; tab-stops: = list 36.0pt"><FONT=20 size=3D3><FONT face=3D"Times New Roman"><SPAN=20 style=3D"mso-fareast-font-family: Times">providing performance = benchmarks that=20 are suitable for testing the models and hardware components developed = by=20 project partners</SPAN></FONT></FONT></LI> <LI class=3DMsoNormal=20 style=3D"MARGIN: 0cm 0cm 0pt; mso-list: l1 level1 lfo2; tab-stops: = list 36.0pt"><FONT=20 size=3D3><FONT face=3D"Times New Roman"><SPAN=20 style=3D"mso-fareast-font-family: Times"></SPAN></FONT></FONT><FONT=20 face=3D"Times New Roman"><FONT size=3D3>d</FONT></FONT><FONT=20 face=3D"Times New Roman"><FONT size=3D3>isseminating the results at = international=20 conferences.<SPAN=20 style=3D"mso-fareast-font-family: = Times"><o:p></o:p></SPAN></FONT></FONT></LI></UL> <P><SPAN=20 style=3D"FONT-FAMILY: 'Times New Roman'; mso-bidi-font-family: 'Arial = Unicode MS'"><FONT=20 size=3D3>Applicants for the postdoctoral research fellowships should = have, or=20 expect to obtain, a Ph.D. in Psychophysics, Computational Modelling, or = a=20 related field.<SPAN style=3D"mso-spacerun: yes">&nbsp; </SPAN>Experience = in=20 information theoretic analysis is also desirable. = <o:p></o:p></FONT></SPAN></P> <P><SPAN=20 style=3D"FONT-FAMILY: 'Times New Roman'; mso-bidi-font-family: 'Arial = Unicode MS'"><FONT=20 size=3D3>Applicants for research studentships should have, or expect to = obtain, a=20 good honours degree in Psychology, Neuroscience, Physics, Computing, or = related=20 fields.<SPAN style=3D"mso-spacerun: yes">&nbsp; </SPAN>The ideal = candidate will=20 possess good analytical, experimental and computational skills.=20 <o:p></o:p></FONT></SPAN></P> <P><SPAN=20 style=3D"FONT-FAMILY: 'Times New Roman'; mso-bidi-font-family: 'Arial = Unicode MS'"><FONT=20 size=3D3>All posts are available from 1 September 2002 for 3 years.<SPAN = style=3D"mso-spacerun: yes">&nbsp; </SPAN>Salaries will be = internationally=20 competitive and reflect the successful candidate's=20 qualifications.<o:p></o:p></FONT></SPAN></P> <P><SPAN=20 style=3D"FONT-FAMILY: 'Times New Roman'; mso-bidi-font-family: 'Arial = Unicode MS'"><FONT=20 size=3D3>&nbsp;</FONT></SPAN><SPAN style=3D"FONT-FAMILY: 'Times New = Roman'"><FONT=20 size=3D3>For further information, please contact Prof. Jochen Braun = (visual=20 psychophysics and modelling, +44 1752 232 711, </FONT><A=20 href=3D"mailto:achim(at)pion.ac.uk"><SPAN style=3D"COLOR: windowtext"><FONT = size=3D3>achim(at)pion.ac.uk</FONT></SPAN></A><FONT size=3D3>) or Dr. Sue = Denham=20 (auditory psychophysics and modelling, +44 1752 232610, </FONT><A=20 href=3D"mailto:sue(at)pion.ac.uk"><SPAN style=3D"COLOR: windowtext"><FONT=20 size=3D3>sue(at)pion.ac.uk</FONT></SPAN></A><FONT size=3D3>).</FONT></SPAN> <DIV class=3DMsoNormal=20 style=3D"DISPLAY: none; MARGIN: 0cm 0cm 0pt; mso-list: l1 level1 lfo2; = tab-stops: list 36.0pt"><SPAN=20 style=3D"DISPLAY: none; mso-fareast-font-family: 'Arial Unicode MS'; = mso-hide: all"><FONT=20 size=3D3><FONT=20 face=3D"Times New = Roman">&nbsp;<o:p></o:p></FONT></FONT></SPAN></DIV></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><SPAN=20 style=3D"DISPLAY: none; mso-fareast-font-family: 'Arial Unicode MS'; = mso-hide: all"><FONT=20 size=3D3><FONT face=3D"Times New = Roman">&nbsp;<o:p></o:p></FONT></FONT></SPAN></P> <P class=3DMsoNormal style=3D"MARGIN: 0cm 0cm 0pt"><FONT size=3D3><FONT=20 face=3D"Times New = Roman">&nbsp;<o:p></o:p></FONT></FONT></P></FONT></BODY></HTML> ------=_NextPart_000_0094_01C22031.E68FDF30--


This message came from the mail archive
http://www.auditory.org/postings/2002/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University