[AUDITORY] Postdoc Position in Auditory Neuro - University of Memphis (Gavin Bidelman )


Subject: [AUDITORY] Postdoc Position in Auditory Neuro - University of Memphis
From:    Gavin Bidelman  <gmbdlman@xxxxxxxx>
Date:    Wed, 27 Jun 2018 10:05:30 -0500
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--000000000000877c47056fa0f46c Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable *POSTDOCTORAL POSITION IN AUDITORY NEUROSCIENCE* *UNIVERSITY OF MEMPHIS* The Auditory Cognitive Neuroscience Laboratory (ACNL) ( http://www.memphis.edu/acnl/) located at the University of Memphis invites applicants for a fully-funded postdoctoral fellow position in human auditory neuroscience. The ACNL, directed by Dr. Gavin Bidelman, is affilia= ted with the Institute for Intelligent Systems (http://www.memphis.edu/iis/) and is housed in the School of Communication Sciences & Disorders (CSD) ( http://www.memphis.edu/csd/). The position has an immediate start date, depending on applicant availability. This position is part of a 5-year, NIH/NIDCD funded R01 project that examines the central neural mechanisms and plasticity of speech perception, auditory categorization, and novel sound learning. The successful candidate will work in a highly interdisciplinary and collaborative team including Speech-Hearing Scientists, Electrical-Computer Engineers, and Cognitive Psychologists to spearhead neuroimaging research on the brain dynamics underlying complex auditory perception-cognition. Experimental approaches include multichannel EEG, brainstem and cortical ERPs, eye-tracking, and lab vs. real-world auditory listening paradigms. Training opportunities include state-of-the art techniques including EEG source imaging, functional connectivity, and =E2=80=9Cbig data science=E2= =80=9D approaches to decode electrical brain activity in association with auditory perceptual outcomes and individual differences in listening skills. The candidate will be expected to contribute to all stages of the research including designing experiments, supervising PhD and AuD students for data collection, analyzing data, and disseminating results. The lab is housed in a new building with state-of-the-art teaching, research, and clinical facilities (www.memphis.edu/chbuilding/). Additional information on research in the department can be found here: http://www.memphis.edu/csd/research/index.php. Candidates should have a PhD in speech/hearing science, cognitive neuroscience, psychology, or a related field. Individuals with expertise in electrical engineering, signal processing, and MATLAB programming are particularly encouraged to apply. Previous background in human neuroimaging techniques (EEG, MRI) and behavioral speech testing is highly desirable, but not required. Salary is commensurate with the NIH pay schedule. To apply, complete an online application at https://workforum.memphis.edu/postings/19235 or search for position #021195= . Applications should include a (1) cover letter describing research experience & interests, (2) CV, (3) representative publication reprints, and (4) contact information for three references. Inquiries can be directed to the project=E2=80=99s PI: *Gavin Bidelman, PhD *[g.bidelman@xxxxxxxx= . EE/AA Employer. Gavin M. Bidelman, PhD Associate Professor University of Memphis Institute for Intelligent Systems School of Comm. Sciences and Disorders 4055 North Park Loop Memphis, TN 38152 901.678.5826 | www.memphis.edu/acnl --000000000000877c47056fa0f46c Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable <div dir=3D"ltr"> <p class=3D"MsoNormal" align=3D"center" style=3D"margin:0in 0in 0.0001pt;te= xt-align:center;line-height:normal;font-size:11pt;font-family:Arial,sans-se= rif"><b><span style=3D"color:rgb(51,51,50)">POSTDOCTORAL POSITION IN AUDITORY NEUROSCIENCE<span></span></span></b></p> <p class=3D"MsoNormal" align=3D"center" style=3D"margin:0in 0in 0.0001pt;te= xt-align:center;line-height:normal;font-size:11pt;font-family:Arial,sans-se= rif"><b><span style=3D"color:rgb(51,51,50)">UNIVERSITY OF MEMPHIS<span></span></span></b></p> <p class=3D"MsoNormal" align=3D"center" style=3D"margin:0in 0in 0.0001pt;te= xt-align:center;line-height:normal;font-size:11pt;font-family:Arial,sans-se= rif"><span style=3D"color:rgb(51,51,50)"><span>=C2=A0</span></span></p> <p class=3D"MsoNormal" align=3D"center" style=3D"margin:0in 0in 0.0001pt;te= xt-align:center;line-height:normal;font-size:11pt;font-family:Arial,sans-se= rif"><span style=3D"color:rgb(51,51,50)"><span>=C2=A0</span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(51,51= ,50)">The Auditory Cognitive Neuroscience Laboratory (ACNL) (</span><a href=3D"http:/= /www.memphis.edu/acnl/" style=3D"color:rgb(5,99,193);text-decoration:underl= ine">http://www.memphis.edu/acnl/</a><span style=3D"color:rgb(51,51,50)">) = located at the University of Memphis invites applicants for a fully-funded postdoctoral fellow position in </span><span style=3D"co= lor:rgb(1,2,2)">human auditory neuroscience. The ACNL, directed by Dr. Gavin Bidelman, is </span><span style=3D"color:rgb(51,51,50)">affiliated wi= th the Institute for Intelligent Systems (</span><a href=3D"http://www.memphis.edu/iis/" sty= le=3D"color:rgb(5,99,193);text-decoration:underline">http://www.memphis.edu= /iis/</a><span style=3D"color:rgb(51,51,50)">) and is housed in the School = of Communication Sciences &amp; Disorders (CSD) (</span><a href=3D"http://www.memphis.edu/csd/" style= =3D"color:rgb(5,99,193);text-decoration:underline">http://www.memphis.edu/c= sd/</a><span style=3D"color:rgb(51,51,50)">). The position has an immediate= start date, depending on applicant availability. <span></span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(1,2,2= )"><span>=C2=A0</span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><a name=3D"_Hlk514420648"><spa= n style=3D"color:rgb(1,2,2)">This position is part of a 5-year, NIH/NIDCD f= unded R01 project that examines the central neural mechanisms and plasticity of speec= h perception, auditory categorization, and novel sound learning. The successf= ul candidate will work in a highly interdisciplinary and collaborative team in= cluding Speech-Hearing Scientists, Electrical-Computer Engineers, and Cognitive Psychologists to spearhead neuroimaging research on the brain dynamics unde= rlying complex auditory perception-cognition. <span></span></span></a></p> <span></span> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(1,2,2= )"><span>=C2=A0</span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(1,2,2= )">Experimental approaches include multichannel EEG, brainstem and cortical ERPs, eye-track= ing, and lab vs. real-world auditory listening paradigms. Training opportunities= include state-of-the art techniques including EEG source imaging, functional connectivity, and =E2=80=9Cbig data science=E2=80=9D approaches to decode e= lectrical brain activity in association with auditory perceptual outcomes and individual differences in listening skills. <span>=C2=A0</span>The candidate will be expected to contribute to all stages of the research including designing experiments, supervising PhD and AuD students for data collection, analyzing data, and disseminating results. <span></span></span>= </p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(1,2,2= )"><span>=C2=A0</span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(50,50= ,50)">The lab is housed in a new building with state-of-the-art teaching, research, a= nd clinical facilities (</span><a href=3D"http://www.memphis.edu/chbuilding/" = style=3D"color:rgb(5,99,193);text-decoration:underline">www.memphis.edu/chb= uilding/</a><span style=3D"color:rgb(50,50,50)">). Additional information o= n research in the department can be found here: </span><a href=3D"http://www.memphis.edu/csd/research/in= dex.php" style=3D"color:rgb(5,99,193);text-decoration:underline">http://www= .memphis.edu/csd/research/index.php</a><span style=3D"color:rgb(50,50,50)">= . </span><span style=3D"color:rgb(1,2,2)"><span></span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(51,51= ,50)"><span>=C2=A0</span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(51,51= ,50)">Candidates should have a PhD in speech/hearing science, cognitive neuroscience, psycho= logy, or a related field. Individuals with expertise in electrical engineering, signal processing, and MATLAB programming are particularly encouraged to ap= ply. Previous background in human neuroimaging techniques (EEG, MRI) and behavio= ral speech testing is highly desirable, but not required. Salary is commensurat= e with the NIH pay schedule. <span></span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><span style=3D"color:rgb(51,51= ,50)"><span>=C2=A0</span></span></p> <p class=3D"MsoNormal" style=3D"margin:0in 0in 0.0001pt;line-height:normal;= font-size:11pt;font-family:Arial,sans-serif"><a name=3D"_Hlk514420325"><spa= n style=3D"color:rgb(51,51,50)">To apply, complete an online application at= </span></a><a href=3D"https://workforum.memphis.edu/postings/19235" style= =3D"color:rgb(5,99,193);text-decoration:underline"><span><span style=3D"bac= kground:white">https://workforum.memphis.edu/postings/19235</span></span><s= pan></span></a><span><span style=3D"color:rgb(34,34,34);background:white"> = or search for position #021195</span>. </span><span style=3D"color:rgb(51,5= 1,50)">Applications should include a (1) cover letter describing research experience &amp; interests, (2) CV, (3) representative publication reprints, and (4) contact information for three references.</span><span sty= le=3D"color:rgb(51,51,50)"> Inquiries can be directed to the project=E2=80= =99s PI: <b>Gavin Bidelman, PhD </b>[</span><span style=3D"color:rgb(52,92,171)"><a href=3D"m= ailto:g.bidelman@xxxxxxxx">g.bidelman@xxxxxxxx</a></span>]. <span style=3D"font-size:11.5pt;color:rgb(50,50,50)">EE/AA Employer.<span><= /span></span></p> <div><br></div><div><br></div><br clear=3D"all"><div><div class=3D"gmail_si= gnature" data-smartmail=3D"gmail_signature"><div dir=3D"ltr"><div><div dir= =3D"ltr"><div><div dir=3D"ltr"><div><div dir=3D"ltr"><div><div dir=3D"ltr">= <div><div dir=3D"ltr"><div><div dir=3D"ltr"><div><div dir=3D"ltr"><div><div= dir=3D"ltr"><div><div dir=3D"ltr"><div><div dir=3D"ltr"><div><div dir=3D"l= tr"><div><div dir=3D"ltr"><div><div dir=3D"ltr"><div dir=3D"ltr"><div dir= =3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr= "><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div = dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"= ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><div dir=3D"ltr"><p= style=3D"margin:0in 0in 0.0001pt;font-size:12pt;font-family:&#39;Times New= Roman&#39;,serif"><span style=3D"color:rgb(31,73,125);font-family:Calibri,= sans-serif;font-size:11pt">=C2=A0</span></p><table border=3D"0" cellspacing= =3D"0" cellpadding=3D"0" width=3D"350" style=3D"font-size:12.8000001907349p= x;width:262.5pt;border-collapse:collapse"><tbody><tr><td style=3D"border-st= yle:none none solid;border-bottom-color:rgb(130,130,130);border-bottom-widt= h:1pt;padding:0in 0in 3.75pt"><p style=3D"margin:0in 0in 0.0001pt;font-size= :12pt;font-family:&#39;Times New Roman&#39;,serif"><span style=3D"font-fami= ly:Helvetica,sans-serif;color:rgb(0,82,156)"><span>Gavin M. Bidelman, PhD</= span>=C2=A0<br></span><span style=3D"font-size:10.5pt;font-family:Helvetica= ,sans-serif;color:rgb(130,130,130)">Associate Professor</span><span style= =3D"font-family:Helvetica,sans-serif;color:rgb(0,82,156)"><u></u><u></u></s= pan></p></td></tr><tr><td style=3D"padding:0in"><table border=3D"1" cellspa= cing=3D"0" cellpadding=3D"0" width=3D"350" style=3D"width:262.5pt;border-co= llapse:collapse;border:none"><tbody><tr><td width=3D"59" style=3D"width:44.= 25pt;border-style:none none solid;border-bottom-color:rgb(130,130,130);bord= er-bottom-width:1pt;padding:7.5pt 0in"><p style=3D"margin:0in 0in 0.0001pt;= font-size:12pt;font-family:&#39;Times New Roman&#39;,serif"><span style=3D"= color:rgb(31,73,125)"></span><img src=3D"https://docs.google.com/uc?id=3D0B= 0hM8E-IuO6KU0JqTnZ2aVRkcGs&amp;export=3Ddownload" width=3D"68" height=3D"96= " style=3D"font-family:arial,sans-serif;font-size:12.8000001907349px"><span= style=3D"color:rgb(31,73,125)"><u></u><u></u></span></p></td><td style=3D"= border-style:none none solid;border-bottom-color:rgb(130,130,130);border-bo= ttom-width:1pt;padding:7.5pt 0in 7.5pt 7.5pt"><p style=3D"margin:0in 0in 0.= 0001pt;font-size:12pt;font-family:&#39;Times New Roman&#39;,serif"><span st= yle=3D"font-size:10.5pt;font-family:Helvetica,sans-serif;color:rgb(0,82,156= )">University of Memphis=C2=A0</span></p><p style=3D"margin:0in 0in 0.0001p= t;font-size:12pt;font-family:&#39;Times New Roman&#39;,serif"><span style= =3D"font-size:10.5pt;font-family:Helvetica,sans-serif;color:rgb(0,82,156)">= Institute for Intelligent Systems</span></p><p style=3D"margin:0in 0in 0.00= 01pt;font-size:12pt;font-family:&#39;Times New Roman&#39;,serif"><span styl= e=3D"font-size:10.5pt;font-family:Helvetica,sans-serif;color:rgb(0,82,156)"= >School of Comm. Sciences and Disorders=C2=A0<br></span><span><span style= =3D"font-size:10.5pt;font-family:Helvetica,sans-serif;color:rgb(130,130,130= )">4055 North Park Loop=C2=A0<br>Memphis, TN=C2=A038152<br></span></span><s= pan style=3D"font-size:10.5pt;font-family:Helvetica,sans-serif;color:rgb(0,= 82,156)"><a href=3D"tel:901.678.5826" value=3D"+19016785860" style=3D"color= :rgb(17,85,204)" target=3D"_blank">901.678.5826</a>=C2=A0</span><span style= =3D"font-size:10.5pt;font-family:Helvetica,sans-serif;color:rgb(130,130,130= )">|</span><span style=3D"font-size:10.5pt;font-family:Helvetica,sans-serif= ;color:rgb(0,82,156)">=C2=A0<a href=3D"http://www.memphis.edu/acnl" style= =3D"color:rgb(17,85,204)" target=3D"_blank">www.memphis.edu/acnl</a><u></u>= <u></u></span></p></td></tr></tbody></table></td></tr><tr><td style=3D"padd= ing:3.75pt 0in 0in"><table border=3D"0" cellspacing=3D"0" cellpadding=3D"0"= width=3D"45%" style=3D"width:157px;border-collapse:collapse"><tbody><tr><t= d width=3D"49" style=3D"width:36.75pt;padding:0in"><p style=3D"margin:0in 0= in 0.0001pt;font-size:12pt;font-family:&#39;Times New Roman&#39;,serif"><sp= an style=3D"border:1pt none windowtext;padding:0in;text-decoration:none"></= span><span style=3D"color:rgb(31,73,125)"><u></u><u></u></span></p></td><td= width=3D"49" style=3D"width:36.75pt;padding:0in"><p style=3D"margin:0in 0i= n 0.0001pt;font-size:12pt;font-family:&#39;Times New Roman&#39;,serif"><spa= n style=3D"border:1pt none windowtext;padding:0in;text-decoration:none"></s= pan><span style=3D"color:rgb(31,73,125)"><u></u><u></u></span></p></td><td = width=3D"49" style=3D"width:36.75pt;padding:0in"></td></tr></tbody></table>= </td></tr></tbody></table><br><p><span style=3D"font-family:Arial,sans-seri= f;background-image:initial;background-repeat:initial"></span></p></div></di= v></div></div></div></div></div></div></div></div></div></div></div></div><= /div></div></div></div></div></div></div></div></div></div></div></div></di= v></div></div></div></div></div></div></div></div></div></div></div></div><= /div></div></div></div></div></div></div></div></div> </div> --000000000000877c47056fa0f46c--


This message came from the mail archive
src/postings/2018/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University