[AUDITORY] Seminar Announcement - February 21 - E.A.R.S. (Electronic Auditory Research Seminars) ("Vogler, Nathan" )


Subject: [AUDITORY] Seminar Announcement - February 21 - E.A.R.S. (Electronic Auditory Research Seminars)
From:    "Vogler, Nathan"  <Nathan.Vogler@xxxxxxxx>
Date:    Thu, 16 Feb 2023 19:38:38 +0000

--_000_MN2PR04MB6127779A1846506FCE7B6A98C8A09MN2PR04MB6127namp_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Dear fellow neuroscientists, Please join us on Tuesday, February 21 at 1:00 pm EST (UTC-5) for the next = edition of E.A.R.S. (Electronic Auditory Research Seminars), a monthly audi= tory seminar series focused on central auditory processing and circuits. Speakers: Sam Norman-Haignere (University of Rochester): "Neural integration in the h= uman auditory cortex" To derive meaning from sound, the brain must integrate information across m= any timescales spanning tens (e.g., phonemes) to hundreds (e.g., words) of = milliseconds. Yet, identifying the integration timescales of auditory neura= l populations has been challenging, in part due to their complex, nonlinear= tuning for natural sound structure. In this talk, I will describe a new me= thod to estimate neural integration windows using natural stimuli (the temp= oral context invariance paradigm). Our method is conceptually simple and ge= neral, and thus applicable to virtually any brain region, sensory domain, o= r temporally precise recording modality. By applying this method to intracr= anial recordings from human neurosurgical patients, we have found that the = human auditory cortex integrates hierarchically across diverse timescales s= panning approximately 50 to 400 ms, with substantially longer integration w= indows in non-primary regions, bilaterally. Moreover, we have found that ne= ural populations with short and long integration windows exhibit distinct f= unctional properties: short-integration electrodes (less than ~200 ms) show= prominent spectrotemporal modulation selectivity, while long-integration e= lectrodes (greater than ~200 ms) show prominent category selectivity. These= findings reveal how multiscale integration organizes auditory computation = in the human brain. Ross Williamson (University of Pittsburgh): "Brain-wide neural circuits for= sensory-guided behavior" Auditory-guided behavior is ubiquitous in everyday life, whenever auditory = information is used to guide the decisions we make and the actions we take.= One such behavior is auditory categorization, a process that reflects the = ability to transform bottom-up sensory stimuli into discrete perceptual cat= egories and use these perceptual categories to drive a subsequent action. A= lthough this process is well-documented at the behavioral and cognitive lev= els, surprisingly little is known about the explicit neural circuit mechani= sms that underlie categorical computation and how the result of this comput= ation drives behavioral outcomes. We believe that the transformation of aud= itory information into an appropriate behavioral response is necessarily a = brain-wide endeavor. The deep layers of the auditory cortex give rise to se= veral massive projection systems that exert influence over many downstream = brain areas. Here, I will discuss our efforts towards understanding how the= se distinct systems differentially contribute to auditory-guided behavior. The seminar will be hosted on Zoom. You can access the seminars here: https= ://pennmedicine.zoom.us/j/95396120820. This link is also posted on our webs= ite https://sites.google.com/view/ears2020/home<https://urldefense.com/v3/_= _https:/sites.google.com/view/ears2020/home__;!!IBzWLUs!VuGYw2vNd9FQhNjdREs= xtlL_E14Tf-se4AUDBQVfyumou78YqCXbcThOzYXJxUxC6EHsJzXOBQB2L6ef9RRcRtEfe9ER5V= 0DPPDG$>. The E.A.R.S. subscriber list is the ears-seminar google group, wh= ich you can join by emailing: ears2022+subscribe@xxxxxxxx<mailto:ea= rs2022+subscribe@xxxxxxxx> or visiting the following link: https://= groups.google.com/g/ears2022<https://urldefense.com/v3/__https:/groups.goog= le.com/g/ears2022__;!!IBzWLUs!VuGYw2vNd9FQhNjdREsxtlL_E14Tf-se4AUDBQVfyumou= 78YqCXbcThOzYXJxUxC6EHsJzXOBQB2L6ef9RRcRtEfe9ER5V3u372u$>. Additional upcoming E.A.R.S. seminars (1:00 pm ET): * 03/14/2023: Professional Development session * 04/18/2023: Trainee Talks * 05/09/2023: Charles Anderson (West Virginia University) & Dayo Ad= ewole (University of Pennsylvania) With kind wishes, Maria Geffen Yale Cohen Steve Eliades Stephen David Alexandria Lesicko Nathan Vogler Jean-Hugues Lestang Huaizhen Cai --_000_MN2PR04MB6127779A1846506FCE7B6A98C8A09MN2PR04MB6127namp_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable <html xmlns:o=3D"urn:schemas-microsoft-com:office:office" xmlns:w=3D"urn:sc= hemas-microsoft-com:office:word" xmlns:m=3D"http://schemas.microsoft.com/of= fice/2004/12/omml" xmlns=3D"http://www.w3.org/TR/REC-html40"> <head> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Dus-ascii"= > <meta name=3D"Generator" content=3D"Microsoft Word 15 (filtered medium)"> <style><!-- /* Font Definitions */ @xxxxxxxx {font-family:Wingdings; panose-1:5 0 0 0 0 0 0 0 0 0;} @xxxxxxxx {font-family:"Cambria Math"; panose-1:2 4 5 3 5 4 6 3 2 4;} @xxxxxxxx {font-family:Calibri; panose-1:2 15 5 2 2 2 4 3 2 4;} @xxxxxxxx {font-family:"Open Sans"; panose-1:2 11 6 6 3 5 4 2 2 4;} /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in; font-size:12.0pt; font-family:"Calibri",sans-serif;} a:link, span.MsoHyperlink {mso-style-priority:99; color:#0563C1; text-decoration:underline;} span.EmailStyle17 {mso-style-type:personal-compose; font-family:"Calibri",sans-serif; color:windowtext;} p.zfr3q, li.zfr3q, div.zfr3q {mso-style-name:zfr3q; mso-margin-top-alt:auto; margin-right:0in; mso-margin-bottom-alt:auto; margin-left:0in; font-size:12.0pt; font-family:"Times New Roman",serif;} span.c9dxtc {mso-style-name:c9dxtc;} span.apple-converted-space {mso-style-name:apple-converted-space;} .MsoChpDefault {mso-style-type:export-only; font-family:"Calibri",sans-serif;} .MsoPapDefault {mso-style-type:export-only; mso-margin-top-alt:auto; mso-margin-bottom-alt:auto;} @xxxxxxxx WordSection1 {size:8.5in 11.0in; margin:1.0in 1.0in 1.0in 1.0in;} div.WordSection1 {page:WordSection1;} /* List Definitions */ @xxxxxxxx l0 {mso-list-id:1519351658; mso-list-template-ids:54676104;} @xxxxxxxx l0:level1 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:.5in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level2 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:1.0in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level3 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:1.5in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level4 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:2.0in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level5 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:2.5in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level6 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:3.0in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level7 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:3.5in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level8 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:4.0in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} @xxxxxxxx l0:level9 {mso-level-number-format:bullet; mso-level-text:\F0A7 ; mso-level-tab-stop:4.5in; mso-level-number-position:left; text-indent:-.25in; mso-ansi-font-size:10.0pt; font-family:Wingdings;} ol {margin-bottom:0in;} ul {margin-bottom:0in;} --></style> </head> <body lang=3D"EN-US" link=3D"#0563C1" vlink=3D"#954F72" style=3D"word-wrap:= break-word"> <div class=3D"WordSection1"> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Dear fellow neuroscientists,</s= pan><span style=3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span><span style=3D"fon= t-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Please join us on&nbsp;<b>Tuesd= ay, February 21 at 1:00 pm EST</b>&nbsp;(UTC-5) for the next edition of E.A= .R.S. (Electronic Auditory Research Seminars), a monthly auditory seminar series focused on central auditory processing and circuits.</span>= <span style=3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span><span style=3D"fon= t-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Speakers:</span><span style=3D"= font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><b><span style=3D"font-fa= mily:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span></b><span style= =3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span class=3D"c9dxtc"><b= ><span style=3D"font-family:&quot;Arial&quot;,sans-serif;color:black">Sam N= orman-Haignere</span></b></span><span class=3D"apple-converted-space"><span= style=3D"font-family:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</spa= n></span><span class=3D"c9dxtc"><span style=3D"font-family:&quot;Arial&quot= ;,sans-serif;color:black">(University of Rochester):</span></span><span style=3D"font-family:&quot;Arial&quot;,s= ans-serif;color:black">&nbsp;</span><span class=3D"c9dxtc"><i><span style= =3D"font-family:&quot;Arial&quot;,sans-serif;color:black">&quot;</span><spa= n style=3D"color:black">Neural integration in the human auditory cortex&quo= t;</span></i></span><span style=3D"font-size:11.0pt;color:black"><o:p></o:p= ></span></p> <p class=3D"MsoNormal" style=3D"text-indent:.25in;background:white"><span s= tyle=3D"font-family:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span>= <span style=3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"text-indent:.5in;background:white"><span st= yle=3D"font-size:10.0pt;font-family:&quot;Arial&quot;,sans-serif;color:blac= k">To derive meaning from sound, the brain must integrate information acros= s many timescales spanning tens (e.g., phonemes) to hundreds (e.g., words) of milliseconds. Yet, identifying the integratio= n timescales of auditory neural populations has been challenging, in part d= ue to their complex, nonlinear tuning for natural sound structure. In this = talk, I will describe a new method to estimate neural integration windows using natural stimuli (the temporal= context invariance paradigm). Our method is conceptually simple and genera= l, and thus applicable to virtually any brain region, sensory domain, or te= mporally precise recording modality. By applying this method to intracranial recordings from human neurosurgica= l patients, we have found that the human auditory cortex integrates hierarc= hically across diverse timescales spanning approximately 50 to 400 ms, with= substantially longer integration windows in non-primary regions, bilaterally. Moreover, we have found that = neural populations with short and long integration windows exhibit distinct= functional properties: short-integration electrodes (less than ~200 ms) sh= ow prominent spectrotemporal modulation selectivity, while long-integration electrodes (greater than ~200 ms) show= prominent category selectivity. These findings reveal how multiscale integ= ration organizes auditory computation in the human brain.&nbsp;<o:p></o:p><= /span></p> <p class=3D"MsoNormal" style=3D"background:white"><b><span style=3D"font-fa= mily:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span></b><span style= =3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span class=3D"c9dxtc"><b= ><span style=3D"font-family:&quot;Arial&quot;,sans-serif;color:black">Ross = Williamson</span></b></span><span class=3D"apple-converted-space"><span sty= le=3D"font-family:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span></= span><span class=3D"c9dxtc"><span style=3D"font-family:&quot;Arial&quot;,sa= ns-serif;color:black">(University of Pittsburgh):</span><span style=3D"color:black"> <i>&quot;</i></span></s= pan><span class=3D"c9dxtc"><i><span style=3D"font-family:&quot;Arial&quot;,= sans-serif;color:#1D1D1D">Brain-wide neural circuits for sensory-guided beh= avior&quot;</span></i></span><span style=3D"font-size:11.0pt;color:black"><= o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-size:= 11.0pt;color:black"><o:p>&nbsp;</o:p></span></p> <p class=3D"zfr3q" style=3D"margin:0in;text-indent:.5in"><span class=3D"c9d= xtc"><span style=3D"font-size:10.0pt;font-family:&quot;Arial&quot;,sans-ser= if;color:#1D1D1D">Auditory-guided behavior is ubiquitous in everyday life, = whenever auditory information is used to guide the decisions we make and the actions we take. One such behavior is auditory c= ategorization, a process that reflects the ability to transform bottom-up s= ensory stimuli into discrete perceptual categories and use these perceptual= categories to drive a subsequent action. Although this process is well-documented at the behavioral and cog= nitive levels, surprisingly little is known about the explicit neural circu= it mechanisms that underlie categorical computation and how the result of t= his computation drives behavioral outcomes. We believe that the transformation of auditory information into = an appropriate behavioral response is necessarily a brain-wide endeavor. Th= e deep layers of the auditory cortex give rise to several massive projectio= n systems that exert influence over many downstream brain areas.</span></span><span class=3D"apple-converted-s= pace"><span style=3D"font-size:10.0pt;font-family:&quot;Arial&quot;,sans-se= rif;color:#1D1D1D">&nbsp;</span></span><span class=3D"c9dxtc"><span style= =3D"font-size:10.0pt;font-family:&quot;Arial&quot;,sans-serif;color:black">= Here, I will discuss our efforts towards understanding how these distinct system= s differentially contribute to auditory-guided behavior.</span></span><span= style=3D"font-family:&quot;Open Sans&quot;,sans-serif;color:#212121"><o:p>= </o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span><span style=3D"fon= t-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><b><span style=3D"font-fa= mily:&quot;Arial&quot;,sans-serif;color:black">The seminar will be hosted o= n Zoom.&nbsp;</span></b><span style=3D"font-family:&quot;Arial&quot;,sans-s= erif;color:black">You can access the seminars here:&nbsp;<a href=3D"https:/= /pennmedicine.zoom.us/j/95396120820" target=3D"_blank" title=3D"https://pen= nmedicine.zoom.us/j/95396120820"><span style=3D"color:black;border:none win= dowtext 1.0pt;padding:0in">https://pennmedicine.zoom.us/j/95396120820</span= ></a>. This link is also posted on our website&nbsp;<a href=3D"https://urldefense= .com/v3/__https:/sites.google.com/view/ears2020/home__;!!IBzWLUs!VuGYw2vNd9= FQhNjdREsxtlL_E14Tf-se4AUDBQVfyumou78YqCXbcThOzYXJxUxC6EHsJzXOBQB2L6ef9RRcR= tEfe9ER5V0DPPDG$" target=3D"_blank" title=3D"https://urldefense.com/v3/__ht= tps:/sites.google.com/view/ears2020/home__;!!IBzWLUs!VuGYw2vNd9FQhNjdREsxtl= L_E14Tf-se4AUDBQVfyumou78YqCXbcThOzYXJxUxC6EHsJzXOBQB2L6ef9RRcRtEfe9ER5V0DP= PDG$"><span style=3D"color:black;border:none windowtext 1.0pt;padding:0in">= https://sites.google.com/view/ears2020/home</span></a>. The E.A.R.S. subscriber list is the ears-seminar google group, which you c= an join by emailing:&nbsp;<span style=3D"border:none windowtext 1.0pt;paddi= ng:0in"><a href=3D"mailto:ears2022+subscribe@xxxxxxxx">ears2022+sub= scribe@xxxxxxxx</a></span>&nbsp;or visiting the following link:&nbsp;<a href=3D"https://urldefense.com/v3/__https:/gro= ups.google.com/g/ears2022__;!!IBzWLUs!VuGYw2vNd9FQhNjdREsxtlL_E14Tf-se4AUDB= QVfyumou78YqCXbcThOzYXJxUxC6EHsJzXOBQB2L6ef9RRcRtEfe9ER5V3u372u$" target=3D= "_blank" title=3D"https://urldefense.com/v3/__https:/groups.google.com/g/ea= rs2022__;!!IBzWLUs!VuGYw2vNd9FQhNjdREsxtlL_E14Tf-se4AUDBQVfyumou78YqCXbcThO= zYXJxUxC6EHsJzXOBQB2L6ef9RRcRtEfe9ER5V3u372u$"><span style=3D"color:black;b= order:none windowtext 1.0pt;padding:0in">https://groups.google.com/g/ears20= 22</span></a>.</span><span style=3D"font-size:11.0pt;color:black"><o:p></o:= p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span><span style=3D"fon= t-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Additional upcoming E.A.R.S. se= minars (1:00 pm ET):</span><span style=3D"font-size:11.0pt;color:black"><o:= p></o:p></span></p> <ul style=3D"margin-top:0in" type=3D"square"> <li class=3D"MsoNormal" style=3D"color:black;mso-list:l0 level1 lfo1;backgr= ound:white"> <span style=3D"font-family:&quot;Arial&quot;,sans-serif">03/14/2023:&nbsp;&= nbsp;&nbsp;&nbsp;&nbsp;Professional Development session</span><span style= =3D"font-size:11.0pt"><o:p></o:p></span></li><li class=3D"MsoNormal" style= =3D"color:black;mso-list:l0 level1 lfo1;background:white"> <span style=3D"font-family:&quot;Arial&quot;,sans-serif">04/18/2023:&nbsp;&= nbsp;&nbsp;&nbsp;&nbsp;Trainee Talks</span><span style=3D"font-size:11.0pt"= ><o:p></o:p></span></li><li class=3D"MsoNormal" style=3D"color:black;mso-li= st:l0 level1 lfo1;background:white"> <span style=3D"font-family:&quot;Arial&quot;,sans-serif">05/09/2023:&nbsp;&= nbsp;&nbsp;&nbsp;&nbsp;Charles Anderson (West Virginia University) &amp; Da= yo Adewole (University of Pennsylvania)</span><span style=3D"font-size:11.0= pt"><o:p></o:p></span></li></ul> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">&nbsp;</span><span style=3D"fon= t-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">With kind wishes,</span><span s= tyle=3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Maria Geffen</span><span style= =3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Yale Cohen</span><span style=3D= "font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Steve Eliades</span><span style= =3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Stephen David</span><span style= =3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Alexandria Lesicko</span><span = style=3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Nathan Vogler</span><span style= =3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Jean-Hugues Lestang</span><span= style=3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"background:white"><span style=3D"font-famil= y:&quot;Arial&quot;,sans-serif;color:black">Huaizhen Cai</span><span style= =3D"font-size:11.0pt;color:black"><o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-size:11.0pt"><o:p>&nbsp;</o:p></= span></p> </div> </body> </html> --_000_MN2PR04MB6127779A1846506FCE7B6A98C8A09MN2PR04MB6127namp_--


This message came from the mail archive
src/postings/2023/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University