Re: [AUDITORY] Seeking advice on using ANF firing rate to reslove front-back confusion in sound localization model (Jonathan Z Simon )


Subject: Re: [AUDITORY] Seeking advice on using ANF firing rate to reslove front-back confusion in sound localization model
From:    Jonathan Z Simon  <jzsimon@xxxxxxxx>
Date:    Fri, 28 Feb 2025 05:27:51 -0500

--Apple-Mail-A1D84503-2C4A-4D1C-96D4-5C3B3361D3D7 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: quoted-printable <html><head><meta http-equiv=3D"content-type" content=3D"text/html; charset=3D= utf-8"></head><body dir=3D"auto">To continue on the benefits of even a small= amount of head rotation, we were able demonstrate that the brain can indeed= learn its own head=E2=80=99s entire HRTF by making use of even tiny rotatio= ns.<div><br></div><div>Aytekin, M., C. F. Moss and J. Z. Simon (2008) A Sens= orimotor Approach to Sound Localization, Neural Computation, 20, 603-635.&nb= sp;<a href=3D"http://dx.doi.org/10.1162/neco.2007.12-05-094">http://dx.doi.o= rg/10.1162/neco.2007.12-05-094</a>&nbsp;</div><div><br></div><div><div style= =3D"display: block;" class=3D"">PDF at&nbsp;<a href=3D"http://cansl.isr.umd.= edu/simonlab/pubs/SensoryMotorSoundLocalization.pdf">http://cansl.isr.umd.ed= u/simonlab/pubs/SensoryMotorSoundLocalization.pdf</a></div><div style=3D"dis= play: block;" class=3D""><br></div><div style=3D"display: block;" class=3D""= >Jonathan</div><div><br id=3D"lineBreakAtBeginningOfSignature"><div dir=3D"l= tr"><div><div><div class=3D"" style=3D"word-wrap: break-word; -webkit-nbsp-m= ode: space; line-break: after-white-space;"><span style=3D"background-color:= rgba(255, 255, 255, 0);">--<br class=3D"">Jonathan Z. Simon (he/him)</span>= </div><div class=3D"" style=3D"word-wrap: break-word; -webkit-nbsp-mode: spa= ce; line-break: after-white-space;"><span style=3D"background-color: rgba(25= 5, 255, 255, 0);">University of Maryland<br class=3D"">Dept. of Electrical &= amp; Computer&nbsp;Engineering / Dept. of Biology /&nbsp;Institute for Syste= ms Research<br class=3D""><span dir=3D"ltr">8223 Paint Branch Dr.</span><br c= lass=3D""><span dir=3D"ltr">College Park, MD 20742 USA</span><br class=3D"">= Office:&nbsp;<span dir=3D"ltr">1-301-405-3645</span>, Lab:&nbsp;<span dir=3D= "ltr">1-301-405-9604</span>, Fax:&nbsp;<span dir=3D"ltr">1-301-314-9281</spa= n><br class=3D""><a href=3D"http://www.isr.umd.edu/Labs/CSSL/simonlab/" clas= s=3D"">http://www.isr.umd.edu/Labs/CSSL/simonlab/</a></span></div></div></di= v></div><div dir=3D"ltr"><br><blockquote type=3D"cite">On Feb 28, 2025, at 1= 2:13=E2=80=AFAM, Richard F. Lyon &lt;0000030301ff4bce-dmarc-request@xxxxxxxx= gill.ca&gt; wrote:<br><br></blockquote></div><blockquote type=3D"cite"><div d= ir=3D"ltr">=EF=BB=BF<div dir=3D"ltr"><div class=3D"gmail_default" style=3D"f= ont-size:small">Qin,</div><div class=3D"gmail_default" style=3D"font-size:sm= all"><br></div><div class=3D"gmail_default" style=3D"font-size:small">The ra= te-vs-place profiles from the two ears may have most of what you need to sup= plement the MSO's output that represents ITD, which is mostly a left-right c= ue.&nbsp; The cues for elevation, including front-back, are generally though= t to be more subtle spectral features, related to the individual's HRTF, and= are not as robust as the ITD cues.&nbsp; ILD cues are of intermediate robus= tness, I think, but still primarily left-right.</div><div class=3D"gmail_def= ault" style=3D"font-size:small"><br></div><div class=3D"gmail_default" style= =3D"font-size:small">I hadn't thought about doing what Jan Schnupp suggested= , looking at slightly different cones of confusion for different frequencies= , but that sounds like another way to conceptualize the subtle HRTF-dependen= t spectral cues.</div><div class=3D"gmail_default" style=3D"font-size:small"= ><br></div><div class=3D"gmail_default" style=3D"font-size:small">So you don= 't have to use "HRTF template matching", but you do have to use HRTFs.</div>= <div class=3D"gmail_default" style=3D"font-size:small"><br></div><div class=3D= "gmail_default" style=3D"font-size:small">If you want to do this in anything= like the real world, as opposed to an anechoic environment, you'll need a s= trong precedence effect to pay attention to the first arrival and ignore ech= os, or something along those lines.&nbsp; <br></div><div class=3D"gmail_defa= ult" style=3D"font-size:small"><br></div><div class=3D"gmail_default" style=3D= "font-size:small">Also, in the real world, we usually resolve front-back con= fusion quickly and easily by rotating our heads a little.&nbsp; The effect o= f rotation on ITD is opposite for front vs back, so this gives a very robust= front-back cue; up-down is still hard.</div><div class=3D"gmail_default" st= yle=3D"font-size:small"><br></div><div class=3D"gmail_default" style=3D"font= -size:small">Dick</div><div class=3D"gmail_default" style=3D"font-size:small= "><br></div></div><br><div class=3D"gmail_quote gmail_quote_container"><div d= ir=3D"ltr" class=3D"gmail_attr">On Wed, Feb 26, 2025 at 4:21=E2=80=AFPM Qin L= iu &lt;<a href=3D"mailto:000003c563e12bd3-dmarc-request@xxxxxxxx">000= 003c563e12bd3-dmarc-request@xxxxxxxx</a>&gt; wrote:<br></div><blockqu= ote class=3D"gmail_quote" style=3D"margin:0px 0px 0px 0.8ex;border-left:1px s= olid rgb(204,204,204);padding-left:1ex"><div class=3D"msg-248069954035649946= 5"> <div dir=3D"ltr"> <div id=3D"m_8012116099777483196divtagdefaultwrapper" dir=3D"ltr" style=3D"f= ont-size:12pt;color:rgb(0,0,0);font-family:Calibri,Helvetica,sans-serif,Emoj= iFont,&quot;Apple Color Emoji&quot;,&quot;Segoe UI Emoji&quot;,NotoColorEmoj= i,&quot;Segoe UI Symbol&quot;,&quot;Android Emoji&quot;,EmojiSymbols"> <p>Dear auditory list,</p> <div><br> I am currently working on a project involving sound localization using firin= g rates from auditory nerve fibers (ANFs) and the medial superior olive (MSO= ). However, I have encountered an issue: I am unable to distinguish between f= ront and back sound sources using MSO firing rates alone but only the left-right.<br> <br> I am considering whether auditory nerve fiber (ANF) firing rates might provi= de a solution, but I am uncertain how to utilize them effectively. For insta= nce, I have experimented with analyzing the positive gradients of ANF firing= rates but have not yet achieved meaningful results.<br> <br> Could anyone suggest an auditory metric derived from binaural signals, ANF f= iring rates, or MSO that could classify front/back sources without relying o= n HRTF template matching? Any insights or alternative approaches would be in= valuable to my work.<br> <br> Thank you in advance. I sincerely appreciate any guidance you can offer.<br>= <br> Best regards,<br> <br> <b></b><strong>Qin Liu</strong><span style=3D"color:rgba(0,0,0,0.9);font-fam= ily:&quot;PingFang SC&quot;,-apple-system,BlinkMacSystemFont,&quot;Segoe UI&= quot;,Roboto,Ubuntu,&quot;Helvetica Neue&quot;,Helvetica,Arial,&quot;Hiragin= o Sans GB&quot;,&quot;Microsoft YaHei UI&quot;,&quot;Microsoft YaHei&quot;,&= quot;Source Han Sans CN&quot;,sans-serif;font-size:16px;background-color:rgb= (252,252,252)"></span><br style=3D"color:rgba(0,0,0,0.9);font-family:&quot;P= ingFang SC&quot;,-apple-system,BlinkMacSystemFont,&quot;Segoe UI&quot;,Robot= o,Ubuntu,&quot;Helvetica Neue&quot;,Helvetica,Arial,&quot;Hiragino Sans GB&q= uot;,&quot;Microsoft YaHei UI&quot;,&quot;Microsoft YaHei&quot;,&quot;Source= Han Sans CN&quot;,sans-serif;font-size:16px;background-color:rgb(252,252,25= 2)"> <span>Doctoral Student</span><span style=3D"color:rgba(0,0,0,0.9);font-famil= y:&quot;PingFang SC&quot;,-apple-system,BlinkMacSystemFont,&quot;Segoe UI&qu= ot;,Roboto,Ubuntu,&quot;Helvetica Neue&quot;,Helvetica,Arial,&quot;Hiragino S= ans GB&quot;,&quot;Microsoft YaHei UI&quot;,&quot;Microsoft YaHei&quot;,&quo= t;Source Han Sans CN&quot;,sans-serif;font-size:16px;background-color:rgb(25= 2,252,252)"></span><br style=3D"color:rgba(0,0,0,0.9);font-family:&quot;Ping= Fang SC&quot;,-apple-system,BlinkMacSystemFont,&quot;Segoe UI&quot;,Roboto,U= buntu,&quot;Helvetica Neue&quot;,Helvetica,Arial,&quot;Hiragino Sans GB&quot= ;,&quot;Microsoft YaHei UI&quot;,&quot;Microsoft YaHei&quot;,&quot;Source Ha= n Sans CN&quot;,sans-serif;font-size:16px;background-color:rgb(252,252,252)"= > <span style=3D"color:rgba(0,0,0,0.9);font-family:&quot;PingFang SC&quot;,-ap= ple-system,BlinkMacSystemFont,&quot;Segoe UI&quot;,Roboto,Ubuntu,&quot;Helve= tica Neue&quot;,Helvetica,Arial,&quot;Hiragino Sans GB&quot;,&quot;Microsoft= YaHei UI&quot;,&quot;Microsoft YaHei&quot;,&quot;Source Han Sans CN&quot;,s= ans-serif;font-size:16px;background-color:rgb(252,252,252)">Laboratory of Wave Engineering,&nbsp;</span><span style=3D"color:rgba(0,0,0,0.9);font-= family:&quot;PingFang SC&quot;,-apple-system,BlinkMacSystemFont,&quot;Segoe U= I&quot;,Roboto,Ubuntu,&quot;Helvetica Neue&quot;,Helvetica,Arial,&quot;Hirag= ino Sans GB&quot;,&quot;Microsoft YaHei UI&quot;,&quot;Microsoft YaHei&quot;= ,&quot;Source Han Sans CN&quot;,sans-serif;font-size:16px;background-color:r= gb(252,252,252)">=C3=89cole Polytechnique F=C3=A9d=C3=A9rale de Lausanne (EPFL)</span><br style=3D"colo= r:rgba(0,0,0,0.9);font-family:&quot;PingFang SC&quot;,-apple-system,BlinkMac= SystemFont,&quot;Segoe UI&quot;,Roboto,Ubuntu,&quot;Helvetica Neue&quot;,Hel= vetica,Arial,&quot;Hiragino Sans GB&quot;,&quot;Microsoft YaHei UI&quot;,&qu= ot;Microsoft YaHei&quot;,&quot;Source Han Sans CN&quot;,sans-serif;font-size= :16px;background-color:rgb(252,252,252)"> <span style=3D"color:rgba(0,0,0,0.9);font-family:&quot;PingFang SC&quot;,-ap= ple-system,BlinkMacSystemFont,&quot;Segoe UI&quot;,Roboto,Ubuntu,&quot;Helve= tica Neue&quot;,Helvetica,Arial,&quot;Hiragino Sans GB&quot;,&quot;Microsoft= YaHei UI&quot;,&quot;Microsoft YaHei&quot;,&quot;Source Han Sans CN&quot;,s= ans-serif;font-size:16px;background-color:rgb(252,252,252)">Email: <a href=3D"mailto:qin.liu@xxxxxxxx" target=3D"_blank">qin.liu@xxxxxxxx</a></s= pan></div> <br> <br> <p></p> </div> </div> </div></blockquote></div> </div></blockquote></div></div></body></html>= --Apple-Mail-A1D84503-2C4A-4D1C-96D4-5C3B3361D3D7--


This message came from the mail archive
postings/2025/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University