Re: [AUDITORY] Seeking Advice on Auditory Models for Hearing Loss (Alexander Brewer )


Subject: Re: [AUDITORY] Seeking Advice on Auditory Models for Hearing Loss
From:    Alexander Brewer  <afb8252@xxxxxxxx>
Date:    Sat, 14 Dec 2024 17:54:05 -0500

--000000000000af3af2062942d3bc Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable I have used the above cited 3d Tune-In toolkit in prior work, and it does a good job/is quite robust for interactive usage. For programmatic hearing loss simulation, I found myself relying on the various stock audiograms from the Clarity/Cadenza Challenge <https://cadenzachallenge.org/>: https://github.com/claritychallenge/clarity/blob/main/clarity/utils/audiogr= am.py You can see an example of how this might be applied to an audio array here: https://github.com/hyve9/samplifi/blob/main/samplifi.py#L475 Regards, Xander On Sat, Dec 14, 2024 at 12:42=E2=80=AFAM Richard F. Lyon < 0000030301ff4bce-dmarc-request@xxxxxxxx> wrote: > Consider CARFAC v2 for this. See paper https://arxiv.org/abs/2404.17490 > <https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__arxiv.org_abs_240= 4.17490&d=3DDwMFaQ&c=3DslrrB7dE8n7gBJbeO0g-IQ&r=3DYmee6oc4fLxbraODGABabA&m= =3DCCOY2HFOSswm7A_ulMZokcLS00OInVpnRQqPueiQ9y5PFZXf0LoEpiN6BK2K5MUQ&s=3DHqK= JiTndyRIVxrLzKHqi1jGPSDP4YWW763WpFbcAs-w&e=3D> > > It's free, open-source, well documented, small, efficient, in multiple > languages, and can model sensorineural hearing loss via reduction of OHC > function. We have a v3 to release soon that can also model synaptopathy. > > Dick > > > On Thu, Dec 12, 2024 at 11:14=E2=80=AFPM Sathish Kumar <sathish.sreeni58@xxxxxxxx= gmail.com> > wrote: > >> Dear members of the auditory list, >> I've recently been working on behavioral auditory source localization >> performance in hearing aid users. To support my behavioral findings, I a= m >> exploring computational auditory models for various configurations of >> hearing loss. >> >> Through the review of the literature, I found that the >> direction-of-arrival models proposed by Dietz et al. (2011) and Kayser e= t >> al. (2015) are suitable for my experiment where the interpretation based= on >> pre-estimated IPD-to-azimuth mappings with respect to Interaural Coheren= ce >> and probability distributions. However, both of these models do not >> account for hearing loss characteristics. >> >> I would greatly appreciate suggestions for other auditory models that >> might be suitable for my purpose. Additionally, I am wondering whether I >> can replace the cochlear and basilar membrane processing steps in Dietz = et >> al. (2011) with the hearing loss models. If anyone has experience with t= his >> approach, I would be grateful for your advice. >> >> Thank you in advance for your suggestions!! >> >> Best regards, >> -- >> >> Sathish Kumar >> >> PhD Research Scholar >> >> Department of Audiology and SLP >> >> Kasturba Medical College (MAHE), Mangalore >> >> Mob: 9789447666 || E-mail: sathish.sreeni58@xxxxxxxx >> >> >> >> <https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__www.linkedin.com= _in_sathish-2Dkumar-2D0815121a9_&d=3DDwMFaQ&c=3DslrrB7dE8n7gBJbeO0g-IQ&r=3D= Ymee6oc4fLxbraODGABabA&m=3DCCOY2HFOSswm7A_ulMZokcLS00OInVpnRQqPueiQ9y5PFZXf= 0LoEpiN6BK2K5MUQ&s=3DClY_lP1RxEdq77xe5PWFhPSA-gOrcoFiGT26h6550bU&e=3D> >> >> <https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__www.researchgate= .net_profile_Sathish-2DKumar-2D163&d=3DDwMFaQ&c=3DslrrB7dE8n7gBJbeO0g-IQ&r= =3DYmee6oc4fLxbraODGABabA&m=3DCCOY2HFOSswm7A_ulMZokcLS00OInVpnRQqPueiQ9y5PF= ZXf0LoEpiN6BK2K5MUQ&s=3DCHkATKRw32klTq7N4axc0sFTfxTT9d00iPYW-joCl4Q&e=3D> >> >> <https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__github.com_SATHi= sh64103&d=3DDwMFaQ&c=3DslrrB7dE8n7gBJbeO0g-IQ&r=3DYmee6oc4fLxbraODGABabA&m= =3DCCOY2HFOSswm7A_ulMZokcLS00OInVpnRQqPueiQ9y5PFZXf0LoEpiN6BK2K5MUQ&s=3DsWK= Pck5_x3zSCniWeRdd1-GmjZWc1wSTYHo4E_9jyu4&e=3D> >> > --000000000000af3af2062942d3bc Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable <div dir=3D"ltr">I have used the above cited 3d Tune-In toolkit in prior wo= rk, and it does a good job/is quite robust for interactive usage. For progr= ammatic hearing loss simulation, I found myself relying on the various stoc= k audiograms from the <a href=3D"https://cadenzachallenge.org/">Clarity/Cad= enza Challenge</a>:<br><br><div><a href=3D"https://github.com/claritychalle= nge/clarity/blob/main/clarity/utils/audiogram.py">https://github.com/clarit= ychallenge/clarity/blob/main/clarity/utils/audiogram.py</a></div><div><br><= /div><div>You can see an example of how this might be applied to an audio a= rray here:<br><br><a href=3D"https://github.com/hyve9/samplifi/blob/main/sa= mplifi.py#L475">https://github.com/hyve9/samplifi/blob/main/samplifi.py#L47= 5</a></div><div><br></div><div>Regards,<br></div><div>Xander<br></div><br><= br></div><br><div class=3D"gmail_quote gmail_quote_container"><div dir=3D"l= tr" class=3D"gmail_attr">On Sat, Dec 14, 2024 at 12:42=E2=80=AFAM Richard F= . Lyon &lt;<a href=3D"mailto:0000030301ff4bce-dmarc-request@xxxxxxxx= ">0000030301ff4bce-dmarc-request@xxxxxxxx</a>&gt; wrote:<br></div><b= lockquote class=3D"gmail_quote" style=3D"margin:0px 0px 0px 0.8ex;border-le= ft:1px solid rgb(204,204,204);padding-left:1ex"><div dir=3D"ltr"><div class= =3D"gmail_default" style=3D"font-size:small">Consider CARFAC v2 for this.= =C2=A0 See paper=C2=A0<a href=3D"https://urldefense.proofpoint.com/v2/url?u= =3Dhttps-3A__arxiv.org_abs_2404.17490&amp;d=3DDwMFaQ&amp;c=3DslrrB7dE8n7gBJ= beO0g-IQ&amp;r=3DYmee6oc4fLxbraODGABabA&amp;m=3DCCOY2HFOSswm7A_ulMZokcLS00O= InVpnRQqPueiQ9y5PFZXf0LoEpiN6BK2K5MUQ&amp;s=3DHqKJiTndyRIVxrLzKHqi1jGPSDP4Y= WW763WpFbcAs-w&amp;e=3D" target=3D"_blank">https://arxiv.org/abs/2404.17490= </a></div><div class=3D"gmail_default" style=3D"font-size:small"><br></div>= <div class=3D"gmail_default" style=3D"font-size:small">It&#39;s free, open-= source, well documented, small, efficient, in multiple languages, and can m= odel sensorineural hearing loss via reduction of OHC function.=C2=A0 We hav= e a v3 to release soon that can also model synaptopathy.</div><div class=3D= "gmail_default" style=3D"font-size:small"><br></div><div class=3D"gmail_def= ault" style=3D"font-size:small">Dick</div><div class=3D"gmail_default" styl= e=3D"font-size:small"><br></div></div><br><div class=3D"gmail_quote"><div d= ir=3D"ltr" class=3D"gmail_attr">On Thu, Dec 12, 2024 at 11:14=E2=80=AFPM Sa= thish Kumar &lt;<a href=3D"mailto:sathish.sreeni58@xxxxxxxx" target=3D"_bl= ank">sathish.sreeni58@xxxxxxxx</a>&gt; wrote:<br></div><blockquote class= =3D"gmail_quote" style=3D"margin:0px 0px 0px 0.8ex;border-left:1px solid rg= b(204,204,204);padding-left:1ex"><div dir=3D"ltr"><div dir=3D"ltr"><div>Dea= r members of the auditory list,</div><div>I&#39;ve recently been working on= behavioral auditory source localization performance in hearing=C2=A0aid us= ers. To support my behavioral findings, I am exploring computational audito= ry models for various configurations of hearing loss.=C2=A0</div><div><br><= /div><div>Through the review of the literature, I found that the direction-= of-arrival models proposed by Dietz et al. (2011) and Kayser et al. (2015) = are suitable for my experiment where the interpretation based on pre-estima= ted IPD-to-azimuth mappings with respect to Interaural Coherence and probab= ility distributions. However, both of these models do not account=C2=A0for = hearing loss characteristics.=C2=A0 =C2=A0 =C2=A0</div><div><br></div><div>= I would greatly appreciate suggestions=C2=A0for other auditory models that = might be suitable for my purpose. Additionally, I am wondering whether I ca= n replace the cochlear and basilar membrane processing steps in Dietz et al= . (2011) with the hearing loss models.=C2=A0If anyone has experience with t= his approach, I would be grateful for=C2=A0your advice.</div><div><br></div= ><div>Thank you in advance for your suggestions!!</div><div><br></div><div>= Best regards,</div><span class=3D"gmail_signature_prefix">-- </span><br><di= v dir=3D"ltr" class=3D"gmail_signature"><div dir=3D"ltr"><span style=3D"col= or:rgb(0,0,0)"><p dir=3D"ltr" style=3D"line-height:1.2;margin-top:0pt;margi= n-bottom:0pt"><span style=3D"color:rgb(32,33,36);background-color:rgb(255,2= 55,255);font-weight:700;font-style:normal;text-decoration:none;vertical-ali= gn:baseline;white-space:pre-wrap"><font face=3D"arial, sans-serif">Sathish = Kumar</font></span></p><p dir=3D"ltr" style=3D"line-height:1.2;background-c= olor:rgb(255,255,255);margin-top:0pt;margin-bottom:0pt"><span style=3D"colo= r:rgb(32,33,36);background-color:transparent;font-weight:400;font-style:nor= mal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"><fon= t face=3D"arial, sans-serif">PhD Research Scholar</font></span></p><p dir= =3D"ltr" style=3D"line-height:1.2;background-color:rgb(255,255,255);margin-= top:0pt;margin-bottom:0pt"><span style=3D"color:rgb(32,33,36);background-co= lor:transparent;font-weight:400;font-style:normal;text-decoration:none;vert= ical-align:baseline;white-space:pre-wrap"><font face=3D"arial, sans-serif">= Department of Audiology and SLP</font></span></p><p dir=3D"ltr" style=3D"li= ne-height:1.2;background-color:rgb(255,255,255);margin-top:0pt;margin-botto= m:0pt"><span style=3D"color:rgb(32,33,36);background-color:transparent;font= -weight:400;font-style:normal;text-decoration:none;vertical-align:baseline;= white-space:pre-wrap"><font face=3D"arial, sans-serif">Kasturba Medical Col= lege (MAHE), Mangalore</font></span></p><p style=3D"line-height:1.2;backgro= und-color:rgb(255,255,255);margin-top:0pt;margin-bottom:0pt"><span style=3D= "color:rgb(32,33,36);background-color:transparent;font-weight:400;font-styl= e:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap"= ><font face=3D"arial, sans-serif">Mob: 9789447666 || E-mail: <a href=3D"mai= lto:sathish.sreeni58@xxxxxxxx" target=3D"_blank">sathish.sreeni58@xxxxxxxx= m</a></font></span></p><p dir=3D"ltr" style=3D"line-height:1.2;background-c= olor:rgb(255,255,255);margin-top:0pt;margin-bottom:0pt"><br></p><p dir=3D"l= tr" style=3D"line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span><span = style=3D"font-size:11pt;font-family:Arial,sans-serif;vertical-align:baselin= e;white-space:pre-wrap"><span style=3D"border:medium;display:inline-block;o= verflow:hidden;width:38px;height:38px"><a href=3D"https://urldefense.proofp= oint.com/v2/url?u=3Dhttps-3A__www.linkedin.com_in_sathish-2Dkumar-2D0815121= a9_&amp;d=3DDwMFaQ&amp;c=3DslrrB7dE8n7gBJbeO0g-IQ&amp;r=3DYmee6oc4fLxbraODG= ABabA&amp;m=3DCCOY2HFOSswm7A_ulMZokcLS00OInVpnRQqPueiQ9y5PFZXf0LoEpiN6BK2K5= MUQ&amp;s=3DClY_lP1RxEdq77xe5PWFhPSA-gOrcoFiGT26h6550bU&amp;e=3D" target=3D= "_blank"><img src=3D"https://lh6.googleusercontent.com/5byXAEkLcmDiDEuMoWne= EZQdjIKw6iXHjfmLsRbz0FYhcx20eR5fkDLbv6Pea9afWwT5tctVYd0IA6YZG3DjVnpO5wv6bek= Czl0jnN0rHbUDj_i0_ltWn5DcTWV_d3VZUj0vee8E3Sn4M_fstbBxj8Q" width=3D"38" heig= ht=3D"38" style=3D"margin-left: 0px; margin-top: 0px;"></a></span></span><s= pan style=3D"font-size:11pt;font-family:Arial,sans-serif;vertical-align:bas= eline;white-space:pre-wrap"> =C2=A0 </span><span style=3D"font-size:11pt;fo= nt-family:Arial,sans-serif;vertical-align:baseline;white-space:pre-wrap"><s= pan style=3D"border:medium;display:inline-block;overflow:hidden;width:38px;= height:38px"><a href=3D"https://urldefense.proofpoint.com/v2/url?u=3Dhttps-= 3A__www.researchgate.net_profile_Sathish-2DKumar-2D163&amp;d=3DDwMFaQ&amp;c= =3DslrrB7dE8n7gBJbeO0g-IQ&amp;r=3DYmee6oc4fLxbraODGABabA&amp;m=3DCCOY2HFOSs= wm7A_ulMZokcLS00OInVpnRQqPueiQ9y5PFZXf0LoEpiN6BK2K5MUQ&amp;s=3DCHkATKRw32kl= Tq7N4axc0sFTfxTT9d00iPYW-joCl4Q&amp;e=3D" target=3D"_blank"><img src=3D"htt= ps://lh5.googleusercontent.com/xK4xzers236Zr7hl6kTXCcf28ETM6L4XYqRBNoIyXKXL= TMavTMfvDdDVpQZC_uarIITzhAVe-L36vaHFM-9zalrKP9vk45O-eCzHcQdE7pQ4hbJUEqYILgh= XvWrhlqGbOL6obnonsC7O-CDSXgVcxQk" width=3D"38" height=3D"38" style=3D"margi= n-left: 0px; margin-top: 0px;"></a></span></span><span style=3D"font-size:1= 1pt;font-family:Arial,sans-serif;vertical-align:baseline;white-space:pre-wr= ap">=C2=A0=C2=A0=C2=A0</span><span style=3D"font-size:11pt;font-family:Aria= l,sans-serif;vertical-align:baseline;white-space:pre-wrap"><span style=3D"b= order:medium;display:inline-block;overflow:hidden;width:38px;height:38px"><= a href=3D"https://urldefense.proofpoint.com/v2/url?u=3Dhttps-3A__github.com= _SATHish64103&amp;d=3DDwMFaQ&amp;c=3DslrrB7dE8n7gBJbeO0g-IQ&amp;r=3DYmee6oc= 4fLxbraODGABabA&amp;m=3DCCOY2HFOSswm7A_ulMZokcLS00OInVpnRQqPueiQ9y5PFZXf0Lo= EpiN6BK2K5MUQ&amp;s=3DsWKPck5_x3zSCniWeRdd1-GmjZWc1wSTYHo4E_9jyu4&amp;e=3D"= target=3D"_blank"><img src=3D"https://lh5.googleusercontent.com/TeW5o2pJsN= Ew0t-KL4RMnHMeyOjtzL8-GjpsM26pbcbFdR9wKh8FfouojMU57VaePdryTPyW8Y9jXgwUVNfqH= Dvlrn_gUM8HwlgzD_pa06vUm0Tw9gfRYz1Pi7PZqRQe32_q9S7Gc0c9DR_YztA2C20" width= =3D"38" height=3D"38" style=3D"margin-left: 0px; margin-top: 0px;"></a></sp= an></span></span><br></p></span></div></div></div></div> </blockquote></div> </blockquote></div> --000000000000af3af2062942d3bc--


This message came from the mail archive
postings/2024/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University