[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Subject: Auditory Illusions



Dear Nedra,
I have a slightly different understanding in the regard of auditory illusions such as the Mc Gurk effect not contributing to hearing aid design. I think these illusions may have more to contribute to not only hearing aid design but also to psycho-acoustics. One of the key uses of the Mc Gurk effect was in terms of studying the re-organisation/remapping process in the auditory and visual cortex which has been recently reported due to auditory deprivation (deafness) in hearing impaired children and the changes to the same with cochlear implantation and I guess the same would apply to hearing aids as well (especially the ones with severe hearing loss). One of the other interesting illusions (although not purely an auditory illusion â or maybe with the current discussion that I have noticed in the list may be not even an illusion) I remember working with hearing aid and cochlear implanted users is the Kiki-Bouba effect which helps us study the sound-shape correspondences in the brain. One of the key missing link I think in the current wave of hearing aid technology which is mainly driven via psycho-acoustical difficulties that a hearing impaired user faces is the overall auditory processing (central auditory processing as well as Auditory- visual processing) difficulties that a hearing loss impinges along-with the lower level psycho-acoustic deficits like poor frequency selectivity, temporal resolution etc. Maybe broadening the definition of psycho-acoustics a bit these aspects may also be well under its domain. The subtle but existing difference between two hearing aid users with the same type of hearing loss characteristics may well be in terms of their central auditory visual processing abilities which might have been differently affected by the loss (although there may be alternative aspects such as difference in some relatively less explored low level psycho-acoustic abilities adding to it). In a nutshell what I want to express is that studying these illusions is to somehow ensure that we reach limitations which are imposed by an impaired ear and not those imposed by inadequate amplification systems and may be try to find out and compensate for the additional auditory-visual processing/integration limitations that the hearing loss has added using appropriate rehabilitative/training measures.

I donât know why there were those strange characters visible in my previous post for this topic so I thought of attaching that previous message at the end as a paragraph for better readability. Apologies to all for any inconvenience caused.
 
Although I am not an expert in this area, but I have done some preliminary (unpublished) work on the use of the Mc Gurk effect for the evaluation of current day multichannel digital hearing aids. Few of the basic assumptions for the rationale and implications were as follows. One of the main benefits of audio visual integration is in difficult listening situations when either the speech is degraded or there is background noise. The visual cue in an auditory visual integration task is relatively unaffected by noise. Hearing impaired listeners especially those with relatively larger auditory deprivation periods rely more on visual cues (the AV balance is slightly tilted towards visual dominance) in speech perception than normal hearing listeners, thus making them better speech readers and also relatively poor AV integrators. The psycho-acoustical aspects of sensory-neural hearing loss suggest a reduction in the possibility of perceiving certain classes of speech sounds especially in presence of noise. Moreover some of the speech sounds may not be heard in noisy environments no matter how intensely they are amplified by the hearing aids. The Digital multichannel hearing aids may also have some amount of internal distortion and delay due to digital processing and filtering. In such scenarios the hearing aid user may benefit from the visual cues provided by a speakers facial and lip movements which should relatively be unaffected by noise and thus boost speech perception for energetic masking at poor signal to noise ratios as well as in informational masking. We presented hearing aid users (relatively homogenous group in terms of hearing aid used) the Mc Gurk stimuli in both congruent and in-congruent conditions in quiet and in noise (three different SNRs) at comfortable level at which the subjects scored > 70% on a screening test using PB words. A criterion of 3/4th fusion responses was kept to determine the presence of Mc Gurk effect. The results indicated that subjects with normal hearing performed better than the subjects using HAâs in all conditions. The presence of auditory and visual information simultaneously in the congruent condition was beneficial for speech perception in quiet and in noise. Noise reduced the subjects ability to perceive speech at poor SNR and had a more severe impact on the performance of the HA users than in normal subjects. The Mc Gurk effect was absent in the HA users at the poorer SNRs compared to normal listeners and in these conditions the HA users responses to the Mc Gurk stimuli were mainly visually dominated. Background noise and increased listening effort are significant factors influencing hearing-aid satisfaction and one of the major reason for rejection of HAâs. Testing Mc Gurk effect in a noisy environment may be a useful way to understand auditory visual speech perception in HA users and verify the benefits of aided AV speech perception in noise in HA users. Some of the implications in terms of hearing aid rehabilitation for the same may be in terms of enhancing optimal HA fitting to achieve not only good auditory perception in noise but also optimum auditory visual perception in noise and the emphasis on auditory training and the use of speech reading skills. The study that I have mentioned above was in no way devoid of limitations like lesser sample size and the hearing aids were used at the same programs that the HA users were using in their everyday listening environments and thus all the user had different program settings. May be the rationale and implications might be of some interest to you though.
 
Regards,
Imran Dhamani