Bimodal Cochlear Implant Listeners' Ability to Perceive Minimal Audible Angle Differences.

Ashley Zaleski-King, Matthew J Goupell, Dragana Barac-Cikoja, Matthew Bakke
Author Information
  1. Ashley Zaleski-King: Department of Speech and Hearing Sciences, Gallaudet University, Washington, DC.
  2. Matthew J Goupell: Department of Hearing and Speech Sciences, University of Maryland College Park, College Park, MD.
  3. Dragana Barac-Cikoja: Department of Speech and Hearing Sciences, Gallaudet University, Washington, DC.
  4. Matthew Bakke: Department of Speech and Hearing Sciences, Gallaudet University, Washington, DC.

Abstract

BACKGROUND: Bilateral inputs should ideally improve sound localization and speech understanding in noise. However, for many bimodal listeners [i.e., individuals using a cochlear implant (CI) with a contralateral hearing aid (HA)], such bilateral benefits are at best, inconsistent. The degree to which clinically available HA and CI devices can function together to preserve interaural time and level differences (ITDs and ILDs, respectively) enough to support the localization of sound sources is a question with important ramifications for speech understanding in complex acoustic environments.
PURPOSE: To determine if bimodal listeners are sensitive to changes in spatial location in a minimum audible angle (MAA) task.
RESEARCH DESIGN: Repeated-measures design.
STUDY SAMPLE: Seven adult bimodal CI users (28-62 years). All listeners reported regular use of digital HA technology in the nonimplanted ear.
DATA COLLECTION AND ANALYSIS: Seven bimodal listeners were asked to balance the loudness of prerecorded single syllable utterances. The loudness-balanced stimuli were then presented via direct audio inputs of the two devices with an ITD applied. The task of the listener was to determine the perceived difference in processing delay (the interdevice delay [IDD]) between the CI and HA devices. Finally, virtual free-field MAA performance was measured for different spatial locations both with and without inclusion of the IDD correction, which was added with the intent to perceptually synchronize the devices.
RESULTS: During the loudness-balancing task, all listeners required increased acoustic input to the HA relative to the CI most comfortable level to achieve equal interaural loudness. During the ITD task, three listeners could perceive changes in intracranial position by distinguishing sounds coming from the left or from the right hemifield; when the CI was delayed by 0.73, 0.67, or 1.7 msec, the signal lateralized from one side to the other. When MAA localization performance was assessed, only three of the seven listeners consistently achieved above-chance performance, even when an IDD correction was included. It is not clear whether the listeners who were able to consistently complete the MAA task did so via binaural comparison or by extracting monaural loudness cues. Four listeners could not perform the MAA task, even though they could have used a monaural loudness cue strategy.
CONCLUSIONS: These data suggest that sound localization is extremely difficult for most bimodal listeners. This difficulty does not seem to be caused by large loudness imbalances and IDDs. Sound localization is best when performed via a binaural comparison, where frequency-matched inputs convey ITD and ILD information. Although low-frequency acoustic amplification with a HA when combined with a CI may produce an overlapping region of frequency-matched inputs and thus provide an opportunity for binaural comparisons for some bimodal listeners, our study showed that this may not be beneficial or useful for spatial location discrimination tasks. The inability of our listeners to use monaural-level cues to perform the MAA task highlights the difficulty of using a HA and CI together to glean information on the direction of a sound source.

References

  1. Acta Otolaryngol. 2003 Jun;123(5):612-7 [PMID: 12875584]
  2. J Acoust Soc Am. 1998 Jun;103(6):3656-66 [PMID: 9637047]
  3. J Acoust Soc Am. 2007 Nov;122(5):2826-31 [PMID: 18189572]
  4. J Acoust Soc Am. 2008 Nov;124(5):3132-41 [PMID: 19045798]
  5. Hear Res. 2012 Jun;288(1-2):100-13 [PMID: 22226928]
  6. Ear Hear. 2003 Apr;24(2):175-83 [PMID: 12677113]
  7. Int J Audiol. 2011 Dec;50(12):871-80 [PMID: 22103439]
  8. J Acoust Soc Am. 2013 Oct;134(4):2923-36 [PMID: 24116428]
  9. J Assoc Res Otolaryngol. 2014 Aug;15(4):633-47 [PMID: 24890714]
  10. Cochlear Implants Int. 2009;10 Suppl 1:96-9 [PMID: 19230032]
  11. J Acoust Soc Am. 2011 Nov;130(5):2817-26 [PMID: 22087910]
  12. J Am Acad Audiol. 2007 Oct;18(9):760-76 [PMID: 18354885]
  13. J Acoust Soc Am. 2005 Mar;117(3 Pt 1):1351-61 [PMID: 15807023]
  14. J Acoust Soc Am. 1992 Mar;91(3):1648-61 [PMID: 1564201]
  15. J Acoust Soc Am. 2015 Mar;137(3):1282-97 [PMID: 25786942]
  16. Ear Hear. 2016 May-Jun;37(3):260-70 [PMID: 26656192]
  17. Audiol Neurootol. 2011;16(2):82-92 [PMID: 20571259]
  18. J Acoust Soc Am. 2013 May;133(5):2839-55 [PMID: 23654390]
  19. J Acoust Soc Am. 2010 Mar;127(3):1440-9 [PMID: 20329844]
  20. J Am Acad Audiol. 2009 Jun;20(6):353-73 [PMID: 19594084]
  21. Int J Otolaryngol. 2011;2011:573968 [PMID: 22013448]
  22. J Acoust Soc Am. 2014 Jan;135(1):EL47-53 [PMID: 24437856]
  23. Trends Amplif. 2009 Sep;13(3):190-205 [PMID: 19713210]
  24. Ear Hear. 2011 Jul-Aug;32(4):536-40 [PMID: 21307775]
  25. Ear Hear. 2017 Jul/Aug;38(4):426-440 [PMID: 28085740]
  26. J Speech Lang Hear Res. 2005 Jun;48(3):668-80 [PMID: 16197280]
  27. Ear Hear. 2000 Feb;21(1):6-17 [PMID: 10708069]
  28. J Acoust Soc Am. 2006 Apr;119(4):2417-26 [PMID: 16642854]
  29. J Acoust Soc Am. 2010 Jan;127(1):400-14 [PMID: 20058986]
  30. Trends Hear. 2017 Jan-Dec;21:2331216517727900 [PMID: 28874096]
  31. Audiol Neurootol. 2011;16 Suppl 2:1-30 [PMID: 21606646]
  32. Audiol Neurootol. 2006;11 Suppl 1:6-11 [PMID: 17063004]
  33. J Hear Sci. 2012;2(4):EA37-EA39 [PMID: 25414796]
  34. J Acoust Soc Am. 1985 Sep;78(3):1120-3 [PMID: 4031257]
  35. Trends Amplif. 2003;7(1):1-9 [PMID: 15004644]
  36. Audiol Neurootol. 2016;21(3):127-31 [PMID: 27077663]
  37. Ear Hear. 2004 Feb;25(1):9-21 [PMID: 14770014]
  38. Ear Hear. 2007 Jun;28(3):412-23 [PMID: 17485990]
  39. Laryngoscope. 1999 Apr;109(4):595-9 [PMID: 10201747]
  40. Ear Hear. 2015 Sep-Oct;36(5):e207-13 [PMID: 25860624]
  41. Int J Audiol. 2010 Dec;49(12):912-9 [PMID: 20874053]
  42. Ear Hear. 1999 Jun;20(3):182-92 [PMID: 10386846]
  43. J Acoust Soc Am. 1998 Jun;103(6):3667-76 [PMID: 9637048]
  44. PLoS One. 2015 Mar 19;10(3):e0120279 [PMID: 25790349]
  45. Ear Hear. 2011 Mar-Apr;32(2):198-208 [PMID: 21052005]
  46. J Am Acad Audiol. 2008 Oct;19(9):657-71; quiz 735 [PMID: 19418706]
  47. J Deaf Stud Deaf Educ. 2012 Spring;17(2):244-58 [PMID: 22057984]
  48. J Assoc Res Otolaryngol. 2014 Apr;15(2):235-48 [PMID: 24464088]
  49. Trends Amplif. 2007 Sep;11(3):161-92 [PMID: 17709573]
  50. Ear Hear. 2016 May-Jun;37(3):282-8 [PMID: 26901264]
  51. Ear Hear. 2013 Nov-Dec;34(6):685-700 [PMID: 24165299]
  52. J Acoust Soc Am. 2004 Sep;116(3):1698-709 [PMID: 15478437]
  53. Ann Otol Rhinol Laryngol Suppl. 1995 Sep;166:314-6 [PMID: 7668686]
  54. Ear Hear. 2014 Sep-Oct;35(5):580-4 [PMID: 25144252]
  55. Int J Audiol. 2004;43(2):61-65 [PMID: 28793845]
  56. Ear Hear. 2001 Oct;22(5):365-80 [PMID: 11605945]
  57. Behav Res Methods. 2012 Jun;44(2):361-73 [PMID: 22101654]
  58. J Assoc Res Otolaryngol. 2009 Mar;10(1):131-41 [PMID: 19048344]
  59. Ear Hear. 2013 Mar-Apr;34(2):133-41 [PMID: 23075632]
  60. Audiol Neurootol. 2014;19(3):151-63 [PMID: 24556850]
  61. Acta Otolaryngol. 2005 Jun;125(6):596-606 [PMID: 16076708]
  62. Audiol Neurootol. 2013;18(1):36-47 [PMID: 23095305]
  63. J Acoust Soc Am. 2013 Apr;133(4):2272-87 [PMID: 23556595]
  64. Am J Otol. 1997 Nov;18(6 Suppl):S140-1 [PMID: 9391635]
  65. Audiol Neurootol. 2009;14 Suppl 1:2-7 [PMID: 19390169]
  66. Ear Hear. 2011 Jul-Aug;32(4):468-84 [PMID: 21412155]
  67. Am J Audiol. 2014 Mar;23(1):79-92 [PMID: 24018578]
  68. Hear Res. 2016 Jun;336:72-82 [PMID: 27178443]
  69. Acta Otolaryngol. 2004 May;124(4):381-6 [PMID: 15224858]
  70. IEEE Trans Biomed Eng. 2006 Dec;53(12 Pt 2):2598-601 [PMID: 17152439]

Grants

  1. R01 DC014948/NIDCD NIH HHS

MeSH Term

Adult
Auditory Perception
Cochlear Implants
Deafness
Female
Hearing Aids
Humans
Male
Middle Aged
Sound Localization

Word Cloud

Created with Highcharts 10.0.0listenersCIHAtaskbimodalMAAlocalizationloudnessinputssounddevicesacousticspatialviaITDperformancebinauralspeechunderstandingusingbesttogetherinterauralleveldeterminechangeslocationSevenusedelayIDDcorrectionthree0consistentlyevencomparisonmonauralcuesperformdifficultyfrequency-matchedinformationmayBACKGROUND:BilateralideallyimprovenoiseHowevermany[ieindividualscochlearimplantcontralateralhearingaid]bilateralbenefitsinconsistentdegreeclinicallyavailablecanfunctionpreservetimedifferencesITDsILDsrespectivelyenoughsupportsourcesquestionimportantramificationscomplexenvironmentsPURPOSE:sensitiveminimumaudibleangleRESEARCHDESIGN:Repeated-measuresdesignSTUDYSAMPLE:adultusers28-62yearsreportedregulardigitaltechnologynonimplantedearDATACOLLECTIONANDANALYSIS:askedbalanceprerecordedsinglesyllableutterancesloudness-balancedstimulipresenteddirectaudiotwoappliedlistenerperceiveddifferenceprocessinginterdevice[IDD]Finallyvirtualfree-fieldmeasureddifferentlocationswithoutinclusionaddedintentperceptuallysynchronizeRESULTS:loudness-balancingrequiredincreasedinputrelativecomfortableachieveequalperceiveintracranialpositiondistinguishingsoundscomingleftrighthemifielddelayed736717msecsignallateralizedonesideassessedsevenachievedabove-chanceincludedclearwhetherablecompleteextractingFourthoughusedcuestrategyCONCLUSIONS:datasuggestextremelydifficultseemcausedlargeimbalancesIDDsSoundperformedconveyILDAlthoughlow-frequencyamplificationcombinedproduceoverlappingregionthusprovideopportunitycomparisonsstudyshowedbeneficialusefuldiscriminationtasksinabilitymonaural-levelhighlightsgleandirectionsourceBimodalCochlearImplantListeners'AbilityPerceiveMinimalAudibleAngleDifferences

Similar Articles

Cited By