Frequency change detection and speech perception in cochlear implant users.

Fawen Zhang, Gabrielle Underwood, Kelli McGuire, Chun Liang, David R Moore, Qian-Jie Fu
Author Information
  1. Fawen Zhang: Department of Communication Sciences and Disorders, University of Cincinnati, Ohio, USA. Electronic address: Fawen.Zhang@uc.edu.
  2. Gabrielle Underwood: Department of Communication Sciences and Disorders, University of Cincinnati, Ohio, USA.
  3. Kelli McGuire: Department of Communication Sciences and Disorders, University of Cincinnati, Ohio, USA.
  4. Chun Liang: Department of Communication Sciences and Disorders, University of Cincinnati, Ohio, USA; Shenzhen Maternity & Child Healthcare Hospital, Shenzhen, China.
  5. David R Moore: Communication Sciences Research Center, Cincinnati Children's Hospital Medical Center, Cincinnati, OH, USA; Department of Otolaryngology, University of Cincinnati, Ohio, USA.
  6. Qian-Jie Fu: Department of Head and Neck Surgery, University of California, Los Angeles, Los Angeles, CA, USA.

Abstract

Dynamic frequency changes in sound provide critical cues for speech perception. Most previous studies examining frequency discrimination in cochlear implant (CI) users have employed behavioral tasks in which target and reference tones (differing in frequency) are presented statically in separate time intervals. Participants are required to identify the target frequency by comparing stimuli across these time intervals. However, perceiving dynamic frequency changes in speech requires detection of within-interval frequency change. This study explored the relationship between detection of within-interval frequency changes and speech perception performance of CI users. Frequency change detection thresholds (FCDTs) were measured in 20 adult CI users using a 3-alternative forced-choice (3AFC) procedure. Stimuli were 1-sec pure tones (base frequencies at 0.25, 1, 4 kHz) with frequency changes occurring 0.5 s after the tone onset. Speech tests were 1) Consonant-Nucleus-Consonant (CNC) monosyllabic word recognition, 2) Arizona Biomedical Sentence Recognition (AzBio) in Quiet, 3) AzBio in Noise (AzBio-N, +10 dB signal-to-noise/SNR ratio), and 4) Digits-in-noise (DIN). Participants' subjective satisfaction with the CI was obtained. Results showed that correlations between FCDTs and speech perception were all statistically significant. The satisfaction level of CI use was not related to FCDTs, after controlling for major demographic factors. DIN speech reception thresholds were significantly correlated to AzBio-N scores. The current findings suggest that the ability to detect within-interval frequency changes may play an important role in speech perception performance of CI users. FCDT and DIN can serve as simple and rapid tests that require no or minimal linguistic background for the prediction of CI speech outcomes.

Keywords

References

  1. J Acoust Soc Am. 1995 Apr;97(4):2479-86 [PMID: 7714264]
  2. PLoS One. 2014 Feb 13;9(2):e88662 [PMID: 24551131]
  3. J Acoust Soc Am. 2007 Aug;122(2):982-91 [PMID: 17672646]
  4. Otol Neurotol. 2014 Apr;35(4):598-604 [PMID: 24557031]
  5. Ear Hear. 2016 May-Jun;37(3):354-64 [PMID: 26656317]
  6. Proc Natl Acad Sci U S A. 2005 Feb 15;102(7):2293-8 [PMID: 15677723]
  7. Ear Hear. 1996 Apr;17(2):133-61 [PMID: 8698160]
  8. J Am Acad Audiol. 2015 Jun;26(6):572-81 [PMID: 26134724]
  9. Ear Hear. 2015 Mar-Apr;36(2):e1-e13 [PMID: 25329372]
  10. Ear Hear. 2017 Mar/Apr;38(2):e74-e84 [PMID: 28225736]
  11. Otol Neurotol. 2013 Apr;34(3):490-8 [PMID: 23442568]
  12. Eur Arch Otorhinolaryngol. 2014 Aug;271(8):2119-26 [PMID: 24096818]
  13. BMC Surg. 2013;13 Suppl 2:S1 [PMID: 24267394]
  14. Ear Hear. 2016 Nov/Dec;37(6):e377-e390 [PMID: 27438871]
  15. J Acoust Soc Am. 2015 Oct;138(4):2350-8 [PMID: 26520316]
  16. Trends Amplif. 2008 Dec;12(4):316-31 [PMID: 18974203]
  17. BMJ Open. 2018 Apr 20;8(4):e019640 [PMID: 29678970]
  18. Ann Otol Rhinol Laryngol. 2016 Oct;125(10):858-61 [PMID: 27357975]
  19. Int J Audiol. 2015 Jan;54(1):48-57 [PMID: 25156097]
  20. J Acoust Soc Am. 2005 Aug;118(2):623-6 [PMID: 16158620]
  21. J Assoc Res Otolaryngol. 2013 Aug;14(4):591-601 [PMID: 23632651]
  22. Ear Hear. 2012 Jan-Feb;33(1):112-7 [PMID: 21829134]
  23. Clin Neurophysiol. 2009 Feb;120(2):360-73 [PMID: 19070543]
  24. Hear Res. 2002 Dec;174(1-2):101-6 [PMID: 12433401]
  25. Hear Res. 2008 Jan;235(1-2):143-56 [PMID: 18093766]
  26. J Acoust Soc Am. 2008 Aug;124(2):EL21-7 [PMID: 18681497]
  27. Cochlear Implants Int. 2015 Sep;16 Suppl 3:S91-S104 [PMID: 26561892]
  28. J Acoust Soc Am. 2008 Oct;124(4):EL189-95 [PMID: 19062785]
  29. Clin Neurophysiol. 2008 Sep;119(9):2111-24 [PMID: 18635394]
  30. J Acoust Soc Am. 2005 Dec;118(6):3874-88 [PMID: 16419830]
  31. Hear Res. 2008 Oct;244(1-2):77-84 [PMID: 18692556]
  32. Cochlear Implants Int. 2016 Apr;17 Suppl 1:42-6 [PMID: 27099110]
  33. Otolaryngol Clin North Am. 2012 Feb;45(1):221-48 [PMID: 22115692]
  34. J Acoust Soc Am. 2009 May;125(5):3328-45 [PMID: 19425674]
  35. PLoS One. 2014 Mar 05;9(3):e90044 [PMID: 24599314]
  36. Proc Natl Acad Sci U S A. 2006 Dec 5;103(49):18866-9 [PMID: 17116863]
  37. J Hear Sci. 2012;2(4):EA37-EA39 [PMID: 25414796]
  38. Cochlear Implants Int. 2015 Mar;16(2):88-94 [PMID: 25117940]
  39. Philos Trans R Soc Lond B Biol Sci. 2008 Mar 12;363(1493):947-63 [PMID: 17827102]
  40. J Assoc Res Otolaryngol. 2007 Sep;8(3):384-92 [PMID: 17587137]
  41. Ear Hear. 2007 Jun;28(3):412-23 [PMID: 17485990]
  42. Hear Res. 2001 Jul;157(1-2):1-42 [PMID: 11470183]
  43. Front Neurosci. 2016 Oct 25;10:464 [PMID: 27826221]
  44. IEEE Trans Biomed Eng. 2005 Jan;52(1):64-73 [PMID: 15651565]
  45. Audiol Neurootol. 2018;23(3):152-164 [PMID: 30300882]
  46. J Assoc Res Otolaryngol. 2002 Sep;3(3):332-50 [PMID: 12382107]
  47. Otol Neurotol. 2015 Sep;36(9):1472-9 [PMID: 26375968]
  48. J Acoust Soc Am. 2010 Oct;128(4):1943-51 [PMID: 20968366]
  49. Otol Neurotol. 2015 Sep;36(8):1331-7 [PMID: 26164443]
  50. Otol Neurotol. 2018 Jun;39(5):571-575 [PMID: 29557842]
  51. Cochlear Implants Int. 2017 Mar;18(2):76-88 [PMID: 28151091]
  52. J Rehabil Res Dev. 2008;45(5):779-89 [PMID: 18816426]
  53. J Acoust Soc Am. 2005 May;117(5):3126-38 [PMID: 15957780]
  54. J Acoust Soc Am. 2009 Mar;125(3):1649-57 [PMID: 19275322]
  55. J Assoc Res Otolaryngol. 2007 Jun;8(2):241-57 [PMID: 17347777]
  56. J Acoust Soc Am. 2013 Mar;133(3):1693-706 [PMID: 23464039]
  57. Ear Hear. 2009 Aug;30(4):411-8 [PMID: 19474735]
  58. J Assoc Res Otolaryngol. 2015 Dec;16(6):797-809 [PMID: 26373936]
  59. Trends Hear. 2018 Jan-Dec;22:2331216518755288 [PMID: 29441835]
  60. Laryngoscope. 2016 Jan;126(1):175-81 [PMID: 26152811]
  61. J Am Acad Audiol. 2002 Mar;13(3):132-45 [PMID: 11936169]
  62. Ear Hear. 2007 Apr;28(2 Suppl):62S-65S [PMID: 17496650]
  63. Hear Res. 2015 Apr;322:107-11 [PMID: 25285624]
  64. J Assoc Res Otolaryngol. 2004 Sep;5(3):253-60 [PMID: 15492884]

Grants

  1. R01 DC004792/NIDCD NIH HHS
  2. R15 DC011004/NIDCD NIH HHS
  3. R15 DC016463/NIDCD NIH HHS

MeSH Term

Acoustic Stimulation
Adult
Aged
Aged, 80 and over
Audiometry, Pure-Tone
Auditory Threshold
Cochlear Implants
Deafness
Female
Humans
Male
Middle Aged
Patient Satisfaction
Pitch Discrimination
Psychoacoustics
Signal-To-Noise Ratio
Speech Acoustics
Speech Perception
Young Adult

Word Cloud

Created with Highcharts 10.0.0frequencyspeechCIperceptionchangesusersdetectionchangeimplantwithin-intervalFrequencyFCDTsDINcochleartargettonestimeintervalsperformancethresholds01SpeechtestsAzBioAzBio-NsatisfactionDynamicsoundprovidecriticalcuespreviousstudiesexaminingdiscriminationemployedbehavioraltasksreferencedifferingpresentedstaticallyseparateParticipantsrequiredidentifycomparingstimuliacrossHoweverperceivingdynamicrequiresstudyexploredrelationshipmeasured20adultusing3-alternativeforced-choice3AFCprocedureStimuli1-secpurebasefrequencies254 kHzoccurring5 stoneonsetConsonant-Nucleus-ConsonantCNCmonosyllabicwordrecognition2ArizonaBiomedicalSentenceRecognitionQuiet3Noise +10 dBsignal-to-noise/SNRratio4Digits-in-noiseParticipants'subjectiveobtainedResultsshowedcorrelationsstatisticallysignificantleveluserelatedcontrollingmajordemographicfactorsreceptionsignificantlycorrelatedscorescurrentfindingssuggestabilitydetectmayplayimportantroleFCDTcanservesimplerapidrequireminimallinguisticbackgroundpredictionoutcomesCochlear

Similar Articles

Cited By