Facial Expressions as an Index of Listening Difficulty and Emotional Response.

Soumya Venkitakrishnan, Yu-Hsiang Wu
Author Information
  1. Soumya Venkitakrishnan: Department of Communication Sciences and Disorders, California State University, Sacramento, California.
  2. Yu-Hsiang Wu: Department of Communication Sciences and Disorders, University of Iowa, Iowa City, Iowa.

Abstract

Knowledge about listening difficulty experienced during a task can be used to better understand speech perception processes, to guide amplification outcomes, and can be used by individuals to decide whether to participate in communication. Another factor affecting these decisions is individuals' emotional response which has not been measured objectively previously. In this study, we describe a novel method of measuring listening difficulty and affect of individuals in adverse listening situations using automatic facial expression algorithm. The purpose of our study was to determine if facial expressions of confusion and frustration are sensitive to changes in listening difficulty. We recorded speech recognition scores, facial expressions, subjective listening effort scores, and subjective emotional responses in 33 young participants with normal hearing. We used the signal-to-noise ratios of -1, +2, and +5 dB SNR and quiet conditions to vary the difficulty level. We found that facial expression of confusion and frustration increased with increase in difficulty level, but not with change in each level. We also found a relationship between facial expressions and both subjective emotion ratings and subjective listening effort. Emotional responses in the form of facial expressions show promise as a measure of affect and listening difficulty. Further research is needed to determine the specific contribution of affect to communication in challenging listening environments.

Keywords

References

  1. Ear Hear. 2016 Jul-Aug;37 Suppl 1:5S-27S [PMID: 27355771]
  2. J Pers Soc Psychol. 1988 Aug;55(2):196-210 [PMID: 3171904]
  3. Wiley Interdiscip Rev Cogn Sci. 2020 Jan;11(1):e1514 [PMID: 31381275]
  4. J Speech Lang Hear Res. 2016 Oct 1;59(5):1233-1246 [PMID: 27768178]
  5. Pers Soc Psychol Bull. 2005 Jan;31(1):121-35 [PMID: 15574667]
  6. Hear Res. 2018 Aug;365:90-99 [PMID: 29779607]
  7. Mem Cognit. 2005 Oct;33(7):1235-47 [PMID: 16532856]
  8. Prog Brain Res. 2013;202:37-53 [PMID: 23317825]
  9. Cogn Emot. 2019 Mar;33(2):378-385 [PMID: 29482469]
  10. J Pers Soc Psychol. 2002 Jul;83(1):94-110 [PMID: 12088135]
  11. Ear Hear. 2016 Jul-Aug;37 Suppl 1:118S-25S [PMID: 27355761]
  12. Pers Soc Psychol Rev. 2007 May;11(2):167-203 [PMID: 18453461]
  13. PLoS One. 2019 Feb 5;14(2):e0211735 [PMID: 30721270]
  14. Int J Audiol. 2011 Nov;50(11):786-92 [PMID: 21916790]
  15. Hear Res. 2021 Oct;410:108334 [PMID: 34450568]
  16. J Speech Hear Res. 1985 Sep;28(3):455-62 [PMID: 4046587]
  17. J Speech Lang Hear Res. 2018 Jun 19;61(6):1463-1486 [PMID: 29800081]
  18. Rev Neurosci. 2004;15(4):241-51 [PMID: 15526549]
  19. Emot Rev. 2009;1(2):99-113 [PMID: 19421427]
  20. Psychol Rev. 1990 Jul;97(3):315-31 [PMID: 1669960]
  21. J Am Acad Audiol. 2011 Feb;22(2):113-22 [PMID: 21463566]
  22. Trends Hear. 2018 Jan-Dec;22:2331216518813243 [PMID: 30482108]
  23. Cogn Emot. 2017 Sep;31(6):1268-1276 [PMID: 27448030]
  24. Ear Hear. 2015 Jan;36(1):145-54 [PMID: 25170782]
  25. Ear Hear. 2010 Aug;31(4):480-90 [PMID: 20588118]
  26. Trends Cogn Sci. 2001 Sep 1;5(9):394-400 [PMID: 11520704]
  27. Emotion. 2007 Feb;7(1):113-130 [PMID: 17352568]
  28. Emotion. 2003 Mar;3(1):68-75 [PMID: 12899317]
  29. Ear Hear. 2020 Sep 10;42(2):465-474 [PMID: 32925306]
  30. Behav Res Methods. 2018 Aug;50(4):1446-1460 [PMID: 29218587]
  31. J Speech Lang Hear Res. 2009 Oct;52(5):1230-40 [PMID: 19380604]
  32. Ear Hear. 2009 Jun;30(3):302-12 [PMID: 19322094]
  33. Psychophysiology. 2010 May 1;47(3):560-9 [PMID: 20070575]
  34. Int J Audiol. 2022 Dec 15;:1-8 [PMID: 36519812]
  35. Trends Hear. 2021 Jan-Dec;25:23312165211027688 [PMID: 34261392]
  36. Ear Hear. 2016 Nov/Dec;37(6):660-670 [PMID: 27438866]
  37. Ear Hear. 2021 Nov-Dec 01;42(6):1577-1589 [PMID: 33795615]
  38. Trends Hear. 2018 Jan-Dec;22:2331216518800869 [PMID: 30261825]
  39. J Speech Lang Hear Res. 2017 Oct 17;60(10):3009-3018 [PMID: 29049601]
  40. Int J Audiol. 2021 Nov;60(11):900-910 [PMID: 33630718]
  41. Proc Natl Acad Sci U S A. 2012 May 8;109(19):7241-4 [PMID: 22509011]
  42. Front Psychol. 2018 Oct 26;9:2052 [PMID: 30416473]
  43. Hear Res. 2018 Nov;369:103-119 [PMID: 30135023]
  44. Int J Audiol. 2014 Jul;53(7):433-40 [PMID: 24673660]
  45. Curr Dir Psychol Sci. 2018 Aug;27(4):211-219 [PMID: 30166776]
  46. Ear Hear. 2011 Jul-Aug;32(4):498-510 [PMID: 21233711]
  47. Ear Hear. 2019 Sep/Oct;40(5):1084-1097 [PMID: 30747742]
  48. Proc Int Conf Autom Face Gesture Recognit. 2010 May 1;28(5):807-813 [PMID: 20490373]
  49. J Pers Soc Psychol. 1990 Feb;58(2):342-53 [PMID: 2319446]
  50. J Acoust Soc Am. 1994 Feb;95(2):1085-99 [PMID: 8132902]
  51. J Pers Soc Psychol. 1971 Feb;17(2):124-9 [PMID: 5542557]
  52. Emotion. 2005 Jun;5(2):175-90 [PMID: 15982083]
  53. Emotion. 2005 Mar;5(1):119-24 [PMID: 15755225]
  54. Health Qual Life Outcomes. 2003 Aug 01;1:29 [PMID: 12914662]
  55. Int J Audiol. 2014 Jun;53(6):418-26 [PMID: 24597604]
  56. Ear Hear. 2017 Nov/Dec;38(6):690-700 [PMID: 28640038]
  57. Psychophysiology. 2014 Mar;51(3):277-84 [PMID: 24506437]
  58. Emotion. 2021 Mar;21(2):447-451 [PMID: 31829721]
  59. Occup Med (Lond). 2014 Jul;64(5):393-4 [PMID: 25005549]
  60. Psychosom Med. 1990 May-Jun;52(3):307-19 [PMID: 2367622]
  61. J Speech Lang Hear Res. 2021 Sep 14;64(9):3627-3652 [PMID: 34491779]
  62. Ear Hear. 2014 Nov-Dec;35(6):600-10 [PMID: 24622352]
  63. Hear Res. 2018 Sep;367:106-112 [PMID: 30096490]
  64. Ear Hear. 2018 Sep/Oct;39(5):922-934 [PMID: 29424766]
  65. J Speech Lang Hear Res. 2015 Dec;58(6):1781-92 [PMID: 26363285]
  66. J Speech Lang Hear Res. 2018 Sep 19;61(9):2405-2421 [PMID: 30208416]
  67. Hear Res. 2014 Jun;312:114-20 [PMID: 24709275]
  68. Curr Biol. 2014 Mar 31;24(7):738-43 [PMID: 24656830]

Word Cloud

Created with Highcharts 10.0.0listeningfacialdifficultyexpressionssubjectiveusedemotionalaffectconfusioneffortresponseslevelcanspeechindividualscommunicationstudyexpressiondeterminefrustrationscoresfoundEmotionalKnowledgeexperiencedtaskbetterunderstandperceptionprocessesguideamplificationoutcomesdecidewhetherparticipateAnotherfactoraffectingdecisionsindividuals'responsemeasuredobjectivelypreviouslydescribenovelmethodmeasuringadversesituationsusingautomaticalgorithmpurposesensitivechangesrecordedrecognition33youngparticipantsnormalhearingsignal-to-noiseratios-1+2+5 dBSNRquietconditionsvaryincreasedincreasechangealsorelationshipemotionratingsformshowpromisemeasureresearchneededspecificcontributionchallengingenvironmentsFacialExpressionsIndexListeningDifficultyResponse

Similar Articles

Cited By (1)