Eye Movements During Visual Speech Perception in Deaf and Hearing Children.

Elizabeth Worster, Hannah Pimperton, Amelia Ralph-Lewis, Laura Monroy, Charles Hulme, Mairéad MacSweeney
Author Information
  1. Elizabeth Worster: Institute of Cognitive Neuroscience University College London.
  2. Hannah Pimperton: Institute of Cognitive Neuroscience University College London.
  3. Amelia Ralph-Lewis: Deafness, Cognition, and Language Research Centre University College London.
  4. Laura Monroy: Institute of Cognitive Neuroscience University College London.
  5. Charles Hulme: University of Oxford.
  6. Mairéad MacSweeney: Institute of Cognitive Neuroscience University College London. ORCID

Abstract

For children who are born deaf, lipreading (speechreading) is an important source of access to spoken language. We used eye tracking to investigate the strategies used by deaf ( = 33) and hearing 5-8-year-olds ( = 59) during a sentence speechreading task. The proportion of time spent looking at the mouth during speech correlated positively with speechreading accuracy. In addition, all children showed a tendency to watch the mouth during speech and watch the eyes when the model was not speaking. The extent to which the children used this communicative pattern, which we refer to as social-tuning, positively predicted their speechreading performance, with the deaf children showing a stronger relationship than the hearing children. These data suggest that better speechreading skills are seen in those children, both deaf and hearing, who are able to guide their visual attention to the appropriate part of the image and in those who have a good understanding of conversational turn-taking.

Keywords

References

  1. Res Dev Disabil. 2016 Jan;48:13-24 [PMID: 26524726]
  2. Cognition. 2016 Feb;147:100-5 [PMID: 26649759]
  3. J Exp Child Psychol. 2010 Nov;107(3):229-43 [PMID: 20570282]
  4. Front Psychol. 2013 Sep 11;4:601 [PMID: 24062705]
  5. Proc Natl Acad Sci U S A. 2012 Jan 31;109(5):1431-6 [PMID: 22307596]
  6. Nature. 1976 Dec 23-30;264(5588):746-8 [PMID: 1012311]
  7. Psychol Sci. 2015 Apr;26(4):490-8 [PMID: 25767208]
  8. Q J Exp Psychol. 1980 Feb;32(1):3-25 [PMID: 7367577]
  9. Cognition. 2017 Mar;160:103-109 [PMID: 28088039]
  10. Front Psychol. 2017 Feb 07;8:106 [PMID: 28223951]
  11. J Speech Lang Hear Res. 2013 Apr;56(2):471-80 [PMID: 23275394]
  12. Dev Sci. 2009 Sep;12(5):798-814 [PMID: 19702771]
  13. Percept Psychophys. 1998 Aug;60(6):926-40 [PMID: 9718953]
  14. Am J Speech Lang Pathol. 2016 Aug 1;25(3):306-20 [PMID: 27537697]
  15. Clin Linguist Phon. 2006 Sep-Oct;20(7-8):621-30 [PMID: 17056494]
  16. J Deaf Stud Deaf Educ. 2012 Winter;17(1):1-18 [PMID: 21712463]
  17. Percept Psychophys. 2003 May;65(4):553-67 [PMID: 12812278]
  18. J Exp Child Psychol. 2009 Jan;102(1):40-59 [PMID: 18829049]
  19. Percept Psychophys. 2003 May;65(4):536-52 [PMID: 12812277]
  20. Front Psychol. 2016 Feb 02;7:52 [PMID: 26869959]
  21. J Speech Lang Hear Res. 2013 Apr;56(2):416-26 [PMID: 23275416]
  22. J Deaf Stud Deaf Educ. 2011 Summer;16(3):289-304 [PMID: 21307357]
  23. Percept Psychophys. 2000 Feb;62(2):233-52 [PMID: 10723205]
  24. Dev Psychobiol. 2004 Dec;45(4):204-20 [PMID: 15549685]
  25. J Deaf Stud Deaf Educ. 2006 Summer;11(3):273-88 [PMID: 16556897]
  26. Lang Cogn Process. 2014;29(7):771-780 [PMID: 25018577]
  27. J Speech Lang Hear Res. 1999 Jun;42(3):526-39 [PMID: 10391620]
  28. Scand Audiol. 1996;25(1):13-20 [PMID: 8658020]

Word Cloud

Created with Highcharts 10.0.0childrenspeechreadingdeafhearingusedeyelipreadingtrackingmouthspeechpositivelywatchbornimportantsourceaccessspokenlanguageinvestigatestrategies = 335-8-year-olds = 59sentencetaskproportiontimespentlookingcorrelatedaccuracyadditionshowedtendencyeyesmodelspeakingextentcommunicativepatternrefersocial-tuningpredictedperformanceshowingstrongerrelationshipdatasuggestbetterskillsseenableguidevisualattentionappropriatepartimagegoodunderstandingconversationalturn-takingEyeMovementsVisualSpeechPerceptionDeafHearingChildrengaze

Similar Articles

Cited By