Cortical responses time-locked to continuous speech in the high-gamma band depend on selective attention.

Vrishab Commuri, Joshua P Kulasingham, Jonathan Z Simon
Author Information
  1. Vrishab Commuri: Department of Electrical and Computer Engineering, University of Maryland, College Park, MD, United States.
  2. Joshua P Kulasingham: Department of Electrical Engineering, Linköping University, Linköping, Sweden.
  3. Jonathan Z Simon: Department of Electrical and Computer Engineering, University of Maryland, College Park, MD, United States.

Abstract

Auditory cortical responses to speech obtained by magnetoencephalography (MEG) show robust speech tracking to the speaker's fundamental frequency in the high-gamma band (70-200 Hz), but little is currently known about whether such responses depend on the focus of selective attention. In this study 22 human subjects listened to concurrent, fixed-rate, speech from male and female speakers, and were asked to selectively attend to one speaker at a time, while their neural responses were recorded with MEG. The male speaker's pitch range coincided with the lower range of the high-gamma band, whereas the female speaker's higher pitch range had much less overlap, and only at the upper end of the high-gamma band. Neural responses were analyzed using the temporal response function (TRF) framework. As expected, the responses demonstrate robust speech tracking of the fundamental frequency in the high-gamma band, but only to the male's speech, with a peak latency of ~40 ms. Critically, the response magnitude depends on selective attention: the response to the male speech is significantly greater when male speech is attended than when it is not attended, under acoustically identical conditions. This is a clear demonstration that even very early cortical auditory responses are influenced by top-down, cognitive, neural processing mechanisms.

Keywords

References

  1. J Neurosci. 2007 Aug 29;27(35):9252-61 [PMID: 17728439]
  2. J Neurosci. 2023 Nov 1;43(44):7429-7440 [PMID: 37793908]
  3. Neuroimage. 2009 Jan 1;44(1):83-98 [PMID: 18501637]
  4. Nat Commun. 2019 Nov 6;10(1):5036 [PMID: 31695046]
  5. Neuroimage. 2018 Jul 15;175:56-69 [PMID: 29604459]
  6. Curr Opin Physiol. 2020 Dec;18:25-31 [PMID: 33225119]
  7. Eur J Neurosci. 2021 Jun;53(11):3640-3653 [PMID: 33861480]
  8. Neuron. 2000 Apr;26(1):55-67 [PMID: 10798392]
  9. Neuroimage. 2012 Aug 15;62(2):774-81 [PMID: 22248573]
  10. J Neurosci Methods. 2007 Sep 30;165(2):297-305 [PMID: 17624443]
  11. Psychophysiology. 2012 Mar;49(3):322-34 [PMID: 22175821]
  12. eNeuro. 2021 Dec 23;8(6): [PMID: 34799409]
  13. J Neurophysiol. 2009 Jul;102(1):349-59 [PMID: 19439675]
  14. Front Neurosci. 2022 Dec 08;16:1075369 [PMID: 36570848]
  15. Proc Natl Acad Sci U S A. 2001 Nov 6;98(23):13367-72 [PMID: 11698688]
  16. Dev Sci. 2010 Jan 1;13(1):77-91 [PMID: 20121865]
  17. Neuron. 2019 Dec 18;104(6):1195-1209.e3 [PMID: 31648900]
  18. PLoS Biol. 2009 Jun;7(6):e1000129 [PMID: 19529760]
  19. J Neurophysiol. 2016 Nov 1;116(5):2356-2367 [PMID: 27605531]
  20. Eur J Neurosci. 2010 Jan;31(1):189-93 [PMID: 20092565]
  21. Ear Hear. 2010 Jun;31(3):302-24 [PMID: 20084007]
  22. J Acoust Soc Am. 2005 Aug;118(2):887-906 [PMID: 16158645]
  23. J Neurophysiol. 2019 Dec 1;122(6):2372-2387 [PMID: 31596649]
  24. Neuron. 2007 Jun 21;54(6):1001-10 [PMID: 17582338]
  25. Science. 1973 Oct 12;182(4108):177-80 [PMID: 4730062]
  26. Neuroscience. 2013 Jul 23;243:104-14 [PMID: 23518221]
  27. Neuroimage. 2021 Jul 15;235:118014 [PMID: 33794356]
  28. Med Biol Eng Comput. 1994 Jan;32(1):35-42 [PMID: 8182960]
  29. Neuroimage. 2020 Nov 15;222:117291 [PMID: 32835821]
  30. J Cogn Neurosci. 2024 Mar 1;36(3):475-491 [PMID: 38165737]
  31. Proc Natl Acad Sci U S A. 2012 Jul 17;109(29):11854-9 [PMID: 22753470]
  32. J Neurosci Methods. 2008 Feb 15;168(1):195-202 [PMID: 17963844]
  33. PLoS Biol. 2020 Oct 22;18(10):e3000883 [PMID: 33091003]
  34. Neuroimage. 2021 Feb 1;226:117545 [PMID: 33186711]
  35. Hum Brain Mapp. 2002 Jan;15(1):1-25 [PMID: 11747097]
  36. J Assoc Res Otolaryngol. 2018 Feb;19(1):83-97 [PMID: 28971333]
  37. Neuroimage. 2021 May 1;231:117866 [PMID: 33592244]
  38. IEEE Trans Biomed Eng. 2021 Dec;68(12):3612-3619 [PMID: 33983878]
  39. Front Neurosci. 2022 Jul 22;16:915744 [PMID: 35942153]
  40. Nat Commun. 2016 Mar 24;7:11070 [PMID: 27009409]
  41. J Neurosci. 2021 Sep 22;41(38):8023-8039 [PMID: 34400518]
  42. Front Neurosci. 2013 Dec 26;7:267 [PMID: 24431986]
  43. PLoS One. 2014 Jan 15;9(1):e85442 [PMID: 24454869]
  44. Neuroimage. 2019 Dec;203:116185 [PMID: 31520743]

Grants

  1. R01 DC019394/NIDCD NIH HHS
  2. T32 DC000046/NIDCD NIH HHS

Word Cloud

Created with Highcharts 10.0.0speechresponseshigh-gammabandmaleresponsecorticaltrackingspeaker'sselectiverangeMEGrobustfundamentalfrequencydependattentionfemaleneuralpitchattendedauditoryAuditoryobtainedmagnetoencephalographyshow70-200Hzlittlecurrentlyknownwhetherfocusstudy22humansubjectslistenedconcurrentfixed-ratespeakersaskedselectivelyattendonespeakertimerecordedcoincidedlowerwhereashighermuchlessoverlapupperendNeuralanalyzedusingtemporalfunctionTRFframeworkexpecteddemonstratemale'speaklatency~40msCriticallymagnitudedependsattention:significantlygreateracousticallyidenticalconditionscleardemonstrationevenearlyinfluencedtop-downcognitiveprocessingmechanismsCorticaltime-lockedcontinuouscocktailpartyFFRphase-lockedprimarycortex

Similar Articles

Cited By