UPV-EHU ADDI
  • Back
    • English
    • Español
    • Euskera
  • Login
  • English 
    • English
    • Español
    • Euskera
  • FAQ
View Item 
  •   Home
  • INVESTIGACIÓN
  • Grupos de Investigación, Institutos y Centros Colaboradores
  • BCBL
  • BCBL-Publications
  • View Item
  •   Home
  • INVESTIGACIÓN
  • Grupos de Investigación, Institutos y Centros Colaboradores
  • BCBL
  • BCBL-Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Neocortical activity tracks the hierarchical linguistic structures of self-produced speech during reading aloud

Thumbnail
View/Open
Neocortical activity tracksl2020.pdf (2.641Mb)
Date
2020
Author
Bourguignon, Mathieu
Molinaro, Nicola
Lizarazu, Mikel
Taulu, Samu
Jousmäki, Veikko
Lallier, Marie
Carreiras, Manuel
De Tiège, Xavier
Metadata
Show full item record
Mathieu Bourguignon, Nicola Molinaro, Mikel Lizarazu, Samu Taulu, Veikko Jousmäki, Marie Lallier, Manuel Carreiras, Xavier De Tiège, Neocortical activity tracks the hierarchical linguistic structures of self-produced speech during reading aloud, NeuroImage, Volume 216, 2020, 116788, ISSN 1053-8119, https://doi.org/10.1016/j.neuroimage.2020.116788.
URI
http://hdl.handle.net/10810/43352
Abstract
How the human brain uses self-generated auditory information during speech production is rather unsettled. Current theories of language production consider a feedback monitoring system that monitors the auditory consequences of speech output and an internal monitoring system, which makes predictions about the auditory consequences of speech before its production. To gain novel insights into underlying neural processes, we investigated the coupling between neuromagnetic activity and the temporal envelope of the heard speech sounds (i.e., cortical tracking of speech) in a group of adults who 1) read a text aloud, 2) listened to a recording of their own speech (i.e., playback), and 3) listened to another speech recording. Reading aloud was here used as a particular form of speech production that shares various processes with natural speech. During reading aloud, the reader’s brain tracked the slow temporal fluctuations of the speech output. Specifically, auditory cortices tracked phrases (<1 Hz) but to a lesser extent than during the two speech listening conditions. Also, the tracking of words (2–4 Hz) and syllables (4–8 Hz) occurred at parietal opercula during reading aloud and at auditory cortices during listening. Directionality analyses were then used to get insights into the monitoring systems involved in the processing of self-generated auditory information. Analyses revealed that the cortical tracking of speech at <1 Hz, 2–4 Hz and 4–8 Hz is dominated by speech-to-brain directional coupling during both reading aloud and listening, i.e., the cortical tracking of speech during reading aloud mainly entails auditory feedback processing. Nevertheless, brain-to-speech directional coupling at 4–8 Hz was enhanced during reading aloud compared with listening, likely reflecting the establishment of predictions about the auditory consequences of speech before production. These data bring novel insights into how auditory verbal information is tracked by the human brain during perception and self-generation of connected speech.
Collections
  • BCBL-Publications

DSpace software copyright © 2002-2015  DuraSpace
OpenAIRE
OpenAIRE
 

 

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesDepartamentos (cas.)Departamentos (eus.)SubjectsThis CollectionBy Issue DateAuthorsTitlesDepartamentos (cas.)Departamentos (eus.)Subjects

My Account

Login

Statistics

View Usage Statistics

DSpace software copyright © 2002-2015  DuraSpace
OpenAIRE
OpenAIRE