Show simple item record

dc.contributor.authorBourguignon, Mathieu
dc.contributor.authorBaart, Martijn
dc.contributor.authorKapnoula, Efthymia C.
dc.contributor.authorMolinaro, Nicola
dc.date.accessioned2020-02-19T10:13:51Z
dc.date.available2020-02-19T10:13:51Z
dc.date.issued2020
dc.identifier.citationLip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech Mathieu Bourguignon, Martijn Baart, Efthymia C. Kapnoula, Nicola Molinaro Journal of Neuroscience 29 January 2020, 40 (5) 1053-1065; DOI: 10.1523/JNEUROSCI.1101-19.2019es_ES
dc.identifier.issn0270-6474
dc.identifier.urihttp://hdl.handle.net/10810/41323
dc.descriptionAccepted December 4, 2019.es_ES
dc.description.abstractLip-reading is crucial for understanding speech in challenging conditions. But how the brain extracts meaning from, silent, visual speech is still under debate. Lip-reading in silence activates the auditory cortices, but it is not known whether such activation reflects immediate synthesis of the corresponding auditory stimulus or imagery of unrelated sounds. To disentangle these possibilities, we used magnetoencephalography to evaluate how cortical activity in 28 healthy adult humans (17 females) entrained to the auditory speech envelope and lip movements (mouth opening) when listening to a spoken story without visual input (audio-only), and when seeing a silent video of a speaker articulating another story (video-only). In video-only, auditory cortical activity entrained to the absent auditory signal at frequencies 1 Hz more than to the seen lip movements. This entrainment process was characterized by an auditory-speech-to-brain delay of 70 ms in the left hemisphere, compared with 20 ms in audio-only. Entrainment to mouth opening was found in the right angular gyrus at 1 Hz, and in early visual cortices at 1– 8 Hz. These findings demonstrate that the brain can use a silent lip-read signal to synthesize a coarse-grained auditory speech representation in early auditory cortices. Our data indicate the following underlying oscillatory mechanism: seeing lip movements first modulates neuronal activity in early visual cortices at frequencies that match articulatory lip movements; the right angular gyrus then extracts slower features of lip movements, mapping them onto the corresponding speech sound features; this information is fed to auditory cortices, most likely facilitating speech parsing.es_ES
dc.description.sponsorshipThis work was supported by the Innoviris Attract program (Grant 2015-BB2B-10), by the Spanish Ministry of Economy and Competitiveness (Grant PSI2016-77175-P), and by the Marie Skłodowska-Curie Action of the European Commission (Grant 743562) to M. Bourguignon; by the Netherlands Organization for Scientific Research (VENI Grant 275-89-027) to M. Baart; by the Spanish Ministry of Economy and Competitiveness, through the Juan de la Cierva-Formación fellowship, and by the Spanish Ministry of Economy and Competitiveness (Grant PSI2017-82563-P) to E.C.K.; by the Spanish Ministry of Science, Innovation and Universities (Grant RTI2018-096311-B-I00), the Agencia Estatal de Investigación, the Fondo Europeo de Desarrollo Regional, and by the Basque government (Grant PI_2016_1_0014) to N.M.; and by the Spanish Ministry of Economy and Competitiveness, through the “Severo Ochoa” Programme for Centres/Units of Excellence in R&D” (SEV-2015-490) awarded to the BCBL.es_ES
dc.language.isoenges_ES
dc.publisherThe Journal of Neurosciencees_ES
dc.relationinfo:eu-repo/grantAgreement/MINECO/PSI2016-77175-Pes_ES
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/MC/743562es_ES
dc.relationinfo:eu-repo/grantAgreement/MINECO/PSI2017-82563-Pes_ES
dc.relationinfo:eu-repo/grantAgreement/MINECO/RTI2018-096311-B-100es_ES
dc.relationinfo:eu-repo/grantAgreement/MINECO/SEV-2015-0490es_ES
dc.rightsinfo:eu-repo/semantics/openAccesses_ES
dc.subjectaudiovisual integrationes_ES
dc.subjectlip-readinges_ES
dc.subjectmagnetoencephalographyes_ES
dc.subjectsilent speeches_ES
dc.subjectspeech entrainmentes_ES
dc.titleLip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speeches_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.rights.holderCopyright © 2020 the authorses_ES
dc.relation.publisherversionhttps://www.jneurosci.org/es_ES


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record