Written sentence context effects on acoustic-phonetic perception: fMRI reveals cross-modal semantic-perceptual interactions
Data
2019Egilea
Guediche, Sara
Zhu, Yuli
Minicucci, Domenic
Blumstein, Sheila E.
Sara Guediche, Yuli Zhu, Domenic Minicucci, Sheila E. Blumstein, Written sentence context effects on acoustic-phonetic perception: fMRI reveals cross-modal semantic-perceptual interactions, Brain and Language, Volume 199, 2019, 104698, ISSN 0093-934X, https://doi.org/10.1016/j.bandl.2019.104698
Laburpena
This study examines cross-modality effects of a semantically-biased written sentence context on the perception of an acoustically-ambiguous word target identifying neural areas sensitive to interactions between sentential bias and phonetic ambiguity. Of interest is whether the locus or nature of the interactions resembles those previously demonstrated for auditory-only effects. FMRI results show significant interaction effects in right mid-middle temporal gyrus (RmMTG) and bilateral anterior superior temporal gyri (aSTG), regions along the ventral language comprehension stream that map sound onto meaning. These regions are more anterior than those previously identified for auditory-only effects; however, the same cross-over interaction pattern emerged implying similar underlying computations at play. The findings suggest that the mechanisms that integrate information across modality and across sentence and phonetic levels of processing recruit amodal areas where reading and spoken lexical and semantic access converge. Taken together, results support interactive accounts of speech and language processing.