Audio-visual speech perception: a developmental ERP investigation
نویسندگان
چکیده
Being able to see a talking face confers a considerable advantage for speech perception in adulthood. However, behavioural data currently suggest that children fail to make full use of these available visual speech cues until age 8 or 9. This is particularly surprising given the potential utility of multiple informational cues during language learning. We therefore explored this at the neural level. The event-related potential (ERP) technique has been used to assess the mechanisms of audio-visual speech perception in adults, with visual cues reliably modulating auditory ERP responses to speech. Previous work has shown congruence-dependent shortening of auditory N1/P2 latency and congruence-independent attenuation of amplitude in the presence of auditory and visual speech signals, compared to auditory alone. The aim of this study was to chart the development of these well-established modulatory effects over mid-to-late childhood. Experiment 1 employed an adult sample to validate a child-friendly stimulus set and paradigm by replicating previously observed effects of N1/P2 amplitude and latency modulation by visual speech cues; it also revealed greater attenuation of component amplitude given incongruent audio-visual stimuli, pointing to a new interpretation of the amplitude modulation effect. Experiment 2 used the same paradigm to map cross-sectional developmental change in these ERP responses between 6 and 11 years of age. The effect of amplitude modulation by visual cues emerged over development, while the effect of latency modulation was stable over the child sample. These data suggest that auditory ERP modulation by visual speech represents separable underlying cognitive processes, some of which show earlier maturation than others over the course of development.
منابع مشابه
Language/Culture Modulates Brain and Gaze Processes in Audiovisual Speech Perception
Several behavioural studies have shown that the interplay between voice and face information in audiovisual speech perception is not universal. Native English speakers (ESs) are influenced by visual mouth movement to a greater degree than native Japanese speakers (JSs) when listening to speech. However, the biological basis of these group differences is unknown. Here, we demonstrate the time-va...
متن کاملSpeech and Non-Speech Audio-Visual Illusions: A Developmental Study
It is well known that simultaneous presentation of incongruent audio and visual stimuli can lead to illusory percepts. Recent data suggest that distinct processes underlie non-specific intersensory speech as opposed to non-speech perception. However, the development of both speech and non-speech intersensory perception across childhood and adolescence remains poorly defined. Thirty-eight observ...
متن کاملAn ERP examination of audiovisual speech perception in Japanese younger and older adults
We studied differences between Japanese younger (YA) and older adults (OA) by recording event-related brain potentials (ERP). Participants were asked to identify audio only (AO) and congruent audiovisual (AV) syllables as /ba/ or /ga/). We found age-related ERP changes (N1, P2, and N2 latencies) in Japanese audiovisual speech perception. Whereas the visual influence was sustained (maintained fr...
متن کاملImpact of language on development of auditory-visual speech perception.
The McGurk effect paradigm was used to examine the developmental onset of inter-language differences between Japanese and English in auditory-visual speech perception. Participants were asked to identify syllables in audiovisual (with congruent or discrepant auditory and visual components), audio-only, and video-only presentations at various signal-to-noise levels. In Experiment 1 with two grou...
متن کاملExploring early developmental changes in face scanning patterns during the perception of audio-visual mismatch of speech cues
Young infants are capable of integrating auditory and visual information and their speech perception can be influenced by visual cues, while 5-month-olds are able to detect a mismatch between the mouth articulation and the speech sound. From 6 months of age infants gradually shift their attention away from eyes and towards mouth in articulating faces, potentially to benefit from intersensory re...
متن کامل