Phonetic information is integrated across intervening nonlinguistic sounds.

نویسندگان

  • D H Whalen
  • A G Samuel
چکیده

When the fricative noise of a fricative-vowel syllable is replaced by a noise from a different vocalic context, listeners experience delays in identifying both the fricative and the vowel (Whalen, 1984): mismatching the information in the fricative noise for vowel and consonant identity with the information in the vocalic segment appears to hamper processing. This effect was argued to be due to integration of the information relevant to phonetic categorization. The present study was intended to eliminate an alternative explanation based on acoustic discontinuities. Noises and vowels were again cross-spliced, but, in addition, the first 60 msec of the vocalic segment (which comprised the consonant-vowel transitions) either had a nonlinguistic noise added to it or was replaced by that noise. The fricative noise and the majority of the vocalic segment were left intact, and both were quite identifiable. Mismatched consonant information caused delays both for original stimuli and for ones with the noise added to the transitions. Mismatched vowel information caused delays for all stimuli, both originals and ones with the noise. Additionally, syllables with a portion replaced by noise took longer to identify than those that had the noise added to them. When asked explicitly to tell the added versions from the replaced, subjects were unable to do so. The results indicate that listeners integrate all relevant information, even across a nonlinguistic noise. Completely replacing the signal delayed identifications more than did adding the noise to the original signal. This was true despite the fact that the subjects were not aware of any difference.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Temporally nonadjacent nonlinguistic sounds affect speech categorization.

Speech perception is an ecologically important example of the highly context-dependent nature of perception; adjacent speech, and even nonspeech, sounds influence how listeners categorize speech. Some theories emphasize linguistic or articulation-based processes in speech-elicited context effects and peripheral (cochlear) auditory perceptual interactions in non-speech-elicited context effects. ...

متن کامل

Hemispheric asymmetries in children's perception of nonlinguistic human affective sounds.

In the present work, we developed a database of nonlinguistic sounds that mirror prosodic characteristics typical of language and thus carry affective information, but do not convey linguistic information. In a dichotic-listening task, we used these novel stimuli as a means of disambiguating the relative contributions of linguistic and affective processing across the hemispheres. This method wa...

متن کامل

Anatomical Correlates of Learning Novel Speech Sounds

We examined the relationship between brain anatomy and the ability to learn nonnative speech sounds, as well as rapidly changing and steady-state nonlinguistic sounds, using voxel-based morphometry in 59 healthy adults. Faster phonetic learners appeared to have more white matter in parietal regions, especially in the left hemisphere. The pattern of results was similar for the rapidly changing b...

متن کامل

Central locus for nonspeech context effects on phonetic identification (L)

Recently, Holt and Lotto @Hear. Res. 167, 156–169 ~2002!# reported that preceding speech sounds can influence phonetic identification of a target syllable even when the context sounds are presented to the opposite ear or when there is a long intervening silence. These results led them to conclude that phonetic context effects are mostly due to nonperipheral auditory interactions. In the present...

متن کامل

Central locus for nonspeech context effects on phonetic identification.

reported that preceding speech sounds can influence phonetic identification of a target syllable even when the context sounds are presented to the opposite ear or when there is a long intervening silence. These results led them to conclude that phonetic context effects are mostly due to nonperipheral auditory interactions. In the present paper, similar presentation manipulations were made with ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Perception & psychophysics

دوره 37 6  شماره 

صفحات  -

تاریخ انتشار 1985