Language contexts in speech categorization: testing the double phonetic standard in bilinguals
MetadataShow full item record
Speech sounds are typically perceived categorically. The acoustic information in speech sounds is perceptually grouped into phonetic categories. It is widely known that language influences the way speech sounds are categorized. That is, one's native language influences where category boundaries are placed. However, it is less understood how bilingual listeners categorize speech sounds. There is evidence showing that bilinguals have different category boundaries from monolinguals, but there is also evidence suggesting that bilinguals have different category boundaries depending on the language they are using at the moment. This phenomenon has been referred as the double phonetic boundary. The goal of this investigation was to verify the existence of the double phonemic boundary in bilingual listeners. As has been done in other studies, bilingual speakers of Spanish and English were asked to identify the speech sound /ta/ from a 10-token speech continuum ranging in VOT from /da/ to /ta/ in two language contexts. In this study, however, two additional procedures were carried out. First, English monolinguals were asked to identify the continuum in two language contexts. It was expected that bilinguals, but not monolinguals, would show a double phonetic boundary. Second, while participants' behavioral measures were assessed, electrophysiological measures [event-related potentials, (ERPs)] also were recorded. This was done in order to observe how speech sounds are represented in the brain. It as expected that bilinguals, but not monolinguals, would show different ERP amplitudes across language contexts. The behavioral results showed that phonemic boundaries did not differ across language contexts for either bilinguals or monolinguals. Further analyses showed bilinguals, but not monolinguals, perceived specific speech sounds--in the "ambiguous zone"--differently across language contexts. The electrophysiological results showed that the ERPs of bilinguals, but not monolinguals, differed across language contexts. Interestingly, behavioral measures correlated significantly with electrophysiological measures only in bilinguals. This result showed that the ERP amplitude was in accordance with the number of sounds perceived as 'ta' across language contexts. The challenges of testing the double phonemic boundary are discussed, along with the limitations of the methodology used in this study.