Published September 18, 2022 | Version v1
Conference paper

Neural correlates of acoustic and semantic cues during speech segmentation in French

Description

Natural speech is highly complex and variable. Particularly, spoken language, in contrast to written language, has no clear word boundaries. Adult listeners can exploit different types of information to segment the continuous stream such as acoustic and semantic information. However, the weight of these cues, when co-occurring, remains to be determined. Behavioural tasks are not conclusive on this point as they focus participants' attention on certain sources of information, thus biasing the results. Here, we looked at the processing of homophonic utterances such as l'amie vs la mie (both /lami/) which include fine acoustic differences and for which the meaning changesdepending on segmentation. To examine the perceptual resolution of such ambiguities when semantic information is available, we measured the online processing of sentences containing such sequences in an ERP experiment involving no active task. In a congruent context, semantic information matched the acoustic signal of the word amie, while, in the incongruent condition, the semantic information carried by thesentence and the acoustic signal were leading to different lexical candidates. No clear neural markers for the use of acoustic cues were found. Our results suggest a preponderant weight of semantic information over acoustic information during natural spoken sentence processing.

Additional details

Created:
February 22, 2023
Modified:
November 29, 2023