Infant-directed-speech enhances neural activity during face perception.

Sirri, L., Parise, E., Reid, V. (2018). Infant-directed-speech enhances neural activity during face perception. Poster presented at the 21st biennial meeting of the International Conference of Infant Studies, Philadelphia, USA. 

Abstract:

Developmental studies have previously shown that infants are sensitive to communicative cues, such as infant-directed speech (IDS) and eye gaze. IDS is commonly used by adults during their interaction with infants. Compared to adult-directed speech (ADS), it has high and more variable pitch, limited vocabulary, shorter utterances, vowel alterations. IDS increases the allocation of attention to language and fosters social interactions between infants and caregivers (Golinkoff, Can, Soderstrom, & Hirsh-Pasek, 2015). IDS elicits also increased neural brain activity compared to ADS (Naoi et al., 2012), particularly in response to familiar words (Zangl & Mills, 2007). As for eye gaze, infants exhibit enhanced N170 (known for face processing) in response to faces with direct than averted eye gaze (Farroni, Csibra, Simion, & Johnson, 2002). Therefore, the aim of the present study is to extend these findings and determine whether IDS, as a communicative cue, enhances face processing while maintaining eye gaze constant.

Thirty-five infants (age range: 3 months and 21 days to 5 months and 24 days) took part in the study. Infants sat on their caregiver's laps facing a monitor screen and heard the word "hello" pronounced either in IDS or ADS followed by woman's face (NimStim), while their event-related brain potentials (ERPs) were recorded. Offline, the data was segmented into 1000ms epochs from sound and picture onset and averaged according to a 200ms pre-stimulus baseline. Trials including artifacts exceeding ±150uV were rejected. Segments were averaged and re- referenced to an average reference. Two ERP components of interest were analysed: the auditory N600-800 sensitive to IDS (Zangl & Mills, 2007) and the visual P100 known to reflect early visual processing. For both components, mean amplitudes of five posterior channels (including O1, O2, PO7 and PO8) over the left and right hemisphere were averaged. For the auditory N600-800 (n=18) the mean number of trials was 15 (range: 10 to 27 trials) and 13 (range: 8 to 25 trials) IDS and ADS respectively and for the visual P100 (n=17), the mean number of trials was 9 in ADS (range: 8 to 21 trials) and 10 (range: 8 to 17 trials) in IDS. Preliminary results show that the auditory N600-800 was modulated by speech type (F(1,17)=6.66, p=.02). The mean amplitudes were significantly larger in the IDS (-20.30uV) compared to the ADS (-15.12uV) condition (t(17)=2.58, p=.02). The visual response to faces was also modulated by the speech type (F(1,16)=5.14, p=.04) with increased P100 mean amplitude for IDS (14.44uV) compared to ADS (11.36uV) condition (t(16)=2.27, p=.04). There was no significant hemispheric difference between conditions. These findings demonstrate that IDS elicits more attention than ADS and enhances face processing. Currently, we are running a control study to determine whether the effects obtained are specific to faces.