Group 2 Created with Sketch.

News

Nov 12, 2019
New paper accepted in Journal of Experimental Child Psychology

(Crollen V., Noël M., Honoré N., Degroote V., Collignon O.)

Recent studies have suggested that multisensory redundancy may improve cognitive learning. According to this view, information simultaneously available across two or more modalities is highly salient and, therefore, may be learned and remembered better than the same information presented to only one modality. In the cur- rent study, we wanted to evaluate whether training arithmetic with a multisensory intervention could induce larger learning improvements than a visual intervention alone. Moreover, because a left-to-right-oriented mental number line was for a long time considered as a core feature of numerical representation, we also wanted to compare left-to-right-organized and randomly orga- nized arithmetic training. Therefore, five training programs were created and called (a) multisensory linear, (b) multisensory ran- dom, (c) visual linear, (d) visual random, and (e) control. A total of 85 preschoolers were randomly assigned to one of these five training conditions. Whereas children were trained to solve simple addition and subtraction operations in the first four training condi- tions, story understanding was the focus of the control training. Several numerical tasks (arithmetic, number-to-position, number comparison, counting, and subitizing) were used as pre- and post-test measures. Although the effect of spatial disposition was not significant, results demonstrated that the multisensory train- ing condition led to a significantly larger performance improvement than the visual training and control conditions. This result was specific to the trained ability (arithmetic) and is dis- cussed in light of the multisensory redundancy hypothesis.

May 08, 2019
New paper accepted in Cortex

Time-resolved discrimination of audio-visual emotion expressions

(Falagiarda F. & Collignon O.)

Humans seamlessly extract and integrate the emotional content delivered by the face and the voice of others. It is however poorly understood how perceptual decisions unfold in time when people discriminate the expression of emotions transmitted using dynamic facial and vocal signals, as in natural social context. In this study, we relied on a gating paradigm to track how the recognition of emotion expressions across the senses unfold over exposure time. We first demonstrate that across all emotions tested, a discriminatory decision is reached earlier with faces than with voices. Importantly, multisensory stimulation consistently reduced the required accumulation of perceptual evidences needed to reach correct discrimination (Isolation Point). We also observed that expressions with different emotional content provide cumulative evidence at different speeds, with “fear” being the expression with the fastest isolation point across the senses. Finally, the lack of correlation between the confusion patterns in response to facial and vocal signals across time suggest distinct relations between the discriminative features extracted from the two signals. All together, these results provide a comprehensive view on how auditory, visual and audiovisual information related to different emotion expressions accumulate in time, highlighting how multisensory context can fasten the discrimination process when minimal information is available.

Apr 15, 2019
Olivier participates in Pint of Science on May 22nd
Olivier will talk about his research in the section Insane in the main brain, with the talk: Building a brain in the dark.

The human brain evolved highly specialised regions dedicated to the refined processing of visual information. What does happen to these regions if you are born blind? Are they simply left dormant and unused? No! In case of blindness, the brain reorganises itself and the regions normally dedicated to vision now involved in the processing of information from the remaining senses. This demonstrates the fascinating ability of the brain to change the tuning of its neurons due to experience, a mechanism called brain plasticity. But what happens then if a blind person recovers sight?

Feb 04, 2019
New paper published in JNeurosci
Representation of auditory motion directions and sound source locations in the human planum temporale

(Battal C., Rezk M., Mattioni S., Vadlamudi J., & Collignon O)

The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is however poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to left, right, up and down moving as well as static sounds. Whole brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human Planum Temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were however significantly distinct. Altogether our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.

Jan 18, 2019
New paper published in Cognition.
Sound symbolism in sighted and blind. The role of vision and orthography in Tsound-shape correspondences

(Bottini R., Barilari M., & Collignon O)

Non-arbitrary sound-shape correspondences (SSC), such as the “bouba-kiki” effect, have been consistently ob- served across languages and together with other sound-symbolic phenomena challenge the classic linguistic dictum of the arbitrariness of the sign. Yet, it is unclear what makes a sound “round” or “spiky” to the human mind. Here we tested the hypothesis that visual experience is necessary for the emergence of SSC, supported by empirical evidence showing reduced SSC in visually impaired people. Results of two experiments comparing early blind and sighted individuals showed that SSC emerged strongly in both groups. Experiment 2, however, showed a partially different pattern of SSC in sighted and blind, that was mostly explained by a different effect of orthographic letter shape: The shape of written letters (spontaneously activated by spoken words) influenced SSC in the sighted, but not in the blind, who are exposed to an orthography (Braille) in which letters do not have spiky or round outlines. In sum, early blindness does not prevent the emergence of SSC, and differences between sighted and visually impaired people may be due the indirect influence (or lack thereof) of orthographic letter shape.

Dec 14, 2018
New paper published in Neuroimage.
Recruitment of the occipital cortex by arithmetic processing follows computational bias in the congenitally blind

Arithmetic reasoning activates the occipital cortex of congenitally blind people (CB). This activation of visual areas may highlight the functional flexibility of occipital regions deprived of their dominant inputs or relate to the intrinsic computational role of specific occipital regions. We contrasted these competing hypotheses by characterising the brain activity of CB and sighted participants while performing subtraction, multiplication and a control letter task. In both groups, subtraction selectively activated a bilateral dorsal network commonly activated during spatial processing. Multiplication triggered activity in temporal regions thought to participate in memory retrieval. No between-group difference was observed for the multiplication task whereas subtraction induced enhanced activity in the right dorsal occipital cortex of the blind individuals only. As this area overlaps with regions showing selective tuning to auditory spatial processing and exhibits increased functional connectivity with a dorsal “spatial” network, our results suggest that the recruitment of occipital regions during high-level cognition in the blind actually relates to the intrinsic computational role of the activated regions.

Nov 13, 2018
New paper published in Cerebral Cortex
Hierarchical brain network for face and voice integration of emotion expression.

(Davies-Thompson J., Elli G., Rezk M., Benetti S., van Ackeren M., Collignon O.)

The brain has separate specialized computational units to process faces and voices located in occipital and temporal cortices. However, humans seamlessly integrate signals from the faces and voices of others for optimal social interaction. How are emotional expressions, when delivered by different sensory modalities (faces and voices), integrated in the brain? In this study, we characterized the brains’ response to faces, voices, and combined face-voice information (congruent, incongruent), which varied in expression (neutral, fearful). Using a whole-brain approach, we found that only the right posterior superior temporal sulcus (rpSTS) responded more to bimodal stimuli than to face or voice alone but only when the stimuli contained emotional expression. Face- and voice-selective regions of interest extracted from independent functional localizers, similarly revealed multisensory integration in the face-selective rpSTS only; further, this was the only face- selective region that also responded significantly to voices. Dynamic Causal Modeling revealed that the rpSTS receives unidirectional information from the face-selective fusiform face area (FFA), and voice-selective temporal voice area (TVA), with emotional expression affecting the connections strength. Our study promotes a hierarchical model of face and voice integration, with convergence in the rpSTS, and that such integration depends on the (emotional) salience of the stimuli.

Sep 10, 2018
New paper accepted in Journal of Experimental Psychology: Human Perception and Performance
How visual experience and task context modulate the use of internal and external spatial coordinate for perception and action.

(Crollen V, Spruyt T, Mahau P, Bottini R, & Collignon O)

Recent studies proposed that the use of internal and external coordinate systems may be more flexible in congenitally blind when compared to sighted individuals. To investigate this hypothesis further, we asked congenitally blind and sighted people to perform, with the hands uncrossed and crossed over the body midline, a tactile TOJ and an auditory Simon task. Crucially, both tasks were carried out under task instructions either favoring the use of an internal (left vs. right hand) or an external (left vs. right hemispace) frame of reference. In the internal condition of the TOJ task, our results replicated previous findings (Röder et al., 2004) showing that hand crossing only impaired sighted participants’ performance, suggesting that blind people did not activate by default a (conflicting) external frame of reference. However, under external instructions, a decrease of performance was observed in both groups, suggesting that even blind people activated an external coordinate system in this condition. In the Simon task, and in contrast with a previous study (Roder et al., 2007), both groups responded more efficiently when the sound was presented from the same side of the response (‘‘Simon effect’’) independently of the hands position. This was true under the internal and external conditions, therefore suggesting that blind and sighted by default activated an external coordinate system in this task. All together, these data comprehensively demonstrate how visual experience shapes the default weight attributed to internal and external coordinate systems for action and perception depending on task demand.

 

Jul 06, 2018
New paper accepted in Scientific Report
Light modulates oscillatory alpha activity in the occipital cortex of totally visually blind individuals with intact non-image-forming photoreception.

(Vandewalle G, van Ackeren M, Daneault V, Hull J, Albouy G, Lepore F, Doyon J, Czeisler C, Dumont M, Carrier J, Lockley S, and Collignon O)

The discovery of intrinsically photosensitive retinal ganglion cells (ipRGCs) marked a major shift in our understanding of how light information is processed by the mammalian brain. These ipRGCs influence multiple functions not directly related to image formation such as circadian resetting and entrainment, pupil constriction, enhancement of alertness, as well as the modulation of cognition. More recently, it was demonstrated that ipRGCs may also contribute to basic visual functions. The impact of ipRGCs on visual function, independently of image forming photoreceptors, remains difficult to isolate, however, particularly in humans. We previously showed that exposure to intense monochromatic blue light (465 nm) induced non-conscious light perception in a forced choice task in three rare totally visually blind individuals without detectable rod and cone function, but who retained non-image-forming responses to light, very likely via ipRGCs. The neural foundation of such light perception in the absence of conscious vision is unknown, however. In this study, we characterized the brain activity of these three participants using electroencephalography (EEG), and demonstrate that unconsciously perceived light triggers an early and reliable transient desynchronization (i.e. decreased power) of the alpha EEG rhythm (8-14 Hz) over the occipital cortex. These results provide compelling insight into how ipRGC may contribute to transient changes in ongoing brain activity. They suggest that occipital alpha rhythm synchrony, which is typically linked to the visual system, is modulated by ipRGCs photoreception; a process that may contribute to the non-conscious light perception in those blind individuals.

This site can improve your user experience by activating cookies.

More information