Group 2 Created with Sketch.


Nov 13, 2018
New paper published in Cerebral Cortex
Hierarchical brain network for face and voice integration of emotion expression.

(Davies-Thompson J., Elli G., Rezk M., Benetti S., van Ackeren M., Collignon O.)

The brain has separate specialized computational units to process faces and voices located in occipital and temporal cortices. However, humans seamlessly integrate signals from the faces and voices of others for optimal social interaction. How are emotional expressions, when delivered by different sensory modalities (faces and voices), integrated in the brain? In this study, we characterized the brains’ response to faces, voices, and combined face-voice information (congruent, incongruent), which varied in expression (neutral, fearful). Using a whole-brain approach, we found that only the right posterior superior temporal sulcus (rpSTS) responded more to bimodal stimuli than to face or voice alone but only when the stimuli contained emotional expression. Face- and voice-selective regions of interest extracted from independent functional localizers, similarly revealed multisensory integration in the face-selective rpSTS only; further, this was the only face- selective region that also responded significantly to voices. Dynamic Causal Modeling revealed that the rpSTS receives unidirectional information from the face-selective fusiform face area (FFA), and voice-selective temporal voice area (TVA), with emotional expression affecting the connections strength. Our study promotes a hierarchical model of face and voice integration, with convergence in the rpSTS, and that such integration depends on the (emotional) salience of the stimuli.

Sep 10, 2018
New paper accepted in Journal of Experimental Psychology: Human Perception and Performance
How visual experience and task context modulate the use of internal and external spatial coordinate for perception and action.

(Crollen V, Spruyt T, Mahau P, Bottini R, & Collignon O)

Recent studies proposed that the use of internal and external coordinate systems may be more flexible in congenitally blind when compared to sighted individuals. To investigate this hypothesis further, we asked congenitally blind and sighted people to perform, with the hands uncrossed and crossed over the body midline, a tactile TOJ and an auditory Simon task. Crucially, both tasks were carried out under task instructions either favoring the use of an internal (left vs. right hand) or an external (left vs. right hemispace) frame of reference. In the internal condition of the TOJ task, our results replicated previous findings (Röder et al., 2004) showing that hand crossing only impaired sighted participants’ performance, suggesting that blind people did not activate by default a (conflicting) external frame of reference. However, under external instructions, a decrease of performance was observed in both groups, suggesting that even blind people activated an external coordinate system in this condition. In the Simon task, and in contrast with a previous study (Roder et al., 2007), both groups responded more efficiently when the sound was presented from the same side of the response (‘‘Simon effect’’) independently of the hands position. This was true under the internal and external conditions, therefore suggesting that blind and sighted by default activated an external coordinate system in this task. All together, these data comprehensively demonstrate how visual experience shapes the default weight attributed to internal and external coordinate systems for action and perception depending on task demand.


Jul 06, 2018
New paper accepted in Scientific Report
Light modulates oscillatory alpha activity in the occipital cortex of totally visually blind individuals with intact non-image-forming photoreception.

(Vandewalle G, van Ackeren M, Daneault V, Hull J, Albouy G, Lepore F, Doyon J, Czeisler C, Dumont M, Carrier J, Lockley S, and Collignon O)

The discovery of intrinsically photosensitive retinal ganglion cells (ipRGCs) marked a major shift in our understanding of how light information is processed by the mammalian brain. These ipRGCs influence multiple functions not directly related to image formation such as circadian resetting and entrainment, pupil constriction, enhancement of alertness, as well as the modulation of cognition. More recently, it was demonstrated that ipRGCs may also contribute to basic visual functions. The impact of ipRGCs on visual function, independently of image forming photoreceptors, remains difficult to isolate, however, particularly in humans. We previously showed that exposure to intense monochromatic blue light (465 nm) induced non-conscious light perception in a forced choice task in three rare totally visually blind individuals without detectable rod and cone function, but who retained non-image-forming responses to light, very likely via ipRGCs. The neural foundation of such light perception in the absence of conscious vision is unknown, however. In this study, we characterized the brain activity of these three participants using electroencephalography (EEG), and demonstrate that unconsciously perceived light triggers an early and reliable transient desynchronization (i.e. decreased power) of the alpha EEG rhythm (8-14 Hz) over the occipital cortex. These results provide compelling insight into how ipRGC may contribute to transient changes in ongoing brain activity. They suggest that occipital alpha rhythm synchrony, which is typically linked to the visual system, is modulated by ipRGCs photoreception; a process that may contribute to the non-conscious light perception in those blind individuals.

This site can improve your user experience by activating cookies.

More information