Group 2 Created with Sketch.


Apr 22, 2020
New paper accepted in Nature

Variability in the analysis of a single neuroimaging dataset by many teams

Botvinik-Nezer, R., Holzmeister F.,… Barilari M.,…Collignon O.,…Gau R. et al.

Data analysis workflows in many scientific domains have become increasingly complex and flexible. To assess the impact of this flexibility on functional magnetic resonance imaging (fMRI) results, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses. The flexibility of analytic approaches is exemplified by the fact that no two teams chose identical workflows to analyze the data. This flexibility resulted in sizeable variation in hypothesis test results, even for teams whose statistical maps were highly correlated at intermediate stages of their analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Importantly, meta-analytic approaches that aggregated information across teams yielded significant consensus in activated regions across teams. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset. Our findings show that analytic flexibility can have substantial effects on scientific conclusions, and demonstrate factors related to variability in fMRI. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for multiple analyses of the same data. Potential approaches to mitigate issues related to analytical variability are discussed


Feb 07, 2020
New paper accepted in eLife

Categorical representation from sound and sight in the ventral occipito-temporal cortex of sighted and blind (Find here a 10 min. talk on the paper)

Mattioni S., Rezk M., Battal C., Bottini R.,Cuculiza Mendoza K.E., Oosterhof N. N., Collignon O.

Is vision necessary for the development of the categorical organization of the Ventral Occipito-Temporal Cortex (VOTC)? We used fMRI to characterize VOTC responses to eight categories presented acoustically in sighted and early blind individuals, and visually in a separate sighted group. We observed that VOTC reliably encodes sound categories in sighted and blind people using a representational structure and connectivity partially similar to the one found in vision. Sound categories were, however, more reliably encoded in the blind than the sighted group, using a representational format closer to the one found in vision. Crucially, VOTC in blind represents the categorical membership of sounds rather than their acoustic features. Our results suggest that sounds trigger categorical responses in the VOTC of congenitally blind and sighted people that partially match the topography and functional profile of the visual response, despite qualitative nuances in the categorical organization of VOTC between modalities and groups.

You can find the paper here.


Dec 11, 2019
Olivier participates in BE Neuroscience & Technology Meetup (December 18th, 7pm-9pm)
Olivier will talk about his research in the BE Neuroscience & Technology Meetup , with the talk: Building a brain in the dark.

What does happen to the “visual cortex” of someone born blind? Are these regions unused as they do not receive their preferred sensory input? No. In contrast, I will show that these regions reorganise to process non-visual inputs in an organise fashion. These data shed new light on the old ‘nature versus nurture’ debate on brain development: while the recruitment of occipital (visual) regions by non-visual inputs in blind individuals highlights the ability of the brain to remodel itself due to experience (nurture influence), the observation of specialized cognitive modules in the reorganised occipital cortex of the blinds, similar to those observed in the sighted, highlights the intrinsic constraints imposed to such plasticity (nature influence).
What then would happen if a congenitally blind individual was given the gift of sight? Would those reorganised regions switch back to their natural dedication to vision? We had the unique opportunity to track the behavioral and neurophysiological changes taking place in the occipital cortex of an early and severely visually impaired patient before as well as 1.5 and 7 months after sight restoration. An in-deep study of this exceptional patient highlighted the dynamic nature of the occipital cortex facing visual deprivation and restoration. Finally, I will present some data demonstrating that even a short period of visual deprivation (only few weeks) during the early sensitive period of brain development leads to enduring large-scale crossmodal reorganization of the brain circuitry typically dedicated to vision, even years after visual inputs.

Nov 12, 2019
New paper accepted in Journal of Experimental Child Psychology

(Crollen V., Noël M., Honoré N., Degroote V., Collignon O.)

Recent studies have suggested that multisensory redundancy may improve cognitive learning. According to this view, information simultaneously available across two or more modalities is highly salient and, therefore, may be learned and remembered better than the same information presented to only one modality. In the cur- rent study, we wanted to evaluate whether training arithmetic with a multisensory intervention could induce larger learning improvements than a visual intervention alone. Moreover, because a left-to-right-oriented mental number line was for a long time considered as a core feature of numerical representation, we also wanted to compare left-to-right-organized and randomly orga- nized arithmetic training. Therefore, five training programs were created and called (a) multisensory linear, (b) multisensory ran- dom, (c) visual linear, (d) visual random, and (e) control. A total of 85 preschoolers were randomly assigned to one of these five training conditions. Whereas children were trained to solve simple addition and subtraction operations in the first four training condi- tions, story understanding was the focus of the control training. Several numerical tasks (arithmetic, number-to-position, number comparison, counting, and subitizing) were used as pre- and post-test measures. Although the effect of spatial disposition was not significant, results demonstrated that the multisensory train- ing condition led to a significantly larger performance improvement than the visual training and control conditions. This result was specific to the trained ability (arithmetic) and is dis- cussed in light of the multisensory redundancy hypothesis.

Apr 15, 2019
Olivier participates in Pint of Science on May 22nd
Olivier will talk about his research in the section Insane in the main brain, with the talk: Building a brain in the dark.

The human brain evolved highly specialised regions dedicated to the refined processing of visual information. What does happen to these regions if you are born blind? Are they simply left dormant and unused? No! In case of blindness, the brain reorganises itself and the regions normally dedicated to vision now involved in the processing of information from the remaining senses. This demonstrates the fascinating ability of the brain to change the tuning of its neurons due to experience, a mechanism called brain plasticity. But what happens then if a blind person recovers sight?

Feb 04, 2019
New paper published in JNeurosci
Representation of auditory motion directions and sound source locations in the human planum temporale

(Battal C., Rezk M., Mattioni S., Vadlamudi J., & Collignon O)

The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is however poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to left, right, up and down moving as well as static sounds. Whole brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human Planum Temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were however significantly distinct. Altogether our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.

Jan 18, 2019
New paper published in Cognition.
Sound symbolism in sighted and blind. The role of vision and orthography in Tsound-shape correspondences

(Bottini R., Barilari M., & Collignon O)

Non-arbitrary sound-shape correspondences (SSC), such as the “bouba-kiki” effect, have been consistently ob- served across languages and together with other sound-symbolic phenomena challenge the classic linguistic dictum of the arbitrariness of the sign. Yet, it is unclear what makes a sound “round” or “spiky” to the human mind. Here we tested the hypothesis that visual experience is necessary for the emergence of SSC, supported by empirical evidence showing reduced SSC in visually impaired people. Results of two experiments comparing early blind and sighted individuals showed that SSC emerged strongly in both groups. Experiment 2, however, showed a partially different pattern of SSC in sighted and blind, that was mostly explained by a different effect of orthographic letter shape: The shape of written letters (spontaneously activated by spoken words) influenced SSC in the sighted, but not in the blind, who are exposed to an orthography (Braille) in which letters do not have spiky or round outlines. In sum, early blindness does not prevent the emergence of SSC, and differences between sighted and visually impaired people may be due the indirect influence (or lack thereof) of orthographic letter shape.

Dec 14, 2018
New paper published in Neuroimage.
Recruitment of the occipital cortex by arithmetic processing follows computational bias in the congenitally blind

Arithmetic reasoning activates the occipital cortex of congenitally blind people (CB). This activation of visual areas may highlight the functional flexibility of occipital regions deprived of their dominant inputs or relate to the intrinsic computational role of specific occipital regions. We contrasted these competing hypotheses by characterising the brain activity of CB and sighted participants while performing subtraction, multiplication and a control letter task. In both groups, subtraction selectively activated a bilateral dorsal network commonly activated during spatial processing. Multiplication triggered activity in temporal regions thought to participate in memory retrieval. No between-group difference was observed for the multiplication task whereas subtraction induced enhanced activity in the right dorsal occipital cortex of the blind individuals only. As this area overlaps with regions showing selective tuning to auditory spatial processing and exhibits increased functional connectivity with a dorsal “spatial” network, our results suggest that the recruitment of occipital regions during high-level cognition in the blind actually relates to the intrinsic computational role of the activated regions.

Nov 13, 2018
New paper published in Cerebral Cortex
Hierarchical brain network for face and voice integration of emotion expression.

(Davies-Thompson J., Elli G., Rezk M., Benetti S., van Ackeren M., Collignon O.)

The brain has separate specialized computational units to process faces and voices located in occipital and temporal cortices. However, humans seamlessly integrate signals from the faces and voices of others for optimal social interaction. How are emotional expressions, when delivered by different sensory modalities (faces and voices), integrated in the brain? In this study, we characterized the brains’ response to faces, voices, and combined face-voice information (congruent, incongruent), which varied in expression (neutral, fearful). Using a whole-brain approach, we found that only the right posterior superior temporal sulcus (rpSTS) responded more to bimodal stimuli than to face or voice alone but only when the stimuli contained emotional expression. Face- and voice-selective regions of interest extracted from independent functional localizers, similarly revealed multisensory integration in the face-selective rpSTS only; further, this was the only face- selective region that also responded significantly to voices. Dynamic Causal Modeling revealed that the rpSTS receives unidirectional information from the face-selective fusiform face area (FFA), and voice-selective temporal voice area (TVA), with emotional expression affecting the connections strength. Our study promotes a hierarchical model of face and voice integration, with convergence in the rpSTS, and that such integration depends on the (emotional) salience of the stimuli.

Sep 10, 2018
New paper accepted in Journal of Experimental Psychology: Human Perception and Performance
How visual experience and task context modulate the use of internal and external spatial coordinate for perception and action.

(Crollen V, Spruyt T, Mahau P, Bottini R, & Collignon O)

Recent studies proposed that the use of internal and external coordinate systems may be more flexible in congenitally blind when compared to sighted individuals. To investigate this hypothesis further, we asked congenitally blind and sighted people to perform, with the hands uncrossed and crossed over the body midline, a tactile TOJ and an auditory Simon task. Crucially, both tasks were carried out under task instructions either favoring the use of an internal (left vs. right hand) or an external (left vs. right hemispace) frame of reference. In the internal condition of the TOJ task, our results replicated previous findings (Röder et al., 2004) showing that hand crossing only impaired sighted participants’ performance, suggesting that blind people did not activate by default a (conflicting) external frame of reference. However, under external instructions, a decrease of performance was observed in both groups, suggesting that even blind people activated an external coordinate system in this condition. In the Simon task, and in contrast with a previous study (Roder et al., 2007), both groups responded more efficiently when the sound was presented from the same side of the response (‘‘Simon effect’’) independently of the hands position. This was true under the internal and external conditions, therefore suggesting that blind and sighted by default activated an external coordinate system in this task. All together, these data comprehensively demonstrate how visual experience shapes the default weight attributed to internal and external coordinate systems for action and perception depending on task demand.


Jul 06, 2018
New paper accepted in Scientific Report
Light modulates oscillatory alpha activity in the occipital cortex of totally visually blind individuals with intact non-image-forming photoreception.

(Vandewalle G, van Ackeren M, Daneault V, Hull J, Albouy G, Lepore F, Doyon J, Czeisler C, Dumont M, Carrier J, Lockley S, and Collignon O)

The discovery of intrinsically photosensitive retinal ganglion cells (ipRGCs) marked a major shift in our understanding of how light information is processed by the mammalian brain. These ipRGCs influence multiple functions not directly related to image formation such as circadian resetting and entrainment, pupil constriction, enhancement of alertness, as well as the modulation of cognition. More recently, it was demonstrated that ipRGCs may also contribute to basic visual functions. The impact of ipRGCs on visual function, independently of image forming photoreceptors, remains difficult to isolate, however, particularly in humans. We previously showed that exposure to intense monochromatic blue light (465 nm) induced non-conscious light perception in a forced choice task in three rare totally visually blind individuals without detectable rod and cone function, but who retained non-image-forming responses to light, very likely via ipRGCs. The neural foundation of such light perception in the absence of conscious vision is unknown, however. In this study, we characterized the brain activity of these three participants using electroencephalography (EEG), and demonstrate that unconsciously perceived light triggers an early and reliable transient desynchronization (i.e. decreased power) of the alpha EEG rhythm (8-14 Hz) over the occipital cortex. These results provide compelling insight into how ipRGC may contribute to transient changes in ongoing brain activity. They suggest that occipital alpha rhythm synchrony, which is typically linked to the visual system, is modulated by ipRGCs photoreception; a process that may contribute to the non-conscious light perception in those blind individuals.

This site can improve your user experience by activating cookies.

More information