In the study of bodily awareness, the predictive coding theory has revealed that our brain continuously modulates sensory experiences to integrate them into a unitary body representation. Indeed, during multisensory illusions (e.g., the rubber hand illusion, RHI), the synchronous stroking of the participant ' s concealed hand and a fake visible one creates a visuotactile con fl ict, generating a prediction error. Within the predictive coding framework, through sensory processing modulation, prediction errors are solved, inducing participants to feel as if touches originated from the fake hand, thus ascribing the fake hand to their own body. Here, we aimed to address sensory processing modulation under multisensory con fl ict, by disentangling somatosensory and visual stimuli processing that are intrinsically associated during the illusion induction. To this aim, we designed two EEG experiments, in which somatosensory- (SEPs; Experiment 1; N = 18; F = 10) and visual -evoked potentials (VEPs; Experiment 2; N = 18; F = 9) were recorded in human males and females following the RHI. Our results show that, in both experiments, ERP amplitude is signi fi cantly modulated in the illusion as compared with both control and baseline conditions, with a modality -dependent diametrical pattern showing decreased SEP amplitude and increased VEP amplitude. Importantly, both somatosensory and visual modulations occur in long -latency time windows previously associated with tactile and visual awareness, thus explaining the illusion of perceiving touch at the sight location. In conclusion, we describe a diametrical modulation of somatosensory and visual processing as the neural mechanism that allows maintaining a stable body representation, by restoring visuotactile congruency under the occurrence of multisensory con fl icts. Signi fi cance Statement Given the inherent relationship between touch and body, the literature on body representation has mainly focused on the somatosensory system ' s investigation, whereas less attention has been paid to the visual system. Here, we aim to investigate the modulation of both somatosensory and visual processing during a well-known multisensory illusion (i.e., the rubber hand illusion, RHI), in which visuotactile con fl ict is employed to challenge body representation. By recording electroencephalography, we show how the processing of somatosensory and visual stimuli is diametrically modulated during the RHI, with a decrease of the former and an increase of the latter. We conclude that this neural mechanism is triggered to restore visuotactile congruency, leading to the illusory feeling of perceiving touch at the sight location.
Balancing the Senses: Electrophysiological Responses Reveal the Interplay between Somatosensory and Visual Processing During Body-Related Multisensory Conflict
Rossi Sebastiano, AliceFirst
;Poles, Karol;Gualtiero, Stefano;Romeo, Marcella;Galigani, Mattia;Bruno, Valentina;Fossataro, Carlotta;Garbarini, FrancescaLast
2024-01-01
Abstract
In the study of bodily awareness, the predictive coding theory has revealed that our brain continuously modulates sensory experiences to integrate them into a unitary body representation. Indeed, during multisensory illusions (e.g., the rubber hand illusion, RHI), the synchronous stroking of the participant ' s concealed hand and a fake visible one creates a visuotactile con fl ict, generating a prediction error. Within the predictive coding framework, through sensory processing modulation, prediction errors are solved, inducing participants to feel as if touches originated from the fake hand, thus ascribing the fake hand to their own body. Here, we aimed to address sensory processing modulation under multisensory con fl ict, by disentangling somatosensory and visual stimuli processing that are intrinsically associated during the illusion induction. To this aim, we designed two EEG experiments, in which somatosensory- (SEPs; Experiment 1; N = 18; F = 10) and visual -evoked potentials (VEPs; Experiment 2; N = 18; F = 9) were recorded in human males and females following the RHI. Our results show that, in both experiments, ERP amplitude is signi fi cantly modulated in the illusion as compared with both control and baseline conditions, with a modality -dependent diametrical pattern showing decreased SEP amplitude and increased VEP amplitude. Importantly, both somatosensory and visual modulations occur in long -latency time windows previously associated with tactile and visual awareness, thus explaining the illusion of perceiving touch at the sight location. In conclusion, we describe a diametrical modulation of somatosensory and visual processing as the neural mechanism that allows maintaining a stable body representation, by restoring visuotactile congruency under the occurrence of multisensory con fl icts. Signi fi cance Statement Given the inherent relationship between touch and body, the literature on body representation has mainly focused on the somatosensory system ' s investigation, whereas less attention has been paid to the visual system. Here, we aim to investigate the modulation of both somatosensory and visual processing during a well-known multisensory illusion (i.e., the rubber hand illusion, RHI), in which visuotactile con fl ict is employed to challenge body representation. By recording electroencephalography, we show how the processing of somatosensory and visual stimuli is diametrically modulated during the RHI, with a decrease of the former and an increase of the latter. We conclude that this neural mechanism is triggered to restore visuotactile congruency, leading to the illusory feeling of perceiving touch at the sight location.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.