Crossmodal integration was studied in humans by presenting random sequences of auditory (brief noise bursts), visual (flashes), and audiovisual (simultaneous noise bursts and flashes) stimuli from a central location at irregular intervals between 600 and 800 ms. The subjects' task was to press a button to infrequent and unpredictable (P=0.15) target stimuli that could be either a more intense noise burst, a brighter flash, or a combination of the two. In accordance with previous studies, behavioral data showed that bimodal target stimuli were responded to much faster and were identified more accurately than the unimodal target stimuli. The neural basis of this crossmodal interaction was investigated by subtracting the ERPs to the auditory (A) and the visual (V) stimuli alone from the ERP to the combined audiovisual (AV) stimuli (i.e. interaction=AV-(A+V)). Using this approach, we replicated previous reports of both early (at around 40 ms) and late (after 100 ms) ERP interaction effects. However, it appears that the very early interaction effects can be largely accounted for by an anticipatory ERP that precedes both the unimodal and bimodal stimuli. In calculating the ERP interaction this slow shift is subtracted twice, resulting in an apparent shift of the opposite polarity that may be confounded with actual crossmodal interactions.

An analysis of audio-visual crossmodal integration by means of event related potential (ERP) recordings

DI RUSSO F;
2002-01-01

Abstract

Crossmodal integration was studied in humans by presenting random sequences of auditory (brief noise bursts), visual (flashes), and audiovisual (simultaneous noise bursts and flashes) stimuli from a central location at irregular intervals between 600 and 800 ms. The subjects' task was to press a button to infrequent and unpredictable (P=0.15) target stimuli that could be either a more intense noise burst, a brighter flash, or a combination of the two. In accordance with previous studies, behavioral data showed that bimodal target stimuli were responded to much faster and were identified more accurately than the unimodal target stimuli. The neural basis of this crossmodal interaction was investigated by subtracting the ERPs to the auditory (A) and the visual (V) stimuli alone from the ERP to the combined audiovisual (AV) stimuli (i.e. interaction=AV-(A+V)). Using this approach, we replicated previous reports of both early (at around 40 ms) and late (after 100 ms) ERP interaction effects. However, it appears that the very early interaction effects can be largely accounted for by an anticipatory ERP that precedes both the unimodal and bimodal stimuli. In calculating the ERP interaction this slow shift is subtracted twice, resulting in an apparent shift of the opposite polarity that may be confounded with actual crossmodal interactions.
2002
ERP
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14244/5146
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 275
social impact