Skip to main content

How Do Sound and Vision Integrate? Visual ERPs With and Without Simultaneous Auditory Inputs

How Do Sound and Vision Integrate? Visual ERPs With and Without Simultaneous Auditory Inputs

Presenter Name:Shelby Howlett

School/Affiliation:Brock University

Co-Authors:Sidney J. Segalowitz

Abstract:

Auditory and visual input are normally segregated in their paths to the sensory cortices. We examined ERPs to a visual stimulus stream (simple oddball task of squares and diamonds) presented at SOAs of 500ms (2Hz), an auditory stimulus stream (repeated tones) at either 1.42, 2, or 2.82Hz, and unimodal control conditions. We examined whether auditory input alters visual sensory ERP responses, focusing on P1, N170 and P2. Participants were asked to respond to visual oddballs and to ignore the auditory sequence. As expected, auditory sequences when presented alone yielded no visual ERP components. ERPs from visual sequences alone were then compared to the visual stimuli coupled with faster, simultaneous and slower auditory sequences to test the effects of conflicting and nonconflicting contexts to visual alone. Visual sequences alone yielded the largest amplitudes. To address the question of auditory input influencing visual ERPs, the modalities appeared to be segregated temporally as the slower and faster auditory sequences did not alter the timing of the visual ERP components. However, the signals were integrated in that the asynchronous auditory streams elicited ERP amplitudes (unexpectedly) larger than those of the simultaneous presentations and not different from the visual-alone condition most of the time (only the N170 at only a few sites were significantly larger for the visual-alone condition). Thus only when coordinated across modalities were visual ERP amplitudes affected.

Poster PDFPoster PDF Meeting LinkMeeting Link