The ability to make accurate and timely decisions, such as judging when it is safe to cross
the road, is the foundation of adaptive behaviour. While the computational and neural
processes supporting simple decisions on isolated stimuli have been well characterised, in
the real-world decision-making often requires integration of discrete sensory events over
time and space. When crossing the road, for example, the locations and speeds of several
cars must be considered. It remains unclear how such integrative perceptual decisions are
regulated computationally. Here we used psychophysics, electroencephalography and
computational modelling to understand how the human brain combines visual motion
signals across space. We directly tested competing predictions arising from influential serial
and parallel accounts of visual processing. Using a biologically plausible model of motion
filtering, we find evidence in favour of parallel integration as the fundamental
computational mechanism regulating integrated perceptual decisions.