Author(s)
ISBN
Publication year
Periodical
Periodical Number
Volume
Pages
Author Address
One major challenge in determining how the brain categorizes objects is to tease apart the contribution of low-level and high-level visual properties to behavioral and brain imaging data. So far, studies using stimuli with equated amplitude spectra have shown that the visual system relies mostly on localized information, such as edges and contours, carried by phase information. However, some researchers have argued that some event-related potentials (ERP) and blood-oxygen-level-dependent (BOLD) categorical differences could be driven by nonlocalized information contained in the amplitude spectrum. The goal of this study was to provide the first systematic quantification of the contribution of phase and amplitude spectra to early ERPs to faces and objects. We conducted two experiments in which we recorded electroencephalograms (EEG) from eight subjects, in two sessions each. In the first experiment, participants viewed images of faces and houses containing original or scrambled phase spectra combined with original, averaged, or swapped amplitude spectra. In the second experiment, we parametrically manipulated image phase and amplitude in 10% intervals. We performed a range of analyses including detailed single-subject general linear modeling of ERP data, test-retest reliability, and unique variance analyses. Our results suggest that early ERPs to faces and objects are due to phase information, with almost no contribution from the amplitude spectrum. Importantly, our results should not be used to justify uncontrolled stimuli; to the contrary, our results emphasize the need for stimulus control (including the amplitude spectrum), parametric designs, and systematic data analyses, of which we have seen far too little in ERP vision research.