Traditionally, a large proportion of perceptual research has assumed a specialization of cortical regions for the processing of stimuli in a single sensory modality. Perception in everyday life, however, usually consists of inputs from multiple sensory channels. Recently the question of how the brain integrates multisensory information has become the focus of a growing number of neuroscientific investigations. This work has identified both multisensory integration regions and crossmodal influences in brain areas traditionally thought to be specific to one sensory modality. Furthermore, several factors have been identified that enhance integration such as spatio-temporal stimulus coincidence and semantic congruency.Multisensory Object Perception in the Primate Brain elucidates the mechanisms of multisensory integration of object-related information with a focus on the visual, auditory, and tactile sensory modalities. Evidence is presented in four sections: methodological considerations, audio-visual processing, visuo-tactile processing, and plasticity, and includes studies in both human and nonhuman primates at different levels of analysis. Studies range from intracranial electrophysiological recordings to non-invasive electro- or magnetoencephalography, functional magnetic resonance imaging, behavioral approaches, and computational modeling.