Auditory behavior, perception, and cognition are all shaped by information from other sensory systems. This volume examines this multi-sensory view of auditory function at levels of analysis ranging from the single neuron to neuroimaging in human clinical populations .
Visual Influence on Auditory Perception Adrian K.C. Lee and Mark T. Wallace
Cue Combination within a Bayesian Framework David Alais and David Burr
Toward a Model of Auditory-Visual Speech Intelligibility Ken W. Grant and Joshua G. W. Bernstein
An Object-based Interpretation of Audiovisual Processing Adrian K.C. Lee, Ross K. Maddox, and Jennifer K. Bizley Hearing in a “Moving” Visual World: Coordinate Transformations Along the Auditory Pathway Shawn M. Willett, Jennifer M. Groh, Ross K. Maddox
Multisensory Processing in the Auditory Cortex Andrew J. King, Amy Hammond-Kenny, Fernando R. Nodal
Audiovisual Integration in the Primate Prefrontal Cortex Bethany Plakke and Lizabeth M. Romanski
Using Multisensory Integration to Understand Human Auditory Cortex Michael S. Beauchamp
Combining Voice and Face Content in the Primate Temporal Lobe Catherine Perrodin and Christopher I. Petkov
Neural Network Dynamics and Audiovisual Integration Julian Keil and Daniel Senkowski
Cross-Modal Learning in the Auditory System Patrick Bruns and Brigitte Röder
Multisensory Processing Differences in Individuals with Autism Spectrum Disorder Sarah H. Baum Miller, Mark T. Wallace
Adrian K.C. Lee is Associate Professor in the Department of Speech & Hearing Sciences and the Institute for Learning and Brain Sciences at the University of Washington, Seattle
Mark T. Wallace is the Louise BMc Gavock Endowed Chair and Professor in the Departments of Hearing and Speech Sciences, Psychiatry, Psychology and Director of the Vanderbilt Brain Institute at Vanderbilt University, Nashville
Allison B. Coffin is Associate Professor in the Department of Integrative Physiology and Neuroscience at Washington State University, Vancouver, WA
Arthur N. Popper is Professor Emeritus and research professor in the Department of Biology at the University of Maryland, College Park
Richard R. Fay is Distinguished Research Professor of Psychology at Loyola University, Chicago
Spis treści
Preface.- Visual Influence on Auditory Perception.- Cue Combination Within a Bayesian Framework.- Toward a Model of Auditory-Visual Speech Intelligibility.- An Object-Based Interpretation of Audiovisual Processing.- Hearing in a “Moving” Visual World: Coordinate Transformations Along the Auditory Pathway.- Multisensory Processing in the Auditory Cortex.- Audiovisual Integration in the Primate Prefrontal Cortex.- Using Multisensory Integration to Understand the Human Auditory Cortex.- Combining Voice and Face Content in the Primate Temporal Lobe.- Neural Network Dynamics and Audiovisual Integration.- Cross-Modal Learning in the Auditory System.- Multisensory Processing Differences in Individuals with Autism Spectrum Disorder.
O autorze
Dr. Adrian K.C. Lee is Director of the Center for Auditory Neuroimaging and Assistant Professor of Speech & Hearing Sciences and Institute for Learning & Brain Sciences at University of Washington. The Lee lab is interested in mapping the cortical dynamics associated with auditory attention as well as how the oculomotor and the visual attentional network interact with auditory attention.
Dr. Mark T. Wallace is Director of the Vanderbilt Brain Institute and Professor of Hearing and Speech Sciences, Psychology and Psychiatry at Vanderbilt University. The Wallace lab is interested in better understanding how the brain synthesizes information from multiple sensory systems and employs an array of approaches ranging from neurophysiology in animal models to neuroimaging in “typical” and clinical populations.