IFB Integrated Center for Research and Treatment of Vertigo, Balance, and Oculomotor Disorders
Spatial orientation is our sense of movement through the world and orientation relative to the constant force of gravity. Many of our daily activities, including walking, driving, or riding a bike, depend critically on spatial orientation estimates that are continuously and effortlessly maintained by dedicated sensory and cognitive processes that most of us take for granted. The broad aim of research in my lab is to achieve a better understanding of these processes.
The best sensory information for estimating spatial orientation comes from visual and vestibular sensory systems. The vestibular system is composed of tiny, biomechanical linear and angular accelerometers in the inner ear. Visual information relevant to spatial orientation is derived primarily from optic flow, the characteristic pattern of visual motion generated when the eye moves relative to the stationary world.
A particular focus of research in the lab is to determine how precisely the nervous system can estimate motion and orientation based on visual and vestibular stimuli. We therefore measure detection and discrimination performance using a virtual reality motion simulator consisting of a hexapod motion platform and attached visual display.
Another focus is to quantify the natural statistical properties of visual and vestibular stimulation in real world environments. These measurements are made using a custom-made head-mounted device that records synchronized information about head motion, eye movements, and visual stimulation during everyday activities such as walking or riding a bike.
Together, these measurements allow us to determine the dynamic range of natural stimulation and the corresponding dynamic range of human sensitivity. Such measurements are necessary prerequisites for modeling spatial orientation perception and action in a probabilistic (e.g. Bayesian) framework.
Cuturi LF, MacNeilage PR. (2014) Optic flow induces nonvisual self-motion aftereffects. Curr Biol. Dec 1;24(23):2817-21.
Cuturi LF, Macneilage PR (2013) Systematic biases in human heading estimation. PLoS One 8(2): e56862. doi: 10.1371/journal.pone.0056862.
MacNeilage PR, Zhang Z, DeAngelis GC, Angelaki DE (2012) Vestibular facilitation of optic flow parsing. PLoS One 7(7): e40264. doi: 10.1371/journal.pone.0040264.
MacNeilage PR, Dokka K, DeAngelis GD, Angelaki DE (2011) Estimating distance during self-motion: a role for visual-vestibular interactions. J Vis. 11(13).
MacNeilage, Banks, DeAngelis, Angelaki (2010) Vestibular heading discrimination and sensitivity to linear acceleration in head and world coordinates. J Neurosci 30(27):9084-9094. Abstract
MacNeilage, Turner, Angelaki (2010) Canal-otolith interactions and detection thresholds of linear and angular components during curved-path self-motion. J Neurophysiol 104(2):765-73. Abstract
MacNeilage, Ganesan, Angelaki (2008) Computational approaches to spatial orientation: from transfer functions to dynamic Bayesian inference. J Neurophysiol 100(6): 2981-2996. Abstract
MacNeilage, Banks, Berger, Buelthoff (2007). A Bayesian model of the disambiguation of gravitoinertial force by visual cues. Exp Brain Res 179(2):263-90. Abstract
Primary Technique(s): virtual reality motion simulator (hexapod motion platform and attached visual display), head-mounted device measurements, modeling spatial orientation perception, Bayesian probabilistic framework
Model Organism(s): human sensory system