The ability to navigate in the world and execute appropriate behavioral responses depends critically on the contribution of the vestibular system to the detection of motion and spatial orientation. A complicating factor is that otolith afferents equivalently encode inertial and gravitational accelerations. Recent studies have demonstrated that the brain can resolve this sensory ambiguity by combining signals from both the otoliths and semicircular canal sensors, although it remains unknown how the brain integrates these sensory contributions to perform the nonlinear vector computations required to accurately detect head movement in space. Here, we illustrate how a physiologically relevant, nonlinear, integrative neural network could be used to perform the required computations for inertial motion detection along the interaural head axis. The proposed model not only can simulate recent behavioral observations, including a translational vestibuloocular reflex driven by the semicircular canals, but also accounts for several previously unexplained characteristics of central neural responses such as complex otolith-canal convergence patterns and the prevalence of dynamically processed otolith signals. A key model prediction, implied by the required computations for tilt-translation discrimination, is a coordinate transformation of canal signals from a head-fixed to a spatial reference frame. As a result, cell responses may reflect canal signal contributions that cannot be easily detected or distinguished from otolith signals. New experimental protocols are proposed to characterize these cells and identify their contributions to spatial motion estimation. The proposed theoretical framework makes an essential first link between the computations for inertial acceleration detection derived from the physical laws of motion and the neural response properties predicted in a physiologically realistic network implementation.
ASJC Scopus subject areas