Modeling auditory-visual evoked eye-head gaze shifts in dynamic multisteps
Author(s) -
Bahadir Kasap,
A. John Van Opstal
Publication year - 2018
Publication title -
journal of neurophysiology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.302
H-Index - 245
eISSN - 1522-1598
pISSN - 0022-3077
DOI - 10.1152/jn.00502.2017
Subject(s) - gaze , computer science , eye movement , superior colliculus , saccadic masking , sensory system , stimulus (psychology) , computer vision , artificial intelligence , psychology , neuroscience , cognitive psychology
In dynamic visual or auditory gaze double-steps, a brief target flash or sound burst is presented in midflight of an ongoing eye-head gaze shift. Behavioral experiments in humans and monkeys have indicated that the subsequent eye and head movements to the target are goal-directed, regardless of stimulus timing, first gaze shift characteristics, and initial conditions. This remarkable behavior requires that the gaze-control system 1) has continuous access to accurate signals about eye-in-head position and ongoing eye-head movements, 2) that it accounts for different internal signal delays, and 3) that it is able to update the retinal ( T E ) and head-centric ( T H ) target coordinates into appropriate eye-centered and head-centered motor commands on millisecond time scales. As predictive, feedforward remapping of targets cannot account for this behavior, we propose that targets are transformed and stored into a stable reference frame as soon as their sensory information becomes available. We present a computational model, in which recruited cells in the midbrain superior colliculus drive eyes and head to the stored target location through a common dynamic oculocentric gaze-velocity command, which is continuously updated from the stable goal and transformed into appropriate oculocentric and craniocentric motor commands. We describe two equivalent, yet conceptually different, implementations that both account for the complex, but accurate, kinematic behaviors and trajectories of eye-head gaze shifts under a variety of challenging multisensory conditions, such as in dynamic visual-auditory multisteps.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom