z-logo
open-access-imgOpen Access
Predicting human visuomotor behaviour in a driving task
Author(s) -
Leif Johnson,
Brian Sullivan,
Mary Hayhoe,
Dana H. Ballard
Publication year - 2014
Publication title -
philosophical transactions of the royal society b biological sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.753
H-Index - 272
eISSN - 1471-2970
pISSN - 0962-8436
DOI - 10.1098/rstb.2013.0044
Subject(s) - gaze , computer science , task (project management) , software deployment , softmax function , human–computer interaction , artificial intelligence , eye tracking , visual search , artificial neural network , management , economics , operating system
The sequential deployment of gaze to regions of interest is an integral part of human visual function. Owing to its central importance, decades of research have focused on predicting gaze locations, but there has been relatively little formal attempt to predict the temporal aspects of gaze deployment in natural multi-tasking situations. We approach this problem by decomposing complex visual behaviour into individual task modules that require independent sources of visual information for control, in order to model human gaze deployment on different task-relevant objects. We introduce a softmax barrier model for gaze selection that uses two key elements: a priority parameter that represents task importance per module, and noise estimates that allow modules to represent uncertainty about the state of task-relevant visual information. Comparisons with human gaze data gathered in a virtual driving environment show that the model closely approximates human performance.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom