Motion perception by a moving observer in a three-dimensional environment
Author(s) -
Lucile Dupin,
Mark Wexler
Publication year - 2013
Publication title -
journal of vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.126
H-Index - 113
ISSN - 1534-7362
DOI - 10.1167/13.2.15
Subject(s) - observer (physics) , parallax , motion perception , computer vision , motion (physics) , artificial intelligence , perception , movement (music) , computer science , optical flow , parsing , communication , mathematics , psychology , physics , acoustics , quantum mechanics , neuroscience , image (mathematics)
Perceiving three-dimensional object motion while moving through the world is hard: not only must optic flow be segmented and parallax resolved into shape and motion, but also observer motion needs to be taken into account in order to perceive absolute, rather than observer-relative motion. In order to simplify the last step, it has recently been suggested that if the visual background is stationary, then foreground object motion, computed relative to the background, directly yields absolute motion. A series of studies with immobile observers and optic flow simulating observer movement have provided evidence that observers actually utilize this so-called "flow parsing" strategy (Rushton & Warren, 2005). We test this hypothesis by using mobile observers (as well as immobile ones) who judge the motion in depth of a foreground object in the presence of a stationary or moving background. We find that background movement does influence motion perception but not as much as predicted by the flow-parsing hypothesis. Thus, we find evidence that, in order to perceive absolute motion, observers partly use flow-parsing but also compensate egocentric motion by a global self-motion estimate.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom