z-logo
open-access-imgOpen Access
Did I do that? Detecting a perturbation to visual feedback in a reaching task
Author(s) -
Elon Gaffin-Cahn,
Todd E. Hudson,
Michael S. Landy
Publication year - 2019
Publication title -
journal of vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.126
H-Index - 113
ISSN - 1534-7362
DOI - 10.1167/19.1.5
Subject(s) - proprioception , visual feedback , hand position , perturbation (astronomy) , computer science , leverage (statistics) , psychology , physical medicine and rehabilitation , control theory (sociology) , computer vision , artificial intelligence , neuroscience , physics , control (management) , medicine , quantum mechanics
The motor system executes actions in a highly stereotyped manner despite the high number of degrees of freedom available. Studies of motor adaptation leverage this fact by disrupting, or perturbing, visual feedback to measure how the motor system compensates. To elicit detectable effects, perturbations are often large compared to trial-to-trial reach endpoint variability. However, awareness of large perturbations can elicit qualitatively different compensation processes than unnoticeable ones can. The current experiment measures the perturbation detection threshold, and investigates how humans combine proprioception and vision to decide whether displayed reach endpoint errors are self-generated only, or are due to experimenter-imposed perturbation. We scaled or rotated the position of the visual feedback of center-out reaches to targets and asked subjects to indicate whether visual feedback was perturbed. Subjects detected perturbations when they were at least 1.5 times the standard deviation of trial-to-trial endpoint variability. In contrast to previous studies, subjects suboptimally combined vision and proprioception. Instead of using proprioceptive input, they responded based on the final (possibly perturbed) visual feedback. These results inform methodology in motor system experimentation, and more broadly highlight the ability to attribute errors to one's own motor output and combine visual and proprioceptive feedback to make decisions.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom