Perspective Modulates Temporal Synchrony Discrimination of Visual and Proprioceptive Information in Self-Generated Movements
Author(s) -
Adria E. N. Hoover,
Laurence R. Harris
Publication year - 2011
Publication title -
i-perception
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.64
H-Index - 26
ISSN - 2041-6695
DOI - 10.1068/ic926
Subject(s) - proprioception , asynchrony (computer programming) , perspective (graphical) , efferent , movement (music) , psychology , afferent , communication , sensory system , multisensory integration , efference copy , cognitive psychology , computer vision , computer science , artificial intelligence , neuroscience , asynchronous communication , computer network , philosophy , aesthetics
Temporally congruent sensory information during a movement and the visual perspective in which we see our movement provide cues for self-recognition. Here we measured threshold and sensitivity for delay detection between a self-generated finger movement (proprioceptive and efferent copy) and the visual image of that movement under differing perspectives. Asynchrony was detected more easily (45–60 ms) when the hand was viewed in an egocentric perspective, even if its image was mirrored so that it appeared to be the other hand. Significantly longer delays (80–100ms) were needed to detect asynchrony when the hand was viewed in an allocentric (or inverted) perspective. These effects were replicated when the movement was seen as if looking in a mirror and when the non-dominant hand was used. We conclude that the tolerance for temporally matching visual, proprioceptive and efferent copy information that informs about the perceived position of body parts depends on whether one is viewing one's own body or someone else's
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom