Multisensory information about changing object properties can be used to quickly correct predictive force scaling for object lifting
Author(s) -
Vonne van Polanen
Publication year - 2022
Publication title -
experimental brain research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.782
H-Index - 172
eISSN - 1432-1106
pISSN - 0014-4819
DOI - 10.1007/s00221-022-06404-9
Subject(s) - object (grammar) , grasp , haptic technology , sensory system , stimulus modality , computer science , lift (data mining) , modality (human–computer interaction) , artificial intelligence , computer vision , psychology , human–computer interaction , cognitive psychology , machine learning , programming language
Sensory information about object properties, such as size or material, can be used to make an estimate of object weight and to generate an accurate motor plan to lift the object. When object properties change, the motor plan needs to be corrected based on the new information. The current study investigated whether such corrections could be made quickly, after the movement was initiated. Participants had to grasp and lift objects of different weights that could be indicated with different cues. During the reaching phase, the cue could change to indicate a different weight and participants had to quickly adjust their planned forces in order to lift the object skilfully. The object weight was cued with different object sizes (Experiment 1) or materials (Experiment 2) and the cue was presented in different sensory modality conditions: visually, haptically or both (visuohaptic). Results showed that participants could adjust their planned forces based on both size and material. Furthermore, corrections could be made in the visual, haptic and visuohaptic conditions, although the multisensory condition did not outperform the conditions with one sensory modality. These results suggest that motor plans can be quickly corrected based on sensory information about object properties from different sensory modalities. These findings provide insights into the information that can be shared between brain areas for the online control of hand-object interactions.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom