z-logo
Premium
Perception of Human Interaction Based on Motion Trajectories: From Aerial Videos to Decontextualized Animations
Author(s) -
Shu Tianmin,
Peng Yujia,
Fan Lifeng,
Lu Hongjing,
Zhu SongChun
Publication year - 2018
Publication title -
topics in cognitive science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.191
H-Index - 56
eISSN - 1756-8765
pISSN - 1756-8757
DOI - 10.1111/tops.12313
Subject(s) - motion (physics) , perception , artificial intelligence , computer vision , animation , motion perception , computer science , biological motion , psychology , communication , computer graphics (images) , neuroscience
People are adept at perceiving interactions from movements of simple shapes, but the underlying mechanism remains unknown. Previous studies have often used object movements defined by experimenters. The present study used aerial videos recorded by drones in a real‐life environment to generate decontextualized motion stimuli. Motion trajectories of displayed elements were the only visual input. We measured human judgments of interactiveness between two moving elements and the dynamic change in such judgments over time. A hierarchical model was developed to account for human performance in this task. The model represents interactivity using latent variables and learns the distribution of critical movement features that signal potential interactivity. The model provides a good fit to human judgments and can also be generalized to the original Heider–Simmel animations (1944). The model can also synthesize decontextualized animations with a controlled degree of interactiveness, providing a viable tool for studying animacy and social perception.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here