z-logo
open-access-imgOpen Access
Panoptic Studio: A Massively Multiview System for Social Interaction Capture
Author(s) -
Hanbyul Joo,
Tomas Simon,
Xulong Li,
Hao Liu,
Lei Tan,
Lin Gui,
Sean Banerjee,
Timothy Godisart,
Bart Nabbe,
Iain Matthews,
Takeo Kanade,
Shohei Nobuhara,
Yaser Sheikh
Publication year - 2017
Publication title -
ieee transactions on pattern analysis and machine intelligence
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.811
H-Index - 372
eISSN - 1939-3539
pISSN - 0162-8828
DOI - 10.1109/tpami.2017.2782743
Subject(s) - computer science , computer graphics (images) , studio , artificial intelligence , computer vision , panopticon , massively parallel , human–computer interaction , multimedia , politics , political science , law , telecommunications , parallel computing
We present an approach to capture the 3D motion of a group of people engaged in a social interaction. The core challenges in capturing social interactions are: (1) occlusion is functional and frequent; (2) subtle motion needs to be measured over a space large enough to host a social group; (3) human appearance and configuration variation is immense; and (4) attaching markers to the body may prime the nature of interactions. The Panoptic Studio is a system organized around the thesis that social interactions should be measured through the integration of perceptual analyses over a large variety of view points. We present a modularized system designed around this principle, consisting of integrated structural, hardware, and software innovations. The system takes, as input, 480 synchronized video streams of multiple people engaged in social activities, and produces, as output, the labeled time-varying 3D structure of anatomical landmarks on individuals in the space. Our algorithm is designed to fuse the "weak" perceptual processes in the large number of views by progressively generating skeletal proposals from low-level appearance cues, and a framework for temporal refinement is also presented by associating body parts to reconstructed dense 3D trajectory stream. Our system and method are the first in reconstructing full body motion of more than five people engaged in social interactions without using markers. We also empirically demonstrate the impact of the number of views in achieving this goal.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom