Summarization of Wearable Videos Based on User Activity Analysis
Author(s) -
Ravi Katpelly,
Tiecheng Liu,
Chin-Tser Huang
Publication year - 2007
Publication title -
ninth ieee international symposium on multimedia (ism 2007)
Language(s) - English
DOI - 10.1109/ism.2007.16
This paper presents a model for automatic summarization of videos recorded by wearable cameras. The proposed model detects various user activities by computing the transform of matching image features among video frames. Four basic types of user activities are proposed, including "moving closer /farther", "panning", "making a turn", and "rotation". Different summarization techniques are provided for different activity types, and a wearable video sequence can be summarized as a compact set of panoramic images. The user activity analysis is solely based on the analysis of images, without resorting to the information of other sensors. Experimental results on a 19- minute video sequence demonstrate the effectiveness of our proposed model.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom