z-logo
open-access-imgOpen Access
Online dense activity detection
Author(s) -
Weiqi Li,
Jianming Wang,
Jiayu Liang,
Guanghao Jin,
TaeSun Chung
Publication year - 2021
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/cvi2.12049
Subject(s) - computer science , event (particle physics) , a priori and a posteriori , activity detection , artificial intelligence , object detection , clips , online and offline , relation (database) , change detection , data mining , machine learning , pattern recognition (psychology) , computer vision , philosophy , physics , epistemology , quantum mechanics , operating system
Dense activity detection is a subtask of activity detection that aims to localise and identify multiple human activities in video clips. Existing methods adopt offline frameworks that require video frames to be available when activity detection begins. These offline methods are unable to be applied to online scenarios. An online framework is proposed for dense activity detection. The framework has two stages: warm‐up and detection. Warm‐up is the initialisation of dense activity detection, which generates a contextual model called an online aggregated‐event. After that, the method moves into the detection stage, which consists of two modules: coarse label prediction and refined label prediction. Coarse label prediction predicts activity labels by taking the online aggregated‐event as a priori; then, prediction is refined by two techniques, human–object interaction detection and online relation reasoning. The proposed method is evaluated using two dense activity datasets: Charades and AVA. The experimental results show that the proposed method has better performance than existing offline methods after the whole video input is added to the algorithm.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here