z-logo
open-access-imgOpen Access
Dual attention convolutional network for action recognition
Author(s) -
Li Xiaoqiang,
Xie Miao,
Zhang Yin,
Ding Guangtai,
Tong Weiqin
Publication year - 2020
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2019.0963
Subject(s) - computer science , discriminative model , dual (grammatical number) , rgb color model , optical flow , artificial intelligence , key (lock) , pattern recognition (psychology) , task (project management) , frame (networking) , action (physics) , motion (physics) , attention network , computer vision , image (mathematics) , physics , quantum mechanics , art , telecommunications , literature , computer security , management , economics
Action recognition has been an active research area for many years. Extracting discriminative spatial and temporal features of different actions plays a key role in accomplishing this task. Current popular methods of action recognition are mainly based on two‐stream Convolutional Networks (ConvNets) or 3D ConvNets. However, the computational cost of two‐stream ConvNets is high for the requirement of optical flow while 3D ConvNets takes too much memory because they have a large amount of parameters. To alleviate such problems, the authors propose a Dual Attention ConvNet (DANet) based on dual attention mechanism which consists of spatial attention and temporal attention. The former concentrates on main motion objects in a video frame by using ConvNet structure and the latter captures related information of multiple video frames by adopting self‐attention. Their network is entirely based on 2D ConvNet and takes in only RGB frames. Experimental results on UCF‐101 and HMDB‐51 benchmarks demonstrate that DANet gets comparable results among leading methods, which proves the effectiveness of the dual attention mechanism.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here