
Video stabilisation with total warping variation model
Author(s) -
Wu Huicong,
Xiao Liang,
Shim Hiuk Jae,
Tang Songze
Publication year - 2017
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2016.0645
Subject(s) - image warping , variation (astronomy) , computer science , computer vision , artificial intelligence , dynamic time warping , computer graphics (images) , physics , astrophysics
This study proposes a robust approach to stabilise videos with a new variational minimising model. In video stabilisation, accumulation error often occurs in cascaded transformation chain‐based methods. To alleviate accumulation error, a new total warping variation (TWV) model is proposed, which describes the smoothness of stabilised camera motion and calculates all the warping transformations efficiently. After estimating original motion parameters based on a 2D similarity transformation model, the corresponding warping parameters are calculated under the TWV minimising framework, where the separable property of the motion parameters is utilised to obtain a closed‐form solution. The proposed method provides robust, smooth and precise motion trajectories after stabilisation. Furthermore, an iterative TWV method is introduced to reduce high‐frequency jitters as well as low‐frequency motions. Moreover, an online TWV method is presented for a long video sequence streaming by adopting a sliding windowed approach. Experimental results on various shaky video sequences show the effectiveness of the proposed method.