
Zero‐quantised discrete cosine transform coefficients prediction technique for intra‐frame video encoding
Author(s) -
Jridi Maher,
Kumar Meher Pramod,
Alfalou Ayman
Publication year - 2013
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2012.0145
Subject(s) - discrete cosine transform , frame (networking) , encoding (memory) , zero (linguistics) , computer science , modified discrete cosine transform , algorithm , trigonometric functions , transform coding , artificial intelligence , computer vision , mathematics , image (mathematics) , telecommunications , geometry , linguistics , philosophy
One promising solution to reduce the computational complexity of discrete cosine transform (DCT) is to identify the redundant computations and to get rid of them. In this study, the authors present a new method to predict zero‐quantised DCT coefficients for efficient implementation of intra‐frame video encoding by identifying such redundant computations. Traditional methods use the Gaussian statistical model of residual pixels to predict all‐zero or partial‐zero blocks. The proposed method is based on two key ideas. At first, the bounds of DCT coefficients are derived from the intermediate signals of the Loeffler DCT algorithm instead of calculating the sum of absolute difference (SAD) of residual pixels. The sufficiency conditions are then suitably chosen to predict the zero‐quantised coefficients to reduce the arithmetic complexity without degrading the video quality. Simulation results are found to validate the analytical model and show that the proposed prediction eliminates more redundant computations than the existing methods. Moreover, the authors have derived a pipelined VLSI architecture of the proposed prediction scheme which offers a saving of more than 63 and 91% of multiplications of the second stage of one‐dimensional DCT for high and low bit‐rate intra‐video encoding, respectively.