z-logo
open-access-imgOpen Access
Adaptive and Proadaptive Image Compression
Author(s) -
V. N. Oulianov
Publication year - 2001
Language(s) - English
DOI - 10.1109/dcc.2001.10045
The most of well-known algorithms on the basis of the "past" (processed) samples make the local properties of the image more precise and adjust some own parameters. At decoding of the image on the basis of restored samples the same decisions are taken. It is the classical adaptive scheme. In the offered proadaptive approach for tuning parameters of algorithm the prolongation of a signal is used. For estimate of statistical properties of an encoded (current) sample are used both "past", and "future" samples and even current sample. Certainly, the reliability of the estimation will increase. But it is necessary to pay for information on "future" – to save some additional information. The proadaptive algorithm tunes the parameters only at encoding, and at decoding the saved values of parameters are used. The more size of additional data, the more often and/or precisely are adjusted parameters of algorithm and the better efficiency of encoding of the main data stream. If parameters of the coder are adjusted in each image fragment by size W×W, the entropy of the code words in the main and additional data streams will depend from the size of the image fragment like in figure 1. Under some conditions their sum has an optimum. The experimental probing on the real and artificial images have shown that the optimum is present always, but the conditions for its achievement essentially depend of the image context. I

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom