On fixed-database universal data compression with limited memory
Y Hershkovits, J Ziv - IEEE Transactions on Information Theory, 1997 - ieeexplore.ieee.org
The amount of fixed side information required for lossless data compression is discussed.
Nonasymptotic coding and converse theorems are derived for data-compression algorithms
with fixed statistical side information (" training sequence") that is not large enough so as to
yield the ultimate compression, namely, the entropy of the source.
Nonasymptotic coding and converse theorems are derived for data-compression algorithms
with fixed statistical side information (" training sequence") that is not large enough so as to
yield the ultimate compression, namely, the entropy of the source.
Showing the best result for this search. See all results