Optimal alphabet partitioning for semi-adaptive coding of sources of unknown sparse distributions

Dan Chen, Yi Jen Chiang, Nasir Memon, Xiaolin Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Practical applications that employ entropy coding for large alphabets often partition the alphabet set into two or more layers. Each symbol was encoded using suitable prefix coding for each layer. The problem of optimal alphabet partitioning was formulated for the design of a two layer semi-adaptive code and the given solution was based on dynamic programming. However, the complexity of the dynamic programming approach can be quite prohibitive for a long sequence and very large alphabet size. Hence, a simple greedy heuristic algorithm whose running time is linear in the number of symbols being encoded was given, irrespective of the underlying alphabet size. The given experimental results demonstrated the fact that superior prefix coding schemes for large alphabets can be designed using this approach as opposed to the typically ad-hoc partitioning approach applied in the literature.

Original languageEnglish (US)
Title of host publicationProceedings - DCC 2003
Subtitle of host publicationData Compression Conference
EditorsJames A. Storer, Martin Cohn
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages372-381
Number of pages10
ISBN (Electronic)0769518966
DOIs
StatePublished - 2003
EventData Compression Conference, DCC 2003 - Snowbird, United States
Duration: Mar 25 2003Mar 27 2003

Publication series

NameData Compression Conference Proceedings
Volume2003-January
ISSN (Print)1068-0314

Other

OtherData Compression Conference, DCC 2003
Country/TerritoryUnited States
CitySnowbird
Period3/25/033/27/03

Keywords

  • Data compression

ASJC Scopus subject areas

  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Optimal alphabet partitioning for semi-adaptive coding of sources of unknown sparse distributions'. Together they form a unique fingerprint.

Cite this