Video coding using 3D dual-tree wavelet transform

Beibei Wang, Yao Wang, Ivan Selesnick, Anthony Vetro

Research output: Contribution to journalArticle

Abstract

This work investigates the use of the 3D dual-tree discrete wavelet transform (DDWT) for video coding. The 3D DDWT is an attractive video representation because it isolates image patterns with different spatial orientations and motion directions and speeds in separate subbands. However, it is an overcomplete transform with 4 : 1 redundancy when only real parts are used. We apply the noise-shaping algorithm proposed by Kingsbury to reduce the number of coefficients. To code the remaining significant coefficients, we propose two video codecs. The first one applies separate 3D set partitioning in hierarchical trees (SPIHT) on each subset of the DDWT coefficients (each forming a standard isotropic tree). The second codec exploits the correlation between redundant subbands, and codes the subbands jointly. Both codecs do not require motion compensation and provide better performance than the 3D SPIHT codec using the standard DWT, both objectively and subjectively. Furthermore, both codecs provide full scalability in spatial, temporal, and quality dimensions. Besides the standard isotropic decomposition, we propose an anisotropic DDWT, which extends the superiority of the normal DDWT with more directional subbands without adding to the redundancy. This anisotropic structure requires significantly fewer coefficients to represent a video after noise shaping.Finally, we also explore the benefits of combining the 3D DDWT with the standard DWT to capture a wider set of orientations.

Original languageEnglish (US)
Article number42761
JournalEurasip Journal on Image and Video Processing
Volume2007
DOIs
StatePublished - 2007

ASJC Scopus subject areas

  • Signal Processing
  • Information Systems
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Video coding using 3D dual-tree wavelet transform'. Together they form a unique fingerprint.

  • Cite this