Skip to main navigation Skip to search Skip to main content

FGS coding using cycle-based leaky prediction through multiple leaky factors

  • Xiangyang Ji*
  • , Yanyan Zheng
  • , Debin Zhao
  • , Feng Wu
  • , Wen Gao
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This paper proposes a fine granularity scalable (FGS) coding using cycle-based leaky prediction, in which the multiple leaky factors are used to yield enhancement layer prediction to make a good compromise between coding efficiency and drift error. In this proposed method, first, the error propagation for leaky prediction with two leaky factors is theoretically analyzed in case only the base-layer bitstream and part of the enhancement-layer bitstream are available at the decoder. Based on this analysis, in this paper, we investigate how to effectively introduce enhancement-layer information into the prediction loop for enhancement-layer coding by the proper leaky factors to constrain drift error while keeping high coding efficiency. Furthermore, a coefficient scaling approach in the transform domain is proposed to address the decoding complexity issue for multiple reconstructions of partial enhancement layers at different quality levels. Finally, an encoder optimization approach is presented to further control drift error for multiple FGS layers coding. The experimental results show that compared to AR-FGS in JSVM, the proposed method can significantly improve the coding performance over a wide range of bitrates.

Original languageEnglish
Article number4498426
Pages (from-to)1201-1211
Number of pages11
JournalIEEE Transactions on Circuits and Systems for Video Technology
Volume18
Issue number9
DOIs
StatePublished - Sep 2008

Keywords

  • Drift error
  • Fine granularity scalability (FGS)
  • Leaky prediction
  • Video coding

Fingerprint

Dive into the research topics of 'FGS coding using cycle-based leaky prediction through multiple leaky factors'. Together they form a unique fingerprint.

Cite this