A local texture refinement and adaptive background construction method for free viewpoint video

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

View synthesis is an important technique for free viewpoint video because of its capability to render numbers of virtual viewpoints. There are some technical challenges in producing desirable synthesis images such as suppressing kinds of artifacts and filling empty holes. In this paper, a novel two-stage synthesis framework is proposed. In this framework, four main original tools have been utilized to solve these problems. They are boundary region detection, adaptive filter, adaptive background construction and depth map processing method. The proposed algorithm is implemented and tested on the Microsoft test video sequences. Experimental results show that the synthesized output has better visual perception and subjective gain can be observed in comparison with the state-of-the-art approaches.

Original languageEnglish
Title of host publicationICMSSP 2018 - 2018 3rd International Conference on Multimedia Systems and Signal Processing
PublisherAssociation for Computing Machinery
Pages6-11
Number of pages6
ISBN (Electronic)9781450364577
DOIs
StatePublished - 28 Apr 2018
Externally publishedYes
Event3rd International Conference on Multimedia Systems and Signal Processing, ICMSSP 2018 - Shenzhen, China
Duration: 28 Apr 201830 Apr 2018

Publication series

NameACM International Conference Proceeding Series

Conference

Conference3rd International Conference on Multimedia Systems and Signal Processing, ICMSSP 2018
Country/TerritoryChina
CityShenzhen
Period28/04/1830/04/18

Keywords

  • Adaptive background construction
  • Adaptive filter
  • Boundary noise detection
  • Depth map processing method
  • Free viewpoint video
  • Two-stage framework
  • View synthesis

Fingerprint

Dive into the research topics of 'A local texture refinement and adaptive background construction method for free viewpoint video'. Together they form a unique fingerprint.

Cite this