Skip to main navigation Skip to search Skip to main content

A new sparse representation of seismic data using adaptive easy-path wavelet transform

  • Jianwei Ma*
  • , Gerlind Plonka
  • , Hervé Chauris
  • *Corresponding author for this work
  • Tsinghua University
  • University of Duisburg-Essen
  • Mines ParisTech, Centre des Matériaux/CNRS, UMR 7633

Research output: Contribution to journalArticlepeer-review

Abstract

Sparse representation of seismic data is a crucial step for seismic forward modeling and seismic processing such as coherent noise separation, imaging, and sparsity-promoting data recovery. In this letter, a new locally adaptive wavelet transform, called easy-path wavelet transform (EPWT), is applied for the sparse representation of seismic data. The EPWT is an adaptive geometric wavelet transform that works along a series of special pathways through the input data and exploits the local correlations of the data. The transform consists of two steps: reorganizing the data following the pathways according to the data values and then applying a 1-D wavelet transform along the pathways. This leads to a very sparse wavelet representation. In comparison to conventional wavelets, the EPWT concentrates most of the energy of signals at smooth scales and needs less significant wavelet coefficients to represent signals. Numerical experiments show that the new method is really superior over the conventional wavelets and curvelets in terms of sparse representation and compression of seismic data.

Original languageEnglish
Article number5439772
Pages (from-to)540-544
Number of pages5
JournalIEEE Geoscience and Remote Sensing Letters
Volume7
Issue number3
DOIs
StatePublished - Jul 2010
Externally publishedYes

Keywords

  • Adaptive wavelets
  • curvelets
  • easy-path wavelet transform (EPWT)
  • seismic processing
  • sparse representation

Fingerprint

Dive into the research topics of 'A new sparse representation of seismic data using adaptive easy-path wavelet transform'. Together they form a unique fingerprint.

Cite this