Hyperspectral feature selection based on mutual information and nonlinear correlation coefficient

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Mutual information (MI) has obvious potential for feature selection, but this has not been fully exploited in the past. In order to make numerical computation easier and more accurate, the MI of the whole multi-dimensional data can be decomposed into an amount of one-dimensional MI and one-dimensional conditional MI components. This paper reveals that using one-dimensional MI components to replace the one-dimensional conditional MI components may be problematic when the features are highly correlated, and we propose a method that using nonlinear correlation coefficient (NCC) to replace some one-dimensional MI components, which also including the conditional ones. Simulations are carried out on the AVIRIS 92AV3C dataset and the results show great potential for improvement in classification accuracy.

Original languageEnglish
Title of host publicationIIH-MSP 2009 - 2009 5th International Conference on Intelligent Information Hiding and Multimedia Signal Processing
Pages965-968
Number of pages4
DOIs
StatePublished - 2009
EventIIH-MSP 2009 - 2009 5th International Conference on Intelligent Information Hiding and Multimedia Signal Processing - Kyoto, Japan
Duration: 12 Sep 200914 Sep 2009

Publication series

NameIIH-MSP 2009 - 2009 5th International Conference on Intelligent Information Hiding and Multimedia Signal Processing

Conference

ConferenceIIH-MSP 2009 - 2009 5th International Conference on Intelligent Information Hiding and Multimedia Signal Processing
Country/TerritoryJapan
CityKyoto
Period12/09/0914/09/09

Keywords

  • Feature selection
  • Hypersepctral data
  • Mutual information
  • Nonlinear correlation coefficient

Fingerprint

Dive into the research topics of 'Hyperspectral feature selection based on mutual information and nonlinear correlation coefficient'. Together they form a unique fingerprint.

Cite this