Skip to main navigation Skip to search Skip to main content

Robust feature selection based on regularized brownboost loss

  • Pan Wei
  • , Qinghua Hu*
  • , Peijun Ma
  • , Xiaohong Su
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Feature selection is an important preprocessing step in machine learning and pattern recognition. It is also a data mining task in some real-world applications. Feature quality evaluation is a key issue when designing an algorithm for feature selection. The classification margin has been used widely to evaluate feature quality in recent years. In this study, we introduce a robust loss function, called Brownboost loss, which computes the feature quality and selects the optimal feature subsets to enhance robustness. We compute the classification loss in a feature space with hypothesis-margin and minimize the loss by optimizing the weights of features. An algorithm is developed based on gradient descent using L2-norm reg-ularization techniques. The proposed algorithm is tested using UCI datasets and gene expression datasets, respectively. The experimental results show that the proposed algorithm is effective in improving the classification robustness.

Original languageEnglish
Pages (from-to)180-198
Number of pages19
JournalKnowledge-Based Systems
Volume54
DOIs
StatePublished - Dec 2013
Externally publishedYes

Keywords

  • Brownboost loss
  • Feature selection
  • Margin
  • Regularization
  • Robustness

Fingerprint

Dive into the research topics of 'Robust feature selection based on regularized brownboost loss'. Together they form a unique fingerprint.

Cite this