Abstract
Feature selection is an important preprocessing step in machine learning and pattern recognition. It is also a data mining task in some real-world applications. Feature quality evaluation is a key issue when designing an algorithm for feature selection. The classification margin has been used widely to evaluate feature quality in recent years. In this study, we introduce a robust loss function, called Brownboost loss, which computes the feature quality and selects the optimal feature subsets to enhance robustness. We compute the classification loss in a feature space with hypothesis-margin and minimize the loss by optimizing the weights of features. An algorithm is developed based on gradient descent using L2-norm reg-ularization techniques. The proposed algorithm is tested using UCI datasets and gene expression datasets, respectively. The experimental results show that the proposed algorithm is effective in improving the classification robustness.
| Original language | English |
|---|---|
| Pages (from-to) | 180-198 |
| Number of pages | 19 |
| Journal | Knowledge-Based Systems |
| Volume | 54 |
| DOIs | |
| State | Published - Dec 2013 |
| Externally published | Yes |
Keywords
- Brownboost loss
- Feature selection
- Margin
- Regularization
- Robustness
Fingerprint
Dive into the research topics of 'Robust feature selection based on regularized brownboost loss'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver