A new weighted least squares support vector machines and its sequential minimal optimization algorithm

  • Liguo Wang*
  • , Ye Zhang
  • , Junping Zhang
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Least squares support vector machines (LSSVM) is widely used in pattern recognition and artificial intelligence domain in recent years for its efficiency in classification and regression. The solution of LSSVM is an optimization problem of a Sum squared error (SSE) cost function with only equality constraints and can be solved in a simple linear system. However, its generalization performance is sensitive to noise points and outliers that are often existent in training dataset. In order to endow robustness to LSSVM, a new method for computing weight vector of error is proposed and the substituting of weighted error vector for original error vector in LSSVM gives birth to a new weighted LSSVM. The method gets weight factor by computing distance between sample and its corresponding class center. Sequential minimal optimization (SMO) algorithm is also extended to the new method for its efficient application. Comparison experiments show superiority of the new method in terms of generalization performance, robust property and sparse approximation. Especially, the new method is much faster than the other method for large number of samples.

Original languageEnglish
Pages (from-to)285-288
Number of pages4
JournalChinese Journal of Electronics
Volume17
Issue number2
StatePublished - Apr 2008

Keywords

  • Robust property
  • Sequential minimal optimization (SMO) algorithm
  • Sparse approximation
  • Weighted least squares support vector machines (WLSSVM)

Fingerprint

Dive into the research topics of 'A new weighted least squares support vector machines and its sequential minimal optimization algorithm'. Together they form a unique fingerprint.

Cite this