计算机工程与应用 ›› 2007, Vol. 43 ›› Issue (32): 168-170.

• 数据库与信息处理 • 上一篇    下一篇

一种快速最小二乘支持向量机分类算法

孔 锐,张 冰   

  1. 暨南大学 珠海学院 计算机科学系,广东 珠海 519070
  • 收稿日期:1900-01-01 修回日期:1900-01-01 出版日期:2007-11-11 发布日期:2007-11-11
  • 通讯作者: 孔 锐

Classification algorithm of fast least squares support vector machine

KONG Rui,ZHANG Bing   

  1. Department of Computer Science of Zhuhai College,Jinan University,Zhuhai,Guangdong 519070,China
  • Received:1900-01-01 Revised:1900-01-01 Online:2007-11-11 Published:2007-11-11
  • Contact: KONG Rui

摘要: 最小二乘支持向量机不需要求解凸二次规划问题,通过求解一组线性方程而获得最优分类面,但是,最小二乘支持向量机失去了解的稀疏性,当训练样本数量较大时,算法的计算量非常大。提出了一种快速最小二乘支持向量机算法,在保证支持向量机推广能力的同时,算法的速度得到了提高,尤其是当训练样本数量较大时算法的速度优势更明显。新算法通过选择那些支持值较大样本作为训练样本,以减少训练样本数量,提高算法的速度;然后,利用最小二乘支持向量机算法获得近似最优解。实验结果显示,新算法的训练速度确实较快。

关键词: 稀疏性, 最小二乘支持向量机, 核函数, 支持向量机

Abstract: Least Squares Support Vector Machines(LS-SVM) acquire the optimal solution by solving a set of linear equations,instead of solving a convex quadratic programming problem,But the solutions in lose sparsity property.When the train set of sample points is bigger,the cost of computation becomes great.The paper presents a new algorithm of Fast Least Squares Support Vector Machines(FLS-SVM).As the same generalization ability,especially when the train set of sample points is bigger,the train speed of the new algorithm is faster than that of original LS-SVM algorithm.The new algorithm first selects the samples as reduced training set which have bigger support value from total training set.Then it trains LS-SVM to acquire optimal solution by using the selected samples in reduced training set.The results of experiment verify that the new algorithm not only acquires the same generalization ability with that of the original algorithms,but also is faster than that of the original algorithms.

Key words: sparsity property, Least Squares Support Vector Machines(LSVM), kernel function, Support Vector Machines(SVM)