Computer Engineering and Applications ›› 2018, Vol. 54 ›› Issue (5): 138-143.DOI: 10.3778/j.issn.1002-8331.1610-0104

Previous Articles     Next Articles

Information entropy-based RVM-AdaBoost ensemble classifier

ZHAI Xiyang, WANG Xiaodan, LI Rui, JIA Qi   

  1. Institute of Air Defense and Anti-Missile, Air Force Engineering University, Xi’an 710051, China
  • Online:2018-03-01 Published:2018-03-13

基于信息熵的RVM-AdaBoost组合分类器

翟夕阳,王晓丹,李  睿,贾  琪   

  1. 空军工程大学 防空反导学院,西安 710051

Abstract: In the light of the problem that AdaBoost works poorly with RVM, a new classifier which is composed of RVM and AdaBoost is proposed. The information entropy of the samples is defined by the output posterior probability of RVM. The higher is the information entropy, the samples are more easily mistaken. Use adaptive information entropy threshold to filter data and use ensemble classifier to classify the samples which are filtered. Regarding the few samples which are not filtered and have false classification results as noise data improves classifier’s stability and avoids classifier’s degradation. Experimental results based on UCI data sets show that the new classifier effectively improves the performance of RVM and has better performance on accuracy, efficiency and stability compared with AdaBoost-RVM and AdaBoost-ARVM classifiers.

Key words: Relevance Vector Machine(RVM), AdaBoost, information entropy, ensemble learning

摘要: 针对AdaBoost算法不能有效提高RVM分类性能的问题,提出一种基于信息熵的RVM与AdaBoost组合分类器。依据RVM输出的后验概率来定义样本的信息熵,信息熵越高的样本越容易错分。提出使用自适应信息熵阈值对数据进行筛选,筛选出的数据使用基于AdaBoost算法的集成分类器进行分类,样本的分类结果由RVM与集成分类器组合给出。把未筛选出且分类错误的极少样本作为噪声对待,增强了组合分类器的稳定性,避免了随着AdaBoost算法迭代次数增加集成分类器出现退化的现象。使用UCI数据集从分类正确率、分类效率和稳定性三方面对提出的组合分类器进行验证,实验结果表明了此分类器的有效性。

关键词: 相关向量机, AdaBoost算法, 信息熵, 集成学习