Computer Engineering and Applications ›› 2018, Vol. 54 ›› Issue (18): 174-179.DOI: 10.3778/j.issn.1002-8331.1705-0338
Previous Articles Next Articles
KONG Xiangxin, ZHOU Wei, WANG Xiaodan, YU Mingqiu
Online:
Published:
孔祥鑫,周 炜,王晓丹,于明秋
Abstract: In incremental learning based on Support Vector Data Description(SVDD), existing methods often discard more useful samples for new models which lead to serious drop in the accuracy of incremental learning. Focused on this problem, a new removing algorithm for incremental SVDD learning is proposed. Based on the analysis of support vector set’s change in incremental learning process, adaptive learning threshold [α] is defined to filter out samples that may become new support vectors. Furthermore, a sample removing method is proposed to avoid repetitive training for useless samples. Simulation with the UCI benchmark datasets and warhead target simulation HRRP data indicates that new algorithm has better classification accuracy than conventional incremental algorithm while improving the training efficiency of the model.
Key words: Support Vector Data Description(SVDD), incremental learning, adaptive threshold, removing method
摘要: 在基于支持向量数据描述(Support Vector Data Description,SVDD)的增量学习中,现有方法通常舍弃了较多对新模型构建有用的样本,导致增量学习的精度严重下降。针对这一问题,提出了一种新的SVDD增量学习淘汰算法(New removing algorithm for Incremental SVDD learning,NISVDD)。在分析增量学习过程中支持向量集变化特性的基础上,定义自适应学习阈值[α],筛选出可能成为新支持向量的样本;同时,提出一种样本淘汰机制,避免无用样本的重复训练。基于UCI数据集和弹头目标仿真HRRP数据的实验结果表明,新算法在提升模型训练效率的同时具备比常规增量算法更高的分类精度。
关键词: 支持向量数据描述, 增量学习, 自适应阈值, 淘汰机制
KONG Xiangxin, ZHOU Wei, WANG Xiaodan, YU Mingqiu. Removing algorithm for incremental SVDD learning[J]. Computer Engineering and Applications, 2018, 54(18): 174-179.
孔祥鑫,周 炜,王晓丹,于明秋. 一种SVDD增量学习淘汰算法[J]. 计算机工程与应用, 2018, 54(18): 174-179.
0 / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://cea.ceaj.org/EN/10.3778/j.issn.1002-8331.1705-0338
http://cea.ceaj.org/EN/Y2018/V54/I18/174