计算机工程与应用 ›› 2010, Vol. 46 ›› Issue (20): 138-141.DOI: 10.3778/j.issn.1002-8331.2010.20.039

• 人工智能 • 上一篇    下一篇

基于模糊熵的BP算法改进

花 强,赵伯夷,高 月   

  1. 河北大学 数学与计算机学院 机器学习研究中心,河北 保定 071002
  • 收稿日期:2010-04-15 修回日期:2010-05-17 出版日期:2010-07-11 发布日期:2010-07-11
  • 通讯作者: 花 强

Improvement of BP neural networks based on fuzzy entropy

HUA Qiang,ZHAO Bo-yi,GAO Yue   

  1. Machine Learning Center,College of Mathematics and Computer Science,Hebei University,Baoding,Hebei 071002,China
  • Received:2010-04-15 Revised:2010-05-17 Online:2010-07-11 Published:2010-07-11
  • Contact: HUA Qiang

摘要: 由于BP网络简单的拓扑结构和优秀的逼近能力,它已经被广泛地应用于预测和非线性系统的建模中。但是由于算法自身的不足,在实际应用中会产生很多问题。因此,BP网络的优化已经成为了一个重要的课题。为了提高BP网络的泛化能力,将模糊熵加入到BP网络的性能函数中,提出了基于模糊熵的BP算法。在实验中,将两种算法进行了对比,结果表明改进算法可以有效地提高测试精度,避免了过度拟合。

关键词: BP网络, 过拟合, 模糊熵

Abstract: Due to their simple topological structure and universal approximation ability,Backpropagation(BP) neural networks have been widely used in prediction and nonlinear system modeling.But some kinds of problems may happen because of the deficiency of the algorithm itself.Therefore,the optimization of neural network has become an important research topic.In order to improve generalization of neural networks,the fuzzy entropy is added to the backpropagation neural networks and the BP neural network based on fuzzy entropy is proposed.The two algorithms are compared in the experiments and the results prove that the improved algorithm can effectively improve the test accuracy and avoid over-fitting.

Key words: backpropagation neural networks, over-fitting, fuzzy entropy

中图分类号: