计算机工程与应用 ›› 2009, Vol. 45 ›› Issue (29): 34-36.DOI: 10.3778/j.issn.1002-8331.2009.29.010

• 研究、探讨 • 上一篇    下一篇

PSO粒子群算法在神经网络泛化能力中研究

刘 军1,邱晓红1,2,汪志勇1,杨 鹏1   

  1. 1.江西师范大学 计算机学院,南昌 330022
    2.江西农业大学 软件学院,南昌 330045
  • 收稿日期:2008-11-12 修回日期:2009-02-05 出版日期:2009-10-11 发布日期:2009-10-11
  • 通讯作者: 刘 军

Research on PSO algorithm in neural network generalization

LIU Jun1,QIU Xiao-hong1,2,WANG Zhi-yong1,YANG Peng1   

  1. 1.College of Computer,Jiangxi Normal University,Nanchang 330022,China
    2.School of Software,Jiangxi Agricultural University,Nanchang 330045,China
  • Received:2008-11-12 Revised:2009-02-05 Online:2009-10-11 Published:2009-10-11
  • Contact: LIU Jun

摘要: 利用PSO粒子群算法对神经网络的权值和阈值,隐藏层中神经元的传递函数系数进行优化。针对网络训练效果好,而泛化能力很差的情况,将训练样本均方差和权值的平方和结合作为PSO算法的目标函数。实验表明,该方法比惯性权值PSO-BP算法和基本梯度下降法好,不但稳定性好,而且预测精度高,泛化能力得到明显加强。

关键词: BP网络, PSO粒子群算法, 传递函数, 逼近, 泛化

Abstract: This paper employs the PSO algorithm to update the weights,the biases and the transfer function’s coefficients of the hidden layer in the neural network.As to the phenomena of good approximation and bad generalization,the MSE of the training set and the MSW of the weights are integrated into the fitness goal.In the experiment,the GPSO-BP algorithm which optimizes the coefficients of the transfer function and has the small weights and thresholds is better than the BP algorithm and the PSO-BP algorithm in terms of the mean correct recognition and the stability.

Key words: Back Propagation(BP) neural network, Particle Swarm Optimization(PSO) algorithm, transfer function, approximation, generalization

中图分类号: