计算机工程与应用 ›› 2007, Vol. 43 ›› Issue (35): 49-51.

• 学术探讨 • 上一篇    下一篇

一种新型高效的计算机寻优算法

鲍江宏,李炯城   

  1. 华南理工大学 数学科学学院,广州 510641
  • 收稿日期:1900-01-01 修回日期:1900-01-01 出版日期:2007-12-11 发布日期:2007-12-11
  • 通讯作者: 鲍江宏

New efficient computational optimization algorithm

BAO Jiang-hong,LI Jiong-cheng   

  1. School of Mathematical Sciences,South China University of Technology,Guangzhou 510641,China
  • Received:1900-01-01 Revised:1900-01-01 Online:2007-12-11 Published:2007-12-11
  • Contact: BAO Jiang-hong

摘要: 提出一种全新的寻找无约束最优解的计算机算法。该算法能使得目标函数梯度的模逐渐收缩到零,以达到目标函数极小化,因此命名“梯度收缩法”。它同时利用了牛顿法和共轭梯度法的优点,应用目标函数的二阶导数,收敛很快,且具有牛顿法的“二次终止”特性。但Hessian矩阵奇异时,牛顿法将无法进行下去,该文算法可以克服这个缺点且能快速确定是否收敛到一个鞍点。

关键词: 计算机寻优, 牛顿法, 共轭梯度法, Hessian矩阵, 梯度

Abstract: A new computational optimization algorithm for unconstrained functions is proposed in this work.The algorithm is expected to cause that Euclidean norm of the gradient of objective function gradually shrinks to zero,in order to minimize the objective function.The new algorithm is named gradient shrink method.It combines the advantages of Newton’s method and conjugate gradient method.It uses second-order derivatives of the objective function,so the convergence rate is rather good,and has the quadratic termination property.When Hessian matrix is singular,Newton’s method will fail to continue,but the disadvantage can be fully avoided in gradient shrink method.In addition,it can quickly determinate whether a saddle point is converged to.

Key words: computational optimization, Newton’s method, conjugate gradient method, Hessian matrix, gradient