Computer Engineering and Applications ›› 2022, Vol. 58 ›› Issue (17): 34-49.DOI: 10.3778/j.issn.1002-8331.2203-0195

• Research Hotspots and Reviews • Previous Articles     Next Articles

Research Progress of Neural Network Based on Non-Gradient Optimization Methods

SHENG Lei, CHEN Xiliang,  KANG Kai   

  1. College of Command and Control Engineering, Army Engineering University, Nanjing 210007, China
  • Online:2022-09-01 Published:2022-09-01

神经网络非梯度优化方法研究进展

盛蕾,陈希亮,康凯   

  1. 陆军工程大学 指挥控制工程学院,南京 210007

Abstract: Neural network optimization is a basic frontier subject in the field of machine learning. Compared with the neural network optimization algorithm based on pure gradient, the non-gradient algorithm shows greater advantages in solving problems such as slow convergence speed, easy to fall into local optimum and inability to solve non-differentiable problems. On the basis of analyzing the advantages and disadvantages of gradient-based neural network methods, this paper firstly reviews the non-gradient optimization methods, including feedforward neural network optimization and stochastic search optimization. Then the advantages and disadvantages of the non-gradient optimization method and its application are analyzed from the aspects of basic theory, training steps of neural network and convergence. Finally, the theoretical and application challenges of training neural network algorithms based on non-gradient are summarized and the future development direction is forecasted.

Key words: deep learning, neural network, training algorithm, optimization theory, non-gradient optimization algorithm

摘要: 神经网络优化是机器学习领域的一个基础性前沿课题。相较于神经网络的纯梯度优化算法,非梯度算法在解决收敛速度慢、易陷入局部最优、无法解决不可微等问题上表现出更大的优势。在剖析基于梯度的神经网络方法优缺点的基础上,重点对部分非梯度优化方法进行了综述,包括前馈神经网络优化和随机搜索优化;从基本理论、训练神经网络的步骤以及收敛性等方面对非梯度优化方法的优缺点和应用情况进行了分析;总结了基于非梯度的训练神经网络的算法在理论和应用方面面临的挑战并且展望了未来的发展方向。

关键词: 深度学习, 神经网络, 训练算法, 优化理论, 非梯度优化算法