摘要
分析表明,在机器学习和人工智能等领域,业界广泛采用梯度下降法来训练参数使得网络模型达到预期目标。然而,传统算法不仅存在收敛速度慢、训练时间长的问题,而且极易陷入局部极值导致算法提前终止,使得结果达不到期望精度。针对上述问题,提出一种新型的基于高斯噪声扰动的随机梯度法,训练过程中通过引入高斯噪声来提高随机搜索能力,并结合先进的优化策略,使得算法有很强的全局搜索能力的同时,还能快速收敛到期望目标,从而满足各类应用场景的快速性和高精度性的要求。仿真结果表明,提出的算法远优于传统算法,且具有更低的时间复杂度。
In the field of machine learning and artificial intelligence,gradient descent method is widely used to train parameters to make the network model achieve the expected goal.However,the conventional algorithm not only has the problems of slow convergence speed and long training time,but also easily falls into local extremum,leading to the premature termination of the algorithm,which makes the result not reach the expected accuracy.In order to solve the above problems,this paper proposes a new stochastic gradient method based on Gaussian noise disturbance.In the training process,Gaussian noise is introduced to improve the random search ability.Combined with advanced optimization strategies,the algorithm has strong global search ability and fast convergence to the desired goal,so as to meet the requirements of rapidity and high accuracy in various application scenarios.Simulation results show that the proposed algorithm is much better than the conventional algorithm,and has lower time complexity.
作者
朱志广
王永
ZHU Zhiguang;WANG Yong(School of Information Science and Technology,University of Science and Technology of China,Anhui 230022,China)
出处
《电子技术(上海)》
2021年第8期4-7,共4页
Electronic Technology
关键词
高斯分布噪声
梯度下降法
算法优化
全局搜索
人工神经网络
Gaussian noise
gradient descent method
algorithm optimization
global search
artificial neural network
作者简介
朱志广,中国科学技术大学信息科学技术学院,硕士研究生,研究方向:机器学习中的算法优化;通信作者:王永,中国科学技术大学信息科学技术学院,教授、博士生导师,研究方向:运动体控制、振动主动控制、机器人控制以及信息融合。