Paper
24 October 2017 Adaptive learning rate method based on Nesterov accelerated gradient
Author Affiliations +
Proceedings Volume 10462, AOPC 2017: Optical Sensing and Imaging Technology and Applications; 104622R (2017) https://doi.org/10.1117/12.2284928
Event: Applied Optics and Photonics China (AOPC2017), 2017, Beijing, China
Abstract
A kind of stochastic gradient descent method of self-adaptive learning rate is proposed in this thesis. This method is based on the optimization algorithm Nesterov accelerated gradient (NAG). First second derivative approximation of cost function is executed, then the final update orientation is corrected through self-adaptive learning rate, and the convergence of the method is analyzed theoretically. This method required no manual adjustment of the learning rate and is robust in the selection of noise gradient information and hyper-parameters, featuring high computation efficiency and small memory overhead. Finally, a comparison is made between this method and other stochastic gradient descent methods through MNIST digital classification task, and the experiment result showed that Adan worked well with the faster rate of convergence and is better than other stochastic gradient descent optimization methods.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Zhenxing Xu, Ping Yang, Bing Xu, and Heping Li "Adaptive learning rate method based on Nesterov accelerated gradient", Proc. SPIE 10462, AOPC 2017: Optical Sensing and Imaging Technology and Applications, 104622R (24 October 2017); https://doi.org/10.1117/12.2284928
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
Back to Top