Title | An improved Adam algorithm using look-ahead |
Creator | |
Date Issued | 2017-06-02 |
Conference Name | 2017 International Conference on Deep Learning Technologies, ICDLT 2017 |
Source Publication | ACM International Conference Proceeding Series
![]() |
Volume | Part F128535 |
Pages | 19-22 |
Conference Date | 2 June 2017 到 4 June 2017 |
Conference Place | Chengdu |
Abstract | Adam is a state-of-art algorithm to optimize stochastic objective function. In this paper we proposed the Adam with Look-ahead (AWL), an updated version by applying look-ahead method with a hyperparameter. We firstly performed convergence analysis, showing that AWL has similar convergence properties as Adam. Then we conducted experiments to compare AWL with Adam on two models of logistic regression and two layers fully connected neural network. Results demonstrated that AWL outperforms the Adam with higher accuracy and less convergence time. Therefore, our newly proposed algorithm AWL may have great potential to be widely utilized in many fields of science and engineering. |
Keyword | Adam Gradient-based optimizer Look-ahead Machine learning |
DOI | 10.1145/3094243.3094249 |
URL | View source |
Language | 英语English |
Scopus ID | 2-s2.0-85025123354 |
Citation statistics |
Cited Times [WOS]:0
[WOS Record]
[Related Records in WOS]
|
Document Type | Conference paper |
Identifier | http://repository.uic.edu.cn/handle/39GCC9TT/9207 |
Collection | Research outside affiliated institution |
Affiliation | Dept. of Computer Science,Wenzhou-Kean University,Wenzhou,88 Daxue Road,China |
Recommended Citation GB/T 7714 | Zhu, An,Meng, Yu,Zhang, Changjiang. An improved Adam algorithm using look-ahead[C], 2017: 19-22. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment