题名 | An improved Adam algorithm using look-ahead |
作者 | |
发表日期 | 2017-06-02 |
会议名称 | 2017 International Conference on Deep Learning Technologies, ICDLT 2017 |
会议录名称 | ACM International Conference Proceeding Series
![]() |
卷号 | Part F128535 |
页码 | 19-22 |
会议日期 | 2 June 2017 到 4 June 2017 |
会议地点 | Chengdu |
摘要 | Adam is a state-of-art algorithm to optimize stochastic objective function. In this paper we proposed the Adam with Look-ahead (AWL), an updated version by applying look-ahead method with a hyperparameter. We firstly performed convergence analysis, showing that AWL has similar convergence properties as Adam. Then we conducted experiments to compare AWL with Adam on two models of logistic regression and two layers fully connected neural network. Results demonstrated that AWL outperforms the Adam with higher accuracy and less convergence time. Therefore, our newly proposed algorithm AWL may have great potential to be widely utilized in many fields of science and engineering. |
关键词 | Adam Gradient-based optimizer Look-ahead Machine learning |
DOI | 10.1145/3094243.3094249 |
URL | 查看来源 |
语种 | 英语English |
Scopus入藏号 | 2-s2.0-85025123354 |
引用统计 | |
文献类型 | 会议论文 |
条目标识符 | https://repository.uic.edu.cn/handle/39GCC9TT/9207 |
专题 | 个人在本单位外知识产出 |
作者单位 | Dept. of Computer Science,Wenzhou-Kean University,Wenzhou,88 Daxue Road,China |
推荐引用方式 GB/T 7714 | Zhu, An,Meng, Yu,Zhang, Changjiang. An improved Adam algorithm using look-ahead[C], 2017: 19-22. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论