发表状态 | 已发表Published |
题名 | Gradient amplification: An efficient way to train deep neural networks |
作者 | |
发表日期 | 2020-09-01 |
发表期刊 | Big Data Mining and Analytics
![]() |
ISSN/eISSN | 2096-0654 |
卷号 | 3期号:3页码:196-207 |
摘要 | Improving performance of deep learning models and reducing their training times are ongoing challenges in deep neural networks. There are several approaches proposed to address these challenges, one of which is to increase the depth of the neural networks. Such deeper networks not only increase training times, but also suffer from vanishing gradients problem while training. In this work, we propose gradient amplification approach for training deep learning models to prevent vanishing gradients and also develop a training strategy to enable or disable gradient amplification method across several epochs with different learning rates. We perform experiments on VGG-19 and Resnet models (Resnet-18 and Resnet-34), and study the impact of amplification parameters on these models in detail. Our proposed approach improves performance of these deep learning models even at higher learning rates, thereby allowing these models to achieve higher performance with reduced training time. |
关键词 | Backpropagation Deep learning Gradient amplification Learning rate Vanishing gradients |
DOI | 10.26599/BDMA.2020.9020004 |
URL | 查看来源 |
收录类别 | ESCI |
语种 | 英语English |
WOS研究方向 | Computer Science |
WOS类目 | Computer Science ; Artificial Intelligence ; Computer Science ; Information Systems |
WOS记录号 | WOS:000895960100004 |
Scopus入藏号 | 2-s2.0-85094645075 |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | https://repository.uic.edu.cn/handle/39GCC9TT/13017 |
专题 | 个人在本单位外知识产出 |
通讯作者 | Pan, Yi; Pan, Yi |
作者单位 | 1.Department of Computer Science,Georgia State University,Atlanta,30302,United States 2.Center for High Performance Computing,Joint Engineering Research Center for Health Big Data Intelligent Analysis Technology,Shenzhen Institutes of Advanced Technology,Chinese Academy of Sciences,Shenzhen,518055,China 3.Department of Computer Science,Georgia State University,Atlanta,30302,United States 4.Center for High Performance Computing,Joint Engineering Research Center for Health Big Data Intelligent Analysis Technology,Shenzhen Institutes of Advanced Technology,Chinese Academy of Sciences,Shenzhen,518055,China |
推荐引用方式 GB/T 7714 | Basodi, Sunitha,Ji, Chunyan,Zhang, Haipinget al. Gradient amplification: An efficient way to train deep neural networks[J]. Big Data Mining and Analytics, 2020, 3(3): 196-207. |
APA | Basodi, Sunitha., Ji, Chunyan., Zhang, Haiping., Pan, Yi., Basodi, Sunitha., .. & Pan, Yi. (2020). Gradient amplification: An efficient way to train deep neural networks. Big Data Mining and Analytics, 3(3), 196-207. |
MLA | Basodi, Sunitha,et al."Gradient amplification: An efficient way to train deep neural networks". Big Data Mining and Analytics 3.3(2020): 196-207. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论