科研成果详情

发表状态已发表Published
题名Gradient amplification: An efficient way to train deep neural networks
作者
发表日期2020-09-01
发表期刊Big Data Mining and Analytics
ISSN/eISSN2096-0654
卷号3期号:3页码:196-207
摘要

Improving performance of deep learning models and reducing their training times are ongoing challenges in deep neural networks. There are several approaches proposed to address these challenges, one of which is to increase the depth of the neural networks. Such deeper networks not only increase training times, but also suffer from vanishing gradients problem while training. In this work, we propose gradient amplification approach for training deep learning models to prevent vanishing gradients and also develop a training strategy to enable or disable gradient amplification method across several epochs with different learning rates. We perform experiments on VGG-19 and Resnet models (Resnet-18 and Resnet-34), and study the impact of amplification parameters on these models in detail. Our proposed approach improves performance of these deep learning models even at higher learning rates, thereby allowing these models to achieve higher performance with reduced training time.

关键词Backpropagation Deep learning Gradient amplification Learning rate Vanishing gradients
DOI10.26599/BDMA.2020.9020004
URL查看来源
收录类别ESCI
语种英语English
WOS研究方向Computer Science
WOS类目Computer Science ; Artificial Intelligence ; Computer Science ; Information Systems
WOS记录号WOS:000895960100004
Scopus入藏号2-s2.0-85094645075
引用统计
文献类型期刊论文
条目标识符https://repository.uic.edu.cn/handle/39GCC9TT/13017
专题个人在本单位外知识产出
通讯作者Pan, Yi; Pan, Yi
作者单位
1.Department of Computer Science,Georgia State University,Atlanta,30302,United States
2.Center for High Performance Computing,Joint Engineering Research Center for Health Big Data Intelligent Analysis Technology,Shenzhen Institutes of Advanced Technology,Chinese Academy of Sciences,Shenzhen,518055,China
3.Department of Computer Science,Georgia State University,Atlanta,30302,United States
4.Center for High Performance Computing,Joint Engineering Research Center for Health Big Data Intelligent Analysis Technology,Shenzhen Institutes of Advanced Technology,Chinese Academy of Sciences,Shenzhen,518055,China
推荐引用方式
GB/T 7714
Basodi, Sunitha,Ji, Chunyan,Zhang, Haipinget al. Gradient amplification: An efficient way to train deep neural networks[J]. Big Data Mining and Analytics, 2020, 3(3): 196-207.
APA Basodi, Sunitha., Ji, Chunyan., Zhang, Haiping., Pan, Yi., Basodi, Sunitha., .. & Pan, Yi. (2020). Gradient amplification: An efficient way to train deep neural networks. Big Data Mining and Analytics, 3(3), 196-207.
MLA Basodi, Sunitha,et al."Gradient amplification: An efficient way to train deep neural networks". Big Data Mining and Analytics 3.3(2020): 196-207.
条目包含的文件
条目无相关文件。
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Basodi, Sunitha]的文章
[Ji, Chunyan]的文章
[Zhang, Haiping]的文章
百度学术
百度学术中相似的文章
[Basodi, Sunitha]的文章
[Ji, Chunyan]的文章
[Zhang, Haiping]的文章
必应学术
必应学术中相似的文章
[Basodi, Sunitha]的文章
[Ji, Chunyan]的文章
[Zhang, Haiping]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。