科研成果详情

题名RMLANet: Random Multi-Level Attention Network for Shadow Detection
作者
发表日期2022
会议名称2022 IEEE International Conference on Multimedia and Expo (ICME)
会议录名称ICME 2022 Conference Proceedings
ISBN9781665485630
会议日期18-22 July 2022
会议地点Taipei
会议举办国China
出版者IEEE
摘要

This paper addresses the problem of shadow detection from a single image. Previous approaches have shown that exploiting both global and local contexts in deep convolutional neural network layers can greatly improve performance. However, multi-level contexts remain underexplored. To achieve this, we propose RMLANet, a novel Random Multi-Level Attention Network. Specifically, we leverage shuffled multi-level features simultaneously with guiding features, and employ the transformer to capture global context. Furthermore, to reduce the computational and memory overhead caused by the self-attention mechanism in the vanilla transformer, we propose a random sampling strategy to reduce the number of inputs to the transformer. This is motivated by observing local consistency in images, which suggests that dense attention is unnecessary. Extensive experimental results demonstrate that our method outperforms current state-of-the-art methods on three widely used benchmark datasets SBU, ISTD and UCF. © 2022 IEEE.

关键词Multi-level Random Sampling Shadow Detection Transformer
DOI10.1109/ICME52920.2022.9860013
URL查看来源
语种英语English
Scopus入藏号2-s2.0-85137744268
引用统计
文献类型会议论文
条目标识符https://repository.uic.edu.cn/handle/39GCC9TT/11535
专题理工科技学院
通讯作者Zhang, Hui
作者单位
1.Department of Computer Science, Hong Kong Baptist University, Hong Kong SAR, China
2.2Division of Science and Technology, BNU-HKBU United International College, Zhuhai, China
第一作者单位北师香港浸会大学
通讯作者单位北师香港浸会大学
推荐引用方式
GB/T 7714
Jie, Leiping,Zhang, Hui. RMLANet: Random Multi-Level Attention Network for Shadow Detection[C]: IEEE, 2022.
条目包含的文件
条目无相关文件。
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Jie, Leiping]的文章
[Zhang, Hui]的文章
百度学术
百度学术中相似的文章
[Jie, Leiping]的文章
[Zhang, Hui]的文章
必应学术
必应学术中相似的文章
[Jie, Leiping]的文章
[Zhang, Hui]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。