Details of Research Outputs

Status即将出版Forthcoming
TitleDeep Into the Domain Shift: Transfer Learning Through Dependence Regularization
Creator
Date Issued2023
Source PublicationIEEE Transactions on Neural Networks and Learning Systems
ISSN2162-237X
Abstract

Classical domain adaptation methods acquire transferability by regularizing the overall distributional discrepancies between features in the source domain (labeled) and features in the target domain (unlabeled). They often do not differentiate whether the domain differences come from the marginals or the dependence structures. In many business and financial applications, the labeling function usually has different sensitivities to the changes in the marginals versus changes in the dependence structures. Measuring the overall distributional differences will not be discriminative enough in acquiring transferability. Without the needed structural resolution, the learned transfer is less optimal. This article proposes a new domain adaptation approach in which one can measure the differences in the internal dependence structure separately from those in the marginals. By optimizing the relative weights among them, the new regularization strategy greatly relaxes the rigidness of the existing approaches. It allows a learning machine to pay special attention to places where the differences matter the most. Experiments on three real-world datasets show that the improvements are quite notable and robust compared to various benchmark domain adaptation models.

KeywordAdaptation models Copula Covariance matrices Data science domain adaptation domain divergence Neural networks regularization Sun Transfer learning Urban areas
DOI10.1109/TNNLS.2023.3279099
URLView source
Indexed BySCIE
Language英语English
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS IDWOS:001006310500001
Scopus ID2-s2.0-85161564453
Citation statistics
Cited Times:4[WOS]   [WOS Record]     [Related Records in WOS]
Document TypeJournal article
Identifierhttp://repository.uic.edu.cn/handle/39GCC9TT/11665
CollectionFaculty of Science and Technology
Corresponding AuthorWu, Qi
Affiliation
1.Guangdong Provincial Key Laboratory of Interdisciplinary Research and Application for Data Science and the Division of Science and Technology, BNU-HKBU United International College, Zhuhai, China
2.CityU-JD Digits Joint Laboratory in Financial Technology and Engineering and the School of Data Science, City University of Hong Kong, Hong Kong, Hong Kong
3.School of Data Science, the CityU-JD Digits Joint Laboratory in Financial Technology and Engineering, and the Institute of Data Science, City University of Hong Kong, Hong Kong,China
4.JD Digits Technology, Beijing, China
First Author AffilicationBeijing Normal-Hong Kong Baptist University
Recommended Citation
GB/T 7714
Ma, Shumin,Yuan, Zhiri,Wu, Qiet al. Deep Into the Domain Shift: Transfer Learning Through Dependence Regularization[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023.
APA Ma, Shumin., Yuan, Zhiri., Wu, Qi., Huang, Yiyan., Hu, Xixu., .. & Huang, Zhixiang. (2023). Deep Into the Domain Shift: Transfer Learning Through Dependence Regularization. IEEE Transactions on Neural Networks and Learning Systems.
MLA Ma, Shumin,et al."Deep Into the Domain Shift: Transfer Learning Through Dependence Regularization". IEEE Transactions on Neural Networks and Learning Systems (2023).
Files in This Item:
There are no files associated with this item.
Related Services
Usage statistics
Google Scholar
Similar articles in Google Scholar
[Ma, Shumin]'s Articles
[Yuan, Zhiri]'s Articles
[Wu, Qi]'s Articles
Baidu academic
Similar articles in Baidu academic
[Ma, Shumin]'s Articles
[Yuan, Zhiri]'s Articles
[Wu, Qi]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Ma, Shumin]'s Articles
[Yuan, Zhiri]'s Articles
[Wu, Qi]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.