Status | 即将出版Forthcoming |
Title | Deep Into the Domain Shift: Transfer Learning Through Dependence Regularization |
Creator | |
Date Issued | 2023 |
Source Publication | IEEE Transactions on Neural Networks and Learning Systems
![]() |
ISSN | 2162-237X |
Abstract | Classical domain adaptation methods acquire transferability by regularizing the overall distributional discrepancies between features in the source domain (labeled) and features in the target domain (unlabeled). They often do not differentiate whether the domain differences come from the marginals or the dependence structures. In many business and financial applications, the labeling function usually has different sensitivities to the changes in the marginals versus changes in the dependence structures. Measuring the overall distributional differences will not be discriminative enough in acquiring transferability. Without the needed structural resolution, the learned transfer is less optimal. This article proposes a new domain adaptation approach in which one can measure the differences in the internal dependence structure separately from those in the marginals. By optimizing the relative weights among them, the new regularization strategy greatly relaxes the rigidness of the existing approaches. It allows a learning machine to pay special attention to places where the differences matter the most. Experiments on three real-world datasets show that the improvements are quite notable and robust compared to various benchmark domain adaptation models. |
Keyword | Adaptation models Copula Covariance matrices Data science domain adaptation domain divergence Neural networks regularization Sun Transfer learning Urban areas |
DOI | 10.1109/TNNLS.2023.3279099 |
URL | View source |
Indexed By | SCIE |
Language | 英语English |
WOS Research Area | Computer Science ; Engineering |
WOS Subject | Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS ID | WOS:001006310500001 |
Scopus ID | 2-s2.0-85161564453 |
Citation statistics | |
Document Type | Journal article |
Identifier | http://repository.uic.edu.cn/handle/39GCC9TT/11665 |
Collection | Faculty of Science and Technology |
Corresponding Author | Wu, Qi |
Affiliation | 1.Guangdong Provincial Key Laboratory of Interdisciplinary Research and Application for Data Science and the Division of Science and Technology, BNU-HKBU United International College, Zhuhai, China 2.CityU-JD Digits Joint Laboratory in Financial Technology and Engineering and the School of Data Science, City University of Hong Kong, Hong Kong, Hong Kong 3.School of Data Science, the CityU-JD Digits Joint Laboratory in Financial Technology and Engineering, and the Institute of Data Science, City University of Hong Kong, Hong Kong,China 4.JD Digits Technology, Beijing, China |
First Author Affilication | Beijing Normal-Hong Kong Baptist University |
Recommended Citation GB/T 7714 | Ma, Shumin,Yuan, Zhiri,Wu, Qiet al. Deep Into the Domain Shift: Transfer Learning Through Dependence Regularization[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023. |
APA | Ma, Shumin., Yuan, Zhiri., Wu, Qi., Huang, Yiyan., Hu, Xixu., .. & Huang, Zhixiang. (2023). Deep Into the Domain Shift: Transfer Learning Through Dependence Regularization. IEEE Transactions on Neural Networks and Learning Systems. |
MLA | Ma, Shumin,et al."Deep Into the Domain Shift: Transfer Learning Through Dependence Regularization". IEEE Transactions on Neural Networks and Learning Systems (2023). |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment