Details of Research Outputs

Status已发表Published
TitleCross-collection latent Beta-Liouville allocation model training with privacy protection and applications
Creator
Date Issued2023-07-01
Source PublicationApplied Intelligence
ISSN0924-669X
Volume53Issue:14Pages:17824-17848
Abstract

Cross-collection topic models extend previous single-collection topic models, such as Latent Dirichlet Allocation (LDA), to multiple collections. The purpose of cross-collection topic modeling is to model document-topic representations and reveal similarities between each topic and differences among groups. However, the restriction of Dirichlet prior and the significant privacy risk have hampered those models’ performance and utility. Training those cross-collection topic models may, in particular, leak sensitive information from the training dataset. To address the two issues mentioned above, we propose a novel model, cross-collection latent Beta-Liouville allocation (ccLBLA), which operates a more powerful prior, Beta-Liouville distribution with a more general covariance structure that enhances topic correlation analysis. To provide privacy protection for the ccLBLA model, we leverage the inherent differential privacy guarantee of the Collapsed Gibbs Sampling (CGS) inference scheme and then propose a hybrid privacy protection algorithm for the ccLBLA model (HPP-ccLBLA) that prevents inferring data from intermediate statistics during the CGS training process without sacrificing its utility. More crucially, our technique is the first attempt to use the cross-collection topic model in image classification applications and investigate the cross-collection topic model’s capabilities beyond text analysis. The experimental results for comparative text mining and image classification will show the merits of our proposed approach.

KeywordBeta-Liouville prior Comparative text mining Cross-collection model Differential privacy Image classification Topic correlation
DOI10.1007/s10489-022-04378-3
URLView source
Indexed BySCIE
Language英语English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence
WOS IDWOS:000913573700003
Scopus ID2-s2.0-85146255428
Citation statistics
Cited Times:9[WOS]   [WOS Record]     [Related Records in WOS]
Document TypeJournal article
Identifierhttp://repository.uic.edu.cn/handle/39GCC9TT/10786
CollectionBeijing Normal-Hong Kong Baptist University
Corresponding AuthorLuo, Zhiwen
Affiliation
1.The Concordia Institute for Information Systems Engineering (CIISE), Concordia University, Montréal, H3H 1M8, Canada
2.G-SCOP Lab, Grenoble Institute of Technology, Grenoble, 38031, France
3.Department of Computer Science, Beijing Normal University-Hong Kong Baptist University United International College (UIC), Zhuhai, Guangdong, 519088, China
Recommended Citation
GB/T 7714
Luo, Zhiwen,Amayri, Manar,Fan, Wentaoet al. Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications[J]. Applied Intelligence, 2023, 53(14): 17824-17848.
APA Luo, Zhiwen, Amayri, Manar, Fan, Wentao, & Bouguila, Nizar. (2023). Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications. Applied Intelligence, 53(14), 17824-17848.
MLA Luo, Zhiwen,et al."Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications". Applied Intelligence 53.14(2023): 17824-17848.
Files in This Item:
There are no files associated with this item.
Related Services
Usage statistics
Google Scholar
Similar articles in Google Scholar
[Luo, Zhiwen]'s Articles
[Amayri, Manar]'s Articles
[Fan, Wentao]'s Articles
Baidu academic
Similar articles in Baidu academic
[Luo, Zhiwen]'s Articles
[Amayri, Manar]'s Articles
[Fan, Wentao]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Luo, Zhiwen]'s Articles
[Amayri, Manar]'s Articles
[Fan, Wentao]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.