Computer Engineering and Applications ›› 2025, Vol. 61 ›› Issue (6): 244-253.DOI: 10.3778/j.issn.1002-8331.2406-0357

• Pattern Recognition and Artificial Intelligence • Previous Articles     Next Articles

Cross-View Contrastive Model for User?Multi-Behavior Recommendation

WU Xia, WANG Shaoqing, ZHANG Yao   

  1. School of Computer Science and Technology, Shandong University of Technology, Zibo, Shandong 255000, China
  • Online:2025-03-15 Published:2025-03-14

跨视图的用户多行为对比推荐模型

吴瑕,王绍卿,张尧   

  1. 山东理工大学 计算机科学与技术学院,山东 淄博 255000

Abstract: Traditional recommendation models often struggle to fully exploit and leverage the diversity and correlations within multi-type behavioral data, resulting in suboptimal recommendation performance. Currently, there are two main challenges in multi-behavior recommendation: (1) how to decouple behaviors at the item level to better separate consistency and difference signals among behaviors; (2) how to better enhance behavioral differences and consistent interests. This paper proposes a cross-view contrastive model for user?multi-behavior recommendation (CVCM),which decomposes user interests across multi-behaviors and leverages contrastive learning between different behavioral interest views of users and specific behavioral interests across users. Specifically, an interest decomposer is firstly employed to disentangle behavior-specific interests and behavior-independent interests from multi-behavior interaction data. Subsequently, a cross-view contrastive learning module is designed to enhance behavioral diversity by contrasting a user’s original view with its weighted transformed view. Finally, a multi-user contrastive learning module is utilized to extract consistent features across different behaviors.Evaluation results on three real datasets, namely Rec-Tmall, Taobao, and Beibei, show that, compared to the best baseline, the improvements in NDCG@10 for the three datasets are 13.99%, 4.98%, and 17.23%, respectively.

Key words: multi-behavior recommendation, multi-behavior interest , contrastive learning

摘要: 传统推荐模型往往难以充分挖掘和利用用户多行为数据中的差异性与关联性,导致次优的推荐性能。目前多行为推荐面临两个主要的挑战:(1)如何在项目级别上解耦行为,更好地分离出行为间的一致性和差异性信号;(2)如何更好地增强行为差异性和一致性兴趣。为此,提出了跨视图的用户多行为对比推荐模型(CVCM),分解用户多行为兴趣,通过用户的不同行为兴趣视图和不同用户的特定行为兴趣进行对比学习。通过用户兴趣分解器,从多行为交互信息中分离出行为特定兴趣和行为无关兴趣。设计跨视图对比学习模块,通过对比同一用户的原始视图和加权转换后视图达到增强行为差异性的目的。通过多用户对比学习模块,来提取不同行为之间的一致性特征。在三个真实数据集Rec-Tmall、Taobao和Beibei上评估结果显示,与最佳基线相比,三个数据集的NDCG@10的提升度分别为13.99%、4.98%、17.23%。

关键词: 多行为推荐, 多行为兴趣, 对比学习