[1] WANG S, PASI G, HU L, et al. The era of intelligent recommendation: editorial on intelligent recommendation with advanced AI and learning[J]. ?IEEE Intelligent Systems, 2020, 35(5): 3-6.
[2] WANG S, CAO L, WANG Y, et al. A survey on session-based recommender systems[J]. ACM Computing Surveys (CSUR), 2021, 54(7): 1-38.
[3] 于蒙, 何文涛, 周绪川, 等. 推荐系统综述[J]. 计算机应用, 2022, 42(6): 1898-1913.
YU M, HE W T, ZHOU X C, et al. Review of recommendation system[J]. Journal of Computer Applications, 2022, 42(6): 1898-1913.
[4] 黄震华, 林小龙, 孙圣力, 等. 会话场景下基于特征增强的图神经推荐方法[J]. 计算机学报, 2022, 45(4): 766-780.
HUANG Z H, LIN X L, SUN S L, et al. Feature augmentation based graph neural recommendation method in session scenarios[J]. Chinese Journal of Computers, 2022, 45(4): 766-780.
[5] RAFTERY A E, LEWIS S M. [Practical Markov chain Monte Carlo]: comment: one long run with diagnostics: implementation strategies for Markov chain Monte Carlo[J]. Statistical Science, 1992, 7(4): 493-497.
[6] LIN J C W, SHAO Y, DJENOURI Y, et al. ASRNN: a recurrent neural network with an attention model for sequence labeling[J]. Knowledge-Based Systems, 2021, 212: 106548.
[7] WU J. Introduction to convolutional neural networks[Z]. Nanjing University. National Key Lab for Novel Software Technology, 2017.
[8] ZHANG M, WU S, GAO M. Personalized graph neural networks with attention mechanism for session-aware recommendation[J]. IEEE Transactions on Knowledge and Data Engineering, 2020, 34(8): 3946-3957.
[9] ZHANG X, LIN H, XU B, et al. Dynamic intent-aware iterative denoising network for session-based recommendation[J]. Information Processing & Management, 2022, 59(3): 102936.
[10] PAN Z, CAI F, CHEN W, et al. Star graph neural networks for session-based recommendation[C]//Proceedings of the 29th ACM International Conference on Information & Knowledge Management, 2020: 1195-1204.
[11] XU C, ZHAO P, LIU Y, et al. Graph contextualized self-attention network for session-based recommendation[C]//Proceedings of the 28th Information Joint Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2019: 3940-3946.
[12] CHAUDHARI S, MITHAL V, POLATKAN G, et al. An attentive survey of attention models[J]. ACM Transactions on Intelligent Systems and Technology, 2021, 12(5): 1-32.
[13] ZAHEER M, GURUGANESH G, DUBEY K A, et al. Big bird: Transformers for longer sequences[C]//Advances in Neural Information Processing Systems, 2020: 17283-17297.
[14] LIU L, WANG L, LIAN T. CaSe4SR: using category sequence graph to augment session-based recommendation[J]. Knowledge-Based Systems, 2021, 212: 106558.
[15] FAN X, LIU Z, LIAN J, et al. Lighter and better: low-rank decomposed self-attention networks for next-item recommendation[C]//Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2021: 1733-1737.
[16] SHANI G, HECKERMAN D, BRAFMAN R I, et al. An MDP-based recommender system[J]. Journal of Machine Learning Research, 2005, 6(9): 1265-1295.
[17] RENDLE S, FREUDENTHALER C, SCHMIDT-THIEME L. Factorizing personalized Markov chains for next-basket recommendation[C]//Proceedings of the 19th International Conference on World Wide Web. New York: ACM, 2010: 811-820.
[18] WANG C, NIEPERT M. State-regularized recurrent neural networks[C]//International Conference on Machine Learning. Piscataway. NJ: IEEE, 2019: 6596-6606.
[19] HUANG L, MA Y, WANG S, et al. An attention-based spatiotemporal LSTM network for next poi recommendation[J]. IEEE Transactions on Services Computing, 2019, 14(6): 1585-1597.
[20] DEVOOGHT R, BERSINI H. Long and short-term recommendations with recurrent neural networks[C]//Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization. New York: ACM, 2017: 13-21.
[21] HIDASI B, QUADRANA M, KARATZOGLOU A, et al. Parallel recurrent neural network architectures for feature-rich session-based recommendations[C]//Proceedings of the 10th ACM Conference on Recommender Systems. New York: ACM, 2016: 241-248.
[22] HUANG J, ZHAO W X, DOU H, et al. Improving sequential recommendation with knowledge-enhanced memory networks[C]//Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 2018: 505-514.
[23] ZHOU J, CUI G, HU S, et al. Graph neural networks: a review of methods and applications[J]. AI Open, 2020, 1: 57-81.
[24] ZHANG Y, GAO H, PEI J, et al. Robust self-supervised structural graph neural network for social network prediction[C]//Proceedings of the ACM Web Conference 2022. New York: ACM, 2022: 1352-1361.
[25] WU S, TANG Y, ZHU Y, et al. Session-based recommendation with graph neural networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Menlo Park, CA: AAAI, 2019: 346-353.
[26] YE R, ZHANG Q, LUO H. Cross-session aware temporal convolutional network for session-based recommendation[C]//Proceedings of the 2020 International Conference on Data Mining Workshops (ICDMW). Piscataway, NJ: IEEE, 2020: 220-226.
[27] WANG Z, WEI W, CONG G, et al. Global context enhanced graph neural networks for session-based recommendation[C]//Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2020: 169-178.
[28] 党伟超, 姚志宇, 白尚旺, 等. 基于图模型和注意力模型的会话推荐方法[J]. 计算机应用, 2022, 42(11): 3610-3616.
DANG W C, YAO Z Y, BAI S W, et al. Session recommendation method based on graph model and attention model[J]. Journal of Computer Applications, 2022, 42(11): 3610-3616.
[29] QIU R, YIN H, HUANG Z, et al. GAG: global attributed graph neural network for streaming session-based recommendation[C]//Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, 2020: 669-678.
[30] WEI L, ZHAO H, HE Z. Designing the topology of graph neural networks: a novel feature fusion perspective[C]//Proceedings of the ACM Web Conference 2022. New York: ACM, 2022: 1381-1391.
[31] CHORDIA S, PAWAR Y, KULKARNI S, et al. Attention is all you need to tell: transformer-based image captioning[M]//Advances in distributed computing and machine learning. Singapore: Springer, 2022: 607-617.
[32] KATHAROPOULOS A, VYAS A, PAPPAS N, et al. Transformers are RNNs: fast autoregressive transformers with linear attention[C]//International Conference on Machine Learning. Piscataway. NJ: IEEE, 2020: 5156-5165.
[33] WANG D, DENG S G, XU G. GEMRec: a graph-based emotion-aware music recommendation approach[C]//International Conference on Web Information Systems Engineering. Cham: Springer, 2016: 92-106.
[34] ZHU Z, HE Y, ZHAO X, et al. Popularity bias in dynamic recommendation[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. New York: ACM, 2021: 2439-2449.
[35] DESHPANDE M, KARYPIS G. Item-based top-n recommendation algorithms[J]. ACM Transactions on Information Systems, 2004, 22(1): 143-177.
[36] TAN Y, XU X, LIU Y. Improved recurrent neural networks for session-based recommendations[C]//Proceedings of the 1st Workshop on Deep Learning for Recommender Systems. New York: ACM, 2016: 17-22.
[37] TANG J, WANG K. Personalized top-n sequential recommendation via convolutional sequence embedding[C]//Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, 2018: 565-573.
[38] BARAKAT A, BIANCHI P. Convergence and dynamical behavior of the ADAM algorithm for nonconvex stochastic optimization[J]. SIAM Journal on Optimization, 2021, 31(1): 244-274.
[39] GHADIMI S, LAN G, ZHANG H. Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization[J]. Mathematical Programming, 2016, 155(1/2): 267-305. |