计算机工程与应用 ›› 2025, Vol. 61 ›› Issue (13): 62-77.DOI: 10.3778/j.issn.1002-8331.2410-0453

• 热点与综述 • 上一篇    下一篇

POI推荐算法研究综述

任瑞,黎英,杨雅莉,宋佩华   

  1. 1.南宁师范大学 物流管理与工程学院,南宁 530100 
    2.南宁师范大学 广西高校智慧物流技术重点实验室,南宁 530100
  • 出版日期:2025-07-01 发布日期:2025-06-30

Review of POI Recommendation Algorithms

REN Rui, LI Ying, YANG Yali, SONG Peihua   

  1. 1.School of Logistics Management and Engineering, Nanning Normal University, Nanning 530100, China
    2.Guangxi Colleges and Universities Key Laboratory of Intelligent Logistics Technology, Nanning Normal University, Nanning 530100, China
  • Online:2025-07-01 Published:2025-06-30

摘要: 兴趣点(point of interest,POI)推荐可以缓解用户选择困难问题并提高位置服务商、商家的收益,是位置社交网络的研究热点之一。在已有的综述中缺乏数据问题对策的梳理、前沿算法的更新、算法性能对比实验等内容。因此对这一领域的研究进行系统性综述,从数据问题、算法技术和对比实验三个方面进行归纳总结。从POI数据问题角度分析并归纳出数据稀疏、数据依赖和数据隐私三大问题及其对应的解决方法;从算法所用技术角度将现有重要研究分为矩阵分解、编码器、图神经网络、注意力机制、生成模型五类,比较并总结其优劣;从算法性能对比角度出发,选取使用频度最高的召回率和精度作为评价指标,对五个代表性算法进行实验及评价;指出该领域所面临的挑战和未来研究方向。

关键词: 兴趣点(POI)推荐, 社交网络, 位置推荐, 深度学习, 数据稀疏

Abstract: POI (point of interest) recommendation mitigates user choice overload and boosts revenue for location-based service providers and businesses, representing a key research focus in location-based social networks. Existing reviews lack?comprehensive coverage of data problem countermeasures, updates on advanced algorithms, and comparative experiments on algorithm performance. This paper offers a systematic review, addressing data issues, algorithmic techniques, and algorithm performance. Firstly, POI data problems include data sparsity, dependency, and privacy, each analyzed with corresponding solutions. Secondly, the paper categorizes research into matrix factorization, encoders, graph neural networks, attention mechanisms, and generative model, comparing their strengths and weaknesses. Thirdly, experiments evaluate five algorithms using recall rate and accuracy. Finally, the challenges and future directions in this field are highlighted.

Key words: point of interest (POI) recommendation, social networking, location recommendation, deep learning, data sparseness