计算机工程与应用 ›› 2019, Vol. 55 ›› Issue (11): 257-264.DOI: 10.3778/j.issn.1002-8331.1803-0236

• 工程与应用 • 上一篇    下一篇

基于并行Apriori的物流路径频繁模式研究

曹菁菁1,任欣欣2,徐贤浩2   

  1. 1.武汉理工大学 物流工程学院,武汉 430063
    2.华中科技大学 管理学院,武汉 430074
  • 出版日期:2019-06-01 发布日期:2019-05-30

Research on Logistics Path Frequent Patterns Based on Parallel Apriori

CAO Jingjing1, REN Xinxin2, XU Xianhao2   

  1. 1.College of Logistics Engineering, Wuhan University of Technology, Wuhan 430063, China
    2.School of Management, Huazhong University of Science and Technology, Wuhan 430074, China
  • Online:2019-06-01 Published:2019-05-30

摘要: 传统的频繁路径挖掘分析主要通过关联规则算法实现,但其在处理大型数据集时,会产生占用内存过多,数据处理速度慢等问题,对此提出一种基于Fuzzy [c]-means聚类算法的并行Apriori算法模型。该模型通过Fuzzy [c]-means算法完成对原始数据集的聚类分析,将同一区域的物流路径数据划分到内部相似度较高的数据类,并利用Apriori算法对各数据类中的频繁模式进行挖掘分析,进而获得各区域的物流频繁路径。同时通过Hadoop平台实现算法的并行化,有效提高算法运行效率和质量。通过对物流频繁路径的挖掘分析,使管理者更清楚货物流向,可为配送路径优化等决策提供支持。

关键词: 大数据, 频繁路径, Hadoop, Fuzzy [c]-means聚类算法, Apriori算法

Abstract: The traditional method of frequent path mining analysis is realized by the association rule algorithm. However, when dealing with large data sets, the traditional association rules algorithm will take up too much memory and process data slowly. In this paper, a parallel Apriori algorithm based on Fuzzy [c]-means clustering algorithm is proposed. The model performs clustering analysis of the original data set by Fuzzy [c]-means algorithm, divides the logistics path data which is considered as the same district into a data cluster with high similarity. Then the model utilizes the Apriori algorithm to mine the frequent paths in this district, so as to obtain the frequent logistics path of each area. Meanwhile, the algorithm is parallelized through the Hadoop platform, which can effectively improve the efficiency and the quality of the algorithm. Through the analysis of the frequent path of logistics, managers can better understand the flow of goods and make the decision of the optimization of the delivery path.

Key words: big data, frequent path, Hadoop, Fuzzy [c]-means clustering algorithm, Apriori algorithm