Computer Engineering and Applications ›› 2019, Vol. 55 ›› Issue (10): 73-76.DOI: 10.3778/j.issn.1002-8331.1801-0217

Previous Articles     Next Articles

Partitioned Extreme Learning Machine for Big Data

ZHAO Jiantang   

  1. College of Mathematics and Information Science, Xianyang Normal University, Xianyang, Shaanxi 712000, China
  • Online:2019-05-15 Published:2019-05-13

大数据分割式极限学习机算法

赵建堂   

  1. 咸阳师范学院 数学与信息科学学院,陕西 咸阳 712000

Abstract: Under the background of huge data input, a partitioned Extreme Learning Machine(ELM) algorithm is proposed to improve the learning speed of the ELM and reduce the memory consumption of computers. The huge data are divided into [K] equal parts, and weights of each ELM are trained based on each part data. The comprehensive weight  of the partitioned ELM is determined based on the arithmetic average operator. To avoid the abnormal data influencing the output of ELM, Ordered Weighted Averaging(OWA) operators are used to fuse the output information of each ELM, so that the output of partitioned ELM is more stable. Numerical simulation shows that the accuracy, the learning speed  and the maximum memory consumption of the partitioned ELM are higher than that of the traditional ELM, it verifies the feasibility and rationality of the proposed method.

Key words: Extreme Learning Machines(ELM), big data, ordered weighted averaging operators

摘要: 在海量数据输入背景下,为提升极限学习机算法的学习速度,降低计算机内存消耗,提出一种分割式极限学习机算法。将海量数据分割成[K]等份,分别训练极限学习机并获得单一外权,基于算术平均算子得到分割式极限学习机的综合外权;为避免异常数据对极限学习机输出结果的影响,采用有序加权平均算子融合单一极限学习机的输出信息,使分割式极限学习机的输出结果更为稳定。数值对比仿真显示:分割式极限学习机比传统极限学习机的学习速度、拟合精度和内存消耗都高,验证了该方法的有效性和可行性。

关键词: 极限学习机(ELM), 大数据, 有序加权平均算子