Computer Engineering and Applications ›› 2019, Vol. 55 ›› Issue (16): 36-41.DOI: 10.3778/j.issn.1002-8331.1903-0355

Previous Articles     Next Articles

Survey on Parallel LLL Algorithms

LIU Yang, CHEN Jingwei, FENG Yong, WU Wenyuan   

  1. 1.School of Information Science and Engineering, Chongqing Jiaotong University, Chongqing 400074, China
    2.Chongqing Key Lab of Automated Reasoning & Cognition, Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, Chongqing 400714, China
  • Online:2019-08-15 Published:2019-08-13

并行LLL算法研究综述

刘洋,陈经纬,冯勇,吴文渊   

  1. 1.重庆交通大学 信息科学与工程学院,重庆 400074
    2.中国科学院重庆绿色智能技术研究院 自动推理与认知重庆市重点实验室,重庆 400714

Abstract: Since 1982, the Lenstra-Lenstra-Lovasz(LLL) algorithm has been successfully applied in computer algebra, coding theory, cryptanalysis, algorithmic number theory, integer programming, etc. After over 30 years, both of theoretical and practical aspects of the sequential LLL algorithm have been significantly improved. However, it still does not satisfy the need of problems with large size, especially in cryptanalysis. Therefore, parallel LLL algorithms have its own importance. This paper surveys the state-of-the-art of parallel LLL algorithms, summarizes and analyzes the existing problems and difficulties for the current parallel LLL algorithms, and proposes the future directions for further study.

Key words: lattice, lattice basis reduction, LLL algorithm, parallel computing

摘要: Lenstra-Lenstra-Lovasz(LLL)格基约化算法自1982年被提出以来,已被成功应用于计算机代数、编码理论、密码分析、算法数论、整数规划等众多领域。经过三十多年的发展,串行LLL算法的理论分析和实际效率都已得到显著改进,但仍不能满足密码分析等领域处理较大规模问题的需要。因此,并行LLL算法研究被寄予厚望。对并行LLL算法的研究现状进行了综述,总结了当前并行LLL算法设计与分析中存在的问题和难点,并对其未来发展趋势进行了展望。

关键词: 格, 格基约化, LLL算法, 并行计算