计算机工程与应用 ›› 2025, Vol. 61 ›› Issue (10): 299-307.DOI: 10.3778/j.issn.1002-8331.2405-0065

• 网络、通信与安全 • 上一篇    下一篇

结合Transformer的双向GRU入侵检测研究

李道全,刘旭寅,刘嘉宇,陈思慧   

  1. 青岛理工大学 信息与控制工程学院,山东 青岛 266520
  • 出版日期:2025-05-15 发布日期:2025-05-15

Intrusion Detection Research Combining Transformer and Bidirectional GRU

LI Daoquan, LIU Xuyin, LIU Jiayu, CHEN Sihui   

  1. School of Information and Control Engineering, Qingdao University of Technology, Qingdao, Shandong 266520, China
  • Online:2025-05-15 Published:2025-05-15

摘要: 在网络入侵检测系统中,以往的系统在特征提取时容易受到噪声特征的干扰,面对不平衡数据时对少数类样本边界区分不明显,同时检测模型容易错过重要时间点的信息。这些问题影响了模型的训练效果,降低了模型的检测性能。为应对这些挑战,提出了一种结合鸽群优化算法和边界合成少数过采样技术的Transformer-双向门控循环单元(bidirectional gated recurrent unit,BiGRU)混合模型。通过鸽群优化算法自动进行特征选择,以提升模型处理复杂数据集的能力,并降低噪声特征的干扰。采用边界合成少数过采样技术对数据进行平衡处理,特别是针对少数类样本,提高其在平衡数据集中的代表性和质量。构建了一个集成Transformer和BiGRU的深度学习模型来进行入侵检测,利用Transformer捕捉全局依赖关系的特征提取能力,同时借助BiGRU的时间序列建模能力来更好地理解序列数据的双向上下文关系。在NSL-KDD数据集上的实验结果表明,该模型展现出了良好的检测性能,准确率达到83.64%,F1分数为78.41%,均超过了对比的传统机器学习模型和深度学习模型。

关键词: 鸽群优化算法, 边界过采样, 多头注意力, 双向循环门控单元, 入侵检测

Abstract: In network intrusion detection systems, previous systems often suffered from interference by noisy features during feature extraction, exhibited poor distinction at the boundaries of minority class samples when dealing with imbalanced data, and the detection models tended to miss important temporal information. Such issues compromised the training effectiveness and diminished the detection performance of the models. To overcome these challenges, this paper introduces a hybrid model that amalgamates the pigeon-inspired optimization (PIO) algorithm and the borderline synthetic minority over-sampling technique (SMOTE) with a Transformer-bidirectional gated recurrent unit (BiGRU). The pigeon-inspired optimization algorithm is utilized for automatic feature selection, enhancing the capability of the model to process complex datasets, and mitigating the impact of noise features. The borderline SMOTE is applied to balance the data, with a specific emphasis on minority class samples, thereby enhancing their representation and quality within the balanced dataset. A deep learning model that integrates Transformer and BiGRU is developed for intrusion detection. This integration exploits the Transformer’s ability to capture global dependencies and the BiGRU’s competence in temporal sequence modeling, enabling a more nuanced understanding of the bidirectional contextual relationships in sequence data. Experimental results derived from the NSL-KDD dataset indicate that the model demonstrates commendable detection performance, attaining an accuracy rate of 83.64% and an F1 score of 78.41%, which surpasses the benchmarks set by traditional machine learning models and other deep learning models used for comparison.

Key words: pigeon-inspired optimization algorithm, Borderline SMOTE, multi-head attention, bidirectional gated recurrent unit, intrusion detection