计算机工程与应用 ›› 2022, Vol. 58 ›› Issue (15): 133-140.DOI: 10.3778/j.issn.1002-8331.2101-0069

• 模式识别与人工智能 • 上一篇    下一篇

舒尔特方格与LSTM的注意力分级建模

王湃,吴凡,汪梅,秦学斌   

  1. 西安科技大学 电器与控制工程学院,西安 710054
  • 出版日期:2022-08-01 发布日期:2022-08-01

Attention Grading Modeling Based on Schulte Grid and LSTM

WANG Pai, WU Fan, WANG Mei, QIN Xuebin   

  1. School of Electrical and Control Engineering, Xi’an University of Science and Technology, Xi’an 710054, China
  • Online:2022-08-01 Published:2022-08-01

摘要: 在基于脑电信号的注意力分级研究中,存在两个亟待解决的技术难点。第一不同注意类型的脑电数据采集及标注困难;第二脑电特征提取算法忽视原始脑电信号时序特征。针对以上问题,设计了基于视觉搜索和反应时技术的舒尔特方格范式,实现对不同注意类型脑电数据的采集以及自动标注;设计长短期记忆深度学习网络(LSTM)实现对注意力分级,保存原始脑电信号的时序特征。实验结果表明,注意力分级模型可以很好区分高中低三种注意力水平;对比现有的五种基于EEG信号的注意力分级算法,小波变换(DWT)、近似熵、共空间模式(CSP)、基于相干系数的脑网络和卷积神经网络(CNN),在相同的EEG数据集上,该注意力分级模型识别准确率最高,高出DWT算法21.49个百分点;高出近似熵算法25.82个百分点;高出CSP算法20.53个百分点;高出基于相干系数的脑网络算法13.32个百分点;高出CNN9.05个百分点。

关键词: 注意力分级模型, 原始脑电信号, 长短期记忆深度学习网络(LSTM), 注意力监测

Abstract: In the study of attention grading based on EEG signals, there are two technical difficulties that need to be solved urgently. EEG data collection and labeling of different attention types are difficult. EEG feature extraction algorithm ignores the timing characteristics of the original EEG signal. In response to the above problems, the Schulte grid paradigm based on visual search and reaction time technology is designed to realize the collection and automatic labeling of EEG data of different types of attention. The long and short-term memory(LSTM) deep learning network is designed to achieve attention force classification, save the timing characteristics of the original EEG signal. The experimental results show that this attention grading model can distinguish high, medium and low attention levels well. Compared with the existing five attention grading algorithms based on EEG signals:wavelet transform(DWT), approximate entropy, and total Spatial pattern(CSP), brain network based on coherence coefficient, and convolutional neural network, on the same EEG data set, this attention classification model has the highest recognition accuracy, 21.49?percentage points higher than DWT algorithm; 25.82?percentage points higher than approximate entropy algorithm; 20.53?percentage points higher than CSP algorithm; 13.32?percentage points higher than brain network algorithm based on coherence coefficient; 9.05?percentage points higher than convolutional neural network(CNN).

Key words: attention grading model, original EEG signal, long and short-term memory(LSTM), attention monitoring