计算机工程与应用 ›› 2022, Vol. 58 ›› Issue (10): 87-92.DOI: 10.3778/j.issn.1002-8331.2106-0506

• 理论与研发 • 上一篇    下一篇

基于脉冲频率与输入电流关系的SNN训练算法

兰浩鑫,陈云华   

  1. 广东工业大学 计算机学院,广州 510006
  • 出版日期:2022-05-15 发布日期:2022-05-15

SNN Training Algorithm Based on Relationship Between Pulse Frequency and Input Current

LAN Haoxin, CHEN Yunhua   

  1. School of Computers, Guangdong University of Technology, Guangzhou 510006, China
  • Online:2022-05-15 Published:2022-05-15

摘要: 脉冲神经网络(spiking neural network,SNN)以异步事件驱动,支持大规模并行计算,在改善同步模拟神经网络的计算效率方面具有巨大潜力。然而,目前SNN仍然面临无法直接训练的难题,为此,受到神经科学领域关于LIF(leaky integrate-and-fire)神经元响应机制研究启发,提出了一种新的基于频率编码的SNN训练算法。通过仿真实验对LIF神经元发放脉冲频率进行建模,得到LIF神经元脉冲频率与输入电流之间显示表达关系,并将其导数作为梯度,解决了SNN训练过程中离散脉冲事件产生的不可微性问题,使得可利用BP算法对SNN进行训练。现有基于频率编码的方法采用时间信用分配机制进行参数更新,通常具有较差的学习效率,为此,采用LIF神经元响应机制更新网络参数,提高了学习效率。在MNIST和CIFAR10数据集上的实验结果验证了所提方法的有效性,分类精度分别达到了99.53%和89.46%,在CIFAR10数据上的识别精度相较于先前研究者提高了4.22个百分点,在学习效率方面相较于先前采用时间信用分配方法提高了一倍左右。

关键词: 脉冲神经网络(SNN), 反向传播算法, LIF神经元, 脉冲频率, 神经形态类脑计算

Abstract: Spiking neural network(SNN) is driven by asynchronous events and supports massively parallel computing. It has great potential in improving the computational efficiency of synchronous analog neural networks. However, SNN is still facing the problem that it cannot be directly trained. For this reason, inspired by the research on the response mechanism of leaky integrate-and-fire(LIF) model, this paper proposes a new SNN training algorithm based on rate coding. Firstly, this paper models the spike rate of the LIF neuron through the simulation experiment, obtains the expression relationship between the spike rate and the input current, and takes the derivative as the gradient, which solves the non-differentiability of the discrete spike event during the SNN training process. Secondly, this paper uses the LIF neuron response mechanism to update the network parameters, which improves the learning efficiency. The experimental results on the MNIST and CIFAR10 datasets verify the effectiveness of this method. The classification accuracy reaches 99.53% and 89.46%. The recognition accuracy on CIFAR10 data is 4.22 percentage points higher than that of previous researchers and the learning efficiency has been doubled.

Key words: spiking neural network(SNN), backpropagation algorithm, LIF neuron, spike rate, neuromorphic brain-like computing