Computer Engineering and Applications ›› 2018, Vol. 54 ›› Issue (23): 7-13.DOI: 10.3778/j.issn.1002-8331.1810-0180

Previous Articles     Next Articles

Deep single-peaked trapezoid neural networks

SHAN Chuanhui   

  1. College of Computer Science, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China
  • Online:2018-12-01 Published:2018-11-30

深度单峰梯形神经网络

单传辉   

  1. 北京工业大学 信息学部 计算机学院,北京 100124

Abstract: Recently, deep learning has obtained a great success in many research areas. The activation function is one of the key factors in deep learning. In this paper, according to characteristics of biological neurons, an improved Single-Peaked Trapezoid Linear Unit(SPTLU) activation function is presented for the right-hand response unbounded of ReLU. SPTLU is in line with the biological neuron biological essence and achieves the excellent performance of equivalent or beyond ReLU. The experiments show the proposed activation function achieves good effectiveness on different datsets, e.g., MNIST, Fashion-MNIST, SVHN, CALTECH101 and CIFAR10 datasets.

Key words: Single-Peaked Trapezoid Linear Unit(SPTLU), SPTLU neuron, SPTLU network, comparative experiment

摘要: 近年来,深度学习在许多领域取得了巨大的成功,其中,激活函数是深度学习取得巨大成功的关键因素之一。根据生物神经元特性,针对ReLU右侧响应无界问题,提出了单峰梯形线性单元(Single-Peaked Trapezoid Linear Unit,SPTLU)。SPTLU更加符合生物神经元特性,且取得了等同和超越ReLU的优异性能,实验表明在不同数据集上都取得了很好的效果,例如,数据集MNIST,Fashion-MNIST,SVHN,CALTECH101和CIFAR10。

关键词: 单峰梯形线性单元(SPTLU), SPTLU神经元, SPTLU网络, 对比实验