收录:
摘要:
Recently, Deep learning has made a great deal of success in processing images, audios, and natural languages and so on. The activation function is one of the key factors in Deep learning. In this paper, according to characteristics of biological neurons, an improved Leaky Single-Peaked Triangle Linear Unit (LSPTLU) activation function is presented for the right-hand response unbounded of Rectified Linear Unit (ReLU) and Leaky ReLU (LReLU). LSPTLU is more in line with the biological neuron essence and achieves the excellent performance of equivalent or beyond ReLU and LReLU on different datsets, e.g., MNIST, Fashion-MNIST, SVHN, IMAGENET, CALTECH101 and CIFAR10 datasets.
关键词:
通讯作者信息:
电子邮件地址: