Indexed by:
Abstract:
Recently, Deep learning has made a great deal of success in processing images, audios, and natural languages and so on. The activation function is one of the key factors in Deep learning. In this paper, according to characteristics of biological neurons, an improved Leaky Single-Peaked Triangle Linear Unit (LSPTLU) activation function is presented for the right-hand response unbounded of Rectified Linear Unit (ReLU) and Leaky ReLU (LReLU). LSPTLU is more in line with the biological neuron essence and achieves the excellent performance of equivalent or beyond ReLU and LReLU on different datsets, e.g., MNIST, Fashion-MNIST, SVHN, IMAGENET, CALTECH101 and CIFAR10 datasets.
Keyword:
Reprint Author's Address:
Email:
Source :
INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS
ISSN: 1598-6446
Year: 2019
Issue: 10
Volume: 17
Page: 2693-2701
3 . 2 0 0
JCR@2022
ESI Discipline: ENGINEERING;
ESI HC Threshold:136
Cited Count:
WoS CC Cited Count: 5
SCOPUS Cited Count: 6
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: