收录:
摘要:
Activation functions play an important role in deep learning and its choice has a significant effect on the training and performance of a model. In this study, a new variant of Exponential Linear Unit (ELU) activation called Transformed Exponential Linear Unit (TELU) is proposed. An empirical evaluation is done to determine the effectiveness of the new activation function using state-of-the-art deep learning architectures. From the experiments, TELU activation function tends to work better than the conventional activations functions on deep models across a number of benchmarking datasets. TELU achieves superior classification accuracy on Cifar-10, SVHN and Caltech-101 dataset on state-of-the-art deep learning models. Additionally, it shows superior AUROC, MCC, and F1-score on the STL-10 dataset. This proves that TELU can be successfully applied in deep learning for image classification.
关键词:
通讯作者信息:
电子邮件地址: