收录:
摘要:
A pruning algorithm for neural network based on information entropy is proposed in this paper. In the proposed algorithm, a new e-exponential information entropy of neural networks' connection weights is defined based on the theory of Shannon's information entropy. Although both information entropies have approximately the same description on uncertainty, the new e-exponential information entropy overcomes the inherent drawbacks of Shannon's entropy. By introducing newly defined entropy as a penalty-term into normal objective function, the minor weight connection is punished and the major weight connection is encouraged due to the unique property of entropy function. Therefore, a simple architecture of neural networks can be achieved by deleting the connection weights whose values are approximately equal to zero. The simulation result using a typical non-linear function approximation shows that a simple architecture of neural networks can be achieved and at the same time the performance of approximation is warranted.
关键词:
通讯作者信息:
电子邮件地址: