• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Bao, Zhenshan (Bao, Zhenshan.) | Zhou, Wanqing (Zhou, Wanqing.) | Zhang, Wenbo (Zhang, Wenbo.)

Indexed by:

EI Scopus

Abstract:

Although the deep learning technology has shown great power in solving the complex tasks, these neural network models are large and redundant as a matter of fact, which makes these networks difficult to be placed in embedded devices with limited memory and computing resources. In order to compress the neural network to a slimmer and smaller one, the multi-grained network pruning framework is proposed in this paper. In our framework, the pruning process was divided into the filter-level pruning and the weight-level pruning. In the process of the filter-level pruning, the importance of the filter was measured by the entropy of the activation tensor of the filter. In the other process, the dynamic recoverable pruning method was adopted to prune the weights deeply. Different from these popular pruning methods, the weight-level pruning is also taken into account based on the employment of the filter-level pruning to achieve more effectively pruning. The proposed approach is validated on two representative CNN models - AlexNet and VGG16, pre-trained on ILSVRC12. Experimental results show that AlexNet and VGG16 network models are compressed 19.75× and 22.53× respectively by this approach, which are 2.05 and 5.89 higher than the classical approaches of dynamic Network Surgery and ThiNet. © Springer Nature Singapore Pte Ltd. 2019.

Keyword:

Convolutional neural networks Deep learning

Author Community:

  • [ 1 ] [Bao, Zhenshan]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 2 ] [Zhou, Wanqing]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 3 ] [Zhang, Wenbo]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China

Reprint Author's Address:

  • [zhang, wenbo]faculty of information technology, beijing university of technology, beijing; 100124, china

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 1865-0929

Year: 2019

Volume: 1058

Page: 564-576

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 2

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:1307/5395958
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.