eduzhai > Applied Sciences > Engineering >

Lossless CNN Channel Pruning via Decoupling Remembering and Forgetting

  • king
  • (0) Download
  • 20210505
  • Save

... pages left unread,continue reading

Document pages: 11 pages

Abstract: We propose ResRep, a novel method for lossless channel pruning (a.k.a. filterpruning), which aims to slim down a convolutional neural network (CNN) byreducing the width (number of output channels) of convolutional layers.Inspired by the neurobiology research about the independence of remembering andforgetting, we propose to re-parameterize a CNN into the remembering parts andforgetting parts, where the former learn to maintain the performance and thelatter learn for efficiency. By training the re-parameterized model usingregular SGD on the former but a novel update rule with penalty gradients on thelatter, we realize structured sparsity, enabling us to equivalently convert there-parameterized model into the original architecture with narrower layers.Such a methodology distinguishes ResRep from the traditional learning-basedpruning paradigm that applies a penalty on parameters to produce structuredsparsity, which may suppress the parameters essential for the remembering. Ourmethod slims down a standard ResNet-50 with 76.15 accuracy on ImageNet to anarrower one with only 45 FLOPs and no accuracy drop, which is the first toachieve lossless pruning with such a high compression ratio, to the best of ourknowledge.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×