Magnitude and Uncertainty Pruning Criterion for Neural Networks

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

4 Citations (Scopus)

Abstract

Neural networks have achieved dramatic improvements in recent years and depict the state-of-the-art methods for many real-world tasks nowadays. One drawback is, however, that many of these models are overparameterized, which makes them both computationally and memory intensive. Furthermore, overparameterization can also lead to undesired overfitting side-effects. Inspired by recently proposed magnitude-based pruning schemes and the Wald test from the field of statistics, we introduce a novel magnitude and uncertainty (MU) pruning criterion that helps to lessen such shortcomings. One important advantage of our MU pruning criterion is that it is scale-invariant, a phenomenon that the magnitude-based pruning criterion suffers from. In addition, we present a 'pseudo bootstrap' scheme, which can efficiently estimate the uncertainty of the weights by using their update information during training. Our experimental evaluation, which is based on various neural network architectures and datasets, shows that our new criterion leads to more compressed models compared to models that are solely based on magnitude-based pruning criteria, with, at the same time, less loss in predictive power.

Original languageEnglish
Title of host publication2019 IEEE International Conference on Big Data, Big Data
EditorsChaitanya Baru, Jun Huan, Latifur Khan, Xiaohua Tony Hu, Ronay Ak, Yuanyuan Tian, Roger Barga, Carlo Zaniolo, Kisung Lee, Yanfang Fanny Ye
PublisherIEEE
Publication date2019
Pages2317-2326
Article number9005692
ISBN (Electronic)9781728108582
DOIs
Publication statusPublished - 2019
Event2019 IEEE International Conference on Big Data, Big Data 2019 - Los Angeles, United States
Duration: 9 Dec 201912 Dec 2019

Conference

Conference2019 IEEE International Conference on Big Data, Big Data 2019
Country/TerritoryUnited States
CityLos Angeles
Period09/12/201912/12/2019
SponsorAnkura, Baidu, IEEE, IEEE Computer Society, Very
SeriesProceedings - 2019 IEEE International Conference on Big Data, Big Data 2019

Keywords

  • Neural network compression
  • overparameterization
  • pruning
  • Wald test

Cite this