Blending Pruning Criteria for Convolutional Neural Networks

نویسندگان

چکیده

The advancement of convolutional neural networks (CNNs) on various vision applications has attracted lots attention. Yet the majority CNNs are unable to satisfy strict requirement for real-world deployment. To overcome this, recent popular network pruning is an effective method reduce redundancy models. However, ranking filters according their “importance” different criteria may be inconsistent. One filter could important a certain criterion, while it unnecessary another one, which indicates that each criterion only partial view comprehensive “importance”. From this motivation, we propose novel framework integrate existing by exploring diversity. proposed contains two stages: Criteria Clustering and Filters Importance Calibration. First, condense via layerwise clustering based rank score. Second, within cluster, calibration factor adjust significance selected blending candidates search optimal Evolutionary Algorithm. Quantitative results CIFAR-100 ImageNet benchmarks show our outperforms state-of-the-art baselines, regrading compact model performance after pruning.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pruning Convolutional Neural Networks for Resource Efficient Transfer Learning

We propose a new framework for pruning convolutional kernels in neural networks to enable efficient inference, focusing on transfer learning where large and potentially unwieldy pretrained networks are adapted to specialized tasks. We interleave greedy criteria-based pruning with fine-tuning by backpropagation—a computationally efficient procedure that maintains good generalization in the prune...

متن کامل

Pruning Convolutional Neural Networks for Image Instance Retrieval

In this work, we focus on the problem of image instance retrieval with deep descriptors extracted from pruned Convolutional Neural Networks (CNN). The objective is to heavily prune convolutional edges while maintaining retrieval performance. To this end, we introduce both data-independent and data-dependent heuristics to prune convolutional edges, and evaluate their performance across various c...

متن کامل

Pruning Convolutional Neural Networks for Resource Efficient Inference

We propose a new formulation for pruning convolutional kernels in neural networks to enable efficient inference. We interleave greedy criteria-based pruning with finetuning by backpropagation—a computationally efficient procedure that maintains good generalization in the pruned network. We propose a new criterion based on Taylor expansion that approximates the change in the cost function induce...

متن کامل

Compact Deep Convolutional Neural Networks With Coarse Pruning

The learning capability of a neural network improves with increasing depth at higher computational costs. Wider layers with dense kernel connectivity patterns furhter increase this cost and may hinder real-time inference. We propose feature map and kernel level pruning for reducing the computational complexity of a deep convolutional neural network. Pruning feature maps reduces the width of a l...

متن کامل

Coarse Pruning of Convolutional Neural Networks with Random Masks

The learning capability of a neural network improves with increasing depth at higher computational costs. Wider layers with dense kernel connectivity patterns further increase this cost and may hinder real-time inference. We propose feature map and kernel pruning for reducing the computational complexity of a deep convolutional neural network. Due to coarse nature, these pruning granularities c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2021

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-030-86380-7_1