Export utility for unconstrained channel pruned models
Project description
Unconstrained Channel Pruning · Paper
UPSCALE: Unconstrained Channel Pruning @ ICML 2023
Alvin Wan, Hanxiang Hao, Kaushik Patnaik, Yueyang Xu, Omer Hadad, David Güera, Zhile Ren, Qi Shan
By removing constraints from existing pruners, we improve ImageNet accuracy for post-training pruned models by 2.1 points on average - benefiting DenseNet (+16.9), EfficientNetV2 (+7.9), and ResNet (+6.2). Furthermore, for these unconstrained pruned models, UPSCALE improves inference speeds by up to 2x over a baseline export.
Quick Start
Install our package.
pip install apple-upscale
Mask and prune channels, using the default magnitude pruner.
import torch, torchvision
from upscale import MaskingManager, PruningManager
x = torch.rand((1, 3, 224, 224), device='cuda')
model = torchvision.models.get_model('resnet18', pretrained=True).cuda() # get any pytorch model
MaskingManager(model).importance().mask()
PruningManager(model).compute([x]).prune()
Customize Pruning
We provide a number of pruning heuristics out of the box:
You can pass the desired heuristic into the UpscaleManager.mask
method call. You can also configure the pruning ratio in UpscaleManager.mask
. A value of 0.25
means 25% of channels are set to zero.
from upscale.importance import LAMP
MaskingManager(model).importance(LAMP()).mask(amount=0.25)
You can also zero out channels using any method you see fit.
model.conv0.weight[:, 24] = 0
Then, run our export.
PruningManager(model).compute([x]).prune()
Advanced
You may want direct access to network segments to build a heavily-customized pruning algorithm.
for segment in MaskingManager(model).segments():
# prune each segment in the network independently
for layer in segment.layers:
# layers in the segment
Development
NOTE: See src/upscale/pruning/README.md for more details on how the core export algorithm code is organized.
Clone and setup.
git clone git@github.com:apple/ml-upscale.git
cd upscale
pip install -e .
Run tests.
py.test src tests --doctest-modules
Paper
Follow the development installation instructions to have the paper code under paper/
available.
To run the baseline unconstrained export, pass baseline=True
to PruningManager.prune
.
PruningManager(model).compute([x]).prune(baseline=True)
To reproduce the paper results, run
python paper/main.py resnet18
Plug in any model in the torchvision.models
namespace.
usage: main.py [-h] [--side {input,output} [{input,output} ...]]
[--method {constrained,unconstrained} [{constrained,unconstrained} ...]]
[--amount AMOUNT [AMOUNT ...]] [--epochs EPOCHS]
[--heuristic {l1,l2,lamp,fpgm,hrank}] [--global] [--out OUT]
[--force] [--latency] [--clean]
model
positional arguments:
model model to prune
options:
-h, --help show this help message and exit
--side {input,output} [{input,output} ...]
prune which "side" -- producers, or consumers
--method {constrained,unconstrained} [{constrained,unconstrained} ...]
how to handle multiple branches
--amount AMOUNT [AMOUNT ...]
amounts to prune by. .6 means 60 percent pruned
--epochs EPOCHS number of epochs to train for
--heuristic {l1,l2,lamp,fpgm,hrank}
pruning heuristic
--global apply heuristic globally
--out OUT directory to write results.csv to
--force force latency rerun
--latency measure latency locally
--clean clean the dataframe
Citation
If you find this useful for your research, please consider citing
@inproceedings{wan2023upscale,
title={UPSCALE: Unconstrained Channel Pruning},
author={Alvin Wan and Hanxiang Hao and Kaushik Patnaik and Yueyang Xu and Omer Hadad and David Guera and Zhile Ren and Qi Shan},
booktitle={ICML},
year={2023}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file apple-upscale-0.1.1.tar.gz
.
File metadata
- Download URL: apple-upscale-0.1.1.tar.gz
- Upload date:
- Size: 38.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e48ff529b13a80b6996af4c263e53d7beae79ed07ec9fc1d3d0a047dcd6ad454 |
|
MD5 | 8c3034a2c6077f0f257b25e8d58b698e |
|
BLAKE2b-256 | c161291f4b23ca0afb390be68c4ce299e27de214bfe3c2489038dd104763b29c |
File details
Details for the file apple_upscale-0.1.1-py3-none-any.whl
.
File metadata
- Download URL: apple_upscale-0.1.1-py3-none-any.whl
- Upload date:
- Size: 43.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 42edf24b2e63d6e1c6861f8278440bb4fb9fa58ebed2eab520611d5c2b7fac37 |
|
MD5 | 2d7e9aec2e4e4c8cc3ce0cee0ef5dbd0 |
|
BLAKE2b-256 | 1ee41dd156ddf27cb4254bff089e1613ecb6b9fdd4162c6a7cb94e587916896e |