Skip to main content

The torch_liberator Module

Project description

Torch Liberator - Deploy PyTorch Models

GitlabCIPipeline GitlabCICoverage Pypi Downloads

Main Page

https://gitlab.kitware.com/computer-vision/torch_liberator

Github Mirror

https://github.com/Kitware/torch_liberator

Pypi

https://pypi-hypernode.com/project/torch_liberator

Torch Hackathon 2021

Youtube Video and Google Slides

Torch Liberator is a Python module containing a set of tools for reading and writing relevant parts of deep networks.

Typically, when moving a deep-network trained with torch, you have to keep track of the entire codebase that defined the model file in addition to the checkpoint file containing the learned weights. Torch Liberator contains tools to extract relevant source code and bundle it with weights and serializing it into a single-file deployment.

Note: as of torch 1.9, torch comes with a torch.package submodule which contains a method for saving model weights with model structure. We recommend using torch.package over the single-file deployments provided in this package. Thus the load_partial_state logic is the main code of interest provided in this module.

Installation

pip install torch_liberator

# OR with a specific branch

pip install git+https://gitlab.kitware.com/computer-vision/torch_liberator.git@main

Partial State Loading

New in 0.1.0 torch liberator now exposes a public load_partial_state function, which does it best to “shove” weights from one model into another model. There are several methods to compute associations between layer names in one model to layer names in another, the most general being the “embedding” method, and the slightly more structured “isomorphism” option.

Have you ever had the scenario where you use one model as a sub-model in a bigger network? Then you had to load pretrained subnetwork state into that bigger model?

The latest version of torch_liberator.load_patial_state can handle this by solving a maximum-common-subtree-isomorphism problem. This computes the largest possible mapping between the two state dictionaries that share consistent suffixes.

>>> import torchvision
>>> import torch
>>> import torch_liberator
>>> resnet50 = torchvision.models.resnet50()
>>> class CustomModel(torch.nn.Module):
>>>     def __init__(self):
>>>         super().__init__()
>>>         self.module = resnet50
>>>         self.extra = torch.nn.Linear(1, 1)
>>> # Directly load resnet50 state into a model that has it as an embedded subnetwork
>>> model = CustomModel()
>>> model_state_dict = resnet50.state_dict()
>>> # load partial state returns information about what it did
>>> info = torch_liberator.load_partial_state(model, model_state_dict, association='isomorphism', verbose=1)
>>> print(len(info['seen']['full_add']))
>>> print(len(info['self_unset']))
>>> print(len(info['other_unused']))
320
2
0

It can also handle loading common state between two models that share some underlying structure.

>>> import torchvision
>>> import torch
>>> import torch_liberator
>>> resnet50 = torchvision.models.resnet50()
>>> class CustomModel1(torch.nn.Module):
>>>     def __init__(self):
>>>         super().__init__()
>>>         self.module = resnet50
>>>         self.custom_model1_layer = torch.nn.Linear(1, 1)
>>> class CustomModel2(torch.nn.Module):
>>>     def __init__(self):
>>>         super().__init__()
>>>         self.backbone = resnet50
>>>         self.custom_model2_layer = torch.nn.Linear(1, 1)
>>> # Load as much of model1 state into model2 as possible
>>> model1 = CustomModel1()
>>> model2 = CustomModel2()
>>> model2_state_dict = model2.state_dict()
>>> # load partial state returns information about what it did
>>> info = torch_liberator.load_partial_state(model1, model2_state_dict, association='isomorphism', verbose=1)
>>> print(len(info['seen']['full_add']))
>>> print(len(info['seen']['skipped']))
>>> print(len(info['self_unset']))
>>> print(len(info['other_unused']))
320
2
2
2
>>> import torchvision
>>> import torch_liberator
>>> #
>>> faster_rcnn = torchvision.models.detection.faster_rcnn.fasterrcnn_resnet50_fpn()
>>> resnet50 = torchvision.models.resnet50(pretrained=True)
>>> state_dict = resnet50.state_dict()
>>> # Load partial state return a dictionary that tells you how well it did
>>> info = torch_liberator.load_partial_state(faster_rcnn, state_dict, verbose=0, association='embedding')
>>> print(ub.map_vals(len, info['seen']))
>>> print(ub.map_vals(len, ub.dict_diff(info, ['seen'])))
{'full_add': 265, 'skipped': 55}
{'other_unused': 55, 'self_unset': 30}

>>> # Load partial state return a dictionary that tells you how well it did
>>> info = torch_liberator.load_partial_state(faster_rcnn, state_dict, verbose=0, association='isomorphism')
>>> print(ub.map_vals(len, info['seen']))
>>> print(ub.map_vals(len, ub.dict_diff(info, ['seen'])))
{'full_add': 265, 'skipped': 55}
{'other_unused': 55, 'self_unset': 30}

Also, if the sizes of the tensor don’t quite fit, they will be mangled, i.e. “shoved-in” as best as possible. See the docstring for more detail.

Stand-alone Single-File Model Deployments

The original purpose of torch_liberator was to build standalone torch packages that contained both the model code and the model weight. It still does that but torch.package new in torch 1.9, might be a better solution moving forward. See torch.package for details.

Torch Liberator builds on the liberator library to statically extract pytorch code that defines a model’s topology and bundle that with a pretrained weights file. This results in a single-file deployment package and can potentially remove dependencies on the codebase used to train the model.

Torch Liberator can also read these deployment files and create an instance of the model initialized with the correct pretrained weights.

The API is ok, but it does need improvement. However, the current version is in a working state. There aren’t any high level docs, but there are a lot of docstrings and doctests. The example here gives a good overview of the code by extracting the AlexNet model from torchvision.

>>> import torch_liberator
>>> from torch_liberator.deployer import DeployedModel
>>> from torchvision import models

>>> print('--- DEFINE A MODEL ---')
>>> model = models.alexnet(pretrained=False)  # false for test speed
>>> initkw = dict(num_classes=1000)  # not all models nicely supply this
>>> model._initkw = initkw
--- DEFINE A MODEL ---

>>> print('--- DEPLOY THE MODEL ---')
>>> zip_fpath = torch_liberator.deploy(model, 'test-deploy.zip')
--- DEPLOY THE MODEL ---
[DEPLOYER] Deployed zipfpath=/tmp/tmpeqd3y_rx/test-deploy.zip


>>> print('--- LOAD THE DEPLOYED MODEL ---')
>>> loader = DeployedModel(zip_fpath)
>>> model = loader.load_model()
--- LOAD THE DEPLOYED MODEL ---
Loading data onto None from <zopen(<_io.BufferedReader name='/tmp/tmpg1kln3kw/test-deploy/deploy_snapshot.pt'> mode=rb)>
Pretrained weights are a perfect fit

The major weirdness right now, is you either have to explicitly define “initkw” (which are the keyword arguments used to create an instance of our model) at deploy time, or you can set it as the _initkw attribute of your model (or if your keyword arguments all exist as member variables of the class, torch_liberator tries to be smart and infer what initkw should be).

There is also a torch-liberator CLI that can be used to package a weight file, a python model file, and optional json metadata.

python -m torch_liberator \
    --model <path-to-the-liberated-py-file> \
    --weights <path-to-the-torch-pth-weight-file> \
    --info <path-to-train-info-json-file> \
    --dst my_custom_deployfile.zip

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

torch_liberator-0.1.1-py3-none-any.whl (58.3 kB view details)

Uploaded Python 3

torch_liberator-0.1.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (176.8 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.5+ x86-64

torch_liberator-0.1.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl (175.1 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.12+ i686 manylinux: glibc 2.5+ i686

torch_liberator-0.1.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (178.4 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.5+ x86-64

torch_liberator-0.1.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl (175.8 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686 manylinux: glibc 2.5+ i686

torch_liberator-0.1.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (170.3 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.5+ x86-64

torch_liberator-0.1.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl (168.7 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686 manylinux: glibc 2.5+ i686

torch_liberator-0.1.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (170.2 kB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.5+ x86-64

torch_liberator-0.1.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl (168.8 kB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ i686 manylinux: glibc 2.5+ i686

File details

Details for the file torch_liberator-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: torch_liberator-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 58.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.12

File hashes

Hashes for torch_liberator-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 74fa1e512ef121c45c5b99977b4bedcd91fc7295b44ec0dab2739abb262e5b36
MD5 6bcaaadd4640600f7705d35ba9c6b2f8
BLAKE2b-256 ebd8774e211acd656713d65bdf2b5df3cc89b1ddff2457c5f95748feaeea24ad

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 886f02e78df5500a8637999db99c9f80ebde3d017f48f12580762c2c34178ac6
MD5 3a969ad9b329b13e521322c884bf62d1
BLAKE2b-256 7f50ecb01154e971ea481f0c093379094e4a9210674bb8667ee8470db3cf07e7

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 a797f20ed9354935405e008802869c7af807488478c8c3dc724b8f218835af59
MD5 7c721956c6987c479d3329d71c5793a4
BLAKE2b-256 a473deaebf2dd48a55b7bb3f45d34bda6c073e2123d4d75ddbfa266713824276

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 37a96e6d915c563ce4d4711c25fbc10031d0342aea029c04c6da95cf6d97d4ac
MD5 3b4e5968a60b76bbc6a57067a5a0c62b
BLAKE2b-256 9c83772f3b5c258dbcc325ecbbf624aeb63c5bb6ca66f4f783638d6e26fe565f

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 2ddd80704e14a0a04cd5ed0d50f83e199f56790edcbf913aa89de6b13e32522a
MD5 4eb9ecaded27152831c4be503adfdc80
BLAKE2b-256 85f8c9d5796fc839a3a507f641d08b2022fe391fa806fc732732843cae0b0fba

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 64bf64fe5d754a117a4d684a31d7085c4b8ad63f368f3f0ef06978dec19afe16
MD5 687c7b2712796150f7b720135ee5ad74
BLAKE2b-256 5fbf3737cd6de1d81ce4a77014096e7515d0066bf37d665d481b1246e7bda223

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 b0a1bc6bc28c9c636add7414f976abf9e4beacc439f4f797c9bd14feb4e5aaed
MD5 8ffacabbffb08026b98b5a5bd9cc2147
BLAKE2b-256 c6be25567e549385dc5be2ed146197b95e3b6186f7a08f26accea67de7c5dde2

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 cf29d8a23b1d5af77dd611bcf1e1ae59751a5a71fe64de28712197c791a8e117
MD5 3c3410f9665f740883bad6aa5ed3ae6b
BLAKE2b-256 7bac6260411f5fdeb31fad893f260d5272997b9591b5d65bb28a12ebfbcced8b

See more details on using hashes here.

File details

Details for the file torch_liberator-0.1.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for torch_liberator-0.1.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 b005611c5b30d90a040c95a0d9b7360c4bb41c67e13668834a72e4c45d414f7c
MD5 c101f4c7d19b6ac6a3cf8c3e214a21d1
BLAKE2b-256 fef420b5496c1f7f68d972e6e1b09b6dbb5f61fb34aa0196b5c42241d7e9ca6e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page