Skip to main content

A collection of tools for neural compression enthusiasts

Project description

NeuralCompression

LICENSE Build and Test

What's New

About

NeuralCompression is a Python repository dedicated to research of neural networks that compress data. The repository includes tools such as JAX-based entropy coders, image compression models, video compression models, and metrics for image and video evaluation.

NeuralCompression is alpha software. The project is under active development. The API will change as we make releases, potentially breaking backwards compatibility.

Installation

NeuralCompression is a project currently under development. You can install the repository in development mode.

PyPI Installation

First, install PyTorch according to the directions from the PyTorch website. Then, you should be able to run

pip install neuralcompression

to get the latest version from PyPI.

Development Installation

First, clone the repository and navigate to the NeuralCompression root directory and install the package in development mode by running:

pip install --editable ".[tests]"

If you are not interested in matching the test environment, then you can just apply pip install -e ..

Repository Structure

We use a 2-tier repository structure. The neuralcompression package contains a core set of tools for doing neural compression research. Code committed to the core package requires stricter linting, high code quality, and rigorous review. The projects folder contains code for reproducing papers and training baselines. Code in this folder is not linted aggressively, we don't enforce type annotations, and it's okay to omit unit tests.

The 2-tier structure enables rapid iteration and reproduction via code in projects that is built on a backbone of high-quality code in neuralcompression.

neuralcompression

  • neuralcompression - base package
    • data - PyTorch data loaders for various data sets
    • distributions - extensions of probability models for compression
    • functional - methods for image warping, information cost, flop counting, etc.
    • layers - building blocks for compression models
    • metrics - torchmetrics classes for assessing model performance
    • models - complete compression models
    • optim - useful optimization utilities

projects

Tutorial Notebooks

This repository also features interactive notebooks detailing different parts of the package, which can be found in the tutorials directory. Existing tutorials are:

  • Walkthrough of the neuralcompression flop counter (view on Colab).
  • Using neuralcompression.metrics and torchmetrics to calculate rate-distortion curves (view on Colab).

Contributions

Please read our CONTRIBUTING guide and our CODE_OF_CONDUCT prior to submitting a pull request.

We test all pull requests. We rely on this for reviews, so please make sure any new code is tested. Tests for neuralcompression go in the tests folder in the root of the repository. Tests for individual projects go in those projects' own tests folder.

We use black for formatting, isort for import sorting, flake8 for linting, and mypy for type checking.

License

NeuralCompression is MIT licensed, as found in the LICENSE file.

Model weights released with NeuralCompression are CC-BY-NC 4.0 licensed, as found in the WEIGHTS_LICENSE file.

Some of the code may from other repositories and include other licenses. Please read all code files carefully for details.

Cite

If you use code for a paper reimplementation. If you would like to also cite the repository, you can use:

@misc{muckley2021neuralcompression,
    author={Matthew Muckley and Jordan Juravsky and Daniel Severo and Mannat Singh and Quentin Duval and Karen Ullrich},
    title={NeuralCompression},
    howpublished={\url{https://github.com/facebookresearch/NeuralCompression}},
    year={2021}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuralcompression-0.3.0.tar.gz (1.6 MB view details)

Uploaded Source

Built Distribution

neuralcompression-0.3.0-py3-none-any.whl (107.8 kB view details)

Uploaded Python 3

File details

Details for the file neuralcompression-0.3.0.tar.gz.

File metadata

  • Download URL: neuralcompression-0.3.0.tar.gz
  • Upload date:
  • Size: 1.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.17

File hashes

Hashes for neuralcompression-0.3.0.tar.gz
Algorithm Hash digest
SHA256 0e2124a49ddf40acab8c51efcba6e3413d2ea6a2140cd3df72e9e32bc4a9a85d
MD5 860558664121b513c51632f1eb8c7db4
BLAKE2b-256 79ad3ed6fb862fbd24ffa4a19398bd12094f26d67d6c04744a084f7e54895d04

See more details on using hashes here.

File details

Details for the file neuralcompression-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for neuralcompression-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5b8bc67f5fe5f695ebe93ee100535a621d472ccc97721c23007b49216573b112
MD5 283057cdd089f67687db6ab80ade3d4f
BLAKE2b-256 6702a7d6227db4aff3bd4af1afebf81f352a02d7d43b2f5152653d817afb3e34

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page