Skip to main content

A lightweight library to help with training neural networks in PyTorch.

Project description

Ignite

https://travis-ci.org/pytorch/ignite.svg?branch=master https://codecov.io/gh/pytorch/ignite/branch/master/graph/badge.svg https://pepy.tech/badge/pytorch-ignite https://img.shields.io/badge/dynamic/json.svg?label=docs&url=https%3A%2F%2Fpypi.org%2Fpypi%2Fpytorch-ignite%2Fjson&query=%24.info.version&colorB=brightgreen&prefix=v

Ignite is a high-level library to help with training neural networks in PyTorch.

  • ignite helps you write compact but full-featured training loops in a few lines of code

  • you get a training loop with metrics, early-stopping, model checkpointing and other features without the boilerplate

Below we show a side-by-side comparison of using pure pytorch and using ignite to create a training loop to train and validate your model with occasional checkpointing:

assets/ignite_vs_bare_pytorch.png

As you can see, the code is more concise and readable with ignite. Furthermore, adding additional metrics, or things like early stopping is a breeze in ignite, but can start to rapidly increase the complexity of your code when “rolling your own” training loop.

Installation

From pip:

pip install pytorch-ignite

From conda:

conda install ignite -c pytorch

From source:

pip install git+https://github.com/pytorch/ignite

Nightly releases

From pip:

pip install --pre pytorch-ignite

From conda (this suggests to install pytorch nightly release instead of stable version as dependency):

conda install ignite -c pytorch-nightly

Why Ignite?

Ignite’s high level of abstraction assumes less about the type of network (or networks) that you are training, and we require the user to define the closure to be run in the training and validation loop. This level of abstraction allows for a great deal more of flexibility, such as co-training multiple models (i.e. GANs) and computing/tracking multiple losses and metrics in your training loop.

Ignite also allows for multiple handlers to be attached to events, and a finer granularity of events in the engine loop.

Documentation

API documentation and an overview of the library can be found here.

Structure

  • ignite: Core of the library, contains an engine for training and evaluating, all of the classic machine learning metrics and a variety of handlers to ease the pain of training and validation of neural networks!

  • ignite.contrib: The Contrib directory contains additional modules contributed by Ignite users. Modules vary from TBPTT engine, various optimisation parameter schedulers, logging handlers and a metrics module containing many regression metrics (ignite.contrib.metrics.regression)!

The code in ignite.contrib is not as fully maintained as the core part of the library. It may change or be removed at any time without notice.

Examples

We provide several examples ported from pytorch/examples using ignite to display how it helps to write compact and full-featured training loops in a few lines of code:

MNIST example

Basic neural network training on MNIST dataset with/without ignite.contrib module:

Distributed CIFAR10 example

Training a small variant of ResNet on CIFAR10 in various configurations: 1) single gpu, 2) single node multiple gpus, 3) multiple nodes and multilple gpus.

Other examples

Notebooks

Contributing

We appreciate all contributions. If you are planning to contribute back bug-fixes, please do so without any further discussion. If you plan to contribute new features, utility functions or extensions, please first open an issue and discuss the feature with us.

Please see the contribution guidelines for more information.

As always, PRs are welcome :)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-ignite-0.3.0.dev20191029.tar.gz (56.0 kB view details)

Uploaded Source

Built Distribution

pytorch_ignite-0.3.0.dev20191029-py2.py3-none-any.whl (90.3 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file pytorch-ignite-0.3.0.dev20191029.tar.gz.

File metadata

  • Download URL: pytorch-ignite-0.3.0.dev20191029.tar.gz
  • Upload date:
  • Size: 56.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.6.9

File hashes

Hashes for pytorch-ignite-0.3.0.dev20191029.tar.gz
Algorithm Hash digest
SHA256 9684ae7177da2e8ca2e8f83e62e18f67fb8483ab4efb90f3849b6d8f1b7cb645
MD5 77610ad9ee1a025f0d944ed4968cbb8e
BLAKE2b-256 051649c8d5bbf6864e42f00374dd7aee111abde72d518b00b9eadebeaca4e186

See more details on using hashes here.

File details

Details for the file pytorch_ignite-0.3.0.dev20191029-py2.py3-none-any.whl.

File metadata

  • Download URL: pytorch_ignite-0.3.0.dev20191029-py2.py3-none-any.whl
  • Upload date:
  • Size: 90.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.6.9

File hashes

Hashes for pytorch_ignite-0.3.0.dev20191029-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 19bfa0952010add581db9b13e114691e404afc6ca1226cc6c132470d92a43cfb
MD5 e5c7702eea34455a5de5c902dd1c43a5
BLAKE2b-256 f5a8d912ffdadc34bb1c47bb9d18f46aaf2de7fff3179de9c4fc1e383c3303c2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page