Skip to main content

Forecasting timeseries with PyTorch - dataloaders, normalizers, metrics and models

Project description

Our article on Towards Data Science introduces the package and provides background information.

Pytorch Forecasting aims to ease timeseries forecasting with neural networks for real-world cases and research alike. Specifically, the package provides

  • A timeseries dataset class which abstracts handling variable transformations, missing values, randomized subsampling, multiple history lengths, etc.
  • A base model class which provides basic training of timeseries models along with logging in tensorboard and generic visualizations such actual vs predictions and dependency plots
  • Multiple neural network architectures for timeseries forecasting that have been enhanced for real-world deployment and come with in-built interpretation capabilities
  • Multi-horizon timeseries metrics
  • Ranger optimizer for faster model training
  • Hyperparameter tuning with optuna

The package is built on pytorch-lightning to allow training on CPUs, single and multiple GPUs out-of-the-box.

Installation

If you are working windows, you need to first install PyTorch with

pip install torch -f https://download.pytorch.org/whl/torch_stable.html.

Otherwise, you can proceed with

pip install pytorch-forecasting

Alternatively, you can install the package via conda

conda install pytorch-forecasting -c conda-forge

If you do not have pytorch installed, install it is recommended to install it first from the pytorch channel.

conda install pytorch -c pytorch

If you have installed a version below PyTorch 1.6, update it:

conda update pytorch -c pytorch

Documentation

Visit https://pytorch-forecasting.readthedocs.io to read the documentation with detailed tutorials.

Available models

Usage

import pytorch_lightning as pl
from pytorch_lightning.callbacks import EarlyStopping, LearningRateMonitor

from pytorch_forecasting import TimeSeriesDataSet, TemporalFusionTransformer

# load data
data = ...

# define dataset
max_encode_length = 36
max_prediction_length = 6
training_cutoff = "YYYY-MM-DD"  # day for cutoff

training = TimeSeriesDataSet(
    data[lambda x: x.date <= training_cutoff],
    time_idx= ...,
    target= ...,
    group_ids=[ ... ],
    max_encode_length=max_encode_length,
    max_prediction_length=max_prediction_length,
    static_categoricals=[ ... ],
    static_reals=[ ... ],
    time_varying_known_categoricals=[ ... ],
    time_varying_known_reals=[ ... ],
    time_varying_unknown_categoricals=[ ... ],
    time_varying_unknown_reals=[ ... ],
)


validation = TimeSeriesDataSet.from_dataset(training, data, min_prediction_idx=training.index.time.max() + 1, stop_randomization=True)
batch_size = 128
train_dataloader = training.to_dataloader(train=True, batch_size=batch_size, num_workers=2)
val_dataloader = validation.to_dataloader(train=False, batch_size=batch_size, num_workers=2)


early_stop_callback = EarlyStopping(monitor="val_loss", min_delta=1e-4, patience=1, verbose=False, mode="min")
lr_logger = LearningRateMonitor()
trainer = pl.Trainer(
    max_epochs=100,
    gpus=0,
    gradient_clip_val=0.1,
    limit_train_batches=30,
    callbacks=[lr_logger, early_stop_callback],
)


tft = TemporalFusionTransformer.from_dataset(
    training,
    learning_rate=0.03,
    hidden_size=32,
    attention_head_size=1,
    dropout=0.1,
    hidden_continuous_size=16,
    output_size=7,
    loss=QuantileLoss(),
    log_interval=2,
    reduce_on_plateau_patience=4
)
print(f"Number of parameters in network: {tft.size()/1e3:.1f}k")

# find optimal learning rate
res = trainer.lr_find(
    tft, train_dataloader=train_dataloader, val_dataloaders=val_dataloader, early_stop_threshold=1000.0, max_lr=0.3,
)

print(f"suggested learning rate: {res.suggestion()}")
fig = res.plot(show=True, suggest=True)
fig.show()

trainer.fit(
    tft, train_dataloader=train_dataloader, val_dataloaders=val_dataloader,
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch_forecasting-0.6.0.tar.gz (73.4 kB view details)

Uploaded Source

Built Distribution

pytorch_forecasting-0.6.0-py3-none-any.whl (80.1 kB view details)

Uploaded Python 3

File details

Details for the file pytorch_forecasting-0.6.0.tar.gz.

File metadata

  • Download URL: pytorch_forecasting-0.6.0.tar.gz
  • Upload date:
  • Size: 73.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.7.9 Linux/5.4.0-1031-azure

File hashes

Hashes for pytorch_forecasting-0.6.0.tar.gz
Algorithm Hash digest
SHA256 4b826a9dbb410fd527cd2d89ad46440a9e3be34dab86d1c574c26653dffef128
MD5 e81095f5e19f0f6ae077afec6d2ef0be
BLAKE2b-256 20307b10cd3cc4dfe1e7a4e89281616fa66202b1649188f235157986d3e6ebaf

See more details on using hashes here.

File details

Details for the file pytorch_forecasting-0.6.0-py3-none-any.whl.

File metadata

File hashes

Hashes for pytorch_forecasting-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8a8c700728189ca4c7687c0b1618846b861fd75c1833d6778c90a4511fdf52cb
MD5 2231cc01087605c0b7ea9bbbae61d95d
BLAKE2b-256 ad0f3dfd15485fc5f6ae070d638e1aca85e4d7e4b65d1144996769ffd80cb19b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page