Skip to main content

AllenNLP integration for hyperparameter optimization

Project description

AllenNLP subcommand for hyperparameter optimization

0. Documentation

You can read the documentation on readthedocs.

1. Install

pip install allennlp_optuna

2. Optimization

2.1 AllenNLP config

Model configuration written in Jsonnet.

You have to replace values of hyperparameters with jsonnet function std.extVar. Remember casting external variables to desired types by std.parseInt, std.parseJson.

local lr = 0.1;  // before
↓↓↓
local lr = std.parseJson(std.extVar('lr'));  // after

For more information, please refer to AllenNLP Guide.

2.2 Define hyperparameter search speaces

You can define search space in Json.

Each hyperparameter config must have type and keyword. You can see what parameters are available for each hyperparameter in Optuna API reference.

[
  {
    "type": "int",
    "attributes": {
      "name": "embedding_dim",
      "low": 64,
      "high": 128
    }
  },
  {
    "type": "int",
    "attributes": {
      "name": "max_filter_size",
      "low": 2,
      "high": 5
    }
  },
  {
    "type": "int",
    "attributes": {
      "name": "num_filters",
      "low": 64,
      "high": 256
    }
  },
  {
    "type": "int",
    "attributes": {
      "name": "output_dim",
      "low": 64,
      "high": 256
    }
  },
  {
    "type": "float",
    "attributes": {
      "name": "dropout",
      "low": 0.0,
      "high": 0.5
    }
  },
  {
    "type": "float",
    "attributes": {
      "name": "lr",
      "low": 5e-3,
      "high": 5e-1,
      "log": true
    }
  }
]

Parameters for suggest_#{type} are available for config of type=#{type}. (e.g. when type=float, you can see the available parameters in suggest_float

Please see the example in detail.

2.3 [Optional] Specify Optuna configurations

You can choose a pruner/sample implemented in Optuna. To specify a pruner/sampler, create a JSON config file

The example of optuna.json looks like:

{
  "pruner": {
    "type": "HyperbandPruner",
    "attributes": {
      "min_resource": 1,
      "reduction_factor": 5
    }
  },
  "sampler": {
    "type": "TPESampler",
    "attributes": {
      "n_startup_trials": 5
    }
  }
}

2.4 Optimize hyperparameters by allennlp cli

poetry run allennlp tune \
    config/imdb_optuna.jsonnet \
    config/hparams.json \
    --optuna-param-path config/optuna.json \
    --serialization-dir result \
    --study-name test

3. Get best hyperparameters

poetry run allennlp best-params \
    --study-name test

4. Retrain a model with optimized hyperparameters

poetry run allennlp retrain \
    config/imdb_optuna.jsonnet \
    --serialization-dir retrain_result \
    --study-name test

5. Hyperparameter optimization at scale!

you can run optimizations in parallel. You can easily run distributed optimization by adding an option --skip-if-exists to allennlp tune command.

poetry run allennlp tune \
    config/imdb_optuna.jsonnet \
    config/hparams.json \
    --optuna-param-path config/optuna.json \
    --serialization-dir result \
    --study-name test \
    --skip-if-exists

allennlp-optuna uses SQLite as a default storage for storing results. You can easily run distributed optimization over machines by using MySQL or PostgreSQL as a storage.

For example, if you want to use MySQL as a storage, the command should be like following:

poetry run allennlp tune \
    config/imdb_optuna.jsonnet \
    config/hparams.json \
    --optuna-param-path config/optuna.json \
    --serialization-dir result \
    --study-name test \
    --storage mysql://<user_name>:<passwd>@<db_host>/<db_name> \
    --skip-if-exists

You can run the above command on each machine to run multi-node distributed optimization.

If you want to know about a mechanism of Optuna distributed optimization, please see the official documentation: https://optuna.readthedocs.io/en/stable/tutorial/004_distributed.html

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

allennlp_optuna-0.1.2.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

allennlp_optuna-0.1.2-py3-none-any.whl (6.8 kB view details)

Uploaded Python 3

File details

Details for the file allennlp_optuna-0.1.2.tar.gz.

File metadata

  • Download URL: allennlp_optuna-0.1.2.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.8.5 Darwin/19.6.0

File hashes

Hashes for allennlp_optuna-0.1.2.tar.gz
Algorithm Hash digest
SHA256 0132adf2053002f911d9b4b424a85809a4cfb3e721b8c4e13ee1f8f7e7bf9759
MD5 7e67ec220f82c2e9a6e54f94922f2e6b
BLAKE2b-256 21c788a85748e9aa04d072c346822ab93eb076e627e5a6814e0a488747c640e5

See more details on using hashes here.

File details

Details for the file allennlp_optuna-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: allennlp_optuna-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 6.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.8.5 Darwin/19.6.0

File hashes

Hashes for allennlp_optuna-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9ee6b88f64372cb4f22c44b1d706495eb6eb16480716e05932ae269a1d103130
MD5 f8d3af7d0628d2b90903f857cad86ddb
BLAKE2b-256 ca470678f06c50588d333299fc7dd705e4cdc3d72e54a64586b2d7e1d6350098

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page