Skip to main content

AllenNLP integration for hyperparameter optimization

Project description

AllenNLP subcommand for hyperparameter optimization

0. Install

pip install allennlp_optuna

1. Optimization

1.1 AllenNLP config

Model configuration written in Jsonnet.

You have to replace values of hyperparameters with jsonnet function std.extVar. Remember casting external variables to desired types by std.parseInt, std.parseJson.

local lr = 0.1;  // before
↓↓↓
local lr = std.parseJson(std.extVar('lr'));  // after

For more information, please refer to AllenNLP Guide.

1.2 Define hyperparameter search speaces

You can define search space in Json.

Each hyperparameter config must have type and keyword. You can see what parameters are available for each hyperparameter in Optuna API reference.

[
  {
    "type": "int",
    "attributes": {
      "name": "embedding_dim",
      "low": 64,
      "high": 128
    }
  },
  {
    "type": "int",
    "attributes": {
      "name": "max_filter_size",
      "low": 2,
      "high": 5
    }
  },
  {
    "type": "int",
    "attributes": {
      "name": "num_filters",
      "low": 64,
      "high": 256
    }
  },
  {
    "type": "int",
    "attributes": {
      "name": "output_dim",
      "low": 64,
      "high": 256
    }
  },
  {
    "type": "float",
    "attributes": {
      "name": "dropout",
      "low": 0.0,
      "high": 0.5
    }
  },
  {
    "type": "float",
    "attributes": {
      "name": "lr",
      "low": 5e-3,
      "high": 5e-1,
      "log": true
    }
  }
]

Parameters for suggest_#{type} are available for config of type=#{type}. (e.g. when type=float, you can see the available parameters in suggest_float

Please see the example in detail.

1.3 [Optional] Specify Optuna configurations

You can choose a pruner/sample implemented in Optuna. To specify a pruner/sampler, create a JSON config file

The example of optuna.json looks like:

{
    "pruner": {
        "type": "HyperbandPruner",
        "attributes": {
            "min_resource": 1,
            "reduction_factor": 5
        }
    },
    "sampler": {
        "type": "TPESampler",
        "attributes": {
            "n_startup_trials": 5
        }
    }
}

1.4 Optimize hyperparameters by allennlp cli

poetry run allennlp tune \
    config/imdb_optuna.jsonnet \
    config/hparams.json \
    --optuna-param-path config/optuna.json \
    --serialization-dir result \
    --study-name test

2. Get best hyperparameters

poetry run allennlp best-params \
    --study-name test

3. Retrain a model with optimized hyperparameters

poetry run allennlp retrain \
    config/imdb_optuna.jsonnet \
    --serialization-dir retrain_result \
    --study-name test

4. Hyperparameter optimization at scale!

you can run optimizations in parallel. You can easily run distributed optimization by adding an option --skip-if-exists to allennlp tune command.

poetry run allennlp tune \
    config/imdb_optuna.jsonnet \
    config/hparams.json \
    --optuna-param-path config/optuna.json \
    --serialization-dir result \
    --study-name test \
    --skip-if-exists

AllenOpt uses SQLite as a default storage for storing results. You can easily run distributed optimization over machines by using MySQL or PostgreSQL as a storage.

For example, if you want to use MySQL as a storage, the command should be like following:

poetry run allennlp tune \
    config/imdb_optuna.jsonnet \
    config/hparams.json \
    --optuna-param-path config/optuna.json \
    --serialization-dir result \
    --study-name test \
    --storage mysql://<user_name>:<passwd>@<db_host>/<db_name> \
    --skip-if-exists

You can run the above command on each machine to run multi-node distributed optimization.

If you want to know about a mechanism of Optuna distributed optimization, please see the official documentation: https://optuna.readthedocs.io/en/stable/tutorial/004_distributed.html

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

allennlp_optuna-0.1.1.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

allennlp_optuna-0.1.1-py3-none-any.whl (6.7 kB view details)

Uploaded Python 3

File details

Details for the file allennlp_optuna-0.1.1.tar.gz.

File metadata

  • Download URL: allennlp_optuna-0.1.1.tar.gz
  • Upload date:
  • Size: 5.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.8.5 Darwin/19.6.0

File hashes

Hashes for allennlp_optuna-0.1.1.tar.gz
Algorithm Hash digest
SHA256 56bac5c4b922bef40849384c0c66dd8a143b786072094ec9e02f624d6e14da1a
MD5 a590d3e64807a09d8bcd4ad5af9be48b
BLAKE2b-256 087c7a63ff7230c1149e7c5170d06d2681a9b374885645f223c9976f9e418438

See more details on using hashes here.

File details

Details for the file allennlp_optuna-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: allennlp_optuna-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 6.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.8.5 Darwin/19.6.0

File hashes

Hashes for allennlp_optuna-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ca3da5c8a01d6adf590e29624b31294878a1815b74a82ea4365b2141b4cc249f
MD5 e5650e25f80503a38970453e10797842
BLAKE2b-256 dbb06a9cee3769f0810d591d21585741b509b986cd67930019542968a1a37ba9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page