Skip to main content

A Python toolbox for performing gradient-free optimization

Project description

Support Ukraine CircleCI

Nevergrad - A gradient-free optimization platform

Nevergrad

nevergrad is a Python 3.6+ library. It can be installed with:

pip install nevergrad

More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the documentation.

You can join Nevergrad users Facebook group here.

Minimizing a function using an optimizer (here NGOpt) is straightforward:

import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value)  # recommended value
>>> [0.49971112 0.5002944]

nevergrad can also support bounded continuous variables as well as discrete variables, and mixture of those. To do this, one can specify the input space:

import nevergrad as ng

def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
    # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)

# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
    # a log-distributed scalar between 0.001 and 1.0
    learning_rate=ng.p.Log(lower=0.001, upper=1.0),
    # an integer from 1 to 12
    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
    # either "conv" or "fc"
    architecture=ng.p.Choice(["conv", "fc"])
)

optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)

# show the recommended keyword arguments of the function
print(recommendation.kwargs)
>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}

Learn more on parametrization in the documentation!

Example of optimization

Convergence of a population of points to the minima with two-points DE.

Documentation

Check out our documentation! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!

Citing

@misc{nevergrad,
    author = {J. Rapin and O. Teytaud},
    title = {{Nevergrad - A gradient-free optimization platform}},
    year = {2018},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}

License

nevergrad is released under the MIT license. See LICENSE for additional details about it. See also our Terms of Use and Privacy Policy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nevergrad-0.8.0.tar.gz (345.3 kB view details)

Uploaded Source

Built Distribution

nevergrad-0.8.0-py3-none-any.whl (446.4 kB view details)

Uploaded Python 3

File details

Details for the file nevergrad-0.8.0.tar.gz.

File metadata

  • Download URL: nevergrad-0.8.0.tar.gz
  • Upload date:
  • Size: 345.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.17

File hashes

Hashes for nevergrad-0.8.0.tar.gz
Algorithm Hash digest
SHA256 8bcd69a5a1a5f29831172f52edd8e49e38ff3837e0a4fa3dccc925b64c1f50b7
MD5 214bdc36648ed5e20fae2711c79901df
BLAKE2b-256 8a10750b244276409077b47bc9a733ad6190a52b27a1119a21c6a08935418998

See more details on using hashes here.

Provenance

File details

Details for the file nevergrad-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: nevergrad-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 446.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.17

File hashes

Hashes for nevergrad-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6bfbbd1616599c739a04a356219dc4a1ee1393a1a6715278956724d228b69aee
MD5 4a5b2b824ff8ef2347105b3d8184dc75
BLAKE2b-256 7d92ba9c378530ea5890fd83a270235874b276844315e090cf387ad5d0dbf330

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page