Skip to main content

A Python toolbox for performing gradient-free optimization

Project description

Support Ukraine CircleCI

Nevergrad - A gradient-free optimization platform

Nevergrad

nevergrad is a Python 3.8+ library. It can be installed with:

pip install nevergrad

More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the documentation.

You can join Nevergrad users Facebook group here.

Minimizing a function using an optimizer (here NGOpt) is straightforward:

import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value)  # recommended value
>>> [0.49971112 0.5002944]

nevergrad can also support bounded continuous variables as well as discrete variables, and mixture of those. To do this, one can specify the input space:

import nevergrad as ng

def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
    # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)

# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
    # a log-distributed scalar between 0.001 and 1.0
    learning_rate=ng.p.Log(lower=0.001, upper=1.0),
    # an integer from 1 to 12
    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
    # either "conv" or "fc"
    architecture=ng.p.Choice(["conv", "fc"])
)

optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)

# show the recommended keyword arguments of the function
print(recommendation.kwargs)
>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}

Learn more on parametrization in the documentation!

Example of optimization

Convergence of a population of points to the minima with two-points DE.

Documentation

Check out our documentation! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer! The last version of our data and the last version of our PDF report.

Citing

@misc{nevergrad,
    author = {J. Rapin and O. Teytaud},
    title = {{Nevergrad - A gradient-free optimization platform}},
    year = {2018},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}

License

nevergrad is released under the MIT license. See LICENSE for additional details about it. See also our Terms of Use and Privacy Policy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nevergrad-1.0.0.tar.gz (384.2 kB view details)

Uploaded Source

Built Distribution

nevergrad-1.0.0-py3-none-any.whl (477.1 kB view details)

Uploaded Python 3

File details

Details for the file nevergrad-1.0.0.tar.gz.

File metadata

  • Download URL: nevergrad-1.0.0.tar.gz
  • Upload date:
  • Size: 384.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for nevergrad-1.0.0.tar.gz
Algorithm Hash digest
SHA256 1fb56c045ffa16c01dd40abd40e9031f2a903efdb8384b52a3c93eb8c9364af5
MD5 a98eda6b2f1863c1468e4c1c9783a8e2
BLAKE2b-256 115f31f31a94a442579b403a95f13729ef574f2ad94526e24ea43d4185710017

See more details on using hashes here.

Provenance

File details

Details for the file nevergrad-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: nevergrad-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 477.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for nevergrad-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d1a8ba9d26b24ddcb7fcbd0e97b7616a757f3a75441c64f42c51812570b4541c
MD5 0a0c67ca2a202899a837a5c9986b1a06
BLAKE2b-256 5b3d3c26ca844fd73a832fa4d1a2e7ae868d6a6a71a552988115c1006d7f75db

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page