Skip to main content

A Python toolbox for performing gradient-free optimization

Project description

CircleCI

Nevergrad - A gradient-free optimization platform

Nevergrad

nevergrad is a Python 3.6+ library. It can be installed with:

pip install nevergrad

More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the documentation.

You can join Nevergrad users Facebook group here.

Minimizing a function using an optimizer (here NGOpt) is straightforward:

import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value)  # recommended value
>>> [0.49971112 0.5002944]

nevergrad can also support bounded continuous variables as well as discrete variables, and mixture of those. To do this, one can specify the input space:

import nevergrad as ng

def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
    # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)

# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
    # a log-distributed scalar between 0.001 and 1.0
    learning_rate=ng.p.Log(lower=0.001, upper=1.0),
    # an integer from 1 to 12
    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
    # either "conv" or "fc"
    architecture=ng.p.Choice(["conv", "fc"])
)

optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)

# show the recommended keyword arguments of the function
print(recommendation.kwargs)
>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}

Learn more on parametrization in the documentation!

Example of optimization

Convergence of a population of points to the minima with two-points DE.

Documentation

Check out our documentation! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!

Citing

@misc{nevergrad,
    author = {J. Rapin and O. Teytaud},
    title = {{Nevergrad - A gradient-free optimization platform}},
    year = {2018},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}

License

nevergrad is released under the MIT license. See LICENSE for additional details about it. See also our Terms of Use and Privacy Policy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nevergrad-0.4.2.post1.tar.gz (239.0 kB view details)

Uploaded Source

Built Distribution

nevergrad-0.4.2.post1-py3-none-any.whl (312.0 kB view details)

Uploaded Python 3

File details

Details for the file nevergrad-0.4.2.post1.tar.gz.

File metadata

  • Download URL: nevergrad-0.4.2.post1.tar.gz
  • Upload date:
  • Size: 239.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.0 requests-toolbelt/0.9.1 tqdm/4.50.2 CPython/3.6.12

File hashes

Hashes for nevergrad-0.4.2.post1.tar.gz
Algorithm Hash digest
SHA256 0951f1f90afce098394adfc91fd7fd454a1922b8c9f4964885dd829fe75b1bd9
MD5 c88c9541b39fd295f7dea0d14a5e630d
BLAKE2b-256 2e26749febf62801510945dfdb23c7d2a81699dad6c4d449112c9c9a8911ba52

See more details on using hashes here.

Provenance

File details

Details for the file nevergrad-0.4.2.post1-py3-none-any.whl.

File metadata

  • Download URL: nevergrad-0.4.2.post1-py3-none-any.whl
  • Upload date:
  • Size: 312.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.0 requests-toolbelt/0.9.1 tqdm/4.50.2 CPython/3.6.12

File hashes

Hashes for nevergrad-0.4.2.post1-py3-none-any.whl
Algorithm Hash digest
SHA256 da65de173ffd50ec22b498f6b11ee29b6ea42803f447dbb6cb3992dfb61c5f6c
MD5 3c671a759d6f12cf560abf770a046dab
BLAKE2b-256 775b07abbdb5414dd34e5916bce82f13f79463f4534d5f66b2194b27214acbf1

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page