Skip to main content

A Python toolbox for performing gradient-free optimization

Project description

CircleCI

Nevergrad - A gradient-free optimization platform

nevergrad is a Python 3.6+ library. It can be installed with:

pip install nevergrad

You can also install the master branch instead of the latest release with:

pip install git+https://github.com/facebookresearch/nevergrad@master#egg=nevergrad

Alternatively, you can clone the repository and run pip install -e . from inside the repository folder.

By default, this only installs requirements for the optimization and instrumentation subpackages. If you are also interested in the benchmarking part, you should install with the [benchmark] flag (example: pip install 'nevergrad[benchmark]'), and if you also want the test tools, use the [all] flag (example: pip install -e '.[all]')

You can join Nevergrad users Facebook group here

Goals and structure

The goals of this package are to provide:

  • gradient/derivative-free optimization algorithms, including algorithms able to handle noise.
  • tools to parametrize any code, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete parameters.
  • functions on which to test the optimization algorithms.
  • benchmark routines in order to compare algorithms easily.

The structure of the package follows its goal, you will therefore find subpackages:

  • optimization: implementing optimization algorithms
  • parametrization: specifying what are the parameters you want to optimize
  • functions: implementing both simple and complex benchmark functions
  • benchmark: for running experiments comparing the algorithms on benchmark functions
  • common: a set of tools used throughout the package

Example of optimization

Convergence of a population of points to the minima with two-points DE.

Documentation

The following README is very general, here are links to find more details on:

  • how to perform optimization using nevergrad, including using parallelization and a few recommendation on which algorithm should be used depending on the settings
  • how to parametrize your problem so that the optimizers are informed of the problem to solve. This also provides a tool to instantiate a script or non-python code in order into a Python function and be able to tune some of its parameters.
  • how to benchmark all optimizers on various test functions.
  • benchmark results of some standard optimizers an simple test cases.
  • examples of optimization for machine learning.
  • how to contribute through issues and pull requests and how to setup your dev environment.
  • guidelines of how to contribute by adding a new algorithm.

Basic optimization example

All optimizers assume a centered and reduced prior at the beginning of the optimization (i.e. 0 mean and unitary standard deviation). They are however able to find solutions far from this initial prior.

Optimizing (minimizing!) a function using an optimizer (here OnePlusOne) can be easily run with:

import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.OnePlusOne(instrumentation=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation)  # optimal args and kwargs
>>> Candidate(args=(array([0.500, 0.499]),), kwargs={})

recommendation holds the optimal attributes args and kwargs found by the optimizer for the provided function. In this example, the optimal value will be found in recommendation.args[0] and will be a np.ndarray of size 2.

instrumentation=n is a shortcut to state that the function has only one variable, of dimension n, See the instrumentation tutorial for more complex instrumentations.

You can print the full list of optimizers with:

import nevergrad as ng
print(list(sorted(ng.optimizers.registry.keys())))

The optimization documentation contains more information on how to use several workers, take full control of the optimization through the ask and tell interface, perform multiobjective optimization, as well as pieces of advice on how to choose the proper optimizer for your problem.

Citing

@misc{nevergrad,
    author = {J. Rapin and O. Teytaud},
    title = {{Nevergrad - A gradient-free optimization platform}},
    year = {2018},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}

License

nevergrad is released under the MIT license. See LICENSE for additional details about it. LGPL code is however also included in the multiobjective subpackage.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nevergrad-0.3.1.tar.gz (174.9 kB view details)

Uploaded Source

Built Distribution

nevergrad-0.3.1-py3-none-any.whl (231.2 kB view details)

Uploaded Python 3

File details

Details for the file nevergrad-0.3.1.tar.gz.

File metadata

  • Download URL: nevergrad-0.3.1.tar.gz
  • Upload date:
  • Size: 174.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.5

File hashes

Hashes for nevergrad-0.3.1.tar.gz
Algorithm Hash digest
SHA256 ce0a2565dcaac76eb4591a31561d5cb718f7266757f7d1b2c6636598dbcf1ff9
MD5 cbacac6ce8d2fda94daddebbddbea63e
BLAKE2b-256 3b4b301ede3768b109117b10ccf5a92888097c1c62c949900f0d1ebe676ec64a

See more details on using hashes here.

Provenance

File details

Details for the file nevergrad-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: nevergrad-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 231.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.5

File hashes

Hashes for nevergrad-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 fb96f53c3a5dc586ca556e497b021e4937f48477c183d5317d452a7579a04b00
MD5 9a672e5f890814930d5e2fa0eb1e4bc1
BLAKE2b-256 3f76f821f24fea5367cd7777f3d6c9dc9455edc2c2a68e60608ed36005554844

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page