Optimization with autodiff
Project description
# autoptim: automatic differentiation + optimization
Do you have a new machine learning model that you want to optimize, and do not want to bother computing the gradients? Autoptim is for you.
## Short presentation
Autoptim is a small Python package that blends `autograd` automatic differentiation in `scipy.optimize.minimize`.
The gradients are computed under the hood using automatic differentiation; the user only provides the objective function:
```python
import numpy as np
from autoptim import minimize
def rosenbrock(x):
return (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2
x0 = np.zeros(2)
x_min, _ = minimize(rosenbrock, x0)
print(x_min)
>>> [0.99999913 0.99999825]
```
It comes with the following features:
- **Natural interfacing with Numpy**: The objective function is written in standard Numpy. The input/ output of `autoptim.minimize` are Numpy arrays.
- **Smart input processing**: `scipy.optimize.minimize` is only meant to deal with one-dimensional arrays as input. In `autoptim`, variables can be multi-dimensional arrays or lists of arrays.
- **Preconditioning**: Preconditioning is a simple way to accelerate minimization, by doing a change of variables. `autoptim` makes preconditioning straightforward.
### Disclaimer
This package is meant to be as easy to use as possible. As so, some compromises on the speed of minimization are made.
## Installation
To install, use `pip`:
```
pip install autoptim
```
## Dependencies
- numpy>=1.12
- scipy>=0.18.0
- autograd >= 1.2
## Examples
Several examples can be found in `autoptim/tutorials`
Do you have a new machine learning model that you want to optimize, and do not want to bother computing the gradients? Autoptim is for you.
## Short presentation
Autoptim is a small Python package that blends `autograd` automatic differentiation in `scipy.optimize.minimize`.
The gradients are computed under the hood using automatic differentiation; the user only provides the objective function:
```python
import numpy as np
from autoptim import minimize
def rosenbrock(x):
return (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2
x0 = np.zeros(2)
x_min, _ = minimize(rosenbrock, x0)
print(x_min)
>>> [0.99999913 0.99999825]
```
It comes with the following features:
- **Natural interfacing with Numpy**: The objective function is written in standard Numpy. The input/ output of `autoptim.minimize` are Numpy arrays.
- **Smart input processing**: `scipy.optimize.minimize` is only meant to deal with one-dimensional arrays as input. In `autoptim`, variables can be multi-dimensional arrays or lists of arrays.
- **Preconditioning**: Preconditioning is a simple way to accelerate minimization, by doing a change of variables. `autoptim` makes preconditioning straightforward.
### Disclaimer
This package is meant to be as easy to use as possible. As so, some compromises on the speed of minimization are made.
## Installation
To install, use `pip`:
```
pip install autoptim
```
## Dependencies
- numpy>=1.12
- scipy>=0.18.0
- autograd >= 1.2
## Examples
Several examples can be found in `autoptim/tutorials`
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
File details
Details for the file autoptim-0.3-py3-none-any.whl
.
File metadata
- Download URL: autoptim-0.3-py3-none-any.whl
- Upload date:
- Size: 5.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.4.2 requests/2.21.0 setuptools/40.6.3 requests-toolbelt/0.9.1 tqdm/4.28.1 CPython/3.7.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f4492bd4666c62f1913b0eb5a68f235d2ac574d9cc98b0f428198184324af8a4 |
|
MD5 | 612678dc2703af0cd00356bccdba6b75 |
|
BLAKE2b-256 | 8a2cf616fc7988db7883eeb29818e13a8508b9c2567bf1cf902b3839dc573438 |