Preconditoned ICA for Real Data
Project description
This repository hosts Python/Octave/Matlab code of the Preconditioned ICA for Real Data (Picard) and Picard-O algorithms.
See the documentation.
Algorithm
Picard is an algorithm for maximum likelihood independent component analysis. It shows state of the art speed of convergence, and solves the same problems as the widely used FastICA, Infomax and extended-Infomax, faster.
The parameter ortho choses whether to work under orthogonal constraint (i.e. enforce the decorrelation of the output) or not. It also comes with an extended version just like extended-infomax, which makes separation of both sub and super-Gaussian signals possible. It is chosen with the parameter extended.
ortho=False, extended=False: same solution as Infomax
ortho=False, extended=True: same solution as extended-Infomax
ortho=True, extended=True: same solution as FastICA
ortho=True, extended=False: finds the same solutions as Infomax under orthogonal constraint.
Installation
We recommend the Anaconda Python distribution.
conda
Picard can be installed with conda-forge. You need to add conda-forge to your conda channels, and then do:
$ conda install python-picard
pip
Otherwise, to install picard, you first need to install its dependencies:
$ pip install numpy matplotlib scipy
Then install Picard with pip:
$ pip install python-picard
or to get the latest version of the code:
$ pip install git+https://github.com/pierreablin/picard.git#egg=picard
If you do not have admin privileges on the computer, use the --user flag with pip. To upgrade, use the --upgrade flag provided by pip.
check
To check if everything worked fine, you can do:
$ python -c 'import picard'
and it should not give any error message.
matlab/octave
The Matlab/Octave version of Picard and Picard-O is available here.
Quickstart
To get started, you can build a synthetic mixed signals matrix:
>>> import numpy as np
>>> N, T = 3, 1000
>>> S = np.random.laplace(size=(N, T))
>>> A = np.random.randn(N, N)
>>> X = np.dot(A, S)
And then use Picard to separate the signals:
>>> from picard import picard
>>> K, W, Y = picard(X)
Picard outputs the whitening matrix, K, the estimated unmixing matrix, W, and the estimated sources Y. It means that Y = WKX
NEW: scikit-learn compatible API
Introducing picard.Picard, which mimics sklearn.decomposition.FastICA behavior:
>>> from sklearn.datasets import load_digits
>>> from picard import Picard
>>> X, _ = load_digits(return_X_y=True)
>>> transformer = Picard(n_components=7)
>>> X_transformed = transformer.fit_transform(X)
>>> X_transformed.shape
Dependencies
These are the dependencies to use Picard:
numpy (>=1.8)
matplotlib (>=1.3)
scipy (>=0.19)
Optionally to get faster computations, you can install
numexpr (>= 2.0)
These are the dependencies to run the EEG example:
mne (>=0.14)
Cite
If you use this code in your project, please cite:
Pierre Ablin, Jean-Francois Cardoso, Alexandre Gramfort Faster independent component analysis by preconditioning with Hessian approximations IEEE Transactions on Signal Processing, 2018 https://arxiv.org/abs/1706.08171 Pierre Ablin, Jean-François Cardoso, Alexandre Gramfort Faster ICA under orthogonal constraint ICASSP, 2018 https://arxiv.org/abs/1711.10873
Changelog
New in 0.8 : for the density exp, the default parameter is now alpha = 0.1 instead of alpha = 1.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file python-picard-0.8.tar.gz
.
File metadata
- Download URL: python-picard-0.8.tar.gz
- Upload date:
- Size: 64.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1da4bddd0f3dc224c9005262fe446f17d0b981023db746bb4353606b17c1d1b5 |
|
MD5 | d43a95d4cb9d3b2a0ebbf57392c22c8f |
|
BLAKE2b-256 | e27467be64a1c017c9deccde610e71466521f3d42aa330339cf8efce9bf89d9e |