Skip to main content

Differentiable neuron simulations.

Project description

Differentiable neuron simulations on CPU, GPU, or TPU

PyPI version Contributions welcome Tests GitHub license

Documentation | Getting Started | Install guide | Reference docs | FAQ

What is Jaxley?

Jaxley is a differentiable simulator for biophysical neuron models, written in the Python library JAX. Its key features are:

  • automatic differentiation, allowing gradient-based optimization of thousands of parameters
  • support for CPU, GPU, or TPU without any changes to the code
  • jit-compilation, making it as fast as other packages while being fully written in python
  • support for multicompartment neurons
  • elegant mechanisms for parameter sharing

Getting started

Jaxley allows to simulate biophysical neuron models on CPU, GPU, or TPU:

import matplotlib.pyplot as plt
from jax import config

import jaxley as jx
from jaxley.channels import HH

config.update("jax_platform_name", "cpu")  # Or "gpu" / "tpu".

cell = jx.Cell()  # Define cell.
cell.insert(HH())  # Insert channels.

current = jx.step_current(i_delay=1.0, i_dur=1.0, i_amp=0.1, delta_t=0.025, t_max=10.0)
cell.stimulate(current)  # Stimulate with step current.
cell.record("v")  # Record voltage.

v = jx.integrate(cell)  # Run simulation.
plt.plot(v.T)  # Plot voltage trace.

Here you can find an overview of what kinds of models can be implemented in Jaxley. If you want to learn more, we recommend you to check out our tutorials on how to:

Installation

Jaxley is available on PyPI:

pip install jaxley

This will install Jaxley with CPU support. If you want GPU support, follow the instructions on the JAX Github repository to install JAX with GPU support (in addition to installing Jaxley). For example, for NVIDIA GPUs, run

pip install -U "jax[cuda12]"

Feedback and Contributions

We welcome any feedback on how Jaxley is working for your neuron models and are happy to receive bug reports, pull requests and other feedback (see contribute). We wish to maintain a positive community, please read our Code of Conduct.

License

Apache License Version 2.0 (Apache-2.0)

Citation

If you use Jaxley, consider citing the corresponding paper:

@article{deistler2024differentiable,
  doi = {10.1101/2024.08.21.608979},
  year = {2024},
  publisher = {Cold Spring Harbor Laboratory},
  author = {Deistler, Michael and Kadhim, Kyra L. and Pals, Matthijs and Beck, Jonas and Huang, Ziwei and Gloeckler, Manuel and Lappalainen, Janne K. and Schr{\"o}der, Cornelius and Berens, Philipp and Gon{\c c}alves, Pedro J. and Macke, Jakob H.},
  title = {Differentiable simulation enables large-scale training of detailed biophysical models of neural dynamics},
  journal = {bioRxiv}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jaxley-0.3.0.tar.gz (93.2 kB view details)

Uploaded Source

Built Distribution

Jaxley-0.3.0-py3-none-any.whl (130.2 kB view details)

Uploaded Python 3

File details

Details for the file jaxley-0.3.0.tar.gz.

File metadata

  • Download URL: jaxley-0.3.0.tar.gz
  • Upload date:
  • Size: 93.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for jaxley-0.3.0.tar.gz
Algorithm Hash digest
SHA256 096839b89dd39ce774f48ea7788ae96ddc09c13f1c13d1e7e8eb18fa295e4618
MD5 5001d8ec3e4575585ec893f33968a266
BLAKE2b-256 affa10cb6d07573b73d60c7cf901b5d9a511eac6f0df9432620f445b140cad87

See more details on using hashes here.

File details

Details for the file Jaxley-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: Jaxley-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 130.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for Jaxley-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3d3b20a45dda223b77f605c89ebb2fa4957fd763f378e4ac6846b19b9c4baa99
MD5 3b22c76eee3b92979a5c552f0e272eb8
BLAKE2b-256 ba7543403c8c60693dd454326977e130b32a222f3907927b4e79c5d1f8429a61

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page