Skip to main content

Differentiable neuron simulations.

Project description

Differentiable neuron simulations on CPU, GPU, or TPU

PyPI version Contributions welcome Tests GitHub license

Documentation | Getting Started | Install guide | Reference docs | FAQ

What is Jaxley?

Jaxley is a differentiable simulator for biophysical neuron models, written in the Python library JAX. Its key features are:

  • automatic differentiation, allowing gradient-based optimization of thousands of parameters
  • support for CPU, GPU, or TPU without any changes to the code
  • jit-compilation, making it as fast as other packages while being fully written in python
  • support for multicompartment neurons
  • elegant mechanisms for parameter sharing

Getting started

Jaxley allows to simulate biophysical neuron models on CPU, GPU, or TPU:

import matplotlib.pyplot as plt
from jax import config

import jaxley as jx
from jaxley.channels import HH

config.update("jax_platform_name", "cpu")  # Or "gpu" / "tpu".

cell = jx.Cell()  # Define cell.
cell.insert(HH())  # Insert channels.

current = jx.step_current(i_delay=1.0, i_dur=1.0, i_amp=0.1, delta_t=0.025, t_max=10.0)
cell.stimulate(current)  # Stimulate with step current.
cell.record("v")  # Record voltage.

v = jx.integrate(cell)  # Run simulation.
plt.plot(v.T)  # Plot voltage trace.

Here you can find an overview of what kinds of models can be implemented in Jaxley. If you want to learn more, we recommend you to check out our tutorials on how to:

Installation

Jaxley is available on PyPI:

pip install jaxley

This will install Jaxley with CPU support. If you want GPU support, follow the instructions on the JAX Github repository to install JAX with GPU support (in addition to installing Jaxley). For example, for NVIDIA GPUs, run

pip install -U "jax[cuda12]"

Feedback and Contributions

We welcome any feedback on how Jaxley is working for your neuron models and are happy to receive bug reports, pull requests and other feedback (see contribute). We wish to maintain a positive community, please read our Code of Conduct.

License

Apache License Version 2.0 (Apache-2.0)

Citation

If you use Jaxley, consider citing the corresponding paper:

@article{deistler2024differentiable,
  doi = {10.1101/2024.08.21.608979},
  year = {2024},
  publisher = {Cold Spring Harbor Laboratory},
  author = {Deistler, Michael and Kadhim, Kyra L. and Pals, Matthijs and Beck, Jonas and Huang, Ziwei and Gloeckler, Manuel and Lappalainen, Janne K. and Schr{\"o}der, Cornelius and Berens, Philipp and Gon{\c c}alves, Pedro J. and Macke, Jakob H.},
  title = {Differentiable simulation enables large-scale training of detailed biophysical models of neural dynamics},
  journal = {bioRxiv}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jaxley-0.4.0.tar.gz (105.2 kB view details)

Uploaded Source

Built Distribution

Jaxley-0.4.0-py3-none-any.whl (143.9 kB view details)

Uploaded Python 3

File details

Details for the file jaxley-0.4.0.tar.gz.

File metadata

  • Download URL: jaxley-0.4.0.tar.gz
  • Upload date:
  • Size: 105.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for jaxley-0.4.0.tar.gz
Algorithm Hash digest
SHA256 ea8642a46e089ff5dedf2cf30403f3a3cffa058be18d28443dd60dce14401ace
MD5 332aadb2d732113d3fb0a16e34ea608b
BLAKE2b-256 7cba082fbeb11c3804d3b0529faef9214b3796119c547a65cc677e77f39588a7

See more details on using hashes here.

File details

Details for the file Jaxley-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: Jaxley-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 143.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for Jaxley-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4b4a1634b89b6db9f895bf1cdfe877faf1d2817aef23fea1558563e710aece99
MD5 f57517ee6662ad603c026f860cd1a6d0
BLAKE2b-256 92e316d74823ac0b2e6b9d28cca58ece1b1ddf69c275e181419bdfb5b719017f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page