Differentiable neuron simulations.
Project description
Differentiable neuron simulations on CPU, GPU, or TPU
Documentation | Getting Started | Install guide | Reference docs | FAQ
What is Jaxley?
Jaxley
is a differentiable simulator for biophysical neuron models, written in the Python library JAX. Its key features are:
- automatic differentiation, allowing gradient-based optimization of thousands of parameters
- support for CPU, GPU, or TPU without any changes to the code
jit
-compilation, making it as fast as other packages while being fully written in python- support for multicompartment neurons
- elegant mechanisms for parameter sharing
Getting started
Jaxley
allows to simulate biophysical neuron models on CPU, GPU, or TPU:
import matplotlib.pyplot as plt
from jax import config
import jaxley as jx
from jaxley.channels import HH
config.update("jax_platform_name", "cpu") # Or "gpu" / "tpu".
cell = jx.Cell() # Define cell.
cell.insert(HH()) # Insert channels.
current = jx.step_current(i_delay=1.0, i_dur=1.0, i_amp=0.1, delta_t=0.025, t_max=10.0)
cell.stimulate(current) # Stimulate with step current.
cell.record("v") # Record voltage.
v = jx.integrate(cell) # Run simulation.
plt.plot(v.T) # Plot voltage trace.
Here you can find an overview of what kinds of models can be implemented in Jaxley
. If you want to learn more, we recommend you to check out our tutorials on how to:
- get started with
Jaxley
- simulate networks of neurons
- speed up simulations with GPUs and
jit
- define your own channels and synapses
- compute the gradient and train biophysical models
Installation
Jaxley
is available on PyPI
:
pip install jaxley
This will install Jaxley
with CPU support. If you want GPU support, follow the instructions on the JAX
Github repository to install JAX
with GPU support (in addition to installing Jaxley
). For example, for NVIDIA GPUs, run
pip install -U "jax[cuda12]"
Feedback and Contributions
We welcome any feedback on how Jaxley is working for your neuron models and are happy to receive bug reports, pull requests and other feedback (see contribute). We wish to maintain a positive community, please read our Code of Conduct.
License
Apache License Version 2.0 (Apache-2.0)
Citation
If you use Jaxley
, consider citing the corresponding paper:
@article{deistler2024differentiable,
doi = {10.1101/2024.08.21.608979},
year = {2024},
publisher = {Cold Spring Harbor Laboratory},
author = {Deistler, Michael and Kadhim, Kyra L. and Pals, Matthijs and Beck, Jonas and Huang, Ziwei and Gloeckler, Manuel and Lappalainen, Janne K. and Schr{\"o}der, Cornelius and Berens, Philipp and Gon{\c c}alves, Pedro J. and Macke, Jakob H.},
title = {Differentiable simulation enables large-scale training of detailed biophysical models of neural dynamics},
journal = {bioRxiv}
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file jaxley-0.2.1.tar.gz
.
File metadata
- Download URL: jaxley-0.2.1.tar.gz
- Upload date:
- Size: 89.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 900b4d2dd1a66b8f28e4ad0fbba3ddef1697ed37b76b3662d376464dba9ad931 |
|
MD5 | 72720ccd6ca70aa4de43d3e48ca70921 |
|
BLAKE2b-256 | 350c8c75ecd651c7f444f2095889325f77689feeef78c579fffa756c6e61f463 |
File details
Details for the file Jaxley-0.2.1-py3-none-any.whl
.
File metadata
- Download URL: Jaxley-0.2.1-py3-none-any.whl
- Upload date:
- Size: 125.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 793c808572cf3cfa5924601da3c1eb8f54c50a564cc467292fefb3a75b0c182a |
|
MD5 | 43be47506cb24510f45e3a32ce6fbea2 |
|
BLAKE2b-256 | 69268d3b5d28391ee4ddedd9e2f57ad87718ea89fe94b9b3de932073d014f439 |