Skip to main content

NEURON Modeling Language Source-to-Source Compiler Framework

Project description

The NMODL Framework

github workflow Build Status codecov CII Best Practices

The NMODL Framework is a code generation engine for NEURON MODeling Language (NMODL). It is designed with modern compiler and code generation techniques to:

  • Provide modular tools for parsing, analysing and transforming NMODL

  • Provide easy to use, high level Python API

  • Generate optimised code for modern compute architectures including CPUs, GPUs

  • Flexibility to implement new simulator backends

  • Support for full NMODL specification

About NMODL

Simulators like NEURON use NMODL as a domain specific language (DSL) to describe a wide range of membrane and intracellular submodels. Here is an example of exponential synapse specified in NMODL:

NEURON {
    POINT_PROCESS ExpSyn
    RANGE tau, e, i
    NONSPECIFIC_CURRENT i
}
UNITS {
    (nA) = (nanoamp)
    (mV) = (millivolt)
    (uS) = (microsiemens)
}
PARAMETER {
    tau = 0.1 (ms) <1e-9,1e9>
    e = 0 (mV)
}
ASSIGNED {
    v (mV)
    i (nA)
}
STATE {
    g (uS)
}
INITIAL {
    g = 0
}
BREAKPOINT {
    SOLVE state METHOD cnexp
    i = g*(v - e)
}
DERIVATIVE state {
    g' = -g/tau
}
NET_RECEIVE(weight (uS)) {
    g = g + weight
}

Installation

See INSTALL.rst for detailed instructions to build the NMODL from source.

Try NMODL with Docker

To quickly test the NMODL Framework’s analysis capabilities we provide a docker image, which includes the NMODL Framework python library and a fully functional Jupyter notebook environment. After installing docker and docker-compose you can pull and run the NMODL image from your terminal.

To try Python interface directly from CLI, you can run docker image as:

docker run -it --entrypoint=/bin/sh bluebrain/nmodl

And try NMODL Python API discussed later in this README as:

$ python3
Python 3.6.8 (default, Apr  8 2019, 18:17:52)
>>> from nmodl import dsl
>>> import os
>>> examples = dsl.list_examples()
>>> nmodl_string = dsl.load_example(examples[-1])
...

To try Jupyter notebooks you can download docker compose file and run it as:

wget "https://raw.githubusercontent.com/BlueBrain/nmodl/master/docker/docker-compose.yml"
DUID=$(id -u) DGID=$(id -g) HOSTNAME=$(hostname) docker-compose up

If all goes well you should see at the end status messages similar to these:

[I 09:49:53.923 NotebookApp] The Jupyter Notebook is running at:
[I 09:49:53.923 NotebookApp] http://(4c8edabe52e1 or 127.0.0.1):8888/?token=a7902983bad430a11935
[I 09:49:53.923 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
    To access the notebook, open this file in a browser:
        file:///root/.local/share/jupyter/runtime/nbserver-1-open.html
    Or copy and paste one of these URLs:
        http://(4c8edabe52e1 or 127.0.0.1):8888/?token=a7902983bad430a11935

Based on the example above you should then open your browser and navigate to the URL http://127.0.0.1:8888/?token=a7902983bad430a11935.

You can open and run all example notebooks provided in the examples folder. You can also create new notebooks in my_notebooks, which will be stored in a subfolder notebooks at your current working directory.

Using the Python API

Once the NMODL Framework is installed, you can use the Python parsing API to load NMOD file as:

from nmodl import dsl

examples = dsl.list_examples()
nmodl_string = dsl.load_example(examples[-1])
driver = dsl.NmodlDriver()
modast = driver.parse_string(nmodl_string)

The parse_file API returns Abstract Syntax Tree (AST) representation of input NMODL file. One can look at the AST by converting to JSON form as:

>>> print (dsl.to_json(modast))
{
  "Program": [
    {
      "NeuronBlock": [
        {
          "StatementBlock": [
            {
              "Suffix": [
                {
                  "Name": [
                    {
                      "String": [
                        {
                          "name": "POINT_PROCESS"
                        }
                    ...

Every key in the JSON form represent a node in the AST. You can also use visualization API to look at the details of AST as:

from nmodl import ast
ast.view(modast)

which will open AST view in web browser:

ast_viz

ast_viz

The central Program node represents the whole MOD file and each of it’s children represent the block in the input NMODL file. Note that this requires X-forwarding if you are using Docker image.

Once the AST is created, one can use exisiting visitors to perform various analysis/optimisations. One can also easily write his own custom visitor using Python Visitor API. See Python API tutorial for details.

NMODL Frameowrk also allows to transform AST representation back to NMODL form as:

>>> print (dsl.to_nmodl(modast))
NEURON {
    POINT_PROCESS ExpSyn
    RANGE tau, e, i
    NONSPECIFIC_CURRENT i
}

UNITS {
    (nA) = (nanoamp)
    (mV) = (millivolt)
    (uS) = (microsiemens)
}

PARAMETER {
    tau = 0.1 (ms) <1e-09,1000000000>
    e = 0 (mV)
}
...

High Level Analysis and Code Generation

The NMODL Framework provides rich model introspection and analysis capabilities using various visitors. Here is an example of theoretical performance characterisation of channels and synapses from rat neocortical column microcircuit published in 2015:

nmodl-perf-stats

nmodl-perf-stats

To understand how you can write your own introspection and analysis tool, see this tutorial.

Once analysis and optimization passes are performed, the NMODL Framework can generate optimised code for modern compute architectures including CPUs (Intel, AMD, ARM) and GPUs (NVIDIA, AMD) platforms. For example, C++, OpenACC and OpenMP backends are implemented and one can choose these backends on command line as:

$ nmodl expsyn.mod sympy --analytic

To know more about code generation backends, see here. NMODL Framework provides number of options (for code generation, optimization passes and ODE solver) which can be listed as:

$ nmodl -H
NMODL : Source-to-Source Code Generation Framework [version]
Usage: /path/<>/nmodl [OPTIONS] file... [SUBCOMMAND]

Positionals:
  file TEXT:FILE ... REQUIRED           One or more MOD files to process

Options:
  -h,--help                             Print this help message and exit
  -H,--help-all                         Print this help message including all sub-commands
  --verbose=info                        Verbose logger output (trace, debug, info, warning, error, critical, off)
  -o,--output TEXT=.                    Directory for backend code output
  --scratch TEXT=tmp                    Directory for intermediate code output
  --units TEXT=/path/<>/nrnunits.lib
                                        Directory of units lib file

Subcommands:
host
  HOST/CPU code backends
  Options:
    --c                                   C/C++ backend (true)

acc
  Accelerator code backends
  Options:
    --oacc                                C/C++ backend with OpenACC (false)

sympy
  SymPy based analysis and optimizations
  Options:
    --analytic                            Solve ODEs using SymPy analytic integration (false)
    --pade                                Pade approximation in SymPy analytic integration (false)
    --cse                                 CSE (Common Subexpression Elimination) in SymPy analytic integration (false)
    --conductance                         Add CONDUCTANCE keyword in BREAKPOINT (false)

passes
  Analyse/Optimization passes
  Options:
    --inline                              Perform inlining at NMODL level (false)
    --unroll                              Perform loop unroll at NMODL level (false)
    --const-folding                       Perform constant folding at NMODL level (false)
    --localize                            Convert RANGE variables to LOCAL (false)
    --global-to-range                     Convert GLOBAL variables to RANGE (false)
    --localize-verbatim                   Convert RANGE variables to LOCAL even if verbatim block exist (false)
    --local-rename                        Rename LOCAL variable if variable of same name exist in global scope (false)
    --verbatim-inline                     Inline even if verbatim block exist (false)
    --verbatim-rename                     Rename variables in verbatim block (true)
    --json-ast                            Write AST to JSON file (false)
    --nmodl-ast                           Write AST to NMODL file (false)
    --json-perf                           Write performance statistics to JSON file (false)
    --show-symtab                         Write symbol table to stdout (false)

codegen
  Code generation options
  Options:
    --layout TEXT:{aos,soa}=soa           Memory layout for code generation
    --datatype TEXT:{float,double}=soa    Data type for floating point variables
    --force                               Force code generation even if there is any incompatibility
    --only-check-compatibility            Check compatibility and return without generating code
    --opt-ionvar-copy                     Optimize copies of ion variables (false)

Documentation

We are working on user documentation, you can find current drafts of :

Citation

If you would like to know more about the the NMODL Framework, see following paper:

  • Pramod Kumbhar, Omar Awile, Liam Keegan, Jorge Alonso, James King, Michael Hines and Felix Schürmann. 2019. An optimizing multi-platform source-to-source compiler framework for the NEURON MODeling Language. In Eprint : arXiv:1905.02241

Support / Contribuition

If you see any issue, feel free to raise a ticket. If you would like to improve this framework, see open issues and contribution guidelines.

Examples / Benchmarks

The benchmarks used to test the performance and parsing capabilities of NMODL Framework are currently being migrated to GitHub. These benchmarks will be published soon in following repositories:

Funding & Acknowledgment

The development of this software was supported by funding to the Blue Brain Project, a research center of the École polytechnique fédérale de Lausanne (EPFL), from the Swiss government’s ETH Board of the Swiss Federal Institutes of Technology. In addition, the development was supported by funding from the National Institutes of Health (NIH) under the Grant Number R01NS11613 (Yale University) and the European Union’s Horizon 2020 Framework Programme for Research and Innovation under the Specific Grant Agreement No. 785907 (Human Brain Project SGA2).

Copyright © 2017-2023 Blue Brain Project, EPFL

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

nmodl_nightly-0.6.166-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.1 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

nmodl_nightly-0.6.166-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

nmodl_nightly-0.6.166-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

nmodl_nightly-0.6.166-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

nmodl_nightly-0.6.166-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

File details

Details for the file nmodl_nightly-0.6.166-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nmodl_nightly-0.6.166-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0c559222dd1595f48be12823af1ed11810ff3a8dba786ad5700a5a582031d1e4
MD5 91490714774f91aaaad5b14f7a215c5a
BLAKE2b-256 5e28f43d1092eedaadc3096763e20ca5c6271f8eb71df75e4b0c58f0f2e1115b

See more details on using hashes here.

Provenance

File details

Details for the file nmodl_nightly-0.6.166-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nmodl_nightly-0.6.166-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 2b5ce449a234630b8b567e5799217b35fdab07c68dd67e5e8c26ce27989885de
MD5 545543671a1737b4852f860a6a9a6175
BLAKE2b-256 bedba5570013859197d947d89a33d746d2960900952155de0a9069abdc8318f6

See more details on using hashes here.

Provenance

File details

Details for the file nmodl_nightly-0.6.166-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nmodl_nightly-0.6.166-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d663767a5cb7605a704ac4d0953db5e86b7718cee10fbc9f135c3577e4653c19
MD5 44b6f946046ebf34dff3ff8fda875f12
BLAKE2b-256 b84e4449047652e35ef88bd4ae05c0b2797ef8759fe671eb1f9fba582416732b

See more details on using hashes here.

Provenance

File details

Details for the file nmodl_nightly-0.6.166-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nmodl_nightly-0.6.166-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 07f0f81411e727acc0390c31038c9aef744a59814b89eed5d5c2b97c6b204afe
MD5 5c1ae5c56152f67ad3458e0c314f5690
BLAKE2b-256 42bcd97c0aee8da103481ca41683a8db859ad9cbd8fb6d9ec54d5a42900139c3

See more details on using hashes here.

Provenance

File details

Details for the file nmodl_nightly-0.6.166-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nmodl_nightly-0.6.166-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a1e14e39fe54e236f38a96edfca1b66aeb10f0ce4ff46e35584d3d38f5c7bb4c
MD5 2e6142a28268e3de8f6d90633df33723
BLAKE2b-256 f9dbcd70886c80210343b935a70e175a3f2d8cec31253b9decce783375076295

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page