Skip to main content

Convert data from proprietary formats to NWB format.

Project description

PyPI version Partial (lazy) Tests Full Tests Auto-release codecov documentation License

NWB conversion tools

NWB Conversion Tools is a package for creating NWB files by converting and combining neural data in proprietary formats and adding essential metadata.

Under heavy construction. API is changing rapidly.

Features:

  • Command line interface
  • Python API
  • Leverages SpikeExtractor to support conversion from a number or proprietary formats.

Installation

To install the latest stable release of nwb-conversion-tools though PyPI, type:

pip install nwb-conversion-tools

For more flexibility we recommend installing the latest version directly from GitHub. The following commands create an environment with all the required dependencies and the latest updates:

git clone https://github.com/catalystneuro/nwb-conversion-tools
cd nwb-conversion-tools
conda env create -f make_env.yml
conda activate nwb_conversion_env

Note that this will install the package in editable mode.

Finally, if you prefer to avoid conda altogether, the following commands provide a clean installation within the current environment:

pip install git+https://github.com/catalystneuro/nwb-conversion-tools.git@master

Dependencies

NWB Conversion Tools relies heavily on SpikeExtractors for electrophysiology and on ROIExtractors for optophysiology data.

You can use a graphical interface for your converter with NWB Web GUI.

Catalogue

v0.9.3

Buzsáki Lab: buzsaki-lab-to-nwb

This project is an ongoing effort for the Ripple U19 conversion of extracellular electrophysiology data to NWB format, including final publishing of each dataset on DANDI. Currently spans 7 major publications and over 14 TB of data on the DANDI Archive. Most of the data consists of raw recordings, LFP, spike sorted units, and behavior with can consist of a mix of mental state tracking, position tracking through mazes, and trial stimulus events.

Shenoy lab: shenoy-lab-to-nwb:

The Shenoy lab is one of the pioneers in developing BCIs for people with paralysis. They are part of the BrainGate team and were the winners of the 2019 BCI award. They use extracellular recordings from Utah arrays and Neuropixels in primates.

v0.9.2

Brody Lab: brody-lab-to-nwb

The Brody lab has a long history with extracellular electrophysiology experiements spanning multiple acquisition systems. This project served two purposes - to allow the conversion of older data from Neuralynx and SpikeGadgets to NWB, and also their newer, larger data using Neuropixels (SpikeGLX). These recordings, some of which exceeded more than 250 GB (several hours worth!), were paired with rich trials tables containing catagorical events and temporal stimuli.

v0.8.10

Feldman Lab: feldman-lab-to-nwb

The Feldman lab utilizes a Neuropixels (SpikeGLX) system along with multiple sophisticated behavior systems for manipulating whisker stimulation in mice. These give rise to very complex trials tables tracking multiple event times throughout the experiments, including multiple event trains within trials.

v0.8.1

Hussaini Lab: hussaini-lab-to-nwb

v0.7.2

Movson lab: movshon-lab-to-nwb

v0.7.0

Tank Lab: tank-lab-to-nwb

Neuropixel (SpikeGLX) recordings of subjects navigating a virtual reality! Behavior contains a huge variety of NWB data types including positional and view angle over time, collision detection, and more! Paired with a specific extension for parsing experiment metadata.

Groh lab: mease-lab-to-nwb

Utilizing the CED recording interface, this project paired ecephys channels with optogenetic stimulation via laser pulses, and mechnical pressure stimulation over time - all of which are channels of data extracted from the common .smrx files!

Giocomo lab: giocomo-lab-to-nwb

Other labs that use NWB standard

For Developers

Running GIN tests locally

nwb-conversion-tools verifies the integrity of all code changes by running a full test suite on short examples of real data from the formats we support. There are two classes of tests in this regard; tests/test_internals does not require any data to be present and represents the 'minimal' expected behavior for our package, whereas tests/test_on_data requires the user to both perform a full install of dependencies (pip install -r requirements-full.txt) as well as download the associated data for each modality.

Install testing dependencies

We provide two easy ways of installing all the dependencies required for testing:

  1. The first is a conda based solution that creates an environment with all the dependencies already installed.
git clone https://github.com/catalystneuro/nwb-conversion-tools
cd nwb-conversion-tools
conda env create -f make_env_testing.yml
conda activate nwb_conversion_testing_env

Note that this will also install datalad which is the endorsed way of downloading the testing data plus pytest and pytest-cov which are the tools that we use on our continuous integration suit.

  1. The same can be accomplished by using pip. In a clean environment run:
git clone https://github.com/catalystneuro/nwb-conversion-tools
cd nwb-conversion-tools
pip install .[test_full]

Notice that this method does not install datalad.

Downloading the data

Datalad (conda install datalad) is the recommended way for downloading the data. To do this; simply call:

datalad install -rg https://gin.g-node.org/NeuralEnsemble/ephy_testing_data

to install the ecephys data, and

datalad install -rg https://gin.g-node.org/CatalystNeuro/ophys_testing_data

for ophys data.

Test configuration file

Once the data is downloaded to your system, you must manually modify the config file located in ./tests/test_on_data/gin_test_config.json so its corresponding LOCAL_PATH key points to the correct folder on your system that contains the dataset folder (e.g., ephy_testing_data for testing ecephys). The code will automatically detect that the tests are being run locally, so all you need to do ensure the path is correct to your specific system.

The output of these tests is, by default, stored in a temporary directory that is then cleaned after the tests finish running. To examine these files for quality assessment purposes, set the flag SAVE_OUTPUTS=true in the same gin_test_config.json file mentioned in the last paragraph and modify the variable OUTPUT_PATH in the respective test if necessary.

Rebuilding on Read the Docs

As a maintainer, once the changes to the documentation are on the master branch, go to https://readthedocs.org/projects/nwb-conversion-tools/ and click "Build version". Check the console output and its log for any errors.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nwb-conversion-tools-0.11.1.tar.gz (65.0 kB view details)

Uploaded Source

Built Distribution

nwb_conversion_tools-0.11.1-py3-none-any.whl (87.9 kB view details)

Uploaded Python 3

File details

Details for the file nwb-conversion-tools-0.11.1.tar.gz.

File metadata

  • Download URL: nwb-conversion-tools-0.11.1.tar.gz
  • Upload date:
  • Size: 65.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for nwb-conversion-tools-0.11.1.tar.gz
Algorithm Hash digest
SHA256 f3c87c0337435948841359a2cad2fd0529613e0ca59a77dc7da7a1b9cba3fa90
MD5 a1c8dc4fb8758524c5e9370cb59ecb99
BLAKE2b-256 0eae2dac5ed87d80d3408301caeae326b1512bfa55a8ca098da1a4fcdd1f9579

See more details on using hashes here.

File details

Details for the file nwb_conversion_tools-0.11.1-py3-none-any.whl.

File metadata

  • Download URL: nwb_conversion_tools-0.11.1-py3-none-any.whl
  • Upload date:
  • Size: 87.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for nwb_conversion_tools-0.11.1-py3-none-any.whl
Algorithm Hash digest
SHA256 902bd5ccebef902c74d9dc854874efc35bd5c9792d15a5f9c115d594ccc7cb60
MD5 4b6a7d1a1e860945779d127798630338
BLAKE2b-256 0362c118d2d16b6be75fde36a271be0a7a58bb10dfd32d7fc0dc406b9e782a54

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page