Convert data from proprietary formats to NWB format.
Project description
NWB conversion tools
NWB Conversion Tools is a package for creating NWB files by converting and combining neural data in proprietary formats and adding essential metadata.
Under heavy construction. API is changing rapidly.
Features:
- Command line interface
- Python API
- Leverages SpikeExtractor to support conversion from a number or proprietary formats.
Installation
To install nwb-conversion-tools directly in an existing environment:
$ pip install nwb-conversion-tools
Alternatively, to clone the repository and set up a conda environment, do:
$ git clone https://github.com/catalystneuro/nwb-conversion-tools
$ cd nwb-conversion-tools
$ conda env create -f make_env.yml
$ conda activate nwb_conversion_env
$ pip install .
Dependencies
NWB Conversion Tools relies heavily on SpikeExtractors for electrophysiology and on ROIExtractors for optophysiology data.
You can use a graphical interface for your converter with NWB Web GUI.
Catalogue
v0.9.3
Buzsáki Lab: buzsaki-lab-to-nwb
This project is an ongoing effort for the Ripple U19 conversion of extracellular electrophysiology data to NWB format, including final publishing of each dataset on DANDI. Currently spans 7 major publications and over 14 TB of data on the DANDI Archive. Most of the data consists of raw recordings, LFP, spike sorted units, and behavior with can consist of a mix of mental state tracking, position tracking through mazes, and trial stimulus events.
Shenoy Lab: shenoy-lab-to-nwb:
v0.9.2
Brody Lab: brody-lab-to-nwb
The Brody lab has a long history with extracellular electrophysiology experiements spanning multiple acquisition systems. This project served two purposes - to allow the conversion of older data from Neuralynx and SpikeGadgets to NWB, and also their newer, larger data using Neuropixels (SpikeGLX). These recordings, some of which exceeded more than 250 GB (several hours worth!), were paired with rich trials tables containing catagorical events and temporal stimuli.
v0.8.10
Feldman Lab: feldman-lab-to-nwb
The Feldman lab utilizes a Neuropixels (SpikeGLX) system along with multiple sophisticated behavior systems for manipulating whisker stimulation in mice. These give rise to very complex trials tables tracking multiple event times throughout the experiments, including multiple event trains within trials.
v0.8.1
Hussaini Lab: hussaini-lab-to-nwb
v0.7.2
Movson lab: movshon-lab-to-nwb
v0.7.0
Tank Lab: tank-lab-to-nwb
Neuropixel (SpikeGLX) recordings of subjects navigating a virtual reality! Behavior contains a huge variety of NWB data types including positional and view angle over time, collision detection, and more! Paired with a specific extension for parsing experiment metadata.
Groh lab: mease-lab-to-nwb
Utilizing the CED recording interface, this project paired ecephys channels with optogenetic stimulation via laser pulses, and mechnical pressure stimulation over time - all of which are channels of data extracted from the common .smrx
files!
Giocomo lab: giocomo-lab-to-nwb
Other labs that use NWB standard
- Axel lab: axel-lab-to-nwb
- Brunton lab: brunton-lab-to-nwb
- Buffalo lab: buffalo-lab-data-to-nwb
- Jaeger lab: jaeger-lab-to-nwb
- Tolias lab: tolias-lab-to-nwb
For Developers
Running GIN tests locally
nwb-conversion-tools
verifies the integrity of all code changes by running a full test suite on short examples of real data from the formats we support. There are two classes of tests in this regard; tests/test_internals
does not require any data to be present and represents the 'minimal' expected behavior for our package, whereas tests/test_on_data
requires the user to both perform a full install of dependencies (pip install -r requirements-full.txt
) as well as download the associated data for each modality. Datalad (conda install datalad
) is recommended for this; simply call
datalad install -rg https://gin.g-node.org/NeuralEnsemble/ephy_testing_data
to install the ecephys
data, and
datalad install -rg https://gin.g-node.org/CatalystNeuro/ophys_testing_data
for ophys
data.
Once the data is downloaded to your system, you must manually modify the test_gin_{modality}.py
files (line #43 for ecehys and line #34 for ophys) to point to the correct folder on your system that contains the dataset folder (e.g., ephy_testing_data
for testing ecephys
). The code will automatically detect that the tests are being run locally, so all you need to do ensure the path is correct to your specific system.
The output of these tests is, by default, stored in a temporary directory that is then cleaned after the tests finish running. To examine these files for quality assessment purposes, set the flag SAVE_OUTPUTS=True
and modify the local OUTPUT_PATH
if necessary.
Rebuilding on Read the Docs
As a maintainer, once the changes to the documentation are on the master branch, go to https://readthedocs.org/projects/nwb-conversion-tools/ and click "Build version". Check the console output and its log for any errors.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file nwb-conversion-tools-0.9.9.tar.gz
.
File metadata
- Download URL: nwb-conversion-tools-0.9.9.tar.gz
- Upload date:
- Size: 57.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b05e56e470c6e8a984ac26f5acb50ad075633b8c178cea7c8382e0c18a8853a7 |
|
MD5 | 2baee6f330a9638be8e04d7d5311467f |
|
BLAKE2b-256 | 250d0fbaac7556ecce44ebc068050d9b8efeb5561a9fb7f26ef2ee65d5094b9c |
File details
Details for the file nwb_conversion_tools-0.9.9-py3-none-any.whl
.
File metadata
- Download URL: nwb_conversion_tools-0.9.9-py3-none-any.whl
- Upload date:
- Size: 79.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0e7001460ab75ed74900d5682e1bc141b8a128a710a68f1a67ad1ce1514489ea |
|
MD5 | 24eb031caebe8e059bb720efd7269086 |
|
BLAKE2b-256 | 7f8a24dacfaf18acd522d1fbc6b0f4a953f1b813435004f6600858d421efe093 |