Skip to main content

Package creating resource payloads from atlas datasets and push them along with the corresponding dataset files into Nexus.

Project description

Description

This module contains command-line executables aka CLIs that take in input atlas pipeline datasets and push them into Nexus along with a resource properties payload, including:

  • push_volumetric to create a VolumetricDataLayer resource payload and push it along with the corresponding volumetric input dataset files into Nexus.

  • push_meshes to create a Mesh resource payload and push it along with the corresponding brain .OBJ mesh folder input dataset files into Nexus.

  • push_cellrecords to create a CellRecordSerie resource payload and push it along with the corresponding Sonata hdf5 file input dataset files into Nexus.

  • push_regionsummary to create a RegionSummary resource payload and push it along with the corresponding brain region metadata json input dataset files into Nexus.

If the Resource already exists in Nexus then it will be updated instead. Eventually push/update their linked atlasRelease and ontology resources. Tag all these resources with the input tag or, if not provided, with a timestamp. Each CLI can process multiple files/directories at once.

The input datasets must be one of the datasets listed in the input configuration file given as the argument of the --config option. These datasets has to correspond to the ones generated by the atlas pipeline (these datasets are referenced on the Pipeline Products page.) The configuration yaml file content structure should look like this configuration example :

  HierarchyJson:
      hierarchy: hierarchy.json
      hierarchy_l23split: hierarchy_l23split.json
  GeneratedDatasetPath:
      VolumetricFile:
          annotation_hybrid: annotation_v2v3_hybrid.nrrd
          annotation_l23split: annotation_l23split.nrrd
          cell_densities: cell_densities
          neuron_densities: neuron_densities
      MeshFile:
          brain_region_meshes_hybrid: brain_region_meshes_v2v3_hybrid
          brain_region_meshes_l23split: brain_region_meshes_l23split
      CellRecordsFile:
          cell_records_sonata: cell_records_sonata.h5
  MetadataFile:
	      metadata_parcellations_ccfv3_l23split: metadata_parcellations_ccfv3_l23split.json

The Resource property payload includes, in addition to various information on the dataset pushed and its content, some provenance informations:

  • The contributor: @id of the user and organisation associated with the input Nexus token.
  • From which datasets it derivates.
  • Informations on the run stocked in the linked Activity resource: the run duration, the software used ...

These four CLIs are hierarchically grouped in the cli initialize-pusher-cli. This CLI allows the Initialisation of the Forge python framework to communicate with Nexus. The Forge will enable to build and push into Nexus the metadata payload along with the input dataset. This means that before calling one of the three CLIs, initialize-pusher-cli must first be called.

Note: the --verbosity argument allows you to print in the console the last resource payload from the list of resource payloads that has been constructed from input datasets that will be pushed into Nexus. If only one dataset has been given as input then its corresponding resource payload will be printed.

Install

pip install "bba-data-push"

Examples

Arguments for initialize-pusher-cli

Run the dataset pusher CLI starting by the Initialisation of the Forge python framework to communicate with Nexus

Inputs

--verbose, -v : Verbosity option. If equal True, the last resource payload from the list of resource payloads that has been constructed from input datasets will be printed. (Optional : boolean).
--forge-config-file : Path to the configuration file used to instantiate the Forge. (Optional, default = "https://raw.githubusercontent.com/BlueBrain/nexus-forge/master/examples/notebooks/use-cases/prod-forge-nexus.yml").
--nexus-env : Nexus environment to use, can be 'dev', staging', 'prod' or the URL of a custom environment. (Optional, default="prod").
--nexus-org : The Nexus organisation to push into. (Optional, default='bbp').
--nexus-org : The Nexus project to push into. (Optional, default='atlas').
--nexus-token-file : Path to the text file containing the Nexus token.

Arguments for push_volumetric

Create a 'VolumetricDataLayer', an 'atlasRelease' and an 'ontology' resource payload to push into Nexus. If the resources already exist in Nexus, they will be fetched and updated instead. This script has been designed to function with volumetric files generated by the Atlas pipeline.

Inputs

--dataset-path : [multiple paths] The files or directories of file to push on Nexus.
--config : Path to the generated dataset configuration file. This is a yaml file containing the paths to the Atlas pipeline generated dataset.
--hierarchy-path : [multiple paths] The path to the json hierarchy file containing an AIBS hierarchy structure.
--hierarchy-jsonld-path : [path] Path to the AIBS hierarchy structure as a JSON-LD file. It is mandatory in case of the creation of a new Ontology resource as it will be attached to it when integrated in the knowledge graph. New Ontology resource is created at the same time a new atlasRelease resource need to be created.
--atlasrelease-config-path : [path] Json file containing the atlasRelease @id as well as its ontology and parcellation volume @id. It needs to contains at least these informations for the atlasRelease Allen Mouse CCFV2 and CCFV3 stocked in the Nexus project bbp/atlas.
--provenance-metadata-path : [path] Path to the Json file containing metadata for the derivation properties as well as the Activity and SoftwareAgent resources.
--resource-tag : [string] Optional tag value with which to tag the resources. (Optional). --link-regions-path : [path] Optional json file containing link between regions and resources (@ ids of mask and mesh for each brain region) to be extracted by the CLI push-regionsummary. If the file already exists it will be annotated else it will be created. (Optional).

Arguments for push_meshes

Create a 'Mesh' , an 'atlasRelease' and an 'ontology' resource payload to push into Nexus. If the resources already exist in Nexus, they will be fetched and updated instead. This script has been designed to function with brain region meshes generated by the Atlas pipeline.

Inputs

--dataset-path : [multiple paths] The files or directories of file to push on Nexus.
--config : [path] Path to the generated dataset configuration file. This is a yaml file containing the paths to the Atlas pipeline generated dataset.
--hierarchy-path : [multiple paths] The path to the json hierarchy file containing an AIBS hierarchy structure.
--hierarchy-jsonld-path : [path] Path to the AIBS hierarchy structure as a JSON-LD file. It is mandatory in case of the creation of a new Ontology resource as it will be attached to it when integrated in the knowledge graph. New Ontology resource is created at the same time a new atlasRelease resource need to be created.
--atlasrelease-config-path : [path] Json file containing the atlasRelease @id as well as its ontology and parcellation volume @id. It needs to contains at least these informations for the atlasRelease Allen Mouse CCFV2 and CCFV3 stocked in the Nexus project bbp/atlas.
--provenance-metadata-path : [path] Path to the Json file containing metadata for the derivation properties as well as the Activity and SoftwareAgent resources.
--resource-tag : [string] Optional tag value with which to tag the resources. (Optional).
--link-regions-path : [path] Optional json file containing link between regions and resources (@ ids of mask and mesh for each brain region) to be extracted by the CLI push-regionsummary. If the file already exists it will be annotated else it will be created. (Optional).

Arguments for push_regionsummary

Create a 'RegionSummary', an 'atlasRelease' and an 'ontology' resource payload to push into Nexus. If the resources already exist in Nexus, they will be fetched and updated instead. This script has been designed to function with metadata json files generated by the Atlas pipeline.

Inputs

--dataset-path : [multiple paths] The files or directories of file to push on Nexus.
--config : [path] Path to the generated dataset configuration file. This is a yaml file containing the paths to the Atlas pipeline generated dataset.
--hierarchy-path : [multiple paths] The path to the json hierarchy file containing an AIBS hierarchy structure.
--hierarchy-jsonld-path : [path] Path to the AIBS hierarchy structure as a JSON-LD file. It is mandatory in case of the creation of a new Ontology resource as it will be attached to it when integrated in the knowledge graph. New Ontology resource is created at the same time a new atlasRelease resource need to be created.
--atlasrelease-config-path : [path] Json file containing the atlasRelease @id as well as its ontology and parcellation volume @id. It needs to contains at least these informations for the atlasRelease Allen Mouse CCFV2 and CCFV3 stocked in the Nexus project bbp/atlas.
--provenance-metadata-path : [path] Path to the Json file containing metadata for the derivation properties as well as the Activity and SoftwareAgent resources.
--resource-tag : [string] Optional tag value with which to tag the resources. (Optional). --link-regions-path : [path] Optional json file containing link between regions and resources (@ ids of mask and mesh for each brain region) to be extracted by the CLI push-regionsummary. If the file already exists it will be annotated else it will be created. (Optional).

Arguments for push_cellrecords

Create a 'CellRecordSeries', an 'atlasRelease' and an 'ontology' resource payload to push into Nexus. If the resources already exist in Nexus, they will be fetched and updated instead. This script has been designed to function with sonata h5 files storing 3D brain cell positions and orientations and generated by the Atlas pipeline.

Inputs

--dataset-path : [multiple paths] The files or directories of file to push on Nexus.
--config : Path to the generated dataset configuration file. This is a yaml file containing the paths to the Atlas pipeline generated dataset.
--hierarchy-path : [multiple paths] The path to the json hierarchy file containing an AIBS hierarchy structure.
--hierarchy-jsonld-path : [path] Path to the AIBS hierarchy structure as a JSON-LD file. It is mandatory in case of the creation of a new Ontology resource as it will be attached to it when integrated in the knowledge graph. New Ontology resource is created at the same time a new atlasRelease resource need to be created.
--atlasrelease-config-path : [path] Json file containing the atlasRelease @id as well as its ontology and parcellation volume @id. It needs to contains at least these informations for the atlasRelease Allen Mouse CCFV2 and CCFV3 stocked in the Nexus project bbp/atlas.
--provenance-metadata-path : [path] Path to the Json file containing metadata for the derivation properties as well as the Activity and SoftwareAgent resources.
--resource-tag : [string] Optional tag value with which to tag the resources. (Optional).

Acknowledgements

The development of this software was supported by funding to the Blue Brain Project, a research center of the École polytechnique fédérale de Lausanne (EPFL), from the Swiss government’s ETH Board of the Swiss Federal Institutes of Technology.

For license and authors, see LICENSE.txt and AUTHORS.txt respectively.

Copyright (c) 2020-2024 Blue Brain Project/EPFL

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

blue_brain_data_push-5.0.0.tar.gz (35.7 MB view details)

Uploaded Source

Built Distribution

blue_brain_data_push-5.0.0-py3-none-any.whl (35.9 MB view details)

Uploaded Python 3

File details

Details for the file blue_brain_data_push-5.0.0.tar.gz.

File metadata

  • Download URL: blue_brain_data_push-5.0.0.tar.gz
  • Upload date:
  • Size: 35.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for blue_brain_data_push-5.0.0.tar.gz
Algorithm Hash digest
SHA256 9790261108b7011c4d22aebb4863fbc0f5a5a4606d286c821495c923c0bba61d
MD5 7e4fb51c647afd6a0f643df4e0dff750
BLAKE2b-256 ef4bd9bd79a226dae5875e9c0c0e452850d227f6fc2c1479b2ad16b4a5893ecb

See more details on using hashes here.

File details

Details for the file blue_brain_data_push-5.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for blue_brain_data_push-5.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e3387b0a495999ab9d2ba7aaa99ab27185d9a045deb43a3d05a92b4a14b27ad3
MD5 9712dbcfa5d06fb238694adbc05e673f
BLAKE2b-256 23d1ca74a1db59fe5ffcaf4448024019470ab38c0f0ce84c5b1f3ff8ffd1419c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page