DataLad extension for semantic metadata handling
Project description
DataLad extension for semantic metadata handling
Overview
This software is a DataLad extension that equips DataLad with an alternative command suite for metadata handling (extraction, aggregation, filtering, and reporting).
Please note that the metadata storage format introduced in release 0.3.0 is incompatible
with the metadata storage formate in previous versions, i.e. 0.2.x
, and in DataLad
proper. They both happily coexist on storage, but this version of metalad will not
be able to read metadata that was stored by the previous version and vice versa.
Eventually there will be an importer that will pull old-version metadata into
the new metadata storage. It is planned for release 0.3.1
Here is an overview of the changes in 0.3.0 (the new system is quite different from the previous release in a few ways):
-
Leaner commands with unix-style behavior, i.e. one command for one operation, and commands are chainable (use results from one command as input for another command, e.g. meta-extract|meta-add).
-
MetadataRecord modifications does not alter the state of the datalad dataset. In previous releases, changes to metadata have altered the version (commit-hash) of the repository although the primary data did not change. This is not the case in the new system. The new system does provide information about the primary data version, i.e. commit-hash, from which the individual metadata elements were created.
-
The ability to support a wide range of metadata storage backends in the future (this is facilitated by the datalad-metadata-model) which is developed alongside metalad), which separates the logical metadata model used in metalad from the storage backends, by abstracting the storage backend), Currently git-repository storage is supported.
-
The ability to transport metadata independently of the data in the dataset. The new system introduces the concept of a metadata-store which is usually the git-repository of the datalad dataset that is described by the metadata. But this is not a mandatory configuration, metadata can be stored in almost any git-repository.
-
The ability to report a subset of metadata from a remote metadata store without downloading the complete remote metadata. In fact only the minimal necessary information is transported from the remote metadata store. This ability is available to all metadata-based operations, for example, also to filtering.
-
A new simplified extractor model that distinguishes between two extractor-types: dataset-level extractors and file-extractors. The former are executed with a view on a dataset, the latter are executed with specific information about a single file-path in the dataset. The previous extractors (datalad, and datalad-metalad<=0.2.1) are still supported.
-
A built-in pipeline mechanism that allows parallel execution of metadata operations like metadata extraction, and metadata filtering. (Still in early stage)
-
A new set of commands that allow operations that map metadata to metadata. Those operations are called filtering and are implemented by MetadataFilter-classes. Filter are dynamically loaded and custom filter are supports, much like extractors. (Still in early stage)
-
Backward compatibility supporting an import from previous metadata storage (planned for 0.3.1).
Command(s) currently provided by this extension
-
meta-extract
-- run an extractor on a file or dataset and emit the resulting metadata (stdout). -
meta-filter
-- run an filter over existing metadata and return the resulting metadata (stdout). -
meta-add
-- add a metadata record or a list of metadata records (possibly received on stdin) to a metadata store, usually to the git-repo of the dataset. -
meta-aggregate
-- aggregate metadata from multiple local or remote metadata-stores into a local metadata store. -
meta-dump
-- reporting metadata from local or remote metadata stores. Allows to select metadata by file- or dataset-path matching patterns including dataset versions and dataset IDs. -
meta conduct
-- execute processing pipelines that consist of a provider which emits objects that should be processed, e.g. files or metadata, and a pipeline of processors, that perform operations on the provided objects, such as metadata-extraction and metadata-adding.Processors are usually executed in parallel. A few pipeline definitions are provided with the release.
Commands currently under development:
-
meta-export
-- write a flat representation of metadata to a file-system. For now you can export your metadata to a JSON-lines file namedmetadata-dump.jsonl
:datalad meta-dump -d <dataset-path> -r >metadata-dump.jsonl
-
meta-import
-- import a flat representation of metadata from a file-system. For now you can import metadata from a JSON-lines file, e.g.metadata-dump.jsonl
like this:datalad meta-add -d <dataset-path> --json-lines -i metadata-dump.jsonl
-
meta-ingest-previous
-- ingest metadata frommetalady<=0.2.1
.
A word of caution: documentation is still lacking and will be addressed with release 0.3.1.
Additional metadata extractor implementations
-
Compatible with the previous families of extractors provided by datalad and by metalad, i.e.
metalad_core
,metalad_annex
,metalad_custom
,metalad_runprov
-
New metadata extractor paradigm that distinguishes between file- and dataset-level extractors. Included are two example extractors,
metalad_example_dataset
, andmetalad_example_file
-
metalad_external_dataset
andmetalad_external_file
, a dataset- and a file-extractors that execute external processes to generate metadata allow processing of the externally created metadata in datalad. -
metalad_studyminimeta
-- a dataset-level extractor that reads studyminimeta yaml files and produces metadata that contains a JSON-LD compatible description of the data in the input file
Indexers
-
Provides indexers for the new datalad indexer-plugin interface. These indexers convert metadata in proprietary formats into a set of key-value pairs that can be used by
datalad search
to search for content. -
indexer_studyminimeta
-- converts studyminimeta JSON-LD description into key-value pairs fordatalad search
. -
indexer_jsonld
-- a generic JSON-LD indexer that aims at converting any JSON-LD descriptions into a set of key-value pairs that reflect the content of the JSON-LD description.
Installation
Before you install this package, please make sure that you install a recent
version of git-annex. Afterwards,
install the latest version of datalad-metalad
from
PyPi. It is recommended to use
a dedicated virtualenv:
# create and enter a new virtual environment (strongly recommended)
virtualenv --system-site-packages --python=python3 ~/env/datalad
. ~/env/datalad/bin/activate
# install from github
pip install datalad-metalad
Support
For general information on how to use or contribute to DataLad (and this extension), please see the DataLad website or the main GitHub project page. The documentation is found here: http://docs.datalad.org/projects/metalad
All bugs, concerns and enhancement requests for this software can be submitted here: https://github.com/datalad/datalad-metalad/issues
If you have a problem or would like to ask a question about how to use DataLad,
please submit a question to
NeuroStars.org with a datalad
tag.
NeuroStars.org is a platform similar to StackOverflow but dedicated to
neuroinformatics.
All previous DataLad questions are available here: http://neurostars.org/tags/datalad/
Acknowledgements
This DataLad extension was developed with support from the German Federal Ministry of Education and Research (BMBF 01GQ1905), and the US National Science Foundation (NSF 1912266).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file datalad_metalad-0.3.1.tar.gz
.
File metadata
- Download URL: datalad_metalad-0.3.1.tar.gz
- Upload date:
- Size: 124.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.9.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2cb9a4e1ec2bb707afdcb8b44b82d3811f2633fab70884ba17a2e3cd9ff11527 |
|
MD5 | 937000cb6afe2b9437ca45af9bf06437 |
|
BLAKE2b-256 | 5573e034ff8d50a97827710f55587768c9d3a3cda4ad2d848e1d2e87842e54af |
File details
Details for the file datalad_metalad-0.3.1-py3-none-any.whl
.
File metadata
- Download URL: datalad_metalad-0.3.1-py3-none-any.whl
- Upload date:
- Size: 139.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.9.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | abe7628862dea54b0424ed071e7ba7a4d471e01fb6762612dfb883090631231e |
|
MD5 | 422a337805b5506817b016f505de5949 |
|
BLAKE2b-256 | 8fd0e0d4cb876dcee0d58e0eb16efa3bf4ed5f8f648fb3d6b59a4a3e3e6a2084 |