GA4GH Data Object Service Schemas
Project description
Schemas for the Data Object Service (DOS) API
View the schemas in Swagger UI
The Global Alliance for Genomics and Health is an international coalition, formed to enable the sharing of genomic and clinical data. This collaborative consortium takes place primarily via github and public meetings. Join the issues today to help us make a cloud agnostic Data Object Service!
Cloud Workstream
The Data Working Group concentrates on data representation, storage, and analysis, including working with platform development partners and industry leaders to develop standards that will facilitate interoperability. The Cloud Workstream is an informal, multi-vendor working group focused on standards for exchanging Docker-based tools and CWL/WDL workflows, execution of Docker-based tools and workflows on clouds, and abstract access to cloud object stores.
What is DOS?
Currently, this is the home of the Data Object Service (DOS) API proposal. This repo has a CWL-based build process ready to go and a place for us to collectively work on USECASES.md.
This proposal for a DOS release is based on the schema work of Brian W. and others from OHSU along with work by UCSC. It also is informed by existing object storage systems such as:
GNOS: http://annaisystems.com/ (as used by PCAWG, see https://pcawg.icgc.org)
ICGC Storage: as used to store data on S3, see https://github.com/icgc-dcc/dcc-storage and https://dcc.icgc.org/icgc-in-the-cloud/aws
HCA Storage: see https://dss.staging.data.humancellatlas.org/ and https://github.com/HumanCellAtlas/data-store
the GDC Storage: see https://gdc.cancer.gov
Keep by Curoverse: see https://arvados.org/ and https://github.com/curoverse/arvados
The goal of DOS is to create a generic API on top of these and other projects, so workflow systems can access data in the same way regardless of project. One section of the API focuses on how to read and write data objects to cloud environments and how to join them together as data bundles (Data object management). Another focuses on the ability to find data objects across cloud environments and implementations of DOS (Data object queries). The latter is likely to be worked on in conjunction with the GA4GH Discovery Workstream.
Key features of the current API proposal:
Data object management
This section of the API focuses on how to read and write data objects to cloud environments and how to join them together as data bundles. Data bundles are simply a flat collection of one or more files. This section of the API enables:
create/update/delete a file
create/update/delete a data bundle
register UUIDs with these entities (an optionally track versions of each)
generate signed URLs and/or cloud specific object storage paths and temporary credentials
Data object queries
A key feature of this API beyond creating/modifying/deletion files is the ability to find data objects across cloud environments and implementations of DOS. This section of the API allows users to query by data bundle or file UUIDs which returns information about where these data objects are available. This response will typically be used to find the same file or data bundle located across multiple cloud environments.
Implementations
There are currently a few experimental implementations that use some version of these schemas.
DOS Connect observes cloud and local storage systems and broadcasts their changes to a service that presents DOS endpoints.
DOS Downloader is a mechanism for downloading Data Objects from DOS URLs.
dos-gdc-lambda presents data from the GDC public rest API using the Data Object Service.
dos-signpost-lambda presents data from a signpost instance using the Data Object Service.
Building the client and server
You can use pip to install a python client and server that implements these schemas.
virtualenv env source env/bin/activate pip install git+git://github.com/ga4gh/data-object-schemas@master --process-dependency-links
This will add the python modules ga4gh.dos.server and ga4gh.dos.client you can use in your projects.
There is also a CLI hook.
ga4gh_dos_server # In another terminal ga4gh_dos_demo
Building Documents
Make sure you have Docker installed for your platform and the cwltool.
virtualenv env source env/bin/activate pip install -r python/dev-requirements.txt
You can generate the Swagger YAML from the Protocol Buffers:
cwltool CWLFile
Find the output in data_objects_service.swagger.json and this can be loaded in the Swagger editor. Use the GitHub raw feature to generate a URL you can load.
When you’re happy with the changes, checkin this file:
mv data_objects_service.swagger.json swagger/proto/
And commit your changes, pushing to the appropriate branch.
How to contribute changes
Take cues for now from the ga4gh/schemas document.
License
See the LICENSE
More Information
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ga4gh-dos-schemas-0.1.0.tar.gz
.
File metadata
- Download URL: ga4gh-dos-schemas-0.1.0.tar.gz
- Upload date:
- Size: 47.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 11349b2941c32258497a5662e7fcf597d43e51ffe738e2045ef8e8a8499f771e |
|
MD5 | 82245d35527e97328487708840e08491 |
|
BLAKE2b-256 | f46adb8ed758eecdbe8da2dedc93ceecb027fab186707f840705331b1138b1ce |
File details
Details for the file ga4gh_dos_schemas-0.1.0-py2-none-any.whl
.
File metadata
- Download URL: ga4gh_dos_schemas-0.1.0-py2-none-any.whl
- Upload date:
- Size: 16.4 kB
- Tags: Python 2
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b76bcf4b6feca92ad90b028d62eaeaf5c5d2c7ff5d1f8b3a1743f1ecb51fe423 |
|
MD5 | 213a70f508b2dc033eed16373f9330fc |
|
BLAKE2b-256 | cf3685a25beee7e68a560a05a57e45af1ea37e13e45a3f1b309b2593b137976d |