A search interface for cancer variant interpretations assembled by aggregating and harmonizing across multiple cancer variant interpretation knowledgebases.
Project description
metakb
The intent of the project is to leverage the collective knowledge of the disparate existing resources of the VICC to improve the comprehensiveness of clinical interpretation of genomic variation. An ongoing goal will be to provide and improve upon standards and guidelines by which other groups with clinical interpretation data may make it accessible and visible to the public. We have released a preprint discussing our initial harmonization effort and observed disparities in the structure and content of variant interpretations.
Getting Started
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
Prerequisites
- A newer version of Python 3, preferably 3.8 or greater. To confirm on your system, run:
python3 --version
- Pipenv, for package management.
pip3 install --user pipenv
Installing
Once Pipenv is installed, clone the repo and install the package requirements into a Pipenv environment:
git clone https://github.com/cancervariants/metakb
cd metakb
pipenv lock && pipenv sync
If you intend to provide development support, install the development dependencies:
pipenv lock --dev && pipenv sync
Setting up Neo4j
The MetaKB uses Neo4j for its database backend. To run a local MetaKB instance, you'll need to run a Neo4j database instance as well. The easiest way to do this is from Neo4j Desktop.
First, follow the desktop setup instructions to download, install, and open Neo4j Desktop for the first time.
Once you have opened Neo4j desktop, use the New
button in the upper-left region of the window to create a new project. Within that project, click the Add
button in the upper-right region of the window and select Local DBMS
. The name of the DBMS doesn't matter, but the password will be used later to connect the database to MetaKB (we have been using password
by default). Select version 5.14.0
(other versions have not been tested). Click Create
. Then, click the row within the project screen corresponding to your newly-created DBMS, and click the green Start
button to start the database service.
The graph will initially be empty, but once you have successfully loaded data, Neo4j Desktop provides an interface for exploring and visualizing relationships within the graph. To access it, click the blue "Open" button. The prompt at the top of this window processes Cypher queries; to start, try MATCH (n:Statement {id:"civic.eid:1409"}) RETURN n
. Buttons on the left-hand edge of the results pane let you select graph, tabular, or textual output.
Setting up normalizers
The MetaKB calls a number of normalizer libraries to transform resource data and resolve incoming search queries. These will be installed as part of the package requirements, but require additional setup.
First, follow these instructions for deploying DynamoDB locally on your computer. Once setup, in a separate terminal instance, navigate to its source directory and run the following to start the database instance:
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb
Next, navigate to the site-packages
directory of your virtual environment. Assuming Pipenv is installed to your user directory, this should be something like:
cd ~/.local/share/virtualenvs/metakb-<various characters>/lib/python<python-version>/site-packages/ # replace <various characters> and <python-version>
Next, initialize the Variation Normalizer by following the instructions in the README. When setting up the UTA database, these docs may be helpful.
The MetaKB can acquire all other needed normalizer data, except for that of OMIM, which must be manually placed:
cp ~/YOUR/PATH/TO/mimTitles.txt ~/.local/share/wags_tails/omim/omim_<date>.tsv # replace <date> with date of data acquisition formatted as YYYYMMDD
Environment Variables
MetaKB relies on environment variables to set in order to work.
-
Always Required:
-
UTA_DB_URL
- Used in Variation Normalizer which relies on UTA Tools
- Format:
driver://user:pass@host/database/schema
- More info can be found here
Example:
export UTA_DB_URL=postgresql://uta_admin:password@localhost:5432/uta/uta_20210129
-
-
Required when using the
--load_normalizers_db
or--force_load_normalizers_db
arguments in CLI commands-
UMLS_API_KEY
- Used in Therapy Normalizer to retrieve RxNorm data
- RxNorm requires a UMLS license, which you can register for one here. You must set the
UMLS_API_KEY
environment variable to your API key. This can be found in the UTS 'My Profile' area after singing in.
Example:
export UMLS_API_KEY={rxnorm_api_key}
-
HARVARD_DATAVERSE_API_KEY
- Used in Therapy Normalizer to retrieve HemOnc data
- HemOnc.org data requires a Harvard Dataverse API key. After creating a user account on the Harvard Dataverse website, you can follow these instructions to generate a key. You will create or login to your account at this site. You must set the
HARVARD_DATAVERSE_API_KEY
environment variable to your API key.
Example:
export HARVARD_DATAVERSE_API_KEY={dataverse_api_key}
-
Loading data
Once Neo4j and DynamoDB instances are both running, and necessary normalizer data has been placed, run the MetaKB CLI with the --initialize_normalizers
flag to acquire all other necessary normalizer source data, and execute harvest, transform, and load operations into the graph datastore.
In the MetaKB project root, run the following:
pipenv shell
python3 -m metakb.cli --db_url=bolt://localhost:7687 --db_username=neo4j --db_password=<neo4j-password-here> --load_normalizers_db
For more information on the different CLI arguments, see the CLI README.
Starting the server
Once data has been loaded successfully, use the following to start service on localhost port 8000:
uvicorn metakb.main:app --reload
Ensure that both the MetaKB Neo4j and Normalizers databases are running.
Navigate to http://localhost:8000/api/v2 in your browser to enter queries.
Running tests
Unit tests
Explain how to run the automated tests for this system
python3 -m pytest
And coding style tests
Code style is managed by ruff and checked prior to commit.
python3 -m ruff check --fix . && python3 -m ruff format .
Contributing
Please read CONTRIBUTING.md for details on our code of conduct, and the process for submitting pull requests to us.
Committing
We use pre-commit to run conformance tests.
This ensures:
- Check code style
- Check for added large files
- Detect AWS Credentials
- Detect Private Key
Before first commit run:
pre-commit install
Versioning
We use SemVer for versioning. For the versions available, see the tags on this repository.
License
This project is licensed under the MIT License - see the LICENSE file for details
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file metakb-2.0.0.dev0.tar.gz
.
File metadata
- Download URL: metakb-2.0.0.dev0.tar.gz
- Upload date:
- Size: 53.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.0.0 CPython/3.12.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7b9edb2ab8a8038f2fe436c8da49b3db1f8ff73b81623836b365ef466e3a41de |
|
MD5 | 780f3d588dc65b0562699bc0d1c6193e |
|
BLAKE2b-256 | a401a3c6d73d91422a83fb447ca73b2fb076ddfe9947e814171e56024778da75 |
File details
Details for the file metakb-2.0.0.dev0-py3-none-any.whl
.
File metadata
- Download URL: metakb-2.0.0.dev0-py3-none-any.whl
- Upload date:
- Size: 57.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.0.0 CPython/3.12.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 72cb9092d210a4c9a5e38859bd32e9c2f79b04712a3ff1fa6c7f0bd0a9e7310e |
|
MD5 | c7c7b272b2495d2800639c360c900d9f |
|
BLAKE2b-256 | 363bf415c6f2470fad5c5e0abcb18c0551d30a8097a9d6b9497a44ac03fa540b |