Skip to main content

Curated and predicted mappings between biomedical identifiers in different namespaces

Project description

Biomappings

Check mappings PyPI PyPI - Python Version PyPI - License DOI

Community curated and predicted equivalences and related mappings between named biological entities that are not available from primary sources.

💾 Data

The data are available through the following three files on the biomappings/biomappings GitHub repository.

Curated Description Link
Yes Human-curated true mappings src/biomappings/resources/mappings.tsv
Yes Human-curated non-trivial false (i.e., incorrect) mappings src/biomappings/resources/incorrect.tsv
No Automatically predicted mappings src/biomappings/resources/predictions.tsv

These data are available under the CC0 1.0 Universal License.

Equivalences and related mappings that are available from the OBO Foundry and other primary sources can be accessed through Inspector Javert's Xref Database on Zenodo which was described in this blog post.

📊 Summary

A summary is automatically generated nightly with GitHub Actions and deployed to https://biomappings.github.io/biomappings/.

The equivalences are also available as a network through NDEx.

⬇️ Installation

The most recent release can be installed from PyPI with:

$ pip install biomappings

The most recent code and data can be installed directly from GitHub with:

$ pip install git+https://github.com/biomappings/biomappings.git

To install in development mode, use the following:

$ git clone git+https://github.com/biomappings/biomappings.git
$ cd biomappings
$ pip install -e .

Usage

There are three main functions exposed from biomappings. Each loads a list of dictionaries with the mappings in each.

import biomappings

true_mappings = biomappings.load_mappings()

false_mappings = biomappings.load_false_mappings()

predictions = biomappings.load_predictions()

Alternatively, you can use the above links to the TSVs on GitHub in with the library or programming language of your choice.

🙏 Contributing Curations

GitHub Web Interface

GitHub has an interface for editing files directly in the browser. It will take care of creating a branch for you and creating a pull request. After logging into GitHub, click one of the following links to be brought to the editing interface:

This has the caveat that you can only edit one file at a time. It's possible to navigate to your own forked version of the repository after, to the correct branch (will not be the default one), then edit other files in the web interface as well. However, if you would like to do this, then it's probably better to see the following instructions on contributing locally.

Locally

  1. Fork the repository at https://github.com/biomappings/biomappings, clone locally, and make a new branch
  2. Edit one or more of the resource files (mappings.tsv, incorrect.tsv, predictions.tsv)
  3. Commit to your branch, push, and create a pull request back to the upstream repository.

🌐 Web Curation Interface

Rather than editing files locally, this repository also comes with a web-based curation interface. Install the code in development mode with the web option (which installs flask and flask-bootstrap) using:

$ git clone git+https://github.com/biomappings/biomappings.git
$ cd biomappings
$ pip install -e .[web]

The web application can be run with:

$ biomappings web

It has a button for creating commits, but you'll also have to make pushes from the repository yourself after reviewing the changes.

Note if you've installed biomappings via PyPI, then running the web curation interface doesn't make much sense, since it's non-trivial for most users to find the location of the resources within your Python installation's site-packages folder, and you won't be able to contribute them back.

⚖️ License

Code is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

biomappings-0.0.3.tar.gz (859.9 kB view details)

Uploaded Source

Built Distribution

biomappings-0.0.3-py3-none-any.whl (682.4 kB view details)

Uploaded Python 3

File details

Details for the file biomappings-0.0.3.tar.gz.

File metadata

  • Download URL: biomappings-0.0.3.tar.gz
  • Upload date:
  • Size: 859.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.9.1

File hashes

Hashes for biomappings-0.0.3.tar.gz
Algorithm Hash digest
SHA256 47ec1988fdea19e142283e4b3ec93f77a2ad21bb1bf6f31e67a32db9c64d0cb2
MD5 e0e66fb50e28f12c8e5cb11e651a0f62
BLAKE2b-256 152aa691489ba8b135ca0a9e735826915e5b6bdca1283a89be32eea3d9e23f20

See more details on using hashes here.

Provenance

File details

Details for the file biomappings-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: biomappings-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 682.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.9.1

File hashes

Hashes for biomappings-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 21cec4bf46cf361dd81246f51571eed08edcafa1734ae4a6ac8a9096734f74e4
MD5 2df480cedc86dfa875b8d0c716ace621
BLAKE2b-256 7c244ce57585ed2496f774af276f6282998cd432e5e74d1edf53fa378e79e37b

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page