Skip to main content

IO hub for Cryo-EM, Cryo-ET and subtomogram averaging data.

Project description

cryohub

cryohub is a library for reading and writing Cryo-ET data based on the cryotypes specification.

Installation

pip install cryohub

Usage

cryohub provides granular I/O functions such as read_star and read_mrc, which will all return objects following the cryotypes specification.

from cryohub.reading import read_star
poseset = read_star('/path/to/file.star')

A higher level function called read adds some magic to the IO procedure, guessing file formats and returning a list of cryotypes.

from cryohub import read
data = read('/path/to/file.star', '/path/to/directotry/', lazy=False, name_regex=r'tomo_\d+')

See the help for each function for more info.

Similarly to the read_* functions, cryohub provides a series of write_* functions, and a magic higher level write funtion.

from cryohub import write
write([poseset1, poseset2], 'particles.tbl')

From the command line

cryohub can be used as a conversion tool between all available formats:

cryohub convert input_file.star output_file.tbl

If instead you just need to quickly inspect your data but want something more powerful than just reading text files or headers, this command will land you in an ipython shell with the loaded data collected in a list called data:

cryohub view path/to/files/* /other/path/to/file.star
print(data[0])

Features

Currently cryohub is capable of reading images in the following formats:

  • .mrc (and the .mrcs, .st, .map, .rec variants) = .tif(f)
  • Dynamo .em
  • EMAN2 .hdf

and particle data in the following formats:

  • Relion .star
  • Dynamo .tbl
  • Cryolo .cbox and .box
  • EMAN2 .json[^1]

Writer functions currently exist for:

  • .mrc
  • EMAN2 .hdf
  • Dynamo .em
  • Relion .star
  • Dynamo .tbl

[^1]: EMAN2 uses the center of the tomogram as the origin for particle coordinates. This means that when opening a tomogram, you'll have to recenter the particles based on its dimensions. To do so automatically, you can use the center_on_tomo argument to provide the hdf file with the tomogram you want to use.

Image data

When possible (and unless disabled), cryohub loads images lazily using dask. The resulting objects can be treated as normal numpy array, except one needs to call array.compute() to apply any pending operations and return the result.

Contributing

Contributions are more than welcome! If there is a file format that you wish were supported in reading or writing, simply open an issue about it pointing to the specification. Alternatively, feel free to open a PR with your proposed implementation; you can look at the existing functions for inspiration.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cryohub-0.6.3.tar.gz (23.9 kB view details)

Uploaded Source

Built Distribution

cryohub-0.6.3-py2.py3-none-any.whl (30.0 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file cryohub-0.6.3.tar.gz.

File metadata

  • Download URL: cryohub-0.6.3.tar.gz
  • Upload date:
  • Size: 23.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for cryohub-0.6.3.tar.gz
Algorithm Hash digest
SHA256 b23c53f385f27d8154257769cbe1320f40c75027878b04cb5b0aabd386fcbf3c
MD5 612159fa676ed8b2a886aaf8c1f95557
BLAKE2b-256 e1633c8793a65d18a3721e2e2fa87f7c1f018b888c7c42007e4c1a7a4066574e

See more details on using hashes here.

Provenance

File details

Details for the file cryohub-0.6.3-py2.py3-none-any.whl.

File metadata

  • Download URL: cryohub-0.6.3-py2.py3-none-any.whl
  • Upload date:
  • Size: 30.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for cryohub-0.6.3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 0764eb90998b006243b7634dc65b04eddd9f20139fd0bf1fb7dc31eef473df2a
MD5 d5ee1290549ee3c3b3f19be21d41458e
BLAKE2b-256 b776f6127e174b0ea89d5235a5d82493c843fd7f1adc8c9dea5f95336efea13d

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page