Skip to main content

IO hub for Cryo-EM, Cryo-ET and subtomogram averaging data.

Project description

cryohub

cryohub is a library for reading and writing Cryo-ET data based on the cryotypes specification.

Installation

pip install cryohub

Usage

cryohub provides granular I/O functions such as read_star and read_mrc, which will all return objects following the cryotypes specification.

from cryohub.reading import read_star
poseset = read_star('/path/to/file.star')

A higher level function called read adds some magic to the IO procedure, guessing file formats and returning a list of cryotypes.

from cryohub import read
data = read('/path/to/file.star', '/path/to/directotry/', lazy=False, name_regex=r'tomo_\d+')

See the help for each function for more info.

Similarly to the read_* functions, cryohub provides a series of write_* functions, and a magic higher level write funtion.

from cryohub import write
write([poseset1, poseset2], 'particles.tbl')

From the command line

cryohub can be used as a conversion tool between all available formats:

cryohub convert input_file.star output_file.tbl

If instead you just need to quickly inspect your data but want something more powerful than just reading text files or headers, this command will land you in an ipython shell with the loaded data collected in a list called data:

cryohub view path/to/files/* /other/path/to/file.star
print(data[0])

Features

Currently cryohub is capable of reading images in the following formats:

  • .mrc (and the .mrcs, .st, .map, .rec variants) = .tif(f)
  • Dynamo .em
  • EMAN2 .hdf

and particle data in the following formats:

  • Relion .star
  • Dynamo .tbl
  • Cryolo .cbox and .box
  • EMAN2 .json[^1]

Writer functions currently exist for:

  • .mrc
  • EMAN2 .hdf
  • Dynamo .em
  • Relion .star
  • Dynamo .tbl

[^1]: EMAN2 uses the center of the tomogram as the origin for particle coordinates. This means that when opening a tomogram, you'll have to recenter the particles based on its dimensions. To do so automatically, you can use the center_on_tomo argument to provide the hdf file with the tomogram you want to use.

Image data

When possible (and unless disabled), cryohub loads images lazily using dask. The resulting objects can be treated as normal numpy array, except one needs to call array.compute() to apply any pending operations and return the result.

Contributing

Contributions are more than welcome! If there is a file format that you wish were supported in reading or writing, simply open an issue about it pointing to the specification. Alternatively, feel free to open a PR with your proposed implementation; you can look at the existing functions for inspiration.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cryohub-0.6.4.tar.gz (23.9 kB view details)

Uploaded Source

Built Distribution

cryohub-0.6.4-py2.py3-none-any.whl (30.0 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file cryohub-0.6.4.tar.gz.

File metadata

  • Download URL: cryohub-0.6.4.tar.gz
  • Upload date:
  • Size: 23.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for cryohub-0.6.4.tar.gz
Algorithm Hash digest
SHA256 f3630fee37b493e55948b35d017d5a9d10ca80d198e63cf6688df285c058a358
MD5 d1ea462922fa321310147ebbc23e4d2d
BLAKE2b-256 f54289dfa4a24e89ef147d59b2d02b61dfda1c2ff02553f3b537d22a45a43c6c

See more details on using hashes here.

Provenance

File details

Details for the file cryohub-0.6.4-py2.py3-none-any.whl.

File metadata

  • Download URL: cryohub-0.6.4-py2.py3-none-any.whl
  • Upload date:
  • Size: 30.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for cryohub-0.6.4-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 7130ed139ced86ca35d4fb00907e860aee22361f7d1e2a363da91bacea096c1d
MD5 c2d013bfe467008731ca1793b8fbd00a
BLAKE2b-256 bd2174cffec913b0f2064b8e194691f4c320e76d9e99b73968469cb50a0940f2

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page