Skip to main content

computer vision for image-based phenotyping of single cells

Project description

VisCy

VisCy is a deep learning pipeline for training and deploying computer vision models for image-based phenotyping at single-cell resolution.

The following methods are being developed:

  • Image translation
    • Robust virtual staining of landmark organelles
  • Image classification
    • Supervised learning of of cell state (e.g. state of infection)
  • Image representation learning
    • Self-supervised learning of the cell state and organelle phenotypes
Note:
VisCy is currently considered alpha software and is under active development. Frequent breaking changes are expected.

Virtual staining

Pipeline

A full illustration of the virtual staining pipeline can be found here.

Library of virtual staining (VS) models

The robust virtual staining models (i.e VSCyto2D, VSCyto3D, VSNeuromast), and fine-tuned models can be found here

Demos

Image-to-Image translation using VisCy

  • Guide for Virtual Staining Models: Instructions for how to train and run inference on ViSCy's virtual staining models (VSCyto3D, VSCyto2D and VSNeuromast)

  • Image translation Exercise: Example showing how to use VisCy to train, predict and evaluate the VSCyto2D model. This notebook was developed for the DL@MBL2024 course.

  • Virtual staining exercise: exploring the label-free to fluorescence virtual staining and florescence to label-free image translation task using VisCy UneXt2. More usage examples and demos can be found here

Gallery

Below are some examples of virtually stained images (click to play videos). See the full gallery here.

VSCyto3D VSNeuromast VSCyto2D
HEK293T Neuromast A549

Reference

The virtual staining models and training protocols are reported in our recent preprint on robust virtual staining:

@article {Liu2024.05.31.596901,
    author = {Liu, Ziwen and Hirata-Miyasaki, Eduardo and Pradeep, Soorya and Rahm, Johanna and Foley, Christian and Chandler, Talon and Ivanov, Ivan and Woosley, Hunter and Lao, Tiger and Balasubramanian, Akilandeswari and Liu, Chad and Leonetti, Manu and Arias, Carolina and Jacobo, Adrian and Mehta, Shalin B.},
    title = {Robust virtual staining of landmark organelles},
    elocation-id = {2024.05.31.596901},
    year = {2024},
    doi = {10.1101/2024.05.31.596901},
    publisher = {Cold Spring Harbor Laboratory},
    URL = {https://www.biorxiv.org/content/early/2024/06/03/2024.05.31.596901},
    eprint = {https://www.biorxiv.org/content/early/2024/06/03/2024.05.31.596901.full.pdf},
    journal = {bioRxiv}
}

This package evolved from the TensorFlow version of virtual staining pipeline, which we reported in this paper in 2020:

@article {10.7554/eLife.55502,
article_type = {journal},
title = {Revealing architectural order with quantitative label-free imaging and deep learning},
author = {Guo, Syuan-Ming and Yeh, Li-Hao and Folkesson, Jenny and Ivanov, Ivan E and Krishnan, Anitha P and Keefe, Matthew G and Hashemi, Ezzat and Shin, David and Chhun, Bryant B and Cho, Nathan H and Leonetti, Manuel D and Han, May H and Nowakowski, Tomasz J and Mehta, Shalin B},
editor = {Forstmann, Birte and Malhotra, Vivek and Van Valen, David},
volume = 9,
year = 2020,
month = {jul},
pub_date = {2020-07-27},
pages = {e55502},
citation = {eLife 2020;9:e55502},
doi = {10.7554/eLife.55502},
url = {https://doi.org/10.7554/eLife.55502},
keywords = {label-free imaging, inverse algorithms, deep learning, human tissue, polarization, phase},
journal = {eLife},
issn = {2050-084X},
publisher = {eLife Sciences Publications, Ltd},
}

Installation

  1. We recommend using a new Conda/virtual environment.

    conda create --name viscy python=3.10
    # OR specify a custom path since the dependencies are large:
    # conda create --prefix /path/to/conda/envs/viscy python=3.10
    
  2. Install a released version of VisCy from PyPI:

    pip install viscy
    

    If evaluating virtually stained images for segmentation tasks, install additional dependencies:

    pip install "viscy[metrics]"
    

    Visualizing the model architecture requires visual dependencies:

    pip install "viscy[visual]"
    
  3. Verify installation by accessing the CLI help message:

    viscy --help
    

Contributing

For development installation, see the contributing guide.

Additional Notes

The pipeline is built using the PyTorch Lightning framework. The iohub library is used for reading and writing data in OME-Zarr format.

The full functionality is tested on Linux x86_64 with NVIDIA Ampere GPUs (CUDA 12.4). Some features (e.g. mixed precision and distributed training) may not be available with other setups, see PyTorch documentation for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

viscy-0.2.0rc1.tar.gz (6.0 MB view details)

Uploaded Source

Built Distribution

viscy-0.2.0rc1-py3-none-any.whl (102.6 kB view details)

Uploaded Python 3

File details

Details for the file viscy-0.2.0rc1.tar.gz.

File metadata

  • Download URL: viscy-0.2.0rc1.tar.gz
  • Upload date:
  • Size: 6.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for viscy-0.2.0rc1.tar.gz
Algorithm Hash digest
SHA256 df8d58b46c7eca9c6ce7703e499c1ecaffe4880c98c1a23988cb944c172c437d
MD5 37e74f2ca10a2097e6d760230787fe21
BLAKE2b-256 7f6fc29ee0b79a189bbec4d94659f45fd4ba1196a7ff492e8cde64925db8b428

See more details on using hashes here.

File details

Details for the file viscy-0.2.0rc1-py3-none-any.whl.

File metadata

  • Download URL: viscy-0.2.0rc1-py3-none-any.whl
  • Upload date:
  • Size: 102.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for viscy-0.2.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 3d04098d7b22c8098fb63ac76a59529aac67bbfd37e2a8243b7c8ce5bf190ca9
MD5 534ed3882317f50b64649a7a8348da44
BLAKE2b-256 3d0680121fbac7a20528753ecf15ec38647e4cd6f515261e799a01179f8853b2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page