Skip to main content

Organize, visualize, and analyze histology images.

Project description

Build Status codecov.io doi-badge

Organize, visualize, and analyze histology images.

HistomicsUI organizes and manages whole slide image (WSI) files using Girder. It has a dedicated interface to select WSI, add annotations manually, and to run analysis and algorithms on all or parts of images.

Girder provides authentication, access control, and diverse storage options, including using local file systems and Amazon S3. WSI images are read and displayed via the large_image module. Algorithms are containerized using Docker and are run using the slicer_cli_web Girder plugin. These can be run on multiple worker machines via Girder Worker and celery.

A set of common algorithms are provided by HistomicsTK.

License

HistomicsUI is made available under the Apache License, Version 2.0. For more details, see LICENSE

Community

Discussions | Issues | Contact Us

Installation

Linux

In linux with Python 3.8 or newer:

Prerequisites:

  • MongoDB must be installed and running.

  • An appropriate version of Python must be installed.

HistomicsUI uses large_image sources to read different image file formats. You need to install appropriate sources for the files that will be used.

# install all sources from the main repo
pip install large-image[sources] --find-links https://girder.github.io/large_image_wheels

or

# install openslide and tiff sources
pip install large-image-source-tiff large-image-source-openslide --find-links https://girder.github.io/large_image_wheels

Now install the histomicsui package, have Girder build its UI, and start the Girder server. Note that at Girder may still require an old version of node (14.x) to build correctly – nvm can be used to manage multiple versions of node.

pip install histomicsui[analysis]
girder build
girder serve

To use Girder Worker:

pip install girder_slicer_cli_web[worker]
GW_DIRECT_PATHS=true girder-worker -l info -Ofair --prefetch-multiplier=1

Girder Worker needs the rabbitmq message service to be running to communicate with Girder. Both Girder and Girder Worker should be run as a user that is a member of the docker group.

The first time you start HistomicsUI, you’ll also need to configure Girder with at least one user and one assetstore (see the Girder documentation). Additionally, it is recommended that you install the HistomicsTK algorithms. This can be done going to the Admin Console, Plugins, Slicer CLI Web settings. Set a default task upload folder, then import the dsarchive/histomicstk:latest docker image.

Reference Deployment

The standard deployment of HistomicsUI is the Digital Slide Archive. The associated repository has tools for readily installing via Docker, VirtualBox, or shell scripts on Ubuntu.

Development

The most convenient way to develop on HistomicsUI is to use the devops scripts from the Digital Slide Archive.

If you are making changes to the HistomicsUI frontend, you can make Girder watch the source code and perform hot reloads on changes using the --watch-plugin argument to girder build. See the Girder docs for more information.

Annotations and Metadata from Jobs

This handles ingesting annotations and metadata that are uploaded and associating them with existing large image items in the Girder database. These annotations and metadata are commonly generated through jobs, such as HistomicTK tasks, but can also be added manually.

If a file is uploaded to the Girder system that includes a reference record, and that reference record contains an identifier field and at least one of a fileId and an itemId field, specific identifiers can be used to ingest the results. If a userId is specified in the reference record, permissions for adding the annotation or metadata are associated with that user.

Metadata

Identifiers ending in ItemMetadata are loaded and then set as metadata on the associated item that contains the specified file. Conceptually, this is the same as calling the PUT item/{id}/metadata endpoint.

Annotations

Identifiers ending in AnnotationFile are loaded as annotations, associated with the item that contains the specified file. Conceptually, this is the same as uploaded the file via the annotation endpoints for the item associated with the specified fileId or itemId.

If the annotation file contains any annotations with elements that contain girderId values, the girderId values can be identifier values from files that were uploaded with a reference record that contains a matching uuid field. The uuid field is required for this, but is treated as an arbitrary string.

Funding

This work was funded in part by the NIH grant U24-CA194362-01.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

histomicsui-1.6.1.dev18.tar.gz (716.6 kB view details)

Uploaded Source

Built Distribution

histomicsui-1.6.1.dev18-py2.py3-none-any.whl (221.2 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file histomicsui-1.6.1.dev18.tar.gz.

File metadata

  • Download URL: histomicsui-1.6.1.dev18.tar.gz
  • Upload date:
  • Size: 716.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for histomicsui-1.6.1.dev18.tar.gz
Algorithm Hash digest
SHA256 1d9eb64f3c6ddb862df64a6ba4dd5296ac831f80a1ed2224f1dd25a84bec6058
MD5 412cef33dbb88b04bfaa37ab6df66364
BLAKE2b-256 9fea61a665359928e38ab4125035e961f9e30fe3990767dea8ff59e432154042

See more details on using hashes here.

Provenance

File details

Details for the file histomicsui-1.6.1.dev18-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for histomicsui-1.6.1.dev18-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 98f28c0dcbfa9e4d081576475fa9f17865dad28001e45ce8eb1e8fe6144e2fd6
MD5 ab0be3d06d412a6b3f118fb7c6053faa
BLAKE2b-256 e8cc560b45f0b2652813caf9f9838e3d32c3f79b7055627771779c882c446bd6

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page