Skip to main content

Generate a multiscale, chunked, multi-dimensional spatial image data structure that can be serialized to OME-NGFF.

Project description

multiscale-spatial-image

Test Notebook tests image image DOI

Generate a multiscale, chunked, multi-dimensional spatial image data structure that can serialized to OME-NGFF.

Each scale is a scientific Python Xarray spatial-image Dataset, organized into nodes of an Xarray Datatree.

Installation

pip install multiscale_spatial_image

Usage

import numpy as np
from spatial_image import to_spatial_image
from multiscale_spatial_image import to_multiscale
import zarr

# Image pixels
array = np.random.randint(0, 256, size=(128,128), dtype=np.uint8)

image = to_spatial_image(array)
print(image)

An Xarray spatial-image DataArray. Spatial metadata can also be passed during construction.

<xarray.SpatialImage 'image' (y: 128, x: 128)>
array([[114,  47, 215, ..., 245,  14, 175],
       [ 94, 186, 112, ...,  42,  96,  30],
       [133, 170, 193, ..., 176,  47,   8],
       ...,
       [202, 218, 237, ...,  19, 108, 135],
       [ 99,  94, 207, ..., 233,  83, 112],
       [157, 110, 186, ..., 142, 153,  42]], dtype=uint8)
Coordinates:
  * y        (y) float64 0.0 1.0 2.0 3.0 4.0 ... 123.0 124.0 125.0 126.0 127.0
  * x        (x) float64 0.0 1.0 2.0 3.0 4.0 ... 123.0 124.0 125.0 126.0 127.0
# Create multiscale pyramid, downscaling by a factor of 2, then 4
multiscale = to_multiscale(image, [2, 4])
print(multiscale)

A chunked Dask Array MultiscaleSpatialImage Xarray Datatree.

DataTree('multiscales', parent=None)
├── DataTree('scale0')
│   Dimensions:  (y: 128, x: 128)
│   Coordinates:
│     * y        (y) float64 0.0 1.0 2.0 3.0 4.0 ... 123.0 124.0 125.0 126.0 127.0
│     * x        (x) float64 0.0 1.0 2.0 3.0 4.0 ... 123.0 124.0 125.0 126.0 127.0
│   Data variables:
│       image    (y, x) uint8 dask.array<chunksize=(128, 128), meta=np.ndarray>
├── DataTree('scale1')
│   Dimensions:  (y: 64, x: 64)
│   Coordinates:
│     * y        (y) float64 0.5 2.5 4.5 6.5 8.5 ... 118.5 120.5 122.5 124.5 126.5
│     * x        (x) float64 0.5 2.5 4.5 6.5 8.5 ... 118.5 120.5 122.5 124.5 126.5
│   Data variables:
│       image    (y, x) uint8 dask.array<chunksize=(64, 64), meta=np.ndarray>
└── DataTree('scale2')
    Dimensions:  (y: 16, x: 16)
    Coordinates:
      * y        (y) float64 3.5 11.5 19.5 27.5 35.5 ... 91.5 99.5 107.5 115.5 123.5
      * x        (x) float64 3.5 11.5 19.5 27.5 35.5 ... 91.5 99.5 107.5 115.5 123.5
    Data variables:
        image    (y, x) uint8 dask.array<chunksize=(16, 16), meta=np.ndarray>

Store as an Open Microscopy Environment-Next Generation File Format (OME-NGFF) / netCDF Zarr store.

It is highly recommended to use dimension_separator='/' in the construction of the Zarr stores.

store = zarr.storage.DirectoryStore('multiscale.zarr', dimension_separator='/')
multiscale.to_zarr(store)

Note: The API is under development, and it may change until 1.0.0 is released. We mean it :-).

Examples

Development

Contributions are welcome and appreciated.

To run the test suite:

git clone https://github.com/spatial-image/multiscale-spatial-image
cd multiscale-spatial-image
pip install -e ".[test]"
pytest
# Notebook tests
pytest --nbmake --nbmake-timeout=3000 examples/*ipynb

To add new or update testing data, such as a new baseline for this block:

dataset_name = "cthead1"
image = input_images[dataset_name]
baseline_name = "2_4/XARRAY_COARSEN"
multiscale = to_multiscale(image, [2, 4], method=Methods.XARRAY_COARSEN)
verify_against_baseline(test_data_dir, dataset_name, baseline_name, multiscale)

Add a store_new_image call in your test block:

dataset_name = "cthead1"
image = input_images[dataset_name]
baseline_name = "2_4/XARRAY_COARSEN"
multiscale = to_multiscale(image, [2, 4], method=Methods.XARRAY_COARSEN)

store_new_image(dataset_name, baseline_name, multiscale)

verify_against_baseline(dataset_name, baseline_name, multiscale)

Run the tests to generate the output. Remove the store_new_image call.

Then, create a tarball of the current testing data

cd test/data
tar cvf ../data.tar *
gzip -9 ../data.tar
python3 -c 'import pooch; print(pooch.file_hash("../data.tar.gz"))'

Update the test_data_sha256 variable in the test/_data.py file. Upload the data to web3.storage. And update the test_data_ipfs_cid Content Identifier (CID) variable, which is available in the web3.storage web page interface.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

multiscale_spatial_image-0.11.2.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

File details

Details for the file multiscale_spatial_image-0.11.2.tar.gz.

File metadata

File hashes

Hashes for multiscale_spatial_image-0.11.2.tar.gz
Algorithm Hash digest
SHA256 07db47c3358bdfad9688a5ee8064a219ec72a43deb65e30bd4e591adfe35d10f
MD5 e6f21c10c97488fb234340551dde5ba4
BLAKE2b-256 f4b4738ad9f0663f012c86d9895e1b4c8e8b0012572f0db3729db7a21fb46651

See more details on using hashes here.

File details

Details for the file multiscale_spatial_image-0.11.2-py3-none-any.whl.

File metadata

File hashes

Hashes for multiscale_spatial_image-0.11.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a02f542a2f535c4229b64f4624a31d2b5ae9a098401da7743bb95e6a8759d23d
MD5 d949bdf48d1f104bac5e9ee5831e61d3
BLAKE2b-256 993b9e62694f6bd360a72d7f944cc025de6334ef0ba11a957a8f859b9b14e491

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page