Skip to main content

xarray Dataset from CASA Tables

Reason this release was yanked:

version tag was incorrect on github (v0.2.13)

Project description

https://img.shields.io/pypi/v/dask-ms.svg https://github.com/ratt-ru/dask-ms/actions/workflows/ci.yml/badge.svg Documentation Status

Constructs xarray Datasets from CASA Tables via python-casacore. The Variables contained in the Dataset are dask arrays backed by deferred calls to pyrap.tables.table.getcol.

Supports writing Variables back to the respective column in the Table.

The intention behind this package is to support the Measurement Set as a data source and sink for the purposes of writing parallel, distributed Radio Astronomy algorithms.

Installation

To install with xarray support:

$ pip install dask-ms[xarray]

Without xarray similar, but reduced Dataset functionality is replicated in dask-ms itself. Expert users may wish to use this option to reduce python package dependencies.

$ pip install dask-ms

Documentation

https://dask-ms.readthedocs.io

Gitter Page

https://gitter.im/dask-ms/community

Example Usage

  import dask.array as da
  from daskms import xds_from_table, xds_to_table

  # Create xarray datasets from Measurement Set "WSRT.MS"
  ds = xds_from_table("WSRT.MS")
  # Set the flag Variable on first Dataset to it's inverse
  ds[0]['flag'] = (ds[0].flag.dims, da.logical_not(ds[0].flag))
  # Write the flag column back to the Measurement Set
  xds_to_table(ds, "WSRT.MS", "FLAG").compute()

  print ds

[<xarray.Dataset>
 Dimensions:         (chan: 64, corr: 4, row: 6552, uvw: 3)
 Coordinates:
     ROWID           (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
 Dimensions without coordinates: chan, corr, row, uvw
 Data variables:
     IMAGING_WEIGHT  (row, chan) float32 dask.array<shape=(6552, 64), chunksize=(6552, 64)>
     ANTENNA1        (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     STATE_ID        (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     EXPOSURE        (row) float64 dask.array<shape=(6552,), chunksize=(6552,)>
     MODEL_DATA      (row, chan, corr) complex64 dask.array<shape=(6552, 64, 4), chunksize=(6552, 64, 4)>
     FLAG_ROW        (row) bool dask.array<shape=(6552,), chunksize=(6552,)>
     CORRECTED_DATA  (row, chan, corr) complex64 dask.array<shape=(6552, 64, 4), chunksize=(6552, 64, 4)>
     PROCESSOR_ID    (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     WEIGHT          (row, corr) float32 dask.array<shape=(6552, 4), chunksize=(6552, 4)>
     FLAG            (row, chan, corr) bool dask.array<shape=(6552, 64, 4), chunksize=(6552, 64, 4)>
     TIME            (row) float64 dask.array<shape=(6552,), chunksize=(6552,)>
     SIGMA           (row, corr) float32 dask.array<shape=(6552, 4), chunksize=(6552, 4)>
     SCAN_NUMBER     (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     INTERVAL        (row) float64 dask.array<shape=(6552,), chunksize=(6552,)>
     OBSERVATION_ID  (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     TIME_CENTROID   (row) float64 dask.array<shape=(6552,), chunksize=(6552,)>
     ARRAY_ID        (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     ANTENNA2        (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     DATA            (row, chan, corr) complex64 dask.array<shape=(6552, 64, 4), chunksize=(6552, 64, 4)>
     FEED1           (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     FEED2           (row) int32 dask.array<shape=(6552,), chunksize=(6552,)>
     UVW             (row, uvw) float64 dask.array<shape=(6552, 3), chunksize=(6552, 3)>
 Attributes:
     FIELD_ID:      0
     DATA_DESC_ID:  0]

Limitations

  1. Many Measurement Sets columns are defined as variably shaped, but the actual data is fixed. dask-ms will infer the shape of the data from the first row and must be consistent with that of other rows. For example, this may be issue where multiple Spectral Windows are present in the Measurement Set with differing channels per SPW.

    dask-ms works around this by partitioning the Measurement Set into multiple datasets. The first row’s shape is used to infer the shape of the partition. Thus, in the case of multiple Spectral Window’s, we can partition the Measurement Set by DATA_DESC_ID to create a dataset for each Spectral Window.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dask-ms-0.2.13.tar.gz (100.3 kB view details)

Uploaded Source

Built Distribution

dask_ms-0.2.13-py3-none-any.whl (127.2 kB view details)

Uploaded Python 3

File details

Details for the file dask-ms-0.2.13.tar.gz.

File metadata

  • Download URL: dask-ms-0.2.13.tar.gz
  • Upload date:
  • Size: 100.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for dask-ms-0.2.13.tar.gz
Algorithm Hash digest
SHA256 87640cf681c0f97426ded0930204688ab8fde1144914e0fe43ac0c048b1ef57c
MD5 2d16c4d8e2b97c460312b8609dc59924
BLAKE2b-256 9a972a9c675319e4747501bd6b06cef3ee3436b9f117a79c9725ff681236dd0e

See more details on using hashes here.

File details

Details for the file dask_ms-0.2.13-py3-none-any.whl.

File metadata

  • Download URL: dask_ms-0.2.13-py3-none-any.whl
  • Upload date:
  • Size: 127.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for dask_ms-0.2.13-py3-none-any.whl
Algorithm Hash digest
SHA256 bd9da2c2d68bb7c6eb74589ef5e7f0327629253fdda80597d5442715d17d498e
MD5 301e0e70b621de3152a9f341e5b66e27
BLAKE2b-256 f7d6ddf46f89191801dd8d4fc8379d0df43ee760bcc73463d54bd94add142ade

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page