Skip to main content

Run code on a dask worker via a context manager

Project description

Afar

Python Version Version License Build Status Coverage Status Code style

One man's magic is another man's engineering
Robert A. Heinlein


afar allows you to run code on a remote Dask cluster using context managers. For example:

import afar

with afar.run, remotely:
    import dask_cudf
    df = dask_cudf.read_parquet("s3://...")
    result = df.sum().compute()

Outside the context, result is a Dask Future whose data resides on a worker. result.result() is necessary to copy the data locally.

By default, only the last assignment is saved. One can specify which variables to save:

with afar.run("one", "two"), remotely:
    one = 1
    two = one + 1

one and two are now both Futures. They can be used directly in other afar.run contexts:

with afar.run as data, remotely:
    three = one + two

assert three.result() == 3
assert data["three"].result() == 3

data above is a dictionary of variable names to Futures. It may be necessary at times to get the data from here. Alternatively, you may pass a mapping to afar.run to use as the data.

run = afar.run(data={"four": 4})
with run, remotely:
    seven = three + four
assert run.data["seven"].result() == 7

If you want to automatically gather the data locally (to avoid calling .result()), use afar.get instead of afar.run:

with afar.get, remotely:
    five = two + three
assert five == 5

If using IPython/Jupyter, the rich repr of the final expression will be displayed if it's not an assignment:

with afar.run, remotely:
    three + seven
# displays 10!

Is this a good idea?

I don't know!

For motivation, see https://github.com/dask/distributed/issues/4003

It's natural to be skeptical of unconventional syntax. Often times, I don't think it's obvious whether new syntax will be nice to use, and you really just need to try it out and see.

We're still exploring the usability of afar. If you try it out, please share what you think, and ask yourself questions such as:

  • can we spell anything better?
  • does this offer opportunities?
  • what is surprising?
  • what is lacking?

Here's an example of an opportunity:

on_gpus = afar.remotely(resources={"GPU": 1})

with afar.run, on_gpus:
    ...

This now works! Keyword arguments to remotely will be passed to client.submit.

I don't know about you, but I think this is starting to look and feel kinda nice, and it could probably be even better :)

This code is highly experimental and magical!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

afar-0.3.0.tar.gz (28.3 kB view details)

Uploaded Source

Built Distribution

afar-0.3.0-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file afar-0.3.0.tar.gz.

File metadata

  • Download URL: afar-0.3.0.tar.gz
  • Upload date:
  • Size: 28.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/0.0.0 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.10

File hashes

Hashes for afar-0.3.0.tar.gz
Algorithm Hash digest
SHA256 6550dd0537d9ed9599748c649ed5e85f39458f173d9c4e871b85a3c9c4f68c8a
MD5 7e76c7a96ac78b54f9c718ab29a0eea8
BLAKE2b-256 8bcd763db82161fe5b1defdb2c43947d063e751f3347788fb2dc55bbf7b1c452

See more details on using hashes here.

File details

Details for the file afar-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: afar-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 11.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/0.0.0 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.10

File hashes

Hashes for afar-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5b010bec7344a0dec67dd7ffb2d4e13b2fbffecadca7d7eea02656ede801cf20
MD5 0491e8f4a7ff9d82af4a83c5524a3a0c
BLAKE2b-256 93071d5f717caf460fa9686a33b5527986b7f2bdabbd04329cb855262184715d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page