Skip to main content

Generator-based operators for asynchronous iteration

Project description

Generator-based operators for asynchronous iteration

Synopsis

aiostream provides a collection of stream operators that can be combined to create asynchronous pipelines of operations.

It can be seen as an asynchronous version of itertools, although some aspects are slightly different. Essentially, all the provided operators return a unified interface called a stream. A stream is an enhanced asynchronous iterable providing the following features:

  • Operator pipe-lining - using pipe symbol |

  • Repeatability - every iteration creates a different iterator

  • Safe iteration context - using async with and the stream method

  • Simplified execution - get the last element from a stream using await

  • Slicing and indexing - using square brackets []

  • Concatenation - using addition symbol +

Requirements

The stream operators rely heavily on asynchronous generators (PEP 525):

  • python >= 3.6

Stream operators

The stream operators are separated in 7 categories:

creation

iterate, preserve, just, call, empty, throw, never, repeat, count, range

transformation

map, enumerate, starmap, cycle, chunks

selection

take, takelast, skip, skiplast, getitem, filter, until, takewhile, dropwhile

combination

map, zip, merge, chain, ziplatest

aggregation

accumulate, reduce, list

advanced

concat, flatten, switch, concatmap, flatmap, switchmap

timing

spaceout, timeout, delay

miscellaneous

action, print

Demonstration

The following example demonstrates most of the streams capabilities:

import asyncio
from aiostream import stream, pipe


async def main():

    # Create a counting stream with a 0.2 seconds interval
    xs = stream.count(interval=0.2)

    # Operators can be piped using '|'
    ys = xs | pipe.map(lambda x: x**2)

    # Streams can be sliced
    zs = ys[1:10:2]

    # Use a stream context for proper resource management
    async with zs.stream() as streamer:

        # Asynchronous iteration
        async for z in streamer:

            # Print 1, 9, 25, 49 and 81
            print('->', z)

    # Streams can be awaited and return the last value
    print('9² = ', await zs)

    # Streams can run several times
    print('9² = ', await zs)

    # Streams can be concatenated
    one_two_three = stream.just(1) + stream.range(2, 4)

    # Print [1, 2, 3]
    print(await stream.list(one_two_three))


# Run main coroutine
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()

More examples are available in the example section of the documentation.

Contact

Vincent Michel: vxgmichel@gmail.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiostream-0.5.0.tar.gz (35.2 kB view details)

Uploaded Source

Built Distribution

aiostream-0.5.0-py3-none-any.whl (39.4 kB view details)

Uploaded Python 3

File details

Details for the file aiostream-0.5.0.tar.gz.

File metadata

  • Download URL: aiostream-0.5.0.tar.gz
  • Upload date:
  • Size: 35.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for aiostream-0.5.0.tar.gz
Algorithm Hash digest
SHA256 d64aa63cb9b96b4dae74b09f260d2d0fa423fbbd17bce29ed39ff8e5caf95b57
MD5 8bba6eef29d68ee6c81d50a995396652
BLAKE2b-256 74c4778bdcc38ff8c9b0a335a9ab2c9c00e1907b3eec310a342ab244c963e753

See more details on using hashes here.

Provenance

File details

Details for the file aiostream-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: aiostream-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 39.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for aiostream-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 daf1979a683452fbcd9b1a6933b3d7c9237d3c240b9737f176ab6e1aaf2d8bca
MD5 beb3d4ff87f8dbf0e74402ff21a03a52
BLAKE2b-256 6feaeba7ae76ce3c0fa178cee4edc87a3f0b2669ab5c055b4a2d56f2d6658fe5

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page