Skip to main content

Distributed Python job queue with asyncio and redis

Project description

SAQ

SAQ (Simple Async Queue) is a simple and performant job queueing framework built on top of asyncio and redis. It can be used for processing background jobs with workers. For example, you could use SAQ to schedule emails, execute long queries, or do expensive data analysis.

It uses aioredis >= 2.0.

It is similar to RQ and heavily inspired by ARQ. Unlike RQ, it is async and thus significantly faster if your jobs are async. Even if they are not, SAQ is still considerably faster due to lower overhead.

SAQ optionally comes with a simple UI for monitor workers and jobs.

SAQ Web UI

Install

# minimal install
pip install saq

# web + hiredis
pip install saq[web,hiredis]

Usage

usage: saq [-h] [--workers WORKERS] [--verbose] [--web] settings

Start Simple Async Queue Worker

positional arguments:
  settings           Namespaced variable containing worker settings eg: eg module_a.settings

options:
  -h, --help         show this help message and exit
  --workers WORKERS  Number of worker processes
  --verbose, -v      Logging level: 0: ERROR, 1: INFO, 2: DEBUG
  --web              Start web app

Example

import asyncio

from saq import CronJob, Queue

# all functions take in context dict and kwargs
async def test(ctx, *, a):
    await asyncio.sleep(0.5)
    # result should be json serializable
    # custom serializers and deserializers can be used through Queue(dump=,load=)
    return {"x": a}

async def cron(ctx):
  print("i am a cron job")

async def startup(ctx):
    ctx["db"] = await create_db()

async def shutdown(ctx):
    await ctx["db"].disconnect()

async def before_process(ctx):
    print(ctx["job"], ctx["db"])

async def after_process(ctx):
    pass

queue = Queue.from_url("redis://localhost")

settings = {
    "queue": queue,
    "functions": [test],
    "concurrency": 10,
    "cron_jobs": [CronJob(cron, cron="* * * * * */5")], # run every 5 seconds
    "startup": startup,
    "shutdown": shutdown,
    "before_process": before_process,
    "after_process": after_process,
}

To start the worker, assuming the previous is available in the python path

saq module.file.settings

To enqueue jobs

# schedule a job normally
job = await queue.enqueue("test", a=1)

# wait 1 second for the job to complete
await job.refresh(1)
print(job.results)

# schedule a job in 10 seconds
await queue.enqueue("test", a=1, scheduled=time.time() + 10)

Demo

Start the worker

saq examples.simple.settings --web

Navigate to the web ui

Enqueue jobs

python examples/simple.py

Comparison to ARQ

SAQ is heavily inspired by ARQ but has several enhancements.

  1. Avoids polling by leveraging BLMOVE or RPOPLPUSH and NOTIFY
    1. SAQ has much lower latency than ARQ, with delays of < 5ms. ARQ's default polling frequency is 0.5 seconds
    2. SAQ is up to 8x faster than ARQ
  2. Web interface for monitoring queues and workers
  3. Heartbeat monitor for abandoned jobs
  4. More robust failure handling
    1. Storage of stack traces
    2. Sweeping stuck jobs
    3. Handling of cancelled jobs different from failed jobs (machine redeployments)
  5. Before and after job hooks
  6. Easily run multiple workers to leverage more cores

Development

python -m venv env
source env/bin/activate
pip install -e ".[dev,web]"
docker run -p 6379:6379 redis
./run_checks.sh

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

saq-0.4.2.tar.gz (37.6 kB view details)

Uploaded Source

Built Distribution

saq-0.4.2-py3-none-any.whl (35.2 kB view details)

Uploaded Python 3

File details

Details for the file saq-0.4.2.tar.gz.

File metadata

  • Download URL: saq-0.4.2.tar.gz
  • Upload date:
  • Size: 37.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for saq-0.4.2.tar.gz
Algorithm Hash digest
SHA256 ca84614c67656b877d567b052e2568dea438ec3a408bcaf962b9da2b1f7756e5
MD5 430c3b45ce4925928ad1ef55e6c28ffa
BLAKE2b-256 f40e0e6d1517862f4e47931f9808fca45d82b1af00997911f4846f81c137dfaa

See more details on using hashes here.

File details

Details for the file saq-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: saq-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 35.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for saq-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 356f1d485a5600bab557bfddd96b80e8ddb2c7155916e7babb0ad8ece5a21e2e
MD5 f407951d94347be344dae6b8c8fd2668
BLAKE2b-256 a4d8e0d5017f131e6213cef7a78b594ea36df3080b758c3bec089ef2cef16ce3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page