Skip to main content

RQ is a simple, lightweight, library for creating background jobs, and processing them.

Project description

RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.

RQ requires Redis >= 3.0.0.

Build status PyPI Coverage Code style: black

Full documentation can be found here.

Support RQ

If you find RQ useful, please consider supporting this project via Tidelift.

Getting started

First, run a Redis server, of course:

$ redis-server

To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function:

import requests

def count_words_at_url(url):
    """Just an example function that's called async."""
    resp = requests.get(url)
    return len(resp.text.split())

Then, create an RQ queue:

from redis import Redis
from rq import Queue

queue = Queue(connection=Redis())

And enqueue the function call:

from my_module import count_words_at_url
job = queue.enqueue(count_words_at_url, 'http://nvie.com')

Scheduling jobs are also similarly easy:

# Schedule job to run at 9:15, October 10th
job = queue.enqueue_at(datetime(2019, 10, 10, 9, 15), say_hello)

# Schedule job to run in 10 seconds
job = queue.enqueue_in(timedelta(seconds=10), say_hello)

Retrying failed jobs is also supported:

from rq import Retry

# Retry up to 3 times, failed job will be requeued immediately
queue.enqueue(say_hello, retry=Retry(max=3))

# Retry up to 3 times, with configurable intervals between retries
queue.enqueue(say_hello, retry=Retry(max=3, interval=[10, 30, 60]))

For a more complete example, refer to the docs. But this is the essence.

The worker

To start executing enqueued function calls in the background, start a worker from your project's directory:

$ rq worker --with-scheduler
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default

That's about it.

Installation

Simply use the following command to install the latest released version:

pip install rq

If you want the cutting edge version (that may well be broken), use this:

pip install git+https://github.com/rq/rq.git@master#egg=rq

Docs

To build and run the docs, install jekyll and run:

cd docs
jekyll serve

Related Projects

If you use RQ, Check out these below repos which might be useful in your rq based project.

Project history

This project has been inspired by the good parts of Celery, Resque and this snippet, and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations.

Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rq-2.0.0.tar.gz (639.2 kB view details)

Uploaded Source

Built Distribution

rq-2.0.0-py3-none-any.whl (95.5 kB view details)

Uploaded Python 3

File details

Details for the file rq-2.0.0.tar.gz.

File metadata

  • Download URL: rq-2.0.0.tar.gz
  • Upload date:
  • Size: 639.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.10

File hashes

Hashes for rq-2.0.0.tar.gz
Algorithm Hash digest
SHA256 76d2a4a27f8fd5c4cfa200cd442efe3c1fd73525c676af06f07fcc0b81bdb70d
MD5 b9e104a495f06bf14fb94f50dde910f2
BLAKE2b-256 e329ce60c4571d51d2b5507961646088999dd0fa63f84c020d61fe6472aa480c

See more details on using hashes here.

File details

Details for the file rq-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: rq-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 95.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.10

File hashes

Hashes for rq-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a3a767876675dcc42683bac1869494c5020ba7fcf5c026d1f6d36a8ab98573a6
MD5 930b1221f82e9debacf598a7a28dbe21
BLAKE2b-256 770a09ce745d9639e883888ddae0f71b23a20c59d072eb17b80b8ea8638b16ac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page