Skip to main content

This package provides an easy API for moving the work out of the tornado process / event loop.

Project description

This package provides an easy API for moving the work out of the tornado process / event loop.

Currently implemented methods are:

  • execute the code in another server’s http hook (django implementation is included);

  • execute the code in a separate thread (thread pool is used);

  • dummy immediate execution.

API example:

from django.contrib.auth.models import User
from slacker import adisp
from slacker import Slacker
from slacker.workers import DjangoWorker

AsyncUser = Slacker(User, DjangoWorker())

@adisp.process
def process_data():
    # all the django ORM is supported; the query will be executed
    # on remote end, this will not block the IOLoop

    users = yield AsyncUser.objects.filter(is_staff=True)[:5]
    print users

(pep-342 syntax and adisp library are optional, callback-style code is also supported)

Installation

pip install tornado-slacker

Slackers and workers

In order to execute some code in non-blocking manner:

  1. Create a Slacker (configured with desired worker) for some python object:

    from slacker import Slacker
    from slacker.workers import ThreadWorker
    
    class Foo(object):
        # ...
    
    worker = ThreadWorker()
    AsyncFoo = Slacker(Foo, worker)
  2. build a query (you can access attributes, do calls and slicing):

    query = AsyncFoo('foo').do_blocking_operation(param1, param2)[0]
  3. execute the query:

    def callback(result):
        # ...
    
    query.proceed(callback)

    or, using pep-342 style:

    from slacker import adisp
    
    @adisp.process
    def handler():
        result = yield query
        # ...

Slackers

Slackers are special objects that are collecting operations (attribute access, calls, slicing) without actually executing them:

>>> from slacker import Slacker
>>> class Foo():
...     pass
...
>>> FooSlacker = Slacker(Foo)
>>> FooSlacker.hello.world()
__main__.Foo: [('hello',), ('world', (), {})]

>>> FooSlacker(name='me').hello.world(1, y=3)[:3]
__main__.Foo: [(None, (), {'name': 'me'}),
 ('hello',),
 ('world', (1,), {'y': 3}),
 (slice(None, 3, None), None)]

Callables arguments must be picklable. Slackers also provide a method to apply the collected operations to a base object.

Any picklable object (including top-level functions and classes) can be wrapped into Slacker, e.g.:

from slacker import adisp
from slacker import Slacker
from slacker.workers import ThreadWorker

def task(param1, param2):
    # do something blocking and io-bound
    return results

async_task = Slacker(task, ThreadWorker())

# pep-342-style
@adisp.process
def process_data():
    results = yield async_task('foo', 'bar')
    print results

# callback style
def process_data2():
    async_task('foo', 'bar').proceed(on_result)

def on_result(results):
    print results

Python modules also can be Slackers:

import shutil
from slacker import Slacker
from slacker.workers import ThreadWorker

shutil_async = Slacker(shutil, ThreadWorker())
op = shutil_async.copy('file1.txt', 'file2.txt')
op.proceed()

Workers

Workers are classes that decides how and where the work should be done:

  • slacker.workers.DummyWorker executes code in-place (this is blocking);

  • slacker.workers.ThreadWorker executes code in a thread from a thread pool;

  • slacker.workers.HttpWorker pickles the slacker, makes an async http request with this data to a given server hook and expects it to execute the code and return pickled results;

  • slacker.workers.DjangoWorker is just a HttpWorker with default values for use with bundled django remote server hook implementation (slacker.django_backend).

    In order to enable django hook, include ‘slacker.django_backend.urls’ into urls.py and add SLACKER_SERVER option with server address to settings.py.

    SLACKER_SERVER is ‘127.0.0.1:8000’ by default so this should work for development server out of box.

Parallel execution

Parallel task execution is supported by adisp library:

def _task1(param1, param2):
    # do something blocking
    return results

def _task2():
    # do something blocking
    return results

# worker can be reused
worker = ThreadWorker()
task1 = Slacker(_task1, worker)
task2 = Slacker(_task2, worker)

@adisp.process
def process_data():
    # this will execute task1 and task2 in parallel
    # and return the result after all data is ready
    res1, res2 = yield task1('foo', 'bar'), task2()
    print res1, res2

Contributing

If you have any suggestions, bug reports or annoyances please report them to the issue tracker:

Source code:

Both hg and git pull requests are welcome!

Credits

Inspiration:

Third-party software: adisp (tornado adisp implementation is taken from brukva).

License

The license is MIT.

Bundled adisp library uses Simplified BSD License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tornado-slacker-0.0.3.tar.gz (10.3 kB view details)

Uploaded Source

File details

Details for the file tornado-slacker-0.0.3.tar.gz.

File metadata

File hashes

Hashes for tornado-slacker-0.0.3.tar.gz
Algorithm Hash digest
SHA256 15fb3cc5f798ac2e60d62b977151955b8c53b4298f98445185d6d93143c27ca2
MD5 9c60ce78c73234fe3719b53754300b1d
BLAKE2b-256 3d72f710d35f780f9cfb667790f914691a2af8adb46d847a30315f34bb1bba33

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page