multi backend asyncio cache
Project description
The asyncio cache that implements multiple backends.
This library aims for simplicity over specialization. It provides a common interface for all caches which allows to store any python object. The operations supported by all backends are:
add
get
set
multi_get
multi_set
delete
exists
expire
clear
raw: Sends raw command to the underlying client
How does it work
Aiocache provides 3 main entities:
backends: Allow you specify which backend you want to use for your cache. Currently supporting: SimpleMemoryCache, RedisCache using aioredis and MemCache using aiomcache.
serializers: Serialize and deserialize the data between your code and the backends. This allows you to save any Python object into your cache. Currently supporting: DefaultSerializer, PickleSerializer, JsonSerializer.
plugins: Implement a hooks system that allows to execute extra behavior before and after of each command.
If you are missing an implementation of backend, serializer or plugin you think it could be interesting for the package, do not hesitate to open a new issue.
Those 3 entities combine during some of the cache operations to apply the desired command (backend), data transformation (serializer) and pre/post hooks (plugins). To have a better vision of what happens, here you can check how set function works in aiocache:
Usage
Install the package with pip install aiocache.
simple redis
import asyncio
from aiocache import RedisCache
cache = RedisCache(endpoint="127.0.0.1", port=6379, namespace="main")
async def redis():
await cache.set("key", "value")
await cache.set("expire_me", "value", ttl=10)
assert await cache.get("key") == "value"
assert await cache.get("expire_me") == "value"
assert await cache.raw("ttl", "main:expire_me") > 0
def test_redis():
loop = asyncio.get_event_loop()
loop.run_until_complete(redis())
loop.run_until_complete(cache.delete("key"))
loop.run_until_complete(cache.delete("expire_me"))
if __name__ == "__main__":
test_redis()
cached decorator
import asyncio
from collections import namedtuple
from aiocache import cached, RedisCache
from aiocache.serializers import PickleSerializer
Result = namedtuple('Result', "content, status")
@cached(ttl=10, cache=RedisCache, serializer=PickleSerializer())
async def async_main():
print("First ASYNC non cached call...")
await asyncio.sleep(1)
return Result("content", 200)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
print(loop.run_until_complete(async_main()))
print(loop.run_until_complete(async_main()))
print(loop.run_until_complete(async_main()))
print(loop.run_until_complete(async_main()))
The decorator by default will use the SimpleMemoryCache backend and the DefaultSerializer. If you want to use a different backend, you can call it with cached(ttl=10, backend=RedisCache). Also, if you want to use a specific serializer just use cached(ttl=10, serializer=DefaultSerializer())
Documentation
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.