Skip to main content

Caches your Django ORM queries and automatically invalidates them.

Project description

Caches your Django ORM queries and automatically invalidates them.

Documentation: http://django-cachalot.readthedocs.io


http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 https://img.shields.io/pypi/pyversions/django-cachalot https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml/badge.svg http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 https://img.shields.io/discord/773656139207802881

Table of Contents:

  • Quickstart

  • Usage

  • Hacking

  • Benchmark

  • Third-Party Cache Comparison

  • Discussion

Quickstart

Cachalot officially supports Python 3.7-3.11 and Django 3.2, 4.1, 4.2, 5.0, 5.1 with the databases PostgreSQL, SQLite, and MySQL.

Note: an upper limit on Django version is set for your safety. Please do not ignore it.

Usage

  1. pip install django-cachalot

  2. Add 'cachalot', to your INSTALLED_APPS

  3. If you use multiple servers with a common cache server, double check their clock synchronisation

  4. If you modify data outside Django – typically after restoring a SQL database –, use the manage.py command

  5. Be aware of the few other limits

  6. If you use django-debug-toolbar, you can add 'cachalot.panels.CachalotPanel', to your DEBUG_TOOLBAR_PANELS

  7. Enjoy!

Hacking

To start developing, install the requirements and run the tests via tox.

Make sure you have the following services:

  • Memcached

  • Redis

  • PostgreSQL

  • MySQL

For setup:

  1. Install: pip install -r requirements/hacking.txt

  2. For PostgreSQL: CREATE ROLE cachalot LOGIN SUPERUSER;

  3. Run: tox --current-env to run the test suite on your current Python version.

  4. You can also run specific databases and Django versions: tox -e py38-django3.1-postgresql-redis

Benchmark

Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called “cachalot” on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called “cachalot.” You can also run the benchmark, and it’ll raise errors with specific instructions for how to fix it.

  1. Install: pip install -r requirements/benchmark.txt

  2. Run: python benchmark.py

The output will be in benchmark/TODAY’S_DATE/

TODO Create Docker-compose file to allow for easier running of data.

Third-Party Cache Comparison

There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix:

TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won’t need to scale to the point of needing cache-machine added to the bowl). If you’re an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with select_related and prefetch_related, you can get a nearly 100x speed up for your initial deployment.

Recall, cachalot caches THE ENTIRE TABLE. That’s where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it’s just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it.

Cachalot is more-or-less intended for cold caches or “just-right” conditions. If you find a partition library for Django (also authored but work-in-progress by Andrew Chen Wang), then the caching will work better since sharding the cold/accessed-the-least records aren’t invalidated as much.

Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It’s the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot.

Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that’s modified rapidly? Since you’ve mixed your cold (90% of) with your hot (10% of) records, you’re caching and invalidating an entire table. It’s like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them.

Note 1: My personal experience with caches stems from Reddit’s: https://web.archive.org/web/20210803213621/https://redditblog.com/2017/01/17/caching-at-reddit/

Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools

Discussion

Help? Technical chat? It’s here on Discord.

Legacy chats:

https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

django_cachalot-2.7.0.tar.gz (74.1 kB view details)

Uploaded Source

Built Distribution

django_cachalot-2.7.0-py3-none-any.whl (55.5 kB view details)

Uploaded Python 3

File details

Details for the file django_cachalot-2.7.0.tar.gz.

File metadata

  • Download URL: django_cachalot-2.7.0.tar.gz
  • Upload date:
  • Size: 74.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for django_cachalot-2.7.0.tar.gz
Algorithm Hash digest
SHA256 834b95d5cffdc8953018a33a4e16ef63c8cfe2226a4dc36573e8bf3d62130175
MD5 b0e45ecf16c6ad27ee66fabb62a56d5b
BLAKE2b-256 473724b5344f682808d0f5488ef154e8ddb0d9c65a68e155ded7787d080667a3

See more details on using hashes here.

File details

Details for the file django_cachalot-2.7.0-py3-none-any.whl.

File metadata

File hashes

Hashes for django_cachalot-2.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a96fa16a25dc07110770d0bde0b01b444c065c7693d4c162c75da9e447f1750c
MD5 13b9fdeb48d67cc1b214ac7b373b7e3c
BLAKE2b-256 91136290ac3e7ba6643585aa38e3bd448a99927f61a9004229847d129e9f8137

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page