Skip to main content

Caches your Django ORM queries and automatically invalidates them.

Project description

Caches your Django ORM queries and automatically invalidates them.

Documentation: http://django-cachalot.readthedocs.io


http://img.shields.io/pypi/v/django-cachalot.svg?style=flat-square&maxAge=3600 https://img.shields.io/pypi/pyversions/django-cachalot https://github.com/noripyt/django-cachalot/actions/workflows/ci.yml/badge.svg http://img.shields.io/coveralls/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 http://img.shields.io/scrutinizer/g/noripyt/django-cachalot/master.svg?style=flat-square&maxAge=3600 https://img.shields.io/discord/773656139207802881

Table of Contents:

  • Quickstart

  • Usage

  • Hacking

  • Benchmark

  • Third-Party Cache Comparison

  • Discussion

Quickstart

Cachalot officially supports Python 3.7-3.11 and Django 3.2, 4.1, 4.2, 5.0, 5.1 with the databases PostgreSQL, SQLite, and MySQL.

Note: an upper limit on Django version is set for your safety. Please do not ignore it.

Usage

  1. pip install django-cachalot

  2. Add 'cachalot', to your INSTALLED_APPS

  3. If you use multiple servers with a common cache server, double check their clock synchronisation

  4. If you modify data outside Django – typically after restoring a SQL database –, use the manage.py command

  5. Be aware of the few other limits

  6. If you use django-debug-toolbar, you can add 'cachalot.panels.CachalotPanel', to your DEBUG_TOOLBAR_PANELS

  7. Enjoy!

Hacking

To start developing, install the requirements and run the tests via tox.

Make sure you have the following services:

  • Memcached

  • Redis

  • PostgreSQL

  • MySQL

For setup:

  1. Install: pip install -r requirements/hacking.txt

  2. For PostgreSQL: CREATE ROLE cachalot LOGIN SUPERUSER;

  3. Run: tox --current-env to run the test suite on your current Python version.

  4. You can also run specific databases and Django versions: tox -e py38-django3.1-postgresql-redis

Benchmark

Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called “cachalot” on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called “cachalot.” You can also run the benchmark, and it’ll raise errors with specific instructions for how to fix it.

  1. Install: pip install -r requirements/benchmark.txt

  2. Run: python benchmark.py

The output will be in benchmark/TODAY’S_DATE/

TODO Create Docker-compose file to allow for easier running of data.

Third-Party Cache Comparison

There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix:

TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you most likely won’t need to scale to the point of needing cache-machine added to the bowl). If you’re an enterprise that already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best mix. However, when performing joins with select_related and prefetch_related, you can get a nearly 100x speed up for your initial deployment.

Recall, cachalot caches THE ENTIRE TABLE. That’s where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it’s just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it.

Cachalot is more-or-less intended for cold caches or “just-right” conditions. If you find a partition library for Django (also authored but work-in-progress by Andrew Chen Wang), then the caching will work better since sharding the cold/accessed-the-least records aren’t invalidated as much.

Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It’s the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot.

Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that’s modified rapidly? Since you’ve mixed your cold (90% of) with your hot (10% of) records, you’re caching and invalidating an entire table. It’s like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them.

Note 1: My personal experience with caches stems from Reddit’s: https://web.archive.org/web/20210803213621/https://redditblog.com/2017/01/17/caching-at-reddit/

Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools

Discussion

Help? Technical chat? It’s here on Discord.

Legacy chats:

https://raw.github.com/noripyt/django-cachalot/master/django-cachalot.jpg

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

django_cachalot-2.6.3.tar.gz (73.7 kB view details)

Uploaded Source

Built Distribution

django_cachalot-2.6.3-py3-none-any.whl (55.2 kB view details)

Uploaded Python 3

File details

Details for the file django_cachalot-2.6.3.tar.gz.

File metadata

  • Download URL: django_cachalot-2.6.3.tar.gz
  • Upload date:
  • Size: 73.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for django_cachalot-2.6.3.tar.gz
Algorithm Hash digest
SHA256 d1235111236ef2bfcc32337774eb620694bba8d49dd10a652d96aa8a1991a3be
MD5 c9b3195f649940891af31a8cef2ef90e
BLAKE2b-256 f8ef31d347ee35e1d81f3fdb362321370536c255144fee597ab9c4d54ad23426

See more details on using hashes here.

Provenance

File details

Details for the file django_cachalot-2.6.3-py3-none-any.whl.

File metadata

File hashes

Hashes for django_cachalot-2.6.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a37d38e29e35c93945cd372888a21ab82115d6b0c334a6f8bc43a450abfdf0b5
MD5 77e1c78f5cd80a060ecaf8b573cd6cb5
BLAKE2b-256 8735264913d3f229416c7644c7e477223de159deb244147fde6489fd037c91e4

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page