Skip to main content

The missing Python utility to read and write large compressed JSONs.

Project description

Pypi project Pypi total project downloads

The missing Python utility to read and write large compressed JSONs.

The library is loosely based on the compress_pickle library.

How do I install this package?

As usual, just download it using pip:

pip install compress_json

Available compression modes

The compression modes, detected automatically by the file name, are gzip, bz2 and lzma, with the notable exception of zip which seems difficult to integrate in the JSON pipeline.

Usage example

The library is extremely easy to use:

import compress_json

D = {
    "A":{
        "B":"C"
    }
}
compress_json.dump(D, "filepath.json.gz") # for a gzip file
compress_json.dump(D, "filepath.json.bz") # for a bz2 file
compress_json.dump(D, "filepath.json.lzma") # for a lzma file

D1 = compress_json.load("filepath.json.gz") # for loading a gzip file
D2 = compress_json.load("filepath.json.bz") # for loading a bz2 file
D3 = compress_json.load("filepath.json.lzma") # for loading a lzma file

Some extra perks: local loading and dumping

The library makes available, other than the usual load and dump from the JSON library, the methods local_load and local_dump, which let you load and dump file in the same directory of wherever you are calling them, by using the call stack.

This can get useful, especially when loading files within packages.

import compress_json

D = {
    "A": {
        "B": "C"
    }
}
compress_json.local_dump(D, "filepath.json.gz") # for a gzip file
compress_json.local_dump(D, "filepath.json.bz") # for a bz2 file
compress_json.local_dump(D, "filepath.json.lzma") # for a lzma file

D1 = compress_json.local_load("filepath.json.gz") # for loading a gzip file
D2 = compress_json.local_load("filepath.json.bz") # for loading a bz2 file
D3 = compress_json.local_load("filepath.json.lzma") # for loading a lzma file

Loading with RAM cache

Sometimes you need to load a compressed JSON file a LOT of times, and you may want to put this document in a cache or something of the sorts. Fortunately, we already provide this option for you:

import compress_json

D1 = compress_json.load(
    "filepath.json.gz",
    use_cache=True
)

D1 = compress_json.local_load(
    "filepath.json.gz",
    use_cache=True
)

Advanced usage

Clearly you can pass parameters to either the chosen compression mode or the json library as follows:

import compress_json

D = {
    "A": {
        "B": "C"
    }
}
compress_json.dump(
    D, "filepath.json.gz",
    compression_kwargs = {kwargs go here},
    json_kwargs = {kwargs go here}
)

D4 = compress_json.load(
    "filepath.json.gz",
    compression_kwargs = {kwargs go here},
    json_kwargs = {kwargs go here}
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

compress_json-1.0.8.tar.gz (4.7 kB view details)

Uploaded Source

File details

Details for the file compress_json-1.0.8.tar.gz.

File metadata

  • Download URL: compress_json-1.0.8.tar.gz
  • Upload date:
  • Size: 4.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.9

File hashes

Hashes for compress_json-1.0.8.tar.gz
Algorithm Hash digest
SHA256 463e55fb8bb154f605130c7f6c2411fcb20f9ef4cb16020fd9880ba143002814
MD5 c0b632c7dad5ac78334b918f73c39f83
BLAKE2b-256 f76083343ff7a3e82d78cdb4f09ed0f3652a11fe814180f984620f1ecb2e4c49

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page