Skip to main content

The missing Python utility to read and write large compressed JSONs.

Project description

Travis CI build SonarCloud Quality SonarCloud Maintainability Codacy Maintainability Maintainability Pypi project Pypi total project downloads

The missing Python utility to read and write large compressed JSONs.

The library is loosely based on the compress_pickle library.

How do I install this package?

As usual, just download it using pip:

pip install compress_json

Tests Coverage

Since some software handling coverages sometimes get slightly different results, here’s three of them:

Coveralls Coverage SonarCloud Coverage Code Climate Coverate

Available compression modes

The compression modes, detected automatically by the file name, are gzip, bz2 and lzma, with the notable exception of zip which seems difficult to integrate in the JSON pipeline.

Usage example

The library is extremely easy to use:

import compress_json

D = {
    "A":{
        "B":"C"
    }
}
compress_json.dump(D, "filepath.json.gz") # for a gzip file
compress_json.dump(D, "filepath.json.bz") # for a bz2 file
compress_json.dump(D, "filepath.json.lzma") # for a lzma file

D1 = compress_json.load("filepath.json.gz") # for loading a gzip file
D2 = compress_json.load("filepath.json.bz") # for loading a bz2 file
D3 = compress_json.load("filepath.json.lzma") # for loading a lzma file

Some extra perks: local loading and dumping

The library makes available, other than the usual load and dump from the JSON library, the methods local_load and local_dump, which let you load and dump file in the same directory of wherever you are calling them, by using the call stack.

This can get useful, especially when loading files within packages.

import compress_json

D = {
    "A": {
        "B": "C"
    }
}
compress_json.local_dump(D, "filepath.json.gz") # for a gzip file
compress_json.local_dump(D, "filepath.json.bz") # for a bz2 file
compress_json.local_dump(D, "filepath.json.lzma") # for a lzma file

D1 = compress_json.local_load("filepath.json.gz") # for loading a gzip file
D2 = compress_json.local_load("filepath.json.bz") # for loading a bz2 file
D3 = compress_json.local_load("filepath.json.lzma") # for loading a lzma file

Advanced usage

Clearly you can pass parameters to either the chosen compression mode or the json library as follows:

import compress_json

D = {
    "A": {
        "B": "C"
    }
}
compress_json.dump(
    D, "filepath.json.gz",
    compression_kwargs = {kwargs go here},
    json_kwargs = {kwargs go here}
)

D4 = compress_json.load(
    "filepath.json.gz",
    compression_kwargs = {kwargs go here},
    json_kwargs = {kwargs go here}
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

compress_json-1.0.5.tar.gz (5.0 kB view details)

Uploaded Source

File details

Details for the file compress_json-1.0.5.tar.gz.

File metadata

  • Download URL: compress_json-1.0.5.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.9

File hashes

Hashes for compress_json-1.0.5.tar.gz
Algorithm Hash digest
SHA256 8cd15b09413f402a08faa09255baa44261f20cad76956a18d2581b0792c69523
MD5 f45c2211623443cc965a8b3ba0cbfebc
BLAKE2b-256 e5ba1a503870491972aaa7281d52d294a1c1f97cd598e52d34cc4280c028db58

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page