Skip to main content

dfindexeddb is an experimental Python tool for performing digital forensic analysis of IndexedDB and leveldb files.

Project description

dfIndexeddb

dfindexeddb is an experimental Python tool for performing digital forensic analysis of IndexedDB and LevelDB files.

It parses LevelDB, IndexedDB and JavaScript structures from these files without requiring native libraries. (Note: only a subset of IndexedDB key types and JavaScript types for Safari and Chromium-based browsers are currently supported. Firefox is under development).

The content of IndexedDB files is dependent on what a web application stores locally/offline using the web browser's IndexedDB API. Examples of content might include:

  • text from a text/source-code editor application,
  • emails and contact information from an e-mail application,
  • images and metadata from a photo gallery application

Installation

  1. [Linux] Install the snappy compression development package
    $ sudo apt install libsnappy-dev
  1. Create a virtual environment and install the package
    $ python3 -m venv .venv
    $ source .venv/bin/activate
    $ pip install dfindexeddb

To also install the dependencies for leveldb/indexeddb plugins, run

    $ pip install 'dfindexeddb[plugins]'

Installation from source

  1. [Linux] Install the snappy compression development package
    $ sudo apt install libsnappy-dev
  1. Clone or download/unzip the repository to your local machine.

  2. Create a virtual environment and install the package

    $ python3 -m venv .venv
    $ source .venv/bin/activate
    $ pip install .

To also install the dependencies for leveldb/indexeddb plugins, run

    $ pip install '.[plugins]'

Usage

Two CLI tools for parsing IndexedDB/LevelDB files are available after installation:

IndexedDB

$ dfindexeddb -h
usage: dfindexeddb [-h] {db,ldb,log} ...

A cli tool for parsing indexeddb files

positional arguments:
  {db,ldb,log}
    db          Parse a directory as indexeddb.
    ldb         Parse a ldb file as indexeddb.
    log         Parse a log file as indexeddb.

options:
  -h, --help    show this help message and exit

Examples:

To parse IndexedDB records from an sqlite file for Safari and output the results as JSON-L, use the following command:

dfindexeddb db -s SOURCE --format safari -o jsonl

To parse IndexedDB records from a LevelDB folder for Chrome/Chromium, using the manifest file to determine recovered records and output as JSON, use the following command:

dfindexeddb db -s SOURCE --format chrome --use_manifest

To parse IndexedDB records from a LevelDB ldb (.ldb) file and output the results as JSON-L, use the following command:

dfindexeddb ldb -s SOURCE -o jsonl

To parse IndexedDB records from a LevelDB log (.log) file and output the results as the Python printable representation, use the following command:

dfindexeddb log -s SOURCE -o repr

To parse a file as a Chrome/Chromium IndexedDB blink value and output the results as JSON:

dfindexeddb blink -s SOURCE

LevelDB

$ dfleveldb -h
usage: dfleveldb [-h] {db,log,ldb,descriptor} ...

A cli tool for parsing leveldb files

positional arguments:
  {db,log,ldb,descriptor}
    db                  Parse a directory as leveldb.
    log                 Parse a leveldb log file.
    ldb                 Parse a leveldb table (.ldb) file.
    descriptor          Parse a leveldb descriptor (MANIFEST) file.

options:
  -h, --help            show this help message and exit

Examples

To parse records from a LevelDB folder, use the following command:

dfleveldb db -s SOURCE

To parse records from a LevelDB folder, and use the sequence number to determine recovered records and output as JSON, use the following command:

dfleveldb db -s SOURCE --use_sequence_number

To parse blocks / physical records/ write batches / internal key records from a LevelDB log (.log) file, use the following command, specifying the type (block, physical_records, etc) via the -t option. By default, internal key records are parsed:

$ dfleveldb log  -s SOURCE [-t {blocks,physical_records,write_batches,parsed_internal_key}]

To parse blocks / records from a LevelDB table (.ldb) file, use the following command, specifying the type (blocks, records) via the -t option. By default, records are parsed:

$ dfleveldb ldb -s SOURCE [-t {blocks,records}]

To parse version edit records from a Descriptor (MANIFEST) file, use the following command:

$ dfleveldb descriptor -s SOURCE [-o {json,jsonl,repr}] [-t {blocks,physical_records,versionedit} | -v]

Plugins

To apply a plugin parser for a leveldb file/folder, add the --plugin [Plugin Name] argument. Currently, there is support for the following artifacts:

Plugin Name Artifact Name
ChromeNotificationRecord Chrome/Chromium Notifications

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dfindexeddb-20240519.tar.gz (58.1 kB view details)

Uploaded Source

Built Distribution

dfindexeddb-20240519-py3-none-any.whl (74.4 kB view details)

Uploaded Python 3

File details

Details for the file dfindexeddb-20240519.tar.gz.

File metadata

  • Download URL: dfindexeddb-20240519.tar.gz
  • Upload date:
  • Size: 58.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for dfindexeddb-20240519.tar.gz
Algorithm Hash digest
SHA256 8b90e27cc18c9ddfbf47b3f88356b1f12169443c7bb783c5b6af6e4a8af92371
MD5 ddf5c659334904f2955ca56544e45ba5
BLAKE2b-256 4354ffc4a786a50b326b44ff8e5e7423520e4013633f304d19434dafa0a6f12e

See more details on using hashes here.

File details

Details for the file dfindexeddb-20240519-py3-none-any.whl.

File metadata

File hashes

Hashes for dfindexeddb-20240519-py3-none-any.whl
Algorithm Hash digest
SHA256 cf65ea561a9f082cfa9999c8a0fdb1b51edfb069fd8a7f7c2d4de2749292090f
MD5 830e8a331a08c9ed1c21787033715564
BLAKE2b-256 530e0349b30e07bb7f7b5a88e7abba2e10757310f80d1dd47e6f06ec893ff325

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page