Skip to main content

from hansel import Crumb to find your file path.

Project description

hansel

Flexible parametric file paths to make queries, build folder trees and smart folder structure access.

PyPI Build Status Coverage Status PyPI Downloads Code Health Scrutinizer Code Quality

Usage

Quick Intro

Imagine this folder tree:

data
└── raw
    ├── 0040000
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040001
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040002
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040003
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040004
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
from hansel import Crumb

# create the crumb
crumb = Crumb("{base_dir}/data/raw/{subject_id}/{session_id}/{image_type}/{image}")

# set the base_dir path
crumb = crumb.replace('base_dir', '/home/hansel')

assert str(crumb) == "/home/hansel/data/raw/{subject_id}/{session_id}/{image_type}"

# get the ids of the subjects
subj_ids = crumb['subject_id']

assert subj_ids == ['0040000', '0040001', '0040002', '0040003', '0040004', ....]

# get the paths to the subject folders, the output can be strings or crumbs, you choose with the make_crumbs boolean argument
subj_paths = crumb.ls('subject_id', make_crumbs=True)

# set the image_type
anat_crumb = crumb.replace(image_type='anat_1')

# get the paths to the anat_1 folders
anat_paths = anat_crumb.ls('image')

Long Intro

I often find myself in a work related with structured folder paths, such as the one shown above.

I have tried many ways of solving these situations: loops, dictionaries, configuration files, etc. I always end up doing a different thing for the same problem over and over again.

This week I grew tired of it and decided to make a representation of a structured folder tree in a string and access it the most easy way.

If you look at the folder structure above I have:

  • the root directory from where it is hanging: ...data/raw,

  • many identifiers (in this case a subject identification), e.g., 0040000,

  • session identification, session_1 and

  • a data type (in this case an image type), anat_1 and rest_1.

With hansel I can represent this folder structure like this:

from hansel import Crumb

crumb = Crumb("{base_dir}/data/raw/{subject_id}/{session_id}/{image_type}")

Let’s say we have the structure above hanging from a base directory like /home/hansel/.

I can use the replace function to make set the base_dir parameter:

crumb = crumb.replace('base_dir', '/home/hansel')

assert str(crumb) == "/home/hansel/data/raw/{subject_id}/{session_id}/{image_type}"

if you don’t need a copy of crumb, you can use the [] operator:

crumb['base_dir'] = '/home/hansel'

Now that the root path of my dataset is set, I can start querying my crumb path.

If I want to know the path to the existing subject_ids folders:

subject_paths = anat_crumb.ls('subject_id')

The output of ls can be str or Crumb or pathlib.Path. They will be Path if there are no crumb arguments left in the crumb path. You can choose this using the make_crumbs argument:

subject_paths = anat_crumb.ls('subject_id', make_crumbs=True)

If I want to know what are the existing subject_ids:

subject_ids = crumb.ls('subject_id', fullpath=False)

or

subject_ids = crumb['subject_id']

Now, if I wanted to get the path to all the anat_1 images, I could do this:

anat_crumb = crumb.replace(image_type='anat_1')

anat_paths = anat_crumb.ls('image')

or

crumb['image_type'] = 'anat_1'

anat_paths = crumb.ls('image')

There are more features such as creating folder trees with a value of maps for the crumbs and also check the feasibility of a crumb path.

More functionalities, ideas and comments are welcome.

Dependencies

Please see the requirements.txt file. Before installing this package, install its dependencies with:

pip install -r requirements.txt

Install

I am only testing this tool on Python 3.4 and 3.5. Maybe it works on Python 2.7 too, having six and pathlib2 installed.

This package uses setuptools. You can install it running:

python setup.py install

If you already have the dependencies listed in requirements.txt installed, to install in your home directory, use:

python setup.py install –user

To install for all users on Unix/Linux:

python setup.py build
sudo python setup.py install

You can also install it in development mode with:

python setup.py develop

Development

Code

Github

You can check the latest sources with the command:

git clone https://www.github.com/alexsavio/hansel.git

or if you have write privileges:

git clone git@github.com:alexsavio/hansel.git

If you are going to create patches for this project, create a branch for it from the master branch.

We tag stable releases in the repository with the version number.

Testing

We are using py.test to help us with the testing.

Otherwise you can run the tests executing:

python setup.py test

or

py.test

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hansel-0.3.0.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

hansel-0.3.0-py3-none-any.whl (17.0 kB view details)

Uploaded Python 3

File details

Details for the file hansel-0.3.0.tar.gz.

File metadata

  • Download URL: hansel-0.3.0.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for hansel-0.3.0.tar.gz
Algorithm Hash digest
SHA256 b92e60c21965530e22ef98ba2024392ac986a6bc737ecb9e5ce32059b2b27f6e
MD5 27e8179c3ced43385d9173d8bfab7db2
BLAKE2b-256 e6495e0ee47b8ebf1cea8913437afadba622a0e3c8704925d64994ce3ae4fe3c

See more details on using hashes here.

File details

Details for the file hansel-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for hansel-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f031aa5eff4d1942a42b1f893c3e273c2e32d7de46561b86d93665a04f3156ce
MD5 670280a0f7285a73f0507ef659b58e28
BLAKE2b-256 82193ae25e136e4496355c7cb387e9b61daf4313bcaf5bc484048d3b5c7e90bf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page