Skip to main content

from hansel import Crumb to find your file path.

Project description

hansel

Flexible parametric file paths to make queries, build folder trees and smart folder structure access.

PyPI Build Status Coverage Status PyPI Downloads Code Health Scrutinizer Code Quality

Usage

Quick Intro

Imagine this folder tree:

data
└── raw
    ├── 0040000
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040001
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040002
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040003
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
    ├── 0040004
    │   └── session_1
    │       ├── anat_1
    │       └── rest_1
from hansel import Crumb

# create the crumb
crumb = Crumb("{base_dir}/data/raw/{subject_id}/{session_id}/{image_type}/{image}")

# set the base_dir path
crumb = crumb.replace('base_dir', '/home/hansel')

assert str(crumb) == "/home/hansel/data/raw/{subject_id}/{session_id}/{image_type}"

# get the ids of the subjects
subj_ids = crumb['subject_id']

assert subj_ids == ['0040000', '0040001', '0040002', '0040003', '0040004', ....]

# get the paths to the subject folders, the output can be strings or crumbs, you choose with the make_crumbs boolean argument
subj_paths = crumb.ls('subject_id', make_crumbs=True)

# set the image_type
anat_crumb = crumb.replace(image_type='anat_1')

# get the paths to the anat_1 folders
anat_paths = anat_crumb.ls('image')

Long Intro

I often find myself in a work related with structured folder paths, such as the one shown above.

I have tried many ways of solving these situations: loops, dictionaries, configuration files, etc. I always end up doing a different thing for the same problem over and over again.

This week I grew tired of it and decided to make a representation of a structured folder tree in a string and access it the most easy way.

If you look at the folder structure above I have:

  • the root directory from where it is hanging: ...data/raw,

  • many identifiers (in this case a subject identification), e.g., 0040000,

  • session identification, session_1 and

  • a data type (in this case an image type), anat_1 and rest_1.

With hansel I can represent this folder structure like this:

from hansel import Crumb

crumb = Crumb("{base_dir}/data/raw/{subject_id}/{session_id}/{image_type}")

Let’s say we have the structure above hanging from a base directory like /home/hansel/.

I can use the replace function to make set the base_dir parameter:

crumb = crumb.replace('base_dir', '/home/hansel')

assert str(crumb) == "/home/hansel/data/raw/{subject_id}/{session_id}/{image_type}"

if you don’t need a copy of crumb, you can use the [] operator:

crumb['base_dir'] = '/home/hansel'

Now that the root path of my dataset is set, I can start querying my crumb path.

If I want to know the path to the existing subject_ids folders:

subject_paths = anat_crumb.ls('subject_id')

The output of ls can be str or Crumb or pathlib.Path. They will be Path if there are no crumb arguments left in the crumb path. You can choose this using the make_crumbs argument:

subject_paths = anat_crumb.ls('subject_id', make_crumbs=True)

If I want to know what are the existing subject_ids:

subject_ids = crumb.ls('subject_id', fullpath=False)

or

subject_ids = crumb['subject_id']

Now, if I wanted to get the path to all the anat_1 images, I could do this:

anat_crumb = crumb.replace(image_type='anat_1')

anat_paths = anat_crumb.ls('image')

or

crumb['image_type'] = 'anat_1'

anat_paths = crumb.ls('image')

There are more features such as creating folder trees with a value of maps for the crumbs and also check the feasibility of a crumb path.

More functionalities, ideas and comments are welcome.

Dependencies

Please see the requirements.txt file. Before installing this package, install its dependencies with:

pip install -r requirements.txt

Install

I am only testing this tool on Python 3.4 and 3.5. Maybe it works on Python 2.7 too, having six and pathlib2 installed.

This package uses setuptools. You can install it running:

python setup.py install

If you already have the dependencies listed in requirements.txt installed, to install in your home directory, use:

python setup.py install –user

To install for all users on Unix/Linux:

python setup.py build
sudo python setup.py install

You can also install it in development mode with:

python setup.py develop

Development

Code

Github

You can check the latest sources with the command:

git clone https://www.github.com/alexsavio/hansel.git

or if you have write privileges:

git clone git@github.com:alexsavio/hansel.git

If you are going to create patches for this project, create a branch for it from the master branch.

We tag stable releases in the repository with the version number.

Testing

We are using py.test to help us with the testing.

Otherwise you can run the tests executing:

python setup.py test

or

py.test

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hansel-0.1.1.tar.gz (14.6 kB view details)

Uploaded Source

Built Distribution

hansel-0.1.1-py3-none-any.whl (15.8 kB view details)

Uploaded Python 3

File details

Details for the file hansel-0.1.1.tar.gz.

File metadata

  • Download URL: hansel-0.1.1.tar.gz
  • Upload date:
  • Size: 14.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for hansel-0.1.1.tar.gz
Algorithm Hash digest
SHA256 991738ca5ab77b615f547b6272dea9f1a568b39b418bd86955a7c30644dcf195
MD5 a4f6878adfb1b4511fc4dc834a80f363
BLAKE2b-256 cd70c65ccd2be5e44dbb5c19aa47dbc029f1d7bc856b808f034a521146260c0e

See more details on using hashes here.

File details

Details for the file hansel-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for hansel-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 fa2a9bed2f940729e0ece26a4c3242cf54f8063285d9a8ff4b2e4802110107f2
MD5 9df54aeeef03cc5c6e1f75fa07a3de45
BLAKE2b-256 926ff9f7968c27480023dadf1f9d15f654a666d902c1dbd9b1ae8598b92924fc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page