Utility functions to write LivingPark notebooks.
Project description
LivingPark utils
A collection of utility functions to write LivingPark notebooks.
Usage examples:
import livingpark_utils
from livingpark_utils import download
from livingpark_utils.clinical import moca2mmse
from livingpark_utils.dataset import ppmi
utils = livingpark_utils.LivingParkUtils()
downloader = download.ppmi.Downloader(utils.study_files_dir)
utils.notebook_init()
utils.get_study_files(["Demographics.csv"], default=downloader)
utils.get_T1_nifti_files(
cohort, default=downloader
) # `cohort` is of type: pd.DataFrame
ppmi.find_nifti_file_in_cache(x["PATNO"], x["EVENT_ID"], x["Description"])
ppmi.disease_duration()
moca2mmse(2)
Exclude subjects from a cohort without leaking patient information.
from livingpark_utils.ignore import (
insert_hash,
remove_ignored,
)
# Assuming a cohort definition defined as `cohort`.
cohort = insert_hash(cohort, columns=["PATNO", "EVENT_ID", "Description"])
remove_ignored(cohort, ignore_file=".ppmiignore")
Usage to execute utility notebooks:
from livingpark_utils.scripts import run
run.mri_metadata()
run.pd_status()
Note: Optionally use the %%capture
cell magic to further hide notebook outputs.
CLI commands
Download T1 nifti files using a cohort definition file.
$ get_T1_nifti_files <cohort_file> --downloader (ppmi) [--symlink=<bool>]
[--force=<bool>] [--timeout=<int>]
The cohort_file
is a csv file created beforehand. Respectively to the chosen downloader, it must have the following columns:
- PPMI:
PATNO
,EVENT_ID
, andDescription
.
Troubleshooting
Permission issues on Windows
We use symbolic links when creating the folder for cached data. Unfortunately, by default, Windows does not authorize users to create symbolic links. To fix this issue on your machine, please follow the guide from this blog post.
Contributing guidelines
We welcome contributions of any kind in the form of Pull-Request to this repository. See also LivingPark contributing guidelines.
Make sure to:
- Use Python type annotations
- Include Python docstrings using numpydoc format for all functions
- Format docstrings
- Run
psf/black
on the files you modify - Run
pre-commit run --all
before committing, this will be checked in your PR
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file livingpark_utils-0.9.tar.gz
.
File metadata
- Download URL: livingpark_utils-0.9.tar.gz
- Upload date:
- Size: 49.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e0aad8213081ee130f77b359c74d3ef90212f653b0aab495f0cb236f7fcef826 |
|
MD5 | f90c205a3758521ad124eaae80554fdf |
|
BLAKE2b-256 | cd6c79c4b990b81713faac1f3d0f57bb7271f254d318abe6caf8e9721d6a09b6 |
Provenance
File details
Details for the file livingpark_utils-0.9-py3-none-any.whl
.
File metadata
- Download URL: livingpark_utils-0.9-py3-none-any.whl
- Upload date:
- Size: 59.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8006ef606be7fc40e5b209bd16e1f73e91105b60d169c8d862853b8775ca3cf0 |
|
MD5 | 3da3e66ce1f2b6a3617c17802f6e1c81 |
|
BLAKE2b-256 | 2b4d6488ee14b0d78ee9b2230616b4c40c63d5b62172d44c5675e1825b1815e4 |