Skip to main content

The ICEBERG Penguin colony usecase package

Project description

Quality Metrics

Build Status

Prerequisites - all available on bridges via the commands below

  • Linux
  • Python 3
  • CPU and NVIDIA GPU + CUDA CuDNN

Software Dependencies - these will be installed automatically with the installation below.

  • scipy==1.2.1
  • Pillow==4.3.0
  • torch
  • scikit-learn==0.19.1
  • torchvision==0.2.0'
  • opencv-python
  • rasterio
  • future

Installation

Preliminaries: These instructions are specific to XSEDE Bridges but other resources can be used if cuda, python3, and a NVIDIA P100 GPU are present, in which case 'module load' instructions can be skipped, which are specific to Bridges.

For Unix or Mac Users:
Login to bridges via ssh using a Unix or Mac command line terminal. Login is available to bridges directly or through the XSEDE portal. Please see the Bridges User's Guide.

For Windows Users:
Many tools are available for ssh access to bridges. Please see Ubuntu, MobaXterm or PuTTY

PSC Bridges

Once you have logged into Bridges, you can follow one of two methods for installing iceberg-penguins.

Method 1 (Recommended):

The lines below following a '$' are commands to enter (or cut and paste) into your terminal (note that all commands are case-sensitive, meaning capital and lowercase letters are differentiated.) Everything following '#' are comments to explain the reason for the command and should not be included in what you enter. Lines that do not start with '$' or '[penguins_env] $' are output you should expect to see.

$ pwd
/home/username
$ cd $SCRATCH                      # switch to your working space.
$ mkdir Penguins                   # create a directory to work in.
$ cd Penguins                      # move into your working directory.
$ module load cuda                 # load parallel computing architecture.
$ module load python3              # load correct python version.
$ virtualenv penguins_env          # create a virtual environment to isolate your work from the default system.
$ source penguins_env/bin/activate # activate your environment. Notice the command line prompt changes to show your environment on the next line.
[penguins_env] $ pwd
/pylon5/group/username/Penguins
[penguins_env] $ export PYTHONPATH=<path>/penguins_env/lib/python3.5/site-packages # set a system variable to point python to your specific code. (Replace <path> with the results of pwd command above.
[penguins_env] $ pip install iceberg_penguins.search # pip is a python tool to extract the requested software (iceberg_penguins.search in this case) from a repository. (this may take several minutes).

Method #2 (Installing from source; recommended for developers only):

$ git clone https://github.com/iceberg-project/Penguins.git
$ module load cuda
$ module load python3
$ virtualenv penguins_env
$ source penguins_env/bin/activate
[penguins_env] $ export PYTHONPATH=<path>/penguins_env/lib/python3.5/site-packages
[penguins_env] $ pip install . --upgrade

To test

[iceberg_penguins] $ deactivate    # exit your virtual environment.
$ interact -p GPU-small            # request a compute node. This package has been tested on P100 GPUs on bridges, but that does not exclude any other resource that offers the same GPUs. (this may take a minute or two or more to receive an allocation).
$ cd $SCRATCH/Penguins             # make sure you are in the same directory where everything was set up before.
$ module load cuda                 # load parallel computing architecture, as before.
$ module load python3              # load correct python version, as before.
$ source penguins_env/bin/activate # activate your environment, no need to create a new environment because the Penguins tools are installed and isolated here.
[iceberg_penguins] $ iceberg_penguins.detect --help  # this will display a help screen of available usage and parameters.

Prediction

You can download to your local machine and use scp, ftp, rsync, or Globus to transfer to bridges.

The one provided here is at the epoch 300 of the model we will call "MY_MODEL".

Please put the model file here: <checkpoints_dir>/MY_MODEL/

Then, follow the environment setup commands under 'To test' above. Finally, the script to run the prediction for a single PNG image tile is:

iceberg_penguins.detect [--params ...]
iceberg_penguins.detect --gpu-ids 0 --name MY_MODEL --epoch 300 --checkpoints_dir '../model_path/' --output test --input_im ../data/MY_IMG_TILE.png

params:

  • --gpu_ids: the gpu used for testing
  • --name: name of the model used for testing
  • --epoch: which epoch we use to test the model
  • --checkpoints_dir: path to the folder containing the trained models
  • --output: directory to save the outputs
  • --input_im: path to the input image

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iceberg_penguins.search-0.3.tar.gz (41.3 kB view details)

Uploaded Source

Built Distribution

iceberg_penguins.search-0.3-py3-none-any.whl (63.0 kB view details)

Uploaded Python 3

File details

Details for the file iceberg_penguins.search-0.3.tar.gz.

File metadata

  • Download URL: iceberg_penguins.search-0.3.tar.gz
  • Upload date:
  • Size: 41.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.1.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.5.2

File hashes

Hashes for iceberg_penguins.search-0.3.tar.gz
Algorithm Hash digest
SHA256 f0304973ad41e4ae091605e12bf839a5bde7043ae51938a14560320abd936ba8
MD5 325c240f297c3677dbd09e2976c6e818
BLAKE2b-256 642918da76adeec265c43755e69442f1e9976447b29aa8806140ec3a2e25d49c

See more details on using hashes here.

File details

Details for the file iceberg_penguins.search-0.3-py3-none-any.whl.

File metadata

  • Download URL: iceberg_penguins.search-0.3-py3-none-any.whl
  • Upload date:
  • Size: 63.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.1.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.5.2

File hashes

Hashes for iceberg_penguins.search-0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 bc4b95e83a8b207e4f0cfb7cd1e73e40446bfd43ccffc58b0e470345d0e7bff4
MD5 5aae0417da52fdf54e44df11db7740f5
BLAKE2b-256 43f6630436bbbd663b146fd9bac654c65c9ca38f11ad56bdb5142424c7072089

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page