Skip to main content

Catalogs for known models

Project description

model_catalogs

Build Status Code Coverage License:MIT Documentation Status Code Style Status

Python Package Index

Provides access through Intake catalogs to a set of ocean models, especially the NOAA OFS models.

Specific functionality includes:

  • Sets up an Intake catalog for known models to provide direct access to model output.
  • Provides access to model output as an xarray Dataset.
  • Models are known by their catalog files; see set here. They include
    • NOAA OFS Models:
      • CBOFS
      • CIOFS
      • CREOFS
      • DBOFS
      • GOMOFS
      • LEOFS
      • LMHOFS
      • LOOFS
      • NGOFS2
      • NYOFS
      • SFBOFS
      • TBOFS
      • WCOFS
      • Full 3D fields, or regularly gridded or 2D versions when available
    • GFS models
    • Global GOFS HYCOM
  • Multiple time ranges and sources of model output are provided when known. For example for the NOAA OFS models there are both forecast and historical sources for all models, and some have others as well.
  • model_catalogs knows how to aggregate NOAA OFS model output between nowcast and forecast files.
  • Known models have cleaned up and filled-in metadata so they are easy to work with in xarray and with cf-xarray.
    • cf-xarray will understand dimension and coordinate names, as well as a set of standard_names mapped to the variables.
  • Metadata about models is included in the Intake catalogs, such as:
    • polygon boundary of numerical domain
    • grid parameters
    • arguments for optimal read-in with xarray
  • Can request the availability of each model source.

Installation

PyPI

To install from PyPI:

pip install model_catalogs

Subsequently install a few packages by saving the pip-requirements.txt file locally and:

pip install -r pip-requirements.txt

These need to be installed separately because some of the packages on GitHub have updates that have not been included in any releases that are available otherwise.

Install Optional Dependencies

Install additional dependencies for full functionality and running the demonstration notebooks. Activate your Python environment, then:

$ mamba install -c conda-forge --file model_catalogs/conda-requirements-opt.txt

or use conda in place of mamba if you don't have mamba installed.

Develop Package

Choose environment approach

Use provided environment

Clone the repo:

$ git clone http://github.com/NOAA-ORR-ERD/model_catalogs.git

In the model_catalogs directory, install conda environment:

$ conda env create -f environment.yml

Install model_catalogs into new environment (still in model_catalogs directory):

$ conda activate model_catalogs
$ pip install -e .

Use other environment

Alternatively, if you have an existing environment you want to add to, clone the repo:

$ git clone http://github.com/NOAA-ORR-ERD/model_catalogs.git
$ cd model_catalogs

Make sure the desired environment is activated and then:

$ conda install -c conda-forge --file conda-requirements.txt
$ pip install -r pip-requirements.txt

Install model_catalogs into the environment (still in model_catalogs directory):

$ pip install -e .

Install development packages

To develop the code, follow instructions above for "Use provided environment" or "Use other environment" as appropriate. Then you can install additional dependencies for development and testing with

$ conda install -c conda-forge --file conda-requirements-dev.txt

Run tests

Run tests that haven't been marked as "slow" with

$ pytest

Run all tests, including slow tests, with:

$ pytest --runslow

Check precommits locally before pushing

To then check code before committing and pushing it to github, locally run

$ pre-commit run --all-files

These checks can change your files so it is best to check the changes before pushing to github.

Compile docs

Compile the docs locally after having installed the developer packages (see "Install development packages") or after making the docs environment with

$ conda env create -f docs/environment.yml

and activating that environment.

Navigate to the docs folder and build the html documentation with

$ make html

Finally you can make sure the documentation looks right by opening "_build/html/index.html".

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

model_catalogs-0.3.1.tar.gz (214.4 kB view details)

Uploaded Source

Built Distribution

model_catalogs-0.3.1-py3-none-any.whl (181.5 kB view details)

Uploaded Python 3

File details

Details for the file model_catalogs-0.3.1.tar.gz.

File metadata

  • Download URL: model_catalogs-0.3.1.tar.gz
  • Upload date:
  • Size: 214.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for model_catalogs-0.3.1.tar.gz
Algorithm Hash digest
SHA256 f32632e605bc0803265483cc7c68ccedaeba6195165b0e6d10b0f424c7de8b8b
MD5 053306f653e9fbbd33a9f033351a4f0d
BLAKE2b-256 932e417da5574c43dc96a06281d889a122b90eb939fa1239915e32d14a7c8c38

See more details on using hashes here.

File details

Details for the file model_catalogs-0.3.1-py3-none-any.whl.

File metadata

File hashes

Hashes for model_catalogs-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b596f3a9a54090c819f08b6338122a45beb96b5e9c366e9fa5edde91a778f32a
MD5 80632a524cb67923583eea34cd0baa0a
BLAKE2b-256 0002fcd04ab1109a77515b946d25c2e07747b8ea8661dbb8a4b5ed7fcba9d9f3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page