Skip to main content

Catalogs for known models

Project description

model_catalogs

Build Status Code Coverage License:MIT Documentation Status Code Style Status Conda Version Python Package Index

Provides access through Intake catalogs to a set of ocean models, especially the NOAA OFS models.

Specific functionality includes:

  • Sets up an Intake catalog for known models to provide direct access to model output.
  • Provides access to model output as an xarray Dataset.
  • Models are known by their catalog files; see set here. They include
    • NOAA OFS Models:
      • CBOFS
      • CIOFS
      • CREOFS
      • DBOFS
      • GOMOFS
      • LEOFS
      • LMHOFS
      • LOOFS
      • NGOFS2
      • NYOFS
      • SFBOFS
      • TBOFS
      • WCOFS
      • Full 3D fields, or regularly gridded or 2D versions when available
    • GFS models
    • Global GOFS HYCOM
  • Multiple time ranges and sources of model output are provided when known. For example for the NOAA OFS models there are both forecast and historical sources for all models, and some have others as well.
  • model_catalogs knows how to aggregate NOAA OFS model output between nowcast and forecast files.
  • Known models have cleaned up and filled-in metadata so they are easy to work with in xarray and with cf-xarray.
    • cf-xarray will understand dimension and coordinate names, as well as a set of standard_names mapped to the variables.
  • Metadata about models is included in the Intake catalogs, such as:
    • polygon boundary of numerical domain
    • grid parameters
    • arguments for optimal read-in with xarray
  • Can request the availability of each model source.

Installation

To use provided environment

Clone the repo:

$ git clone http://github.com/NOAA-ORR-ERD/model_catalogs.git

In the model_catalogs directory, install conda environment:

$ conda env create -f environment.yml

Alternatively, if you have an existing environment you want to add to:

$ conda install --file conda-requirements.txt
$ pip install -r pip-requirements.txt

Install model_catalogs into new environment (still in model_catalogs directory):

$ conda activate model_catalogs
$ pip install -e .

To install alongside LibGOODS requirements

Clone the LibGOODS repo:

$ git clone http://github.com/NOAA-ORR-ERD/LibGOODS.git

Navigate to the LibGOODS directory and then:

conda create --name libgoods_env  # create new environment, if you want
conda activate libgoods_env  # activate whichever environment you want to use
conda install -c conda-forge mamba  # mamba installs packages fast
mamba install -c conda-forge --file libgoods/conda_requirements.txt  # install LibGOODS conda requirements

Clone the model_catalogs repo in a good location:

$ git clone http://github.com/NOAA-ORR-ERD/model_catalogs.git

Navigate to the model_catalogs directory, then:

mamba install -c conda-forge --file conda-requirements.txt  # install model_catalogs conda requirements
pip install -r pip-requirements.txt  # install model_catalogs pip requirements

Install model_catalogs locally into environment:

pip install -e .

Install Optional Dependencies

Install additional dependencies for full functionality and running the demonstration notebooks. Activate your Python environment, then:

$ mamba install -c conda-forge --file model_catalogs/conda-requirements-opt.txt

or use conda in place of mamba if you don't have mamba installed.

Run demo

You can then open Jupyter lab from a terminal window with:

$ jupyter lab

Then double-click the "demo.ipynb" notebook and run through the cells with "shift-enter".

Develop Package

To develop the code, follow instructions above for "To use provided environment". Then you can install additional dependencies for development and testing with

$ conda install --file requirements-dev.txt

Run tests

Run tests that haven't been marked as "slow" with

$ pytest

Run all tests, including slow tests, with:

$ pytest --runslow

Note that the slow tests are not run during CI.

Check precommits locally before pushing

To then check code before committing and pushing it to github, locally run

$ pre-commit run --all-files

These checks can change your files so it is best to check the changes before pushing to github.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

model_catalogs-0.1.1.tar.gz (437.7 kB view details)

Uploaded Source

Built Distribution

model_catalogs-0.1.1-py3-none-any.whl (184.1 kB view details)

Uploaded Python 3

File details

Details for the file model_catalogs-0.1.1.tar.gz.

File metadata

  • Download URL: model_catalogs-0.1.1.tar.gz
  • Upload date:
  • Size: 437.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for model_catalogs-0.1.1.tar.gz
Algorithm Hash digest
SHA256 dfea6154daf24fd482f6263d502e4cb304125a52887122d3d8543bfe392ba0b4
MD5 09835edcde7824341e6d7f21990727b9
BLAKE2b-256 fed5da3f4099baca049e98fafeb2e413733a31f4aa64bd572a10b02667f6ccee

See more details on using hashes here.

File details

Details for the file model_catalogs-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for model_catalogs-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cd0c2838ee74e233fa929236034cdffe9346682b9bfd1785d6d9fdcd6423a544
MD5 3fe5c127d8d922cd74164b1146681f33
BLAKE2b-256 9a1a2a5e3cd2901c37ccf53080a3c7cc3a7cc8514a9a4e58334662a230166632

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page