Skip to main content

Run your BMI implementation in a separate process and expose it as BMI-python with GRPC

Project description

DOI Build Status Documentation Status

grpc4bmi

Purpose

This software allows you to wrap your BMI implementation (https://github.com/csdms/bmi) in a server process and communicate with it via the included python client. The communication is serialized to protocol buffers by GRPC (https://grpc.io/) and occurs over network ports.

Installation

Optionally, create your virtual environment and activate it, Then, run

pip install grpc4bmi

on the client (python) side. If your server model is implemented in Python, do the same in the server environment (e.g. docker container). If the model is implemented in R, run instead

pip install grpc4bmi[R]

in the server environment. For bleeding edge version from GitHub use

pip install git+https://github.com/eWaterCycle/grpc4bmi.git#egg=grpc4bmi

Finally if the model is implemented in C or C++, clone this git repo and run

make ; make install

in the cpp folder.

Usage

Model written in Python

For inspiration look at the example in the test directory. To start a server process that allows calls to your BMI implementation, type

run-bmi-server --name <PACKAGE>.<MODULE>.<CLASS> --port <PORT> --path <PATH>

where <PACKAGE>, <MODULE> are the python package and module containing your implementation, <CLASS> is your bmi model class name, <PORT> is any available port on the host system, and optionally <PATH> denotes an additional path that should be added to the system path to make your implementation work. The name option above is optional, and if not provided the script will look at the environment variables BMI_PACKAGE, BMI_MODULE and BMI_CLASS. Similarly, the port can be defined by the environment variable BMI_PORT. This software assumes that your implementation constructor has no parameters.

Model written in C/C++ (beta)

Create an executable along the lines of cpp/run-bmi-server.cc. You can copy the file and replace the function

Bmi* create_model_instance()
{
    /* Return your new BMI instance pointer here... */
}

with the instantiation of your model BMI. The model needs to implement the csdms BMI for C, but you may also implement our more object-oriented C++ interface BmiCppExtension.

Model written in R

The grpc4bmi Python package can also run BMI models written in R if the model is a subclass of AbstractBmi See https://github.com/eWaterCycle/bmi-r for instruction on R and Docker.

Run the R model a server with

run-bmi-server --lang R [--path <R file with BMI model>] --name [<PACKAGE>::]<CLASS> --port <PORT>

For example with WALRUS use

run-bmi-server --lang R --path ~/git/eWaterCycle/grpc4bmi-examples/walrus/walrus-bmi.r --name WalrusBmi --port 50051

The client side

The client side has only a Python implementation. The default BMI client assumes a running server process on a given port.

from grpc4bmi.bmi_grpc_client import BmiClient
import grpc
mymodel = BmiClient(grpc.insecure_channel("localhost:<PORT>"))
print mymodel.get_component_name()
mymodel.initialize(<FILEPATH>)
...further BMI calls...

The package contains also client implementation that own the server process, either as a python subprocess or a docker image or a singularity image running the run-bmi-server script. For instance

from grpc4bmi.bmi_client_subproc import BmiClientSubProcess
mymodel = BmiClientSubProcess(<PACKAGE>.<MODULE>.<CLASS>)

will automatically launch the server in a sub-process and

from grpc4bmi.bmi_client_subproc import BmiClientDocker
mymodel = BmiClientDocker(<IMAGE>,<PORT>)

will launch a docker container, assuming that a GRPC BMI server will start and exposes the port <PORT>.

from grpc4bmi.bmi_client_singularity import BmiClientSingularity
mymodel = BmiClientSingularity(<IMAGE>,<PORT>)

will launch a singularity container, assuming that a GRPC BMI server will start and exposes the port <PORT>.

Development: generating the grpc code

When developers change the proto-file, it is necessary to install grpc tools python packages in your python environment:

pip install -r requirements.txt
pip install -e .
# For R integration also install the R extras with
pip install -e .[R]

and install the C++ runtime and protoc command as described in https://github.com/google/protobuf/blob/master/src/README.md. After this, simply executing the proto_gen.sh script should do the job.

Future work

More language bindings are underway.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grpc4bmi-0.2.5.tar.gz (26.2 kB view details)

Uploaded Source

Built Distribution

grpc4bmi-0.2.5-py3-none-any.whl (35.2 kB view details)

Uploaded Python 3

File details

Details for the file grpc4bmi-0.2.5.tar.gz.

File metadata

  • Download URL: grpc4bmi-0.2.5.tar.gz
  • Upload date:
  • Size: 26.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.0.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for grpc4bmi-0.2.5.tar.gz
Algorithm Hash digest
SHA256 bafc86fa54a35c1ca9c11a2e7033c91d41953fbe55c0b46333d6f55df7b0ab48
MD5 2895e14ed8163812cd4e4d27816d9a93
BLAKE2b-256 15a80c5be81419310aa6859fcc9459c4a3a11344877830aa1917d57901439e17

See more details on using hashes here.

Provenance

File details

Details for the file grpc4bmi-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: grpc4bmi-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 35.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.0.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for grpc4bmi-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 fbf1fdbbd2364a1038df363a8dc6cacff90dc30e8fcd9aa93df0b4824543979a
MD5 28c8ff826c578fdd437ea39382f0cd53
BLAKE2b-256 abe233fe5c30d1fc613a21d217a33cc54679607be3bc85733b9d96ee8d493a21

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page