Skip to main content

Front-end for the ServiceX Data Server

Project description

ServiceX_frontend

Client access library for ServiceX

GitHub Actions Status Code Coverage

PyPI version Supported Python versions

Introduction

Given you have a selection string, this library will manage submitting it to a ServiceX instance and retrieving the data locally for you. The selection string is often generated by another front-end library, for example:

Prerequisites

Before you can use this library you'll need:

  • An environment based on python 3.6 or later
  • A ServiceX end-point. For example, http://localhost:5000/servicex, if ServiceX is running on a local k8 cluster and the proper ports are open, or the public servicex instance (contact IRIS-HEP at xxx if you are part of the LHC to request an account, or with help setting up an instance).

How to access your endpoint

The servicex library searches for configuration information in several locations to determine what end-point it should connect to, in the following order:

  1. A .servicex file in the current working directory
  2. A .servicex file in the user's home directory ($HOME on Linux and Mac, and your profile directory on Windows).
  3. The config_defaults.yaml file distributed with the servicex package.

If no endpoint is specified, then the library defaults to the developer endpoint, which is http://localhost:5000 for the web-service API, and localhost:9000 for the minio endpoint. No passwords are required.

Create a .servicex file, in the yaml format, in the appropriate place for your work that contains the following:

api_endpoint:
  endpoint: <your-endpoint>
  username: <api-username>
  password: <api-password>

  minio_endpoint: <minio-endpoint>
  minio_username: <minio-accesskey>
  minio_password: <minio-secretkey>

Finally, you can create the objects ServiceXAdaptor and MinioAdaptor by hand in your code, passing them as arguments to ServiceXDataset and inject custom endpoints and usernames and passwords, avoiding the configuration system. This is probably only useful for advanced users.

Usage

The following lines will return a pandas.DataFrame containing all the jet pT's from an ATLAS xAOD file containing Z->ee Monte Carlo:

    from servicex import ServiceX
    query = "(call ResultTTree (call Select (call SelectMany (call EventDataset (list 'localds:bogus')) (lambda (list e) (call (attr e 'Jets') 'AntiKt4EMTopoJets'))) (lambda (list j) (/ (call (attr j 'pt')) 1000.0))) (list 'JetPt') 'analysis' 'junk.root')"
    dataset = "mc15_13TeV:mc15_13TeV.361106.PowhegPythia8EvtGen_AZNLOCTEQ6L1_Zee.merge.DAOD_STDM3.e3601_s2576_s2132_r6630_r6264_p2363_tid05630052_00"
    ds = ServiceXDataset(dataset)
    r = ds.get_data_pandas_df(query)
    print(r)

And the output in a terminal window from running the above script (takes about 1-2 minutes to complete):

python scripts/run_test.py http://localhost:5000/servicex
            JetPt
entry
0       38.065707
1       31.967096
2        7.881337
3        6.669581
4        5.624053
...           ...
710183  42.926141
710184  30.815709
710185   6.348002
710186   5.472711
710187   5.212714

[11355980 rows x 1 columns]

If your query is badly formed or there is an other problem with the backend, an exception will be thrown with information about the error.

If you'd like to be able to submit multiple queries and have them run on the ServiceX back end in parallel, it is best to use the asyncio interface, which has the identical signature, but is called get_data_pandas_df_async.

For documentation of get_data and get_data_async see the servicex.py source file.

Features

Implemented:

  • Accepts a qastle formatted query
  • Exceptions are used to report back errors of all sorts from the service to the user's code.
  • Data is return in the following forms:
    • pandas.DataFrame an in process DataFrame of all the data requested
    • awkward an in process JaggedArray or dictionary of JaggedArrays
    • A list of root files that can be opened with uproot and used as desired.
    • Not all output formats are compatible with all transformations.
  • Complete returned data must fit in the process' memory
  • Run in an async or a non-async environment and non-async methods will accommodate automatically (including jupyter notebooks).
  • Support up to 100 simultaneous queries from a laptop-like front end without overwhelming the local machine (hopefully ServiceX will be overwhelmed!)
  • Start downloading files as soon as they are ready (before ServiceX is done with the complete transform).
  • It has been tested to run against 100 datasets with multiple simultaneous queries.
  • It supports local caching of query data
  • It will provide feedback on progress.
  • Configuration files supported so that user identification information does not have to be checked into repositories.

Testing

This code has been tested in several environments:

  • Windows, Linux, MacOS
  • Python 3.6, 3.7, 3.8
  • Jupyter Notebooks (not automated), regular python command-line invoked source files

API

Everything is based around the ServiceXDataset object. Below is the documentation for the most common parameters.

|  ServiceXDataset(dataset: str,
            image: str = 'sslhep/servicex_func_adl_xaod_transformer:v0.4',
            storage_directory: Union[str, NoneType] = None,
            file_name_func: Union[Callable[[str, str], pathlib.Path], NoneType] = None,
            status_callback_factory: Callable[[str], Callable[[Union[int, NoneType], int, int, int], NoneType]] = _run_default_wrapper)
 |      Create and configure a ServiceX object for a dataset.
 |
 |      Arguments
 |
 |          dataset                     Name of a dataset from which queries will be selected.
 |          image                       Name of transformer image to use to transform the data
 |          storage_directory           Location to cache data that comes back from ServiceX. Data
 |                                      can be reused between invocations.
 |          file_name_func              Allows for unique naming of the files that come back.
 |                                      Rarely used.
 |          status_callback_factory     Factory to create a status notification callback for each
 |                                      query. One is created per query.
 |
 |
 |      Notes:
 |
 |          -  The `status_callback` argument, by default, uses the `tqdm` library to render
 |             progress bars in a terminal window or a graphic in a Jupyter notebook (with proper
 |             jupyter extensions installed). If `status_callback` is specified as None, no
 |             updates will be rendered. A custom callback function can also be specified which
 |             takes `(total_files, transformed, downloaded, skipped)` as an argument. The
 |             `total_files` parameter may be `None` until the system knows how many files need to
 |             be processed (and some files can even be completed before that is known).

To get the data use one of the get_data method. They all have the same API, differing only by what they return.

 |  get_data_awkward_async(self, selection_query: str) -> Dict[bytes, Union[awkward.array.jagged.JaggedArray, numpy.ndarray]]
 |      Fetch query data from ServiceX matching `selection_query` and return it as
 |      dictionary of awkward arrays, an entry for each column. The data is uniquely
 |      ordered (the same query will always return the same order).
 |
 |  get_data_awkward(self, selection_query: str) -> Dict[bytes, Union[awkward.array.jagged.JaggedArray, numpy.ndarray]]
 |      Fetch query data from ServiceX matching `selection_query` and return it as
 |      dictionary of awkward arrays, an entry for each column. The data is uniquely
 |      ordered (the same query will always return the same order).

Each data type comes in a pair - an async version and a synchronous version.

  • get_data_awkward_async, get_data_awkward - Returns a dictionary of the requested data as numpy or JaggedArray objects.
  • get_data_rootfiles, get_data_rootfiles_async - Returns a list of locally download files (as pathlib.Path objects) containing the requested data. Suitable for opening with ROOT::TFile or uproot.
  • get_data_pandas_df, get_data_pandas_df_async - Returns the data as a pandas DataFrame. This will fail if the data you've requested has any structure (e.g. is hierarchical, like a single entry for each event, and each event may have some number of jets).
  • get_data_parquet, get_data_parquet_async - Returns a list of files locally downloaded that can be read by any parquet tools.

Development

For any changes please feel free to submit pull requests!

To do development please setup your environment with the following steps:

  1. A python 3.7 development environment
  2. Fork/Pull down this package, XX
  3. python -m pip install -e .[test]
  4. Run the tests to make sure everything is good: pytest.

Then add tests as you develop. When you are done, submit a pull request with any required changes to the documentation and the online tests will run.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

servicex-2.0.0b9.tar.gz (24.6 kB view details)

Uploaded Source

Built Distribution

servicex-2.0.0b9-py3-none-any.whl (25.0 kB view details)

Uploaded Python 3

File details

Details for the file servicex-2.0.0b9.tar.gz.

File metadata

  • Download URL: servicex-2.0.0b9.tar.gz
  • Upload date:
  • Size: 24.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7

File hashes

Hashes for servicex-2.0.0b9.tar.gz
Algorithm Hash digest
SHA256 45c6f174a0820e4f8688aad931f62f57901ef58b501fb6f1f94c970e06bdaa3b
MD5 e9bf7da30f5889c19c05e83722b841d0
BLAKE2b-256 2c34b617c24a7245f57878391847982207de0b398d3b9e64851d926d3809c1ab

See more details on using hashes here.

File details

Details for the file servicex-2.0.0b9-py3-none-any.whl.

File metadata

  • Download URL: servicex-2.0.0b9-py3-none-any.whl
  • Upload date:
  • Size: 25.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7

File hashes

Hashes for servicex-2.0.0b9-py3-none-any.whl
Algorithm Hash digest
SHA256 98116287ce22d9f964dcb1f0691877b773f7fb6121fc6d63027177421947e988
MD5 2bdeef31ae1768b3c5430fdfc80c3d1c
BLAKE2b-256 c56f6673d792efc4cadefe7d7e5e6755e3271048a149091d96bfa138560c6810

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page