Skip to main content

Tool to locally run tests for AnalysisProductions

Project description

LbAPLocal

LbAPLocal is the python library for running offline tests for the LHCb AnalysisProductions framework.

Usage

LbAPLocal is installed by default with the LHCb environment on lxplus. For users on external clusters, one can source the LHCb environment from CVMFS to get setup: source /cvmfs/lhcb.cern.ch/lib/LbEnv After installing, LbAPLocal can be run from the command line with the following options:

Usage: lb-ap [OPTIONS] COMMAND [ARGS]...

  Command line tool for the LHCb AnalysisProductions

Options:
  --version
  --help     Show this message and exit.

Commands:
  list       List the available production folders by running lb-ap list...
  render     Render the info.yaml for a given production
  validate   Validate the configuration for a given production
  test       Execute a job locally
  debug      Start an interactive session inside the job's environment
  reproduce  Reproduce an existing online test locally
  parse-log  Read a Gaudi log file and extract information

To see which productions are available:

$ lb-ap list
The available productions are:
* MyAnalysis

To see which jobs are available for a given production:

$ lb-ap list MyAnalysis
The available jobs for MyAnalysis are:
* My2016MagDownJob
* My2016MagUpJob

To render the templating in info.yaml for a given production:

$ lb-ap render MyAnalysis

To validate the configuration of a given production:

$ lb-ap validate MyAnalysis
Rendering info.yaml for MyAnalysis
YAML parsed successfully
YAML validated successfully

To run a test of a job interactively:

$ lb-ap debug MyAnalysis My2016MagDownJob

Welcome to analysis productions debug mode:

The production can be tested by running:

gaudirun.py -T '$ANALYSIS_PRODUCTIONS_DYNAMIC/Lb2Lll/MC_2017_MagDown_Lb2PsiL_mm_strip_autoconf.py' '$ANALYSIS_PRODUCTIONS_BASE/Lb2Lll/stripping_seq.py' prodConf_DaVinci_00012345_00006789_1.py

[DaVinci v45r5] output $

If the debug job uses the output of another job as input and the output of that job is not provided with -i <output_file_location> then that job will be tested first (non-interactively) and its output file location passed to the debug job.

To test a job non-interactively:

$ lb-ap test MyAnalysis My2016MagDownJob
Success! Output can be found in xxxxxxxxxxxx

If the test job uses the output of another job as input then that job must be tested first (interactively or not) and its output file location appended to the debug command as -i <output_file_location>. If the file location is not specified then the dependent job will be run first and its output passed to the test job.

To read a Gaudi log file and extract information:

$ lb-ap parse-log Job.log
Summary of log messages in: Job.log
    Found 2659 ERROR messages
        * 2649 instances of "*** Flag container MC/TrackInfo not found."
        * 9 instances of "HltSelReportsDecoder::   Failed to add Hlt selection name Hlt2RecSummary to its container "
        * 1 instances of "HltSelReportsDecoder:: The   ERROR message is suppressed : '  Failed to add Hlt selection name Hlt2RecSummary to its container '"
    Found 61 WARNING messages
        * 7 instances of "TupleToolBremInfo:: TupleToolBremInfo requires fullDST -  BremP and BremOrigin might not be reliable (Multiplicity is OK)"
        and 54 others (50 unique), pass "--suppress=0" to show all messages

Errors have been detected!
  * Lines: 3275, 3277, 3279, 3281, 3283 and 17 others
    This message indicates the location specified for the information being accessed by
    RelatedInfo does not exist. It is likely that either:

    * The location specified is incorrect, try looking for it with dst-dump.
    * The given information was never stored for that candidate, in which case the use of
    RelatedInfo should be removed.

General explanations
  * Line: 6318
    Histograms are not being saved as no filename has been specified for storing them. This
    message is harmless and normally ignored.
Error: Found issues in log

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

LbAPLocal-0.1.0.tar.gz (302.6 kB view details)

Uploaded Source

Built Distribution

LbAPLocal-0.1.0-py3-none-any.whl (27.0 kB view details)

Uploaded Python 3

File details

Details for the file LbAPLocal-0.1.0.tar.gz.

File metadata

  • Download URL: LbAPLocal-0.1.0.tar.gz
  • Upload date:
  • Size: 302.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.10

File hashes

Hashes for LbAPLocal-0.1.0.tar.gz
Algorithm Hash digest
SHA256 b9041918a399d7c23608988d815b41bb63c6068b79fac61e364f625c5416acc1
MD5 eab45da68e1f397402daf8e12e144e4f
BLAKE2b-256 8ec42911e2ff2e829e3b32e0fb0504d2b9b994f5ecae023ca882c63a2556524e

See more details on using hashes here.

File details

Details for the file LbAPLocal-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: LbAPLocal-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 27.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.10

File hashes

Hashes for LbAPLocal-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f4ae68cabcee9752717788b887be5e47eceaa1e9cd9e22c0bccab38d69bebd1a
MD5 bd408b575ead2d54c58af9a4471cf71e
BLAKE2b-256 cdf923a04846f94befbabdaab0359c5debc8392f0f160a638fab70e42cf55fad

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page