Skip to main content

Doing Jobs Dinamically with Azkaban

Project description

Build Status

Auror Core

Simple Flow creation for Azkaban

auror

Install

pip install auror_core

Supported Job Types

V1

  • Command
  • Flow

V2

  • Command

Usage

Creating a simple Azkaban flow with one command

You just need to import job type and project

from auror_core.v1.job import Job, Command
from auror_core import Project

com1 = Job()\
.as_type(Command)\
.with_name("commands job")\
.with_command("bash echo 1")\
.with_another_command("bash echo 2")

Project("folder_to_generate_files", com1).write()

Creating a simple V2 Azkaban flow with one command

V2 flow is implemented under v2 subfolder with same job types

from auror_core.v2.job import Job, Command
from auror_core import Project

com1 = Job()\
.as_type(Command)\
.with_name("commands job")\
.with_command("bash echo 1")\
.with_another_command("bash echo 2")

Project("folder_to_generate_files", com1).is_v2().write()

Creating Flows with dependencies

from auror_core.v2.job import Job, Command
from auror_core import Project

com1 = Job()\
.as_type(Command)\
.with_name("commands job")\
.with_command("bash echo 1")\
.with_another_command("bash echo 2")

com2 = Command()\
.with_name("sub command job")\
.with_command("bash echo 1")\
.with_dependencies(com1)

Project("folder_to_generate_files", com1, com2).is_v2().write()

Sharing job attributes

Organize jobs with same configuration

from auror_core.v2.job import Command
from auror_core import Project

com = Command()\
.with_command("bash echo 1")

com1 = com.with_name("commands job")\
.with_another_command("bash echo 2")

com2 = com.with_name("sub command job")\
.with_dependencies(com1)

Project("folder_to_generate_files", com1, com2).is_v2().write()

Job with extra customization and configuration

Simulating a Command with base Job (NOT RECOMMENDED)

from auror_core.v1.job import Job
from auror_core import Project

com1 = Job()\
.with_name("commands job")\
.with_(command="bash echo 1")

com1._type = "command"

Project("folder_to_generate_files", com1).write()

Integrating with Flow (just for V1)

V2 already have flow included

from auror_core.v1.job import Command, Flow, Job
from auror_core import Project

com1 = Command()\
.with_name("commands job")\
.with_command("bash echo 1")

flow = Job()\
.as_type(Flow)\
.with_name("flow")\
.with_dependencies(com1)

Project("folder_to_generate_files", com1, flow).write()

Using Flow Params

from auror_core.v2.job import Command
from auror_core.v2.params import Params
from auror_core import Project

params = Params(
    teste1="my test",
    variable="my variable"
)

com = Command()\
.with_command("bash echo ${variable}")

Project("folder_to_generate_files", com)\
.is_v2()\
.with_params(params)\
.write()

Using Flow Environment Variables

from auror_core.v2.job import Command
from auror_core.v2.params import Env
from auror_core import Project

env = Env(
    TESTE="my test",
    VARIABLE="my variable"
)

com = Command()\
.with_command("bash echo $VARIABLE")

Project("folder_to_generate_files", com)\
.is_v2()\
.with_params(env)\
.write()

Using Flow Environment Variables and Params

from auror_core.v2.job import Command
from auror_core.v2.params import Env, Params
from auror_core import Project

env = Env(
    TESTE="my test",
    VARIABLE="my variable"
)

params = Params(
    teste1="my test",
    variable="my variable"
)

com = Command()\
.with_command("bash echo $VARIABLE ${variable}")

Project("folder_to_generate_files", com)\
.is_v2()\
.with_params(params, env)\
.write()

Join multiple variables in one

from auror_core.v2.job import Command
from auror_core.v2.params import Env
from auror_core import Project

env = Env(
    TESTE="env test",
    VARIABLE="env variable"
)

params = Params(
    teste1="my test",
    variable="my variable"
)

one_param = ParamsJoin("params_strange_name", ",") ## param name and separator

com = Command()\
.with_command("bash echo ${params_strange_name}") 
## it will print: my test,my variable,env test,env variable
## THERE IS NO ORDER GUARANTEE, JUST Python 3.6 >

Project("folder_to_generate_files", com)\
.is_v2()\
.with_params(one_param(params, env))\
.write()

Load jobs from YAML File (just for V2)

You can find some YAML File examples on Azkaban Flow Documentation

from auror_core.v2.loader import Loader

loader = Loader('/path/to/file/flow.yaml')
jobs = loader.as_job_objects()

Or you can export these jobs as a Python File

from auror_core.v2.loader import Loader

loader = Loader('/path/to/file/flow.yaml')
jobs = loader.as_python_file('/path/to/desired/directory') # will be dumped with 'flow.py' name

Dump memory flows to a Python File (just for V2)

from auror_core.v2.dumper import Dumper

com1 = Job() \
.with_name("commands job 1") \
.with_(command="bash echo 1")

com2 = Job()\
.with_name("commands job 2")\
.with_(command="bash echo 2")

dumper = Dumper('/path/to/desired/directory') # will be dumped with 'flow.py' name
dumper.dump_jobs(com1, com2)

Plugins

Plugins are just extensions from auror_core

There is a cookiecutter for new azkaban jobtypes with Auror template too: https://github.com/globocom/azkaban-jobtype-cookiecutter

We already have email plugin: https://github.com/globocom/azkaban-jobtype-email

Contribute

For development and contributing, please follow Contributing Guide and ALWAYS respect the Code of Conduct

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

auror_core-1.2.0.tar.gz (11.9 kB view details)

Uploaded Source

File details

Details for the file auror_core-1.2.0.tar.gz.

File metadata

  • Download URL: auror_core-1.2.0.tar.gz
  • Upload date:
  • Size: 11.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.3

File hashes

Hashes for auror_core-1.2.0.tar.gz
Algorithm Hash digest
SHA256 d560d927df7f2a88556e8ca4c860605a8f97fa842668f7f7640b478799e4eb3f
MD5 ca8bcea5fa93036292638414cbd133d2
BLAKE2b-256 3d746d595df0c4cfdcf54c65e882649a73ec975aeced0f0b2d3acc1870115b5a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page