Skip to main content

An AWS Aurora Serverless Data API dialect for SQLAlchemy

Project description

This package provides a SQLAlchemy dialect for accessing PostgreSQL and MySQL databases via the AWS Aurora Data API.

Installation

pip install sqlalchemy-aurora-data-api

Prerequisites

  • Set up an AWS Aurora Serverless cluster and enable Data API access for it. If you have previously set up an Aurora Serverless cluster, you can enable Data API with the following AWS CLI command:

    aws rds modify-db-cluster --db-cluster-identifier DB_CLUSTER_NAME --enable-http-endpoint --apply-immediately
  • Save the database credentials in AWS Secrets Manager using a format expected by the Data API (a JSON object with the keys username and password):

    aws secretsmanager create-secret --name rds-db-credentials/MY_DB
    aws secretsmanager put-secret-value --secret-id rds-db-credentials/MY_DB --secret-string "$(jq -n '.username=env.PGUSER | .password=env.PGPASSWORD')"
  • Configure your AWS command line credentials using standard AWS conventions. You can verify that everything works correctly by running a test query via the AWS CLI:

    aws rds-data execute-statement --resource-arn RESOURCE_ARN --secret-arn SECRET_ARN --sql "select * from pg_catalog.pg_tables"
    • Here, RESOURCE_ARN refers to the Aurora RDS database ARN, which can be found in the AWS RDS Console (click on your database, then “Configuration”) or in the CLI by running aws rds describe-db-clusters. SECRET_ARN refers to the AWS Secrets Manager secret created above.

    • When running deployed code (on an EC2 instance, ECS/EKS container, or Lambda), you can use the managed IAM policy AmazonRDSDataFullAccess to grant your IAM role permissions to access the RDS Data API (while this policy is convenient for testing, we recommend that you create your own scoped down least-privilege policy for production applications).

Usage

The package registers two SQLAlchemy dialects, mysql+auroradataapi:// and postgresql+auroradataapi://. Two sqlalchemy.create_engine() connect_args keyword arguments are required to connect to the database:

  • aurora_cluster_arn (also referred to as resourceArn in the Data API documentation)

    • If not given as a keyword argument, this can also be specified using the AURORA_CLUSTER_ARN environment variable

  • secret_arn (the database credentials secret)

    • If not given as a keyword argument, this can also be specified using the AURORA_SECRET_ARN environment variable

All connection string contents other than the protocol (dialect) and the database name (path component, my_db_name in the example below) are ignored.

from sqlalchemy import create_engine

cluster_arn = "arn:aws:rds:us-east-1:123456789012:cluster:my-aurora-serverless-cluster"
secret_arn = "arn:aws:secretsmanager:us-east-1:123456789012:secret:rds-db-credentials/MY_DB"

engine = create_engine('postgresql+auroradataapi://:@/my_db_name',
                       echo=True,
                       connect_args=dict(aurora_cluster_arn=cluster_arn, secret_arn=secret_arn))

with engine.connect() as conn:
    for result in conn.execute("select * from pg_catalog.pg_tables"):
        print(result)

Motivation

The RDS Data API is the link between the AWS Lambda serverless environment and the sophisticated features provided by PostgreSQL and MySQL. The Data API tunnels SQL over HTTP, which has advantages in the context of AWS Lambda:

  • It eliminates the need to open database ports to the AWS Lambda public IP address pool

  • It uses stateless HTTP connections instead of stateful internal TCP connection pools used by most database drivers (the stateful pools become invalid after going through AWS Lambda freeze-thaw cycles, causing connection errors and burdening the database server with abandoned invalid connections)

  • It uses AWS role-based authentication, eliminating the need for the Lambda to handle database credentials directly

Debugging

This package uses standard Python logging conventions. To enable debug output, set the package log level to DEBUG:

logging.basicConfig()

logging.getLogger("aurora_data_api").setLevel(logging.DEBUG)

License

Licensed under the terms of the Apache License, Version 2.0.

https://travis-ci.org/chanzuckerberg/sqlalchemy-aurora-data-api.png https://codecov.io/github/chanzuckerberg/sqlalchemy-aurora-data-api/coverage.svg?branch=master https://img.shields.io/pypi/v/sqlalchemy-aurora-data-api.svg https://img.shields.io/pypi/l/sqlalchemy-aurora-data-api.svg https://readthedocs.org/projects/sqlalchemy-aurora-data-api/badge/?version=latest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sqlalchemy-aurora-data-api-0.3.1.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

sqlalchemy_aurora_data_api-0.3.1-py2.py3-none-any.whl (10.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file sqlalchemy-aurora-data-api-0.3.1.tar.gz.

File metadata

  • Download URL: sqlalchemy-aurora-data-api-0.3.1.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.9.7

File hashes

Hashes for sqlalchemy-aurora-data-api-0.3.1.tar.gz
Algorithm Hash digest
SHA256 5f8cb6d5baac476bc990f99aaaec4aa2509ce2256777b57f70b0c7a28847b4bc
MD5 c0d2a869aeb960b6392d78db9d8f19b2
BLAKE2b-256 4bc3f1339a171595095d3699c38214e50fa525d7006b9874c492d1a4891923f1

See more details on using hashes here.

File details

Details for the file sqlalchemy_aurora_data_api-0.3.1-py2.py3-none-any.whl.

File metadata

  • Download URL: sqlalchemy_aurora_data_api-0.3.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.9.7

File hashes

Hashes for sqlalchemy_aurora_data_api-0.3.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 bdff2f0a48c3336703ed7ef03f8df4e3cd3349d58c8be53836bf80b4b04c8dac
MD5 c418190bdb318952afd43b3dfe5922a0
BLAKE2b-256 a50abfe63eeaedf39ab2551acbc4572b8c0d781d93e94e3b9b40e772860c293a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page