Skip to main content

An AWS Aurora Serverless Data API dialect for SQLAlchemy. This is a temporary package that allows credentials to be passed via the SQLAlchemy URI.

Project description

This package provides a SQLAlchemy dialect for accessing PostgreSQL and MySQL databases via the AWS Aurora Data API.

Installation

pip install sqlalchemy-aurora-data-api

Prerequisites

  • Set up an AWS Aurora Serverless cluster and enable Data API access for it. If you have previously set up an Aurora Serverless cluster, you can enable Data API with the following AWS CLI command:

    aws rds modify-db-cluster --db-cluster-identifier DB_CLUSTER_NAME --enable-http-endpoint --apply-immediately
  • Save the database credentials in AWS Secrets Manager using a format expected by the Data API (a JSON object with the keys username and password):

    aws secretsmanager create-secret --secret-id rds-db-credentials/MY_DB
    aws secretsmanager put-secret-value --secret-id rds-db-credentials/MY_DB --secret-string "$(jq -n '.username=env.PGUSER | .password=env.PGPASSWORD')"
  • Configure your AWS command line credentials using standard AWS conventions. You can verify that everything works correctly by running a test query via the AWS CLI:

    aws rds-data execute-statement --resource-arn RESOURCE_ARN --secret-arn SECRET_ARN --sql "select * from pg_catalog.pg_tables"
    • Here, RESOURCE_ARN refers to the Aurora RDS database ARN, which can be found in the AWS RDS Console (click on your database, then “Configuration”) or in the CLI by running aws rds describe-db-clusters. SECRET_ARN refers to the AWS Secrets Manager secret created above.

    • When running deployed code (on an EC2 instance, ECS/EKS container, or Lambda), you can use the managed IAM policy AmazonRDSDataFullAccess to grant your IAM role permissions to access the RDS Data API (while this policy is convenient for testing, we recommend that you create your own scoped down least-privilege policy for production applications).

Usage

The package registers two SQLAlchemy dialects, mysql+auroradataapi:// and postgresql+auroradataapi://. Two sqlalchemy.create_engine() connect_args keyword arguments are required to connect to the database:

  • aurora_cluster_arn (also referred to as resourceArn in the Data API documentation)

    • If not given as a keyword argument, this can also be specified using the AURORA_CLUSTER_ARN environment variable

  • secret_arn (the database credentials secret)

    • If not given as a keyword argument, this can also be specified using the AURORA_SECRET_ARN environment variable

All connection string contents other than the protocol (dialect) and the database name (path component, my_db_name in the example below) are ignored.

from sqlalchemy import create_engine

cluster_arn = "arn:aws:rds:us-east-1:123456789012:cluster:my-aurora-serverless-cluster"
secret_arn = "arn:aws:secretsmanager:us-east-1:123456789012:secret:rds-db-credentials/MY_DB"

engine = create_engine('postgresql+auroradataapi://:@/my_db_name',
                       echo=True,
                       connect_args=dict(aurora_cluster_arn=cluster_arn, secret_arn=secret_arn))

with engine.connect() as conn:
    for result in conn.execute("select * from pg_catalog.pg_tables"):
        print(result)

Motivation

The RDS Data API is the link between the AWS Lambda serverless environment and the sophisticated features provided by PostgreSQL and MySQL. The Data API tunnels SQL over HTTP, which has advantages in the context of AWS Lambda:

  • It eliminates the need to open database ports to the AWS Lambda public IP address pool

  • It uses stateless HTTP connections instead of stateful internal TCP connection pools used by most database drivers (the stateful pools become invalid after going through AWS Lambda freeze-thaw cycles, causing connection errors and burdening the database server with abandoned invalid connections)

  • It uses AWS role-based authentication, eliminating the need for the Lambda to handle database credentials directly

Debugging

This package uses standard Python logging conventions. To enable debug output, set the package log level to DEBUG:

logging.basicConfig()

logging.getLogger("aurora_data_api").setLevel(logging.DEBUG)

License

Licensed under the terms of the Apache License, Version 2.0.

https://travis-ci.org/chanzuckerberg/sqlalchemy-aurora-data-api.png https://codecov.io/github/chanzuckerberg/sqlalchemy-aurora-data-api/coverage.svg?branch=master https://img.shields.io/pypi/v/sqlalchemy-aurora-data-api.svg https://img.shields.io/pypi/l/sqlalchemy-aurora-data-api.svg https://readthedocs.org/projects/sqlalchemy-aurora-data-api/badge/?version=latest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

preset-sqlalchemy-aurora-data-api-0.2.8.tar.gz (15.0 kB view details)

Uploaded Source

File details

Details for the file preset-sqlalchemy-aurora-data-api-0.2.8.tar.gz.

File metadata

File hashes

Hashes for preset-sqlalchemy-aurora-data-api-0.2.8.tar.gz
Algorithm Hash digest
SHA256 ca495b2b972732c4d7e9434b4a826875f842d922b62ab1b98a33b15043c5a254
MD5 5fec6e4cdc322b29af6dd04682efd2ab
BLAKE2b-256 b6f18b13fd7ef10844aeda596128b2442433f0fb9181b6c36c9e4d0a9ad530ed

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page