Thursday, December 12, 2024
No menu items!
HomeDatabase ManagementRun event-driven stored procedures with AWS Lambda for Amazon Aurora PostgreSQL and...

Run event-driven stored procedures with AWS Lambda for Amazon Aurora PostgreSQL and Amazon RDS for PostgreSQL

In today’s cloud-centered applications, event-driven architectures have become increasingly popular for building scalable and responsive applications. When it comes to integrating with databases like Amazon Aurora PostgreSQL-Compatible Edition or Amazon Relational Database Service (Amazon RDS) for PostgreSQL, traditional approaches of invoking stored procedures may not always provide the desired flexibility, security, and scalability.

In this post, we demonstrate how to set up an event-driven workflow to run stored procedures for Amazon RDS for PostgreSQL with AWS Lambda to bridge this gap by securely connecting to an Aurora PostgreSQL database using AWS Secrets Manager, making sure that stored procedures can be managed in the cloud. We explore the step-by-step process, discuss the advantages of this approach, and address the limitations of invoking stored procedures from Lambda functions.

Solution overview

The following diagram provides an overview of our solution architecture.

This solution automates and secures the management of stored procedures in Aurora PostgreSQL-Compatible. It uses a Lambda function within a virtual private cloud (VPC) to connect to the PostgreSQL database, retrieve credentials from Secrets Manager, and run stored procedures. The process is initiated through the AWS Command Line Interface (AWS CLI), invoking the Lambda function, which performs the necessary operations and sends notifications to DBAs through Amazon Simple Notification Service (Amazon SNS). Amazon VPC endpoints promote secure communication between services, maintaining a secure and efficient workflow for managing database tasks in the cloud.

Prerequisites

For this walkthrough, the following prerequisites are necessary:

  • An AWS account with the necessary permissions to create and manage the required AWS services
  • An Aurora PostgreSQL DB instance
  • An Amazon Linux 2023 instance to create a Lambda layer for psycopg3
  • An Amazon Simple Storage Service (Amazon S3) bucket to upload the Lambda layer
  • Basic understanding of Python, Lambda, Secrets Manager, and Aurora PostgreSQL-Compatible

With these prerequisites in place, you’re now ready to implement a secure and automated solution for managing PostgreSQL stored procedures using a Lambda function.

Create a Secrets Manager secret

You can store your PostgreSQL database user credentials securely using Secrets Manager. Complete the following steps:

  1. On the Secrets Manager console, choose Secrets in the navigation pane.
  2. Choose Store a new secret to create a new secret.
  3. Select Other type of secrets as the secret type.
  4. In the Secret key/value section, enter the following key-value pairs:
    1. username – The user name for your PostgreSQL database.
    2. password – The password for your PostgreSQL database.
    3. host – The endpoint or hostname of your PostgreSQL database.
    4. dbname – The name of your PostgreSQL database.
    5. port – The port for your PostgreSQL database.
  5. Optionally, you can include more metadata, such as a description or tags, to help organize and manage the secret.
  6. Choose a secret name that is meaningful and simple to remember, such as -my-postgresql-credentials-.
  7. Select the appropriate AWS Key Management Service (AWS KMS) for encryption or use the default AWS managed key.
  8. Review the secret details and choose Store to create the secret.

Create an SNS topic and subscription

To configure Amazon SNS to send notifications to designated DBA users upon successful completion of stored procedures, complete the following steps:

    1. On the Amazon SNS console, choose Topics in the navigation pane.
    2. Choose Create topic to create a new SNS topic.
    3. Select Standard as the type and enter a meaningful name for your topic, such as DBA-Notifications.
    4. Choose Create topic to complete the process.

Next, you create a subscription to your topic.

  1. Choose the topic you just created.
  2. Choose Create subscription.
  3. Select the protocol (for example, email) and provide the endpoint (an email address) and where notifications will be sent.
  4. Choose Create subscription to complete the setup.

Create the Lambda function

Complete the following steps to create a Lambda function to run the stored procedures and interact with other AWS services:

  1. On the Lambda console, choose Functions in the navigation pane.
  2. Choose Create function to create a new Lambda function.
  3. Select Author from scratch as the function creation method.
  4. Enter a meaningful name for your function, such as PostgreSQL_Stored_Procedure_Blog.
  5. Select the runtime environment that matches your preferred programming language (for example, Python, Node.js, or Java). The Lambda function used for this post runs on the Python 3.12 runtime.
  6. Under Execution role, select Create a new role with basic Lambda permissions to allow the Lambda function to access other AWS resources.
  7. Choose Create function to create the Lambda function.
    Later on this post, we share a sample Python code for this Lambda function.
    dbblog-4081-2

Create a Lambda layer for psycopg3

In this post, we use a Lambda layer because it allows you to include external libraries like psycopg3 and manage dependencies separately from your function code, maintaining a clean and modular deployment.

Complete the following steps to create the Lambda layer on an Amazon Linux 2023 instance:

  1. Create a directory structure for a Lambda layer and navigate into the newly created lambda-layer folder:
    mkdir -p lambda-layer/python
    cd lambda-layer

    dbblog-4081-3

  2. Update the system packages and install the Python 3 pip package manager using the yum package management tool to the latest versions, allowing you to install and manage Python packages:
    sudo dnf update && sudo dnf install python3-pip

    dbblog-4081-4

  3. Download the psycopg binary package from PyPI (The Python Package Index) and its dependencies for Python 3.12 and save them in the current directory:
    sudo pip3 download 'psycopg[binary]' --platform manylinux2014_x86_64 --only-binary=:all: --python-version 3.12 -d .

    dbblog-4081-5

    Wheel files are a built package format for Python that can be directly installed, and the command downloads .whl files for psycopg and its dependencies.

  4. Extract the contents of the psycopg binary wheel (.whl) files to the python directory:
    unzip -o psycopg_binary-*.whl -d python/

    dbblog-4081-6

  5. Extract and overwrite the PostgreSQL Python adapter wheel files to the python/ directory:
    unzip -o psycopg-*.whl -d python/

    dbblog-4081-7

  6. Extract and overwrite the typing extensions wheel (.whl) files to the python/ directory:5px 0px
    unzip -o typing_extensions-*.whl -d python/

    dbblog-4081-8

  7. Create a compressed ZIP archive named psycopg3-layer.zip containing the python directory and its contents:
    ip -r9 psycopg3-layer.zip python

    dbblog-4081-9

  8. Move the local file psycopg3-layer.zip to the specified S3 bucket using the AWS CLI:
    aws s3 mv psycopg3-layer.zip s3://<mybucket>/

    dbblog-4081-10

  9. Create a new Lambda layer on the Lambda console by choosing the ZIP file from Amazon S3.dbblog-4081-11Now you can use the Lambda layer in your function.
  10. On the Lambda console, add the created layer to your Lambda function to use psycopg3.
    dbblog-4081-12dbblog-4081-13-1

Now you can use the sample Python code for the Lambda function, which connects to a PostgreSQL database, runs a stored procedure named in the DB_QUERY environment variable, and sends a success message to an SNS topic. The function retrieves the database credentials from Secrets Manager and other connection parameters from the environment variables.

Configure the Lambda function

To customize the Lambda function by setting environment variables and run the desired stored procedure, complete the following steps:

  1. In the Lambda function’s configuration, set the following environment variables:
    1. SECRET_NAME – The name of the secret you created in the Secrets Manager.
    2. DB_HOST – The endpoint or hostname of your PostgreSQL database.
    3. DB_NAME – The name of your PostgreSQL database.
    4. DB_QUERY – The stored procedure query—for example, CALL procedure_list_tables_in_information_schema();.
    5. REGION_NAME – The AWS Region where your Secrets Manager secret and other resources are located.
    6. SNS_TOPIC_ARN – The Amazon Resource Name (ARN) of the SNS topic to which notifications will be sent.
  2. Make sure that the Lambda function’s execution role has the necessary permissions to access Secrets Manager and the PostgreSQL database.
    1. For Secrets Manager, the role should have the secretsmanager:GetSecretValue
    2. For the PostgreSQL database, the role should have the appropriate permissions to run the desired SQL queries or stored procedures.
  3. Optionally, configure the remaining settings, such as memory allocation, timeout, or VPC configuration, based on your specific requirements.

Test and deploy the Lambda function

Validate and deploy the Lambda function to retrieve credentials, connect to the database, and run stored procedures before deploying it to the target environment:

  1. Test the Lambda function by invoking it manually or through an event source, such as an Amazon CloudWatch event or an Amazon API Gateway
  2. Verify that the Lambda function can successfully retrieve the database credentials from Secrets Manager and establish a connection to the PostgreSQL database.
  3. If the testing is successful, deploy the Lambda function to your desired environment (such as development, staging, or production).

When the Lambda function runs, a successful SNS email notification is sent, as shown in the following screenshot.

dbblog-4081-14

Security, logging, and monitoring

Verify that you’re handling database credentials securely by using Secrets Manager, and restrict access to your Lambda function as needed. Additionally, make sure your Lambda function has the necessary network access by placing it in the same VPC as your RDS instance and configuring the security groups accordingly.

Make sure to implement proper logging and error handling in your Lambda function to monitor and troubleshoot any issues that might arise during database connectivity.

Handling long-running stored procedures

Although Lambda functions have a default timeout of 3 seconds, you can configure the timeout to up to 15 minutes to accommodate long-running stored procedures. If your PostgreSQL stored procedure exceeds this 15-minute limit, you have a couple of options.

To mitigate this, you can break down the stored procedure into smaller, more manageable tasks that can be run sequentially within the Lambda function’s time constraints.

Alternatively, you can implement an asynchronous processing pattern where the Lambda function runs the stored procedure and returns immediately; the actual processing of the long-running stored procedure is offloaded to a separate service or worker, allowing you to use the event-driven architecture of Lambda while adhering to serverless best practices.

Clean up

To avoid incurring future charges, delete the resources you created:

  1. Delete your RDS or Aurora DB instance if not in use.
  2. Delete the Lambda function you created.
  3. Delete the SNS topic.
  4. Delete the Secrets Manager secret.
  5. Delete any IAM roles and policies created for the post.
  6. Delete the EC2 instance.

Conclusion

In this post, we explored the implementation of a Lambda function that securely connects to an Aurora PostgreSQL database using Secrets Manager. We provided step-by-step instructions for setting up Secrets Manager and Amazon SNS, creating the Lambda function, and implementing the necessary logic. Additionally, we discussed the limitations of invoking stored procedures from Lambda functions and highlighted the key advantages of this approach, including improved security, scalability, reduced infrastructure overhead, seamless integration, and cost optimization.

We welcome your questions and suggestions. Please leave a comment.


 About the Authors

ramspathRamesh Pathuri is a Senior Database Consultant with AWS Worldwide Public Sector Professional Services. His extensive experience in databases, along with a passion for guiding customers through their innovative transition to the AWS Cloud, enables them to unlock new possibilities through database migration and modernization, and optimize their data and analytics solutions for increased innovation, efficiency, and impact.

Picture15-4Gautam Bhaghavatula is a Senior Partner Solutions Architect at AWS, leveraging years of hands-on experience as a Senior Cloud Infrastructure Architect in AWS Professional Services. He specializes in scalable cloud architectures, robust security, and performance optimization. With expertise across systems, networking, microservices, and DevOps, Gautam collaborates with partners to enable seamless migrations, advanced modernization, and strategic cloud adoption.

Read MoreAWS Database Blog

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments