Many customers have Oracle multi-tenant databases in their on-premises environments, and may be looking to use Amazon Relational Database Service (Amazon RDS) for Oracle features while using multi-tenant databases. As of August 2022, Amazon RDS for Oracle supports a subset of multi-tenant architecture called single-tenant architecture. A multi-tenant architecture enables an Oracle database to function as a multi-tenant container database (CDB). A CDB can include customer-created pluggable databases (PDBs). In RDS for Oracle, the CDB contains only one PDB. The single-tenant architecture uses the same Amazon RDS APIs as that of a non-CDB architecture. Amazon RDS for Oracle with a non-CDB database is mostly identical using a PDB.
In this post, we discuss a solution for migrating your on-premises Oracle multi-tenant pluggable databases to Amazon RDS for Oracle.
A pluggable database is a portable collection of schemas, schema objects, and nonschema objects that appears to an Oracle Net client as a non-CDB. PDBs can be plugged into CDBs, and a CDB can contain multiple PDBs. Each PDB appears on the network as a separate database.
When you create a CDB, specify the DB instance identifier just as you would for a non-CDB based single tenant Oracle RDS instance. The instance identifier forms the first part of your endpoint. The system identifier (SID) is the name of the CDB. The SID of every CDB is RDSCDB. You can’t choose a different value.
This section covers lift and shift procedure to migrate a pluggable database to Amazon RDS for Oracle. This involves downtime based on the size of the database, network bandwidth between on-premise and AWS, and time consumed by export and import of database. In order to minimize the downtime you can refer to Migrating Oracle databases with near-zero downtime using AWS DMS
For this post, we use Amazon Elastic Compute Cloud (Amazon EC2) as our source Oracle database server. The following diagram illustrates our architecture.
Before you get started, complete the following prerequisite steps:
Create an RDS for Oracle multitenant container database using the following AWS CloudFormation template.
Note: The CloudFormation template creates a Bring Your Own License (BYOL) RDS for Oracle instance.
Make sure the source database is an Oracle Pluggable database located on-premises or on an EC2 instance which can connect to AWS resources hosted in the respective account.
Confirm that you can connect to both the on-premises Oracle database and RDS for Oracle instance.
Add an Oracle RDS tnsentry in your source database server (Amazon EC2 in our case) to connect to the target.
Install and configure the AWS Command Line Interface (AWS CLI) on a machine that has access to Amazon Simple Storage Service (Amazon S3) and the RDS instance. For instructions, refer to Installing or updating the latest version of the AWS CLI.
After you create the RDS for Oracle database (the target database) as a single-tenant architecture, you can connect to the PDB database using the endpoint identifier and PDB database name. The following example shows the format for the connection string in SQL*Plus:
Export data from the source Oracle database
Connect to your source database and fetch the schemas and its respective tables associated with a PDB database:
Connect to the source PDB database using TNS and perform an export of the schemas, that are in scope of the migration, using Oracle data pump :
Transfer exported dump files to Amazon S3 and integrate an Amazon S3 role with the RDS instance
The RDS for Oracle instance must have access to an S3 bucket to work with the Amazon RDS for Oracle integration with Amazon S3. After you create an Amazon RDS for Oracle instance and S3 bucket, create an IAM policy and an IAM role as outlined in this section, then attach the policy to the role.
To use the Amazon RDS for Oracle integration with Amazon S3, your Amazon RDS for Oracle instance must be associated with an option group that includes the S3_INTEGRATION option. An option group is already attached to our RDS instance; to add the S3_Integration feature, complete the following steps:
On the Amazon RDS console, choose Option groups.
Choose the group attached to the RDS instance.
Choose Add option.
For Option, choose S3_INTEGRATION.
For Version, choose 1.0.
For Apply Immediately, select Yes.
Choose Add Option.
After you add S3_Integration to the option group, create an IAM role to integrate with the Oracle RDS instance.
In the navigation pane of the IAM console, choose Roles, then choose Create role.
Under Select trusted entity, choose AWS service and choose RDS.
Under Add permissions, choose AmazonS3FullAccess.
Under Role Details, enter RDS_S3_Integration_Role as the role name and choose Create role.
After the IAM role and S3_Integration is created, associate them with your RDS DB instance.
On the Amazon RDS console, choose your DB instance.
On the Connectivity & Security tab, choose Manage IAM roles.
For Add IAM role to this instance, choose RDS_S3_Integration_Role (the role that you created).
For Features, choose S3_INTEGRATION.
Choose Add role.
After the IAM role and Amazon S3 integration feature are associated with your RDS for Oracle database, you can upload the data dump files from your on-premises Oracle database instance to Amazon S3 and also download from Amazon S3 to the RDS instance. This step requires AWS CLI (AWS Command line interface) to be available on the host, follow the step by step instructions on installing or updating the latest version of the AWS CLI.
Download the dump files from Amazon S3 to the RDS instance
After you upload the data dump files to the S3 bucket, connect to your target database instance and download the data pump files from Amazon S3 to the DATA_PUMP_DIR of your target instance. See the following code:
Note: Make sure your RDS instance has enough storage space to accommodate the dump files. You can monitor the storage space from CloudWatch metrics, and prevent Amazon RDS from running out of space, by creating cloudwatch alarm on RDS storage .
This gives you the task ID 1654638896204-1266. Verify the status of the file you uploaded to the RDS for Oracle instance with the following SQL query:
After the SQL query output shows the file downloaded successfully, you can list the data pump file in the RDS for Oracle database with the following query:
Import the data
After the data dump file is available, you can create tablespaces, grants, and schemas on the target RDS for Oracle database before you initiate the import.
Grant privileges to the directories before the import with the following code:
Connect to the source Oracle server or another server that can make a connection to the target RDS instance and run the following command:
Verify tables in the PDB on Amazon RDS
Connect to the RDS instance and run the following command to verify the tables:
After data migration and validation is completed, perform the cutover using following steps:
The RDS for Oracle DB is ready to serve traffic for the first time, so take a snapshot of it.
Put your source database in read-only mode.
Re-validate the data manually.
Enable triggers if not enabled; you can use the below query to get the list of triggers which are disabled:
Get the state of the sequences and set the sequences to the appropriate values:
Verify that there is no traffic to the source Oracle database and redirect all application traffic to the RDS instance.
To remove all the components created by this solution, complete the following steps:
Sign in to the AWS Management Console.
Choose the Region where your RDS for Oracle instance reside.
On the CloudFormation console, choose your stack and select Delete and select Delete Stack.
In this post, we demonstrated how to migrate your on-premises pluggable or non-pluggable database to an Amazon RDS for Oracle database (a single-tenant architecture). You can perform these steps to migrate your databases to Amazon RDS for Oracle and take advantage of Amazon RDS managed services and features.
We encourage you to try this solution and take advantage of all the benefits of using AWS DMS with Oracle databases. For more information, see Getting started with AWS Database Migration Service and Best Practices for AWS Database Migration Service. For more information on Oracle Database Migration, refer to the guide Migrating Oracle Databases to the AWS Cloud.
About the authors
Jeevith Anumalla is a Senior Data Architect with the Professional Services team at Amazon Web Services. He works as data migration specialist, helps customers build data lake and analytics platform.
Sagar Patel is a Senior Database Specialty Architect with the Professional Services team at Amazon Web Services. He works as a database migration specialist to provide technical guidance and help Amazon customers to migrate their on-premises databases to AWS.
Read MoreAWS Database Blog