[go: up one dir, main page]

0% found this document useful (0 votes)
25 views2 pages

Schema Migration with REMAP_SCHEMA

The document outlines a schema-level data migration process using Oracle Data Pump (expdp/impdp) for a logistics project, detailing the steps taken to migrate data from a staging schema to a production schema. Key features of Data Pump include high performance, schema remapping, and automation capabilities, with a focus on ensuring data consistency through FLASHBACK_TIME. Post-migration validation and a rollback strategy were implemented to ensure data integrity and accuracy.

Uploaded by

leoleo4592
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views2 pages

Schema Migration with REMAP_SCHEMA

The document outlines a schema-level data migration process using Oracle Data Pump (expdp/impdp) for a logistics project, detailing the steps taken to migrate data from a staging schema to a production schema. Key features of Data Pump include high performance, schema remapping, and automation capabilities, with a focus on ensuring data consistency through FLASHBACK_TIME. Post-migration validation and a rollback strategy were implemented to ensure data integrity and accuracy.

Uploaded by

leoleo4592
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Schema-Level Data Migration Using Data Pump (expdp/impdp)

Real-Time Scenario

In my logistics project, I had to migrate schema data (CONTAINER, TERMINAL, DOCUMENT) from a staging schema

to a production schema. We used Oracle Data Pump (expdp/impdp) due to high data volume.

Why Data Pump?

- High performance

- Supports metadata + data

- Schema remapping

- FLASHBACK_TIME for consistency

- Automation-friendly

Interview Summary

I migrated schema using expdp/impdp with REMAP_SCHEMA, ensured data consistency using FLASHBACK_TIME,

validated post-migration using queries for row counts, indexes, and object status.

Step-by-Step Approach

1. Pre-Requisites:

- Create directory object

- Check tablespaces/privileges

- Capture row counts

2. Export (expdp):

expdp stg_user/password SCHEMAS=stg_user ... FLASHBACK_TIME=SYSTIMESTAMP

3. Import (impdp):

impdp prod_user/password REMAP_SCHEMA=stg_user:prod_user ... TABLE_EXISTS_ACTION=REPLACE

4. Post-Migration Validation:

- Validate object counts


Schema-Level Data Migration Using Data Pump (expdp/impdp)

- Row count check

- Recompile invalid objects

- Check indexes & constraints

5. Rollback Strategy:

expdp backup before import, use impdp if rollback needed

6. Automation:

Shell script to run export & import

7. Logistics-Specific Checks:

- Validate GATE_IN/GATE_OUT in container

- Check terminal appointment statuses

Validation Queries

SELECT COUNT(*) FROM dba_objects WHERE owner='PROD_USER';

SELECT * FROM user_indexes WHERE status='UNUSABLE';

SELECT constraint_name, status FROM dba_constraints WHERE owner='PROD_USER';

Interview Wrap-Up Answer

In my logistics project, I migrated schema from staging to production using expdp/impdp with REMAP_SCHEMA. I

ensured data consistency using FLASHBACK_TIME and performed post-migration validations for object count, index

status, and data accuracy. The entire process was automated using shell scripts with rollback backup in place.

You might also like