Export/Import with Data Pump


The regular scripts export.js and import.js are very convenient, however the disadvantage is that they run much slower than native database tools like Oracle's Data Pump. So many admins prefer to use Data Pump which is fine especially for 1:1 backups.

Traditional Methods with Data Pump

Backing up a BPS schema for security can easily be done with Oracle Data Pump by exporting either the full database or only the schema. Restoring into a database where none of the schema or bps roles or users are yet present is straight forward. In case of the schema-only export however there are some more actions required: Create the tablespaces and the roles schema_usr and schema_gst before import, and recreate the database users after import in the users app of the BPS GUI.

However there are special cases that need many more actions, for example copying to another schema name or reorganizing because something in is broken (indexes, constraints, triggers, functions…). See BPS manual for details to take care of.

This is where the scripts bpsoraexp.cmd and bpsoraimp.cmd come into the game, because they handle most everything required for a clean import or reorg.

BPS Scripts for Data Pump

bpsoraexp.cmd exports only vital table data with expdp into a DMP file, and the statements necessary to recreate the sequences into a SQL file. The perfect time for exporting a production schema is when nobody else is working on it. Nevertheless the export uses FLASHBACK to pull a consistent copy of the time the export was started.

For import with bpsoraimp.cmd, a clean BPS schema needs to be present on the target oracle server. This may be an existing schema, or one that was just created with the «BPS Datastore Setup» assistant. Only the table data are then imported into that clean schema, and the sequences are recreated. This way, all other objects, grants etc are left in a clean state and you have a perfect BPS datastore.

After importing with bpsoraimp.cmd, the only step left is to recreate the BPS database users in the users app of the BPS GUI.

The scripts must run with the credentials of system or other user with oracle dba privileges.


Create the Data Pump Dir

The scripts requires a oracle directory named BPS_DATA_PUMP.

Make directory in file system on the oracle server

In Windows

mkdir C:\oracle\bps_data_pump

In Linux as user oracle

mkdir -p /home/oracle/bps_data_pump
All following tasks may be executed either on a windows client, or on the windows server.
When the oracle server is linux, the steps must be executed on a windows client.

Starting with 12.2 the data pump utilities are also available for instant client as Tools Package. For older versions you need the full blown oracle client.

Unlike the other client components which are more flexible, it is mandatory to have matching client and server versions for data pump utilities.

Make directory in the database

Connect to the service with sqlplus

sqlplus system/******@vmora18

Create directory for a windows server


or for a linux server

CREATE OR REPLACE DIRECTORY BPS_DATA_PUMP AS '/home/oracle/bps_data_pump';



Execute Export

Open a terminal window, create and change to a working directory

mkdir D:\oracle\bpsimpexp
cd /d D:\oracle\bpsimpexp 

Check that sqlplus, impdp and bpsoraexp are found in the path

sqlplus -v
expdp help=y

Start script

bpsoraexp system/********@vmora12 lu_agrar

Sample output

In the working directory you will find LU_AGRAR.SQL created by the script

And in the data pump directory on the server you will find LU_AGRAR.DMP and the log file from expdp

For importing you will need LU_AGRAR.DMP with the table data and LU_AGRAR.SQL with the sequences.

Execute Import

In our example we have the files LU_AGRAR.DMP and LU_AGRAR.SQL with the export of schema LU_AGRAR from an other db server. We want to import the schema on our target server vmora18, and also into a schema named LU_AGRAR.

  • Copy the file LU_AGRAR.DMP into the data pump directory.
  • Create a working directory from where you will run bpsoraimp.cmd.
  • Copy the file LU_AGRAR.SQL into the working directory.
  • Check/prepare the schema of LU_AGRAR on server vmora18:
    • In case it does not exist, create it with the «BPS Datastore Setup» assistant. (For details see BPS manual)
    • In case it already exists but is an old version, update it with the «BPS Datastore Update» assistant.
    • In case it already exists but may be broken, delete it with the «BPS Datastore Drop» assistant, and then re-create it with the «BPS Datastore Setup» assistant.
  • In any case, make shure the target schema has the same version as the source schema, otherwise the import may fail. (Check in Central System Settings/Install/DbLevel)
  • Open a terminal window and do as follows

Change to the working directory

cd /d d:\oracle\bpsimpexp 

Check that sqlplus, impdp and bpsoraimp are found

sqlplus -v
impdp help=y

Start script

bpsoraimp system/********@vmora18 lu_agrar

Sample script output

In the data pump directory on the server you will find the log file of the table data import with impdp

Importing to a schema with other name

The basic steps are equal as before where we imported into LU_AGRAR.

This time we want to import from LU_AGRAR.DMP and LU_AGRAR.SQL into schema LU_FGB. So instead of checking/preparing schema LU_AGRAR on the target server, we do it for schema LU_FGB this time.

There is no need for a schema LU_AGRAR to exist on the target server, but it does not hurt either. So this method also allows to import back into the source server, without touching the the original schema. You might want to do this to copy your producton schema to a test or quality schema.

Start script, but this time append the new target schema name as well

bpsoraimp system/********@vmora18 lu_agrar lu_fgb

Easy peasy, isn't it?