Export/Import with Data Pump

The regular scripts export.js and import.js are convenient, however the disadvantage is that they run much slower than native database tools like Oracle's Data Pump. So many admins prefer to use Data Pump which is fine especially for 1:1 backups.

Backing up a BPS schema for security can easily be done with Oracle Data Pump by exporting either the full database or only the schema. Restoring into a database where none of the schema or bps roles or users are yet present is straight forward. In case of the schema-only export however there are some more actions required: Create the tablespaces and the roles schema_usr and schema_gst before import, and recreate the database users after import in the users app of the BPS GUI.

However there are special cases that need many more actions, for example copying to another schema name or reorganizing because something in is broken (indexes, constraints, triggers, functions…). See BPS manual for details to take care of.

This is where the scripts bpsoraexp.cmd and bpsoraimp.cmd come into the game, because they handle most everything required for a clean import or reorg.

bpsoraexp.cmd exports only vital table data with expdp into a DMP file, and the statements necessary to recreate the sequences into a SQL file. The perfect time for exporting a production schema is when nobody else is working on it. Nevertheless the export uses FLASHBACK to pull a consistent copy of the time the export was started.

For import with bpsoraimp.cmd, a clean BPS schema needs to be present on the target oracle server. This may be an existing schema, or one that was just created with the «BPS Datastore Setup» assistant. Only the table data are then imported into that clean schema, and the sequences are recreated. This way, all other objects, grants etc are left in a clean state and you have a perfect BPS datastore.

After importing with bpsoraimp.cmd, the only step left is to recreate the BPS database users in the users app of the BPS GUI.

The scripts must run with the credentials of system or other user with oracle dba privileges.

The scripts requires a oracle directory named BPS_DATA_PUMP.

Make directory in file system on the oracle server

In Windows

mkdir C:\oracle\bps_data_pump

In Linux as user oracle

mkdir -p /home/oracle/bps_data_pump
All following tasks may be executed either on a windows client, or on the windows server.
When the oracle server is linux, the steps must be executed on a windows client.

The data pump utilities are also available for instant client as Tools Package. Unlike the other client components which are more flexible, it is mandatory to have matching client and server versions for data pump utilities.

Make directory in the database

Connect to the pdb service with sqlplus

sqlplus system/******@vmora18

In case you are on the server, make sure to connect to the pdb, not the cdb. For example:

sqlplus / as sysdba

Create directory for a windows server


or for a linux server

CREATE OR REPLACE DIRECTORY BPS_DATA_PUMP AS '/home/oracle/bps_data_pump';



Open a BPS Commandline, create and change to a working directory

mkdir C:\oracle\bpsimpexp
cd /d C:\oracle\bpsimpexp 

Check that sqlplus, impdp and bpsoraexp are found in the path

sqlplus -v
expdp help=y

Start script

bpsoraexp system/********@vmora12 lu_buv

Sample output

  • In the current working directory you will find lu_buv-<version>-<date>.sql with the sequence statements created by the script.
  • In the data pump directory on the server you will find lu_buv-<version>-<date>.dmp and the log file lu_buv-<version>-<date>-exp.log from expdp.

For importing you will need dmp file with the table data and the sql file with the sequences.

In our example we have the files lu_buv-2024001-20200723.dmp and lu_buv-2024001-20200723.sql with the export of schema LU_BUV from an other db server. We want to import the schema on our target server vmora12, and also into a schema named LU_BUV.

  • Copy the file lu_buv-2024001-20200723.dmp into the data pump directory.
  • Create a working directory from where you will run bpsoraimp.
  • Copy the file lu_buv-2024001-20200723.sql into the working directory.
  • Check/prepare the schema of LU_BUV on server vmora18:
    • In case it does not exist, create it with the «BPS Datastore Setup» assistant. (For details see BPS manual)
    • In case it already exists but is an old version, update it with the «BPS Datastore Update» assistant.
    • In case it already exists but may be broken, delete it with the «BPS Datastore Drop» assistant, and then re-create it with the «BPS Datastore Setup» assistant.
  • In any case, make shure the target schema has the same version as the source schema, otherwise the import may fail. (Check in Central System Settings/Install/DbLevel)
  • Open a terminal window and do as follows

Change to the working directory

cd /d c:\oracle\bpsimpexp 

Check that sqlplus, impdp and bpsoraimp are found

sqlplus -v
impdp help=y

Start script

bpsoraimp system/********@vmora12 lu_buv-2024001-20200723

Sample script output

  • In the data pump directory on the server you will find the log file lu_buv-2024001-20200723_imp.log of the table data import with impdp.

The basic steps are equal as before where we imported into LU_BUV.

This time we want to import from lu_buv-2024001-20200723.dmp and lu_buv-2024001-20200723.sql into schema LU_FGB. So instead of checking/preparing schema LU_BUV on the target server, we do it for schema LU_FGB this time.

There is no need for a schema LU_BUV to exist on the target server, but it does not hurt either. So this method also allows to import back into the source server, without touching the the original schema. You might want to do this to copy your producton schema to a test or quality schema.

Start script, but this time append the new target schema name as well

bpsoraimp system/********@vmora12 lu_buv-2024001-20200723 lu_fgb

Easy peasy, isn't it?

  • howto/orapump.txt
  • Last modified: 2020/07/26 05:31
  • by ibk