Saturday, February 15, 2014

Script to Export Database | Schema | Table using (Data pump expdp or Legacy export exp)

Update on 05-Feb-2020: 
This post is not valid anymore. Sorry for inconvenience! 

A new script has been released, please follow this link to get the final script:
http://dba-tips.blogspot.com/2020/02/new-script-for-exporting-data.html



Creating a script to export the data is not a big deal, but importing the dump file is a time-consuming job for the DBA. export_data.sh script will not only do the export job, but it will help you out with the import steps when you decide to import the dumpfile, it lists the DDL statements and grants that you need to run before and after the import process in a separate script. I consider this function the most useful feature in this script.

The script gives you the options to export Full Database, Schema or table using either Legacy export utility exp or the modern Data Pump utility expdp.
This script was tested on 10g, 11g and 12c on Linux & SUN environments.

How it works:

First download the script from here:
https://www.dropbox.com/s/hk0pfo2tanop35r/export_data.sh?dl=0

Second run the script, if you have multiple running database on the server it will give you the option to select the database you want to export data from.

> It will ask you the location you want to save the dump file
WHERE TO SAVE THE EXPORT FILE [DUMPFILE]? [ENTER THE FULL PATH]
> It will ask you if you want to export the FULL DATABASE.
Do you want to EXPORT FULL DATABASE? [Y|N] [Y] [N TO EXPORT SCHEMA|TABLE]
    [If your answer is "no"]
      > It will ask you if you want to export a SCHEMA.
Do you want to EXPORT a SCHEMA? [Y|N] [Y] [N If you want to EXPORT TABLE]
          [If your answer is "no"]
             >It will go into the export table mode:
                [Enter the Owner then
Please Enter the TABLE OWNER:
                  Enter the Table name]
Please Enter the TABLE NAME:
> It will ask you to select the utility you want to perform the export with [expdp or exp]
WHICH EXPORT UTILITY YOU WANT TO USE: [1) DATAPUMP [EXPDP]]
====================================
1) DATAPUMP [EXPDP]
2) LEGACY EXPORT [EXP]
> Then let the script do the remaining steps:
    - If EXPDP was selected to export the data, the script will automatically calculate the degree of parallelism based on the number of core CPUs on the server. [By the way parallelism is a buggy feature in EXPDP]
    - It will create a user called DBA_BUNDLEEXP7, granting it dba privilege to use it in the export
       (I'm creating this user because using a sysdba user will disable functions like consistent=y during the export)
    - It will create a directory called EXPORT_FILES_DBA_BUNDLE pointing to the dump location
       you've entered earlier [this in case you selected expdp utility to perform the export job].
    - It will create a "BEFORE IMPORT SCRIPT" [In case you selected export Schema mode earlier]
       this script will include the [creation statement of  roles assigned to the user, user creation statement,
       grant privileges/roles, grant privileges the user has on other schemas objects.
     - It will create an "AFTER IMPORT SCRIPT" [In case you selected export Database mode earlier]
        this script will hint you all triggers owned by SYS user  [if exist]. These triggers will not be created
        during the import process.
     - It will create an "AFTER IMPORT SCRIPT" [In case you selected export Schema mode earlier]
        this script will include the [creation statements of public synonyms for user;s table [if exist],
        grant privileges on schema objects to other users [if exist], giving you a hint for the triggers owned
        by other users that pointing to the exported schema tables [if exist], recompile invalid objects.
     - It will create an "AFTER IMPORT SCRIPT" [In case you selected export Table mode earlier]
        this script will create the public synonyms for exported table [if exist].
     - The script will start the export job using DBA_BUNDLEEXP7 user.
     - Once the export job finish it will drop the DBA_BUNDLEEXP7 user.
     - At the end the script will list to you "Import Guidelines" including the
        BEFORE/AFTER import scripts locations. Finally, it will print the full path of the dumpfile.

If you still not OK with that introduction don't worry the script is self-explanatory :-)
Please note that COMPRESSION option is used by default in the export process.

At any stage, you can terminate the script by pressing [Ctrl+c]

This script is part of DBA BUNDLE, to read more about it please visit this link:
http://dba-tips.blogspot.ae/2014/02/oracle-database-administration-scripts.html

DISCLAIMER: THIS SCRIPT IS DISTRIBUTED IN THE HOPE THAT IT WILL BE USEFUL, BUT WITHOUT ANY WARRANTY. IT IS PROVIDED "AS IS".

The following is the code: [in case the download link is not working]

6 comments:

  1. it's great in fact.But there is a challenge while exporting more than one table or user. can you update your script with these two options

    ReplyDelete
    Replies
    1. Thanks for the suggestion, a new script was published having this feature along with other cool features. Here is the link:
      https://dba-tips.blogspot.com/2020/02/new-script-for-exporting-data.html

      Delete
  2. I'm using your script for long time, this time I'm getting this error:

    ORA-31693: Table data object "RS"."PRODUCTS" failed to load/unload and is being skipped due to error:
    ORA-02354: error in exporting/importing data
    ORA-01466: unable to read data - table definition has changed

    ReplyDelete
    Replies
    1. Please use the new script in this link:
      https://dba-tips.blogspot.com/2020/02/new-script-for-exporting-data.html

      Delete
  3. How can I add rows parameter in it I want to export whole schema with only 50000 rows per table/

    ReplyDelete
  4. Actually, this feature is not available in the script yet.

    ReplyDelete