Datapump in Oracle

Datapump is 10g new features useful to do export/import which is called as unload/load data too. There is one drawback with compare to normal export/import is…

The DataPump export/import jobs need more startup time then the old export and import utility, so for small jobs better to use normal export/import utility.
2) DataPump export data only to the disk not to a tape.

Now you will have the question that why to use datapump export/import.

1) Oracle claims that Datapump enables you to decrease total export time and import is 15 to 30 mins faster than the normal import method.

2) You can estimate the job times before hand means how much time it will take to do the export or import,

3) Ycan directly load one database from remote instance.

4) and you can select the objects to get exported or imported ( so fine-grained object selection)

5) so all in all its a superset of old export/import utility.

The commands to do use Datapum is
$expdp username/passwd (various parameters) $expdp help=y

Data Pump export utility will unload the data into operating system files known as dump files. It writes to these files in a proprietary format, which only the Data Pump import utility can understand while loading the data in the dump files into the same or other database.

==>You can take DataPump export dump files from one operating system and import them into a database running on a different type of platform. (i.e. from Linux to Solaris)

so when you export using Datapump you can not import it using the normal old import utility.

Mechanics of the DataPump:

To perform complete datapump job it uses several processes which are new background process to our database 10g, Master process,worker process,shadow processes.

Master Process:(MCP) That is Master control process recognized at DMnn. from the backend <instance>_DMnn_<pid>.

1) creates jobs and controls them
2)creates and manages the worker processes
3)Monitor the jobs and log the progress
4) Maintain the job state and restart information in the Master table
5)manages the necessary files, including dump file set.

so here comes the question What is master table??

MCP uses a special table called Master table(MT) to log the location of the various database objects in the export dump file. MCP maintain the information like job state and restart information in the master table,

Job state?? restart??
–Yes so here you can temporarily stop your job(export or import running) and then again after doing some alteration you can restart your job we will see in detail how to do the same.

Where this Master table resides?? Oracle creates the master table in the schema of the user who is running the Datapump job at the begining of every export job.

so Master table will have info like what all parameters of the export job, status of all worker process, location of the objects which are inside the dumpfile set.

here dumpfile is same as old export/import dump file and log file too we can have which will log complete operation.  one more file is used in Datapump is SQLFILE.
SQLFILE ????— Yup , DataPump import uses a special parameter called SQLFILE which will write all the DDL statements it execute during the import job to a file.

here we will have the job name as well as master table name both will be same.
so what could be the job name?? we can use the JOBNAME parameter to provide your own job name(JOBNAME=priyaexp). Well this parameter is –optional however if you don’t specify the jobname parameter DataPump generates a unique sysem name like <user>_<operation>_<mode>_%N.

suppose username is SYSTEM and you are doing full database export then the name will be SYSTEM_EXPORT_FULL_01.

More on datapump coming soon,   If you like this don’t forget to leave your comments

Share This Post with Your Friends over Social Media!

About the Author Atul Kumar

Leave a Comment:

Saurabh Sood says June 9, 2008

As you mentioned that datapump takes more time to start, could you please explain why this is so?

SubbaRao says June 11, 2008

Good work priyanka.

keep it up my friend..


lakshmikanthan says June 16, 2008

hi every one one,
If i want release any newly and important dump means how can i posted in

lakshmikanthan says June 16, 2008

Hi priyanga,

nice work it useful for me.
keep it up.

abhijeet says June 17, 2009


I am working as Oracle DBA on 9i,10g
I am plaining for the Apps DBA
and i am week in PLSQL developement How will i develop me in this.
Can you please suggest any thing for the same.

Please guid me regarding same
Abhijeet K

Atul Kumar says June 17, 2009

PLSQL Development is mainly for apps developers (forms & reports)

For apps dba , please go for patching , cloning

other things to learn are architecture of apps and how to install apps

raju says March 9, 2010

In the case of imp/exp dump files are created in the client machine where as in the case of datapump dump files are created in the server computer.

pervaish says April 7, 2010

Very nicely written (simple and concise)

Thanks for sharing your knowledge.

ishakfc71 says November 25, 2011


i have a rac system which is using ASM
i was wondering if is there any way to
take export of system files to the disk space which is controlled by ASM


Add Your Reply