Datapump-2

 This post is in continuation of my previous post on  Oracle Datapump Part I here
 

Directory:

In Datapump syntax we found Directory, What does this directory means Does it talk about the OS level directory ???

well you can say that here directory means object.
we need to create database level directory as well as os level directory

How and Why??  ==>>  Datapump is server based and notclient based. Server based means, datapump creates all its dump files on the server not on client machine. 

BG processes of the oracle they  are responsible for all the I/O operations for the dump files.

 While doing export/import earlier (in old export/import) We were able to give dump file location where ever we find space but in 10g we have specific location which is pointed by this Directory Object.

Now you cant let any user be able to specify an absolute file path on the server. so over writing the server file if you have power to write dump file anywhere will be stopped.
Datapump maps to a specific OS directory with this Directory object.

Example

SQL> desc dba_directories;
 Name                                                Null?             Type
 —————————————– ——– —————————-
 OWNER                                     NOT NULL VARCHAR2(30)
 DIRECTORY_NAME               NOT NULL VARCHAR2(30)
 DIRECTORY_PATH                                     VARCHAR2(4000)

SQL> select * from dba_directories where DIRECTORY_NAME like ‘%DATA_PUMP%’;

OWNER                          DIRECTORY_NAME   DIRECTORY_PAT
—————————— ————————————————————————
SYS                            DATA_PUMP_DIR              /u01/oracle/product/TEST/10.2.0/rdbms/log/

===>> Want to create the directory right now and then do datapump export

SQL> create directory test_dir1 as ‘/u01/oracle/product/TEST/10.2.0/export’;

Directory created.

Then where the security is if all the users who has db privilege can create the directory????===> To create the directory we need to have DBA role privilege OR “CREATE ANY DIRECTORY” privilege.

Now if you want to grant user priya privileges on the newly created directory test_dir1,

SQL> GRANT READ,WRITE ON DIRECTORY test_dir1 to priya;
Grant succeeded.

To do the datapump export you should have ” Write” privilege on all the files (means write privilege on Directory) and for import; need to have Read access to that export dump files.

Is only read privilege for import is sufficient ??? Nope as you need to write log filetoo so you shuld have write privilege as well on the Directory for import 😉

Once the Directory is created, all the DataPump export and import jobs can use the DIRECTORY parameter to specify the name of the directory object
(DIRECTORY=test_dir1), thus DIRECTORY is pointing to the OS level directories and files.

$expdp priya/passwd dumpfile=test_dir_export.dmp

IF you have already created the default directory with the name DATA_PUMP_DIR, then no need to specify DIRECTORY parameter in export/import commands.

Oracle will automatically look for the directory which is specified by the value DATA_PUMP_DIR.

SQL> CREATE DIRECTORY TEST_DIR2 AS ‘/U01/oracle/product/10.2.0/export’;
Directory created.

$export DATA_PUMP_DIR=TEST_DIR2
$expdp priya/passwd tables=emp dumpfile=emp.dmp.
$expdp priya/passwd tables=emp logfile=test_dir2:priya.log

test_dir2is directory and DataPump file name is priya.log , (:) is separation for both the value (Data Pump file contains dump file,logfile or sqlfile).

 Some practicle examples:

1)  Import of tables from scott’s account to priya’s  account :

If you do the same using the old import job:

$ imp username/password FILE=scott.dmp FROMUSER=scott TOUSER=priya TABLES=(*)

Data Pump Import
$ impdp username/password DIRECTORY=test_dir1 DUMPFILE=scott.dmp
TABLES=scott.emp REMAP_SCHEMA=scott:priya

Here fromuser/touser is replaced with remap_schema, so EMP table of scott ownership will be imported to the priya schema.

2) Export of an entire database to a dump file with all GRANTS, INDEXES,
and data

$ exp username/password FULL=y FILE=dba.dmp GRANTS=y INDEXES=y ROWS=y $ expdp username/password FULL=y INCLUDE=GRANT INCLUDE= INDEX
DIRECTORY=test_dir1 DUMPFILE=dba.dmp CONTENT=ALL

The INCLUDE parameter allows you to specify which object , you want to keep in the export job. The EXCLUDE parameter allows you to specify which object , you want to keep out of the export job.
           You cannot mix the two parameters in one job. Both parameters work with Data Pump Import as well.

REMAP 
REMAP_TABLESPACE – This allows you to easily import a table into a different
tablespace from which it was originally exported. The databases have to be 10.1
or later.
Example:> impdp username/password REMAP_TABLESPACE=tbs_1:tbs_6
DIRECTORY=dpumpdir1 DUMPFILE=employees.dmp

REMAP_DATAFILES– This is a very useful feature when you move databases between platforms that have different file naming conventions. This parameter changes the source datafile name to the target datafile name in all SQL statements where the source datafile is referenced. Because the REMAP_DATAFILE value uses quotation marks, it’s best to specify the parameter within a parameter file.

Share This Post with Your Friends over Social Media!

About the Author AppsDBA

Leave a Comment:

14 comments
nishit patel says July 23, 2008

Thanks priyanka.

It clears the concepts instead of just practical approach.well written.

Reply
shreenidhi says October 4, 2008

The article is very useful and informative.
regards
shreenidhimakesh

Reply
Jegan says April 8, 2009

Hi…Nice Work….

Reply
Nishit Patel says April 8, 2009

Again very Good One & very informative ! But this time i request and will appreciate it if you can come up with all posible Datapump practical Examples in all aspects !

My Query :
Is it posible to extract and import only All USERS (no individual listing), Their SYS Privileges & TBS structure, using Datapump or any other simple way ? ( I do not want objects with/without data at all)

Thanks.

Reply
Naoual says April 17, 2009

ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at “SYS.UTL_FILE”, line 488
ORA-29283: invalid file operation

je reçoit ces erreurs qd j’execute le expdp sachant que le directory est créé, je possède un parfile pour exporter les tables dans la liste.
Merci

Reply
Raghu says April 25, 2009

Hi all can we use exp/imp for R12 cloning process or expdp/impdp.

expdp/impdp process very very slow …it is not moving further from REF_CONSTRAINT.

Please help…

Thanks

Reply
Atul Kumar says April 25, 2009

Raghu,
Use rapid clone to clone R12 , exp/imp is useful for migration (from one O.S. to other)

Reply
Raghu says April 25, 2009

Thanks Atul,

Fine. Whether R12 supports exp/imp along with expdp/impdp. we are exporting only metadata not the data.

expdp taking long … and stuck at REF_CONSTRAINT …

whether Analyzing the whole DB improves the performance of DB. we analyzed only 70% tables.

Reply
Atul Kumar says April 25, 2009

Yes expdp/impdp supported with R12 typically, the export runs for several hours.
Check
454616.1 Export/Import Process for Oracle E-Business Suite Release 12 using 10gR2

For performance issues with expdb/impdp check
453895.1 Checklist for Slow Performance of Export Data Pump (expdp) and Import DataPum…

Reply
Raghu says April 25, 2009

Thanks Atul

Reply
raju says March 9, 2010

While doing export/import earlier (in old export/import) We were able to give dump file location where ever we find space but in 10g we have specific location which is pointed by this Directory Object.

This is not the actual difference between exp/imp and datapump.

Reply
pervaish says April 7, 2010

Again Very nicely written
You should cover some real life scenarios
and performance and maintenance of datapump real issues while using datapump pls,
coparison between dataloader and datapump,
Which one we should prefer under which circumstances?

Thanks
pervaish

Reply
hofans says May 8, 2012

Hi, well written valuable information. I was struggling with tablespaces, that are with different names. The hint remap_tablespaces is good for that! Thanks!

Reply
mazeem22 says December 7, 2014

Thank you. it was very helpful for us.

Reply
Add Your Reply