Oracle Data Pump technology enables very high-speed movement of data and metadata from one database to another. The Data Pump is a server based utility vs. the traditional Export / Import which are client based utilities. The Oracle Data Pump is not compatible with the Export / Import functionality. The Oracle Data Pump can also use NETWORK_LINK functionality to move data and metadata from a remote database without a dumpfile using the network technology between the two databases.
Data Pump jobs use a master table, a master process, and worker processes to perform the work and keep track of the progress. For every Data Pump Export job and Data Pump Import job, a master process is created. The master process controls the entire job, including communicating with the clients, creating and controlling a pool of worker processes, and performing logging operations.
create a directory object
To create a directory, you must have the DBA role or you must have been granted the CREATE ANY DIRECTORY privilege.
Example (a DBA creates directories on the Windows platform and grants access to user scott):
CONNECT system/manager
CREATE OR REPLACE DIRECTORY my_dir as 'D:\DataPump';
CREATE OR REPLACE DIRECTORY my_logdir as 'E:\logs';
GRANT read, write ON DIRECTORY my_dir TO scott;
GRANT read, write ON DIRECTORY my_logdir TO scott;
Example (a normal user with the CREATE ANY DIRECTORY privilege creates directories on the Unix platform - this user automatically has READ and WRITE privilege on that directory):
Note that the CREATE DIRECTORY statement does not actually create the directory for you on disk. If the directory is invalid, a DataPump job will fail with:
Schema Exports/Imports
CONNECT system/manager
GRANT CREATE ANY DIRECTORY TO scott;
CONNECT scott/tiger
CREATE OR REPLACE DIRECTORY my_dir as '/usr/DataPump';
CREATE OR REPLACE DIRECTORY my_logdir as '/usr/logs';
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation
Example(dpexport.bat)
@echo off
cls
set day=%date:~0,3%
set mm=%date:~4,2%
set dd=%date:~7,2%
set yy=%date:~-4%
cls
set day=%date:~0,3%
set mm=%date:~4,2%
set dd=%date:~7,2%
set yy=%date:~-4%
expdp 'sys/oracle10g@dupdb as sysdba' dumpfile= dupdb_full_export_%yy%%mm%%dd%.dmp logfile=dupdb_full_export_%yy%%mm%%dd%.log DIRECTORY=DUPDB_FULL_EXPORT_DIR full=Y
expdp 'monitor/monitor' dumpfile=dpdir:exp%U.dmp logfile=prod_full_export20120606.log DIRECTORY=dpdir full=Y cluster=N exclude=statistics parallel=5
Schema Exports/Imports
expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log
impdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impdpSCOTT.log
impdp 'sys/syspw as sysdba' directory=dpdir dumpfile=exp%U logfile=IMP
DPProd.log full=y cluster=N parallel=5
Include/Exclude
expdp scott/tiger@db10g schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log expdp scott/tiger@db10g schemas=SCOTT exclude=TABLE:"= 'BONUS'" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log
A single import/export can include multiple references to the parameters, so to export tables, views and some packages we could use either of the following approaches.
Network Exports/Imports (NETWORK_LINK) TheINCLUDE=TABLE,VIEW,PACKAGE:"LIKE '%API'" or INCLUDE=TABLE INCLUDE=VIEW INCLUDE=PACKAGE:"LIKE '%API'"
NETWORK_LINK
parameter identifies a database link to be used as the source for a network export/import. The following database link will be used to demonstrate its use.In the case of exports, theCONN / AS SYSDBA GRANT CREATE DATABASE LINK TO test; CONN test/test CREATE DATABASE LINK remote_scott CONNECT TO scott IDENTIFIED BY tiger USING 'DEV';
NETWORK_LINK
parameter identifies the database link pointing to the source server. The objects are exported from the source server in the normal manner, but written to a directory object on the local server, rather than one on the source server. Both the local and remote users require the EXP_FULL_DATABASE
role granted to them.For imports, theexpdp test/test@db10g tables=SCOTT.EMP network_link=REMOTE_SCOTT directory=TEST_DIR dumpfile=EMP.dmp logfile=expdpEMP.log
NETWORK_LINK
parameter also identifies the database link pointing to the source server. The difference here is the objects are imported directly from the source into the local server without being written to a dump file. Although there is no need for a DUMPFILE
parameter, a directory object is still required for the logs associated with the operation. Both the local and remote users require the IMP_FULL_DATABASE
role granted to them.impdp test/test@db10g tables=SCOTT.EMP network_link=REMOTE_SCOTT directory=TEST_DIR logfile=impdpSCOTT.log remap_schema=SCOTT:TEST
Monitoring Data Pump
set lines 150 pages 100 numwidth 7col program for a38
col username for a10
col spid for a7
select to_char(sysdate,'YYYY-MM-DD HH24:MI:SS') "DATE", s.program, s.sid,
s.status, s.username, d.job_name, p.spid, s.serial#, p.pid
from v$session s, v$process p, dba_datapump_sessions d
where p.addr=s.paddr and s.saddr=d.saddr;
select sid,username ,event,status,state,module,last_call_et
from v$session
where status ='ACTIVE'
and username like 'SYS%' --user who is importing/exporting
select * from dba_datapump_jobs
where operation ='EXPORT'
select * from dba_datapump_jobs
where operation ='IMPORT'
select * from dba_datapump_sessions
select username, opname,target_desc,sofar,totalwork,message
from v$session_longops
where message like 'SYS%'
waitevent associated with dp
When using data pump you will have a calling session which is the session that you log into the DB with when executing the impdp or expdb utility. Within the DB there will be one or more sessions which are called by this main data pump session;
direct path read
wait for unread message on broadcast channel
Datapump dump file I/O
Note:
You can also include the undocumented parameter METRICS=y to include additional logging information about number of objects and the time it took to process them in the log file.
No comments:
Post a Comment