DATA migration : Using expdp/impdp for Data Transfer between Oracle Databases.
Unless Databases are in some version.release it won’t make an issue, But for example if you have Oracle 11.1.0.7 as source and wish to import it in 10.2.0.3.0 you should use version parameter along with expdp/impdp.
STEP 1: Sqlplus to source and target databases as sys :
# sqlplus / as sysdba
STEP 2: Specifying export/import dump directory:
# CREATE DIRECTORY expdpump_dir AS /home/oracle[version]g/datapump and exit.
NOTE: Exporting and importing can be done the using expdp/impdp in fallowing modes.
Full Mode : FULL=y
Schema Mode : SCHEMAS=name1,name2,name3
TABLES Mode : TABLES=Table1,Table2,Table3
Step 3: RUN export at Source:
# expdp system/[PASSWORD]@[SID] schemas=[SCHEMA] DIRECTORY=expdpump_dir JOB_NAME=test DUMPFILE=[SCHEMA]_[SID]_[date]_%u.dmp logfile=EXP_[SCHEMA]_[SID]_date.log PARALLEL=4 VERSION=10.2.0.3.0
Successful export output from log file : /home/oracle11g/datapump/exp_[SCHEMA]_[SID]_date.log
Dump file set for "SYS"."SYS_EXPORT_SCHEMA_01" is:
/home/oracle11g/datapump/[SCHEMA]_[SID]_[date]_1.dmp
/home/oracle11g/datapump/[SCHEMA]_[SID]_[date]_2.dmp
/home/oracle11g/datapump/[SCHEMA]_[SID]_[date]_3.dmp
/home/oracle11g/datapump/[SCHEMA]_[SID]_[date]_4.dmp
Job "SYS"."SYS_EXPORT_SCHEMA_01" successfully completed at 01:35:45
NOTE : Use PARALLEL parameter depending on the number of CPUs(some says twice of CPUs cores, don’t try too much) on your server, %u in DUMPFILE implies that multiple files may be generated
NOTE : We should give the version parameter value as target database (Wherever we import) compatible parameter value.In my case impdp for : 10.2.0.3.0
STEP 4 : Transfer the Datapump to Destination dump dir’ “/home/oracle[version]g/datapump” use ftp or scp.
STEP 5 : RUN import at Destination :
# impdp system/[PASSWORD]@[SID] schemas=[SCHEMA] DIRECTORY=dpump_dir1 JOB_NAME=test DUMPFILE=[SCHEMA]_[SID]_[date]%u.dmp logfile=IMP_[SCHEMA]_[SID]_date.log PARALLEL=4 VERSION=10.2.0.3.0
Successful export output from log file : /home/oracle10g/datapump/IMP_[SCHEMA]_[SID]_date.log
imported "SYS"."SYS_OBJECTS" 9.956 MB 101933 rows
Job "SYS"."SYS_IMPORT_SCHEMA_01" completed without any error(s) at 01:54:57
STEP 6 : Final but I always find very helpful – MONITORING the export/import job STATUS.I understand the pain of waiting for log file to be written, but understand this log file will be optionally written and you often suspect JOB may be NOT RUNNING. All I say is KEEP THE FAITH in expdp/impdp.
> SELECT owner_name, job_name, operation, state FROM dba_datapump_jobs;
OWNER_NAME JOB_NAME OPERATION STATE
---------- -------------------- ---------- ------------
SYSTEM TEST EXPORT NOT RUNNING
Oh! Yes TEST EXPORT is NOT RUNNING.
STEP 7: KILL || STOP || RESUME || RESTART Datapump Jobs.
When exporting (or importing), press Ctrl-c to show the datapump prompt
KILL_JOB : Is something you cannot retriew the Job later, usually done when Job is in hung state.
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
[Ctrl-c]
Export> KILL_JOB
STOP_JOB : Can be reusmed later [STOP_JOB=IMMEDIATE] where you like to pause the datatpump job for some reason.
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
[Ctrl-c]
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([yes]/no): yes
ATTACH : Resume the Stopped JOB
# expdp system/[PASSWORD]@[SID ATTACH=[JOB_NAME]
START_JOB: Restart the current job Use [=SKIP_CURRENT] and CONTINUE_CLIENT.
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
[Ctrl-c]
Export> START_JOB=SKIP_CURRENT
Export> CONTINUE_CLIENT
Job EXP_FULL has been reopened at Thursday, 09 June, 2011 10:26
Restarting "SYSTEM"."TEST": expdp system*******@[SID] schemas=[SCHEMA] DIRECTORY=expdpump_dir JOB_NAME=test DUMPFILE=[SCHEMA]_[SID]_[date]_%u.dmp logfile=EXP_[SCHEMA]_[SID]_date.log PARALLEL=4 VERSION=10.2.0.3.0
Processing object type DATABASE_EXPORT/TABLESPACE
No comments:
Post a Comment