Which three statements are true about Oracle Data Pump export and import operations?

Which three statements are true about Oracle Data Pump export and import operations?

Which three statements are true about Oracle Data Pump export and import operations?

A.
You can detach from a data pump export job and reattach later.

B.
Data pump uses parallel execution server processes to implement parallel import.

C.
Data pump import requires the import file to be in a directory owned by the oracle owner.

D.
The master table is the last object to be exported by the data pump.

E.
You can detach from a data pump import job and reattach later.

Explanation:
B: Data Pump can employ multiple worker processes, running in parallel, to
increase job performance.
D: For export jobs, the master table records the location of database objects within a dump file set.
/ Export builds and maintains the master table for the duration of the job. At the end of an export
job, the content of the master table is written to a file in the dump file set.
/ For import jobs, the master table is loaded from the dump file set and is used to control the
sequence of operations for locating objects that need to be imported into the target database.



Leave a Reply 23

Your email address will not be published. Required fields are marked *


Ledeboer, Jeroen

Ledeboer, Jeroen

Please don’t blame me if I’m wrong, but I think A, B, E are the correct answers.
I can’t find documentation that the master table finally is exported together with the data. Reading about the contents of this table it isn’t necessary I think.
Attach and detach is possible for impdp and expdp

Bruno

Bruno

I Jeroen,
I agree with you!

But why these answers are so wrong ???

Jimwong

Jimwong

I think D is correct.
http://docs.oracle.com/cd/E16655_01/server.121/e17639/dp_overview.htm

Tracking Progress Within a Job
While the data and metadata are being transferred, a master table is used to track the progress within a job. The master table is implemented as a user table within the database. The specific function of the master table for export and import jobs is as follows:

•For export jobs, the master table records the location of database objects within a dump file set. Export builds and maintains the master table for the duration of the job. At the end of an export job, the content of the master table is written to a file in the dump file set.

•For import jobs, the master table is loaded from the dump file set and is used to control the sequence of operations for locating objects that need to be imported into the target database.

JanK

JanK

A – E – TRUE
When a Datapump Export or Import session is launched, a Datapump Job is automatically started. This way, we can

–> detach from and reattach to long-running jobs without affecting the job itself.

D – TRUE
At the end of an export job, the content of the master table is written to a file in the dump file set.

Peter

Peter

its written not exported so then D is false

Mohammad Rafiq

Mohammad Rafiq

In my opinion AD and E

Eugene

Eugene

B FALSE
http://docs.oracle.com/database/121/SUTIL/dp_import.htm#SUTIL921

PARALLEL=integer

The value you specify for integer specifies the maximum number of processes of active execution operating on behalf of the import job. This execution set consists of a combination of worker processes and parallel I/O server processes. The master control process, idle workers, and worker processes acting as parallel execution coordinators in parallel I/O operations do not count toward this total.

tagarista

tagarista

wHY answer C is false?

Juan

Juan

The User performing the import only needs read access to the directory. I don’t know why B is false, Anyone knows?

JanK

JanK

B – false
Data Pump can employ multiple worker processes, running in parallel, to increase job performance.
No any server proces

nohup

nohup

Agree , A D E

praveen

praveen

I believe all should be true. Directory file on the o/s should be owned by oracle.

Muhammed

Muhammed

taken the exam today .

bit change in question , select two not three , also options changed.

correct answers : AE

Antonio

Antonio

Not B
By default impdp doesnt uses parallelism. You have to explicit write the clausule PARRALEL=n