You are about to plug a multi-terabyte non-CDB into an existing multitenant container database
(CDB) as a pluggable database (PDB).
——-The characteristics of the non-CDB are as follows:
Version: Oracle Database 12c Releases 1 64-bit
Character set: WE8ISO8859P15
National character set: AL16UTF16
O/S: Oracle Linux 6 64-bit
The characteristics of the CDB are as follows:
Version: Oracle Database 12c Release 1 64-bit
Character set: AL32UTF8
O/S: Oracle Linux 6 64-bit
Which technique should you use to minimize down time while plugging this non-CDB into the
CDB?
A.
Transportable database
B.
Transportable tablespace
C.
Data Pump full export / import
D.
The DBMS_PDB package
E.
RMAN
Explanation:
Note:
* Generating a Pluggable Database Manifest File for the Non-CDB
Execute the dbms_pdb.describe procedure to generate the manifest file.
exec dbms_pdb.describe(pdb_descr_file=>’/u01/app/oracle/oradata/noncdb/noncdb.xml’);
Shut down the noncdb instance to prepare to copy the data files in the next section.
shutdown immediate
exit
What about the difference in Character Sets? In the 12c upgrade course they said the CDB and PDB. Think you get this error: Database CHARACTER SET ERROR Character set mismatch:….
Maybe C is the good answer??? I am not 100% sure.
I think D is the best option, The question is the technique used to minimize downtime.
Using the dbms_pdb package is the most easiest option and all export / import tecnniques, regardless full or partial ( transportable database or transportable tablespace) might require more downtime….
I agreed to D.
1. Both are 12c. 2.Minimize downtime with multi-terabatye DB.
B is the correct answer (transporatble tablespaces will be faster than other methods). Also the character sets should not mis-match between the CDB and new PDB and the questions emphasize “this” non-CDB into the CDB, so we have to not focus on the general technique but the specific situation
Due to non CDB Version: Oracle Database 12c correct option should be D.The DBMS_PDB package.
D
I would choose C because of different charaster sets.
D
Character Set conversion from WE8ISO8859P15 to AL32UTF8 using the Database Migration Assistant for Unicode (DMU).
D is the easiest Option
C – OK
character set is not compatible
But we must have empty PDB in CDB ;/
If the CDB is AL32UTF8 then use the DMU tool Note 1272374.1 The Database Migration Assistant for Unicode (DMU) Tool
examples of common NOT plug-in compatible situations:
CDB is AL32UTF8 or UTF8 , PDB is US7ASCII or any xx8xxxxx characterset .
CDB is WE8MSWIN1252 , PDB is WE8ISO8859P15
CDB is AR8MSWIN1256 , PDB is AR8ISO8859P6
https://support.oracle.com/epmos/faces/DocumentDisplay?_afrLoop=453075343557014&id=1968706.1&displayIndex=1&_afrWindowMode=0&_adf.ctrl-state=106hd7zz48_126#FIX
The export/import migration methods could be used to overcome these limitations.
You can use Data Pump to migrate all, or portions of, a database from a non-CDB into a PDB, between PDBs within the same or different CDBs, and from a PDB into a non-CDB. In general, using Data Pump with PDBs is identical to using Data Pump with a non-CDB.
AB – false
General Limitations on Transporting Data
Be aware of the following general limitations as you plan to transport data:
The source and the target databases must use compatible database character sets. Specifically, one of the following must be true:
The database character sets of the source and the target databases are the same.
D – false
The source and target CDBs must have the same character set and national character set.
https://gumpx.wordpress.com/2013/07/20/database-12c-convert-non-cdb-with-different-character-set-to-pdb/
D
I think C because of the different character set.
There are three possible methods to plug a non-CDB database into a CDB.
Whichever method is used, you have to get the non-CDB into a transactionally-consistent state and open it in restricted mode.
• Either use transportable tablespace (TTS) or full conventional export / import or
transportable database (TDB) provided that in the last one any user-defined object
resides in a single user-defined tablespace.
• Or use DBMS_PDBpackage to construct an XML file describing the non-CDB data files to
plug the non-CDB into the CDB as a PDB. This method presupposes that the non-CDB is
an Oracle 12cdatabase.
• Or use replication with GoldenGate
Using the DBMS_PDBpackage is the easiest option.
If the DBMS_PDB package is not used, then using export/import is usually simpler than using
GoldenGate replication, but export/import might require more down time during the switch from the non-CDB to the PDB.
If you choose to use export/import, and you are moving a whole non-CDB into the CDB, then
transportable databases (TDB) is usually the best option. If you choose to export and import
part of a non-CDB into a CDB, then transportable tablespaces (TTS) is the best option.
B, C are ruled out. E will take a long time.
I think DBMS_PDB package will have a less downtime than Choice A.
D
D correct
Use the DBMS_PDB package and after DMU to convert as note (Doc ID 1968706.1)
Have you noticed that question 54 and 128 are same except the Character set difference?
In 54 if “transportable table” is the correct answer and how come 128 has DBMS_PDB as the correct answer.
I go with C.
There the NONCDB version was 11.2.0.2.0 and here the NON CDB version is 12CR1. That Makes it possible with DBMS_PDB. makes sense?
B & D: source and target must have compatible character sets
https://docs.oracle.com/cd/B28359_01/server.111/b28310/tspaces013.htm
Limitations on Transportable Tablespace Use
Be aware of the following limitations as you plan to transport tablespaces:
– The source and target database must use the same character set and national character set.
I would go for “C” according to the link below :
https://gumpx.wordpress.com/2013/07/20/database-12c-convert-non-cdb-with-different-character-set-to-pdb/
https://docs.oracle.com/database/121/ADMIN/cdb_plug.htm#ADMIN13551
The source and target must have compatible character sets and national character sets. To be compatible, the character sets and national character sets must meet all of the requirements specified in Oracle Database Globalization Support Guide.
C
As per MOS note: (Doc ID 1968706.1)
In Oracle Database 12c, all pluggable databases (PDBs) in a container database (CDB) must have
* the same Database character set (NLS_CHARACTERSET) or the NLS_CHARACTERSET need to be a (Plug-in compatible) binary subset of the CDB NLS_CHARACTERSET
* the same National character set (NLS_NCHAR_CHARACTERSET) as the CDB’s root container
in order to be able to plug in.
why not A?
http://www.oracle.com/technetwork/database/enterprise-edition/full-transportable-wp-12c-1973971.pdf
The Characterset WE8ISO885915 is the subset of AL32UTF8.
You can create the new PDB and migrate the non-cdb database to the new PDB.
A is right answer
can you provide a link that states that WE8ISO885915 is a subset of AL32UTF8?
thanks
https://docs.oracle.com/cd/E41633_01/pt853pbh1/eng/pt/tgbl/task_SelectingCharacterSets-0769cc.html
Unfortunately this says otherwise: https://community.oracle.com/thread/1112765
You’re right
examples of common NOT plug-in compatible situations:
CDB is AL32UTF8 or UTF8 NLS_CHARACTERSET, PDB is US7ASCII or any xx8xxxxx NLS_CHARACTERSET.
CDB is WE8MSWIN1252 NLS_CHARACTERSET, PDB is WE8ISO8859P15 NLS_CHARACTERSET
CDB is AR8MSWIN1256 NLS_CHARACTERSET , PDB is AR8ISO8859P6 NLS_CHARACTERSET
——————————————————-
If the question is to chose 2 answers I’ll go for: C, E
——————————————————-
https://docs.oracle.com/database/121/ADMIN/transport.htm#ADMIN11403
Transporting data is much faster than performing either an export/import or unload/load of the same data. It is faster because, for user-defined tablespaces, the data files containing all of the actual data are copied to the target location, and you use Data Pump to transfer only the metadata of the database objects to the new database
Transportable tablespaces and transportable tables only transports data that resides in user-defined tablespaces. However, full transportable export/import transports data that resides in both user-defined and administrative tablespaces, such as SYSTEM and SYSAUX. Full transportable export/import transports metadata for objects contained within the user-defined tablespaces and both the metadata and data for user-defined objects contained within the administrative tablespaces. Specifically, with full transportable export/import, the export dump file includes only the metadata for objects contained within the user-defined tablespaces, but it includes both the metadata and the data for user-defined objects contained within the administrative tablespaces
=============
Demonstration
=============
https://martincarstenbach.wordpress.com/2015/10/08/example-of-full-transportable-export-to-create-a-12c-pdb/
A:
B:
Limitations on Transportable Tablespace Use
The source and target database must use the same character set and national character set.
https://docs.oracle.com/cd/B28359_01/server.111/b28310/tspaces013.htm#i1007233
C:
Moving a Non-CDB Into a CDB
https://docs.oracle.com/database/121/ADMIN/transport.htm#BEHDGGAI
Transporting a Database Using an Export Dump File
https://docs.oracle.com/database/121/ADMIN/transport.htm#ADMIN13726
If the source platform’s endian format is different from the target platform’s endian format, then use one of the following methods to convert the data files:
Use the GET_FILE or PUT_FILE procedure in the DBMS_FILE_TRANSFER package to transfer the data files. These procedures convert the data files to the target platform’s endian format automatically.
Use the RMAN CONVERT command to convert the data files to the target platform’s endian format.
Appendix: Limitations on Full Transportable Export/Import
http://www.oracle.com/technetwork/database/enterprise-edition/full-transportable-wp-12c-1973971.pdf
D:
The DBMS_PDB package provides an interface to examine and manipulate data about pluggable databases.
E:
Converting Data Between Platforms Using RMAN
When you use the RMAN CONVERT command to convert data, you can either convert the data on the source platform after running Data Pump export, or you can convert it on the target platform before running Data Pump import. In either case, you must transfer the data files from the source system to the target system.
You can convert data with the following RMAN CONVERT commands:
CONVERT DATAFILE
CONVERT TABLESPACE
CONVERT DATABASE
My Answer is : C
I tested and D worked fine.