DBMentors is a solution oriented group, started by a team of qualified and committed professionals with vast experience in IT industry. The team has in-depth technical and design expertise with highest standards of programming quality.
Please see my other blog for Oracle EBusiness Suite Posts - EBMentors
Search This Blog
Note: All the posts are based on practical approach avoiding lengthy theory. All have been tested on some development servers. Please don’t test any post on production servers until you are sure.
If you want to delete the report server, perfrom the following steps:
1- Stop opmn [example ORACLEMT_HOME\opmn\bin> opmnctl stopall ]
2- Stop EM console [ D:\> emctl stop iasconsole ]
3- Take the backup of opmn.xml. [ located at ORACLEMT_HOME\opmn\conf ]
4- Open the opmn.xml, there must be the entry related to your report server, carefully delete the whole tag
5- Validate opmn.xml [ ORACLEMT_HOME\opmn\bin> opmnctl validate path\opmn.xml ]
6- Take the backup of targets.xml [ located at ORACLEMT_HOME\sysman\emd ]
7- Open targets.xml, there must be the entry related to your report server, carefully delete the whole tag
8- Start opmn [example ORACLEMT_HOME\opmn\bin>opmnctl startall ]
9- Start EM console [ D:\> emctl start iasconsole ]
Go to your AS enterprise manager and you should not see your deleted report server.
Following solution for Oracle Application Server 10g Release 2 on Microsoft Windows 2000/2003.
If you want to create a standalone reports server and you do not want to administer it via OPMN and EM, please use the following syntax: $ORACLE_HOME\BIN>rwserver server=reports_server_name batch=yes
If you want to create a standalone reports server and you DO want to administer it via OPMN and EM then please use the following steps. 1- MIDTIERHOME\BIN> addNewServerTarget.bat reports_server_name 2- MIDTIERHOME\opmn\BIN> opmnctl reload 3- MIDTIERHOME\opmn\BIN> opmnctl startproc ias-component=reports_server_name 4- MIDTIERHOME\opmn\BIN> opmnctl status
--> Streams 10gR2 This write up assumes you have good understanding of database administration and basic understanding of Streams at least theoretical concepts. I suggest that you must read this write up first carefully and then start doing. You need to make changes in your tnsnames.ora for creating services for your participating databases. You may need to change the script here according to your environment. If you face problem then you can write to me using linkedin. Overview: Streams can be used for such diverse tasks as data replication, data warehousing, database migration between platforms, application upgrade or migration, and queuing applications.
Streams can be configured in a number of different ways depending on the business requirements. For example, Streams can be configured to:
Replicate data from a database to one or more databases, even if those databases have different structures or naming conventions.
Replicate data between hardware platforms and/or database releases and/or character sets.
Consolidate data from multiple sources with varying structures into a single database.
Provide high availability while performing database or application upgrades or while migrating between hardware platforms.
We used the streams for replication (One-Way) in Ministry Of Higher Education KSA for their Finance Application’s Database only. It was schema level because only one application schema is being used by the application. For the purpose of this write up I’ll use the HR schema as first we performed our test on the same and off course discussing Ministry application schema is irrelevant and also complex as we wana develop visual interface also to manage the streams easily and dynamically. We can include/exclude the tables for capture process via this interface.
Replication using Oracle Streams is implemented in the following way:
1.-A background capture process is configured to capture changes made to tables,schemas, or the entire database. The capture process captures changes from the redo log and formats each captured change into a logical change record (LCR). The capture process uses logminer to mine the redo/archive logs to format LCRs.
2.-The capture process enqueues LCR events into a queue that is specified.
3.-This queue is scheduled to Propagate events from one queue to another in a different database.
4.-A background apply process dequeues the events and applies them at the destination database.
Moving to Configuration
You need to know the following first of all
1- Source/Capture Site – Global Database name of the Source
(For this write up it is TEST.REGRESS.RDBMS.DEV.US.ORACLE.COM)
2- Target/Apply Site - Global Database name of the Target
(For this write up it is APPLYTEST.REGRESS.RDBMS.DEV.US.ORACLE.COM)
3- STRMADMIN - Streams Administrator with password strmadmin (to be created)
4- HR - Source schema to be replicated (already installed if you installed examples or just install with the help of scripts provided in RDBMS folder of you installation if not existing already)
Note: We will replicate both DML and DDL.
Configuring Streams related Parameters:
To ensure a successful Streams implementation, use the following recommendations when constructing a Streams environment in the 10.2 release of the Oracle database.
Software Version
Database Settings: Parameters, Storage, and Privileges
Source Site Configuration
Target Site Configuration
Software Version
Oracle recommends to run streams with the latest available patchset (currently 10.2.0.4), although I first tested it on 10.2.0.1.0 and it worked successfully.
Database Parameters
For best results in a Streams environment, there are no of parameters (about 11) can be set but I just set the following parameters at each participating instance (For this write up TEST and APPLYTEST)
Parameter & Recommendation
Description
Consideration
_job_queue_interval = 1
Scan rate interval (seconds) of job queue. Default is 5. It is hidden parameter
This improves the scan rate for propagation jobs to every second, rather than every 5 seconds.
log_archive_dest_1
Where archive logs will be stored
You should have enough space on this destination
sga_max_size=1024M
While testing I put this size
sga_target=800M
While testing I put this size
Doing the work now:
I think we have enough info to work now, I’ll provide info step by step and if there is some explanation required, it will be provided briefly.
1- First we need to create the stream admin user on the source/capture site (TEST), do the following:
--SOURCE SITE
SQL> CONN SYS/TEST@TEST AS SYSDBA
Connected.
SQL> CREATE USER STRMADMIN IDENTIFIED BY STRMADMIN DEFAULT
2 TABLESPACE STREAMS_TBS
3 TEMPORARY TABLESPACE TEMP
4 QUOTA UNLIMITED ON STREAMS_TBS;
User created.
SQL> GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
Grant succeeded.
2-Check the parameters and change their values if required
SQL> show parameter _job_queue_interval SQL> ALTER SYSTEM SET "_job_queue_interval" = 1 SCOPE=SPFILE;
System altered. SQL> show parameter COMPATIBLE NAME TYPE VALUE
Set the log_archive_dest_1 value to you desired location where you want to store the archive logs for source. I’ve set to my location other than oracle home.
SQL> alter system set sga_max_size=1024M scope=spfile; System altered.
3-Now you should Verify that supplemental logging has been specified at the source either at the database level or for the individual replicated table. As I already have said that capture process uses logminer to mine redo logs to format LCR, if the supplemental logging is enable database can store additional data for logmining.
You can check the database level logging with the help of following SQL statement
SQL> SELECT supplemental_log_data_pk, supplemental_log_data_ui FROM V$DATABASE;
SUP SUP
--- ---
YES YES
You can check the table level logging with the following SQL statement
SQL> SELECT supplemental_log_data_pk, supplemental_log_data_ui, supplemental_log_data_fk FROM dba_capture_prepared_tables UNION
SELECT supplemental_log_data_pk, supplemental_log_data_ui, supplemental_log_data_fk FROM dba_capture_prepared_schemas UNION
SELECT supplemental_log_data_pk, supplemental_log_data_ui, supplemental_log_data_fk FROM dba_capture_prepared_database;
You can check the supplemental log groups with following SQL statement
SQL> Select log_group_name, column_name, position
2from dba_log_group_columns where table_name = 'DEPARTMENTS' and owner='HR';
Its quite possible that in your environment supplemental logging is not enabled, but in my case it is already enable as I’m using logminer. If it is not enabled in your environment, enable it with the following statements.
SQL> Alter database add supplemental log data;
Database altered.
SQL> ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL,PRIMARY KEY,UNIQUE,FOREIGN KEY) COLUMNS;
NOTE: global_names parameter for the participating databases must be TRUE, if it is not, issue the following statement on each participating database ie; TEST and APPLYTEST for this write up.
SQL> create public database link "APPLYTEST.REGRESS.RDBMS.DEV.US.ORACLE.COM" using 'APPLYTEST';
Database link created.
NOTE:If your global_names is TRUE, then the name of the database link must be exactly same as the global_name of your remote database. Further more here ‘APPLYTEST’ is the service name in tnsname.ora pointing to the TARGET site, so change your tnsnames.ora accordingly.
SQL> connect STRMADMIN/STRMADMIN@test
Connected.
SQL> create database link "APPLYTEST.REGRESS.RDBMS.DEV.US.ORACLE.COM" connect to strmadmin identified by strmadmin;
Database link created.
STEP 2.- Connect as the Streams Administrator in the target site and create the streams queue
SQL> connect STRMADMIN/STRMADMIN@APPLYTEST
Connected.
SQL> BEGIN
2 DBMS_STREAMS_ADM.SET_UP_QUEUE(
3 queue_table => 'STREAMS_QUEUE_TABLE',
4 queue_name => 'STREAMS_QUEUE',
5 queue_user => 'STRMADMIN');
6 END;
7 /
PL/SQL procedure successfully completed.
STEP 3.- add apply rules for the Schema at the destination database
STEP 6.- We may need to do export, import and instantiation of tables from Source to Destination Database ; if the objects are not present in the destination database, perform an export of the objects from the source database and import them into the destination database
Export from the Source Database:
Specify the OBJECT_CONSISTENT=Y clause on the export command.
By doing this, an export is performed that is consistent for each individual object at a particular system change number (SCN).
Specify STREAMS_INSTANTIATION=Y clause in the import command.
By doing this, the streams metadata is updated with the appropriate information in the destination database corresponding to the SCN that is recorded in the export file
5 DBMS_OUTPUT.PUT_LINE ('Instantiation SCN is: ' || iscn);
6 END;
7 /
Instantiation SCN is: 218492643
PL/SQL procedure successfully completed.
Instantiate the objects at the destination database with this SCN value.
The SET_TABLE_INSTANTIATION_SCN procedure controls which LCRs for a table are to be applied by the apply process. If the commit SCN of an LCR from the source database is less than or equal to this instantiation SCN, then the apply process discards the LCR. Else, the apply process applies the LCR.
Note: In 10g recursive=true parameter of BMS_APPLY_ADM.SET_SCHEMA_INSTANTIATION_SCN
is used for instantiation. If using the parameter recursive true with SET_SCHEMA_INSTANTIATION_SCN then you need a dblink on the destination database to the source database with the same name as the source database.
SQL> conn sys/APPLYTEST@APPLYTEST as sysdba
Connected.
SQL> create public database link "TEST.REGRESS.RDBMS.DEV.US.ORACLE.COM" using 'TEST'; -- Here TEST is the service name in your tnsnames.ora, change your tnsnames accordingly.
Database link created.
SQL> connect STRMADMIN/STRMADMIN@APPLYTEST
Connected.
SQL> create database link "TEST.REGRESS.RDBMS.DEV.US.ORACLE.COM" connect to strmadmin identified by strmadmin;
Database link created.
STEP 7.- Specify an 'APPLY USER' at the destination database. This is the user who would apply all statements and DDL statements. The user specified in the APPLY_USER parameter must have the necessary privileges to perform DML and DDL changes on the apply objects.
SQL> conn strmadmin/strmadmin@APPLYTEST
Connected.
SQL> BEGIN
2 DBMS_APPLY_ADM.ALTER_APPLY(
3 apply_name => 'STREAM_APPLY',
4 apply_user => 'HR');
5 END;
6 /
PL/SQL procedure successfully completed.
STEP 8.- Set stop_on_error to false so apply does not abort for every error; then, start the Apply process on the destination
SQL> conn strmadmin/strmadmin@APPLYTEST
Connected.
SQL> BEGIN
2 DBMS_APPLY_ADM.SET_PARAMETER(
3 apply_name => 'STREAM_APPLY',
4 parameter => 'disable_on_error',
5 value => 'n');
6 END;
7 /
PL/SQL procedure successfully completed.
SQL> DECLARE
2 v_started number;
3 BEGIN
4 SELECT decode(status, 'ENABLED', 1, 0) INTO v_started
5 FROM DBA_APPLY WHERE APPLY_NAME = 'STREAM_APPLY';
Having experience more than 14 years, I've been working as Oracle DBA,RAC DBA, EBS DBA, Oracle Analyst, Oracle Developer in different organizations. I've been involved in delivering lectures and technical workshops on Oracle in various colleges and universities in Pakistan. Following glimpses on my profile Credentials OCP DBA 8i/10g/11g, OCP Apps DBA, OCP Developer, Oracle Certified Expert, RAC & GI 11gR2, SCJP, mySAP FI/CO solution consultant Qualification
MCS, MSCS Employers
Ejada Systems Ltd. Riyadh KSA (Current), Netsol Techenologies Pakistan, Ministry of Railways Pakistan Clients Ministry of Higher Education KSA, Saudi Telecom, Ministry of Interior KSA, Saudi Chemicals, Saudi Shipping Corp., Saudi Red Crescent, Mobily Saudia, High Commission for Riyadh Development, GHQ Rawalpindi Pakistan,State Bank of Pakistan, Military Academy Kakool Pakistan,Govt. of Punjab Pakistan,Pakistan Railways Headquarter, BORJAN Ltd. Pakistan