Friday 22 October 2010

Oracle 11g Streams Synchronous Capture

Have you ever wanted just to maintain a table in two different databases? In two different schemas? Something like replicating, keeping a copy of a table in sync in another database, not in real-time, but almost real time? There is an easy way, use Oracle Streams Synchronous Capture!

Many times in development I get this request, where a developer in a team will come with a request like "Can we get a copy of that table in our schema in some sort of sync please?". Alternative ways of providing the developer with this table would be backup/restore, data-pump, both not so much "in sync" and overkill solutions. There is a better way, Oracle Streams Synchronous Capture.

Oracle Streams is a tool which propagates information between databases using Advance Queuing in almost real-time. It has 3 main components Capture, Propagate and Apply. Capture is the process with which you capture the data in the source database. Propagate is the process which transfers the data from the source database to the target database. The Apply process copies (inserts) the data into the target database, once the data arrives to the target database. Oracle Streams is easy to setup and use, most important of it all, it is FREE. Yes, Oracle Streams would be the perfect candidate to have data available in two or more database systems in sync for whatever reason, be it development tests, production replication between sites or whatever legitimate reason you want a copy of your data to exist somewhere else as well.

You might argue, there is something fundamentally wrong with asking to have a copy of the data. That is, the data twice, three times and even more, in different places. The action of having the copy of data in different places not in real-time, with delay, might dilute the truth and might even version the truth. When all we want is just "one" truth, one real and correct answer. Oracle streams usually will have two ends, the source database and the target database. There is nothing stopping us from changing the data of the target database after it is streamed down from the source. I will probably agree with you if you are thinking like this. Nevertheless, Oracle Streams still proves useful.

Reading the Oracle documentation on Oracle Streams you can see that although there are multiple ways to configure Oracle Streams, there seems to be only 2 kinds of Oracle Streams. The Asynchronous Redo Log Capture based Streams and the Synchronous Capture "internal-mechanism/trigger" based Streams. Both have their uses. The former is the heavy weight type, where all changes on your source database objects are captured from the Redo Logs of the source database and then they are propagated in the Redo Logs of the target database where the Apply process picks them up and applies them. A system suitable if you want to stream everything, including DML and DDL, between the source and target databases, or more just than few tables. On the other hand, if you just want to stream only the DML of few tables between databases then you can use Oracle Streams Synchronous Capture, where triggers are used to capture the changes as they happen in the source database and then propagate the changes to the target database,no redo log mining is used for this. This is exactly what this post is all about, using Oracle Streams Synchronous Capture to replicate few tables between databases, a New Feature of Oracle 11g, which doesn't rely on archivelogs or redo logs to propagate DML changes on tables between databases.

Below is a step-by-step implementation of Oracle Synchronous Capture in Oracle 11g. I use two database instances SOURDB for the source database and TARGDB for the target database. I create two Oracle Streams administrator schemas on these databases first. Then I create the Oracle Streams users, the table and the capture, propagate and apply processes. The PL/SQL package DBMS_APPLY_ADM is used, although if you read the documentation you will see that there are other packages which you can use.

Step by step implementation


1. Create U_SOURCE users to be used for the test on both databases. Yes on both systems the users is called U_SOURCE, as it has to be identical for the following example to work. You can have different usernames, but then you have to create a transformation process, look in the documentation you will understand. I didn't want this blog entry to take forever, I took a shortcut by naming both schemas in both databases SOURDB and TARGDB as U_SOURCE.


-- on SOURDB create user
connect / as sysdba

create user u_source
identified by u_source
default tablespace users
temporary tablespace temp;

grant connect,resource to u_source;

-- db link to connect to target
CREATE DATABASE LINK TARGDB CONNECT TO strmadmin
IDENTIFIED BY strmadmin
USING 'TARGDB';

-- on TARGDB create user

connect / as sysdba

create user u_source
identified by u_source
default tablespace users
temporary tablespace temp;
grant connect,resource to u_source;

-- db link to connect to source
CREATE DATABASE LINK SOURDB CONNECT TO strmadmin
IDENTIFIED BY strmadmin
USING 'SOURDB';


2. Now create the Streams Administrator users on both SOURDB and TARGDB databases, with their own STREAMS tablespace. Run the script below on SOURDB as well as TARGDB as SYSDBA.


CREATE TABLESPACE streams_tbs
DATAFILE SIZE 25M
AUTOEXTEND ON MAXSIZE 256M;

CREATE USER strmadmin IDENTIFIED BY strmadmin
DEFAULT TABLESPACE streams_tbs
QUOTA UNLIMITED ON streams_tbs;

grant dba to strmadmin;


3. Now is time to create the Capture process in the SOURDB. We use Oracle AQ as the structure to capture DML changes. That is why first we need to create the queue and then add the table which we want to capture the changes from. Also, in synchronous capture you don't have to start the capture process, it starts automatically once you create the table rule.


--in SOURCEDB as user STRMADMIN set up the queue
--if you want to remove queue use
--exec dbms_streams_adm.remove_queue('strmadmin.SOURDB_queue',true,true);
BEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(
queue_table => 'strmadmin.SOURDB_queue_table',
queue_name => 'strmadmin.SOURDB_queue',
queue_user => 'strmadmin');
END;
/


4. Now still as the STRMADMIN user in the SOURDB which owns the above queue, you can add the table from which you want to capture the changes like this:


BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES(
table_name => 'U_SOURCE.T1',
streams_type => 'SYNC_CAPTURE',
streams_name => 'SYNC_CAPTURE',
queue_name => 'strmadmin.SOURDB_queue',
include_dml => true,
include_ddl => false,
include_tagged_lcr => false,
inclusion_rule => true,
source_database => 'SOURDB');
END;


5. Still logged in as the STRMADMIN user in SOURDB, create the propagation process between SOURDB and TARGDB.


--To drop existing propagation
--BEGIN
--DBMS_PROPAGATION_ADM.DROP_PROPAGATION('strmadmin.SOURDB_propagation');
--END;

--To create a propagation between SOURDB to TARGDB
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES(
table_name => 'u_source.t1',
streams_name => 'SOURDB_propagation',
source_queue_name => 'strmadmin.SOURDB_queue',
destination_queue_name => 'strmadmin.TARGDB_queue@TARGDB',
include_dml => TRUE,
include_ddl => FALSE,
include_tagged_lcr => FALSE,
source_database => 'SOURDB',
inclusion_rule => TRUE,
queue_to_queue => TRUE);
END;


6. On the target database TARGDB create the Apply process, again first you will have to create the queue and then the Apply process.


--exec dbms_streams_adm.remove_queue('strmadmin.TARGDB_queue',true,true);
BEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(
queue_table => 'strmadmin.TARGDB_queue_table',
queue_name => 'strmadmin.TARGDB_queue',
queue_user => 'strmadmin');
END;
/

--now create the apply process
begin
dbms_apply_adm.create_apply(
queue_name => 'strmadmin.TARGDB_queue',
apply_name => 'sync_apply',
apply_captured => false); <-- very important to be set to FALSE! end; / --set Apply process rule, which table etc.. begin dbms_streams_adm.add_table_rules( table_name => 'U_SOURCE.T1',
streams_type => 'APPLY',
streams_name => 'SYNC_APPLY',
queue_name => 'strmadmin.TARGDB_queue',
include_dml => true,
include_ddl => false,
include_tagged_lcr => false,
source_database => 'SOURDB');
end;
/


7. Once you have created the Apply process on the target database, you will have to instantiate the target database 'replica' table with the SCN number. This creates a point in time on the target table which shows the point the target table started accepting changes from the source database table. So, in the target database TARGDB as the user U_SOURCE do the following:


select dbms_flashback.get_system_change_number apply_scn from dual;

insert into u_source.t1
(select *
from u_source.t1@SOURDB as of scn 13983123089);

commit;

--Instantiate, point of no turning back!
begin
dbms_apply_adm.set_table_instantiation_scn(
source_object_name => 'u_source.t1',
source_database_name => 'SOURDB',
instantiation_scn => 13983123089);
end;
/

--Confirm the instantiation is complete
select source_database,
source_object_owner||'.'||source_object_name object,
instantiation_scn
from dba_apply_instantiated_objects;

--Start the Apply process, from this point onwards all changes in source table T1 are
--propagated to the target table T1.
begin
dbms_apply_adm.start_apply('SYNC_APPLY');
end;



8. Now, you can test the setup by going to the SOURDB U_SOURCE schema and inserting some rows to the table T1 and coming back to the TARGDB U_SOURCE schema and seeing the changes being replicated. If everything went alright, your DML will be propagated across from source to target, as a logical chanage record (LCR) as the Oracle Streams parlor denotes it.

For more reading and monitoring of the Oracle Streams operation go to Monitoring a Capture Process You can also use Oracle Enterprise Manager to monitor Oracle Streams. Unfortunately although you CAN setup the first type of Oracle Streams using Redo Logs with Enterprise Manager, you CAN NOT setup Oracle Streams Synchronous Capture.

3 comments:

sonu said...

Hi,

I follow the sequence that you posted here but it not works for me.

givign error " ORA-02019: connection description for remote database not found "

I have created DB link in stream user. I hope this is correct.

I have one more questoin , I am using Sync capture because I have Standard edition but issue is in my schemas I have 400 tables and I think there should be some good alternates for schema replication in SE. Could you please suggest me?

Thanks,
Suchi

Kubilay said...

Sonu, sorry for the late reply. If you have lots of tables, then sync capture is not for you, try Oracle Streams the binary option, the one which uses redo logs. The Sync capture is a quick and dirty way to stream few tables with internal triggers and no use of redo logs. It will be very fiddly for you if you have 500 tables. There is always Goldengate if your company can afford it, just drag and drop stuff.

Streams is lots of PL/SQL and script configuration. With regards to DB_LINKS in sync capture, you have to create them from the schemas which you want to stream. Also check the table privileges SELECT...etc. on the objects you want to stream from.

It takes a while to orchestrate all this information, but you will get there.

Anonymous said...

Hi,

As far as I know, apply process has to be created before the capture process. If this order is not followed, you will have error messages. But your error seems to be due to a different reason, most probably something to do with the conection string being used in the DB link not entered in the tnsnames.ora file. Test yor DB links before you follow the later steps.

Good Luck,
Tansel