Friday 16 December 2011

Salesforce to Oracle Real Time Integration

In this blog post I will show you how to make Web Service like calls from Salesforce to Oracle, using Informatica Cloud Endpoint URL in real time.

In Winter 12 release of Informatica Cloud we will have the ability to make Salesforce outbound message calls to an Informatica Cloud task and enable real-time integration.

The idea is simple

Use Salesforce Workflow Rule (a database trigger) and make an outbound message call whenever a new Account Record is edited or created.

Step by step instructions

1) You will create a vanilla Data Synchronisation task, I called mine 'test ws', where you read the Account fields from Salesforce(left) and you UPSERT them to an Oracle(right), in a table called FROMSFDC like this:


2) Then at the 6.Schedule step of the Informatica Cloud task you will see the a new option called:

Run this task in real-time upon receiving an outbound message from Salesforce






When you tick this option, as seen above, an Endpoint URL is generated for the 'test ws'  Informatica Cloud task. You can call this task whenever you save a record in Salesforce and force it to execute the  UPSERT and update the Oracle table FROMSFDC. Next is to put the above Endpoint URL in a Salesforce Workflow Action (Basically an AFTER INSERT/UPDATE TRIGGER).

3) Next thing to do is to open up Salesforce and go to Workflow & Approvals > Outbound Messages and paste the above Endpoint URL like this:






4) Last thing to do on this is to create a Workflow Rule to fire the Outbound Message you have just created to call the Informatica Cloud task. Here I created something like this:





 
That's it! Now, don't forget to Activate the rule. 

Whenever a new Salesforce Account is added to Oracle As well. Whenever a Salesforce Account is edited it is UPSERTED to Oracle as well. 

Below you see the test records inserted in Salesforce and then Upserted to Oracle (Oracle Apex in this case).

Here is the record in Salesforce being created, look at timestamp.

















That is the record created in Salesforce. Check the timestamp.

And here you can see the same record pushed in real-time to my Oracle Apex account via the Informatica Cloud URL Endpoint option in my 'test ws' task.


















Conclusion:

It is easy to connect in real-time Salesforce and Oracle, or any other database.  The benefits are tremendeous, for something similar with Web Services you would have to write code to send SOAP messages to the Database, and you would have to do lots of configuration work in the database to receive those messages. The beauty with Informatica Cloud Endpoint URL task schedule is that you can call Informatica Cloud tasks in real-time from Salesforce with zero coding and just config.


Note: The Data Synchronisation Endpoint URL functionality of Informatica Cloud will be available in January 2012. I have a beta test org which I have used for this blog post. You can find out more about Informatica Cloud Winter 2012 release here.

Thursday 1 September 2011

Update data in the same Salesforce Object with a 'Self-join' Informatica Cloud Task


When you use Informatica Cloud you usually use it to do Migration, Integration and Synchronisation. That's about what it is used for, right?

It is all about moving data from point A to point B or  syncing data between A and B. That is what  Informatica Cloud is built for, to move/sync data from A to B and while moving it, maybe transform it as well. A is the 'source' and  B is the 'target'. Correct! A can be an Oracle or ERP database and B can be a Salesforce Org. Classical use case of Informatica Cloud.

Well, not exactly. How about if you think a little bit differently.

How about if A is the source and A is the target, as well!

Yes, that's it. Read from A and Write to A via an Informatica Cloud synchronisation task. A kind of  Informatica Cloud Data Syncronisation Task which will 'self-join' an Object or Table. To be used to 'self cleanse' or  'self update' the Salesforce custom object data or Oracle database table. This is possible, yes we can use  Informatica Cloud as 'self-join' SQL command as well, to edit our data 'in-place' in one single object/table.

I will try to explain this concept with an example. Suppose you have data in your Salesforce standard object called 'Account' which you want to change. A standard approach would be to 'export' data in a CSV file via a report, change the data manually in Excel or something and then use Apex Data Loader to put the changed data back to the Accounts object in your Salesforce Org right?

The screen shots below show you how to do this type of operations without actually exporting the data, but doing it in place with an Informatica Cloud synchronisation task.

What this synchronisation task does is, it uses the same Salesforce Org connection and Account object as 'Source' and as 'Target'. That is it, the Account Standard object for this task is the Source connection and the Target connection at the same time.

This way the Informatica Cloud task, reads the data from the Account object and writes it back, after it transforms it, into the same Account object.


Create a Source on the Account object




Create a Target for the same Account object in the same org. Just choose as target the same object.




Field Mapping a kind  of  'Self-Join', see Id=id




Use DECODE function in the same Field Mapping to manipulate data



The Informatica Cloud string function 'DECODE' does the trick. Observe how it changes the occurrences of the string 'Direct Employer' to 'Employer' in the Type field of the standard Salesforce Account object, which is actually the same object it reads it from.
decode(Type,'Direct Employer','Employer')
When you save and run this task it will read from the same Account object and write back to the same Account object the changed data. All within an Informatica Cloud Data Synchronisation 'Self-join' task! No file downloads, Excel use or whatever. Just one command. Given the plethora of  Informatica Cloud String, Arithmetic, Date etc... functions available, just imagine what you can do to your data!

This is a wonderful and simple example of how Informatica Cloud can be used to cleanse data, I think.


Thursday 18 August 2011

Copy your Salesforce org into Oracle with CopyForce


We found this really cool piece of code called CopyForce in Google code the other day which lets you: 

Copy your Salesforce org objects (tables, indexes) into Oracle or any other RDBMS, yes you can!

The code I believe is freely available. But not to lose time I used the GUI which is available with trial a license of 30 days from CAPSTORM. In this post I want to quickly show you how you can copy your whole Salesforce org dev or prd with its data into Oracle, then do whatever you like with it.

Installing CopyForce with CAPSTORM GUI.

1. Download the CopyForceOrace.zip file from the above site.
2. Extract the zip file to a directory of your choice

/home/crazyman/copystorm

3. Change directory permissions chmod +x on shell script CopyStorm.sh
4. Start the GUI by running

./CopyStorm.sh

5. Start the CAPSTORM GUI, accept EULA.
6. Remember this is a 30 day trial only.

Once the GUI is up you just have to provide your Salesforce and Oracle connection credentials. Before that I logged to Oracle and created a dummy schema called ORA_SFDC and in this schema the CAPSTORM programme will copy the whole Salesforce developer org objects.



Once you have configured and tested both connections, then you can start the copy process.



Once the process finishes lets login to Oracle and see how many objects were created in this blank ORA_SFDC schema. So I can see by running an Oracle dictionary view SQL that I get in my Schema only tables, views and LOBs.




SQL>
select object_type, count(*)
from user_objects
group by object_type
order by 2 desc

OBJECT_TYPE         COUNT(*)               
------------------- ---------------------- 
TABLE               161                   
INDEX               161                   
LOB                 89                     


That's it, that's what you get in Oracle from a vanilla Salesforce developer Org and a few custom objects that I had in this particular developer org. 161 tables per org only, not bad huh!
Now just imagine what you can do with this! 
The other good thing you get with CopyForce is since now you have copied your Salesforceorg  schema and your Salesforce data into Oracle, you can now use SQL to your heart's content and massage this data as you wish! That is, gone are the days where you had to dump Salesforce data to Excel and change it manually before you put  it back into Salesforce. Now you can change your Salesforce data in an Oracle database with all the sophistication ofSQL and put it back to Salesforce. 
I think CopyForce has lots of future, and really would be a good choice for one-off data integration, data beautification projects. 
I wonder what is in those LOBs ? I bet is the Apex code....




Friday 15 July 2011

Salesforce Dev 401 Exam study material

Just a quick post to share with you my past Salesforce Dev401 Exam study materials. I have assembled this table  of 87 questions when I was studying for the exam back in January 2011. 

These are not actual Exam questions, don't be fooled!

This is study stuff that I have compiled reading the study guides and the available training materials I found online. It did help me pass the exam. The quiz like questions I compiled helped me a lot to understand concepts and limits.

I have also tried to categorise the questions & quizzes by the Salesforce Dev401 study guide syllabus.

Here is the link to the questions: http://apex-outsource.com/pls/apex/f?p=309:Dev401


Hope it helps!


Saturday 2 April 2011

Salesforce API call from Oracle APEX

This post is about integrating Salesforce force.com with Oracle using the Salesforce API.

I will try to show you how you can write a PL/SQL procedure to quickly send an SQL INSERT you do in an Oracle table to a Salesforce custom object as well. That is, how you can INSERT to a Salesforce object simultaneously when you insert to an Oracle table. I will use Oracle APEX, an Oracle RAD (Rapid Application Development) tool, as the Oracle database. 

You can achieve this by using the Oracle APEX package APEX_WEB_SERVICE.MAKE_REQUEST and send SOAP envelopes to the Salesforce WSDL, without the HTTPS option, as I couldn't get that part working with my FREE Salesforce Developer account and my hosted Oracle Apex instance.

The idea is simple, you will create the PL/SQL procedure in Oracle, run it and insert data into an Oracle table and a Salesforce custom object simultaneously via SOAP envelope. The SOAP elements will be sent to Salesforce from the PL/SQL block using the APEX_WEB_SERVICE.MAKE_REQUEST package. The data flow direction is from Oracle Apex to Salesforce. To do this, you will create two SOAP envelopes. 

1. One to get authenticated and login to Salesforce, once you login you can get the Salesforce sesionID.

2. And two to send the INSERT (Create in Salesforce parleur) data DML. 

Here is the code you run in Oracle, you don't have to do anything in Salesforce, all you need in Salesforce is to have a custom object to insert the data into, your credentials and obviously your XML envelopes should be written accordingly based on the Salesforce WSDL which you will generate. I have the following code as a PL/SQL anonymous block in an Oracle Apex page process and does the job for me. Just to save you time, you need an Oracle Apex instance with enabled external network calls, unfortunately apex.oracle.com will not let you do that, that was when I checked it last time, you can always get a low cost fully external network enabled Oracle Apex instance from hosting companies, the one I use is Enciva.com


Enough said, here is the code make sure you pass in your Salesforce credentials with the security token.

DECLARE
l_envelope CLOB;
l_envelope_create CLOB;
l_xml XMLTYPE;
l_xml_create XMLTYPE;
v_session_id varchar2(1000);
BEGIN
wwv_flow_api.set_security_group_id;
--
--
-- ENVELOPE LOGIN
l_envelope :='<?xml version="1.0" encoding="utf-8"?>
   <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://
www.w3.org/2001/XMLSchema">
      <soap:Body>
         <login xmlns="urn:enterprise.soap.sforce.com">
            <username>YOUR@SALESFORCEUSERNAME</username>
            <password>YOURPASSWORDANDSECURITYTOKEN</password>
         </login>
      </soap:Body>
   </soap:Envelope>';
 
-- make request
l_xml := apex_web_service.make_request(
p_url => 'http://na7.salesforce.com/services/Soap/c/20',
p_action => 'http://na7.salesforce.com/services/Soap/c/20/login',
p_envelope => l_envelope
);
--
-- ENVELOPE CREATE
-- get session id
-- You will need the salesforce session id , I know this is an ugly way of getting it, write a better one then, I just wanted to get it working

v_session_id := replace(replace(
substr(substr(l_xml.getClobVal(),0,instr(l_xml.getClobVal(),'</sessionId>')),instr(substr(l_xml.getClobVal(),0,instr(l_xml.getClobVal(),'</sessionId>')),'<sessionId>'))
,'<sessionId>',''),'<','');
-- debug
dbms_output.put_line('session_id is: '||v_session_id);
 
l_envelope_create :='<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
      <soapenv:Header> 
         <ns1:SessionHeader soapenv:mustUnderstand="0" xmlns:ns1="urn:enterprise.soap.sforce.com">  
            <ns2:sessionId xmlns:ns2="urn:enterprise.soap.sforce.com">'||v_session_id||'</ns2:sessionId> 
         </ns1:SessionHeader>
      </soapenv:Header>
      <soapenv:Body> 
         <create xmlns="urn:enterprise.soap.sforce.com">  
            <sObjects xsi:type="ns3:linkstream__c" xmlns:ns3="urn:sobject.enterprise.soap.sforce.com">
               <ns3:wbid__c>'||:P3_WBID ||'</ns3:wbid__c>   
               <ns3:Link__c>'||:P3_LINK ||'</ns3:Link__c>   
               <ns3:Description__c>'||:P3_DESCRIPTION ||'</ns3:Description__c>   
               <ns3:Display_link__c>'||UPPER(:P3_DISPLAY_LINK) ||'</ns3:Display_link__c>   
            </sObjects>  
         </create>
      </soapenv:Body>
   </soapenv:Envelope> ';

dbms_output.put_line(l_envelope_create);

l_xml_create := apex_web_service.make_request(
p_url => 'http://na7.salesforce.com/services/Soap/c/20',
p_action => 'http://na7.salesforce.com/services/Soap/c/20/create',
p_envelope => l_envelope_create
);
dbms_output.put_line(substr(l_xml_create.getClobVal(),1,25500));

END;


As you can see the code executes in one PL/SQL block. You can call the procedure via an Insert Trigger on the Oracle table, or as a 'Page processing' process in Oracle Apex. It ads a link to my force.com sites custom object from my Oracle Apex app. I display this mock force.com sites on the left column in this blog, under 'My force.com apps' section.

Needless to say, there are maybe dozens of easier ways of doing the same thing between Oracle and Salesforce with tools such as CastIron, Talend, Informatica or Apatar. Some are even free. But hey, this is using no tool but just HTTP!




Saturday 29 January 2011

Salesforce workflow

What is Salesforce Workflow?

Workflow in force.com, is business logic and in this post I will try to explain it.

In force.com, you can create custom objects and then you can create relationships on your custom objects according to the application data model you have designed. Very good, but if you don't have workflow, your force.com app will be nothing more than a lousy database. A mere database to enter and search records. What is salesforce anyway? Isn't it a data-centric, based around a database application building platform? Another database front end?

Not, just that! Force.com workflow will bring this database into life. Instead of being a stagnant repository of passive and past data, it will become a forward thinking and forward acting alive database. Maybe the thinking bit was a bit of an exaggeration but nevertheless, deductive databases is what we want, isn't it? Maybe one day...

What the force.com workflow engine does is things like: automatically send email alerts, assign tasks, or update field values based on rules that we define. That is it's job, and that is exactly what it does. The force.com workflow engine, based on rules we define, can:

  • Assign a 'Task' to a force.com user
  • Send an email alert to 'anyone' (a valid email will suffice).
  • Update the field of a particular record on an object
  • Send an outbound message to an external system, things like webservice call, SOAP etc.

When does the workflow action fire?

So a workflow can do all these things. That is great! But when? Based on what? It is nice to have the ability to send emails and update fields, but what determines or induces these actions? Well the answer is, The Workflow Rule.

The workflow rule is the main container where all these events are included and are programmed to execute. They execute only when certain workflow 'evaluation criteria' and workflow 'rule criteria' are met.

That is, the workflow rule is basically watching a custom object (it can only watch over one custom object at a time) and when certain evaluation criteria on the watched custom object and rule criteria (a condition) are met, bang!... Then it activates (fires) the workflow actions and does all the things the workflow can do, like send emails, field updates etc. It sounds so much like a database trigger, hmm... doesn't it?

The 'evaluation criteria' tells the workflow rule when to watch the custom object.

The 'rule criteria' tells the workflow rule when to fire the events and based on what condition.

The 'evaluation criteria' and 'rule criteria' together must be met successfully for the workflow rule to fire.

Now, for the workflow rule 'evaluation criteria' we have predetermined choices and these are:

1. When a record is created, or when a record is edited and did not previously meet the rule criteria (this again and again repeatedly triggers the rule every
time a record is saved only until the rule's criteria are met, then it stops, sort of once only if you can say, subsequent edits are ignored)
2. Only when a record is created (this ignores updates to existing records)
3. Every time a record is created or edited (which repeatedly triggers the rule every time the record is saved, it does this only when the rule criteria is met as well).

These are the only times that we can choose for a workflow to be evaluated. Remember evaluation is not enough, the 'rule criteria' has to be met as well for the workflow actions to fire!

For setting the 'rule criteria' we have 2 options. We can use a filter like setup, a condition on the watched/associated custom object, i.e when the Status field of the custom object equals to Open(Status=Open) then fire the workflow or when a Formula Expression evaluates to true.

So to summarise, we can send emails, update fields and do outbound web service calls from force.com using workflow. This is easy to setup, is based on an object and the evaluation and rule criteria must be satisfied for the workflow to fire. Well quite handy piece of work from salesforce, so you don't need to write your own workflows any more! The mantra 'No Software' used in Salesforce's adverts suits this well.

What are you waiting for? Use your Free developer.force.com account login and go to Setup > Create > Workflow & Approvals > Workflow Rules and start practising.