Global Currencies in Oracle BIA

If you read the Oracle Documentation for Oracle BIA (7.6.9.3), you could find the following; “Currency conversions are required because your business might have transactions involving multiple currencies. To create a meaningful report, you have to use a common currency.”

Oracle BIA stores amounts in the following currencies;

– Document Currency –> Currency of the actual Transaction (Can vary in a multinational organization)
– Local Currency –> Currency defined in the Ledger
– Global Currency –> Defined in the DAC

You can use a Global Currency if you want to report all the different Currencies in one Global Currency (eg. ‘EUR’, or ‘USD’). Oracle BIA for Oracle eBS is able to report in 3 different Global Currencies. If you use Oracle BIA for CRM, you are able to store an additional two Global Currencies. You define the Global Currencies in the DAC.

– $$GLOBAL1_CURR_CODE
– $$GLOBAL2_CURR_CODE
– $$GLOBAL3_CURR_CODE

You will have to define a Global Rate Type for each Global Currency as well.

– $$GLOBAL1_RATE_TYPE (for the first global currency)
– $$GLOBAL2_RATE_TYPE (for the second global currency)
– $$GLOBAL3_RATE_TYPE (for the third global currency)

DAC - Currency - Rate - Parameters

You would use the above DAC Parameters for a Document Currency to Global Currency Conversion ($$GLOBALn_CURR_CODE & $$GLOBALn_RATE_TYPE). The $$DEFAULT_LOC_RATE_TYPE-Parameter is used for the Document Currency to Local Currency Conversion.

To get a better understanding about how these Conversions work, you should try to understand the; ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet. The purpose of this Mapplet is to; “Find the Conversion Rate for a given Currency Code, Rate Type and Exchange date. You should find this mapping in the Out-of-the-Box SILOS-Folder.

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES - Diagram

In the ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet you will find an Expression; ‘EXPT_CALC_EXCH_RATES.’ This ‘EXPT_CALC_EXCH_RATES’-Expression is responsible for the actual Conversion (Calculation).

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES - EXPT_CALC_EXCH_RATES

There are a few other Mapplets with similar name and functionality compared to the ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet.

The Conversion for eg. the GLOBAL1_EXCHANGE_RATE basically consists of three components:

– Input from a Mapping (eg. SIL_APInvoiceDistributionFact – W_AP_INV_DIST_F)
– Lookup to the W_GLOBAL_CURR_G-Table
– Lookup to the W_EXCH_RATE_G-Table

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES - EXPT_CALC_EXCH_RATES - Calculate Rate

Input from a Mapping

The ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet is part of a Mapping. Let’s take the ‘SIL_APInvoiceDistributionFact’-Mapping which populates the ‘W_AP_INV_DIST_F’-Table as an example.

Informatica Mapping - SIL_APInvoiceDistributionFact

Lookup to the W_GLOBAL_CURR_G-Table

The ‘SILGlobalCurrencyGeneral_Update’-Mapping in the ‘SILOS’-folder stores the values of the  to the $$GLOBALn_CURR_CODE-, and $$GLOBALn_RATE_TYPE-DAC-Parameters in the W_GLOBAL_CURR_G-Table.

W_GLOBAL_CURR_G - Currency - Rate

As you can see the values in the W_GLOBAL_CURR_G-Table, match with the values in the DAC-Parameters. Of course you can setup these Parameters anyway you like.

Lookup to the W_EXCH_RATE_G-Table

The W_EXCH_RATE_G is a Table which resembles the GL_DAILY_RATE-Table in Oracle eBS. If the contents of this Table are incomplete or incorrect, the whole setup of the different Currencies is useless. The ‘LKP_W_EXCH_RATE’-Lookup is used to find the Conversion Rate for a given Currency Code (‘From’ and ‘To’) and Exchange Date .

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES - LKP_W_EXCH_RATE - Complete

 

If you query the  ‘W_AP_INV_DIST_F’-Table an you find the ‘GLOBAL1_EXCHANGE_RATE’-Column empty for a certain Transaction, the above should help you find out the reason.

Good Luck.

Oracle BIA – Making the Full / Incremental Load work

It’s possible to configure either a Full- or an Incremental Load in Oracle BIA. If you look at the Informatica version of Oracle BIA, there are a few areas you will have to configure.

First you start with the Informatica Mapping. This will be one Mapping. It does not matter whether you run this Mapping Full or Incremental.

Lets take the ‘SDE_ORA_GLJournals’-Mapping as an example. In the Source Qualifier of the Mapping (or Mapplet), you will see a reference to to the $$LAST_EXTRACT_DATE. If you would run the Mapping with these settings, you will run an Incremental Mapping. This means that you only select the data which is created / updated since the last ETL-run.

Informatica - Source Qualifier - $$LAST_EXTRACT_DATE

The $$LAST_EXTRACT_DATE is a Parameter which you configure in the Datawarehouse Administration Console (DAC) and reference in Informatica.

DAC - Configure $$LAST_EXTRACT_DATE

According to the Oracle documentation, the “@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP. Returns the minimum of the task’s primary or auxiliary source tables last refresh timestamp, minus the prune minutes.”

Make sure this Parameter is available in both the DAC (see above) as well as in the Mapping (or Mapplet).

Informatica - Variables and Parameters - $$LAST_EXTRACT_DATE

This way the Parameter can be used in the Extraction Mapping. If you reference a Parameter in the Extraction Mapping Query which isn’t declared, the Workflow will return an error and won’t complete.

So the steps are easy;

1. Declare the $$LAST_EXTRACT_DATE-Parameter in the DAC
2. Declare the $$LAST_EXTRACT_DATE-Parameter in Informatica
3. Reference the $$LAST_EXTRACT_DATE-Parameter in the Source Qualifier

As I said before, the same Mapping is used for the the Incremental- as well as the Full-Load. If you want to run the two different loads, make sure there ar two different Workflows which run the same mapping. The difference is in the mapping of the Workflow. The Full-Workflow uses the $$INITIAL_EXTRACT_DATE whereas the Incremental-Workflow uses the $$LAST_EXTRACT_DATE.

Informatica - Workflow - SDE_ORA_GLJournals

If you edit the task which belongs to the Incremental-Workflow (‘SDE_ORA_GLJournals’), you will find the Source Qualifier with the extraction query and a reference to the $$LAST_EXTRACT_DATE-Parameter.

As you can see, the LAST_UPDATE_DATE is compared to the $$LAST_EXTRACT_DATE-Parameter.

After each ETL-run, the LAST_EXTRACT_DATES (Refresh Date) per table are stored. You can check, update or delete these values as per requirement (see picture below). If you decide to delete the Refresh Date, a Full Load ill be performed the next time.

DAC - Refresh Dates

As stated earlier, the Full-Workflow is almost identical. The only thing is that there is a reference to the $$INITIAL_EXTRACT_DATE. The $$INITIAL_EXTRACT_DATE-Parameter is defined in the DAC. You define a date in the past. Just make sure that this date captures all the data you need.

DAC - Configure $$INITIAL_EXTRACT_DATE

Make sure this Parameter is available in both the DAC (see above) as well as in the Mapping (or Mapplet).

Informatica - Variables and Parameters - $$INITIAL_EXTRACT_DATE

This way the Parameter can be used in the Extraction Mapping. If you reference a parameter in the Extraction Mapping Query which isn’t declared, the Workflow will return an error and won’t complete.

How do you make sure that the $$INITIAL_EXTRACT_DATE-Parameter will be used when running a Full-Load?

Informatica - Workflow - SDE_ORA_GLJournals_Full

If you edit the task which belongs to the Incremental-Workflow (‘SDE_ORA_GLJournals_Full’), you will find the Source Qualifier with the extraction query and a reference to the $$INITIAL_EXTRACT_DATE-Parameter.

As you can see, the LAST_UPDATE_DATE is compared to the $$INITIAL_EXTRACT_DATE-Parameter.

At this point everything is in place to either run a Full-, or an Incremental Load.

Informatica - Workflows

You just have to tell the DAC to either run the ‘SDE_ORA_GLJournals_Full’-Workflow or the ‘SDE_ORA_GLJournals’-Workflow (incremental)

DAC - Task - SDE_ORA_GL_Journals

Check the Informatica Session Log when the ETL has a another result than expected. It could be that the Workflows are incorrectly defined. You will see in the Session Log which Parameter is used and what the value of that Parameter is.

Good Luck.

Solving the Informatica TM_6795 ERROR

When running an Execution Plan in the DAC, I ran into an error. Investigating this error led to the Informatica Workflow Monitor. The logfile returned the following error;

“TM_6795 ERROR: Session or its instance is invalidated and the Integration Service is configured not to run impacted sessions.”

The Informatica Message Reference leads to the following;

***

TM_6795 The Repository Service marked the session as impacted, and the Integration Service is not configured to run impacted sessions.
User Response: Validate the session or configure the Integration Service to run impacted sessions.

***

If this is the case check whether all the related Informatica Mappings, Mapplets and Workflows, etc. are  validated and checked in. The latter did it for me. Not all objects where properly checked in.