Pages

Friday, January 31, 2014

Dear colleagues,Infratects Top Gun 2014 is coming!!! I will be there with all my colleagues and other speakers to share with you an amazing experience. As last year, we will have keynote Speaker Kash Mohammadi, VP Product Development EPM/BI at Oracle – he will be covering the latest Oracle EPM developments. This year we will be presenting FDMEE including a live demo workshop.

Friday, January 24, 2014

I got recently one question regarding the solution of the following scenario:Our file has periods in rows, how can we load it through a single POV?

In Classic FDM, everything was bit more flexible as you got control of almost everything. You could easily add a custom dimension for your periods and then adjust the Export Action script to export custom period column instead of POV period.

My aim is to import the source column with periods (1st column) into FDMEE and then replace the POV Period by the mapped values for imported periods in the .DAT file exported to HFM.

Adding a Lookup dimension for Source Period
Lookup dimensions can be added to target applications and assigned with data columns. They are only for FDMEE purposes, therefore they won't be exported. They can be used in many different ways. Today we will use the Lookup dimension to import the source period, map it, and export it. How?

I have added Lookup dimension Period_Custom and assign with data column UD10.

Importing Source Period into Period_Custom dimension
Lookup dimensions can be configured in Import Formats as other dimensions:

When I run my Data Load Rule for POV March-2006, I can see a column for Period_Custom dimension:

In this example, my mappings are * to * although you can create any kind of mapping for the lookup dimension.

If we export this data into HFM we can see how the the POV Period March-2006 is exported...we don't want this!

Obviously HFM will show 6000 for March:

Replacing POV Period by Source Period in the DAT file
If you take a look to the export file we don't see source period anywhere, therefore it will be very difficult to replace as we will not find it when going through all lines.
First thing we can figure out is how to export the source period without modifying the HFM_EXPORT.vbs action script :-)
One approach would be:
- Before the file is exported...concatenate Source Period to Target Account
- After the file is exported...go through the file and replace POV Period by Source Period
- Don't forget to remove Source Period from Target Account in the DAT file...
I'm sure you can find other approaches but this one lets me show more :-)

How we concatenate Source Period to Target Account?
At database level, this is concatenating two columns of table TDATASEG: UD10X and ACCOUNTX. Why UD10X? we said we can also have mappings for my source period so let's take the target period.The update statement would be something like:UPDATE TDATASEG SET ACCOUNTX = UD10X || '_' || ACCOUNTX WHERE LOADID = ? AND ACCOUNTX IS NOT NULL AND UD10X IS NOT NULLHow do we perform this operation before .DAT file is exported?We can use the Event Script BefExportToDat

You can also see how I have added some debug information to my process log (Outbox\Logs). This is totally recommended as you can easily debug your code.

Note: I found an issue when running the UPDATE statement using LOADID as parameter. I'm still figuring out if this a bug. You can use PARTITIONKEY instead. I will keep you posted.

Update: the issue is not related to the SQL query itself but to the fact that if we run EXPORT step separately from IMPORT step, the column LOADID in TDATASEG table will be only updated during the EXPORT and therefore not available in BefExportToDat event script. Basically if we run steps separately (from Workbench), LOADID will be different for each step. If we run the DLR from Data Load Rule page and select Import + Export, then both steps will have same LOADID and therefore the SQL query with LOADID in the WHERE clause will work as expected.Easier solution is to modify the SQL UPDATE and use POV information in the WHERE statement (RULE_ID, PARTITIONKEY, CATKY, PERIODKEY:

How do we replace POV Period and remove prefix added to Target Account?
We can go through all lines in the .DAT file and use split + Python Slice Notation to get the prefix and build the final line with custom period and target account in the correct format.
Actually we will be creating a new .DAT file (with same name) so we first backup the original one.

Once the file is secured we will go through all lines and build the new line:

And finally, after the file is produced we have to remove the prefix from ACCOUNTX:

Now it's time to place our scripts in FDMEE:

Running the FDMEE workflow... exporting the new file

All operations shown above will be transparent to the end user as they will happen during the Export step. Therefore the workbench grid will look the same as before but we can see the differences in the DAT file being exported and final data in HFM:

Our new DAT file has now source periods instead of POV Period as we were expecting, and we can check data in HFM:

Well, as I said, this can be one of multiple solutions so take your time to think. Also take into consideration all other integration requirements you may have.

Tuesday, January 21, 2014

Hi all,
in the last months I noticed that there were lot of people who have no clue about Account Reconciliation Manager aka ARM. In this post series I will try to give you a brief introduction to the product... but always from FDMEE side :-)

We will try to answer questions like:
- What is ARM? What is it used for?
- Which is the role of FDMEE when loading balances into ARM?
- ...

Financial Monthly CycleTo understand what ARM is I should start with talking about Financial Monthly Cycle. I'm pretty sure the concept slightly varies from one company to other but the essence should be the same.
So how can we see the financial monthly cycle of my company AKAFDMEE?

Our daily business activity transactions are grouped by activity/time (GL Debit/Credit). At the end of the day, we have huge volume of transactional data. Where is this data stored? my company is too big so we have many financial systems to facilitate management of all transactional data. All my transaction systems feed into common General Ledger Systems... you may know some of them: JD Edwards, E-Business Suite, PeopleSoft, SAP, etc.

We need to consolidate our business activity with other business activity and business units. For that, we have a tool called Hyperion Financial Management (HFM)

Mmm... all my data is now consolidated and we have to deliver all results, dashboards, financial statements, etc. to all stakeholders. Lucky we got Hyperion Financial Reporting (FR)

And now what? we would need to reconcile all reports, the data, balances from different GL systems... manual process required :-(

Going back to Hyperion, there was no reconciling tool until 11.1.2.2 where the new Account Reconciliation Manager was released as part of the Hyperion Financial Close Management Suite (FCM)Account Reconciliation Manager - ARM

Part of Hyperion FCM Suite

Reconciles the final data, with the source data, down to the transactional level

Facilitates the use of multiple currencies

Facilitates the management of balances in larger account

ARM follows a reconciliation workflow that can be depicted as follows:

The objective of ARMisto reconcile balances from a Source Systemby comparing it with balances from Sub-System(s)

To summarize, we want to verify that specific balance is correct by reconciling accounts back to their respective source systems. It sounds easy, doesn't it?

ARM - System and Sub-System

The following 2 examples shows how ARM can be used in the business.Example 1: reconcile EBS GL Balances with EBS Account Receivables Balances

- What do we want to verify? EBS GL Balances

- Source System = EBS GL

-Sub- System= EBS AR

Example 2: reconcile HFM consolidated data with EBS GL Balances

- What do we want to verify? HFM Data

-Source System = Consolidated data extracted from HFM (EAL?, EA?...)

-Sub- System = EBS GL Balances

FDMEE-ARM Integration

Why we need FDMEE? easy one... FDMEE will be used to load balances from Source System and Sub-System(s) into ARM so they can be reconciled.

Friday, January 10, 2014

When loading data into HFM there is an option to validate HFM data intersections before load happens. This validation takes place during the Validate Step just after FDM validates all source dimension members have been mapped. The main objective is to capture errors before loading data into HFM, and therefore avoid partial loads

Checks performed
As a result of the validation we can get:

Invalid Member: the mapped member (target member) does not exist in HFM or the user does not have access to the related security class.

Cannot Write:the data cell (intersection of dimensions) contains a "no input" rile, is on an aggregated level (parent) or data has been promoted to a process level with the read-only attribute.

Locked Intersection:the period has not been started by the process level management or the period has been locked.

Differences between FDM and FDMEE

Main differences between Classic FDM and FDMEE are:

Name: in FDM the report was named "Intersection Validation Report" while in FDMEE has been renamed to "Intersection Check Report"

Format: in FDM the report was generated in XML format which was using XSL to format the report. The user could hover the mouse over errors in order to get details and instructions/tips for fixing them. He could also click on errors to navigate to valid members. In FDMEE, although the report is stored as a XML file, the report is generated PDF format which no hover capability. XSL transformation has been removed in FDMEE.

Report: in FDM there was only type of report while in FDMEE we have two types. One report for typical HFM application with 4 custom dimensions and other for HFM applications which have more than 4 dimensions (Dynamic Report)

Generation: in FDM the report was generated when the user clicked on Validate Action while in FDMEE the user can now generate the report also from button "Generate Intersection Check Report".

Analyzing Intersection Errors
I really like meetings/workshops where I'm sitting with both HFM administrators and responsible of source systems... when data is being loaded everybody is expectant to see who is going to be blamed when errors happen.
- HFM administrator will say to the source system administrator: Hey! you did not tell me this data was going to be extracted...
- Source system administrator will say to the HFM administrator: Hey! you did not specify which data you wanted...
- FMDEE administrator will blame both!!! :-)

Jokes apart, the Intersection Check Report is a really good starting point to fix issues related to HFM application design (top dimension members, intercompany accounts, etc.) and to source system extracts (data to be extracted, granularity, etc.) It's in this stage where the FDMEE administrator should be a bridge between both parties. At the end, we want everyone's life to be easier, and of course, that includes ours.

My suggestions to work on errors shown in the report are:

Check if HFM metadata is correctly configured

Check if mapping/import format is correct

Check if source data is correct

You can see how I prefer following reverse error finding but you are free to follow your own approach as long as you find the issue and fix it.

If you take a look to the 2 RTF files you will notice that POV members (Scenario, Year, and Period) are displayed in a different way for each report. For "Fixed" report they are displayed on top of all columns (as Classic FDM). On the contrary, for "Dynamic" report they are displayed with all other dimension columns which are filled dynamically based on dimensions in the HFM application.

These reports can be adjusted by using BIP Word Desktop plug-in. Once you review the XML files generated from FDMEE to produce the Intersection Check Report then you will easily see the link between XML tags and RTF reports tags.

Query for Intersection Check ReportDon't try to take a look to the query associated to this report. I say that because this report is generated from API ($Oracle_Middleware\EPMSystem11R1\products\FinancialDataQuality\bin\HFM_VALIDATE.vbs) and cannot be launched from Report Execution link. Therefore it has a dummy query associated.

If you try to execute the report from Report Execution link it will be empty:

Enabling Intersection Check ReportTo enable the Intersection Check Report you have to set target application option "Check Intersection" to Yes.

The default Intersection Check Report can be then specified at System or Application level:

Running Intersection Check ReportAs we stated before, the report is automatically generated at the Validate step when we either run the Import Step or the Validate Step.

The following example shows a report where ICP member is invalid:

If we scroll down we will see that [ICP None] is the only valid member for my current HFM configuration:

We will have to review the issue and fix it before validate successes.﻿ It could be mapping issue, wrong source data, wrong HFM metadata... who knows?

If we want to re-generate it again we don't have to click on Validate link again (as Classic FDM). We can generate it by clicking on "Generate Intersection Check Report" button:

XML file for the Intersection Check ReportThe action script HFM_VALIDATE.vbs generates a XML file which is then converted to PDF and prompted to the user. This XML file is very similar to the one generated in Classic FDM.
They have slightly different XML structure with different XML tags, surely to make the RTF template easier (the RTF report is filled from the XML and then converted to PDF)
So here you have another difference between Classic FDM and FDMEE: in FDMEE the Intersection Check Report has a report template associated (RTF) while in Classic FDM there was no Crystal report for it. The XML was just generated, transformed using XSL and displayed to the user in Internet Explorer.
The XML file can be found in the outbox folder of our target application:

If you take a look into it and you compare to a XML file generated from Classic FDM you will notice differences like for example XML tag <xmlrow> which identifies each line in the report, or the XML from Classic FDM having a reference to the XSL which transforms it:

Creating a custom XSL file to format the XML report in FDMEE (Dynamic Report)So... why not trying to create a custom XSL to format the XML generated in FDMEE and display it as the same way it was done in Classic FDM? In that way I could easily navigate the XML file being able to see error details and tips.
The idea is to adjust the existing XLS file for the new XML tags and also make it dynamic as the number of dimensions can vary. I had to recall my old times in XML but at the end I got it working so I'm sure you can do it as well.

Let's say that we have the following XML generated in FDMEE:

You can see how this is opened in IE as a normal XML.
After creating our custom XSL we have to reference it from the XML file as follows:

Then if we try to open it with IE we will see the XML formatted as Classic FDM (click on links, hover on errors, etc.):

How did I create the XSL file? I adjusted the XLS from Classic FDM including code like:

I suggest you to review the XSL file for Classic FDM to see how it transforms XML files and then adjust it to new XML tags. Also, think that the number of dimensions may vary so some of your code should perform transformations dynamically as well.﻿

Where did I place the XSL file? It's interesting that the XSL file is accessible from anywhere and not only locally so I chose htdocs folder of OHS to make it available when the OHS service is up and running:

The XSL reference can be the prepended to tag <?xml version="1.0"?> as follows:

Creating a Custom Script to display the XML transformed by the custom XSL file
We cannot alter FDMEE to display the transformed XML file instead of the PDF but we can create a custom script that:
1. Prepend XSL reference to the XML file
2. Display the XML file
You can think in many different ways of implementing this solution. I'm so lazy so I have created a custom script with two parameters:- Parameter 1: Location- Parameter 2: Process Id

The first parameter is useless but it is good to show how parameters work. The second parameter has associated a query that returns all process Ids for my target HFM application (Id = 17)

The﻿ following code prepends the XSL reference to the XML file and shows the formatted XML file:

#---------------------------------------------------------------------------# Adds to the Intersection Validation Report (XML) the first line with# the XSL reference# XSL File = TARGETAPPLICATION.xsl# XML File = TARGETAPPLICATION_ProcessId_VALIDATE.xml# The process ID and Application Name are passed as input parameter# For example, for application EPMA7DIM, the line to be added is:# <?xml-stylesheet type="text/xsl" href=".\IVR_XLS\EPMA7DIM.xsl"?># Parameters# Param 1: Location# Param 2: Process ID#---------------------------------------------------------------------------# Get value for Parameter CUSTOM_PROCESSID (Process Id)sProcessId = fdmAPI.getCustomScriptParameterValue("CUSTOM_PROCESSID")# Custom Scripts are registered at Target Application Level so we can get the # target HFM application from the FDMEE contextsHFMApp = fdmContext["TARGETAPPNAME"]

# Remove backup on success try: os.unlink(sbackupname) except OSError: passRunning the Custom Script
We can now run the custom script from "Script Execution" link:

We have to select the Process Id from the list of processes returned by the Query (entering it manually won't work) and execute the script in "Online Mode" in order the report to be displayed. If we execute the script in "Offline Mode" the report will not be displayed.

After clicking OK we will be asked to open the XML file:

I found an issue here but I did not have time to research on it. If you open the XML file directly from the prompt window, the XML file will be opened in the same window as Workspace... so you can save it locally and open it.

That's all my folks. With this post I tried to show you what is the Intersection Check Report, how is generated in FDMEE, how useful it is for error solving, and finally how you can create a custom solution to display the report as it was done in Classic FDM.
I have also shown some Jython coding: open/delete a file, write into a file, work with exceptions (try - except), get input parameters, etc.