Quite naturally, many organizations over-rate the quality of their enterprise and corporate performance management (EPM / CPM) practices and systems. In reality they lack in in terms of how comprehensive and how integrated they are. For example, when you ask executives how well they measure and report either costs or non-financial performance measures, most proudly boast that they are very good. Again, this is inconsistent and conflicts with surveys where anonymous replies from mid-level managers candidly score them as “needs much improvement.”

Every organization cannot be above average!

What makes exceptionally good EPM systems exceptional?

Let’s not attempt to be a sociologist or psychologist and explain the incongruities between executives boasting superiority while anonymously answered surveys reveal inferiority. Rather let’s simply describe the full vision of an effective EPM system that organizations should aspire to.

First, we need to clarify some terminology and related confusion. EPM is neither solely a system nor solely a process. It is instead the integration of multiple managerial methods – and most of them have been around for decades arguably even before there were computers. EPM is also not just a CFO initiative with a bunch of scorecard and dashboard dials. It is much broader. Its purpose is not about monitoring the dials but rather moving the dials.

What makes for exceptionally good EPM is when multiple managerial methods are not only individually effective but also are seamlessly integrated and enhanced through embedded analytics of all flavors. Examples for using analytics to enhance EPM are to perform data segmentation, clustering, regression, and correlation analysis.

EPM is like musical instruments in an orchestra

I like to think of the various EPM methods as an analogy of musical instruments in an orchestra. An orchestra’s conductor does not raise their baton to the strings, woodwinds, percussion, and brass and say, “Now everyone play loud.” They seek balance and guide the symphony composer’s fluctuations in harmony, rhythm and tone.

Here are my six main groupings of the EPM methods – its musical instrument sections...:

SAP BPC’s EPM Add-In is what delivers the BPC user interface to the Microsoft suite of products. This means that users of BPC can open up MS Exel, and connect to SAP BPC via the EPM Add-In.

This is just brilliant, because it allows our target audience – the finance community – to leverage their Excel skills when working on BPC. A harmonious relationship indeed!Not to mention a key attribute of the product which accountants adorn with praise.

The problem comes in with the fact that we are now introducing 2 software vendors into the mix. SAP, representing BPC and of course Microsoft, representing Excel.

Several customers have called, panic stricken, describing an issue I see too often. It goes something like this:

“Yesterday, my EPM Add-In was working just fine, today, certain buttons and features are not working or ‘greyed’ out”

This can usually be attributed to Microsoft releasing updates for their software. These updates would have run in the background, unbeknownst to most users. The basis or IT team sometimes even send batch updates to the group on computers on the same network.

This is what breaks your SAP BPC EPM Add-In. There is hope, as SAP are quick to react and this is the resolution to this latest occurrence:

SAP Note 2107965 best describes symptoms, caveat – this does require an S-number to access.

Upon easing into my morning routine (Americano; cold milk; one sugar) I was fat slapped by the date. We are in the last quarter of 2013, fast approaching year end and the holiday season. Almost time for the ‘year in review’ series of articles which will receive front page glory.

Another task that never goes amiss and is somewhat habitual for us all, is my loyal affair with software. Starting up, logging in and after some work behind the wheel…logging off. That software is called BPC.

This has got me thinking, we spend so much time with the product, how many of us actually know its’ origins and the history behind the technology? To better understand the application software, let’s start with BPC’s story in commerce:

“OutlookSoftCorporation was a Stamford, Connecticut based software company. Along withOutlookSoft, other vendors in the performance management space include SAS,Cognos and Business Objects. In 2006, OutlookSoft was named in the top 25% in Deloitte'sTechnology Fast 500 and in the same year, was sued by competitor, Hyperion,over software patents, winning a dismissal in October 2006 as the jury found nopatent infringement and subsequently ruled the patents ineligible for claim. InJune 2007, SAP AG announced its proposed acquisition of OutlookSoft as a partof its strategy to challenge Oracle Corporation OutlookSoft's largest productwas revamped as SAP BPC (Business Planning and Consolidation) after acquisitionby SAP AG.”

So then, BPC was Outlooksoft, which really was an application that was developed to leverage Microsoft’s BI stack. This is where the history lesson begins. Business Intelligence in the Microsoft world can attribute its success to the concept of dimensional modeling. This is where we find ideas such as cubes, measures, dimensions and tables modeled to present end user’s with optimal performance when querying data. An acronym which embodies most of these elements is referred to as OLAP, and often used in tech jargon.

A pioneer in this field is Ralph Kimball, referred to the architect of data warehousing http://en.wikipedia.org/wiki/Ralph_Kimball . It is the Kimball approach to modeling which the broader audience uses in solution design for BPC. Below, we find the 10 Essential Rules of Dimensional Modeling according to Kimball University (http://www.kimballgroup.com/ ):

Rule #1: Load detailed atomic data intodimensional structures.

Dimensionalmodels should be populated with bedrock atomic details to support theunpredictable filtering and grouping required by business user queries. Userstypically don't need to see a single record at a time, but you can't predictthe somewhat arbitrary ways they'll want to screen and roll up the details. Ifonly summarized data is available, then you've already made assumptions aboutdata usage patterns that will cause users to run into a brick wall when theywant to dig deeper into the details. Of course, atomic details can becomplemented by summary dimensional models that provide performance advantagesfor common queries of aggregated data, but business users cannot live onsummary data alone; they need the gory details to answer their ever-changingquestions.

Rule #2: Structure dimensional models aroundbusiness processes.

Businessprocesses are the activities performed by your organization; they representmeasurement events, like taking an order or billing a customer. Business processestypically capture or generate unique performance metrics associated with eachevent. These metrics translate into facts, with each business processrepresented by a single atomic fact table. In addition to single process facttables, consolidated fact tables are sometimes created that combine metricsfrom multiple processes into one fact table at a common level of detail. Again,consolidated fact tables are a complement to the detailed single-process facttables, not a substitute for them.

Themeasurement events described in Rule #2 always have a date stamp of somevariety associated with them, whether it's a monthly balance snapshot or amonetary transfer captured to the hundredth of a second. Every fact tableshould have at least one foreign key to an associated date dimension table,whose grain is a single day, with calendar attributes and nonstandardcharacteristics about the measurement event date, such as the fiscal month andcorporate holiday indicator. Sometimes multiple date foreign keys arerepresented in a fact table.

Rule #4: Ensure that all facts in a single facttable are at the same grain or level of detail.

There are three fundamental grains tocategorize all fact tables: transactional, periodic snapshot, or accumulatingsnapshot. Regardless of its grain type, every measurement within a fact tablemust be at the exact same level of detail. When you mix facts representingmultiple levels of granularity in the same fact table, you are setting yourselfup for business user confusion and making the BI applications vulnerable tooverstated or otherwise erroneous results.

#5:Resolve many-to-many relationships in fact tables.

Since afact table stores the results of a business process event, there's inherently amany-to-many (M:M) relationship between its foreign keys, such as multipleproducts being sold in multiple stores on multiple days. These foreign keyfields should never be null. Sometimes dimensions can take on multiple valuesfor a single measurement event, such as the multiple diagnoses associated witha health care encounter or multiple customers with a bank account. In thesecases, it's unreasonable to resolve the many-valued dimensions directly in the facttable, as this would violate the natural grain of the measurement event. Thus,we use a many-to-many, dual-keyed bridge table in conjunction with the facttable.

Rule#6: Resolve many-to-one relationships in dimension tables.

Hierarchical,fixed-depth many-to-one (M:1) relationships between attributes are typicallydenormalized or collapsed into a flattened dimension table. If you've spentmost of your career designing entity-relationship models for transactionprocessing systems, you'll need to resist your instinctive tendency tonormalize or snowflake a M:1 relationship into smaller subdimensions; dimensiondenormalization is the name of the game in dimensional modeling.

It is relatively common to have multiple M:1relationships represented in a single dimension table. One-to-onerelationships, like a unique product description associated with a productcode, are also handled in a dimension table. Occasionally many-to-onerelationships are resolved in the fact table, such as the case when the detaileddimension table has millions of rows and its roll-up attributes are frequentlychanging. However, using the fact table to resolve M:1 relationships should bedone sparingly.

The codesand, more importantly, associated decodes and descriptors used for labeling andquery filtering should be captured in dimension tables. Avoid storing crypticcode fields or bulky descriptive fields in the fact table itself; likewise,don't just store the code in the dimension table and assume that users don'tneed descriptive decodes or that they'll be handled in the BI application. Ifit's a row/column label or pull-down menu filter, then it should be handled asa dimension attribute.

Though we stated in Rule #5 that fact tableforeign keys should never be null, it's also advisable to avoid nulls in thedimension tables' attribute fields by replacing the null value with"NA" (not applicable) or another default value, determined by the datasteward, to reduce user confusion if possible.

Rule#8: Make certain that dimension tables use a surrogate key.

Meaningless,sequentially assigned surrogate keys (except for the date dimension, wherechronologically assigned and even more meaningful keys are acceptable) delivera number of operational benefits, including smaller keys which mean smallerfact tables, smaller indexes, and improved performance. Surrogate keys areabsolutely required if you're tracking dimension attribute changes with a newdimension record for each profile change. Even if your business users don'tinitially visualize the value of tracking attribute changes, using surrogateswill make a downstream policy change less onerous. The surrogates also allowyou to map multiple operational keys to a common profile, plus buffer you fromunexpected operational activities, like the recycling of an obsolete productnumber or acquisition of another company with its own coding schemes.

Rule#9: Create conformed dimensions to integrate data across the enterprise.

Conformeddimensions (otherwise known as common, master, standard or referencedimensions) are essential for enterprise data warehousing. Managed once in theETL system and then reused across multiple fact tables, conformed dimensionsdeliver consistent descriptive attributes across dimensional models and supportthe ability to drill across and integrate data from multiple businessprocesses. The Enterprise Data Warehouse Bus Matrix is the key architectureblueprint for representing the organization's core business processes andassociated dimensionality. Reusing conformed dimensions ultimately shortens thetime-to-market by eliminating redundant design and development efforts; however,conformed dimensions require a commitment and investment in data stewardshipand governance, even if you don't need everyone to agree on every dimensionattribute to leverage conformity.

Rule#10: Continuously balance requirements and realities to deliver a DW/BIsolution that's accepted by business users and that supports theirdecision-making.

Dimensional modelers must constantly straddlebusiness user requirements along with the underlying realities of theassociated source data to deliver a design that can be implemented and that,more importantly, stands a reasonable chance of business adoption. Therequirements-versus-realities balancing act is a fact of life for DW/BIpractitioners, whether you're focused on the dimensional model, project strategy,technical/ETL/BI architectures or deployment/maintenance plan

Development in a model(previously App) with poor architecture can be a frustrating task for any BPC professional. Complex architecture impedes performance, maintenance and consultant desirability. Sticking to the basics in terms of what is needed is most certainly the quickest route to delivering a solution.

In this short blog I'd like to emphasize what are the system prerequisites for the architecture within a bare bone legal consolidation:

Environment

Models:

Legal Consolidation - This can be classified as the framework for the consolidation where the bulk of the work is done

Rate - This model would house data pertinent to exchange rates and currency translation

Ownership - This model is where we store data required for the consolidating roll up or structure, necessary for the Ownership Manager or previously Dynamic Hierarchy Editor (DHE)

(Note: An intercompany model is seen more often than not, however it is not a system requirement for setup of a legal consolidation. This model is used to match and pair intercompany transactions within the organizational structure)

Dimensions

Legal Consolidation

Account

Category

Entity

Time

Flow/Movement

RptCurrency

Intercompany

AuditID

Consolidation Group

Rate

Account

Category

Entity

Time

Input Curreny

Ownership

Account

Category

Entity

Time

Intercompany

Consolidation Group

You may notice that each model has the same first 4 dimensions. These dimensions are the basic system requirement for any model type, can can be easily referenced using the acronym A(ccount).C(ategory).E(ntity).T(ime)

Hope this helps, and I'd invite correction if I have misinterpreted the above in any way.

EPM projects more often than not have numerous stakeholders which contribute to a successful implementation.

The following learnings, I believe should be highlighted as integral and indeed generically applicable to most if not all EPM project implementation. These findings can be delved into more deeply in the SAP Insider hosted webinar, "Enterprise Performance Management: The Key to Predicting the Unpredictable". I will attempt to attach the presentation in a bid to credit authorship, albeit my debut Blog on the new and exciting SCN, so please forgive functionality oversight.

Think big but start small:

Rather than implementing point solutions in the EPM space an organization should map-out a desired end-state architecture and roadmap that will support the required business processes

Design with the full Enterprise Performance Management process in mind, not just Planning, Forecasting or Consolidation alone

Remember the objective is to provide better insight and business decision making capability, in addition to making the process more efficient

Consider Timing

Consider the timing of the BPC implementation – particularly if BPC is being implemented as part of a broader SAP implementation

Timing of end user training and deployment for planning and budgeting likely to be different…deploying BPC ahead of core financial systems to align with budget cycle is feasible but brings its own particular risks and challenges so should be evaluated carefully

Ensure dependencies on other departments and projects are identified and well managed e.g. CoA re-design, ECC implementation, or BI (NW or SQL technical upgrade)

Buy In

Ensure your project has achieved the appropriate stakeholder buy-in and agreement with regards to the business case, scope and roll-out

This approach helps prevent unnecessary delays to mobilization and ensures a pro-active resolution of potential issues e.g. speed of delivery vs scope or potential blockers such as landscape freezes

Quick Wins

Identify achievable quick-wins to get under-the-belt before embarking on a more ambitious program. This approach has the following advantages:

The business doesn’t have to wait a prolonged period to see returns on investment

Builds confidence that EPM solutions deliver tangible benefits

Allows time for the solution to bed-in and for the business to plan extensions and improvements at a manageable pace

Interim process improvements can bring big benefits while the project is in progress

Scope Control

Flexibility is one of BPC’s greatest strength but a project’s worst enemy. Enforce a good change control process. In integrated systems, even small changes in BPC can have far reaching impact, upstream and downstream

Whilst BPC consolidation solutions can be scoped with relative ease, planning, budgeting and forecasting projects present more of a challenge.

The project should account for a contingency budget to deal with the “unknown unknowns”

In conclusion, I think these are great project generic methodologies to apply in order to better guarantee EPM project success.