Category: Regulation

MDM is all about connecting up and managing versions of what should be shared data. Generally people worry about their internal systems, since numerous distinct systems in an enterprise typically think that they “own” a particular piece of master data such as customer, product or asset. However, in addition to this it also important to consider external data i.e. data that you do not control within an enterprise but which you still need to relate to your own data. There is more of this than you might imagine, and MDM vendors would be well advised to take account of it. One example is Siperian’s recent link to Standard & Poor’s counter-party data, but there are many other cases. Purisma did such a job of linking to D&B data that D&B bought the company.

Other types of common external data are consumer data from Acxiom or Experian, patient data from IMS, financial and news data from Reuters and doubtless many more, some of which will be industry specific. Mapping such data back to internal systems is an important and non-trivial job e.g. D&B company data changes regularly, so it is more than just a one-off exercise.

MDM platform vendors would be well advised to consider building more such pre-built links to their platforms, as at present there are fairly few such links, yet customers surely will want these more and more as they deploy MDM more widely.

Another attempt at reforming the US patent system is rumbling its way through the US legislature.Â One sensible reform isÂ to make patent claims valid only from the date the application is filed, unlikely the current lunacy (unique to the US) which makes it valid from whenever the inventor claims he or she invented it (good luck verifying that).

More controversially, the draft senate bill looks at restricting the value of compensation that can be claimed in case of infringement to the “novel and non obvious” valueÂ of a product, rather than its overall value.Â This willÂ irritateÂ genuine patent holders and ambulance chasing patent attorneys alike, while cheering up technology firms that violate patents.Â A good overview of the legislation can be found on Wikipedia.

It seems to me that tinkering around at the edges of the system is rather likeÂ rearranging the deckchairs on the Titanic.Â The US patent office needs more root and branch reform to restore its reputation following a series of fiascos in recent years.Â Â In my limited personal experience of the US patent office the system seemed very slow (eight years to get our patent) and we dealt with with patent assessors that seemed barely able to communicate at all, never mind bringing relevant knowledge or perspective.Â We got there in the end, but it was an agonising process, and a lot more painful than the UK patent equivalent (which was not exactly a racy process either at five years before being granted).

Given how important intellectual property is, it would seem important to improve the standard of patent reviewers and improve the process itself to avoid some of the crazy patents that have been granted in recent years.Â The new legislation does nothing to address this core issue.

Recent management change at Radioshack reported in CFO.com shows just how important it is for CFOs to be able to produce accounts that they can confidently sign off in todayâ€™s stricter regulatory environment. An incoming CFO needs to feel absolutely certain that the books are in pristine shape, and may have to produce historical financial information from systems that he or she did not implement and is unfamiliar with. Particularly as in the case of Radio Shack, when the CFO has to deliver reports when theyâ€™ve been in the job less than half the fiscal year. Often, a new CFO is faced with a Pandoraâ€™s box when they do peek inside the finance systems theyâ€™ve just inherited.

How confident can they be given the serious consequences if something turns out to be awry? What if there are acquisitions, which need to be speedily assimilated into the corporate structure, yet in reality take years to convert or replace the acquired companyâ€™s incompatible IT systems? When a new CFO opens up the lid on the financial systems on which they rely, what do they uncover? We are all familiar with those scenes in horror films where the victim opens the forbidden door, or the lid on the long-shut chest, and as the audience we think â€œoh no, donâ€™t open thatâ€. How confident can an incoming CFO be that they are not about to re-enact such a scene?

The unpleasant reality in most companies is that financial data resides in multiple systems e.g. through a series of subsidiaries, or is in transition in the case of acquired companies. As I have written about elsewhere, letting a single and reliable view of corporate performance information can be a thorny problem. If you have to go back over time, as when changes occur to the structure of a chart of accounts or when major reorganisations happen, it is difficult to compare like with like. Moreover, CFOs need to understand the origin of the data on which they rely with a full auditable trail. This means that finance teams need to be active in defining and checking the business rules and processes from which data is derived in their company, and how these processes are updated when changes occur. Relying on the ERP system to do all this is insufficient since many of the business rules reside outside of these systems. This is why modern data warehousing and master data management software can help deliver clearer visibility. Ideally a CFO should be able to gather financial data together and view it from any perspective and at any level of detail – without having to standardise operational systems and business processes. The most intelligent software uses a model of the business, not the data or the supporting IT architecture, as its blueprint. Such a business model-driven approach insulates the company from change, since the reporting formats change immediately in response to changes in the business model. Using such intelligent software means that business change – such as re-organisations, new products, consolidation programs, and de-mergers – should no longer be feared.

Leading the organisation throughout change provides a real opportunity for the data-driven CFO to make his or her mark. By using the most modern technology available they can do this safely and without becoming a compliance victim. Good luck to all newly appointed CFOs!

Cognos shares have slid nearly 20% in recent weeks as an SEC probe into their accounting continues. The questions raised are in the notoriously tricky area of US GAAP rules, specifically on “VSOE” (or vendor specific objective evidence) which determine how much revenue can be credited for a deal in current figures, and what amount should be deferred. The post-Enron climate has ushered in a much harsher review of software industry practices than was normal in the past, and such esoteric sounding accounting rules can seriously impact a company, as Cognos is now seeing.

Word in the market is that the underlying business is actually quite robust at present, so hopefully this will be a blip for the company rather than anything more serious. Cognos 8 means that there is quite a lot of potential for Cognos to gain revenue as customers upgrade to the new software, which features much better integration between ReportNet and Powerplay, and a complete revamp of Metrics Manager, which is retitled Metrics Studio. These improvements should see Cognos customers steadily upgrading, and so having a positive impact on the company’s already pretty healthy finances. However perhaps some more conservative interpretation of US GAAP on their part would be wise.

It would seem self-evident that accountants enjoy dealing with numbers, but those who may not have worked in large corporations would be shocked by the number of meetings that occur arguing over whose version of data is correct. Just answering a simple question like “how profitable is customer x to us” is a nightmarish task in large companies, because business to business enterprises have data about customers scattered throughout a raft of different computer systems, and may not account for costs uniformly across the enterprise. This is not isolated to a few backward companies: a friend of mine who used to work for Goldman Sachs confirmed that if they wanted to know how much business they did with a major corporate client then it was a significant project to gather this information from around the corporation. This is especially a problem with global companies, whose operating subsidiaries and structure may be opaque. For example, Calvin Klein is owned by Unilever, but how many people know that? Consequently it is easy to see how a company supplying Calvin Klein may not necessarily record that information as actually being linked to Unilever. There are many, many examples like this, which would cause issues even without the problem of uniformity of cost allocation across an enterprise.

An article in Accounting and Finance mentions the notion of the “data driven CFO”, which alludes to the notion that CFOs these days really need to get to grips with this kind of issue. Not only are they under pressure to answer questions from their business colleagues about corporate performance (such as “how profitable is customer x to us”) but they also have to contend with increasing regulation such as Sarbanes Oxley and various industry specific rules such as Basel 2 in banking, or FDA rules within life sciences. People often gripe about such regulations, but there are some sound reasons for them. An amusing example of the need for the latter was a discussion my wife (who worked at Smith Kline Beecham at the time) shared with me. SKB had an anti-worming drug which they decided was not worth marketing to humans for technical reasons, but were pondering whether it could be usefully put into pet food, since dogs in particular have a lot of problems with worms. After much ferreting around, it turned out that this could not actually be done because in fact a scary percentage of people actually eat dog food, and so it would have to go through the same regulatory regime as if it was for humans (as an aside, I found this quite surreal, but when I mentioned it at lunch that day at Shell, two of the five people at the table admitted to having eaten dog food; if you think I am joking see the internet for recipes involving pet food).

The point is that the regulations are a fact of life and are increasingly specific about the auditability of reported data. Much of which may involve reconstructing past history at some point a few years back. Now if a company can’t tell how profitable a customer is without a significant effort, just how easy will it be for them to answer a question regarding profitability four years ago, since when the business has probably restructured a couple of times? I suspect that there are a lot of increasingly nervous CFOs out there, who my be sorely tested if they actually have to go back and reconstruct historical data. Yet even apart from regulatory compliance, for departments such as marketing really need to understand trends over time in order to be effective. How many of these are getting the trend data that they need from their finance groups? As pressures increase for performance improvement, more and more CFO departments are going to have to become more data driven in years to come, like it or not. Let’s hope their suporting systems can deal with such requests, or they will spend a long time barking up the wrong tree.

Andy Hayler

Andy Hayler is a passionate and outspoken commentator on the enterprise software market. A 20-year veteran of data modelling, warehousing and integration projects, he was named a Red Herring Top 10 Innovator in 2002 for founding Kalido – an innovative information management company that provides customers with the ability to dynamically view the impact of business changes. The views expressed on this blog are Andy’s own, and do not necessarily reflect the views of The Information Difference.