SAP has been slapped with another public sector lawsuit in California recently that is attracting all kinds of attention – the wrong kind. On November 21st, the Los Angeles Times reported that California State Controller John Chiang is suing SAP Public Services for return of $50,000,000 it paid for an upgraded payroll software system.

This news follows announcement of a settlement by SAP and implementer Deloitte Consulting with Marin County in California earlier this year over an allegedly botched SAP installation.

Problems of this nature are not unique to the public sector, however. Bridgestone recently sued IBM for $600 Million over another botched SAP implementation.

Those of us who work daily with ERP and other enterprise software systems that do work properly may find this surprising but we shouldn’t. In fact, IT professionals can learn a lot from these disasters because those being sued allege that there is plenty of blame to go around.

Closer examination of the finger pointing between software companies, vendors and end-users may actually help us develop a good list of what not to do when pursuing IT projects.

In the State of California payroll system case, the lawsuit’s complaint is blistering: “After three years, and paying SAP approximately $50 million to integrate its own software into a new payroll and benefits system for the state of California, all [California] has to show for its investment is a system that could not get the payroll right even once over an eight-month period for a pilot group of only 1,500 employees.”

The failure caused a State Senate report to observe: “The expensive misadventure has once again left many wondering why – in a state that has given the world Google, Apple, Facebook and Twitter – California consistently struggles to modernize its own public computer systems.”

At the center of the blame game were failures to note similar botched projects, inadequate testing, excessive customization requirements and a game of data conversion hot potato. Eclipse Solutions was hired by the California Technology Agency to analyze the problems. They reported a number of critical discrepancies in data conversion including: “inaccurate mappings, incomplete conversion cycles … a lack of focus and priority … lack of formal review and signoff for critical artifacts, lack of clear communications, a lack of collaboration, overloaded resources (project staff), and a lack of adequate management involvement.”

After an admittedly brief examination of the facts, I would offer the following additional observations:

The project was simply too large in scope. The project begins with a premise that there needs to be a single system that pays all of the state’s employees. Why? Halfway through, most of the state’s higher education employees were removed from the project. In my opinion, the extensive number of data integration and system integration points created an environment of such enormous complexity that the project itself was unmanageable.

State bidding requirements reduce competition and increase costs. Ironically, the state is its own worst enemy on IT contracts because it limits responses to giant entities like BearingPoint (now bankrupt) by setting unreasonable requirements on the number of employees. Google, Apple, Facebook and Twitter, all started small and scaled up with success through innovation. Innovation is a concept foreign to government IT projects. The state actually hires multiple consulting vendors to routinely review and criticize the system integrators it hires. While this seems like a good idea at first, it actually inhibits cooperation by creating an environment of suspicion and blame rather than innovation and problem solving. Even with lucrative revenues in the tens of millions of dollars looming, only two vendors responded to the RFP. Many were forced out by unfavorable requirements while others were likely unwilling to take on the risk.

Too much focus is placed on data conversion and not enough on system integration. By creating a single new massive system, it forces all legacy data to be converted. No one wants to deal with this challenge because at the end of the day, data disambiguation often becomes a manual task. With 250,000 employees in the State’s payroll system, that’s a lot of manual data conversion work. So why not run the systems in parallel for awhile? Let the old systems think they are paying the employees, when in fact they are really feeding information to the new system. Then gradually replace each departmental system one at a time with new interfaces. Real-time data integration can be an effective substitution for data migration in some cases, at least while your sorting out the data conversion challenges in manageable sets rather than all at once.

These are naïve observations, I admit. Of course so were the ideas that you could put a personal computer on a desk, search the globe for information from a screen with eight words on it, or connect millions of people socially. Come on California, we can do this.

Don't worry, you will never lose your blog coverage. I have article on that here. http://it.toolbox.com/blogs/integrate-my-jde/spacebased-architecture-for-healthcaregovs-integration-big-data-love-child-57954

Disclaimer: Blog contents express the viewpoints of their independent authors and
are not reviewed for correctness or accuracy by
Toolbox for IT. Any opinions, comments, solutions or other commentary
expressed by blog authors are not endorsed or recommended by
Toolbox for IT
or any vendor. If you feel a blog entry is inappropriate,
click here to notify
Toolbox for IT.