Most often when the watchdogs at the Government Accountability Office are called into to check out an agency, process or project they are looking for something that has gone wrong. This week, however the group took a look at some government IT projects that have gone right and came up with some best practices other government agencies or in public corporations could emulate to achieve success in their own IT projects.

In the report, the GAO noted that planned federal information technology spending has now risen to at least $81 billion for fiscal year 2012.

"As we have previously reported, federal IT projects too frequently incur cost overruns and schedule slippages while contributing little to mission-related outcomes. Given the size of these investments and the criticality of many of these systems to the health, economy, and security of the nation, it is important that federal agencies successfully acquire these systems -- that is, ensure that the systems are acquired on time and within budget and that they deliver the expected benefits and functionality," the GAO stated.

After interviewing the CIOs and other department officials in the seven projects it identified as successful -- which included programs from the Census Bureau, Defense Information Systems Agency, National Nuclear Security Administration, Customs and Border Protection, Federal Aviation Administration, Internal Revenue Service and Veterans Health Administration -- the GAO came up with nine common factors that were critical to the success of three or more of the seven investments.

Those common critical success factors and a small example of each were:

1. Program officials were actively engaged with stakeholders. For example, Census officials stated that the workers on its Decennial Response Integration System (DRIS) were members of the integrated project team. Their responsibilities as members of the team included involvement in requirements development, participation in peer reviews of contractual deliverables, and review of contractor proposals, the GAO said.

2. Program staff had the necessary knowledge and skills. For example, VA officials told the GAO that the Occupational Health Record-keeping System (OHRS) program relied extensively on the subject matter experts' occupational health experience -- treating them as part of the development team and including them in decision making. Two investments in our sample even went one step further -- by selecting the program manager from the end user organization as opposed to an individual with an IT background.

3. Senior department and agency executives supported the programs. For example, IRS officials explained that endorsement for the project called Customer Account Data Engine (CADE 2) came from the highest levels of the organization. In particular, those officials told us that the IRS Commissioner has made CADE 2 one of his top priorities. IRS officials told us that the Commissioner, through for example, his keynote speech at a town hall meeting for IRS employees, has provided a clear and unwavering message about CADE 2. This speech and other activities have unified IRS employees, driven change, and removed barriers that can often impede programs of this magnitude.

4. End users and stakeholders were involved in the development of requirements. In this case the GAO said Census officials said that its DRIS program management office collaborated extensively with the stakeholders and the contractor to develop requirements. For example, program management office personnel, contractor staff, and the stakeholders all worked together to analyze the requirements in order to ensure they were understood, unique, and verifiable.

5. End users participated in testing of system functionality prior to formal end user acceptance testing. For example, DISA officials told the GAO they used a virtual site to connect developers and end users in online testing of evolving software repeatedly during the development of its Global Combat Support System. Using the tool, the developers were able to record the sessions, which was helpful in addressing defects identified during testing.

6. Government and contractor staff were stable and consistent. Here the GAO said DISA officials indicated that the longevity of the program management office and contractor staffs has been a contributing factor to GCSS' success. For example, the longevity of the staff contributed to them becoming subject matter experts in their areas of responsibility

7. Program staff prioritized requirements. FAA officials told the GAO end users on its Integrated Terminal Weather System (ITWS) presented the development team with a "wish list" of requirements that would help them significantly. Those officials told us that end users and developers prioritized those requirements by balancing importance to the end users with the maturity of the technology. FAA officials stated that prototypes of these new requirements were developed and evaluated by end users in the field and were ultimately implemented in the initial operating capability for ITWS.

8. Program officials maintained regular communication with the prime contractor. Census officials stated that the DRIS program management office took a proactive, "no surprises" approach to communicating with the contractor. For example, on a monthly basis, the program management office formally documented the technical performance of the contractor based on the relevant elements of the work breakdown structure and the award fee plan. These reports were provided to the contractor, who in turn used the feedback to improve its technical performance.

9. Programs received sufficient funding. The GAO said projects that receive funding commensurate with their requirements are better positioned to ensure the availability of needed resources, and therefore, deliver the investment within established goals.