This is the accessible text file for GAO report number GAO-02-50
entitled 'Information Technology: Defense Information Systems Agency
Can Improve Investment Planning and Management Controls' which was
released on March 15, 2002.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the
printed version. The portable document format (PDF) file is an exact
electronic replica of the printed version. We welcome your feedback.
Please E-mail your comments regarding the contents or accessibility
features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States General Accounting Office:
GAO:
Report to Congressional Committees:
March 2002:
Information Technology:
Defense Information Systems Agency Can Improve Investment Planning and
Management Controls:
GAO-02-50:
GAO Highlights:
Highlights of GAO-02-50, a report to the Senate and House Committees
on Armed Services.
Why GAO Did This Study:
The Defense Information Systems Agency (DISA) spends about $3.5
billion annually providing critical information technology (IT)
support to the military services, military commands, and Defense
agencies, as well as operating and maintaining crucial command,
control, and communications systems. In response to a mandate in the
fiscal year 2001 Defense Authorization Act, GAO studied the agency's
management of its 500 Day Action Plan, as well as its efforts to
establish important institutional management controls.
What GAO Found:
In March 2001, DISA issued A 500 Day Action Plan for Supporting DoD
Decision Superiority, which described 140 actions requiring the
investment of resources to improve its customer satisfaction and its
performance. A strength of this plan was its focus on satisfying
customer needs. However, the plan did not adequately address other
important elements, such as providing reasonable assurance that
planned actions or investments were cost-effective. In particular,
DISA did not adequately define the scope and content of the actions or
develop associated high-level cost, schedule, benefit, and risk
estimates for each. When decisionmakers are faced with time and
resource constraints, such estimates are essential, providing the
basis for evaluating and selecting among competing investment options,
and establishing baselines against which to measure progress.
To further improve its performance, DISA is also strengthening key
institutional management controls. In reviewing selected controls
associated with high-performing organizations (see below), GAO found
DISA to be taking actions to establish aspects of each control area,
but found some to be still in their formative stages, while others had
progressed much farther. In IT human capital management, for example,
DISA has begun to identify requirements by establishing an inventory
of its workforce knowledge and skills; forecasting its strategic
workforce needs; and filling the gap between the two. In contrast, in
enterprise architecture, DISA has only begun to establish a management
foundation and has yet to develop an architecture. Such variability in
the maturity of control areas is due to the level of executive
attention, priority, and commitment associated with each. Until each
control area is fully functioning, DISA will be challenged in
maximizing its performance and accountability.
Table: Selected management controls associated with high-performing
organizations and the degree to which they are largely under way at
DISA:
Management control: Strategic planning;
Definition: Establishing mission and vision, including core values and
goals;
Largely under way: Yes.
Management control: IT human capital management;
Definition: Attracting, retaining, and motivating people having the
skills needed by the organization;
Largely under way: Yes.
Management control: Organizational structure management;
Definition: Aligning operational responsibilities with business and
mission goals, and maintaining accountability;
Largely under way: Yes.
Management control: Enterprise architecture management;
Definition: Developing, maintaining, and using an explicit blueprint
for operational and technical change;
Largely under way: No.
Management control: IT investment management;
Definition: Selecting and controlling investments to maximize benefit
and minimize risk;
Largely under way: No.
Management control: Customer relations management;
Definition: Focusing on satisfying customer needs;
Largely under way: Yes.
Management control: Knowledge management;
Definition: Capturing, understanding, and using the information and
intellect within an organization to achieve objectives;
Largely under way: No.
[End of table]
What GAO Recommends:
To strengthen DISA's operational efficiency and effectiveness, GAO is
making specific recommendations aimed at ensuring that DISA makes
informed decisions about the many investments described in its Action
Plan, as well as ensuring that DISA fully establishes the
institutional management controls addressed in GAO's study. These
recommendations include making establishment of each of these controls
an agency imperative. DOD concurred or partially concurred with all of
GAO's recommendations and stated that it is in the process of
implementing corrective actions.
This is a test for developing highlights for a GAO report. The full
report, including GAO's objectives, scope, methodology, and analysis,
is available at [hyperlink, [hyperlink,
http://www.gao.gov/products/GA0-02-50]. For additional information
about the report, contact Randolph C. Hite (202-512-3439). To provide
comments on this test highlights, contact Keith Fultz (202-512-3200)
or E-mail HighlightsTest@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Action Plan Development Was Appropriately Focused on Satisfying
Customers, but Not on Other Tenets of Effective Planning:
DISA Has Taken Steps to Improve Management of Action Plan
Implementation, but More Can Be Done:
DISA Is in the Process of Establishing Important Institutional
Management Controls:
Conclusions:
Recommendations:
Agency Comments and Our Evaluation:
Appendixes:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Status of DISA's Efforts to Benchmark Performance:
Appendix III: Further Details Regarding DISA's Enterprise Architecture
Management and Information Technology Investment Management:
Appendix IV: Comments from the Department of Defense:
Tables:
Table 1: Summary of Extent to Which 57 Actions Have No Established
Baselines:
Table 2: Status of DISA's Enterprise Architecture (EA) Management
Process as of November 30, 2001:
Table 3: Status of DISA's IT Investment Management as of November 30,
2001:
Figures:
Figure 1: DISA's Reporting Structure and Field Units:
Figure 2: Relationships Among Management Controls, People, Processes,
and Technology:
Figure 3: The Five Stages of Maturity Within IT Investment Management:
Abbreviations:
CIO: chief information officer:
CRM: customer relations management:
DISA: Defense Information Systems Agency:
DOD: Department of Defense:
EA: enterprise architecture:
GPRA: Government Performance and Results Act:
IT: information technology:
ITINI: information technology investment management:
OMB: Office of Management and Budget:
[End of section]
United States General Accounting Office:
Washington, D.C. 20548:
March 15, 2002:
The Honorable Carl Levin:
Chairman:
The Honorable John Warner:
Ranking Minority Member:
Committee on Armed Services:
United States Senate:
The Honorable Bob Stump:
Chairman:
The Honorable Ike Skelton:
Ranking Minority Member:
Committee on Armed Services:
House of Representatives:
The Defense Information Systems Agency (DISA) performs a critical
information technology (IT) support mission for the Department of
Defense (DOD) and others. On a cost reimbursable basis, DISA provides
computing services, telecommunications services, and acquisition
services; in fiscal year 2001, DISA's service reimbursements were
about $2.5 billion. DISA also operates and maintains joint warfighting
and related mission support command, control, and communications
systems funded by direct appropriations, which in fiscal year 2001
were about $1 billion. In light of the significance and cost
implications of DISA's mission, it is important that the agency cost-
effectively invest and manage its limited resources. In March 2001,
DISA issued a plan, entitled A 500 Day Action Plan for Supporting DoD
Decision Superiority, that contains 140 ongoing or planned actions
involving the investment of resources. DISA has also recently begun a
number of other institutional management improvements.
The fiscal year 2001 Defense Authorization Act directed us to review
DISA operational efficiency and effectiveness and to identify
opportunities for improvement.[Footnote 1] As agreed with your
offices, our objectives were to determine whether DISA (1) had
effectively managed development of its 500 Day Action Plan, (2) is
effectively managing implementation of the plan, and (3) has
established certain institutional management controls needed to
effectively adjust to shifts in strategic direction. The control areas
that we agreed to address are (a) strategic planning, (b) IT human
capital management,[Footnote 2] (c) organizational structure management,
(d) enterprise architecture management,[Footnote 3] (e) IT investment
management,[Footnote 4] (f) customer relations management,[Footnote 5]
and (g) knowledge management.[Footnote 6] Each of these areas is
agencywide in scope and strategically focused; to work effectively,
each depends on the proper application of organizational resources—
people, processes, and technology.[Footnote 7] As further agreed, our
review of these management controls focused on whether DISA had either
established or was in the process of establishing them; it did not
include evaluating the effectiveness of established controls. We
briefed your offices on the results of our review in January 2002.
[Footnote 8] Details on our objectives, scope, and methodology are in
appendix I.
Results in Brief:
In developing its 500 Day Action Plan, DISA appropriately focused on
understanding and satisfying customer concerns and needs. However,
DISA did not adequately address other important elements of effective
plan development, such as having reasonable assurance that planned
actions (investments) were cost-effective. In particular, DISA did not
adequately define the scope and content of the actions or develop
associated high-level cost, schedule, benefit, and risk estimates for
each. When decisionmakers are faced with time and resource
constraints, such estimates provide the requisite basis for evaluating
and selecting among competing investment options. Such estimates also
provide the baselines against which to measure progress and determine
whether the investments improve efficiency and effectiveness and
advance strategic goals. According to DISA officials, developing
baseline data needed to assess cost-effectiveness and measuring
progress and results were not considered during plan development,
because at that time they did not view the actions as individual
projects to be planned and controlled. DISA has since begun to develop
scope, schedule, and cost baselines for some planned actions. However,
it has yet to begin developing benefit and risk baselines, and it has
not analyzed the cost-effectiveness of its planned actions. As a
result, DISA has not adequately ensured that its action plan contains
the best mix of investments for improving mission performance and
achieving strategic goals.
During our review, DISA took steps intended to better manage
implementation of the 500 Day Action Plan. Specifically, although the
agency did not establish baseline commitments[Footnote 9] in
developing its action plan, DISA has since established some, but not
all, baselines and is beginning to monitor progress against these
commitments. In addition, DISA has established a process to notify
customers of changes to baselines, but the process did not include
justification of the costs, benefits, and risks of the investment,
which would be needed for senior management approval of the changes.
Until DISA adequately measures progress in implementing planned
actions and manages changes to those actions, DISA cannot determine
which, if any, of its planned investments are producing performance
improvements and thus warrant further investment.
DISA's 500 Day Action Plan is part of a larger set of management
actions that the agency has initiated to improve mission performance.
These actions address some, but not all, of the institutional
management controls that can help an agency effectively adjust to
shifts in strategic direction. These controls include (1) strategic
planning, (2) IT human capital management, (3) organizational
structure management, (4) enterprise architecture management, (5) IT
investment management, (6) customer relations management, and (7)
knowledge management?[Footnote 10] DISA has activities under way
associated with each of these institutional management controls;
although some are in their formative stage, others have progressed
much farther. For its IT human capital management effort, for example,
DISA has completed, ongoing, and planned steps to identify its IT
human capital requirements; establish an inventory of its workforce
knowledge, skills, and abilities; forecast its strategic workforce
needs; and fill the void between the two through evaluating its
progress in training, retention, and hiring initiatives. In contrast,
for its enterprise architecture, DISA has only begun to establish
elements of the architecture management foundation, and it has yet to
develop an architecture; for its knowledge management effort, it does
not yet have a defined management approach and structure. Such
variability in the maturity of these controls can be attributed to the
level of executive attention, priority, and commitment associated with
each. Until each control area is fully functioning, DISA will be
challenged in responding effectively to changes in its strategic
direction and maximizing its performance and accountability.
To strengthen DISA's operational efficiency and effectiveness, we are
making recommendations aimed at ensuring that DISA makes informed
decisions about investing in its 500 Day Action Plan initiatives. We
are also making recommendations to facilitate DISA's ongoing
institutional management efforts by ensuring that DISA fully
establishes certain controls.
In written comments on a draft of this report, DOD stated that it
concurred or partially concurred with all of our recommendations. DOD
also stated that by working closely with us during this review, DISA
is either in the process of implementing, or has plans to implement,
our recommendations and that doing so will improve support to DISA's
customers.
Background:
DISA is a DOD component agency reporting to the assistant secretary of
defense for command, control, communications, and intelligence.
[Footnote 11] DISA centrally manages major portions of DOD's common
global IT resources, providing services and operating and maintaining
systems that support the computing, networking, and information needs
of the national command authority, military services, joint military
commands, and Defense agencies.
DISA's services include:
* providing computing capabilities critical to DOD's global combat
support operations;
* providing voice, data, and video telecommunications services to DOD
and other customers;
* purchasing telecommunications services on behalf of its customers
from commercial vendors and other sources, such as voice services from
the General Services Administration's Federal Technology Service
contract; and;
* purchasing customized IT products and services.
In addition to these services, DISA also operates and maintains a
number of systems that perform mission-critical functions. These
systems include the following:
* The Defense Information Systems Network, which is used to provide
telecommunication services.
* The Global Combat Support System, which integrates joint combat
support information from various databases and presents battlefield
status information during an engagement.
* The Defense Message System, which interfaces with other U.S.
government agencies, allies, and contractors to provide multimedia
messaging and directory services for DOD users worldwide.
* The Global Command and Control System, which provides a range of
information needed to conduct joint U.S. and allied military
operations, including battlefield information, imagery, planning
support, and other intelligence information. The system operates at
over 625 networked sites worldwide. Using the Defense Information
Systems Network, the Global Command and Control System delivers system
applications, such as the Global Combat Support System and messaging
systems, used by battlefield commanders to synchronize and coordinate
widely dispersed air, land, sea, space, and special operations forces
during military operations.
In addition, DISA manages the Information System Security Program,
which is to protect DOD telecommunications and IT systems from damage,
unauthorized access, or threats to their availability. The agency also
provides guidance and support on IT operational and technical issues
to DOD components and coordinates DOD planning and policy for
integration of systems within the DOD infrastructure, including
management of the Joint Technical Architecture.
To accomplish its mission, DISA employs about 8,300 staff, located in
its headquarters' Command and 10 directorate offices and at 20 field
and line organizations worldwide. Figure 1 depicts DISA's reporting
structure within DOD and shows its field units.
Figure 1: DISA's Reporting Structure and Field Units:
[Refer to PDF for image: organizational chart]
Top level:
Secretary of Defense.
Second level, reporting to Secretary of Defense:
Assistant Secretary of Defense for Command, Control, Communications,
and Intelligence (C3I).
Third level, reporting to Assistant Secretary of Defense for C3I:
Defense Information Systems Agency; Office of the Agency and DISA
Headquarters Command Staff Offices.
Fourth level, reporting to Defense Information Systems Agency; Office
of the Agency and DISA Headquarters Command Staff Offices:
Defense Information System Network Service Center;
Defense Technical Information Center;
Defense Information Technology Contracting Organization;
DISA Directorates:
* Acquisition, Logistics, and Facilities;
* Application Engineering;
* Computing Services;
* Customer Advocacy;
* Interoperability:
- Joint Interoperability Test Command;
* Manpower, Personnel, and Security;
* Network Services
* Strategic Plans, Programming, and Policy;
* Technical Integration Services;
* Operations:
- Joint Staff Support Center;
- Joint Spectrum Center;
- DISA Field Offices:
DISA Central/Special Operations Command;
DISA Continental U.S.;
DISA European Command;
DISA Fort Gordon;
DISA U.S. Joint Forces Command;
DISA Pacific Command;
DISA Southern Command;
DISA Space Command;
DISA Strategic Command;
DISA Transportation Command.
[End of figure]
DISA's operations generally fall into four key areas: (1) computing,
(2) telecommunications, (3) acquisition services, and (4) joint combat
support and DOD enterprise capabilities. Each of the directorate,
field, and line units supports aspects of these areas. For example,
the Computing Services directorate is responsible for operating
assigned DISA information processing, communications, and network
systems, including management, operations, and maintenance of six
regional mainframe processing data centers within the United States.
The Network Services directorate is responsible for developing network
solutions for voice, data, and video transmission services and
monitoring the effectiveness of network performance in meeting
customer requirements. The responsibilities of DISA's Defense
Information Technology Contracting Organization include procuring,
accounting, and paying for IT supplies and services required by DISA
and other DOD components. The Joint Interoperability Test Command is
responsible for performing operational test and evaluation of DISA and
other DOD IT acquisitions. DISA also has 10 field offices located at
major customer locations, such as the U.S. Space Command, that are
responsible for handling on-site customer issues and inquiries with
products and services offered.
Prior Reports Have Cited Weaknesses in Measuring Cost-Effectiveness:
Recent reports by us and others have pointed out weaknesses in DISA's
ability to know whether it is cost-effectively providing services and
operating and maintaining systems. For example, in 1998, we reported
that in providing IT services, DISA had difficulty setting prices that
recovered the full cost of doing business; this difficulty impaired
the agency's ability to focus management attention on the full costs
of carrying out operations and managing those costs effectively.
[Footnote 12] Specifically, in setting prices for telecommunications
services, DISA did not incorporate about $137 million of costs
incurred, so that all costs were not reflected in prices charged to
customers and thus not recovered. Also, the agency used at least $231
million of its appropriated funding, reserved for use on joint
warfighting capabilities, to support IT business activities that
should have been fully funded by customer reimbursements for services.
As a result, DISA did not have reliable information upon which to
measure the cost-effectiveness of its services. We recommended that
DISA improve its operations, price-setting, and financial management
practices by setting prices that included all costs incurred and
promptly collecting amounts owed by customers.
Inspector general reports have also found performance weaknesses. In
1999, the DOD inspector general reported that DISA's management of
DOD's long-haul telecommunications requirements was fragmented and in
need of improvement.[Footnote 13] In 2000, the DISA inspector general
reported that the process for collecting and reporting performance
data was also fragmented, procedures were not established, and
practices did not ensure results as intended by DISA's performance
contract, which was established in fiscal year 2000 between DISA and
the deputy secretary of defense.[Footnote 14] Under this contract, the
agency committed to measuring quality, cost-effectiveness, and
timeliness of its goods and services, as well as customer satisfaction
with these, and to performing benchmarking studies gauging the
reasonableness of service cost and quality.[Footnote 15]
Director Has Initiated a 500 Day Action Plan to Improve Service
Shortly after the current director assumed command of DISA in June
2000, agency customers reported on problems with slow service,
unanswered telephone calls, and inadequate network capacity. A former
customer himself, the director responded by launching an initiative to
solicit customer input on three core questions: what DISA was doing
right, what it could do better, and what future requirements it needed
to address. The goal of the initiative was to improve customer
satisfaction with the agency's services and resulted in a 500 Day
Action Plan for service improvement. The plan is divided into five
main sections:
1. Strategic goals. DISA's strategic goals, as stated in the action
plan, are
* "Goal 1: Provide a flexible, reliable information infrastructure,
capable of supporting the evolving Global Information Grid, required
by the warfighter and others to achieve the highest level of
effectiveness in joint and combined operations.
* "Goal 2: Easy sharing of high quality information supporting
interoperability among U.S. Forces and Allies.
* "Goal 3: Defense information resources are secure.
* "Goal 4: DISA is a sought after employer. Personnel are available,
well qualified, and able to improve their professional skills and
advancement potential.
* "Goal 5: Information technology in support of business evolution
will be used to the maximum advantage to satisfy customers."
This section of the plan also includes statements of mission and
vision and descriptions of nine key initiatives that are designated as
critical to achieving the above goals: (1) the Defense Information
System Network, (2) the Global Command and Control System, (3) the
Global Combat Support System, (4) information assurance, (5) the
Defense Message System, (6) assured computing, (7) customer account
management, (8) electronic commerce/electronic business, and
(9) interoperability activities.
2. Customer-requested activities. The plan includes 109 customer-
requested actions, grouped by customer. Each action includes a brief
statement of need and importance, designation of the office of primary
responsibility, the start date, the completion date, and key terms and
conditions related to the action.[Footnote 16]
3. Global network actions. The plan describes 32 actions that assist
DISA in providing a flexible, reliable, affordable, integrated
information network infrastructure. (Of these 32, 17 are also included
among the customer-requested actions.)
4. Operational improvements. The plan proposes 16 actions to improve
DISA's internal organizational and workforce operations.
5. Master schedule. The plan includes a summary schedule for all 140
actions (109 customer actions, 15 global network actions not included
in the 109 customer-requested actions, and 16 actions internal to DISA
management), spanning a time frame from before January 2001 to about
August 2002.
Each of these 140 actions involves, to varying levels, the investment
of IT resources to achieve a specific end result. DISA officials
grouped the actions into three types: projects, mission-based
services, and processes, as follows.
1. Projects were defined as actions to enhance "a capability to meet a
customer need" and "subject to intensive oversight and supported by
formal documentation and/or a formal oversight process."
2. Mission-based services were defined as "human capital being applied
to a key, critical problem, [such as establishing] standards,
engineering, test and evaluation, or [military command] support."
3. Processes were described as "[starting] with a determination about
what needs to be improved to reach a goal or end-state, [for which]
solutions may be material, nonmaterial, or both [and involve]
significant investment amounts."
Of the 140 actions in the 500 Day Action Plan, DISA categorized 44 as
projects, 44 as mission-based services, and 52 as processes.
Effective IT Investment Planning Is Critical to Informed Investment
Selection and Decisionmaking:
Federal law and guidance[Footnote 17] and industry best practices
recognize IT investment planning as critically important, as it
results in an IT investment plan that should be used to implement
budget priorities for the year in accordance with strategic goals and
the enterprise architecture. Our IT investment management framework,
which is based on industry best practices, establishes a systematic
process for investment planning and management, including processes
for selecting, controlling, and evaluating investment options to
maximize the value of the investments and to minimize their risks.
[Footnote 18] This process requires the development of life-cycle
cost, schedule, benefit, and risk estimates and the use of these
estimates in comparing the relative merits of competing investment
options. Such a process allows decisionmakers to select those
initiatives that best meet the agency's strategic goals and prioritize
the selected initiatives for allocation of IT resources. The results
of these informed decisions can then be captured in an IT investment
plan. This plan, like DISA's 500 Day Action Plan, is intended to
identify those initiatives in which the agency intends to invest time,
money, and effort to produce a result with value commensurate with
cost.
Action Plan Development Was Appropriately Focused on Satisfying
Customers, but Not on Other Tenets of Effective Planning:
As described in our IT investment management framework, effective IT
investment planning requires, among other things, that organizations
provide for satisfaction of customer needs and evaluate competing
investment choices in light of each investment's estimated life-cycle
costs, schedule, benefits, and risks. The 500 Day Action Plan
appropriately recognized that satisfying customer needs is important
to a service provider like DISA. To develop the plan, DISA first
solicited extensive customer input. Next, with the direct involvement
of its executive leadership, the agency identified and selected near-
term initiatives (or actions) in which it would invest IT resources to
address customer concerns and increase customer satisfaction with
DISA's services. However, DISA did not treat the actions that it
selected for inclusion in the plan as investments by defining high-
level work scope and establishing high-level cost, schedule, benefit,
and risk estimates for each action based on that work scope, so that
it could understand the actions' cost-effectiveness and thus make
informed investment decisions. DISA has since taken steps to address
these planning issues. However, it has not addressed them all. For
example, it has not established life-cycle cost, benefit, and risk
baselines for all actions. Thus, it cannot be adequately assured that
its planned actions are the best mix of investment options to meet
strategic performance goals.
Action Plan Was Focused on Customer Satisfaction:
At its most basic level, DISA's mission requires the agency to cost-
effectively meet the requirements of its customers—the national
command authority and supporting military commands, military services,
and Defense agencies. Customer satisfaction is therefore a critical
factor for DISA's mission success, and effective development of its
action plan required DISA to solicit and use customer input.
DISA's development of its action plan was based on extensive input
from its customers, beginning in July 2000, when the director formally
solicited customer input on the three core questions (what DISA was
doing right, what it could do better, and what future requirements it
needed to address). By September 2000, this solicitation had produced
479 requirements from DISA customers, and the agency began a process
to translate these requirements into its 500 Day Action Plan.
According to the DISA director, the goal of the action plan was to
capture the high-priority customer requirements that the agency would
commit to deliver. To achieve this goal, DISA worked through the 479
requirements by soliciting the views of the DISA organizational
component responsible for each requirement, eliminating overlap among
requirements, and assessing the feasibility of delivering on the
requirement. Out of this process emerged a draft plan containing 111
actions.
The agency's next step was to validate the plan by sharing it with its
customers and soliciting their comments, which it did in December
2000. Based on customer comments, DISA deleted 5 actions and added 34,
resulting in a total of 140 actions. According to DISA officials, the
plan's evolution (from 479 requirements to 111 actions and finally to
140 actions) was achieved through customer interaction and discussion
among DISA leadership. DISA issued its final 500 Day Action Plan in
March 2001; it plans to update the plan during fiscal year 2002 by
once again soliciting customer input.
Action Plan Is an IT Investment Plan, but Its Development Did Not
Consider Cost-Effectiveness:
OMB Circular A-130 outlines a disciplined process for selecting,
controlling, and evaluating IT investments.[Footnote 19] DOD
directives also emphasize the need to consider the cost-effectiveness
of competing IT investment options, such as DISA's planned actions, to
assist in investment management (prioritizing investments and
allocating IT resources). Such an investment management process is
embedded in our IT investment management framework and is considered a
best practice, followed by leading government and industry
organizations.[Footnote 20]
A key element of this investment management process is the agency's IT
investment plan. The investment plan implements the agency's IT budget
priorities for the year, reflecting the agency's strategic goals and
its enterprise architecture. It also demonstrates to the agency's
investment decisionmaking authority the merits of a project, making
the case that the project meets cost-effectiveness criteria and
deserves funding. For effective investment planning, agencies need at
least preliminary information for each investment option in the
following areas: scope of the work to be performed, scheduled
milestones, and estimated life-cycle costs, expected benefits, and
anticipated risks. Also, for an organization to determine how well its
implementation activities achieve the results established by these
baseline estimates, it needs results-based performance measures for
each investment.
DISA did not evaluate the cost-effectiveness of the 140 actions
selected and included in the plan. Specifically, in developing the
action plan, DISA did not define in at least general terms the work
scope for the planned actions, nor did it establish general
milestones, generally estimate the life-cycle cost to complete
actions, project the benefits of completing the actions, or assess the
risks facing the actions.
In reviewing supporting documentation for 57 of the 140 actions (18
projects, 18 mission-based services, and 21 processes), we found that
performance measures, cost/benefit and risk analysis, and cost,
schedule, benefit, and risk baselines were largely missing for all
types of actions.[Footnote 21] DISA did not define performance
measures for 30 percent (17 of 57) of the actions, and benefit
baselines were not established or cost/benefit or risk analyses
performed for any of the 57 actions. The agency did not define work
scope for 14 percent (8 of 57) of the actions, schedule baselines were
not established for 19 percent (11 of 57), and life-cycle cost
estimates were missing for 89 percent (51 of 57) of the actions.
Table 1 summarizes the results of our assessment of the 57 actions.
Table 1: Summary of Extent to Which 57 Actions Have No Established
Baselines:
Attribute reviewed: Life-cycle cost baseline not established;
18 projects: 17 (94%);
18 mission services: 17 (94%);
21 processes: 17 (81%);
57 total: 51 (89%).
Attribute reviewed: Work scope not defined;
18 projects: 3 (17%);
18 mission services: 3 (17%);
21 processes: 2 (9%);
57 total: 8 (14%).
Attribute reviewed: Schedule baseline not established;
18 projects: 4 (23%);
18 mission services: 3 (18%);
21 processes: 4 (19%);
57 total: 11 (19%).
Attribute reviewed: Benefit baseline not established;
18 projects: 18 (100%);
18 mission services: 18 (100%);
21 processes: 21 (100%);
57 total: 57 (100%).
Attribute reviewed: Cost/benefit and risk analysis not performed;
18 projects: 18 (100%);
18 mission services: 18 (100%);
21 processes: 21[A] (100%);
57 total: 57 (100%).
Attribute reviewed: Performance measures not defined;
18 projects: 7 (39%);
18 mission services: 4 (23%);
21 processes: 6 (29%);
57 total: 17 (30%).
[A] According to a DISA official, the actions categorized as processes
had not progressed to the point where baselines supported cost/benefit
and risk analysis.
Source: GAO analysis of DISA action implementation and management data.
[End of table]
According to DISA officials, they did not define this information for
each action or assess its cost-effectiveness during plan development
because the actions were viewed as goals to achieve, rather than
individual investment projects to be defined, planned, and controlled.
Further, DISA officials stated that because the action plan was driven
by customer concerns, measuring return on investment was not the real
focus of the plan, which was customer satisfaction. In addition,
agency officials stated that the extent of baseline information and
analysis for each action was a function of the size and complexity of
the investment. While we agree with this principle, effective
investment planning, as previously discussed, nevertheless requires at
least a minimal level of information about the investments (such as
life-cycle costs, benefits, and risks), so that management can make
informed selection decisions and develop an effective investment plan.
Moreover, in view of the total 1-year cost ($171.7 million) of the 21
actions for which fiscal year 2002 estimates were made, the
investments in the 500 Day Action Plan are substantial and accordingly
warrant the development of baseline information to permit informed
decisionmaking.
DISA Has Taken Steps to Improve Management of Action Plan
Implementation, but More Can Be Done:
Effectively implementing an investment plan such as DISA's 500 Day
Action Plan requires, at a minimum, (1) measuring progress in meeting
planned commitments for each investment and (2) controlling changes to
these baseline commitments and reporting on such changes. Although
DISA has recently begun measuring progress against some baselines for
its planned actions and reporting baseline changes to affected
customers, it is still not measuring progress against all relevant
baselines (such as expected benefits) because it has yet to establish
these. Also, it is not controlling changes to baselines to ensure that
these changes are justified. Further, although DISA officials told us
that the agency is measuring action plan implementation success
through its annual benchmarking of agency performance against industry
standards, this benchmarking does not compensate for the absence of
performance measurements for plan actions, because most actions do not
map to benchmarked performance measures. As a result, DISA does not
know if its continued investment in actions is economically justified,
and it does not know whether changes to actions are warranted.
DISA's Ability to Measure Plan Implementation, While Improved, Is
Still Limited:
To determine whether an IT investment plan like the 500 Day Action
Plan is being implemented effectively, an organization needs to
measure whether investment baselines are being achieved (such as a
commitment to deliver defined capabilities and business value by a
certain date for a certain cost), so that it can promptly take
appropriate corrective actions to address any variances. The Clinger-
Cohen Act[Footnote 22[ and OMB guidance[Footnote 23] require measuring
the achievement of such investment commitments. OMB Circular A-130
states that agencies are to implement performance measures that
monitor progress toward expected results of IT investments. These
expected results are represented by the cost, schedule, risk, and
benefit baselines established in selecting an IT investment.
Initially, DISA did not measure the progress of plan implementation by
comparing actual results to baseline commitments because these were
not established. According to DISA, it was instead measuring
implementation of its 500 Day Action Plan through the annual
benchmarking process set up under its performance contract. However,
DISA's benchmarking efforts are not an effective or adequate measure
of action plan implementation because most of the actions were not
covered by the benchmarking reviews. Specifically, a mapping of
actions to the performance contract showed that 100 of 140 actions (71
percent) were not aligned. (Additional information on DISA's
benchmarking efforts is provided in appendix II.)
DISA has begun taking steps to better manage implementation of its
action plan. For example, during the course of our review, DISA
drafted a process whereby the responsible DISA action officer is to
obtain agreement from the customer that the "exit criteria/performance
metrics" (that is, close-out criteria and deliverables) for a given
action are acceptable. When the action is completed, the action
officer is to obtain written concurrence from the customer confirming
that the action is completed. Also under this process, the DISA
director is to request customer confirmation of completed actions.
However, DISA has yet to begin measuring benefits realized or risks
mitigated because it has not established baselines for either against
which it can measure progress.
Another example of a step to strengthen plan implementation that DISA
began during the course of our review is for its action officers to
begin briefing the status of the actions to the DISA director (and
other executives) at monthly Corporate Board meetings,[Footnote 24]
using a "stoplight" approach, with rankings of red, yellow, or green.
DISA also developed criteria for classifying the status of the
action's (1) schedule, (2) funding and staffing, and (3) customer
feedback and issues. However, these criteria do not measure progress.
Specifically, the funding and staffing criteria do not compare actual
costs of work performed (what was actually spent to date) to the
budgeted cost of work performed (what should have been spent based on
the scope of work completed to date). Instead, it is merely a
statement of whether the action was unfunded (red), partially funded
(yellow), or fully funded (green).
Despite recent steps to begin measuring progress in implementing
actions, DISA officials acknowledge that improvements are needed.
According to officials, they will revisit their approach to measuring
progress on actions and ensure that performance measures are
meaningful. Without adequate performance measures that continuously
compare status against expectations, DISA cannot adequately assess its
progress toward expected results and detect implementation problems so
that prompt corrective action can be taken.
Mechanisms to Control Changes to Baselines Are Under Development:
Changes to project baselines can affect the delivery of promised
capabilities and benefits on time and within budgets. Accordingly,
changes to baselines must be controlled so that only those that are
justified on the basis of costs, benefits, and risks are approved and
made. At a minimum, such change control involves having an explicit
definition of project baselines as a starting point, submitting
proposed changes to those baselines (exceeding a specified threshold
level) to a designated decisionmaking authority, understanding the
impacts of the proposed changes on other project baselines and the
customer's needs, and documenting and reporting approved changes.
DISA has begun to introduce elements of effective change control into
its management of action plan implementation. Initially, DISA
generally tracked (in monthly reports) only schedule baseline changes
made by action officers. According to agency officials, these officers
were supposed to check with customers to ensure that changes still met
customer needs; however, since this process and its implementation
were not documented, we could not confirm that it was actually
practiced. We did confirm, however, that schedule baselines (the
primary baselines that existed at that time) were at times changed
significantly. For example, an action plan report for April 2001 (1
month after the action plan was issued) showed that the target
completion dates changed for seven actions—one from June 2001 to
September 2002 (a 15-month change). Also, of 12 actions briefed to
DISA's Corporate Board in August 2001, the target completion dates for
all 12 had changed (changes ranged from 1 to 18 months). For these
changes, however, decisionmaking was left to the discretion of the
action officer, and the ramifications of these changes on action
costs, benefits, and risks were not addressed. As a result, whether
action changes were prudent investment decisions was not known.
During our review, DISA refined its change control approach to require
the responsible action officer to obtain customer agreement with
proposed completion date changes. Also, officials told us that the
DISA director is beginning to hold status meetings with the action
officers; to notify customers of significant deviations from recently
established cost, scope, and schedule baselines; and to obtain
customer concurrence with such changes. However, this refined approach
still does not satisfy all tenets of effective change control.
Specifically, because DISA does not view the actions as investments to
be controlled, it cannot adequately ensure that the implications of
changes are understood by decisionmakers so that the changes (1) do
not adversely impact other actions, (2) are approved by an authority
level commensurate with the significance and risk of the change, and
(3) are a cost-effective use of resources.
DISA Is in the Process of Establishing Important Institutional
Management Controls:
As we have previously reported, an organization's effectiveness in
responding to changes in its strategic direction is largely a function
of how well the organization is managed.[Footnote 25] An important
measure of an organization's management effectiveness is how certain
institutional management functions or controls have been established:
that is, the degree to which explicitly defined and rigorously
followed organizational rules, policies, procedures, and tools are in
place to enable management to best apply and measure the use of
resources (people, processes, and technology) to accomplish mission
goals and objectives. While the absence of one or more of these
controls does not mean that an organization will fail, it does
unnecessarily limit the organization's ability to perform its mission
and respond to change, increasing the risk that mission performance
and accountability will suffer.
Based on our experience in examining a wide range of government
programs, we have previously reported on a set of eight institutional
management functions that are needed to ensure effective organization
management.[Footnote 26] In this report on DISA, we address five of
these eight functions: strategic planning, human capital
(specifically, IT human capital), organizational alignment,
information management (focusing here on enterprise architecture
management and IT investment management), and performance measurement
(this function is included as an element of all management areas).
[Footnote 27] We also address two additional management controls—
customer relations management and knowledge management—because both
are important and DISA identified them as central to its
organizational management capability, citing efforts under way to
establish both. Specifically, the management controls for DISA
addressed in this report are the following:
* strategic planning: establishing the agency's mission and vision,
including core values, goals, and approaches/strategies for achieving
the goals;
* IT human capital management: attracting, retaining, and motivating
the people who possess the knowledge, skills, and abilities that
enable an IT organization to accomplish its mission;
* organizational structure management: aligning operational
responsibilities with business and mission goals and objectives, and
maintaining an accountability framework;
* enterprise architecture management: developing, maintaining, and
using an explicit blueprint for operational and technological change;
* IT investment management: selecting and controlling investments in
IT so as to maximize benefits and minimize risk;
* customer relations management: focusing an organization's operations
on how to best satisfy customer needs; and;
* knowledge management: capturing, understanding, and using the
collective body of information and intellect within an organization to
achieve organizational goals and objectives.
All these institutional controls are interrelated and interdependent,
collectively providing an organization with a comprehensive
understanding both of current business approaches and of efforts
(under way or planned) to change these approaches. These controls help
an organization determine how it is applying its resources, analyze
how to redirect these resources in the face of change, implement such
redirection, and measure success. With this decisionmaking capability,
the organization is better positioned to (among other things) direct
appropriate responses to unexpected changes in its environment.
Figure 2 is one way to represent how these key management controls are
related to an organization's basic resources: people, processes, and
technology.
Figure 2: Relationships Among Management Controls, People, Processes,
and Technology:
[Refer to PDF for image: illustration]
This illustration depicts 3 overlapping circles indicating the
following relationships:
People and Technology:
Management controls:
* Knowledge management;
* Customer relations management.
People and Process:
Management controls:
* IT human capital management;
* Organizational structure management.
Process and Technology:
Management controls:
* Enterprise architecture management;
* IT investment management.
People, Process and Technology:
Management control:
* Strategic planning.
[End of figure]
DISA has performed varying levels of activity in all of these
management areas. Much work remains to be accomplished, however,
before all can be viewed as mature and institutionalized. Generally,
DISA has progressed farthest in the areas that have been given
priority and received management focus. Until all the control areas
receive appropriate focus and are fully operative, DISA will be
challenged both in responding effectively to shifts in its strategic
direction and in improving its mission performance and accountability.
DISA Is Performing Important Strategic Planning Activities:
Effective strategic planning can be viewed as providing the foundation
for each of the other management control areas. Through strategic
planning, an organization describes a general vision of what it wants
to accomplish—and how it wants to accomplish that vision—by spelling
out its mission, core values, goals, and strategies. According to the
Government Performance and Results Act[Footnote 28] (GPRA) and related
OMB implementing guidance,[Footnote 29] effective strategic planning
includes the following elements, the first two of which are
fundamental to the establishment of the remaining four:
* defining a comprehensive, but brief, agency mission statement
defining the basic purpose of the agency and covering the major
functions and operations of the agency;
* defining general agency goals and objectives for all major functions
and operations within the agency's span of influence;
* describing how the goals and objectives are to be achieved,
including (1) operational processes, skills and technology, and the
human, capital, information, and other resources (such as reasonable
funding and staff projections) required to meet those goals and
objectives; (2) steps taken to resolve mission-critical management
problems; (3) efforts to provide high quality and efficient training
opportunities for staff; and (4) processes for communicating goals and
objectives throughout the agency;
* describing how the agency's performance goals are related to the
general goals and objectives, including a brief outline of the type,
nature, and scope of the performance goals, and the relevance and use
of performance goals in determining the achievement of general goals
and objectives;
* identifying key factors, external to the agency and beyond its
control, that could significantly affect achievement of the general
goals and objectives, including indicating their links to a particular
goal(s) and describing how achievement of the goal could be directly
and significantly affected by these factors; and;
* describing the program evaluation(s) used in establishing or
revising the general goals and objectives of the strategic plan, and
including a schedule for future program evaluations.
DISA is performing important strategic planning activities as
described below. However, strategic planning can be strengthened with
respect to describing how strategic goals and objectives will be
achieved and how program evaluations will be used to establish and
revise goals and objectives, as is also described below.
* DISA's strategic plan[Footnote 30] includes a mission statement that
defines the agency's purpose and its primary business areas.
* Its strategic plan and the 500 Day Action Plan describe general goals
and objectives (see background section of this report for examples).
* Its strategic plan does not describe the approaches or strategies to
achieve goals and objectives. For example, while DISA addressed its IT
resource needs (such as staffing, training, and funding) in its annual
Program Operating Memorandum, it did not address the steps to be taken
to resolve mission-critical management problems and processes for
communicating goals and objectives throughout the agency. Furthermore,
although DISA's Director's Planning Guidance addresses "critical
initiatives" supporting the mission (such as the Global Command and
Control System and the Defense Message System), it did not explicitly
link these initiatives to DISA's strategic goals and objectives. If it
has not adequately defined the resources and strategies for achieving
goals and objectives, an agency reduces its ability to align its
activities, core processes, and resources to support achievement of
its strategic goals and mission, putting their achievement at risk.
* DISA's strategic planning has addressed the relationship between the
general goals and the annual performance goals. Specifically, DISA's
annual performance plan is referenced in its strategic plan, and the
performance plan links each performance goal/objective with the
specific agency strategic goals. Such a linkage is important in
ensuring that agency efforts are properly aligned with goals (and thus
contribute to their accomplishment), and in assessing progress toward
achieving these goals.
* DISA's strategic plan describes key external factors that could
affect DISA's strategic direction as defined in its goals and
objectives. For example, it describes how customer cooperation in
alerting DISA to operational changes (strategic and tactical) are
important to DISA's ability to carry out its mission and achieve its
goals and objectives.
* DISA's strategic planning does not adequately provide for using
program evaluations to establish/revise strategic goals. Although DISA
was performing and documenting evaluations of its programs, it could
not demonstrate that the findings of these evaluations were used in
developing strategic goals. Similarly, evaluation plans did not
consistently outline scope, key issues, and schedule: of six program
plans that DISA provided, only one outlined the scope and schedule for
evaluations. Also, DISA could not demonstrate that results of
evaluations were used to improve performance, although officials
stated that evaluation results were used in this way.
Program evaluations are an objective and formal assessment of the
results, impact, or effects of a program or policy. If an agency does
not establish a process for performing and using such evaluations in
considering strategic goals, it loses a critical source of information
to help ensure the validity and reasonableness of goals and
strategies, as well as to help identify factors likely to affect
performance. This information is also helpful in explaining results in
the agency's annual GPRA performance reports, especially if goals are
not met.
DISA Has Performed Important IT Human Capital Activities:
Modern human capital management values people and is aligned with an
organization's mission, vision, and strategic goals. Further, it
recognizes and invests in employees as critical assets for achieving
an organization's strategic business/mission goals and objectives. As
we have previously reported,[Footnote 31] strategic IT human capital
centers on viewing people as assets whose value to an organization can
be enhanced through investment. As the value of people increases, so
does the performance capacity of the organization. To maintain and
enhance the capabilities of IT staff, organizations should, among
other things,
* assess knowledge and skills needed to effectively perform IT
operations to support agency mission and goals;
* inventory the knowledge and skills of current IT staff;
* identify gaps between requirements and current staffing; and;
* develop and implement plans to fill the gaps.
This management control has received considerable focus from DISA.
Thus far, the agency has performed activities supporting all four
elements of effective IT human capital management, as described below.
* DISA has begun to identify its IT human capital requirements, having
issued requests for its offices to identify workforce requirements.
However, how these requirements and the plans for meeting them are
aligned with DISA's strategic plan has yet to be documented. According
to DISA, a comprehensive 5-year workforce plan will be issued in March
2002, which will link to the agency's strategic plan. Until the agency
has this plan, it will be challenged in identifying its current and
future IT human capital needs (such as the size of the workforce and
the appropriate knowledge, skills, and abilities) to pursue its
mission.
* DISA has implemented an automated support system to assist it in
capturing, assessing, and managing the knowledge and skill set of its
workforce. The system is also designed to identify staff training
needs by comparing an individual's skills against the requirements for
a particular position. This system is a searchable database of staff
skills possessed by all DISA staff, and it is intended to permit quick
identification of staff with special skills needed to accomplish
mission tasks.
* Also, DISA is using this automated support system to identify gaps
in staff strengths and developmental needs. DISA plans to use this
information to develop workforce plans addressing vacancies, to
understand gains and losses of staff by position, and to strengthen
staff competencies/skills in specific mission areas. DISA plans to
establish a workforce workgroup in January 2002 to develop the
workforce plans.
* DISA is taking steps to invest in training and development of its
staff to fill identified skills gaps. For example, it plans to
introduce individual development planning for all staff. In addition,
its course catalog (October 2000) provides for central management of
training and development of staff. According to DISA officials, the
agency is in the process of evaluating effective solutions for
requirements-driven training and training metrics. Once training and
development needs are identified, DISA plans to implement enhancements
to its training program, beginning in fiscal year 2002. Such
investments in training and development are necessary for an agency to
ensure that it is building the competencies needed to achieve its
shared vision.
DISA Has Recently Realigned Its Organizational Structure:
To be responsive to the needs of customers and apply resources to
respond to a rapidly changing environment, an organization needs to
structure itself in a way that minimizes bureaucracy. In doing so, as
we have reported,[Footnote 32] an agency needs to accomplish, among
other things, the following:
* Reduce multiple management layers (team-based matrix management is
used to streamline processes; senior executives are empowered).
* Reduce organizational subdivisions (number of divisions is reduced;
local, regional, and worldwide offices are consolidated).
* Improve coordination, productivity, and team-building throughout the
organization (employee feedback is encouraged, and employee suggestion
programs are in place; organization encourages enhanced customer
communication and feedback).
DISA implemented a new organizational structure on October 1, 2001,
and established the Office of the Chief Transformation Executive to
guide the integration of changes in people, processes, structure,
policy, and tools to achieve organizational transformation goals.
According to DISA officials, this new structure was designed to
position the agency to manage change and is aligned with DISA's global
support business areas, such as network services, computing services,
field operations, and application engineering.
DISA's new organizational structure reduced and consolidated
management layers and subdivisions. The new structure reduces the
number of field and line organizations from 27 to 20. In the national
capital region, which includes DISA headquarters, staff are being
consolidated from 15 locations down to 3. In addition, as part of the
reorganization, the agency implemented a Corporate Board (composed of
senior executives and the DISA director) to facilitate integrated
entitywide decisionmaking.
However, establishing this management control area still requires
improvements in coordination, productivity, and team-building through
establishing methods to encourage enhanced customer communication and
feedback. While DISA has introduced internal communications and
feedback channels, such as directorate-specific all-hands meetings,
external communications and feedback channels are still evolving (see
the discussions of customer relations management and knowledge
management control areas, later in this section). Without these
channels, an organization's ability to get needed information to
appropriate decisionmakers can be impaired.
DISA Had Not Focused Efforts on Enterprise Architecture Management:
Enterprise architectures (EA) are essential tools for effectively and
efficiently engineering business processes and for implementing and
evolving supporting systems. These architectures are systematically
derived and captured descriptions—in useful models, diagrams, and
narrative—of the mode of operation for a given enterprise (e.g., an
agency). They describe the agency in both (1) logical terms, such as
interrelated business processes and business rules, information needs
and flows, and work locations and users; and (2) technical terms, such
as hardware, software, data, communications, and security attributes
and standards. These architectures provide these perspectives both for
the current or "as is" environment and for the target or "to be"
environment, as well as a transition plan for sequencing from the "as
is" to the "to be" environment. Managed properly, an EA can clarify
and help optimize the interdependencies and interrelationships among
an agency's business operations and the underlying IT infrastructure
and applications that support these operations.
The federal Chief Information Officers (CIO) Council, in collaboration
with us, issued guidance on architecture management.[Footnote 33] This
guidance specifies six primary areas of effective EA management:
* initiating the EA program by obtaining executive support,
establishing management structure and control, and developing program
activities and products;
* defining an architecture process and approach, including defining
the intended use and scope of the EA, determining the depth of the EA,
and selecting the EA products, framework, and toolset;
* developing the EA, including collecting information used in
developing the baseline EA of the organization's current or "as is"
state against which future progress can be measured, developing the
target EA of the organization's vision of future business operations
and supporting technology, developing a sequencing plan that defines
the incremental steps for making the transition from the baseline to
the target architecture, and approving the EA for use;
* using the EA to facilitate systematic agency change by continuously
aligning technology investments and projects with agency needs;
* maintaining the EA through periodic reassessments to ensure its
continued alignment with the organization's business practices,
funding profiles, technologies, and projects; and;
* continuously controlling and overseeing the EA program, including
ensuring that controls are in place and functioning and that
weaknesses are identified and addressed.
DISA's EA management capability is less established than any other
area. Thus far, the agency's efforts have been limited to deciding to
base its EA on the DOD architecture framework[Footnote 34] and stating
its intention to use the EA to support the management of its IT
investments. As a result, much remains to be accomplished. According
to DISA officials, EA management has not been an area of DISA
leadership focus and attention. Without this architecture, DISA lacks
the operational and technical blueprint for guiding and constraining
its investments, such as those in its 500 Day Action Plan, in a way
that optimizes agencywide performance and accountability.
Thus far, the DISA CIO has proposed high-level EA program targets, but
has not yet obtained buy-in from the DISA director and senior business
executives for these. Such executive commitment provides the CIO with
necessary sponsorship to fund development and maintenance of the EA.
Also, DISA has taken some steps to establish an EA management
structure. For example, a DISA chief architect has been appointed, and
a working group responsible for developing an EA has been established.
However, dates have not been approved for establishing a program
management office or for appointing key personnel necessary for
developing and maintaining an EA. Because the EA is a corporate asset
requiring investment of agency resources, a formal program management
structure is necessary to ensure successful execution of the process.
DISA issued a policy letter on November 21, 2001, governing the
implementation of its EA, which states that systems will adhere to
DOD's established architecture framework. However, the policy letter
did not address other activities associated with this process, such as
defining the intended use and scope of the EA, determining its depth,
and selecting products and tools. Until the agency fully defines its
EA process and approach, it will not have an adequate basis for
ensuring that its architecture is properly developed and tailored to
the scope and nature of the agency's needs.
Without a defined architectural process and approach, DISA cannot
accomplish the other areas of effective EA management and thus will
continue to lack an EA to guide and direct its investment in new and
existing IT assets in a way that promotes effective operational and
technological change. As we have reported at other agencies[Footnote
35] investing in systems without an EA increases the risk that systems
will not meet business needs, will be incompatible, will perform
poorly, and will cost more to develop, integrate, and maintain than is
warranted.
Appendix III includes a table that provides more details on the state of
DISA's EA management control area.
IT investment management is a structured, disciplined approach to
selecting, controlling, and evaluating a portfolio of competing
investment options. This approach to managing IT investments permits
informed and deliberative organizational decisionmaking about how to
best expend resources on IT-related initiatives in a manner that
maximizes return on investment and minimizes risk. We have issued an
information technology investment management (ITIM)
framework,[Footnote 36] which identifies critical processes for
successful IT investment and organizes these processes into a
framework of increasingly mature stages. The framework supports the
fundamental investment management requirements of the Clinger-Cohen
Act[Footnote 37] and provides a tool for implementing those
requirements. ITIM has been favorably reviewed by federal CIOs and
OMB. A summary of the framework is provided in figure 3, and each of
its five stages is described further below.
Figure 3: The Five Stages of Maturity Within IT Investment Management:
[Refer to PDF for image: illustration]
From Project-centric to Enterprise and strategic focus:
Maturity: Stage 1: Creating investment awareness;
Description: There is little awareness of investment management
techniques. IT management processes are ad hoc and project-centric,
and they have widely variable outcomes.
Maturity: Stage 2: Building the investment foundation;
Description: Repeatable investment control techniques are in place,
and the key foundation capabilities have been implemented focusing on
cost and schedule activities.
Maturity: Stage 3: Developing a complete investment portfolio;
Description: Comprehensive IT investment portfolio selection and
control techniques are in place that incorporate benefit and risk
criteria linked to mission goals and strategies.
Maturity: Stage 4: Improving the investment process;
Description: Process evaluation techniques focus on improving the
performance and management of the organization's IT investment
portfolio.
Maturity: Stage 5: Leveraging IT for strategic outcomes;
Description: Investment benchmarking and IT-enabled change techniques
deployed to management are strategically shape business outcomes.
Source: U.S. General Accounting Office, Information Technology
Investment Management: A Framework for Assessing and Improving Process
Maturity, Exposure Draft, GAO/AIMD-10.1.23, version 1 (Washington,
D.C.: May 2000).
[End of figure]
Stage 1: Creating investment awareness. In the first stage of IT
investment management, the starting point for all organizations, the
organization is becoming aware of the need to manage investments. This
stage is marked by the existence of ad hoc, unstructured, and
unpredictable investment decisions, with little or no relationship
between the success or failure of one investment and that of another.
Stage 2: Building the investment foundation. In the second stage of
maturity, repeatable investment techniques are in place, and key
capabilities have been implemented. To achieve this stage of maturity,
an organization must establish five critical processes:
* establishing and operating an IT investment board (or more than one)
to make investment decisions;
* performing project oversight, including monitoring projects relative
to cost and schedule expectations;
* tracking IT assets, including creating and maintaining an IT
inventory and providing tracking data to executive decisionmakers;
* identifying business needs for IT projects, which requires
identifying key customers or end users and the near-term business
needs that each project will support; and;
* selecting proposals systematically by applying defined investment
criteria.
Stage 3: Developing a complete investment portfolio. To have effective
IT investment management, an organization must be at this stage of the
framework or higher. This stage requires the establishment of five
critical processes:
* aligning authority of IT investment boards, so that their
responsibilities and activities are coordinated (if an organization
has more than one such board);
* defining portfolio selection criteria so that decisionmakers can
communicate to the organization the criteria used to select and fund
investments;
* analyzing investments, including their fundamental cost, benefit,
schedule, and risk characteristics, before they are funded and
combined with other investments into a portfolio;
* developing an investment portfolio by comparing, selecting, and
funding worthwhile investments; and;
* overseeing portfolio performance by adding the elements of
investment benefit and risk management to the control process
activities begun in stage two.
Stage 4: Improving the investment process. When IT investment
management is sufficiently mature, organizations are at the stage
where they can begin improving the process. At stage four,
organizations are focused on using evaluation techniques to improve
their IT investment processes and portfolios along with maintaining
mature control and selection processes. The three critical processes
are:
* performing postimplementation reviews and providing feedback,
* evaluating and improving portfolio performance, and,
* managing systems and technology succession.
Stage 5: Leveraging IT for strategic outcomes. When its IT investment
management is at the highest level of maturity, an organization shapes
its strategic outcomes by learning from other organizations and
continuously improving the manner in which it uses IT to support and
improve business results. The critical processes of stage five are:
* performing investment process benchmarking and,
* managing IT-driven strategic business change.
Our analysis of DISA against the ITIM framework showed that the agency
has fulfilled some elements of both stages 2 and 3 but none in stage 4
or 5. According to a DISA official, the agency sees itself as between
stages 1 and 2. Further, DISA plans to first develop a consistent,
repeatable process as the foundation for building a portfolio-based
approach to IT investment management. This plan is consistent with our
staged framework. The status of DISA's efforts in each of the ITIM
stages follows.
Stage 2 processes: Of the five elements for maturity stage 2, DISA has
focused activities in two elements: establishing an IT investment
board and tracking IT assets. In addition, it is performing some
activities in the other three elements. Each of these elements is
discussed below.
* DISA has established an IT investment board, which was chartered on
November 28, 2001. The board operates according to DISA's IT Capital
Investment Process Implementation Plan (version 2.0), issued in
October 2001.
* DISA is working to perform IT project oversight, including
formalizing the review process for the IT investment board and
refining a project data collection instrument currently in use.
Because these activities are not yet established, however, DISA is not
able to routinely provide each project's up-to-date cost and schedule
data to the IT investment board.
* Through issuance of the 500 Day Action Plan, DISA has begun to track
its portfolio of IT systems. In addition, DISA uses the Defense IT
Management System as a central repository for information on IT
assets, such as management, reutilization, and accounting data.
* DISA officials stated that to identify business needs for IT
projects, the agency identifies specific users for each IT project
throughout its life cycle and includes this information in the
project's program plan. However, DISA could not provide any evidence
to substantiate these statements.
* DISA officials have drafted guidance for use in systematic selection
of proposals. However, until the process is in place and functioning,
DISA is not able to develop, analyze, and prioritize proposals in
support of funding decisions.
Unless these repeatable basic processes are accomplished for
individual project investment selection and management, IT projects
are less likely to deliver promised capabilities on time and within
budget.
Stage 3 processes: DISA has not established any critical processes
associated with stage 3, but it has begun efforts on those stage 3
critical processes that lay the groundwork for establishing other
stage 3 processes. Examples of partially established and not
established critical processes are as follows.
* DISA has drafted portfolio selection criteria. However, the IT
investment board has not approved the selection criteria and the
criteria have not been distributed throughout the organization.
Currently, DISA's investment board is testing the draft IT portfolio
selection criteria.
* DISA is not yet analyzing investments using its selection criteria.
DISA is currently testing its draft selection criteria via analysis of
a single project.
* DISA has not yet established critical processes for developing and
overseeing an investment portfolio.
Without a portfolio-based approach to investment management, an agency
will be challenged in its ability to invest in the right mix of
projects to best meet mission goals.
Appendix III provides a table summarizing the state of DISA's IT
investment management control area. The table also includes
descriptions of the elements associated with each stage of maturity
within the ITIM framework.
DISA Is Performing Important Customer Relations Management Activities:
Private industry leaders have promulgated guidance for establishing an
effective customer relations management (CRM) capability.[Footnote 38]
This guidance states that in order to meet customers' needs and
expectations, an organization should become externally focused and
establish partnerships with its customers. Such a customer-focused
organization also aligns its business strategy with technologies,
applications, processes, and organizational changes to optimize both
the cost-effectiveness of operations and customer satisfaction. As
with the other management process areas discussed in this report,
establishing a CRM capability begins with the adoption of a strategic
vision, supported by senior management, that:
* fosters a culture of client focus,
* is committed to CRM strategy,
* establishes CRM goals, and,
* defines a strategy to reach CRM goals.
With this commitment, the supporting business process, organizational,
and technology infrastructure is then established to collect, analyze,
and maintain customer information. More specifically, this means that
* CRM processes are integrated throughout organization,
* customer information is collected,
* customer needs and expectations are identified,
* flexible solutions and enabling technologies are evaluated and
implemented to warehouse customer information and maximize client
satisfaction, and,
* CRM staff is trained and developed.
Once this infrastructure is established, the CRM operational
capability is to be sustained through continuous measurement and
improvement, including:
* using customer feedback surveys and focus groups and,
* using results to improve CRM processes.
Customer relations management has been a priority area for DISA, as
evidenced by the focus of its 500 Day Action Plan. Thus, DISA has
performed many CRM activities, including developing a CRM strategy,
measuring progress, and using the results of these measurements for
continuous improvement. It has also taken steps to build and maintain
the necessary supporting infrastructure. Specifically, DISA has
established the means to collect customer information and identify
customer needs, as demonstrated through development of its 500 Day
Action Plan. However, it is still pilot testing an electronic commerce
CRM Web portal as part of its evaluation of solutions and enabling
technologies, and this pilot had not been extended and integrated
throughout DISA. Moreover, according to DISA's CRM strategy briefing,
the pilot depends on DISA's enterprise architecture and knowledge
management activities; however, as discussed in this report, neither
of these management control areas has yet been established. Further,
DISA's CRM training program is planned for fiscal year 2002. Until it
has the infrastructure to support and implement its CRM strategy, DISA
will be challenged in its ability to effectively manage customer
relations.
DISA's Knowledge Management Area Is Under Development:
Effective knowledge management captures the collective body of
information and intellect within an organization, treats the resultant
knowledge base as a valued asset, and makes relevant parts of the
knowledge base available to decisionmakers at all levels of the
organization. Knowledge management is closely aligned with enterprise
architecture management, because both focus on systematically
identifying the information needs of the organization and describing
the means for sharing this information among those who need it.
Guidance issued by the federal CIO Council[Footnote 39] provides a
framework for establishing a knowledge management capability. Elements
involved in institutionalizing this function include:
* deciding with whom (both internally and externally) to share
organizational knowledge;
* deciding what knowledge is to be shared, through performing a
knowledge audit and creating a knowledge map;
* deciding how the knowledge is to be shared, through creating
apprenticeships/mentoring programs and communities of practice for
transferring tacit knowledge, identifying best practices and lessons
learned, managing knowledge content, and evaluating methods for
sharing knowledge; and;
* sharing and using organizational knowledge, through obtaining
sustained executive commitment, integrating the knowledge management
function across the enterprise and embedding it in business models,
communicating strategies, and measuring performance and value.
DISA has performed limited activities to establish effective knowledge
management. The agency has designated a knowledge management
organization that is to report to the DISA Corporate Board and has
appointed a knowledge management chief. Also, the DISA vice director
signed the knowledge management council charter on August 28, 2001.
However, until DISA institutionalizes the knowledge management
function throughout its organization, it cannot ensure the
availability and continued value of knowledge assets to support
strategic goals and objectives.
Described below are areas in which DISA's efforts to develop effective
knowledge management are limited.
* DISA had not yet defined with whom to share organizational
knowledge. DISA has begun drafting a review and approval process for
sharing organizational knowledge, but this draft did not address
establishing internal and external parties with whom DISA would share
information.
* Similarly, DISA has not determined what knowledge to share. Although
DISA has begun drafting a DISA knowledge implementation plan for
establishing the activities associated with this process, there were
no finalized, approved plans to define the implementation. Further (as
discussed in the section on the agency's enterprise architecture
management), DISA has not yet begun to develop its architecture, which
would include a related determination of what information (i.e.,
knowledge) is needed by whom, where and when it is need, and in what
form it is needed to perform mission operations.
* DISA has not yet determined how to share its organizational
knowledge: that is, how to make knowledge available. DISA's knowledge
management chief and knowledge management council have not yet begun
to address how DISA will share knowledge. Again, this determination is
closely aligned with developing the enterprise architecture, which
DISA has yet to do.
The three elements above lay the foundation for DISA to implement an
effective knowledge management function throughout DISA. Thus, DISA
has not yet progressed to the point of performing the activities
associated with implementation, the fourth element of this management
control area.
Conclusions:
Through development and implementation of its 500 Day Action Plan,
DISA has demonstrated a commitment to improving its customer
orientation. However, DISA's action plan development efforts were
focused solely on customer satisfaction and did not effectively
address whether planned actions would be cost-effective and thus worth
pursuing. As a result, DISA cannot be assured that it is pursuing
initiatives under the plan that are the most prudent strategic
investment choices among competing options. DISA has taken steps to
address this planning limitation as part of its efforts to manage
implementation of the plan; however, these steps stop short of
adequately addressing how to determine the most cost-effective
portfolio of action plan initiatives. Unless DISA expands the focus of
its planning and performance measurement to include cost-effectiveness
considerations, it runs the risk of investing in areas and assets
that, while satisfying customer-defined needs, do not produce mission
value commensurate with costs. DISA's commitment to improving customer
satisfaction is appropriate and laudable, but it must be equally
committed to opportunities to reduce its costs of operations and
improve its mission performance.
Through its ongoing efforts to implement important institutional
management controls, DISA is building the institutional capacity
needed to implement its strategic goals and objectives and to respond
effectively to changes in its environment. However, this suite of
management controls is largely a work in progress. The key for DISA
will be to remain vigilant in completing these controls and in doing
so expeditiously. Fortunately, DISA leadership has already taken at
least the first steps in developing and implementing all these
controls, and its progress thus far indicates an understanding and
appreciation of the value and urgency of completing them.
Nevertheless, until these controls are in place and functioning, DISA
will not have the organizational means to accommodate change and to
realize its vision of being the preferred provider of information
services across DOD.
Recommendations:
To improve DISA's development and execution of its current and future
IT investment action plans, we recommend that the secretary of defense
direct the DISA director, through the assistant secretary of defense
for command, control, communications, and intelligence, to follow a
structured and disciplined IT investment management process for
selection, control, and evaluation of the initiatives in current and
future action plans.
For plan development, we recommend that the DISA director:
* define the general scope of actions and establish preliminary life-
cycle cost, schedule, benefit, and risk baselines for actions; and;
* perform a preliminary, high-level assessment of return on investment
for proposed actions to gauge their cost-effectiveness.
For plan implementation, we recommend that the DISA director:
* use approved baselines to develop meaningful results-oriented
performance metrics;
* implement a formal process (1) to control significant changes to
action baselines and closure of actions and (2) to inform stakeholders
of significant deviations in the action baselines;
* in monitoring implementation of the planned actions, update scope of
work, cost, schedule, benefit, and risk baselines for all actions, as
appropriate, to ensure that actions remain cost-effective investment
choices; and;
* establish a mechanism to track customer feedback to ensure that the
customer concerns that led to the actions are resolved.
To improve institutional management controls needed to respond to
changes in strategic direction, we recommend that the secretary of
defense direct the DISA director, through the assistant secretary of
defense for command, control, communications, and intelligence, to
make it an agency priority to establish the elements described in this
report for each of the following management controls: (1) strategic
planning, (2) organizational structure management, (3) enterprise
architecture management, (4) IT investment management, (5) customer
relations management, and (6) knowledge management. For IT human
capital management, we are not making recommendations in light of the
fact that DISA has either completed or is close to completing each of
the important elements of effective IT human capital management
discussed in the report. For the other management controls, we
specifically recommend that the agency do the following:
To strengthen the agency's strategic planning, we recommend that the
DISA director:
* fully define approaches or strategies to achieve goals and objectives,
* completely explain the relationship between the general goals and
the annual performance goals, and;
* fully describe how program evaluations are used to establish and
revise strategic goals.
As part of its ongoing organizational structure management, we
recommend that the DISA director evaluate and implement solutions for
advancing coordination, productivity, and team-building.
To strengthen management of DISA's effort to develop, implement, and
maintain an enterprise architecture, we recommend that the DISA
director follow the steps defined in the CIO Council's guide on
architecture management,[Footnote 40] as appropriate, including
* initiating a program;
* defining the architecture process and approach;
* developing the architecture, including the baseline and target
architectures, and the plan for sequencing from the baseline to the
target;
* using the architecture in making IT investment decisions;
* maintaining the architecture; and;
* continuously controlling and overseeing the program.
To establish effective IT investment management, we recommend that the
DISA director follow the steps detailed in our IT investment
management guide,[Footnote 41] including (1) building a foundation for
IT investments, including:
* establishing and operating an IT investment board,
* performing IT project oversight,
* tracking IT assets,
* identifying business needs for IT projects, and,
* selecting proposals systematically,
and (2) establishing the capability to manage investments as a
complete investment portfolio, including:
* defining portfolio selection criteria,
* analyzing investments,
* developing an investment portfolio, and.
* overseeing portfolio performance.
To strengthen customer relations management, we recommend that the
DISA director build and maintain a supporting customer relations
infrastructure that permeates the entire organization.
Finally, to define and implement an organizationally integrated
knowledge management function, we recommend that the DISA director
follow the steps outlined in the CIO Council guide on this subject,
[Footnote 42] including:
* deciding with whom to share organizational knowledge,
* deciding what organizational knowledge to share,
* deciding how to share organizational knowledge, and,
* institutionalizing and using the knowledge management process.
Agency Comments and Our Evaluation:
In written comments on a draft of this report, the assistant secretary
of defense, command, control, communications, and intelligence, who is
the DOD CIO, stated that our review highlighted many improvements to
DISA's management of IT investments (see app. IV), and that it
concurred or partially concurred with all our recommendations. DOD
also stated that by working closely with us during this review, DISA
is either in the process of implementing, or has plans to implement,
our recommendations and that doing so will improve support to DISA's
customers. Additionally, DOD described DISA's ongoing and planned
efforts for each recommendation. We acknowledge DISA's responsiveness
and plan to follow up periodically on DISA's progress in fully
addressing each recommendation.
For one area of our recommendations, DOD qualified its agreement,
stating that it partially concurred. Specifically, regarding our
recommendations to improve plan development, DOD agreed that defining
the scope of actions and establishing cost, schedule, benefit, and
risk baselines and related assessments of cost-effectiveness were
required for project actions. However, DOD did not agree that all
actions require this level of definition and assessment. We recognize
that while all actions involve investment of resources, the nature of
projects differs, and thus, the level of investment management rigor
should be commensurate with the needs of the project. In our opinion,
DOD's development of a guideline for defining the scope and
establishing baselines for actions is a positive step toward
ultimately controlling DISA's 500 Day Action Plan investments.
We are sending copies of this report to the chairmen and ranking
minority members of the Subcommittee on Defense, Senate Committee on
Appropriations; the Subcommittee on Readiness and Management Support,
Senate Committee on Armed Services; the Subcommittee on Defense, House
Committee on Appropriations; and the Subcommittee on Military
Readiness, House Committee on Armed Services. We are also sending
copies to the secretary of defense; the director, Office of Management
and Budget; and the director, Defense Information Systems Agency.
Copies will be made available to others upon request.
If you or your staff have any questions on matters discussed in this
report, please contact me at (202) 512-3439 or Nancy A. DeFrancesco,
Assistant Director, at (202) 512-3225. We can also be reached by E-
mail at hiter@gao.gov and defrancescon@gao.gov. Other key contributors
to this report were Bernard Anderson, Barbara Collier, M. Saad Khan,
and B. Scott Pettis.
Signed by:
Randolph C. Hite:
Director, Information Technology Architecture and Systems Issues:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
Our objectives were to determine whether DISA (1) had effectively
managed development of the 500 Day Action Plan, (2) is effectively
managing execution of the action plan, and (3) has established the
institutional management controls needed to effectively adjust to
shifts in strategic direction. These controls include (a) strategic
planning, (b) IT human capital management, (c) organizational
structure management, (d) enterprise architecture management, (e) IT
investment management, (f) customer relations management, and (g)
knowledge management. As further agreed, our review of these
management controls focused on whether DISA had either established
them or was in the process of doing so; it did not include evaluating
their effectiveness.
To assess DISA's development and execution of the 500 Day Action Plan,
we reviewed documentation of 479 original customer inputs to the plan
in September 2000, and customer comments on the draft plan received by
DISA in January and February 2001; we compared comments received by
DISA to the resulting plan, issued in March 2001. In addition, we
interviewed officials of the Office of the Deputy Director for
Strategic Plans, Programming, and Policy and compared DISA's practices
(both in place and planned) to federal criteria and industry best
practices for internal controls, planning, and management of
information technology (IT) investments. Specific criteria are
contained in the following:
* Office of Management and Budget (OMB) Circular A-11, Preparing and
Submitting Budget Estimates (July 19, 2000).
* OMB Circular A-130, Management of Federal Information Resources
(November 28, 2000).
* DOD Directive 5010.38, Management Control (MC) Program (August 26,
1996).
* DOD Directive 5105.19, Defense Information Systems Agency (DISA)
(June 25, 1991).
* DOD Directive 8000.1, Defense Information Management (IM) Program
(October 27, 1992).
* DISA Circular 400-120-1, Management and Engineering Plan Guide (July
1, 1996).
* DOD Chief Information Officer (CIO) Guidance and Policy Memorandum
(G&PM) No. 11-8450, Department of Defense (DOD) Global Information
Grid (GIG) Computing (April 6, 2001).
* Department of Defense ADP [Automated Data Processing] Internal
Control Guideline (July 1988).
* A Practical Guide to Federal Enterprise Architecture, Chief
Information Officers Council, version 1.0 (February 2001).
* Information Technology Investment Management: A Framework for
Assessing and Improving Process Maturity, Exposure Draft, GAO/AIMD-
10.1.23, version 1 (May 2000).
Using an agency-developed listing that identified the 140 actions as
44 project actions, 44 mission-based service actions, and 52 process
actions, we selected a statistical sample of 57 actions (18 project
actions, 18 mission service actions, and 21 process actions). This
sample size was determined to provide precision (with 95 percent
confidence) of ±10 percentage points or better. We examined
documentation supporting the development, planning, management, and
monitoring of these actions.
We reviewed documentation supporting DISA's efforts to monitor the
status of the action plan, including the meeting minutes from the DISA
Corporate Board meetings held on August 20 and September 7, 2001. We
also interviewed officials in the Office of the Deputy Director for
Strategic Plans, Programming, and Policy and examined documentation
supporting the closure of seven actions that were completed during our
review.
To determine the extent to which DISA measures and monitors its
performance, we reviewed documentation on studies of DISA's efficiency
and effectiveness. Of 159 such studies identified to us by DISA
(including about 130 manpower or budget studies) dating from fiscal
years 1995 to 2001, we reviewed documentation supporting 34 of 103
studies conducted or in process for fiscal years 1998 to 2001. We also
reviewed DISA's finalized performance contracts for the fiscal years
2000 and 2001, as well as documentation supporting contract status and
accomplishment of performance measures for these years. This
documentation included reports on the results of customer satisfaction
surveys and on methodology used, as well as benchmarking studies that
compared the efficiency and effectiveness of DISA's computing and
telecommunications services to industry averages. We also reviewed
DISA's draft performance contracts and related guidance for fiscal
years 2002 and 2003. To assess alignment of DISA's strategic goals to
these performance measures, we reviewed a correlation of the 500 Day
Action Plan with DISA's strategic plan, fiscal year 2002 performance
contract, and fiscal year 2002 GPRA performance plan.
To determine whether DISA has the management controls in place to
facilitate operational change in response to shifts in DOD strategy,
we researched federal criteria and best practices to identify key
institutional management controls that enable an organization to
accommodate change and transition to a results orientation and
increased accountability. These include the following:
* OMB Circular A-11, Preparing and Submitting Budget Estimates (July
19, 2000).
* Determining Performance and Accountability Challenges and High
Risks, GAO-01-159SP (November 2000).
* Human Capital: Attracting and Retaining a High-Quality Information
Technology Workforce, GAO-02-113T, (October 4, 2001).
* A Practical Guide to Federal Enterprise Architecture, Chief
Information Officers Council, version 1.0 (February 2001).
* Information Technology Investment Management: A Framework for
Assessing and Improving Process Maturity, Exposure Draft, GAO/AIMD-
10.1.23, version 1 (May 2000).
* resources of the CRM-Forum, an independent forum for CRM research
conducted by private industry experts and consulting firms, including
Deloitte Research and Gartner Group.
* Managing Knowledge @ Work: An Overview of Knowledge Management,
Chief Information Officers Council (August 2001).
To determine DISA's progress in establishing the seven management
controls areas identified above, we reviewed documentation pertaining
to DISA's transformation and compared DISA's management environment
planned and in place to the management areas. We also developed tables
providing our assessments of DISA's status in performing EA management
and IT investment management control activities, analyzed in terms of
critical processes and key practice activities. A critical process is
a structured set of key practice activities that, when performed
collectively, contributes to attaining intended results. A key
practice activity is a process element that occurs over time, has
recognizable results, and is necessary to implement a critical process
(such as establishing procedures, performing and tracking work, and
taking corrective actions). We rated each key practice activity as
established, partially established, or not established. An established
activity was one that was supported by documentation showing that the
activity was systematically defined and reflected in DISA policies and
procedures. A partially established activity was in a proposed or
draft state, was not formally documented, or had documentation showing
that it did not meet requirements of federal criteria or best
practices. A not established activity was one that was not addressed
in formal or proposed documentation.
DISA's progress for each critical process was determined by the status
of the key practice activities associated with that process. For a
critical process to be assessed as either established or not
established, all the associated activities had to be assessed
correspondingly. For a critical process to be rated as partially
established, at least one activity had to be either established or
partially established.
We also interviewed officials from the following DISA offices assigned
organizational responsibility for these areas:
* Office of the Director for Strategic Plans, Programming, and Policy;
* Office of the Director for Manpower, Personnel, and Security;
* Office of the Deputy Director for Joint Requirements Analysis and
Integration;
* Office of the Deputy Director for C4I Modeling, Simulation, and
Assessment; and;
* Office of the Chief Information Officer.
We conducted our work at the DISA offices in Arlington, VA. We
performed our work from June through December 2001, in accordance with
generally accepted government auditing standards.
[End of section]
Appendix II:
Status of DISAs Efforts to Benchmark Performance:
As discussed in our IT investment management guide,[Footnote 43]
benchmarking of customer satisfaction provides valuable feedback for
improving an organization's products and services. Benchmarking
enables an organization to identify and compare its own practices and
performance levels to those of peers in industry and government, so
that performance and accountability can be improved. Recognizing this,
DISA began performing (in fiscal year 2000) benchmarking comparisons
for the telecommunications (voice and data) and mainframe computing
services that it offers, focusing on (1) customer satisfaction, (2)
quality, and (3) cost.
According to DISA, it is measuring implementation of its 500 Day
Action Plan through the annual benchmarking process set up under its
fiscal year 2002 performance contract. However, as discussed in the
body of this report, few of the measurement activities in DISA's
performance contract are aligned with action plan baselines. Thus,
DISA's benchmarking efforts are not a useful and meaningful measure of
action plan implementation. Specifically, a mapping of the performance
contract to the action plan shows that 100 actions (71 percent of the
total 140 actions) do not correlate to the two benchmarking categories
covered by the performance contract (telecommunications and mainframe
computing). Although we cannot provide specific examples of these 100
actions because they are not for public disclosure, the 100 actions
that are not addressed within the scope of DISA's benchmarking pertain
to joint warfighting capabilities, including the levels of support
provided to specific customers and the use of emerging technologies.
Even if benchmarking efforts were aligned with planned actions, DISA
has not benchmarked all the services it provides (such as mid-tier
computing[Footnote 44] services), and the results for those services
that have been assessed show mixed levels of performance.
Specifically, before fiscal year 2000, DISA used customer surveys to
assess performance, which were focused on customer satisfaction and
did not address cost-effectiveness. The survey conducted in 1999,
[Footnote 45] for example, reported acceptable[Footnote 46] customer
satisfaction ratings for computing and telecommunications services.
However, aggregating the overall ratings as acceptable for each
element did not reflect the level of dissatisfaction on a sub-element
level. For example, aggregate customer satisfaction with voice, video,
and data telecommunications products and services was rated high
(slightly above 75 percent), even though less than 75 percent of
respondents were satisfied with video and data services, and almost 25
percent were in fact dissatisfied with data telecommunications
services. In this assessment, DISA did not measure either the rates it
charged customers or the quality of service, and it did not benchmark
performance against commercial peers.
To DISA's credit, more recent assessments of customer satisfaction
with DISA's mainframe computing services show improvement, with
average customer satisfaction ratings for fiscal years 2000 and 2001
that are higher than the average for industry peers. However, DISA's
benchmarking of the cost-effectiveness of its mainframe computing
services had not been completed for fiscal year 2000, according to a
DISA official, because of difficulty in identifying commercial
industry rates for comparison. Officials told us that the 2000 results
have been combined with the 2001 results. DISA issued a summary report
of these results on November 28, 2001. In its summary report, DISA
stated that it performed better than commercial providers in the areas
of central processing unit and direct access storage device
acquisition and management; however, it realized higher costs than
commercial providers in the areas of staffing and software. The report
stated that the proprietary nature of commercial rates impaired DISA's
ability to perform an exact rate comparison; however, DISA derived
target rates from information available and will use these targets to
improve its computing operations. The benchmarking report also stated
that DISA had not yet completed its mainframe consolidation, which is
intended to reduce costs, and had not yet initiated other cost
reducing initiatives planned for 2002 and 2003. The report concluded
that these initiatives would enable DISA to become fully competitive
with commercial provider prices by 2004.
In the telecommunications area, a 1998 study[Footnote 47] showed that
DISA rates for telecommunications services were competitive with those
of commercial industry; however, the study also stated that not all
DISA's cost of operations had been accounted for in the rate
comparison. Accordingly, the study report concluded that "DISA's unit
prices are understated because they do not reflect the true costs of
running the business." In December 2000, DISA issued a summary report
of the benchmarking (performed by two contractors) of voice, data, and
video telecommunications services; the summary report covers 1999 and
2000. On December 10, 2001, DISA issued a similar summary report on
benchmarking of voice and data services for 2001 (video services were
not included). According to the summary report, in 2001, the average
global voice rate was 38 percent lower than the average global
commercial voice rate. From 1999 to 2001, improvement was shown in the
voice rates between Japan and the continental United States, which
decreased over $0.40 per minute (from $0.5873 to $0.1826 per minute).
However, 2001 rates for voice and data services among certain European
sites and between these sites and the continental United States were
about 25 percent and 10 percent higher, respectively, than the average
commercial rate, because of a rate freeze in this sector until 2005.
[End of section]
Appendix III: Further Details Regarding DISAs Enterprise Architecture
Management and Information Technology Investment Management:
We analyzed DISA's progress in maturing its enterprise architecture
(EA) management and information technology (IT) investment management
(ITIM) areas in terms of the critical processes and key practice
activities that constitute each area (as defined in our guidance and
published products, federal guidance, or industry best practices
[Footnote 48]). A critical process is a structured set of key practice
activities that, when performed collectively, contributes to attaining
the management control area. A key practice activity is a process
element that occurs over time, has recognizable results, and is
necessary to implement a critical process (such as establishing
procedures, performing and tracking work, and taking corrective
actions).
We rated each key practice activity as established, partially
established, or not established. An established activity was one that
was supported by documentation showing that the activity was
systematically defined and reflected in DISA policies and procedures.
A partially established activity was in a proposed or draft state, was
not formally documented, or had documentation showing that it did not
meet requirements of federal criteria or best practices. A not
established activity did not meet the criteria for either established
or partially established.
DISA's status for each critical process was determined by the status
of the key practice activities associated with that process. For a
critical process to be assessed as either established or not
established, all the associated activities for that critical process
had to be rated in the same way (that is, either all established or
none established). For a critical process to be rated as partially
established, at least one activity had to be either established or
partially established.
Table 2 is a summary of the state of DISA's EA management control
area; for each critical process, it provides the associated key
practice activities and presents our evaluation of their establishment
at DISA.
Table 2: Status of DISA's Enterprise Architecture (EA) Management
Process as of November 30, 2001:
Management control[A] critical processes and key practice activities:
1. Initiate EA program;
Partially established.
1a. EA function obtains executive buy-in and support;
Partially established (Note 1).
1b. EA function establishes management structure and control;
Partially established (Note 1).
1c. EA program activities and products are developed;
Partially established (Note 1).
2. Define an architecture process and approach;
Partially established.
2a. Intended use of the EA is defined;
Not established (Note 2).
2b. Scope of EA is defined;
Not established (Note 2).
2c. Depth of EA is determined;
Not established (Note 2).
2d. Appropriate EA products are selected;
Not established (Note 2).
2e. A framework is evaluated and selected;
Established.
2f. An EA tool set is selected;
Not established (Note 2).
3. Develop the EA;
Not established.
3a. Information is collected;
Not established.
3b. Products are generated, and the EA repository populated;
Not established.
3c. Sequencing plan is developed;
Not established.
3d. The EA products are approved, published, and disseminated;
Not established.
4. Use the EA;
Not established.
4a. EA is integrated with capital planning and investment control and
system life-cycle processes;
Not established.
4b. The integrated process is executed;
Not established.
4c. Other uses of the EA are developed;
Not established.
5. Maintain the EA;
Not established.
5a. The EA is maintained as it evolves;
Not established.
5b. Proposals for EA modifications continue;
Not established.
6. Continuously control and oversee the EA program;
Not established.
6a. Necessary EA program management controls are in place and
functioning;
Not established.
6b. Unmet EA expectations are identified;
Not established.
6c. Appropriate action is taken to address deviations;
Not established.
6d. Continuous improvement is ensured;
Not established.
Note 1: DISA had not completed implementation of proposed activity.
Note 2: DISA-provided documentation did not address all aspects of
this activity.
[A] Critical processes for this management control area are derived
from A Practical Guide to Federal Enterprise Architecture, Chief
Information Officers Council, version 1.0 (February 2001).
Source: GAO analysis of data obtained from DISA officials.
[End of table]
Table 3 is a summary of the state of DISA's IT investment management
control area. It provides the critical processes associated with each
stage of maturity within the ITINI framework. For each critical
process, it provides the associated key practice activities and
presents our evaluation of their establishment at DISA.
Table 3: Status of DISA's IT Investment Management as of November 30,
2001:
Stage 1: Creating Investment Awareness:
Management control[A] critical processes and key practice activities:
1.1.IT spending occurs without a disciplined investment process (This
is the starting point for all organizations).
Stage 2: Building the Investment Foundation:
Management control[A] critical processes and key practice activities:
2.1. Establish and operate an IT investment board;
Established.
2.1a. IT investment board is created and defined with board membership
integrating both IT and business knowledge;
Established.
2.1b. IT investment board operates according to written policies and
procedures in the organization-specific IT investment process guide;
Established.
2.2. Perform IT project oversight;
Partially Established.
2.2a. Each project's up-to-date cost and schedule data are provided to
the IT investment board;
Partially Established (Note 1).
2.2b. Using established criteria, the IT investment board oversees
individual IT project performance regularly by comparing actual cost
and schedule data to expectations;
Partially Established.
2.2c. The IT investment board performs special reviews of projects
that have not met predetermined performance standards;
Partially Established.
2.2d. Appropriate corrective actions for each underperforming project
are defined, documented, and agreed to by the IT investment board and
the project manager;
Partially Established.
2.2e. Corrective actions are implemented and tracked until the desired
outcome is achieved;
Partially Established.
2.3. Track IT assets;
Established.
2.3a. The organization's IT asset inventory is developed and maintained
according to a written procedure;
Established.
2.3b. IT asset inventory changes are maintained according to a written
procedure;
Established.
2.3c. Investment information is available on demand to decisionmakers
and other affected parties;
Established.
2.3d. Historical IT asset inventory records are maintained for future
selections and assessments;
Established.
2.4. Identify business needs for IT projects;
Partially Established.
2.4a. The business needs for each IT project are clearly identified
and defined;
Partially Established (Note 2).
2.4b. Specific users are identified for each IT project;
Partially Established (Note 2).
2.4c. Identified users participate in project management throughout a
project's life cycle;
Partially Established (Note 2).
2.5. Select proposals systematically;
Partially Established.
2.5a. The organization uses a structured process to develop new IT
proposals;
Partially Established (Note 1).
2.5b. Executives analyze and prioritize new IT proposals according to
established selection criteria;
Partially Established (Note 1).
2.5c. Executives make funding decisions for new IT proposals according
to an established process;
Partially Established (Note 1).
Stage 3: Developing a Complete Investment Portfolio:
Management control[A] critical processes and key practice activities:
3.1. Align authority of IT investment boards (Not applicable—DISA is
using a single enterprisewide IT investment board).
3.2. Define portfolio selection criteria;
Partially Established.
3.2a. The enterprisewide IT investment board approves the core IT
portfolio selection criteria, including cost, benefit, schedule, and
risk (CBSR) criteria, based on the organization's mission, goals,
strategies, and priorities;
Partially Established (Note 1).
3.2b. The IT portfolio selection criteria are distributed throughout
the organization;
Partially Established (Note 1).
3.2c. The IT portfolio selection process is reviewed on the basis of
cumulative experience and event-driven data and modified, as
appropriate;
Partially Established (Note 1).
3.3. Analyze investments;
Partially Established.
3.3a. The IT investment board ensures that the CBSR and other required
data are validated for each investment within its span of control;
Partially Established (Note 1).
3.3b. The IT investment board assesses each of its IT investments with
respect to the IT portfolio selection criteria;
Partially Established (Note 1).
3.3c. The IT investment board prioritizes its full portfolio of IT
investments using the portfolio selection criteria;
Partially Established (Note 1).
3.4. Develop an investment portfolio;
Not established.
3.4a. The IT investment board assigns investment proposals to a
portfolio category;
Not established.
3.4b. The IT investment board examines the mix of proposals and
investments across the common portfolio categories and makes
selections for funding;
Not established.
3.4c. The IT investment board approves or modifies the annual CBSR
expectations for each of its selected IT investments;
Not established (Note 1).
3.4d. A repository of portfolio development information is
established, updated, and maintained;
Not established.
3.5. Oversee portfolio performance;
Not established.
3.5a. The IT investment board monitors the performance of each
investment in its portfolio by comparing actual CBSR data to
expectations;
Not established.
3.5b. Using established criteria, the IT investment board identifies IT
investments that have not met predetermined CBSR performance
expectations;
Not established.
3.5c. The IT investment board and the project manager determine the
root cause of the poor performance;
Not established.
3.5d. The IT investment board and the project manager develop an
action plan designed to remedy the identified cause(s) of poor
performance;
Not established.
3.5e. Corrective actions are initiated and outcomes are tracked;
Not established.
Stage 4: Improving the Investment Process:
Management control[A] critical processes and key practice activities:
4.1. Perform postimplementation reviews (PIRs) and provide feedback;
Not established.
4.1a. The IT investment board identifies projects for which a PIR will
be conducted, and a PIR is initiated for each investment so identified;
Not established.
4.1 b. Quantitative and qualitative investment data are collected,
evaluated for reliability, and analyzed during the PIRs;
Not established.
4.1c. Lessons learned and improvement recommendations about the
investment process and individual investments are developed, captured
in a written product or knowledge base, and distributed to
decisionmakers;
Not established.
4.2. Evaluate and improve portfolio performance;
Not established.
4.2a. Comprehensive IT portfolio performance measurement data are
defined and collected through agreed upon methods;
Not established.
4.2b. Aggregate performance data and trends are analyzed;
Not established.
4.2c. Investment process and portfolio improvement recommendations are
developed and implemented;
Not established.
4.3. Manage systems and technology succession;
Not established.
4.3a. The IT investment board develops criteria for identifying IT
investments that may meet succession status;
Not established.
4.3b. IT investments are periodically analyzed for succession, and
appropriate investments are identified as succession candidates;
Not established.
4.3c. The interdependency of each investment with other investments in
the IT portfolio is analyzed;
Not established.
4.3d. The IT investment board makes a succession decision for each
candidate IT investment;
Not established.
Stage 5: Leveraging IT for Strategic Outcomes:
Management control[A] critical processes and key practice activities:
5.1. Perform investment process benchmarking;
Not established.
5.1a. Baseline data are collected for the organization's current IT
investment management processes;
Not established.
5.1b. Comparable external best-in-class IT investment management
processes are identified and benchmarked;
Not established.
5.1c. Improvements are made to the organization's investment
management processes;
Not established.
5.2. Manage IT-driven strategic business change;
Not established.
5.2a. The organization creates and maintains a knowledge base of state-
of-the-technology IT products and processes;
Not established (Note 3).
5.2b. Information technologies with strategic business-changing
capabilities are identified and evaluated;
Not established (Note 3).
5.2c. Strategic changes to the business processes are planned and
implemented based on the capabilities of identified information
technologies;
Not established.
Note 1: DISA had not completed implementation of proposed activity.
Note 2: DISA-provided documentation did not address all aspects of
this activity.
Note 3: This activity is dependent upon DISA's implementation of
customer relations management and knowledge management functions
across DISA.
[A] Critical processes for this management control area are derived
from Information Technology Investment Management: A Framework for
Assessing and Improving Process Maturity Exposure Draft, GAO/AIMD-
10.1.23, version 1 (May 2000).
Source: GAO analysis of data obtained from DISA officials.
[End of table]
[End of section]
Appendix IV: Comments from the Department of Defense:
Assistant Secretary Of Defense:
Command, Control, Communications, And Intelligence:
6000 Defense Pentagon:
Washington, DC 20301-6000:
February 22, 2002:
Mr. Joel C. Willemssen:
Managing Director, Information Technology Issues:
U.S. General Accounting Office:
Washington, D.C. 20548:
Dear Mr. Willemssen:
This is the Department of Defense (DoD) response to the GAO Draft
Report GAO-02-50, "Information Technology: Defense Information Systems
Agency Can Improve Investment Planning and Management Controls," dated
January 10, 2002 (GAO Code 310211).
The Department has reviewed the subject draft report. The audit that
your staff and the Defense Information Systems Agency (DISA) worked
closely together on highlighted many improvements to DISA's management
of information technology (1T) investments. DISA has either
implemented or has plans to implement your recommendations. These
recommendation and actions will improve support to DISA's customers.
We appreciate the opportunity to comment on the draft report.
Sincerely,
Signed by:
John P. Stenbit:
Enclosure:
[End of letter]
DoD Response to:
GAO Draft Report Dated January 10, 2002 (GAO Code 310211):
"Information Technology: Defense Information Systems Agency Can Improve
Investment Planning and Management Controls"
Recommendation 1: To improve DISA's development and execution of its
current and future information technology (IT) investment action
plans, the GAO recommended that the Secretary of Defense direct the
DISA Director, through the Assistant Secretary of Defense for Command,
Control, Communications, and Intelligence, to follow a structured and
disciplined IT investment management process for selection, control,
and evaluation of the initiatives in current and future action plans.
DOD Response: Concur. DISA has acknowledged its concurrence with the GAO
recommendation to follow a structured and disciplined IT investment
management process for selection, control, and evaluation of action
plan items that involve IT investment. DISA's responses to the
remaining GAO recommendations in the draft report reflect this
concurrence. This action is considered complete.
Recommendation 2: For plan development, the GAO recommended that the
DISA Director:
* define the general scope of actions and establish preliminary life-
cycle cost, schedule, benefit, and risk baselines for actions; and;
* perform a preliminary, high-level assessment of return on investment
for proposed actions to gauge their cost-effectiveness.
DOD Response: Partially concur. DISA concurs that project actions
require a baseline definition of scope, identification of costs,
schedule, and risks. We do not agree, however, that all actions
require the formal process required for projects.
DISA concurs with the GAO recommendation to follow a structured and
disciplined IT investment management process for selection, control,
and evaluation of action plan items that involve IT investment.
However, GAO's assumption that all the actions in DISA's existing 500
Day Action Plan or future requirements can be characterized as IT
investments oversimplifies what is actually a much more complex
situation.
By DOD policy DISA is the designated provider for specified computing,
communications and joint combat support services across all DOD. As a
service provider, DISA exists to support the information processing
requirements of the President, Secretary of Defense, military
services, joint military commands, Defense agencies and the
warfighter. The majority of these services are provided on a cost
reimbursable basis under the Defense Working Capital Fund. DISA's
customers identify the type of service required, the performance
levels required for each service, and budget the necessary funds to
pay for the service. DISA is responsible for satisfying its
customer's requirements with the best possible service at the lowest
possible cost. While DISA does receive appropriated funds for its
joint combat support mission, most of the requirements in this area
are defined by external bodies such as the Joint Staff, the Military
Communications Electronics Board, joint military commands, the Office
of the Secretary of Defense, and others.
DISA's role as a service provider has significant implications for the
application of the formal IT investment management framework espoused
by GAO. First, since DISA does not specify requirements, it frequently
lacks both the necessary information and the functional expertise
needed to develop the life-cycle costs, benefits, and risk estimates
required by the GAO IT investment management framework. Similar to a
business, DISA makes investments based on the aggregate of customer
requirements, trend information, and products and services requested
or emerging in its marketplace. Second, since DISA develops only a few
information processing applications but operates many, it frequently
lacks the knowledge of timing and functional interdependencies needed
to develop implementation schedules for a whole system. Third, since
DISA does not establish functional priorities but responds to
priorities established by its customers, DISA is not in a good
position to select the initiatives that best meet its customer's
strategic goals and prioritize the selected initiatives for allocation
of IT resources. Finally, and perhaps most importantly, DISA's role as
a service provider makes it inappropriate for DISA to assume the role
of decision maker in allocating its customer's IT resources.
Mindful of its role as a service provider, DISA grouped the actions
selected for inclusion in the 500 Day Action Plan into three
categories: projects, mission-based services, and processes. We concur
with the need to follow a structured IT investment management process
for selection, control and evaluation of projects. We do not agree
that all of the actions identified in the 500 Day Action Plan
constitute projects. Many of the actions are requests to evaluate the
feasibility of initiating a project. Many of the actions reflect
customer prioritization of services that DISA had already budgeted to
perform. Other actions were already formally evaluated through the
customer's IT investment processes with DISA selected to deliver the
service. When the action requested clearly qualifies as a project, or
our preliminary analysis indicates that a project is the best method
of satisfying the requested action, then the appropriate approval
process for IT investment will be followed.
DISA will continue to document the cost, schedule, benefit, and risk
baselines for existing 500 Day Action Plan actions in the quad chart
format that was developed with GAO assistance during the audit. Since
the IT investment management process recommended by GAO is directed
towards capturing the results of informed decisions in an IT
investment plan, DISA will focus its attention on applying the GAO IT
investment management framework to the fiscal year 2003 revision of
the 500 Day Action Plan. Guidelines for defining the general scope of
actions and establish preliminary life-cycle cost, schedule, benefit,
and risk baselines for actions; and performing a preliminary, high-
level assessment of return on investment for proposed actions to gauge
their cost-effectiveness will be developed by July 2002 for use in
developing the fiscal year 2003 plan.
Recommendation 3: For plan implementation, the GAO recommended that
the DISA Director:
* use approved baselines to develop meaningful results-oriented
performance metrics;
* implement a formal process (1) to control significant changes to
action baselines and closure of actions and (2) to inform stakeholders
of significant deviations in the action baselines;
* in monitoring implementation of the planned actions, update scope of
work, cost, schedule, benefit, and risk baselines for all actions, as
appropriate, to ensure that actions remain cost-effective investment
choices; and;
* establish a mechanism to track customer feedback to ensure that the
customer concerns that led to the actions are resolved.
DOD Response: Concur. DISA had begun the process of documenting the
exit criteria, performance metrics, risks, schedule and cost during
the GAO study. This action is now complete and the information is used
in monthly status reports. As indicated in the GAO report, we began
introducing elements of change control into the management and
implementation of the action plan. This work continues as we are
developing our customer feedback letters addressing the status of
actions to date and requesting the customer concurrence in changes (if
necessary) in scope and schedule. This action is considered complete.
Recommendation 4: To improve institutional management controls needed
to respond to changes in strategic direction, the GAO recommended that
the Secretary of Defense direct the DISA Director, through the
Assistant Secretary of Defense for Command, Control, Communications,
and Intelligence, to make it an agency priority to establish the
elements described in this report for each of the following management
controls: (1) strategic planning, (2) organizational structure
management, (3) enterprise architecture management, (4) IT investment
management, (5) customer relations management, and (6) knowledge
management.
DOD Response: Concur. DISA has acknowledged its intentions to make it
an agency priority to establish the elements described in this report
for each of the following management controls: (1) strategic planning,
(2) organizational structure management, (3) enterprise
architecture management, (4) IT investment management, (5) customer
relations management, and (6) knowledge management. In its responses
to Recommendations 5 through 10, DISA describes the actions it has
already taken to implement these management controls and those that
are planned for the future. This action is considered complete.
Recommendation 5: To strengthen the agency's strategic planning, the
GAO recommended that the DISA Director:
* fully define approaches or strategies to achieve goals and
objectives,
* completely explain the relationship between the general goals and
the annual performance goals, and,
* fully describe how program evaluations are used to establish and
revise strategic goals.
DOD Response: Concur. The management processes that vet programmatic
issues in the context of DISA's strategic goals include stand-ups, the
Corporate Board process, Senior Leadership Offsites, wall-to-walls
(WTW) reviews, in process reviews (IPRs) and the 500 Day Planning
process. Any and all of these forums can and are used to test whether
programs are helping the agency meet its goals. For example, the
Corporate Board uses the New Work Opportunities Process to vet whether
new programs can assist us in meeting corporate goals. And, as a
result of the IPR, WTW and other processes, we can determine whether
new/revised goals and objectives are needed. The goals stated in the
500 Day Plan were validated based on feedback from the other
processes. The annual Performance Plan forces us to revisit the goals at
least annually. Following is a synopsis of the documents developed as
part of DISA's strategic planning process:
DISA Strategic Plan - is GPRA compliant in that it directly relates to
DOD and Joint strategic planning. It is the capstone document for all
DISA organizations to look to for guidance to ensure resources
directly support one or more of the goals and objectives described in
the plan. It is a five-year plan that is reviewed annually and updated
every three years. DISA's Strategic Plan is structured to address two
distinct types of IT assets; first, DOD IT assets managed by DISA, and
second, DISA IT assets used to accomplish the DISA mission. It
provides the primary framework for the development of implementation
plans within mission areas. The body of the plan identifies the DISA
mission, vision, and goals. The goals include objectives for
performance assessments. The strategic plan is the foundation for the
Information Technology Management (ITM) Strategic Plan, POM,
Performance Contract, Annual Performance Plan, and Annual Program
Plans. Although not directly tasked by the GPRA, DISA developed its
Strategic Plan in accordance with a Secretary of Defense, Comptroller
Memo, Subject: Government Performance and Results Act Implementation,
16 October 1997.
Director's Planning Guidance - The Director's annual guidance provides
the Agency with programming guidance for development of
programs/projects that identify manpower and funding resources to
satisfy the agency goals and objectives for the future. The Director's
Planning Guidance is developed from the Defense Planning Guidance.
DISA 500 Day Action Plan - is the DISA Director's near-term action
plan that speaks directly to our customers as well as to the people of
DISA. It focuses management efforts, sets specific goals that
describes the way ahead, is action oriented, and is squarely focused
on customer needs and expectations. It captures high-priority customer
requirements DISA has committed to deliver, manifests our intent and
will foster accountability. Finally, it provides the baseline against
which progress will be reviewed to provide feedback to DISA's
customers.
DISA Information Technology Management (ITM) Strategic Plan - is the
strategic direction of IT management within DISA focusing on the IT
products and services provided to DISA staff to accomplish their
missions and functions. The DISA ITM Strategic Plan is subordinate to
the DISA Strategic Plan, provides more details in the role of IT
management and information technology, addresses strategic goals and
objectives for DISA intended IT investments (e.g. DISANET), and links
these goals back to the DISA Strategic Plan. Follow on implementation
plans for accomplishing the goals and objectives of the ITM Strategic
Plan will provide cost, schedule, and performance details.
Program Objective Memoranda (POM) - The Secretary of Defense uses the
Planning, Programming and Budget System (PPBS) to set programming
priorities for DOD and track those programs through budget execution.
It is a systematic structure to develop a defense strategy that is
translated into the specific defense programs, and then accurately
determines what those programs will cost. The PPBS is a cyclic process
containing three distinct, integrated and overlapping phases,
Planning, Programming and Budgeting. The planning phase of the PPBS
begins with the goals and priorities defined in the Strategic Plan,
Future Years Defense Plan (FYDP), and Director's Planning Guidance.
Preparing and producing the POM falls into the programming phase. The
purpose of this phase is to translate goals and objectives for the
next two-to-seven years into a definitive structure expressed in terms
of time phased financial resource and manpower requirements. During
the budgeting phase the POM and OSD directed Program Decision
Memorandums (PDM) are used to generate the Budget Estimate Submission
(BES), which covers the prior, current and out year budgets. OSD
reviews the BES and issues Program Budget Decisions (PBDs) that serve
as a basis for the President's Budget submission. POM and BES data is
incorporated into the Performance Contract, and Annual Program Plan.
Resources required to accomplish the metrics in the Performance
Contract and Performance Plan would be identified in the POM.
Performance Contract - Critical initiatives specified in the
Director's Planning Guidance are addressed in the Performance Contract
in terms of Business Area performance standards. Both the Performance
Contract and Strategic Plan contain a description of the four DISA
Business Areas. It is submitted to OSD with the POM covering the POM
years with a focus on the first year. The performance measures used in
this contract directly support the goals and objectives in the
strategic plan and help ensure that DISA uses its resources
effectively. It articulates expectations for the POM periods,
enumerates deliverables for DISA Business Areas, and identifies
quantitative and qualitative measures. These measures are incorporated
into the ITM Strategic Plan, the Annual Performance Plan, and Annual
Program Plans.
DISA Annual Performance Plan — The Performance Plan articulates the
short-term course DISA will use to accomplish multi-year goals and
objectives and identifies performance targets for the plan year.
Performance management goals and objectives are based on the DISA
Strategic Plan, ITM Strategic Plan, and Performance Contract. It
integrates, as one process, the reporting requirements of GPRA,
Clinger-Cohen, and DISA Performance Contract. It is the key document
that consolidates the reporting of DISA strategic goals and objectives
and identifies performance measures from the performance contract for
a consolidated view of DISA's performance including: a description of
the measure, the method of measurement, current baseline, end-of-year
target, completion year, and expected outcome.
DISA Performance Report — DISA initiated an annual performance report
as part of a comprehensive performance process developed by the agency
in recognition of GPRA and guidance in OMB Circular A-11. The report
includes an assessment of the agency's performance against the
performance goals established for that year; an analysis toward the
overall strategic goals; an explanation of deviations or impediments
encountered in achieving the goals; and addresses how the impediments
will be overcome in future years.
Annual Program Plans — DISA program managers document their execution
plans for the upcoming fiscal year in Annual Program Plans. The Annual
Program Plan serves as a record of the program manager's plans. In the
aggregate, these plans serve as a roadmap for the Agency's fiscal year
planned accomplishments. Key elements of this roadmap are an audit
trail for the planned accomplishments in the POM and President's
Budget and the linkage to the strategic plan goals and performance
contract measures. The plan also fulfills an external reporting
requirement for DISA's Performance Contract, GPRA, and the Clinger-
Cohen Act. After the Annual Program Plans have been presented to the
Budget Review Council, the Director or Vice Director will approve the
Plan and authorize execution.
DISA must on occasion adjust its plans and planning cycles due to
external changes in process. For example, the 2001 Quadrennial Defense
Review contains requirements for a Defense Agency Review and a
Transformation Roadmap that may influence or substitute for one or
more of the documents or processes described above.
This action is considered complete.
Recommendation 6: As part of its ongoing organizational structure
management, the GAO recommended that the DISA Director evaluate and
implement solutions for advancing coordination, productivity, and team
building.
DOD Response: Concur. DISA concurs wholeheartedly with the
recommendation to evaluate and implement solutions for advancing
coordination, productivity and team building.
As noted in the GAO report, DISA established a new organizational
element specifically to address transformation and management of
change — the Chief Transformation Executive (CTE). CTE is developing a
transformation management plan that identifies specific steps and
actions toward transforming the agency, including (but not limited to)
coordination and communication processes, process re-engineering to
improve productivity, and workshops to facilitate more open
communication and teaming behaviors. The transformation management
plan will be published in conjunction with the DISA Transformation
Roadmap in June 2002.
In addition, CTE is taking a leadership role in establishing knowledge
communities and conducting facilitated leadership planning sessions to
help foster improved coordination and teamwork. CTE is working in
conjunction with the DISA Chief Information Officer (CIO) and the DISA
Chief of Staff (COS) to gather knowledge sharing requirements and is
building the plan for developing and institutionalizing knowledge-
enabling processes, structures and systems across DISA (see response
to Recommendation 10 on Knowledge Management).
Recommendation 7: To strengthen management of DISA's effort to
develop, implement, and maintain an enterprise architecture, GAO
recommended that the DISA Director follow the steps defined in the
Chief Information Officers (CIO) Council's guide on architecture
management, as appropriate, including:
a. initiating a program;
b. defining the architecture process and approach;
c. developing the architecture, including the baseline and target
architectures, and the plan for sequencing from the baseline to the
target;
d. using the architecture in making IT investment decision;
e. maintaining the architecture; and;
f. continuously controlling and overseeing the program.
DOD Response: Concur. DISA concurs with the recommendation to develop,
implement, and maintain an enterprise architecture (EA). DISA has
developed an action plan that describes the intended use of the EA,
outlines its scope and depth, evaluates and selects an EA framework,
and selects an EA toolset. An EA working group has been established to
begin the development of EA program activities and products. The
development of the EA is scheduled to be completed by December 2002.
The following paragraphs provide additional details in regard to
DISA's plans for implementing enterprise architecture.
(Recommendation 7a): The Office of the Chief Information Officer (CIO)
has met individually with all DISA senior leaders to obtain executive
buy-in and support. In January 2002, CIO briefed the Director and Vice
Director in order to outline the scope of the DISA Enterprise
Architecture (EA) program, goals of the EA program, EA implementation
strategy, and milestones. In January 2002, an EA working group was
established to begin the development of EA program activities and
products.
(Recommendation 7b): The CIO has developed an action plan for
establishing an EA program. The action plan describes the intended use
of the EA, outlines the scope and depth of the EA, evaluates and
selects an EA framework, and selects an EA toolset. The action plan
will be finalized in February 2002.
(Recommendation 7c — 7f): The As-Is architecture and To-Be
architecture will be completed by December 2002. A transition plan
that will provide a roadmap for migrating from the baseline to the
target architecture will be developed by March 2003. When the
architecture is finalized and the transition plan is complete, the
architecture can be used in making IT investment decisions.
Maintenance and oversight of the architecture will be carried out
annually. By fully implementing the EA program, the DISA CIO will be
better able to support DISA's information technology, capital planning
process, its strategic planning process, and its customer service.
Recommendation 8: To establish effective 1T investment management, the
GAO recommended that the DISA Director follow the steps detailed in
GAO's IT investment management guide, including (a) building a
foundation for IT investments, including:
* establishing and operating an IT investment board,
* performing IT project oversight,
* tracking IT assets,
* identifying business needs for IT projects, and.
* selecting proposals systematically,
and (b) establishing the capability to manage investments as a
complete investment portfolio, including:
* defining portfolio selection criteria,
* analyzing investments,
* developing an investment portfolio, and,
* overseeing portfolio performance.
DOD Response: Concur. DISA concurs with GAO's recommendation to build
an effective IT investment process. However, it is important to
understand that DISA deals with two different types of IT investments;
first, external DOD IT assets managed by DISA, and second, internal
DISA IT assets used to accomplish the DISA mission. DISA's external IT
investments are vetted through a host of external processes as well as
IT project oversight. These external requirements will be aggregated
by program within our POM submission. The improved IT investment
management process called for by GAO will be integrated with other
processes throughout the Agency and within DOD. For each investment,
DISA managers will determine return on investment, assess the
availability of metrics, show how it supports the strategic plan and
meets the other requirements of the Clinger-Cohen Act. Initial answers
will be improved as tools provide better information to the managers.
(Recommendation 8a): Regarding building a foundation for IT
investments, DISA has been working to build the foundation for an IT
Investment Board, which was officially chartered in November 2001 and
will have its first meeting in February 2002. This board will be
involved with the development, coordination, evaluation, and
implementation of DISA's Enterprise Architecture and Capital
Investment Plans for IT investments supporting Agency business
processes.
We are in the process of ensuring that cost data, established criteria
against performance standards, etc. are established. An IT Investment
Scoring Model is being developed and will be used to support IT
project oversight, identifying business needs and selecting proposals
systematically. This part of the process is scheduled to be
substantially complete by 30 September 2002.
(Recommendation 8b): Establishing a capability to manage investments
as a complete investment portfolio will be the next step. The starting
point for DISA's portfolio will be the current investments that today
are managed by business areas. The enterprise architecture will
describe these business areas and help in building this criteria
indicating the "as is" and "to be" views and our transition plan (see
response to Recommendation 7 indicating completion of the Enterprise
Architecture by December 2002). The combination of the consistent
investment process and criteria for the portfolio will fully enable
DISA to manage investments as a portfolio. Building this portfolio
process will be completed by 30 September 2004.
It should be noted that in the 2001 Quadrennial Defense Review (QDR)
DOD has recognized the need to "transform its business processes and
infrastructure to both enhance the capabilities and creativity of it
employees and free up resources to support warfighting and the
transformation of military capabilities." This transformation will
depend heavily on leveraging IT capabilities to enhance the accurate,
timely flow of information so as to streamline the overhead structure
and flatten the DOD organization. It is the elimination of overhead
and redundancy that will produce a significant percentage of the
resources necessary to carry out the transformation of military
capabilities.
Today, the single greatest threat to timely implementation of new IT-
based military capabilities is the excessive amount of time it takes
to negotiate the complex budgeting, approval and oversight processes.
Given the rapid pace of technology, innovative new IT capabilities
routinely become obsolete and are replaced in the marketplace before
DOD can secure funding and acquire the product. Such delays cannot be
tolerated when IT support for new capabilities such as unmanned aerial
vehicles and near real-time targeting have a life or death impact on
the warfighter on the battlefield.
The QDR notes further that the Planning, Programming and Budgeting
System (PBBS) and the acquisition process create a significant amount
of the self-imposed institutional work in the Department. Changes have
already been instituted in both areas to reduce the complexity of the
process with the goal of measurably increasing the tooth to tail ratio
over the next few years. Some adjustments may be necessary in our
current plans for IT investment management as DOD continues to take
action based on the 2001 QDR.
Recommendation 9: To strengthen customer relations management, the GAO
recommended that the DISA Director build and maintain a supporting
customer relations infrastructure that permeates the entire
organization.
DOD Response: Concur. Effective 1 October 2001, the DISA Director
realigned the organization and created the Customer Advocacy
Directorate with the goal of fostering and sustaining strong customer
relations throughout DISA. This reorganization highlights the
importance of the customer and assigns the responsibility for this
transformation to a single element. DISA's customer relations
management (CRM) program is a multifaceted program that addresses CRM
as a process, a culture and a primary objective that can be measured
and tracked. A training program has been established for all Customer
Advocates that includes self-paced programs, technical training on
DISA services and products, and professional CRM training from
certified institutions. During fiscal year 2001, DISA's Network
Services Directorate conducted professional CRM training designed to
reach individuals in their organization. In fiscal year 2002, the
program is being expanded to include all members of DISA. Also during
fiscal year 2002, a series of processes will be developed in support
of the ISO 9001 program that will provide the baseline and framework
on how DISA implements CRM. In conjunction with this effort, DISA also
created Customer Advocates and Senior Executive Account Managers
(SEAM). This group of handpicked leaders within DISA is tasked to
ensure the development of close cooperation, support and understanding
between DISA and its customer base. Periodic customer focused meetings
have and are being scheduled to capture requirements, issues and
concerns and bring them to a mutually satisfactory conclusion. The
scope of conferences, working sessions, technical meetings and
partnership meetings continues to grow. Since October 2001, DISA has
had very successful customer meetings with the Air Force, Marine
Corps, Defense Logistics Agency, OSD (C3I), Joint Staff, and several
DoD organizations.
The Customer Advocacy Directorate (CA) had the lead in developing the
CRM infrastructure to support internal change. CA has developed two
new Customer Focused Reports and started a Senior Visitor's Program
that tailors presentations to the customer's needs and desires. CA
participates with the DISA Knowledge Management Council and other DISA
Directorates to create/fine-tune DISA processes and systems to better
share customer information. An essential objective for fiscal year
2002 is the implementation of a CRM Web portal that fully integrates
DISA customer tracking systems and provides customizable outputs via a
digital dashboard. CA expects to field a prototype Customer Score Card
by March 2002, that will identify status, issues, concerns, and
actions to be taken in a recognizable structure designed to present
the customer's perspective to senior DISA leadership.
During fiscal year 2002, DISA will revamp its customer conference to
focus on those things the customer needs to be done to ensure
integrated support to DOD and the war fighter. In conjunction with
DISA's Chief Transformation Officer, organizational and process
changes will be implemented to improve CRM as a process,
infrastructure, technology and way of life. The customer is the focus
of DISA, and CA has the responsibility to introduce techniques and
facilitate change to make customer focus the center point of how DISA
does business.
Recommendation 10: To define and implement an organizationally
integrated knowledge management function, the GAO recommended that the
DISA Director follow the steps outlined in the CIO Council guide on
this subject, including:
* deciding with whom to share organizational knowledge,
* deciding what organizational knowledge to share,
* deciding how to share organizational knowledge, and,
* institutionalizing and using the knowledge management process.
DOD Response: Concur. DISA concurs with GAO's recommendation to
implement an organizationally integrated knowledge management (KM)
function. Since our initial discussions with GAO auditors, DISA has
made considerable progress in this management control area, completing
the following actions in support of institutionalizing knowledge
management at DISA:
* Defined management structure (Jun 01);
* Established KM Council (Mar 01, formal charter — Aug 01);
* Developed implementation plan framework (Jun 01);
* Developed Speakers Program (Began Sep 01);
* Completed KM Questionnaire (audit) to baseline organizational KM
initiatives/knowledge base requirements (Sep 01);
* Compiled enterprise database inventory (Oct 01);
* Started KM Requirements Identification process (July 01);
* Drafted KM Instruction (Oct 01);
* Developed initial technical criteria (to assess technical
feasibility of initiatives proposed for knowledge base) (Dec 01);
* Staffed KM team (Nov 01).
We are addressing the first two foundation elements ("Whom do we share
with?" and "What do we share?") as part of an on-going KM Requirements
Identification process. This process, which began in July 2001, will
collect, analyze and prioritize knowledge requirements, and document
the process results in a KM Capstone Requirements Document (CRD) by
February 2002. DISA senior managers were interviewed to determine what
knowledge they and their staffs need to better perform their mission.
Authoritative source databases are also being identified. Initially,
access questions are being focused internally, however, our Customer
Advocacy organization is assessing what information should be shared
with our external customers.
Regarding the remaining KM foundation element ("How do we share?"), in
February 2002, we will begin piloting two Knowledge Communities, i.e.,
Communities of Interest/Practice, (one in the Resource Management area
and one in the Contract Management area) to facilitate the
exchange of tacit knowledge and to help identify effective
collaboration methods and support tools. Additionally, we are planning
to undertake a Portal Technology Technical Assessment by June 2002,
which we expect to lead to an enterprise portal pilot effort in fiscal
year 2003. The Agency Technical Criteria Evaluation and the KM
Architecture efforts will be developed in concert with the Enterprise
Architecture. Current plans call for fully institutionalizing and
using the knowledge management process throughout the agency by fiscal
year 2005.
It still, however, must be recognized that knowledge management is not
a well-defined science and that as experience grows, strategies and
levels of investment will change. We expect this and therefore view
our plans as exploratory and evolutionary.
[End of section]
Footnotes:
[1] P.L. 106-398, Floyd D. Spence National Defense Authorization Act
for Fiscal Year 2001, app. section 918.
[2] IT human capital management is an approach to attracting,
retaining, and motivating the people who possess the knowledge,
skills, and abilities that enable an organization to accomplish its IT
mission.
[3] Enterprise architecture management is an approach to developing,
maintaining, and using an explicit blueprint for operational and
technological change.
[4] IT investment management is an approach to selecting and
controlling IT spending so as to maximize return on investment and
minimize risk.
[5] Customer relations management is an approach to focusing an
organization's operations on how to best satisfy customer needs.
[6] Knowledge management is an approach to capturing, understanding,
and using the collective body of information and intellect within an
organization to accomplish its mission.
[7] Other institutional controls not addressed in this report (but
equally important) are budget formulation and execution, financial
management, acquisition, and security management.
[8] Briefing to the Senate Armed Services Committee on January 31,
2002; briefing to the House Armed Services Committee on January 23,
2002.
[9] The baseline commitments would define what an action is intended
to provide (in terms of capability and value), by when, at what cost,
and with what associated elements of risk. These commitments are the
expectations for the action that allow informed decisionmaking on
whether to invest in the action and permit measurement of action
progress and performance.
[10] Other institutional controls not included in the scope of our
review (but equally important) are budget formulation and execution,
financial management, acquisition, and security management.
[11] The assistant secretary of defense for command, control,
communications, and intelligence also serves as the DOD chief
information officer.
[12] U.S. General Accounting Office, DOD Information Services:
Improved Pricing and Financial Management Practices Needed for
Business Area, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-98-182] (Washington, D.C.: Sept.
15, 1998).
[13] Management of DoD Long-Haul Telecommunications Requirements,
Report Number 99140 (Apr. 1999).
[14] Audit of DISA's Performance Contract, Final Report 2001-01 (Oct.
2000).
[15] Annual performance contracts were instituted by the November 1997
Defense Reform Initiative as a means to improve the cost-effectiveness
and efficiency of DOD's business processes and support infrastructure.
Similar to the performance plan required by the Government Performance
and Results Act of 1993, the performance contract facilitates efforts
to manage resources better and link program results to budget.
[16] We give no specific examples here because DISAs position is that
the military sensitivity of the actions makes them unsuitable for
public disclosure.
[17] 40 U.S.C. § 1422; Management of Federal Information Resources,
Office of Management and Budget (OMB) Circular A-130 (Nov. 28, 2000).
[18] U.S. General Accounting Office, Information Technology Investment
Management: A Framework for Assessing and Improving Process Maturity,
Exposure Draft, GAO/AIMD-10.1.23, version 1 (Washington, D.C.: May
2000).
[19] Management of Federal Information Resources, OMB Circular A-130
(Nov. 28, 2000).
[20] U.S. General Accounting Office, Information Technology Investment
Management: A Framework for Assessing and Improving Process Maturity,
Exposure Draft, GAO/AIMD-10.1.23, version 1 (Washington, D.C.: May
2000).
[21] DISA did establish cost baselines for 21 of the 57 actions
reviewed, but these were only estimates of costs to be incurred in
fiscal year 2002, not life-cycle cost estimates. For the 21 actions
with cost estimates, the total estimated fiscal year 2002 cost was
$171.7 million.
[22] 40 U.S.C. § 1422.
[23] Management of Federal Information Resources, OMB Circular A-130
(Nov. 28, 2000).
[24] The board includes high-level personnel from each DISA national
capital region organization, empowered to act for their organizations.
[25] U.S. General Accounting Office, Managing in the New Millennium:
Shaping a More Efficient and Effective Government for the 21st
Century, [hyperlink, http://www.gao.gov/products/GAO/T-OCG-00-9]
(Washington, D.C.: Mar. 29, 2000); GAO: Supporting Congress for the
21st Century, [hyperlink, http://www.gao.gov/products/GAO/T-OCG-00-10]
(Washington, D.C.: July 18, 2000); and Determining Performance and
Accountability Challenges and High Risks, [hyperlink,
http://www.gao.gov/products/GAO-01-159SP] (Washington, D.C.: Nov.
2000).
[26] U.S. General Accounting Office, Determining Performance and
Accountability Challenges and High Risks, [hyperlink,
http://www.gao.gov/products/GAO-01-159SP] (Washington, D.C.: Nov.
2000).
[27] The other three institutional management controls (not addressed
in this report, but equally important) are budget formulation and
execution, financial management, and acquisition.
[28] P.L. 103-62, Government Performance and Results Act of 1993.
[29] Preparation and Submission of Strategic Plans, Annual Performance
Plans, and Annual Program Performance Reports, OMB Circular A-11, Part
2.
[30] Defense Information Systems Agency Strategic Plan, version 2.0
(May 2000).
[31] U.S. General Accounting Office, Human Capital: Attracting and
Retaining a High-Quality Information Technology Workforce, [hyperlink,
http://www.gao.gov/products/GAO-02-113T] (Washington, D.C.: Oct. 4,
2001).
[32] U.S. General Accounting Office, GAO: Supporting Congress for the
21st Century, [hyperlink, [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO/T-OCG-00-10] (Washington, D.C.: July 18, 2000).
[33] CIO Council, A Practical Guide to Federal Enterprise
Architecture, version 1.0 (Feb. 2001).
[34] The DOD framework (the Command, Control, Communications,
Computers, Intelligence, Surveillance, and Reconnaissance Architecture
Framework) promotes the use of three views in an organization's
architecture: systems, operational, and technical. Further, some
requirements for the technical view are set forth in the Joint
Technical Architecture, which sets minimum technical architecture
standards for interoperability that apply to all DOD components.
[35] See, for example, U.S. General Accounting Office, Customs Service
Modernization: Architecture Must Be Complete and Enforced to
Effectively Build and Maintain Systems, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-98-70] (Washington, D.C.: May 5,
1998); Information Technology: Architecture Needed to Guide
Modernization of DOD's Financial Operations, [hyperlink,
http://www.gao.gov/products/GAO-01-525] (Washington, D.C.: May 17,
2001).
[36] U.S. General Accounting Office, Information Technology Investment
Management: A Framework for Assessing and Improving Process Maturity,
Exposure Draft, [hyperlink, GAO/AIMD-10-1.23, version 1 (Washington,
D.C.: May 2000).
[37] 40 U.S.C. § 1422.
[38] Best practices have been compiled by the CRM-Forum, an
independent resource for CRM research conducted by private industry
experts and consulting firms, including Deloitte Research and Gartner
Group.
[39] CIO Council, Managing Knowledge @ Work: An Overview of Knowledge
Management (Aug. 2001).
[40] CIO Council, A Practical Guide to Federal Enterprise
Architecture, version 1.0 (Feb. 2001).
[41] U.S. General Accounting Office, Information Technology Investment
Management: A Framework for Assessing and Improving Process Maturity,
Exposure Draft, GAO/AIMD10-1.23, version 1 (Washington, D.C.: May
2000).
[42] Council, Managing Knowledge @ Work: An Overview of Knowledge
Management (Aug. 2001).
[43] U.S. General Accounting Office, Information Technology Investment
Management: A Framework for Assessing and Improving Process Maturity,
Exposure Draft, GAO/AIMD10-1.23, version 1 (Washington, D.C.: May
2000).
[44] Mid-tier computers are those other than mainframe, such as
microcomputers and centralized servers for distributed applications.
Of the total users at one DISA data center, 6 percent (11,200 out of
196,200) are users of mid-tier services.
[45] The 1999 DOD survey focused on the biennial review of customer
satisfaction with DISA's major business areas of DOD components. For
DISA, this review included joint warfighting capabilities, computing
services, telecommunications services, and acquisition services.
Elements rated by customers included satisfaction with the
effectiveness, efficiency, and economy aspects of DISA's products and
services; DISA's responsiveness to customers; DISA's coordination with
customers; and satisfaction with the quality of DISA's products and
services.
[46] For the 1999 survey, survey elements were measured by a
satisfied, neutral, or dissatisfied response from customers; an
element was acceptable if 50 percent or more survey respondents rated
the element as satisfied. Of the total users at one DISA data center,
6 percent (11,200 out of 196,200) are users of mid-tier services.
[47] The fiscal year 1998 telecommunications study was a contracted
examination of the business process, cost, and methodology of DISA's
electronic commerce operations (these included telecommunications, as
e-commerce uses telecommunication capabilities for transmission of
electronic transactions).
[48] Chief Information Officers Council, A Practical Guide to Federal
Enterprise Architecture, version 1.0 (Feb. 2001), and U.S. General
Accounting Office, Information Technology Investment Management: A
Framework for Assessing and Improving Process Maturity, Exposure
Draft, GAO/AIMD-10-1.23, version 1 (Washington, D.C.: May 2000).
[End of section]
GAO’s Mission:
The General Accounting Office, the investigative arm of Congress,
exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO’s commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO’s Web site [hyperlink,
http://www.gao.gov] contains abstracts and fulltext files of current
reports and testimony and an expanding archive of older products. The
Web site features a search engine to help you locate documents using
key words and phrases. You can print these documents in their entirety,
including charts and other graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as “Today’s Reports,” on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
[hyperlink, http://www.gao.gov] and select “Subscribe to daily E-mail
alert for newly released products” under the GAO Reports heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. General Accounting Office:
441 G Street NW, Room LM:
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director, NelliganJ@gao.gov:
(202) 512-4800:
U.S. General Accounting Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: