This is the accessible text file for GAO report number GAO-13-597
entitled 'Geostationary Weather Satellites: Progress Made, but
Weaknesses in Scheduling, Contingency Planning, and Communicating with
Users Need to Be Addressed' which was released on September 19, 2013.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States Government Accountability Office:
GAO:
Report to the Committee on Science, Space, and Technology, House of
Representatives:
September 2013:
Geostationary Weather Satellites:
Progress Made, but Weaknesses in Scheduling, Contingency Planning, and
Communicating with Users Need to Be Addressed:
GAO-13-597:
GAO Highlights:
Highlights of GAO-13-597, a report to the Committee on Science, Space,
and Technology, House of Representatives.
Why GAO Did This Study:
NOAA, with the aid of the National Aeronautics and Space
Administration (NASA), is procuring the next generation of
geostationary weather satellites. The GOES-R series is to replace the
current series of satellites (called GOES-13, -14, and -15), which
will likely begin to reach the end of their useful lives in 2015. This
new series is considered critical to the United States’ ability to
maintain the continuity of satellite data required for weather
forecasting through 2036.
GAO was asked to evaluate GOES-R. GAO’s objectives were to (1) assess
GOES-R progress and efforts to address key cost and schedule risks;
(2) evaluate efforts to manage changes in requirements and whether any
significant changes have recently occurred; and (3) evaluate the
adequacy of GOES-R contingency plans. To do so, GAO analyzed program
and contractor data, compared GOES-R schedules, requirements changes,
and contingency plans to best practices by leading organizations, and
interviewed officials at NOAA, NASA, and at other federal agencies
that rely on GOES.
What GAO Found:
The National Oceanic and Atmospheric Administration (NOAA) has
completed the design of its Geostationary Operational Environmental
Satellite-R (GOES-R) series and made progress in building flight and
ground components. While the program reports that it is on track to
stay within its $10.9 billion life cycle cost estimate, it has not
reported key information on reserve funds to senior management. Also,
the program has delayed interim milestones, is experiencing technical
issues, and continues to demonstrate weaknesses in the development of
component schedules. These factors have the potential to affect the
expected October 2015 launch date of the first GOES-R satellite, and
program officials now acknowledge that the launch date may be delayed
by 6 months. A launch delay would increase the time that NOAA is
without an on-orbit backup satellite. It would also increase the
potential for a gap in GOES satellite coverage should one of the two
operational satellites (GOES-14 or -15) fail prematurely (see graphic)—
a scenario given a 36 percent likelihood of occurring by an
independent review team.
Figure: Potential Gap in GOES Coverage:
[Refer to PDF for image: illustration]
GOES-13:
Available as backup: Calendar year 2009-2010;
Operational period: Calendar year 2010-2015.
GOES-14:
Launch date: Calendar year 2009;
Post launch test period: Calendar year 2009;
Available as backup: Calendar year 2010-2015;
Operational period: Calendar year 2015-2020.
GOES-15:
Launch date: Calendar year 2010;
Post launch test period: Calendar year 2010;
Available as backup: Calendar year 2010-2011;
Operational period: Calendar year 2012-2017.
GOES-R:
Launch date: Calendar year 2015;
Post launch test period: Calendar year 2015-2016;
Available as backup: Calendar year 2016;
Operational period: Calendar year 2017-2023.
GOES-S:
Launch date: Calendar year 2017;
Post launch test period: Calendar year 2017;
Available as backup: Calendar year 2017-2020;
Operational period: Calendar year 2020-2023.
Projected gap in backup coverage: mid-2015-mid 2015.
Source: GAO analysis of NOAA data.
[End of figure]
While the GOES-R program has established a process for managing
requirements changes, it has not effectively involved key satellite
data users. Since 2007, the GOES-R program decided not to develop 31
of the original set of GOES products and modified specifications on 20
remaining products. For example, NOAA decreased the accuracy
requirement for the hurricane intensity product and decreased the
timeliness of the lightning detection product. However, key satellite
data users were not fully informed about changes and did not have a
chance to communicate their concerns about the impact of these changes
on their operations. Until NOAA improves its communication with
external satellite data users, obtains input from the users, and
addresses user concerns when considering product changes, its changes
could cause an unexpected impact on critical user operations.
NOAA has established contingency plans for the loss of its GOES
satellites and ground systems that are generally in accordance with
best practices; however, these plans are missing key elements. For
example, NOAA did not work with the user community to address
potential reductions in capability under contingency scenarios or
identify alternative solutions for preventing a delay in the GOES-R
launch date. Until NOAA addresses the shortfalls in its contingency
plans and procedures, the plans may not work as intended in an
emergency and satellite data users may not obtain the information they
need to perform their missions
What GAO Recommends:
GAO is recommending that NOAA address weaknesses in managing reserves
and scheduling, improve communications with satellite data users, and
address shortfalls in contingency planning. NOAA concurred with GAO’s
recommendations and identified steps it is taking to implement them.
View [hyperlink, http://www.gao.gov/products/GAO-13-597]. For more
information, contact David Powner at (202) 512-9286 or pownerd@gao.gov.
[End of section]
Contents:
Letter:
Background:
NOAA Has Made Progress in Developing GOES-R, but Continues to Face
Challenges that Could Increase the Risk of a Satellite Data Gap:
NOAA Has a Process for Managing Changes in GOES-R Requirements, but
Changes Could Affect Some Users:
NOAA Developed GOES-R Contingency Plans, but Weaknesses Increase the
Impact of a Potential Coverage Gap:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Comments from the Department of Commerce:
Appendix III: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Summary of the Procurement History of the Geostationary
Operational Environmental Satellites:
Table 2: Key Changes to the Geostationary Operational Environmental
Satellite-R Series Program over Time:
Table 3: Geostationary Operational Environmental Satellite-R Series
Instruments:
Table 4: Key Components of the Geostationary Operational Environmental
Satellite-R Series Ground Project:
Table 5: Major Development Reviews for the Geostationary Operational
Environmental Satellite-R Series:
Table 6: Development Status of Flight Project Components for the
Geostationary Operational Environmental Satellite-R Satellite, as of
August 2013:
Table 7: Development Status of the Geostationary Operational
Environmental Satellite-R Series Ground Project Components, as of
August 2013:
Table 8: Reserve Levels for the Geostationary Operational
Environmental Satellite-R Series Program, as of March 2013:
Table 9: Delays in Milestones for the Geostationary Operational
Environmental Satellite-R Series Program:
Table 10: Description of Scheduling Best Practices:
Table 11: Assessment of Selected Schedules Use of Best Practices over
Time:
Table 12: Best Practices in Managing Requirements Changes:
Table 13: Assessment of Geostationary Operational Environmental
Satellite-R Series Program Practices in Managing Changes in
Requirements:
Table 14: Summary of Key Changes in Product and Program Requirements
between 2007 and 2012:
Table 15: User Concerns about Key Changes or Deviations in
Requirements:
Table 16: Guidelines for Developing a Sound Contingency Plan:
Table 17: Implementation of Key Contingency Planning Elements for
Geostationary Operational Environmental Satellites:
Figures:
Figure 1: Approximate Geographic Coverage of the Geostationary
Operational Environmental Satellites:
Figure 2: Generic Data Relay Pattern for the Geostationary Operational
Environmental Satellites:
Figure 3: Organizational Structure and Staffing of the Geostationary
Operational Environmental Satellite-R Series Program:
Figure 4: NASA's Life Cycle for Flight Systems:
Figure 5: Potential Gap in Geostationary Operational Environmental
Satellite Coverage:
Abbreviations:
CDR: critical design review:
FOR: flight operations review:
GOES: Geostationary Operational Environmental Satellite:
GOES-R: Geostationary Operational Environmental Satellite-R series:
KDP: key decision point:
MDR: mission definition review:
MOR: mission operations review:
NASA: National Aeronautics and Space Administration:
NESDIS: National Environmental Satellite, Data, and Information
Service:
NOAA: National Oceanic and Atmospheric Administration:
NSOF: NOAA Satellite Operations Facility:
ORR: operational readiness review:
PDR: preliminary design review:
SDR: system definition review:
SIR: system integration review:
[End of section]
GAO:
United States Government Accountability Office:
441 G St. N.W.
Washington, DC 20548:
September 9, 2013:
The Honorable Lamar S. Smith:
Chairman:
The Honorable Ralph Hall:
Chairman Emeritus:
The Honorable Eddie Bernice Johnson:
Ranking Member:
Committee on Science, Space, and Technology:
House of Representatives:
Geostationary environmental satellites play a critical role in our
nation's weather forecasting. These satellites--which are managed by
the Department of Commerce's National Oceanic and Atmospheric
Administration (NOAA)--provide information on atmospheric, oceanic,
climatic, and solar conditions that help meteorologists observe and
predict regional and local weather events. They also provide a means
of identifying the large-scale evolution of severe storms, such as
forecasting a hurricane's path and intensity.
NOAA, through collaboration with the National Aeronautics and Space
Administration (NASA), is procuring the next generation of
geostationary weather satellites, called the Geostationary Operational
Environmental Satellite-R (GOES-R) series. The GOES-R series consists
of four satellites and is to replace the current series of
geostationary environmental satellites as they reach the end of their
useful lives. This new series is expected to provide the first major
improvement in the technology of GOES instruments since 1994 and, as
such, is considered critical to the United States' ability to maintain
the continuity of data required for weather forecasting through the
year 2036.
This report responds to your request that we review NOAA's GOES-R
series program (GOES-R program). Specifically, our objectives were to
(1) assess GOES-R progress and efforts to address key cost and
schedule risks that we identified in our prior report, (2) evaluate
efforts to manage changes in requirements and whether any significant
changes have recently occurred, and (3) evaluate the adequacy of GOES-
R contingency plans. To assess NOAA's progress in developing GOES-R
and addressing key risks, we compared estimated and actual program
deliverables and analyzed monthly program status briefings to identify
current status and recent development challenges. We also followed up
on our prior concerns regarding reserve funds and scheduling practices
by comparing the program's current level of reserve funding and two
component schedules to best practices.[Footnote 1] By recalculating
reserve percentages based on supporting data and examining schedule
anomalies through use of a standard template, we determined data in
both areas to be reliable for the purposes of this audit. To assess
NOAA's efforts to manage changes in requirements, we compared the
agency's policies and practices to best practices identified by
leading organizations[Footnote 2] and identified major changes to the
program over time. To evaluate the adequacy of the GOES-R contingency
plan, we compared the GOES-R contingency plan to best practices in
contingency planning identified by leading organizations.[Footnote 3]
We also interviewed program officials as well as key internal and
external satellite data users.
We conducted this performance audit from October 2012 to September
2013 in accordance with generally accepted government auditing
standards. Those standards require that we plan and perform the audit
to obtain sufficient, appropriate evidence to provide a reasonable
basis for our findings and conclusions based on our audit objectives.
We believe that the evidence obtained provides a reasonable basis for
our findings and conclusions based on our audit objectives. See
appendix I for a complete description of our objectives, scope, and
methodology.
Background:
Since the 1970s, geostationary satellites have been used by the United
States to provide meteorological data for weather observation,
research, and forecasting. NOAA's National Environmental Satellite,
Data, and Information Service is responsible for managing the civilian
operational geostationary satellite system, called GOES. Geostationary
satellites can maintain a constant view of the earth from a high orbit
of about 22,300 miles in space.
NOAA operates GOES as a two-satellite system that is primarily focused
on the United States (see fig. 1). These satellites provide timely
environmental data about the earth's atmosphere, surface, cloud cover,
and the space environment to meteorologists and their audiences. They
also observe the development of hazardous weather, such as hurricanes
and severe thunderstorms, and track their movement and intensity to
reduce or avoid major losses of property and life. The ability of the
satellites to provide broad, continuously updated coverage of
atmospheric conditions over land and oceans is important to NOAA's
weather forecasting operations.
Figure 1: Approximate Geographic Coverage of the Geostationary
Operational Environmental Satellites:
[Refer to PDF for image: illustrated world map]
Map depicts coverage by:
GOES-West;
GOES-East;
Overlap of the two.
Sources: NOAA (data), Mapart (map).
[End of figure]
To provide continuous satellite coverage, NOAA acquires several
satellites at a time as part of a series and launches new satellites
every few years (see table 1). NOAA's policy is to have two
operational satellites and one backup satellite in orbit at all times.
Table 1: Summary of the Procurement History of the Geostationary
Operational Environmental Satellites:
Series name: Original GOES[C];
Procurement duration[A]: 1970-1987;
Satellites[B]: 1, 2, 3, 4, 5, 6, 7.
Series name: GOES I-M;
Procurement duration[A]: 1985-2001;
Satellites[B]: 8, 9, 10, 11, 12.
Series name: GOES N;
Procurement duration[A]: 1998-2010;
Satellites[B]: 13, 14, 15, Q[D].
Series name: GOES-R;
Procurement duration[A]: 2008-2024;
Satellites[B]: R, S, T, U.
Source: GAO analysis of NOAA data.
[A] Duration includes time from contract award to final satellite
launch.
[B] Satellites in a series are identified by letters of the alphabet
when they are on the ground (before launch) and by numbers once they
are in orbit.
[C] The procurement of these satellites consisted of four separate
contracts for (1) two early prototype satellites and GOES-1, (2) GOES-
2 and -3, (3) GOES-4 through -6, and (4) GOES-G (failed on launch) and
GOES-7.
[D] NOAA decided not to exercise the option for this satellite.
[End of table]
Four GOES satellites-GOES-12, GOES-13, GOES-14, and GOES-15-are
currently in orbit. Both GOES-13 and GOES-15 are operational
satellites with GOES-13 covering the eastern United States and GOES-15
in the western United States (see fig. 1). GOES-14 is currently in an
on-orbit storage mode and available as a backup for the other two
satellites should they experience any degradation in service. GOES-12
is at the end of its service life, but it is being used to provide
limited coverage of South America. The GOES-R series is the next
generation of satellites that NOAA is planning. The first two
satellites in the series (called GOES-R and GOES-S) are planned for
launch in October 2015 and February 2017, respectively.[Footnote 4]
Each of the operational geostationary satellites continuously
transmits raw environmental data to NOAA ground stations. The data are
processed at these ground stations and transmitted back to the
satellite for broadcast to primary weather services and the global
research community in the United States and abroad. Raw and processed
data are also distributed to users via ground stations through other
communication channels, such as dedicated private communication lines
and the Internet. Figure 2 depicts a generic data relay pattern from a
geostationary satellite to the ground stations and commercial
terminals.
Figure 2: Generic Data Relay Pattern for the Geostationary Operational
Environmental Satellites:
[Refer to PDF for image: illustration]
Ground station:
Communications link to users.
GOES satellite:
Raw environmental data sent to ground station;
Processed environmental data sent back to GOES;
Processed environmental data broadcast to users.
Source: GAO analysis of NOAA data.
[End of figure]
Overview of the GOES-R Program:
NOAA established the GOES-R program to develop and launch the next
series of geostationary satellites and to ensure the continuity of
geostationary satellite observations. Since its inception, the GOES-R
program has undergone several changes in cost and scope. As originally
envisioned, GOES-R was to encompass four satellites hosting a variety
of advanced technology instruments and providing 81 environmental
products. The first two satellites in the series were expected to
launch in September 2012 and April 2014. However, in September 2006,
NOAA decided to reduce the scope and technical complexity of the GOES-
R program because of expectations that total costs, which were
originally estimated to be $6.2 billion, could reach $11.4 billion.
Specifically, NOAA reduced the minimum number of satellites from four
to two, canceled plans for developing an advanced instrument (which
reduced the number of planned satellite products from 81 to 68), and
divided another instrument into two separate acquisitions. The agency
estimated that the revised program would cost $7 billion and kept the
planned launch dates unchanged.
Subsequently, NOAA made several other important decisions about the
cost and scope of the GOES-R program. In May 2007, NOAA had an
independent cost estimate completed for the GOES-R program. After
reconciling the program office's cost estimate of $7 billion with the
independent cost estimate of about $9 billion, the agency established
a new program cost estimate of $7.67 billion. This was an increase of
$670 million from the previous estimate. The program also moved the
launch dates for the first two satellites to December 2014 and April
2016. Further, in November 2007, to mitigate the risk that costs would
rise, program officials decided to remove selected program
requirements from the baseline program and treat them as contract
options that could be exercised if funds allowed. These requirements
included the number of products to be distributed, the time to deliver
the remaining products (product latency), and how often these products
would be updated with new satellite data (refresh rate). For example,
program officials eliminated the requirement to develop and distribute
34 of the 68 envisioned products, including low cloud and fog, sulfur
dioxide detection, and cloud liquid water. Program officials included
the restoration of the requirements for the products, latency times,
and refresh rates as options in the ground system contract that could
be acquired at a later time. Program officials later reduced the
number of products that could be restored as a contract option (called
option 2) from 34 to 31 because they determined that two products were
no longer feasible and two others could be combined into a single
product.
In late 2009, NOAA changed the launch dates for the first two
satellites to October 2015 and February 2017, in part due to a bid
protest related to award of the spacecraft contract. More recently,
NOAA restored two satellites to the program's baseline, making GOES-R
a four-satellite program once again. In February 2011, as part of its
fiscal year 2012 budget request, NOAA requested funding to begin
development for two additional satellites in the GOES-R series--GOES-T
and GOES-U. The program estimates that the development for all four
satellites in the GOES-R series--GOES-R, GOES-S, GOES-T, and GOES-U--
is to cost $10.9 billion through 2036, an increase of $3.2 billion
over its prior life cycle cost estimate of $7.67 billion for the two-
satellite program. See table 2 for an overview of key changes to the
GOES-R program.
Table 2: Key Changes to the Geostationary Operational Environmental
Satellite-R Series Program over Time:
Number of satellites:
August 2006 (baseline program): 4;
September 2006: 2;
November 2007: 2;
February 2011: 4.
Instruments or instrument changes:
August 2006 (baseline program):
* Advanced Baseline Imager;
* Geostationary Lightning Mapper;
* Magnetometer;
* Space Environmental In-Situ Suite;
* Solar Imaging Suite (which included the Solar Ultraviolet Imager,
and Extreme Ultraviolet/X-Ray Irradiance Sensor);
* Hyperspectral Environmental Suite;
September 2006:
* Advanced Baseline Imager;
* Geostationary Lightning Mapper;
* Magnetometer;
* Space Environmental In-Situ Suite;
* Solar Ultraviolet Imager;
* Extreme Ultraviolet/X-Ray Irradiance Sensor;
November 2007: No change;
February 2011: No change.
Number of satellite products:
August 2006 (baseline program): 81;
September 2006: 68;
November 2007: 34 baseline 34 optional;
February 2011: 34 baseline 31 optional.
Life cycle cost estimate (in then-year dollars):
August 2006 (baseline program): $6.2 billion--$11.4 billion (through
2034);
September 2006: $7 billion (through 2028);
November 2007: $7.67 billion (through 2028);
February 2011: $10.9 billion (through 2036)[A].
Estimated launch dates for GOES-R and S:
August 2006 (baseline program): GOES-R: September 2012;
GOES-S: April 2014;
September 2006: GOES-R: September 2012; GOES-S: April 2014;
November 2007: GOES-R: December 2014; GOES-S: April 2016;
February 2011: GOES-R: October 2015; GOES-S: February 2017.
Source: GAO analysis of NOAA data.
[A] Based on NOAA's fiscal year 2012 budget baseline, $7.64 billion of
this cost estimate was for the first two satellites in the series,
GOES-R and GOES-S. The cost for the remaining two satellites--GOES-T
and GOES-U--was estimated at $3.22 billion.
[End of table]
Program and Program Office Structure:
The GOES-R program is divided into flight and ground projects that
have separate areas of responsibility and oversee different sets of
contracts. The flight project, which is managed by NASA, includes
instruments, spacecraft, launch services, satellite integration, and
on-orbit satellite initialization. Table 3 summarizes the GOES-R
instruments and their planned capabilities.
Table 3: Geostationary Operational Environmental Satellite-R Series
Instruments:
Planned instrument: Advanced Baseline Imager;
Description: Expected to provide variable area imagery and radiometric
information of the earth's surface, atmosphere, and cloud cover. Key
features include:
* monitoring and tracking severe weather;
* providing images of clouds to support forecasts; and;
* providing higher resolution, faster coverage, and broader coverage
simultaneously.
Planned instrument: Geostationary Lightning Mapper;
Description: Expected to continuously monitor total lightning (in-
cloud and cloud-to-ground) activity over the United States and
adjacent oceans and to provide a more complete dataset than previously
possible. Key features include:
* detecting lightning activity as an indicator of severe storms and
convective weather hazard impacts to aviation and;
* providing a new capability to GOES for long-term mapping of total
lightning that only previously existed on NASA low-earth-orbiting
research satellites.
Planned instrument: Magnetometer;
Description: Expected to provide information on the general level of
geomagnetic activity, monitor current systems in space, and permit
detection of magnetopause crossings, sudden storm commencements, and
substorms.
Planned instrument: Space Environmental In-Situ Suite;
Description: Expected to provide information on space weather to aid
in the prediction of particle precipitation, which causes disturbance
and disruption of radio communications and navigation systems. Key
features include:
* measuring magnetic fields and charged particles;
* providing improved heavy ion detection, adding low-energy electrons
and protons; and;
* enabling early warnings for satellite and power grid operation,
telecom services, astronauts, and airlines.
Planned instrument: Solar Ultraviolet Imager;
Description: Expected to provide coverage of the entire dynamic range
of solar X-ray features, from coronal holes to X-class flares, and
will provide quantitative estimates of the physical conditions in the
Sun's atmosphere. Key features include:
* providing information used for geomagnetic storm
forecasts, and power grid performance, and;
* providing observations of solar energetic particle events related to
flares.
Planned instrument: Extreme Ultraviolet/X-Ray Irradiance Sensor;
Description: Expected to detect solar soft X-ray irradiance and solar
extreme ultraviolet spectral irradiance. Key features include:
* monitoring solar flares that can disrupt communications and degrade
navigational accuracy, affecting satellites, astronauts, high latitude
airline passengers, and;
* monitoring solar variations that directly affect satellite
drag/tracking and ionospheric changes, which impact communications and
navigation operations.
Source: GAO analysis of NOAA data.
[End of table]
The ground project is directed by NOAA and is made up of three main
components: the core ground system, an infrastructure of antennas, and
a product access subsystem. In turn, the core ground system comprises
four functional modules supporting operations, product generation,
product distribution, and configuration control. Key components of the
ground project are described in table 4.
Table 4: Key Components of the Geostationary Operational Environmental
Satellite-R Series Ground Project:
Component: Core Ground System;
Description: Expected to (1) provide command of operational functions
of the spacecraft and instruments, (2) receive and process information
from the instruments and spacecraft, (3) distribute satellite data
products to users, and (4) provide configuration control and a common
infrastructure and set of services for the satellite and instruments.
Component: Antennas;
Description: Expected to provide six new antenna stations and modify
four existing antennas to receive GOES-R data. The antenna contract is
also expected to include the construction of related infrastructure,
software development for control systems, and maintenance.
Component: Product Distribution and Access System;
Description: Expected to provide ingestion of data and distribution
for GOES-R products and data to authorized users. When completed, this
system will be integrated into the core ground system.
Source: GAO analysis of NOAA data.
[End of table]
NOAA is responsible for GOES-R program funding and overall mission
success. The NOAA Program Management Council, which is chaired by
NOAA's Deputy Undersecretary, is the oversight body for the GOES-R
program. However, since it relies on NASA's acquisition experience and
technical expertise to help ensure the success of its programs, NOAA
implemented an integrated program management structure with NASA for
the GOES-R program (see fig. 3). NOAA also located the program office
at NASA's Goddard Space Flight Center.
Figure 3: Organizational Structure and Staffing of the Geostationary
Operational Environmental Satellite-R Series Program:
[Refer to PDF for image: organizational structure]
Top level:
* Commerce;
* NASA.
Next level:
* NOAA, reports to Commerce;
Next level:
* NOAA/NASA Program Management Council, communicates with NOAA;
Next level:
* National Environmental Satellite, Data, and Information Service
(NESDIS), reports to NOAA;
* Goddard Space Flight Center Management Council (with NOAA
representation), reports to NASA; communicates with National
Environmental Satellite, Data, and Information Service (NESDIS).
Next level:
* NESDIS/Science Mission Directorate Program Management Council;
communicates with NOAA/NASA Program Management Council and National
Environmental Satellite, Data, and Information Service (NESDIS).
Next level:
* GOES-R Program: System Program Director: NOAA; Deputy System Program
Director: NASA; Assistant System Program Director: NOAA; reports to
National Environmental Satellite, Data, and Information Service
(NESDIS); communicates with Goddard Space Flight Center Management
Council (with NOAA representation);
* Program Scientist: NOAA; reports to National Environmental
Satellite, Data, and Information Service (NESDIS); communicates with
GOES-R Program.
Next level:
* Flight Project: Project Manager: NASA; reports to GOES-R Program;
* Ground Project: Project Manager: NOAA; reports to GOES-R Program.
Source: NOAA.
[End of figure]
Prior Reports Made Recommendations to Address Program Weaknesses:
In recent years, we issued a series of reports aimed at addressing
weaknesses in the GOES-R program.[Footnote 5] Key areas of focus
included (1) improving communications with external data users, (2)
developing contingency plans, and (3) addressing key cost and schedule
risks.
* Improving communications with external users. In September 2010, we
reported that while NOAA had identified GOES data users and involved
internal NOAA users in developing and prioritizing GOES-R
requirements, it had not adequately involved other federal users who
rely on GOES data by documenting their input and communicating major
changes to the program.[Footnote 6] We recommended that the program
establish processes for satellite data requirements definition and
prioritization that include documented input from external users, as
well as processes to notify these non-NOAA agencies of GOES-R program
status and changes. In February 2012, the GOES-R program developed a
communications plan that described how external stakeholders would be
notified of GOES-R progress, status changes, and other relevant
activities. However, NOAA has not yet fully implemented the plan, as
demonstrated by the communication shortfalls discussed later in this
report.
* Developing contingency plans. In September 2010, we reported that
while there was a potential gap in backup coverage due to satellite
launch delays, NOAA had not established adequate continuity plans for
its geostationary satellites.[Footnote 7] We recommended that the
program's plan include implementation procedures, resources, staff
roles, and timetables needed to transition to a single satellite, a
foreign satellite, or other solution. In December 2012, NOAA finalized
a contingency plan that generally included these elements. However,
more work remains to ensure that the plan is viable.
More recently, in February 2013, we added the potential gaps in
weather satellite data to our biennial High-Risk list.[Footnote 8] In
that report, we noted that NOAA had established a contingency plan for
a potential gap in the GOES program, but it needed to demonstrate its
progress in coordinating with the user community to determine their
most critical requirements, conducting training and simulations for
contingency operations scenarios, evaluating the status of viable
foreign satellites, and working with the user community to account for
differences in product coverage under contingency operations
scenarios. We also stated that NOAA should update its contingency plan
to provide more details on its contingency scenarios, associated time
frames, and any preventative actions it is taking to minimize the
possibility of a gap.
* Addressing key cost and schedule risks. In June 2012, we reported
that the GOES-R program might not be able to ensure that it had
adequate resources to cover unexpected problems in remaining
development, and that unresolved schedule deficiencies existed in its
integrated master schedule and contractor schedules. We also reported
that the program estimated a 48 percent chance that the planned GOES-R
launch date of October 2015 would be reached.[Footnote 9] We
recommended that the program assess and report on the reserves needed
for completing remaining development for each satellite in the series,
and address shortfalls in the schedule management practices we
identified such as eliminating unnecessary constraints and creating a
realistic allocation of resources, in order to minimize the likelihood
of a potential gap. The agency agreed with these recommendations and
took steps to address them by identifying needed reserve levels and
refining program schedules.
NOAA Has Made Progress in Developing GOES-R, but Continues to Face
Challenges that Could Increase the Risk of a Satellite Data Gap:
NOAA has completed its design of the GOES-R program, and has made
progress in building components of the flight and ground segments.
Program officials also report that the program is operating within its
estimated budget of $10.9 billion. However, key information on
reserves has not been reported to management. Further, both the flight
and ground segments have experienced delays in achieving major
milestones due to technical challenges, and weaknesses in the
development of master schedules could cause further delays. Program
officials stated that they have made improvements on how they manage
cost reserves and schedules, but acknowledged that there will always
be opportunities for improvement because the reserves and schedules
are so dynamic on a big program like GOES-R. These challenges have the
potential to impact the expected launch date of the first GOES-R
satellite, which would delay the availability of an on-orbit backup
and increase the potential for a gap in GOES satellite coverage should
either of the two operational satellites fail prematurely.
Program Has Completed Design and Begun Building Components of the
First Satellite:
NASA and NOAA are following NASA's standard space system life cycle on
the GOES-R program. This life cycle includes distinct phases,
including concept and technology development; preliminary design and
technology completion; final design and fabrication; system assembly,
integration and testing, launch and checkout; and operations and
sustainment. There are key program reviews throughout each of the
phases, including preliminary design review, critical design review,
and system integration review. NOAA and NASA jointly conduct key
reviews on the flight and ground segments individually as well as for
the program as a whole, and then make decisions on whether to proceed
to the next phase. Figure 4 provides an overview of the life cycle
phases, key program reviews, and associated decision milestones. In
addition, the key reviews are described in table 5.
Figure 4: NASA's Life Cycle for Flight Systems:
[Refer to PDF for image: life cycle illustration]
Formulation:
Pre-phase: A Concept Studies;
KDP A.
Phase A: Concept and Technology Development;
KDP B;
SDR/MDR: system definition review/mission definition review.
Phase B: Preliminary Design and Technology Completion;
KDP C (confirmation review);
Program Start;
PDR: preliminary design review.
Implementation:
Phase C: Final Design and Fabrication;
CDR: critical design review;
MOR: mission operations review;
SIR: system integration review;
KDP D.
Phase D: System Assembly, Integration and Test, Launch;
FOR: flight operations review;
ORR: operational readiness review;
KDP E.
Phase E: Operations and Sustainment;
KDP F.
Phase F: Closeout.
Source: NASA data and GAO analysis.
Note: According to a NASA official, the MOR and FOR are considered
lower-level reviews and are not required by NASA's primary procedural
requirements. They are, however, key mission reviews required by NASA's
Goddard Space Flight Center.
[End of figure]
Table 5: Major Development Reviews for the Geostationary Operational
Environmental Satellite-R Series:
Review: System Definition;
Review;
Description: Performed on the flight
and ground segments individually, and then on the program as a whole,
this review is to examine the proposed system architecture/design and
demonstrate that a system that fulfills the mission objectives can be
built within existing constraints.
Review: Preliminary Design Review;
Description: Performed on the flight
and ground segments individually, and then on the program as a whole,
this review is to demonstrate that the preliminary design meets all
system requirements with acceptable risk and within the cost and
schedule constraints and to establish the basis for proceeding with
detailed design.
Review: Critical Design Review;
Description: Performed on the flight
and ground segments individually, and then on the program as a whole,
this review is to evaluate the completed detailed design of the element
and subsystem products in sufficient detail to provide approval for a
production stage.
Review: Mission Operations Review;
Description: Performed programwide,
this review is to establish the adequacy of plans and schedules for
ground systems and flight operations preparation, and to justify
readiness to proceed with implementation of the remaining required
activities. It is typically held subsequent to completion of detail
design and fabrication activity, but prior to initiation of major
integration activities of flight or ground-system elements.
Review: System Integration Review;
Description: Performed programwide,
this review is to evaluate the readiness of the project to start system
assembly, test, and launch operations. The objectives of the review
include ensuring that planning is adequate for all remaining system
activities and that available cost and schedule resources support
completion of all necessary remaining activities with adequate margin.
Review: Flight Operations Review;
Description: This review is to
present the results of mission operations activities and show that the
program has verified compliance with all requirements and demonstrated
the ability to execute all phases and modes of mission operations, data
processing, and analysis.
Review: Operational Readiness Review;
Description: This review is to
examine characteristics and procedures used in the system's operation
and ensures that all system and support hardware, software, personnel,
and procedures are ready for operations and that user documentation
accurately reflects the deployed state of the system. It is typically
held near the completion of pre-launch testing between the flight
segment and the ground system.
Source: GAO analysis of NOAA documentation.
[End of table]
The GOES-R program has completed final design and begun building
components of the flight and ground systems. Specifically, the program
completed critical design reviews for the flight and ground projects
and for the overall program between April and November 2012. In its
evaluation of the program as part of the critical design review, an
independent review board complimented the program on several recent
achievements, stating that the program was beyond the level of
maturity expected at that phase, and that the program's planning was a
major factor in the launch date of the first satellite remaining
October 2015.
As the spacecraft and instruments are developed, NASA conducts several
interim reviews and tests before proceeding to the next major program-
level review, the system integration review. These include a pre-
environmental review, which represents the conclusion of an initial
round of testing before exposing the instrument to testing under
adverse environmental conditions; environmental testing of key
functions under adverse conditions; and a pre-shipment review, which
is conducted on each instrument to ensure it is ready to be shipped
for integration and testing on the spacecraft.
The GOES-R flight components are in various stages leading up to the
system integration review. Of the six GOES-R instruments, one has
completed environmental testing and its pre-shipment review; four
instruments are in the midst of these reviews and tests; and one
instrument has not yet passed its pre-environmental review. In
addition, the program began building the spacecraft in February 2013.
Table 6 provides more information on progress made on the key flight
project components.
Table 6: Development Status of Flight Project Components for the
Geostationary Operational Environmental Satellite-R Satellite, as of
August 2013:
Key components: Advanced Baseline Imager;
Recent progress:
* Pre-environmental review completed in November 2012;
* Environmental testing under way.
Key components: Extreme Ultraviolet/X-Ray Irradiance Sensor;
Recent progress:
* Instrument fully assembled and tested;
* Pre-environmental review conducted in July 2012;
* Pre-ship review conducted in April 2013.
Key components: Geostationary Lightning Mapper;
Recent progress:
* Assembly of some subcomponents completed, others continuing;
* Subcomponent testing is under way.
Key components: Magnetometer;
Recent progress:
* Selected components have completed readiness reviews and tests;
* Environmental testing is underway.
Key components: Solar Ultraviolet Imager;
Recent progress:
* Pre-environmental review completed in November 2012;
* Environmental testing is under way;
* Pre-ship review scheduled for July 2013.
Key components: Space Environmental In-Situ Suite;
Recent progress:
* Individual component testing completed;
* Pre-environmental review conducted in May 2013.
Key components: Spacecraft;
Recent progress:
* Core structure testing completed; multiple components delivered;
* Integration of subsystems under way;
* Construction of; the system module that will host instruments is
under way.
Source: GAO analysis of NOAA documentation.
[End of table]
Similar to the flight project, major ground system milestones are
focused on building and testing components and the program has made
progress in this area. Specifically, on the core ground system, a
prototype for the operations module was delivered in late 2012 and
used for initial testing and training.[Footnote 10] In July 2013, the
ground project delivered the iteration of the operations module that
will be used to support the first satellite. In addition, the program
has installed antenna dishes at NOAA's primary satellite
communications site, and completed two key reviews of antennas at the
GOES remote backup site. The Product Distribution and Access System
recently completed a review that will allow testing to begin on its
first release. An integration review for ground components is also
expected to take place in January 2014. More detail on the progress of
the ground project can be seen in table 7.
Table 7: Development Status of the Geostationary Operational
Environmental Satellite-R Series Ground Project Components, as of
August 2013:
Key components: Core Ground Segment;
Recent progress:
* Critical design review completed in November 2012;
* Completed readiness review for receipt of GOES-R antennas;
* Prototype delivery and installation for the mission operations
function completed; first release scheduled for June 2013.
Key components: Antenna System;
Recent progress:
* Contractor demonstrated ability to produce 8 of 13 components;
remainder due in May 2013;
* Installation of the first two antenna structures has been completed;
the third antenna structure is scheduled to be completed in fiscal
year 2014;
* Supporting infrastructure built, and two key reviews completed, for
remote back-up antenna site.
Key components: Product Distribution and Access System;
Recent progress:
* Testing begun on first increment/release.
Source: GAO analysis of NOAA documentation.
[End of table]
The program's next major milestone is a programwide system integration
review, which is scheduled for March 2014. Based on the results of
that review, NOAA and NASA will decide whether to move the program to
the next phase: the system assembly, integration and test, and launch
and checkout phase.
Contingency Reserves Are Generally in Line with Goals for Overall
Program Development; Reporting on Reserve Values Remains Limited:
The GOES-R program is estimated to cost $10.9 billion. As of February
2013, the program estimated that this amount was divided into four
categories, with $6.0 billion for the flight project, $1.7 billion for
the ground project, $2.0 billion for other program costs (including,
among other things, program/project support and formulation) and $1.2
billion for operations and support. Program officials reported that
the program is currently operating without cost overruns on any of its
main components, but noted that the program life cycle costs may
increase by $150 to $300 million if full funding in the current fiscal
year is not received.
A portion of the amounts planned for the flight project and ground
project are allocated to contingency reserves (also called management
reserve). The program also keeps a programwide contingency allocation
separate from those of the flight and ground projects. A contingency
reserve provides program managers ready access to funding in order to
resolve problems as they occur and may be necessary to cover increased
costs resulting from unexpected design complexity, incomplete
requirements, or other uncertainties.[Footnote 11] NASA's Goddard
Space Flight Center requires its flight projects, including GOES-R, to
maintain contingency reserves during system development, at a level of
25 percent of development costs.[Footnote 12] The GOES-R program
requires its flight and ground projects to maintain 20 percent of
planned remaining development costs as reserve funding.[Footnote 13]
The program office also maintains contingency reserves equal to 10
percent of planned remaining development costs to cover program
support costs and to supplement the flight and ground projects'
reserves if necessary. According to a NOAA program official, the GOES-
R program is able to meet NASA's requirement through the combination
of the 20 percent flight and ground project requirements and the
supplemental 10 percent program-level reserve. An official also stated
that the method of keeping separate reserves at the program and
project levels was chosen as it was successful on past projects.
The GOES-R flight project, ground project, and program office are at
or above the amount of reserves they are required to carry.
Specifically, as of March 2013, the overall contingency reserve
percentages for the flight and ground projects were at 20 and 28
percent, respectively, which are at or above the required level of 20
percent. The program reserves were at 11 percent, slightly above the
required level of 10 percent. Reserve values and percentages are
provided in table 8.
Table 8: Reserve Levels for the Geostationary Operational
Environmental Satellite-R Series Program, as of March 2013:
Component: Flight;
Required reserve: 20%;
Remaining reserve[B]: 20%.
Component: Ground;
Required reserve: 20%;
Remaining reserve[B]: 28%.
Component: Program;
Required reserve: 10%;
Remaining reserve[B]: 11%.
Component: Total;
Required reserve: 25%[A];
Remaining reserve[B]: 29%.
Source: NOAA data and GAO analysis of NOAA data.
Notes:
[A] NASA and NOAA officials stated that the allocation of reserves
among flight, ground, and program components meets NASA's requirement
of 25 percent because the program category includes reserve funds that
can be used to supplement the flight and ground components.
[B] A series of adjustments are made to the flight, ground, and
program budget amounts before the reserve percentage is calculated;
thus the reserve percentage cannot be reached by dividing contingency
reserves by total budget authority.
[End of table]
Reporting on Reserves Is Not Sufficiently Detailed or Transparent:
We previously reported that, in order to oversee GOES-R contingency
funding, senior managers should have insight into the amount of
reserves set aside for each satellite in the program and detailed
information on how reserves are being used on both the flight and
ground components. While the GOES-R program continues to regularly
report on contingency funds, it does not report key information on the
status of its reserve funding to senior level management.[Footnote 14]
At monthly program management council meetings, the program reports
summary information, such as the total value of contingency reserves
and reserve percentage held for each fiscal year. Reserve totals are
given for the flight and ground projects as well as for the program
overall. However, the program does not report on the reserves needed
for completing remaining development for each satellite in the series
or provide detailed information on how reserves are being used. Thus,
for example, if the later satellites in the series have a high level
of reserves, they could mask low reserve values for earlier satellites
in the series. Further, in its monthly presentations to senior
managers and NOAA executives, the program does not include information
on the cause of any changes in reserve values from the prior month or
the assumptions it makes when calculating reserves. For example, the
flight reserve value recently went up by 2 percentage points because
the program decided to include reserve funding for the GOES-T
satellite in 2018, and the ground reserve values went down by 10
percentage points because the program shifted reserve funding from the
ground to the flight projects. Neither of these changes was identified
or explained in the monthly presentations. The lack of insight on how
the reserves are calculated and modified could lead executives to
misinterpret the size of the remaining reserves. Program officials
noted that they took steps after our previous report to clarify what
they report about reserves, but noted that the amount of information
needed to fully explain reserve calculations and changes could be too
much information for an executive-level briefing. Without regularly
providing sufficiently detailed budget information, it may be more
difficult for program management to have the information they need to
make the best decisions possible regarding the program's future
funding.
Recent and Potential Milestone Delays and Continued Weaknesses in
Scheduling Practices Increase the Potential for a Delayed Launch:
The GOES-R program established programwide milestones, including the
mission operations review and flight operations review, to determine
the program's ability to proceed to system integration and to complete
mission operations, respectively. It also established five end-to-end
system tests to validate compatibility between the space and ground
segments before the launch of the first satellite.
However, over the past year, the program delayed many of these key
milestones and tests. Delays in the mission operations review means
that the large-scale integration of flight and ground components will
not occur until 21 months prior to launch. Similarly, delaying end-to-
end tests until 17 months prior to launch will allow the program less
time to respond to any problems that occur. Table 9 highlights key
milestones and the extent of recent delays.
Table 9: Delays in Milestones for the Geostationary Operational
Environmental Satellite-R Series Program:
Program milestone: Mission operations review;
Date planned (as of April 2012): January 2013;
Date completed or planned (as of March 2013): January 2014;
Delay: 12 months[A].
Program milestone: End-to-end test #1;
Date planned (as of April 2012): February 2014;
Date completed or planned (as of March 2013): May 2014;
Delay: 3 months.
Program milestone: End-to-end test #2;
Date planned (as of April 2012): May 2014;
Date completed or planned (as of March 2013): August 2014;
Delay: 3 months.
Program milestone: End-to-end test #3;
Date planned (as of April 2012): August 2014;
Date completed or planned (as of March 2013): December 2014;
Delay: 4 months.
Program milestone: Flight operations review;
Date planned (as of April 2012): September 2014;
Date completed or planned (as of March 2013): January 2015;
Delay: 4 months.
Program milestone: End-to-end test #4;
Date planned (as of April 2012): December 2014;
Date completed or planned (as of March 2013): March 2015;
Delay: 3 months.
Program milestone: End-to-end test #5;
Date planned (as of April 2012): July 2015;
Date completed or planned (as of March 2013): July 2015;
Delay: No change.
Source: GAO analysis of NOAA data.
[A] Program officials stated that they had erroneously scheduled the
mission operations review too soon, and moved the date by 9 months to
better reflect when the review was needed. Therefore, only 3 of the 12
months were attributable to a delay.
[End of table]
Continued Technical Issues Could Cause Further Delays:
The GOES-R program is also experiencing technical issues on the flight
and ground projects that could cause further schedule delays.
* The original supplier for a key component on the spacecraft moved to
a different facility, introducing risk due to the loss of experienced
personnel and the impact on schedule. This led the program to find an
alternative supplier. While a design review was performed to confirm
resolution of the issue in April 2013, this change may lead to a delay
of up to 6 months in integrating the component on the spacecraft.
Program officials noted that this delay is not expected to impact the
program's critical path or major milestones.
* The Geostationary Lightning Mapper's electronics unit experienced
problems during testing, which led the program office to delay the
tests.[Footnote 15] The program is considering several options to
address this issue, including using the electronics unit being
developed for a later GOES satellite to allow key components to
proceed with testing. If the issue cannot be resolved, it would affect
the instrument's performance. As a result, the program is also
considering excluding the Geostationary Lightning Mapper from the
first GOES satellite. The program plans to make its decision on
whether or not to include the instrument in late 2013. The removal of
this instrument would cause a significant reduction in the satellite's
functionality. Key GOES users have stated that they would prefer that
NOAA delay launching the GOES-R satellite rather than launch it
without the Geostationary Lightning Mapper.
* The program delayed the start of work on the ground system at the
NOAA satellite operations facility by three months, from a planned
date of October 2012 to January 2013, following a bid protest of the
award of a contract to upgrade the facility. This delay compressed an
already tight schedule for testing the ground system.
* Testing for a number of ground system requirements has been
postponed until future releases and builds, potentially causing
modification to the schedule for these future products.
* Power amplifiers for the antenna systems experienced higher than
expected failure rates, which could lead to schedule delays and
decreases in operational availability.
Given that fewer than 3 years remain before GOES-R's expected launch
in October 2015, continued delays in key milestones and reviews
decrease the likelihood that the launch date will be met. Program
officials recently acknowledged that the GOES-R launch date may be
delayed by about 6 months, and attributed the cause of the delay to a
shortfall of $54 million in anticipated funding in fiscal year 2013.
[Footnote 16]
Scheduling Practices Improved, but Weaknesses Remain:
Delays in the program's remaining schedule are also at risk of further
growth due to weaknesses in the program's scheduling methods. Program
schedules not only provide a road map for systematic program
execution, but also provide the means by which to gauge progress,
identify and address potential problems, and promote accountability.
Achieving success in managing large-scale programs depends in part on
having an integrated and reliable schedule that defines, among other
things, when work activities and milestone events will occur, how long
they will take, and how they are related to one another. Without such
a reliable schedule, program milestones may slip.
In June 2012, we reported on weaknesses in program schedules that
comprised portions of the program's Integrated Master Schedule,
including subordinate schedules for the spacecraft and core ground
system. At that time, our work identified nine best practices
associated with developing and maintaining a reliable schedule.
[Footnote 17] These are (1) capturing all activities, (2) sequencing
all activities, (3) assigning resources to all activities, (4)
establishing the duration of all activities, (5) integrating schedule
activities horizontally and vertically, (6) establishing the critical
path for all activities, (7) identifying reasonable float time between
activities, (8) conducting a schedule risk analysis, and (9) updating
the schedule using logic and durations. See table 10 for a description
of each of these best practices.
Table 10: Description of Scheduling Best Practices:
Practice: Capturing all activities;
Description: The schedule should reflect all activities (steps,
events, outcomes, etc.) as defined in the program's work breakdown
structure to include activities to be performed by both the government
and its contractors.
Practice: Sequencing all activities;
Description: The schedule should sequence activities in the order that
they are to be implemented. In particular, activities that must finish
prior to the start of other activities (i.e., predecessor activities),
as well as activities that cannot begin until other activities have
been completed (i.e., successor activities) should be identified.
Practice: Assigning resources to all activities;
Description: The schedule should reflect who will do the work
activities, whether all required resources will be available when they
are needed, and whether there are any funding or time constraints.
Practice: Establishing the duration of all activities;
Description: The schedule should reflect the duration of each
activity. These durations should be as short as possible and have
specific start and end dates.
Practice: Integrating schedule activities horizontally and vertically;
Description: The schedule should be horizontally integrated, meaning
that it should link the products and outcomes associated with sequenced
activities. The schedule should also be vertically integrated, meaning
that there is traceability among varying levels of activities and
supporting tasks and subtasks.
Practice: Establishing the critical path for all activities;
Description: The critical path represents the chain of dependent
activities with the longest total duration in the schedule.
Practice: Identifying reasonable float time between activities;
Description: The schedule should identify a reasonable amount of
float--the time that an activity can slip before the delay affects the
finish milestone--so that schedule flexibility can be determined. As a
general rule, activities along the critical path typically have the
least amount of float.
Practice: Conducting a schedule risk analysis;
Description: A schedule risk analysis is used to predict the level of
confidence in the schedule, determine the amount of time contingency
needed, and identify high-priority schedule risks.
Practice: Updating the schedule using logic and durations to determine
the dates;
Description: The schedule should use logic and durations in order to
reflect realistic start and completion dates, be continually monitored
to determine differences between forecasted completion dates and
planned dates, and avoid logic overrides and artificial constraint
dates.
Source: GAO analysis of government and industry practices in GAO-09-
3SP.
[End of table]
In a previous report, we observed that important schedule components
in GOES-R related schedules had not been included or completed, and
recommended that these shortfalls be addressed.[Footnote 18] NOAA has
since improved selected practices on its spacecraft and core ground
schedules, but other practices stayed the same or worsened.
Specifically, for the spacecraft, 2 practices were improved, 5 stayed
the same, and 2 became weaker. For the core ground system, 4 practices
were improved, 3 stayed the same, and 2 became weaker. Table 11
compares our assessments of the spacecraft and core ground system
schedules in July 2011 and November 2012.
Table 11: Assessment of Selected Schedules Use of Best Practices over
Time:
Scheduling best practice: Best practice 1: Capturing all activities;
Spacecraft schedules:
July 2011: The agency/contractor has substantially met the criteria
for this best practice;
November 2012: The agency/contractor has fully met the criteria for
this best practice;
Core ground schedules:
July 2011: The agency/contractor has partially met the criteria for
this best practice;
November 2012: The agency/contractor has substantially met the
criteria for this best practice.
Scheduling best practice: Best practice 2: Sequencing all activities;
Spacecraft schedules:
July 2011: The agency/contractor has partially met the criteria for
this best practice;
November 2012: The agency/contractor has partially met the criteria
for this best practice;
Core ground schedules:
July 2011: The agency/contractor has partially met the criteria for
this best practice;
November 2012: The agency/contractor has partially met the criteria
for this best practice;
Scheduling best practice: Best practice 3: Assigning resources to all
activities;
Spacecraft schedules:
July 2011: The agency/contractor has minimally met the criteria for
this best practice;
November 2012: The agency/contractor has partially met the criteria
for this best practice;
Core ground schedules:
July 2011: The agency/contractor has partially met the criteria for
this best practice;
November 2012: The agency/contractor has minimally met the criteria
for this best practice.
Scheduling best practice: Best practice 4: Establishing the duration of
all activities;
Spacecraft schedules:
July 2011: The agency/contractor has substantially met the criteria
for this best practice;
November 2012: The agency/contractor has substantially met the
criteria for this best practice;
Core ground schedules:
July 2011: The agency/contractor has substantially met the criteria
for this best practice;
November 2012: The agency/contractor has substantially met the
criteria for this best practice.
Scheduling best practice: Best practice 5: Integrating schedule
activities horizontally and vertically;
Spacecraft schedules:
July 2011: The agency/contractor has substantially met the criteria
for this best practice;
November 2012: The agency/contractor has partially met the criteria
for this best practice;
Core ground schedules:
July 2011: The agency/contractor has partially met the criteria for
this best practice;
November 2012: The agency/contractor has minimally met the criteria
for this best practice.
Scheduling best practice: Best practice 6: Establishing the critical
path for all activities;
Spacecraft schedules:
July 2011: T The agency/contractor has minimally met the criteria for
this best practice.
November 2012: The agency/contractor has substantially met the
criteria for this best practice;
Core ground schedules:
July 2011: The agency/contractor has minimally met the criteria for
this best practice;
November 2012: The agency/contractor has partially met the criteria
for this best practice.
Scheduling best practice: Best practice 7: Identifying float on
activities and paths;
Spacecraft schedules:
July 2011: The agency/contractor has minimally met the criteria for
this best practice.
November 2012: The agency/contractor has minimally met the criteria
for this best practice.
Core ground schedules:
July 2011: The agency/contractor has minimally met the criteria for
this best practice;
November 2012: The agency/contractor has minimally met the criteria
for this best practice.
Scheduling best practice: Best practice 8: Conducting a schedule risk
analysis;
Spacecraft schedules:
July 2011: The agency/contractor has minimally met the criteria for
this best practice;
November 2012: The agency/contractor has minimally met the criteria
for this best practice;
Core ground schedules:
July 2011: The agency/contractor has minimally met the criteria for
this best practice;
November 2012: The agency/contractor has partially met the criteria
for this best practice.
Scheduling best practice: Best practice 9: Updating the schedule using
logic and durations to determine the dates;
Spacecraft schedules:
July 2011: The agency/contractor has fully met the criteria for this
best practice;
November 2012: The agency/contractor has substantially met the
criteria for this best practice;
Core ground schedules:
July 2011: The agency/contractor has substantially met the criteria
for this best practice;
November 2012: The agency/contractor has fully met the criteria for
this best practice.
Source: GAO analysis of schedules provided by GOES-R, documents and
information received from GOES-R officials.
[End of table]
NOAA has improved elements of the schedules for both components.
Specifically, the spacecraft schedule has eliminated level of effort
activities[Footnote 19] and has assigned resources for a greater
percentage of activities. The core ground schedule now has an
automated process by which all subcontractor records are combined to
create an integrated schedule. It has a series of connected activities
that lead to what contractor officials consider its main milestone
delivery, and has implemented a detailed schedule risk analysis for a
key upcoming release.
However, scheduling issues remain on the schedules for both
components. For example, both schedules have issues with sequencing
remaining activities and integration between activities. Regarding the
spacecraft schedule, there is a small subset of activities with
incomplete links between activities, and more than 20 percent of
remaining detail activities have lags, or a set number of days between
an activity and its successor. In the core ground schedule, a number
of activities are missing either predecessor or successor activities,
and there are several activities representing the end of the project
on or about the same date. Without the right linkages, activities that
slip early in the schedule do not transmit delays to activities that
should depend on them. When this happens, the schedule will not
provide a sufficient basis for understanding the program as a whole,
and users of the schedule will lack confidence in the dates and the
critical path.
Both schedules also have a very high average of total float time for
detailed activities.[Footnote 20] Specifically, total float time is
greater than two months for nearly two-thirds of remaining detailed
activities in the spacecraft schedule, and at least a year for more
than 10 percent of remaining detail activities in the core ground
schedule. In the case of spacecraft, officials stated that high levels
of float time were often due to activities that had been completed at
one time for several satellites, only one of which was immediately
needed. Officials also provided detailed information on the activities
with the highest amount of float. In the case of the core ground
schedule, officials stated that many activities occurring after the
main milestone date, which occurs nearly five years prior to the end
of the schedule, do not have a true successor, and therefore are
calculated only to the end of the contract. Officials also stated that
values and trends in float time are monitored regularly for both
schedules. Such high values of total float time can falsely depict
true project status, making it difficult to determine which activities
drive key milestone dates. Without reasonable values of total float
time, it cannot be used to identify activities that could be permitted
to slip and thus release and reallocate resources to activities that
require more resources to be completed on time.
In addition, the project's critical path does not match up with
activities that make up the driving path[Footnote 21] on the core
ground schedule. Contractors monitor a driving path monthly to both
major and minor milestone deliveries. However, until the schedule can
produce a true critical path, it will be more difficult for the
program office to provide reliable time line estimates or identify
when problems or changes may occur and their effect on downstream
work. Also, without a valid critical path to the end of the schedule,
management cannot focus on activities that will have a detrimental
effect on the key project milestones and deliveries if they slip.
Further, neither schedule file has fully integrated resources with
schedule activities. As of November 2012, contractor officials stated
that the ground system schedule was not feasible given available
resources and that they were in the process of revising their
immediate schedules to make them feasible. The spacecraft schedule
contains major resource categories that correspond to contractor sites
and work phases. However, thresholds for overruns of resource
allocations are functionally disabled within the schedules through the
setting of an arbitrarily high value for maximum resources per
category. In response, contractor officials stated that account
managers are responsible for monitoring resource levels and that
weekly meetings are held to ensure that resource issues are discussed.
Information on resource needs and availability in each work period
assists the program office in forecasting the likelihood that
activities will be completed as scheduled. If the current schedule
does not allow insight into current or project allocation of
resources, then the risk of delays in the program's schedule is
significantly increased.
Deficiencies in scheduling practices such as the ones outlined here
could increase the likelihood of launch date delays, because decision
making would be based on data that does not accurately depict current
status, thus impeding management's ability to conduct meaningful
oversight on the program's schedules. Program officials noted that
they have made improvements in scheduling practices, but explained
that because the schedules are so dynamic there are always areas for
improvement. Lack of the proper understanding of current program
status due to schedules that are not fully reliable undercuts the
ability of the program office to manage a high-risk program like GOES-
R.
Delays in the GOES-R Launch Date Could Increase the Risk of a
Satellite Data Gap:
Potential delays in the launch date of the first GOES-R satellite
would increase the risk of a gap in GOES satellite coverage. NOAA's
policy is to have two operational satellites and one backup satellite
in orbit at all times. This policy proved useful in December 2008 and
again in September 2012, when the agency experienced problems with one
of its operational satellites, but was able to move its backup
satellite into place until the problems had been resolved.
NOAA is facing a period of at least a year when it will not have a
backup satellite in orbit. Specifically, in April 2015, NOAA expects
to retire one of its operational satellites (GOES-13) and move its
backup satellite (GOES-14) into operation. Thus, the agency will have
only two operational satellites in orbit--and no backup satellite--
until GOES-R is launched and completes an estimated 6-month post-
launch test period. If GOES-R is launched in October 2015, the soonest
it could be available for operational use would be April 2016. Any
delay to the GOES-R launch would extend the time without a backup to
more than one year. Figure 5 shows anticipated operational and test
periods for the two most recent series of GOES satellites.
Figure 5: Potential Gap in Geostationary Operational Environmental
Satellite Coverage:
[Refer to PDF for image: illustration]
GOES-13:
Available as backup: Calendar year 2009-2010;
Operational period: Calendar year 2010-2015.
GOES-14:
Launch date: Calendar year 2009;
Post launch test period: Calendar year 2009;
Available as backup: Calendar year 2010-2015;
Operational period: Calendar year 2015-2020.
GOES-15:
Launch date: Calendar year 2010;
Post launch test period: Calendar year 2010;
Available as backup: Calendar year 2010-2011;
Operational period: Calendar year 2012-2017.
GOES-R:
Launch date: Calendar year 2015;
Post launch test period: Calendar year 2015-2016;
Available as backup: Calendar year 2016;
Operational period: Calendar year 2017-2023.
GOES-S:
Launch date: Calendar year 2017;
Post launch test period: Calendar year 2017;
Available as backup: Calendar year 2017-2020;
Operational period: Calendar year 2020-2023.
Projected gap in backup coverage: mid-2015-mid 2015.
Source: GAO analysis of NOAA data.
[End of figure]
In addition to the year or more during which no back-up satellite
would be available, there is a chance that NOAA would have to operate
with a single operational satellite. In December 2012, an independent
review board estimated that there is a 36 percent chance that the GOES
constellation would have only one operational satellite at the
expected date of GOES-R's launch. Thus, if NOAA were to experience a
problem with either of its operational satellites before GOES-R is in
orbit and operational, it would need to rely on older satellites that
are beyond their expected operational lives and may not be fully
functional. Without a full complement of operational GOES satellites,
the nation's ability to maintain the continuity of data required for
effective weather forecasting could be compromised. This, in turn,
could put the public, property, and the economy at risk.
NOAA Has a Process for Managing Changes in GOES-R Requirements, but
Changes Could Affect Some Users:
System requirements describe the functionality needed to meet user
needs and perform as intended in an operational environment. According
to leading industry, academic, and government entities, a disciplined
process for developing and managing requirements can help reduce the
risks of developing or acquiring a system.[Footnote 22] One key aspect
of effective requirements management involves managing changes to
requirements through a standardized process. Table 12 outlines best
practices of a sound change management process and key questions for
evaluating the process.
Table 12: Best Practices in Managing Requirements Changes:
Practice: Manage changes to requirements throughout the life cycle
using a standard process;
Key questions:
Does the program (or project) have a requirements management plan?
Does the program maintain a current and approved set of requirements?
Does the program have an approved set of baseline requirements?
Does the program's change management process provide guidance for the
identification, review, and management of all requirements changes?
Do change management processes apply throughout the program's life
cycle?
Does change management documentation, such as meeting notes or change
records, indicate that the organization is following its change
management policies and procedures?
Practice: Document changes to requirements;
Key questions:
Does the organization maintain records for all changes?
Are all approved requirements changes documented according to a
standard process?
Are other work products in consistent alignment with requirements
changes?
Practice: Document rationale for change and analyze impact;
Key questions:
Does the program document rationales for proposed changes?
Does the program maintain a history of these rationales?
Do they analyze the impact of a proposed change to the project and to
users in impact assessments?
Do these assessments address impacts to cost, schedule, risk, and
project capabilities?
Practice: Have an approval body with appropriate representation review
and approve all requirements changes;
Key questions:
Has the program established an approval body for requirements changes
and defined its responsibilities?
Do change management policies require appropriate representation on
the approval body?
Do change management policies require that the approval body review
and approve all changes?
Does documentation show that the approval body reviewed and approved
program requirements changes?
Practice: Ensure that requirements changes are aligned with user needs;
Key questions:
Are requirements analyzed according to a standard process to determine
if they continue to meet user needs?
Do impact assessments show that the requirements remain in alignment
with user needs?
Has the program traced the changed requirements back to user needs?
Has the program verified and validated that changed requirements align
with user needs?
Practice: Communicate requirements changes to users;
Key questions:
When requirements changes occur, are they communicated to end users?
Is change information disseminated as part of a standard process?
Source: GAO analysis of government and industry practices.
[End of table]
The GOES-R program has a change management process that satisfied
three practices, partially satisfied two practices, and did not
satisfy one practice. Specifically, GOES-R has established a change
management process that tracks and documents changes in requirements,
documents the rationale for the changes as well as the potential
impact of the change on cost and schedule, and ensures that changes
are reviewed and approved by a change control board. In addition, the
program has evaluated the impact of key changes on selected users and
communicated with those users. However, as we first reported in 2010,
the program is still weak in evaluating the impact of changes on
external users who rely on GOES data products and in effectively
communicating changes to those satellite data users.[Footnote 23]
Specifically, table 13 outlines how the GOES-R program performed on
each of the best practices for managing changes in requirements, and
is followed by a more detailed discussion of key shortfalls.
Table 13: Assessment of Geostationary Operational Environmental
Satellite-R Series Program Practices in Managing Changes in
Requirements:
Practice: Manage changes to requirements throughout the life cycle
using a standard process;
Assessment: Satisfied;
Discussion: The GOES-R program has a requirements management plan and
has established a change management process to apply throughout the
project's life cycle. In order to change a high-level requirement, the
program must follow a detailed process that begins with proposing a
change request, evaluating it, and obtaining approval or rejection of
the request. The program also maintains an approved set of high-level
baseline requirements and updates them regularly in response to
requirements changes.
Practice: Document changes to requirements;
Assessment: Satisfied;
Discussion: The GOES-R program documents requirements changes in a
public change log associated with its high-level requirements document.
More detailed information on the changes is tracked in an internal
database. The changes documented in the change log align with those
documented in the internal tracking database.
Practice: Document rationale for change and analyze impact;
Assessment: Partially satisfied;
Discussion: The GOES-R program documented the rationale for individual
requirements changes as well as the cost and schedule impact of
selected changes. In addition, the program has assessed the impact of
key changes on selected users within NOAA. However, the program has
not assessed the cost and schedule impact of all changes, and has not
assessed the impact of key changes on external users who rely on GOES
satellite data. Program officials noted that they assessed the cost
and schedule impact of changes that were expected to negatively impact
the program's cost or schedule, and that they focus their impact
assessments on users within NOAA because they are considered the
primary users.
Practice: Have an approval body with appropriate representation review
and approve all changes;
Assessment: Satisfied;
Discussion: The GOES-R program has a configuration change board with
representation from key NOAA and NASA officials. The board's
responsibilities are formalized in program documentation. Further, the
board members review and approve requirements changes.
Practice: Ensure that requirements changes are aligned with user needs;
Assessment: Not; satisfied;
Discussion: The program's change management process does not require
taking steps to ensure that changes in requirements are aligned with
user needs. Specifically, the process does not require officials to
trace applicable changes to user needs or to test or simulate whether
the change still meets user needs. Moreover, for seven selected
changes we reviewed, the program did not demonstrate the steps it took
to test or validate the changes to ensure they were aligned with user
needs. Program officials noted that they utilize a user working group
to communicate changes to users and elevate concerns raised by users.
Practice: Communicate requirements changes to users;
Assessment: Partially satisfied;
Discussion: The GOES-R program generally communicates requirements
changes to key users within NOAA and NOAA's National Weather Service
through mechanisms such as email correspondence and working groups,
while it communicates changes to external users through periodic
conferences, such as the GOES users' conferences. However, it does not
alert external users who rely on GOES data to perform their missions
about specific changes in requirements that will likely affect their
operations. These external users include the Federal Aviation
Administration, the U.S. Department of Agriculture, and the Department
of the Navy. Officials at all three agencies reported that they were
not informed about key changes in requirements that could affect their
operations. Program officials stated that they work through a variety
of working groups to try to communicate changes with those who utilize
the satellite data.
Source: GAO analysis of NOAA documentation.
[End of table]
While the program generally communicates requirements changes to key
users within NOAA's National Weather Service community, it does not
communicate as well with satellite data users external to NOAA. Many
such users are dependent on GOES satellite data for their respective
missions. Officials responsible for working with satellite weather
observations at three agencies were unaware of selected changes in
GOES-R requirements. For example, the Federal Aviation Administration
uses the satellites' data and images to manage air traffic across the
country, and the Navy uses the data for oceanic weather forecasting,
as well as tactical ocean analysis of regions of interest. They stated
that NOAA had not reached out to them to alert them to these changes
or ask if the changes would impact them. Similarly, Forest Service
officials were concerned that potential changes in spectrum
allocations could affect their ability to obtain data from their own
ground-based weather observation systems because they currently rely
on GOES-R communication channels to obtain this data.
GOES-R program officials noted that they provide regular briefings to
the Office of the Federal Coordinator for Meteorology, an interagency
council with membership from fifteen federal departments and agencies
involved in meteorological activities (including the Departments of
Agriculture, Defense, and Transportation) and that the Air Force
represents the Department of Defense community on the GOES-R Series
Independent Advisory Committee. However, they acknowledged that they
cannot ensure that the information they provide is disseminated within
the agencies. Further, GOES officials explained that one reason for
the distinction between the internal and external users is that the
internal users belong to formal working groups and receive regular
updates from the GOES-R program, while the other users generally have
more informal or indirect connections with the program. Instead of
direct communications such as e-mails, the other users may receive
information about GOES-R requirements changes from publicly available
information or through other meteorological partnerships with NOAA.
Without consistent and direct communication, users may be unaware of
changes to program requirements that are critical to their respective
missions. Because GOES-R users across the country have missions that
preserve and protect property and life, it is critical that these
organizations are made aware of any changes as soon as they are made,
so that they can assess and mitigate any significant impacts.
GOES-R Program Has Undergone Multiple Changes in Requirements;
Selected Changes Could Affect User Operations:
Since 2007, NOAA has changed multiple system requirements on the GOES-
R program. These changes involved strengthening or relaxing
specifications on selected products, finalizing a decision not to
develop 31 products, and modifying programmatic requirements not tied
to any individual product. For example, NOAA strengthened
specifications for the geographic coverage, image resolution, and
refresh rate on a product depicting total precipitable water, and
strengthened accuracy specifications for a product depicting cloud
layers and heights. NOAA also relaxed specifications to provide less
measurement accuracy on a product depicting hurricane intensity, less
geographic coverage on a product depicting sea surface temperatures,
less resolution on a product that tracks the motion of clouds through
the atmosphere, and less timely updates on a product depicting
lightning detection. The GOES-R program also documented NOAA's earlier
decision not to develop or provide 31 products that it labeled as
optional, noting that the products will only be developed if funding
becomes available. In addition, programmatic changes include the
elimination of 97 percent mission availability as a measure of minimum
performance success and the decision not to transmit raw satellite
data to users. Table 14 provides an overview of key changes in product
and program requirements since 2007.
Table 14: Summary of Key Changes in Product and Program Requirements
between 2007 and 2012:
Type of Change: Product;
November 2007: 34 products, each with specifications for accuracy,
geographic coverage, resolution, and timeliness; October 2012: 34
products, of which: 20 (59%) were modified;
* 14 had changes in accuracy measurement;
* 7 had changes in geographic coverage;
* 3 had changes in horizontal resolution;
* 8 had changes in refresh rate/latency.
Type of Change: Product;
November 2007: 34 optional products;
October 2012: 2 optional products were eliminated; 1 optional product
was combined with another optional product;
31 optional products are not being developed.
Type of Change: Program;
November 2007: The satellites shall be capable of being configured to
accommodate additional instrumentation with minimal redesign of the
spacecraft;
October 2012: Requirement removed.
Type of Change: Program;
November 2007: The GOES-R series is required to meet or exceed the
level of capability of the prior series of satellites (GOES-N,O,P) for
system continuity;
October 2012: Requirement removed.
Type of Change: Program;
November 2007: The GOES-R satellites are required to acquire and
transmit the raw environmental data to ground stations to allow for the
timely and accurate processing of data;
October 2012: While the program is still required to relay GOES-R
sensor data, the requirement to acquire and transmit raw data has been
removed.
Type of Change: Program;
November 2007: GOES-R is required to meet or exceed the prior series of
satellites' capabilities for storage of environmental data;
October 2012: The program is required to make products available to
NOAA Archival Data Centers, but capabilities for storing the data are
not specified.
Type of Change: Program;
November 2007: The GOES-R system need date is specified as December
2014;
October 2012: Requirement removed.
Type of Change: Program;
November 2007: GOES-R is required to achieve "full operational
capability," which is defined, in part, as full coverage of the east
and west positions;
October 2012: The requirement for full operational capability was
strengthened to include the production and availability of the full
product set of satellite data to users.
Type of Change: Program;
November 2007: Minimum performance success is defined as 97 percent
mission availability for collecting, generating, and distributing key
products over a defined central coverage zone;
October 2012: Minimum performance success is redefined as the
successful generation and availability of key functions to users. The
availability percentage has been removed.
Type of Change: Program;
November 2007: The operational lifetime of the GOES-R series shall
extend through 2028;
October 2012: The individual GOES-R satellites' lifetimes shall be 5
years in on-orbit storage plus 10 years in operation.
Type of Change: Program;
November 2007: Requirements for a remote backup facility not specified;
October 2012: Addition of requirements for a remote backup facility.
Type of Change: Program;
November 2007: Failover time to backup satellite or backup ground
facility not mentioned;
October 2012: This information is now included.
Type of Change: Program;
November 2007: Requirements do not specify the locations of the
satellites in on-orbit storage;
October 2012: Added requirements that specify the satellites' checkout
location and the location of on-orbit satellite storage.
Source: GAO analysis of NOAA documentation.
[End of table]
While NOAA officials stated that they believe that only one of the
changes that were made since 2007 was significant,[Footnote 24]
internal and external users noted that they found many of the changes
to be significant. In addition, selected satellite data users noted
concern at the loss of 17 of the optional products that are no longer
being developed. The changes that users found significant, along with
user reasoning for why these changes are significant, are listed in
table 15. GOES-R program officials acknowledged that the National
Weather Service and other users will have impacts from the loss or
degradation of products, but that it is not always accurate to assume
that GOES-R could have met the original requirements. In 2011, an
algorithm development executive board reported that several original
requirements could not have been met because, among other reasons,
they relied on a hyperspectral instrument that was removed from the
program, the requirements were poorly stated and it only became
evident later that GOES could not support them, and there were
scientific limitations on the development of the products that only
became evident after development had started. Program officials stated
that they have identified alternative methods for obtaining certain
products (some outside the scope of GOES-R) and that they are
proactively trying to develop alternative products in coordination
with users and other development organizations.
Table 15: User Concerns about Key Changes or Deviations in
Requirements:
Product: Cloud top height;
Change: Relaxation of accuracy requirements;
User concerns: Navy officials reported that this change will likely
cause significant errors, which will reduce the utility of the cloud
top height measurements.
Product: Downward shortwave radiation;
Change: Relaxation of accuracy requirements;
User concerns: Navy officials reported that the larger accuracy ranges
might make this product difficult to use in a statistically
significant way.
Product: Reflected shortwave radiation;
Change: Relaxation of accuracy requirements;
User concerns: Navy officials reported that the larger accuracy ranges
might make this product difficult to use in a statistically
significant way.
Product: Derived stability indices;
Change: Relaxation of resolution requirements;
User concerns: Officials from both the Navy and the Federal Aviation
Administration expressed concern about this change. The Federal
Aviation Administration reported that the reduction in horizontal
resolution might result in reduced forecast accuracy and a reduced
ability to verify convection, which is useful for predicting severe
storms.
Product: Lightning detection;
Change: Reduction in product timeliness;
User concerns: Officials from the Federal Aviation Administration
expressed concern about this change. They reported that a delay in
refresh times could be significant for aviation operations, especially
over water areas that rely on satellite data for coverage. In these
areas, lightning will be used as an indicator of storm formation and
delays in detection and transmission could impact situational
awareness.
Product: Magnetometer (geomagnetic field);
Change: Reduction of magnetic field accuracy requirements;
User concerns: The National Weather Service's Space Weather Prediction
Center found this change acceptable for the purposes of GOES-R data,
but determined that the reduction of the accuracy requirements would
noticeably increase error in the instrument's readings of solar energy
and the geomagnetic field.
Product: Aerosol particle size;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the Department of Agriculture's Forest
Service and the Navy expressed concern about not receiving this
product. The Forest Service reported that this product would help them
monitor and manage air quality.
Product: Aircraft icing threat;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, and the Federal Aviation Administration
expressed concern about not receiving this product. The Aviation
Weather Center reported that this product would be useful because
icing is a major hazard for safe air travel.
Product: Cloud layers/heights;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center and the Navy expressed concern about not receiving this
product. The Aviation Weather Center reported that this product would
help prevent aviation accidents caused by low visibility. Low cloud
ceiling and visibility was associated with about 20 percent of all
aviation accidents from 1994 to 2003.
Product: Cloud liquid water;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center and the Navy expressed concern about not receiving this
product. The Aviation Weather Center reported that this product might
help identify regions with low visibility. Because low visibility is
associated with airline accidents, this product would help prevent
aviation accidents.
Product: Cloud type;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, and the Federal Aviation Administration
expressed concern about not receiving this product. The Aviation
Weather Center reported that, as it is related to the icing threat,
this product would help air traffic controllers know if a cloud was
made of ice, water, or a mixture of the two.
Product: Convective initiation;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center expressed concern about not receiving this product
because new convection is critical to air traffic flow management, and
convection is a major hazard for safe and efficient flight. In
addition, officials from the Navy, the Federal Aviation
Administration, and the Department of Agriculture's Forest Service
were concerned by the loss of this product. The Forest Service
officials reported that the loss of this product could impact its
ability to locate potential ignition areas for wildland fires. The
National Weather Service's Storm Prediction Center also stated that
this product was likely to have had a positive impact on its mission,
which is to predict and monitor high impact weather events such as
tornadoes.
Product: Enhanced "V"/overshooting top detection;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, and the Federal Aviation Administration
expressed concern about not receiving this product. The Aviation
Weather Center reported that this product would indicate the location
of turbulence and convection, thereby helping to improve the safety
and efficiency of air travel. The National Weather Service's Storm
Prediction Center also stated that this product was likely to have had
a positive impact on its mission.
Product: Flood/standing water;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the Department of Agriculture's Forest
Service and the Navy expressed concern about not receiving this
product. Forest Service officials are concerned that the removal of
this product would impact their management of and response to hazards
and disasters.
Product: Ice Cover;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's
Environmental Modeling Center and the Navy expressed concern about not
receiving this product. The Environmental Modeling Center reported
that ice cover data would help assimilate data received from the
sounding sensors.
Product: Low cloud and fog;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, and the Federal Aviation Administration
expressed concern about not receiving this product. The Aviation
Weather Center reported that this product would help prevent aviation
accidents caused by low visibility. Low ceiling and visibility
accounted for about 20% of all aviation accidents from between 1994 to
2003.
Product: Ozone total;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the Department of Agriculture's Forest
Service are concerned that the removal of this product would impact
its ability to monitor and manage air quality.
Product: Probability of rainfall;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, and the Federal Aviation Administration
expressed concern about not receiving this product. The Aviation
Weather Center reported that the loss of this product is significant
because heavy rainfall relates to air traffic planning and the
efficiency of airport operations, and heavy rainfall is correlated
with low ceiling and low visibility and/or convection. Officials from
the Department of Agriculture were also concerned that the loss of
this product would impact their predictive services.
Product: Rainfall potential;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, and the Federal Aviation Administration
expressed concern about not receiving this product. The Aviation
Weather Center reported that the loss of this product is significant
because heavy rainfall relates to air traffic planning and the
efficiency of airport operations, and heavy rainfall is correlated
with low ceiling and low visibility and/or convection. Officials from
the Department of Agriculture were also concerned that the loss of
this product would impact their predictive services.
Product: Tropopause folding turbulence prediction;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, and the Federal Aviation Administration
expressed concern about not receiving this product. The Aviation
Weather Center reported that the loss of this product is significant
because turbulence is a major hazard for safe air travel.
Product: Vegetation fraction (green vegetation);
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the Department of Agriculture's Forest
Service are concerned by the loss of this product because it would
help with forest health monitoring and fire danger assessments.
Officials from the National Weather Service's Environmental Modeling
Center also expressed concern about not receiving this product because
it would help them analyze and predict temperature differences and
precipitation.
Product: Vegetation index;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the Department of Agriculture's Forest
Service are concerned by the loss of this product because it would
help with forest health monitoring and fire danger assessments.
Product: Visibility;
Change: An optional product; not planned to be developed or provided;
User concerns: Officials from the National Weather Service's Aviation
Weather Center, the Navy, the Federal Aviation Administration, and the
Department of Agriculture's Forest Service expressed concern about the
loss of this product. The Aviation Weather Center reported that this
product would help prevent aviation accidents caused by low
visibility, and the Forest Service reported that it would have helped
with air quality monitoring and management.
Source: GAO analysis of federal agency responses.
[End of table]
In addition to the changes that have already been implemented on the
GOES-R program, there are other potential changes that could occur.
For example, by the end of 2013, the program plans to decide whether
or not to include the Geostationary Lightning Mapper on the GOES-R
satellite. Also, there could be changes in the spectrum allocated to
weather satellite data. Officials from the National Weather Service
and Forest Service raised concerns that these potential changes could
also affect their operations. Because these changes have the potential
to impact satellite data user operations, it is critical that the GOES-
R program communicates program changes to the extended user community.
By doing so, satellite data users can establish plans to mitigate any
shortfalls in data and minimize the impact of the changes on their
operations.
NOAA Developed GOES-R Contingency Plans, but Weaknesses Increase the
Impact of a Potential Coverage Gap:
GOES satellite data are considered a mission-essential function
because of their criticality to weather observations and forecasts.
These forecasts--such as those for severe storms, hurricanes, and
tornadoes-
-can have a substantial impact on our nation's people, infrastructure,
and economy. Consequently, NOAA policy requires that there must be two
in-orbit GOES satellites and one on-orbit spare in operation at all
times. If one of the operational satellites were to fail, the on-orbit
spare could be moved into position to take the place of the failed
satellite. However, if there are delays in the launch of the GOES-R
satellite or if either of the two satellites currently in operation
were to fail, NOAA would not have an on-orbit spare to fill the gap.
Government and industry best practices call for the development of
contingency plans to maintain an organization's essential functions in
the case of an adverse event.[Footnote 25] These practices include key
elements such as defining failure scenarios, identifying and selecting
strategies to address failure scenarios, developing procedures to
implement the selected strategies, identifying any actions needed to
implement the strategies, testing the plans, and involving affected
stakeholders. These elements can be grouped into categories, including
(1) identifying failure scenarios and impacts, (2) developing
contingency plans, and (3) validating and implementing contingency
plans (see table 16).
Table 16: Guidelines for Developing a Sound Contingency Plan:
Category: Identifying failure scenarios and impacts;
Key elements:
* Define likely failure scenarios;
* Conduct impact analyses showing impact of failure scenarios on
business processes and user requirements;
* Define minimum acceptable level of outputs and recovery time
objectives, and establish resumption priorities.
Category: Developing contingency plans;
Key elements:
* Define roles and responsibilities for implementing contingency plans;
* Identify alternative solutions to address failure scenarios;
* Select contingency strategies from among alternatives based on
costs, benefits, and impacts;
* Develop "zero-day" procedures;
* Define actions needed to implement contingency strategies;
* Define and document triggers and time lines for enacting the actions
needed to implement contingency plans;
* Ensure that steps reflect priorities for resumption of products and
recovery objectives;
* Designated officials review and approve contingency plan.
Category: Validating and implementing contingency plans;
Key elements:
* Identify steps for testing contingency plans and conducting training
exercises;
* Prepare for and execute tests;
* Execute applicable actions for implementation of contingency
strategies;
* Validate test results for consistency against minimum performance
levels;
* Communicate and coordinate with stakeholders to ensure that
contingency strategies remain optimal for reducing potential impacts;
* Update and maintain contingency plans as warranted.
Source: GAO analysis of guidance documents from the National Institute
of Standards and Technology, Software Engineering Institute, GAO,
NOAA, and the GOES-R program.
[End of table]
NOAA has established contingency plans for both its GOES satellites
and its associated ground systems. In September 2010, we recommended
that NOAA develop and document continuity plans for the operation of
geostationary satellites that include the implementation procedures,
resources, staff roles, and timetables needed to transition to a
single satellite, a foreign satellite, or other solution. In September
2011, the GOES-R program provided a draft plan documenting a strategy
for conducting operations if there were only a single operational
satellite. In December 2012, the program provided us with a final
version of this plan. It included scenarios for three, two, and one
operational satellites. In addition to this satellite contingency
plan, NOAA has another contingency-related plan with activation
procedures for its satellites.
Furthermore, the NOAA office responsible for ground-based satellite
operations and products has created plans for contingency operations
at the GOES ground system facility, the Satellite Operations Control
Center. Specifically, NOAA's plans describe the transfer of critical
functions to a backup facility during an emergency. The continuity
plan contains, among other things, descriptions of the alternate
locations for resources, and the performance of key functions and
implementation procedures.
When compared to best practices, NOAA's satellite and ground system
contingency plans had many strengths and a few weaknesses.
Specifically, the satellite contingency plan fully implemented seven
elements, partially implemented nine elements, and did not implement
one element. The ground system contingency plan fully implemented ten
elements, partially implemented six elements, and one element was not
applicable. Table 17 shows the extent to which the satellite and
ground system contingency plans fully implemented, partially
implemented, or did not implement key contingency planning elements.
Table 17: Implementation of Key Contingency Planning Elements for
Geostationary Operational Environmental Satellites:
Category: Identifying failure scenarios and impacts;
Key element: Define likely failure scenarios;
Satellite system: Fully implemented;
Ground system: Fully implemented;
Description: NOAA has defined three likely failure scenarios for its
satellite system--the loss of one, two, or all three satellites in the
GOES constellation. The agency also defines the conditions that would
constitute a satellite failure. NOAA's scenarios are broad enough that
they cover a wide range of situations, including a gap caused by a
delay in the GOES-R launch. NOAA has defined likely ground system
failure scenarios.
Key element: Conduct impact analyses showing impact of failure
scenarios on business processes and user requirements;
Satellite system: Not implemented;
Ground system: Partially implemented;
Description: NOAA did not conduct impact analyses showing the impact of
satellite failure scenarios on business processes or user
requirements. NOAA conducted impact analyses of ground system outages
and disruptions on business processes and user requirements; however,
these analyses do not reflect each failure scenario.
Key element: Define minimum acceptable level of outputs and recovery
time objectives, and establish resumption priorities;
Satellite system: Fully implemented;
Ground system: Fully implemented;
Description: NOAA defined minimum acceptable output criteria for
satellites, instruments and products in its satellite plans as well as
for business processes and subsystems in the ground system plans.
Category: Developing contingency plans;
Key element: Define roles and responsibilities for implementing
contingency plans;
Satellite system: Partially implemented;
Ground system: Partially implemented;
Description: NOAA has defined roles and responsibilities for some, but
not all, contingency operations in both the satellite and ground system
plans. For example, the satellite contingency plan identifies roles and
responsibilities for briefing management in the event of losing an
operational satellite, but does not define responsibility for notifying
users. The ground system contingency plans describe roles and
responsibilities of three contingency teams, but do not clearly define
the roles and responsibilities for the contingency coordinator.
Key element: Identify alternative solutions to address failure
scenarios;
Satellite system: Partially; implemented;
Ground system: Fully implemented;
Description: In its satellite contingency plan, NOAA identified
alternative solutions to address satellite failure scenarios,
including relocating and using older GOES satellites and requesting
coverage by foreign satellites. However, NOAA did not identify
alternative solutions for preventing delays in the GOES-R launch,
which could cause a reduction in the number of satellites. For its
ground systems, NOAA identified a solution for its failure scenarios:
to switch operations to one of several backup locations.
Key element: Select contingency strategies from among alternatives
based on costs, benefits, and impacts;
Satellite system: Partially implemented;
Ground system: Partially implemented;
Description: In both sets of plans, NOAA has selected contingency
strategies to address failure scenarios; however, it did not provide
evidence that it had selected these strategies from alternatives based
on costs, benefits, and impacts. Moreover, NOAA did not select
strategies to prevent one of the most likely situations that would
trigger a failure scenario: a delay in the launch of the GOES-R
satellite.
Key element: Develop "zero-day" procedures;
Satellite system: Partially implemented;
Ground system: Fully implemented;
Description: NOAA identified strategies and procedures for addressing
GOES satellites failure scenarios, but did not establish associated
time frames. NOAA developed zero-day strategies and procedures for the
GOES ground system.
Key element: Define actions needed to implement contingency strategies;
Satellite system: Partially implemented;
Ground system: Fully implemented;
Description: NOAA has defined high-level activities to implement
satellite contingency strategies, such as relocation of a satellite to
a central location and user notification of a switch to a single
satellite--however, no detailed procedure steps are given for
performance of these activities. NOAA has defined the steps to
implement GOES ground system contingency strategies.
Key element: Define and document triggers and time lines for enacting
the actions needed to implement contingency plans;
Satellite system: Partially implemented;
Ground system: Partially implemented;
Description: NOAA has identified triggers and specific time lines for
implementing satellite contingency plans. However, it has not
established triggers or time lines for any actions it might take to
prevent a delay in the GOES-R launch. NOAA has identified two
different triggers for enacting the ground system plan, but the plan
does not describe which trigger is to be used.
Key element: Ensure that steps reflect priorities for resumption of
products and recovery objectives;
Satellite system: Partially implemented;
Ground system: Partially implemented;
Description: NOAA's satellite contingency plan describes its recovery
objectives and prioritizes GOES instruments and products; however, the
steps for implementing contingency strategies do not reflect these
priorities and objectives. Ground system contingency strategies
establish priorities for resuming operations, but do not define
recovery time objectives.
Key element: Designated officials review and approve contingency plan;
Satellite system: Fully implemented;
Ground system: Fully implemented;
Description: A designated official has reviewed and approved both sets
of contingency plans.
Category: Validating and implementing contingency plans;
Key element: Identify steps for testing contingency plans and
conducting training exercises;
Satellite system: Fully implemented;
Ground system: Fully implemented;
Description: NOAA has identified steps for testing GOES satellite
contingency plans and has conducted exercises and simulations. NOAA
has also identified steps for testing and conducting exercises and
simulations on its ground system contingency plans. NOAA provides
training to its operations staff on contingency operations for both
the satellite and ground systems.
Key element: Prepare for and execute tests;
Satellite system: Fully implemented;
Ground system: Fully implemented;
Description: NOAA officials provided documentation showing preparation
for and execution of regular maneuvers of on-orbit satellites.
According to officials, these maneuvers are similar to the maneuvers
identified as an action in the contingency plans. NOAA also prepared
for and executed tests of its ground system contingency plans.
Key element: Execute applicable actions for implementation of
contingency strategies;
Satellite system: Fully implemented;
Ground system: Not Applicable;
Description: NOAA has performed actions to implement contingency
strategies, including activities to monitor the health and safety of
the satellites, and to provide status information to management.
Executing actions is not applicable for the ground system contingency
plan, because that plan does not identify actions to be taken.
Key element: Validate test results for consistency against minimum
performance levels;
Satellite system: Partially implemented;
Ground system: Fully implemented;
Description: NOAA tested a series of satellite maneuvers similar to
those that would be used in the event of a failure, but did not
demonstrate how these or other scenario tests would meet minimum
performance levels. On the ground systems, NOAA performed tests to
validate the contingency operations, and demonstrated that the
transfer of responsibility meets minimum recovery performance levels.
Key element: Communicate and coordinate with stakeholders to ensure
that contingency strategies remain optimal for reducing potential
impacts;
Satellite system: Partially implemented;
Ground system: Partially implemented;
Description: According to users, NOAA is proactive in communicating
potential changes and impacts when issues develop, and responded
quickly to a recent outage in a GOES satellite. However, the
contingency strategies currently in place of (1) switching to single
satellite operations and (2) using a foreign satellite as a temporary
replacement would have a major effect on user operations; NOAA has not
provided key external users with information on meeting data needs
under these scenarios. For example, the Forest Service relies on GOES
satellites to obtain data from its distributed ground-based
observation network, but NOAA has not discussed potential mitigation
options specific to this scenario.
Key element: Update and maintain contingency plans as warranted;
Satellite system: Fully implemented;
Ground system: Fully implemented;
Description: NOAA has updated and maintained contingency plans for both
the GOES satellite and ground systems.
Source: GAO analysis of NOAA documentation.
[End of table]
NOAA has implemented most of the best practices on both the GOES
satellite and ground contingency plans. Specifically, NOAA identified
failure scenarios, recovery priorities, and minimum levels of
acceptable performance. NOAA also established contingency plans that
identify solutions and high-level activities and triggers to implement
the solutions. Further, the agency has tested its contingency plans,
trained staff on how to implement the contingency plans, and updated
the plans when warranted. The agency also successfully implemented its
contingency plans when it experienced problems with one of its
operational satellites. Specifically, when GOES-13 experienced
problems in September and October 2012, NOAA activated its contingency
plans to move its back-up satellite into position to provide
observations until GOES-13 was once again operational. While the
agency has not needed to address the loss of a back-up satellite in
recent years, contingency plans cover this situation by determining if
older GOES satellites could provide coverage, moving the single
satellite into a central position over the country, and seeking data
from foreign satellites.
However, both satellite and ground contingency plans contain areas
that fall short of best practices. For example, NOAA has not
demonstrated that the contingency strategies for both its satellite
and ground system are based on an assessment of costs, benefits, and
impact on users. Further, the satellite plan does not specify
procedures for working with the user community to account for
potential reductions in capability under contingency operations. For
example, officials from the Federal Aviation Administration noted that
NOAA's contingency plans do not define the compatibility, security,
and standard protocol language they should use if a foreign satellite
were to be utilized. Also, while selected users reported that, in the
past, they have been well informed by NOAA when changes in service
occur, including the problems with GOES-13, others were either not
informed or received information on outages through a third party.
Moreover, selected users stated that certain contingency operations
could have a significant impact on their operations. For example,
Federal Aviation Administration officials stated that flight
approaches in Alaska that were enabled using the Global Positioning
System were affected by the GOES-13 outage in late 2012. As another
example, Forest Service officials explained that if GOES were to
experience an outage and not have a backup satellite available, it was
their understanding that NOAA would either move a single satellite
into a central position over the country or obtain observations from a
foreign satellite. Under both of these scenarios, they could lose
views of wildland fires and their ability to obtain data from ground-
based observation networks. Nearly all users stated that the effects
of a switch to a single satellite or foreign satellite configuration
would be significant.
In addition, while NOAA's failure scenarios for its satellite system
are based on the number of available satellites--and the loss of a
backup satellite caused by a delayed GOES-R launch would fit into
these scenarios--the agency did not identify alternative solutions or
time lines for preventing a GOES-R launch delay. According to NOAA
officials, a gap caused by a delayed launch would trigger the same
contingency actions as a failure on launch or the loss of a currently
on-orbit satellite. However, this does not take into account potential
actions that NOAA could undertake to prevent a delayed launch, such as
removing selected functionality or compressing test schedules.
NOAA officials stated that their focus on primary users and on the
number of available satellites is appropriate for their contingency
plans. Given the potential for a delay in the launch of the GOES-R
satellite and the expectation that there will be at least a year with
no backup satellite in orbit, it is important that NOAA consider ways
to prevent a delay in the GOES-R launch, and ensure its contingency
plans are fully documented, tested, and communicated to affected
stakeholders. Further, it is critical that NOAA and users are aware of
how contingency scenarios will affect user operations. Until
comprehensive plans are developed, it is less certain that NOAA can
provide a consistent level of service and capabilities in the event of
an early failure or late launch. This in turn could have a devastating
effect on the ability of meteorologists to observe and report on
severe weather conditions.
Conclusions:
The GOES-R program is well on its way toward developing the first
satellite in the series, but it continues to face risks that could
delay the first satellite's launch. Among these risks are issues we
have previously raised on how the program manages reserve funds and
implements sound scheduling practices. Specifically, the agency does
not provide important details on its contingency reserve funds to
senior executives, including the reserves allocated for each of the
four satellites or key assumptions made in calculating reserves.
Without this information, program officials could misinterpret the
size of the remaining reserves and make poor decisions regarding the
program's future funding. The agency has improved selected scheduling
practices, but others remain weak--in part, according to agency
officials, due to the dynamic nature of scheduling a program as
complex as the GOES-R satellite program. As the agency closes in on
its expected launch date, technical issues in developing the space and
ground segments and scheduling problems could make it more difficult
to launch on schedule, and program officials now acknowledge that the
launch date may be delayed by 6 months. Any delay in the anticipated
launch date would expand a potential one-year gap in the availability
of an on-orbit backup GOES satellite, and raise the risk of a gap in
geostationary satellite data should one of the two operational
satellites experience a problem.
While the agency has made multiple changes to GOES-R requirements in
recent years, it has not effectively involved satellite data users in
those changes. Specifically, internal NOAA and external satellite data
users were not fully informed about changes in GOES-R requirements and
did not have a chance to communicate their concerns about the impact
these changes could have on their ability to perform their missions.
Many of these users expressed concerns about the effect these changes
could have on their ability to fulfill their missions, including
facilitating air traffic, conducting military operations, and fighting
wildland fires. Until NOAA improves its outreach and communication
with external satellite data users, its changes in requirements could
cause unexpected impacts on critical user operations.
Given the possibility of a gap in geostationary satellite coverage,
NOAA has established contingency plans for both its GOES satellites
and ground systems; these plans include the likely scenario in which
there will not be an on-orbit backup. While these plans include many
elements called for in industry best practices, the satellite
contingency plan did not assess the potential impacts of a failure on
users, or specify actions for working with the user community to
address these potential reductions in capability under contingency
operations. They also did not identify alternative solutions or time
lines for preventing a delay in the GOES-R launch date. The absence of
a fully-tested and complete set of GOES-R-related contingency plans
and procedures could have a major impact on levels of service provided
in the event of a satellite or ground system failure.
Recommendations for Executive Action:
To address risks in the GOES-R program development and to help ensure
that the satellite is launched on time, we are making the following
four recommendations to the Secretary of Commerce. Specifically, we
recommend that the Secretary of Commerce direct the NOAA Administrator
to:
* Direct program officials to include information on the amount of
reserve funding for each of the four satellites in the program as well
as information on the calculation and use of reserves in regular
briefings to NOAA senior executives, so that executives are fully
informed about changes in reserve levels.
* Given the likely gap in availability of an on-orbit GOES backup
satellite in 2015 and 2016, address the weaknesses identified in this
report on the core ground system and the spacecraft schedules. These
weaknesses include, but are not limited to, sequencing all activities,
ensuring there are adequate resources for the activities, and
conducting a schedule risk analysis.
* Improve communications with internal and external satellite data
users on changes in GOES-R requirements by (a) assessing the impact of
changes on user's critical operations; (b) seeking information from
users on any concerns they might have about past or potential changes;
and (c) disseminating information on past and potential changes in
requirements to satellite data users.
* Revise the satellite and ground system contingency plans to address
weaknesses identified in this report, including providing more
information on the potential impact of a satellite failure,
identifying alternative solutions for preventing a delay in GOES-R
launch as well as time lines for implementing those solutions, and
coordinating with key external stakeholders on contingency strategies.
Agency Comments and Our Evaluation:
We sought comments on a draft of our report from the Department of
Commerce and NASA. We received written comments on a draft of this
report from Commerce transmitting NOAA's comments. NOAA concurred with
all four of our recommendations and identified steps that it is taking
to implement them. It also provided technical comments, which we have
incorporated into our report, as appropriate. NOAA's comments are
reprinted in appendix II.
While NOAA concurred with our recommendation to include information on
reserve funding for each of the four satellites in the program and
information on the calculation and use of reserves in regular
briefings to senior executives, and suggested that its current
processes fulfill this recommendation, we do not believe they do.
Specifically, NOAA stated that the GOES-R program currently reports on
reserve funding at two major monthly management meetings, which alerts
management if reserves fall below designated thresholds for the
remaining work on all four satellites. The agency also stated that its
reporting of the percent of "unliened" contingency funding--the amount
of contingency funding not allocated to a potential risk or issue--for
the remaining work addresses our concern regarding whether there are
sufficient reserves to complete the GOES-R series.
However, the GOES-R program does not currently identify the reserve
funding needed for each individual satellite or provide details on how
reserves are being calculated and used at the monthly management
meetings. By not providing reserve information on the individual
satellites, the program is not alerting management about potential
near-term funding shortfalls. For example, maintaining a high level of
reserves on the later satellites could mask a low level of reserves in
the near-term for GOES-R and S. Such a scenario could affect the
satellites' development schedules and launch dates. Further, by not
obtaining details on the assumptions made when calculating reserves
and the causes of changes in reserve values, management is unable to
determine if changes in reserve levels are due to the addition,
subtraction, or use of funds, or to changes in the assumptions used in
the calculations. Given the importance of reserve funds in ensuring
the satellite development remains on track, management should be aware
of reserve funding levels for each individual satellite and of the
underlying reasons for changes in reserve levels. Therefore, we
continue to believe that additional action is needed by NOAA to
respond to our recommendation.
After we received agency comments and while our report was in final
processing, NOAA notified us that the launch dates of the first and
second GOES-R series satellites would be delayed. Given the late
receipt of this information, our report reflects the previous launch
date.
NASA did not provide comments on the report's findings or
recommendations, but noted that it would provide any input it might
have to NOAA for inclusion in that agency's comments.
As agreed with your offices, unless you publicly announce the contents
of this report earlier, we plan no further distribution until 30 days
from the report date. At that time, we will send copies to interested
congressional committees, the Secretary of Commerce, the Administrator
of NASA, the Director of the Office of Management and Budget, and
other interested parties. The report also will be available at no
charge on the GAO website at [hyperlink, http://www.gao.gov].
If you or your staff have any questions on the matters discussed in
this report, please contact me at (202) 512-9286 or at
pownerd@gao.gov. Contact points for our Offices of Congressional
Relations and Public Affairs may be found on the last page of this
report. GAO staff who made major contributions to this report are
listed in appendix III.
Signed by:
David A. Powner
Director, Information Technology Management Issues:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
Our objectives were to (1) assess the National Oceanic and Atmospheric
Administration's (NOAA) progress in developing the Geostationary
Operational Environmental Satellite-R series (GOES-R) program and in
addressing key cost and schedule risks that we identified in a prior
report, (2) evaluate the program's efforts to manage changes in
requirements and whether any significant changes have recently
occurred, and (3) evaluate the adequacy of GOES-R contingency plans.
To assess NOAA's progress in developing the GOES-R satellite program,
we compared the program's planned completion dates for key milestones
identified in its management control plan and system review plan
against actual and currently estimated completion dates. We analyzed
monthly program status briefings to identify the current status and
recent development challenges of flight and ground project components
and instruments. To assess NOAA's efforts to address key cost risks,
we compared program-reported data on development costs and reserves to
best practices in reserve funding as identified by the program's
management control plan, which, in turn, reflects National Aeronautics
and Space Administration requirements. We calculated reserve
percentages using program office data on development costs and
reserves, and compared these calculations to the reserve percentages
reported by the program to management. To assess NOAA's efforts to
address key schedule risks, we compared schedules for two key GOES-R
components to best practices in schedule development as identified in
our Cost Estimating and Assessment Guide.[Footnote 26] Similar to our
previous report, we used a five-part rating system. We then compared
our previous assessment to our current assessment to identify
practices that were improved, stayed the same, or became weaker over
time. We conducted interviews with GOES-R program staff to better
understand milestone time frames, to discuss current status and recent
development challenges for work currently being performed on GOES-R,
and to understand how the program reports costs and reserve totals. We
also examined the reliability of data on cost reserves and program
schedules. Regarding cost reserves, we examined reliability by
recalculating reserve percentages based on supporting data over a
period of one year, and compared the results to those presented by the
program to management. Regarding schedules, we created a template that
examined each schedule in areas such as missing logic, tasks completed
out of sequence, and completed tasks with start or finish dates in the
future. As a result, we found both the reserve information and the
schedules to be reliable for the purposes of conducting our analyses.
To evaluate the program's efforts to manage changes in requirements,
we compared GOES-R practices for managing requirements changes against
best practices, which we drew from several leading industry sources
including the Software Engineering Institute's Capability Maturity
Model®-Integration, the Program Management Institute's Program
Manager's Body of Knowledge, the Federal Information Security Controls
Audit Manual and the Information Technology Governance Institute's
Control Objectives for Information and related Technology governance
framework. We assessed GOES-R practices as having satisfied, partially
satisfied, or not satisfied each best practice. We analyzed changes
from 2007 to the present in the program's Level I Requirements
Document to determine the extent of the changes. We also identified
concerns about these changes from a subset of satellite data users. We
selected users from both inside and outside NOAA's National Weather
Service, the main GOES satellite user, based on several factors: the
importance of GOES data to the organization's core mission, the user's
reliance on GOES products that have changed or may change, and--for
agencies outside of NOAA--the percentage of spending devoted to
meteorological operations. The user organizations outside of NOAA
included in our review were: the US Department of Agriculture, the
Department of Transportation's Federal Aviation Administration, and
the Department of Defense's Navy and Air Force. User organizations
inside of NOAA's National Weather Service included the Aviation
Weather Center, Space Weather Prediction Center, Storm Prediction
Center, Environmental Modeling Center, and a Weather Forecast Office.
To evaluate the adequacy of GOES-R contingency plans, we compared
contingency plans and procedures for both GOES satellites and the GOES
ground system against best practices developed from leading industry
sources such as the National Institute of Standards and Technology,
the Software Engineering Institute's Capability Maturity Model®-
Integration, and our prior work. We analyzed the contingency plans to
identify strategies for various failure scenarios and determined
whether the satellite and ground system contingency plans fully
implemented, partially implemented, or did not implement each of the
practices. We also interviewed selected satellite data users to better
determine the impact of a GOES failure scenario on their operations,
and the level of communication they have had with NOAA satellite
offices on current contingency plans.
We performed our work at NOAA, National Aeronautics and Space
Administration, and US Department of Agriculture offices in the
Washington, D.C., area and at National Weather Service offices in
Kansas City, Missouri; Norman, Oklahoma; and Sterling, Virginia. We
conducted this performance audit from October 2012 to September 2013,
in accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives.
[End of section]
Appendix II: Comments from the Department of Commerce:
The Deputy Secretary of Commerce:
Washington, D.C. 20230:
July 5, 2013:
Mr. David A. Powner:
Director, Information Technology Management Issues:
U.S. Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Mr. Powner:
Thank you for the opportunity to review and comment on the U.S.
Government Accountability Office's draft report entitled,
"Geostationary Weather Satellites: Progress Made, but Weaknesses in
Scheduling, Contingency Planning, and Communicating with Users Need to
be Addressed" (GAO-13-597). On behalf of the Department of Commerce, I
have enclosed the National Oceanic and Atmospheric Administration's
programmatic comments to the draft report.
If you have any questions, please contact me or Jim Stowers, Deputy
Assistant Secretary for Legislative and Intergovernmental Affairs, at
(202) 482-3663.
Sincerely,
Signed by:
Patrick Gallagher:
Acting Deputy Secretary of Commerce:
Enclosure:
U.S. Department of Commerce:
National Oceanic and Atmospheric Administration:
Comments to the Draft GAO Report Entitled, "Geostationary Weather
Satellites: Progress Made, but Weaknesses in Scheduling, Contingency
Planning, and Communicating with Users Need to be Addressed"
(GAO-13-597, July 2013):
General Comments:
The Department of Commerce appreciates the opportunity to review the
U.S. Government Accountability Office's (GAO) draft report. Throughout
the report, when referring to the program in whole, use "the GOES-R
Series Program." This is to ensure a clear distinction between the
overall program and the first satellite in this series (i.e., GOES-R).
NOAA Response to GAO Recommendations:
Recommendation 1: "Direct program officials to include information on
the amount of reserve funding for each of the four satellites in the
program as well as information on the calculation and use of reserves
in regular briefings to NOAA senior executives, so that executives are
fully informed about changes in reserve levels."
NOAA Response: Concur. The GOES-R Series Program currently reports
contingency amounts (reserves) to the Goddard Space Flight Center
(GSFC) monthly Management Status Review (MSR) and the National Oceanic
and Atmospheric Administration (NOAA) monthly Program Management
Council (PMC) meeting. The unliened contingency amount is reported as
a dollar amount and as a percentage of unexecuted work-to-go. This
approach alerts management to a contingency falling below the required
levels for work-to-go, including work on GOES-R, S, T, and U. NOAA
Leadership will continue the ongoing process of working with GOES-R
Series Program to ensure contingency reporting meets its requirements
for detailed information, and ensure reporting is revised accordingly.
The percent of unliened contingency on work-to-go, which is reported
monthly to management by the GOES-R Program, specifically addresses
GAO's concern regarding sufficient reserves to complete the GOES-R
Series.
Recommendation 2: "Given the likely gap in availability of an on-orbit
GOES backup satellite in 2015 and 2016, address the weaknesses
identified in this report on the core ground systems and spacecraft
schedules. These weaknesses include, but are not limited to,
sequencing all activities, ensuring there are adequate resources for
the activities, and conducting a schedule risk analysis."
NOAA Response: Concur. The GOES-R Series Program conducts monthly
health checks of the spacecraft, instrument, and ground segment
schedules and works with the contractors to resolve the issues
identified. The Program Integrated Master Schedule (IMS) is built from
the contractor schedule submissions, which are summarized into flight
and ground segment schedules that are then integrated to form the
Program IMS. The contractor schedules do reflect the appropriate
subcontractor activities. The integration and summarization process
provides an end-to-end critical path and the amount of schedule slack
for that critical path. Schedule performance information is also
monitored and reported in a number of non-schedule ways. For example,
milestones executed versus planned, engineering products executed
versus planned, reviews (requirements, design, manufacturing, etc.)
executed versus planned, and progress on subcontract and procurement
activities. Flight, Ground, and Program schedule information is
reported to the GSFC monthly Management Status Reviews and the NOAA
monthly PMC meetings. The GOES-R Series Program will continue to bring
down the number of errors in the schedules and improve the fidelity of
the Program IMS.
Recommendation 3: "Improve communications with internal and external
satellite data users on changes in GOES-R requirements by (a)
assessing the impact of changes on user's critical operations; (b)
seeking information from users on any concerns they might have about
past or potential changes; and (c) disseminating information on past
and potential changes in requirements to satellite data users."
NOAA Response: Concur. The GOES-R Series Program has an active process
for communicating Program status to stakeholders and soliciting their
input. GOES-R Series Program has extensive interaction with users
which takes place at meetings such as the recent NOAA Satellite
Conference, the NOAA Satellite Science Week, GOES-R Risk Reduction and
Algorithm Development Executive Board (ADEB) meetings. Additional
interaction takes place at the GOES User's Conference, the Annual
American Meteorological Society (AMS) Meeting, sponsor meetings
(including DoD and Canada) at the Cooperative Program for Meteorology
Education and Training (COMET), and the annual GOES-R status briefing
to the Office of the Federal Coordinator for Meteorology (OFCM).
The GOES-R Series Program will endeavor to further improve its
communications with its internal and external satellite data users and
will consider new opportunities to disseminate information about
forthcoming changes in the GOES-R era. These may include: National
Weather Service (NWS) Customer Forums which are held twice per year to
discuss "Family of Service" and other NWS data flows to the private
sector, increased engagement with broadcast meteorologists, and
improved dissemination of information with the direct read-out
community.
Recommendation 4: "Revise the satellite and ground system contingency
plans to address weaknesses identified in the report, including
providing more information on the potential impact of a satellite
failure, identifying alternative solutions for preventing a delay in
GOES-R launch as well as time lines for implementing those solutions,
and coordinating with key external stakeholders on contingency
strategies."
NOAA Response: Concur. NOAA will update its satellite and ground
system contingency plans to address weaknesses identified in the
report. In addition, the GOES-R Series Program will provide regular
updates to NOAA, the Department, the Office of Management and Budget,
and Congress on efforts underway to protect the launch schedule, as
part of the regular monthly and quarterly program reviews and will
continue to use existing mechanisms to communicate program status to
key stakeholders.
[End of section]
Appendix III: GAO Contact and Staff Acknowledgments:
GAO Contact:
David A. Powner, (202) 512-9286 or pownerd@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, individuals making
contributions to this report included Colleen Phillips (assistant
director), Paula Moore (assistant director), Shaun Byrnes, Kathleen
Feild, Nancy Glover, Franklin Jackson, Kaelin Kuhn, Jason Lee, Scott
Pettis, Meredith Raymond, Maria Stattel, and Jessica Waselkow.
[End of section]
Footnotes:
[1] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for
Developing and Managing Capital Program Costs, [hyperlink,
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: Mar. 2009);
NOAA, Geostationary Operational Environmental Satellites--R Series
Management Control Plan (Silver Spring, Md.: January 2013).
[2] NASA, NASA Systems Engineering Handbook (Washington, D.C.:
December 2007), Software Engineering Institute, CMMI® for Development,
Version 1.3 (Pittsburgh, Pa.: November 2010); Project Management
Institute, A Guide to the Project Management Body of Knowledge
(Newtown Square, Pa.: 2004); GAO, Federal Information System Controls
Audit Manual, [hyperlink, http://www.gao.gov/products/GAO-09-232G]
(Washington, D.C.: February 2009); IT Governance Institute, Control
Objectives for Information and related Technology 4.1 (Rolling
Meadows, Ill.: 2007).
[3] GAO, Year 2000 Computing Crisis: Business Continuity and
Contingency Planning, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-10.1.19] (Washington, D.C.:
August 1998); National Institute of Standards and Technology,
Contingency Planning Guide for Federal Information Systems, NIST 800-
34 (Gaithersburg, Md.: May 2010); Software Engineering Institute,
CMMI® for Acquisition, Version 1.3 (Pittsburgh, Pa.: November 2010).
[4] While our report was in final processing, NOAA announced that it
would delay the launch dates for its GOES-R and GOES-S satellites to
the second quarter of fiscal year 2016 and the third quarter of fiscal
year 2017, respectively.
[5] GAO, Environmental Satellites: Focused Attention Needed to
Mitigate Program Risks, [hyperlink,
http://www.gao.gov/products/GAO-12-841T], (Washington, D.C.: June 27,
2012); GAO, Geostationary Weather Satellites: Design Progress Made,
but Schedule Uncertainty Needs to be Addressed, [hyperlink,
http://www.gao.gov/products/GAO-12-576], (Washington, D.C.: June 26,
2012); GAO, Geostationary Operational Environmental Satellites:
Improvements Needed in Continuity Planning and Involvement of Key
Users, [hyperlink, http://www.gao.gov/products/GAO-10-799]
(Washington, D.C.: Sept. 1, 2010); GAO, Geostationary Operational
Environmental Satellites: Acquisition Has Increased Costs, Reduced
Capabilities, and Delayed Schedules, [hyperlink,
http://www.gao.gov/products/GAO-09-596T] (Washington, D.C.: Apr. 23,
2009); GAO, Geostationary Operational Environmental Satellites:
Acquisition Is Under Way, but Improvements Needed in Management and
Oversight, [hyperlink, http://www.gao.gov/products/GAO-09-323]
(Washington, D.C.: Apr. 2, 2009); GAO, Geostationary Operational
Environmental Satellites: Further Actions Needed to Effectively Manage
Risks, [hyperlink, http://www.gao.gov/products/GAO-08-183T]
(Washington, D.C.: Oct. 23, 2007); GAO, Geostationary Operational
Environmental Satellites: Progress Has Been Made, but Improvements Are
Needed to Effectively Manage Risks, [hyperlink,
http://www.gao.gov/products/GAO-08-18] (Washington, D.C.: Oct. 23,
2007); GAO, Geostationary Operational Environmental Satellites:
Additional Action Needed to Incorporate Lessons Learned from Other
Satellite Programs, [hyperlink,
http://www.gao.gov/products/GAO-06-1129T] (Washington, D.C.: Sept. 29,
2006); and GAO, Geostationary Operational Environmental Satellites:
Steps Remain in Incorporating Lessons Learned from Other Satellite
Programs, [hyperlink, http://www.gao.gov/products/GAO-06-993]
(Washington, D.C.: Sept. 6, 2006).
[6] [hyperlink, http://www.gao.gov/products/GAO-10-799].
[7] [hyperlink, http://www.gao.gov/products/GAO-10-799].
[8] GAO, 2013 High-Risk Series: An Update, [hyperlink,
http://www.gao.gov/products/GAO-13-359T] (Washington, D.C.: February
14, 2013).
[9] [hyperlink, http://www.gao.gov/products/GAO-12-756].
[10] This module is called the Mission Management function.
[11] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for
Developing and Managing Capital Program Costs, [hyperlink,
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: Mar. 2009).
[12] While NOAA has ultimate responsibility for GOES-R, NOAA shares
program management responsibilities with NASA, and the program office
is located at NASA's Goddard Space Flight Center.
[13] Until late 2012, NOAA required the ground project to maintain 30
percent of its development cost as a reserve. However, program
officials recently revised the requirement down to 20 percent to
reflect the shorter amount of development time before launch and the
retirement of some risks.
[14] [hyperlink, http://www.gao.gov/products/GAO-12-576].
[15] Under testing, the electronics board emitted unexpectedly high
levels of radiation, which would cause a high number of false alarms
and hinder the program's ability to assess the instrument's
observations.
[16] While our report was in final processing, NOAA announced that it
would delay the launch date for its GOES-R satellite from October 2015
to the second quarter of fiscal year 2016.
[17] See [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. In May
2012, we published updated guidance on scheduling best practices. See
GAO, Schedule Assessment Guide: Best Practices for Project Schedules--
Exposure Draft, [hyperlink, http://www.gao.gov/products/GAO-12-120G]
(Washington, D.C.: May 30, 2012). The updated guidance identifies 10
best practices.
[18] [hyperlink, http://www.gao.gov/products/GAO-12-576].
[19] Level-of-effort activities represent work that has no measurable
output and cannot be associated with a physical product or defined
deliverable. These activities are typically related to management and
other oversight that continues until the detailed activities they
support have been completed.
[20] Total float time is the amount of time an activity can be delayed
or extended before the delay affects its successors or the program's
finish date.
[21] A driving path is the longest path of successive activities that
drives the finish date for a key milestone. The driving path often
corresponds to a schedule's critical path.
[22] Leading industry and governments sources--including the Software
Engineering Institute's Capability Maturity Model®-Integration, the
Project Management Institute's Project Management Body of Knowledge,
the Federal Information Security Controls Audit Manual, the IT
Governance Institute's Control Objectives for Information and related
Technology governance framework, and NASA system development policies-
-provide extensive guidance on managing requirements.
[23] [hyperlink, http://www.gao.gov/products/GAO-10-799].
[24] NOAA officials stated that the sole significant change was a
reduction in the accuracy requirement for the magnetometer, and
demonstrated that they had obtained approval from the most pertinent
user community, the National Weather Service's Space Weather
Prediction Center, before making the change.
[25] See GAO, Year 2000 Computing Crisis: Business Continuity and
Contingency Planning, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-10.1.19] (Washington, D.C.:
August 1998); National Institute of Standards and Technology,
Contingency Planning Guide for Federal Information Systems, NIST 800-
34 (Gaithersburg, Md.: May 2010); Software Engineering Institute,
CMMI® for Acquisition, Version 1.3 (Pittsburgh, Pa.: November 2010).
[26] See GAO, GAO Cost Estimating and Assessment Guide: Best Practices
for Developing and Managing Capital Program Costs, [hyperlink,
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: Mar.
2009).In May 2012, GAO published updated guidance on scheduling best
practices. See GAO, Schedule Assessment Guide: Best Practices for
Project Schedules--Exposure Draft, [hyperlink,
http://www.gao.gov/products/GAO-12-120G] (Washington, D.C.: May 30,
2012). The updated guidance identifies 10 best practices. In order to
compare past and current results, we conducted our current assessment
using the original 9 practices.
[End of section]
GAO’s Mission:
The Government Accountability Office, the audit, evaluation, and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the
performance and accountability of the federal government for the
American people. GAO examines the use of public funds; evaluates
federal programs and policies; and provides analyses, recommendations,
and other assistance to help Congress make informed oversight, policy,
and funding decisions. GAO’s commitment to good government is
reflected in its core values of accountability, integrity, and
reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each
weekday afternoon, GAO posts on its website newly released reports,
testimony, and correspondence. To have GAO e-mail you a list of newly
posted products, go to [hyperlink, http://www.gao.gov] and select
“E-mail Updates.”
Order by Phone:
The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black
and white. Pricing and ordering information is posted on GAO’s
website, [hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
Connect with GAO:
Connect with GAO on facebook, flickr, twitter, and YouTube.
Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts.
Visit GAO on the web at [hyperlink, http://www.gao.gov].
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm];
E-mail: fraudnet@gao.gov;
Automated answering system: (800) 424-5454 or (202) 512-7470.
Congressional Relations:
Katherine Siggerud, Managing Director, siggerudk@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, DC 20548.
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, DC 20548.
[End of document]