This is the accessible text file for GAO report number GAO-08-1012
entitled '2010 Census: Census Bureau Needs Procedures for Estimating
the Response Rate and Selecting for Testing Methods to Increase
Response Rate' which was released on October 30, 2008.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
United States Government Accountability Office:
GAO:
September 2008:
2010 Census:
Census Bureau Needs Procedures for Estimating the Response Rate and
Selecting for Testing Methods to Increase Response Rate:
2010 Census:
GAO-08-1012:
GAO Highlights:
Highlights of GAO-08-1012, a report to congressional requesters.
Why GAO Did This Study:
The U.S. Census Bureau (Bureau) estimates that it will spend at least
$2 billion to enumerate households that did not return census forms
during the 2010 Census. Increasing the response rate would reduce the
number of households that Bureau field staff must visit. To address
concerns about reducing the cost of enumerating these households, GAO
(1) analyzed how the Bureau develops, supports, and updates the
response rate estimate, and the extent to which the Bureau uses the
estimate to inform its 2010 planning efforts; (2) described the methods
the Bureau considered for increasing response in 2010 and how it tested
these methods; and (3) assessed how the Bureau identifies and selects
for testing methods to increase response rate, including considering
other surveys’ methods.
To meet these objectives, GAO analyzed the Bureau’s documentation for
estimating the response rate and selecting for testing methods to
increase response, and interviewed experts from other survey
organizations.
What GAO Found:
The 2010 Census response rate estimate is not fully supported,
systematically reevaluated, or clearly incorporated into the life cycle
cost estimate and planning efforts for nonresponse follow-up—where
census workers visit households that do not return their census forms.
Specifically, the Bureau could not demonstrate support for one
component underpinning the estimate—a general decline due to decreasing
public participation in surveys—because it did not document its
decisions or data sources when developing the estimate. The two other
estimate components that affect responses are the short-form-only
census and the replacement questionnaire. In 2001, the Bureau estimated
the 2010 Census response rate to be 69 percent. However, from 2001
through 2008, the Bureau did not systematically reevaluate the estimate
or consider test results from this decade to determine if the estimate
should be updated. Although the Bureau revised the estimate to 64
percent after a major redesign of its nonresponse follow-up operation
in 2008, the Bureau still lacks procedures for establishing when and
how to reevaluate and, if necessary, update the estimate. To estimate
costs and plan for nonresponse follow-up, the Bureau relies on response
rate estimates for local census office types because these estimates
reflect geographic differences. Officials said that the local estimates
reflect components of the national estimate. However, only one of the
three components from the national estimate—the replacement
questionnaire—was clearly reflected in the local census office type
estimates.
Through various national and field tests and experiments, the Bureau
tested nine methods to increase 2010 Census response and currently
plans to implement two of these methods—the replacement questionnaire
and two-column bilingual form. The Bureau also plans to use a
communications campaign to increase response and plans to test campaign
messages in 2009. In July 2006, the Bureau decided not to include an
Internet response option in the 2010 Census. However, the Bureau
recently announced that it is again considering including the Internet
option in 2010, although it has not developed further plans for testing
it.
For 2010, the Bureau established test objectives and research questions
to identify methods to test for increasing response. However, Bureau
officials did not document the methods that they considered but decided
not to test or the rationale behind those decisions. Although officials
said that they considered cost, complexity of the test, and
compatibility of experiments in their decisions, they did not specify
how they weighed these factors to select and prioritize the nine
methods they chose to test. Officials said that they consider the
experiences of other survey organizations to identify potential methods
to increase response, but they, along with some experts, noted that
such methods may only be indirectly applicable to the decennial census.
Nonetheless, testing modifications to methods the Bureau has previously
considered or tested, such as testing a variety of telephone reminder
messages, may yield additional opportunities for increasing response.
What GAO Recommends:
GAO recommends that the Secretary of Commerce direct the Bureau to
establish procedures for developing, documenting, and reevaluating the
response rate estimate and for selecting for testing methods to
increase the response rate. In commenting on a draft of this report,
Commerce generally agreed with GAO’s recommendations and committed to
take action for the 2020 Census.
To view the full product, including the scope and methodology, click on
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-1012]. For more
information, contact Mathew J. Scirè at (202) 512-6806 or
sciremj@gao.gov
[End of section]
Contents:
Letter:
Results in Brief:
Background:
The Response Rate Estimate Lacks Support, Is Not Systematically
Reevaluated, and Is Not Clearly Incorporated into Planning Efforts:
The Bureau Plans to Incorporate Additional Methods to Increase Mail
Response in 2010:
Bureau Lacks Procedures for Selecting for Testing Methods to Increase
Response Rate:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Testing of Methods to Increase Response to 2010 Census:
Appendix III: Comments from the Department of Commerce:
Appendix IV: GAO Contacts and Staff Acknowledgments:
Table:
Table 1: Self-Response Methods Tested in 2010 Testing Cycle and Effect
on Response Rate in Tests:
Figures:
Figure 1: Decennial Census Costs from 1970 through 2010 (Projected) in
Constant 2010 Dollars:
Figure 2: Excerpt from Advance Letter from Census 2000:
Figure 3: Reminder Postcard from 2003 Census Test:
Figure 4: Components of the Census Bureau's 2001 National Response Rate
Estimate for the 2010 Census:
Figure 5: Excerpt from Two-Column Bilingual Form from 2008 Dress
Rehearsal:
Figure 6: Text for Telephone Reminder Call from 2003 Census Test:
Figure 7: Message on Flap to Distinguish Replacement Questionnaire from
Initial Questionnaire from 2005 Census Test:
Figure 8: Due Date on Envelope from 2006 Test:
Figure 9: Example of a Census 2000 Communications Campaign Poster:
United States Government Accountability Office:
Washington, DC 20548:
September 30, 2008:
The Honorable Tom Davis:
Ranking Member:
Committee on Oversight and Government Reform:
House of Representatives:
The Honorable Michael R. Turner:
Ranking Member:
Subcommittee on Information Policy, Census and National Archives:
Committee on Oversight and Government Reform:
House of Representatives:
The U.S. Census Bureau (Bureau) estimates that even after adjusting for
inflation, the 2010 decennial census will be the most expensive census
in our nation's history, costing from $13.7 billion to $14.5
billion.[Footnote 1] The Bureau estimates that more than $2 billion
will be used to employ temporary field staff for nonresponse follow-up-
-its largest field operation where enumerators interview households
that did not return census forms. Increasing the response rate would
reduce the number of households that Bureau field staff must visit
during this nationwide operation.[Footnote 2] According to Bureau
officials, a 1 percent increase in the response rate can save $75
million.
The Bureau expects to hire over 700,000 temporary workers to conduct
nonresponse follow-up with about 47 million households over the course
of 10 weeks in 2010. The Bureau initially based the schedule, staffing,
and funding it needed for nonresponse follow-up on an estimated
national response rate of 69 percent. However, in February 2008, the
Director of the Bureau initiated a replanning of the Field Data
Collection Automation program--a major acquisition that includes
systems; equipment, including handheld computers; and infrastructure
for field staff to use in collecting data for the 2010 Census. After
analyzing several options to revise the design of the 2010 Decennial
Census, on April 3, 2008, the Secretary of Commerce announced that the
Bureau would no longer use handheld computers in nonresponse follow-up
and revised the estimated national response rate from 69 percent to 64
percent. The Bureau estimated that this option would result in a cost
increase of $2.2 billion to $3 billion over the previously reported
cost estimate of $11.5 billion.
To address your concerns about reducing the cost of nonresponse follow-
up operations, we reviewed the Bureau's estimated response rate and
plans for increasing response. Specifically, we (1) analyzed how the
Bureau develops, supports, and updates the response rate estimate, and
the extent to which the Bureau uses the estimate to inform its 2010
planning efforts; (2) described the methods the Bureau considered for
increasing response rates in 2010 and what it did to test these
methods; and (3) assessed how the Bureau identifies and selects for
testing methods to increase response rate, including considering other
surveys' methods for increasing response.
To meet these objectives, we reviewed documentation to support the
components of the response rate estimate and research literature on
methodologies to increase response to mail surveys and efforts to
estimate survey response rate. We analyzed Bureau documents related to
2000 Census evaluations, the 2010 research and testing program, the
2010 Census life cycle cost estimate, and the communications campaign.
We also interviewed Bureau officials in the Decennial Management
Division and other divisions about methods to increase self-response.
Further, we interviewed experts on survey methodology from Statistics
Canada, the U.S. Department of Labor's Bureau of Labor Statistics, and
various academic institutions, as well as former Census Bureau
officials and researchers. We asked the experts a common set of
questions in order to compare responses and identify recurring themes.
Appendix I includes a list of experts we interviewed and additional
information on our scope and methodology.
We conducted our review from July 2007 through September 2008 in
accordance with generally accepted government auditing standards. Those
standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe that
the evidence obtained provides a reasonable basis for our findings and
conclusions based on our audit objectives.
Results in Brief:
The 2010 Census response rate estimate is not fully supported,
systematically reevaluated, or clearly incorporated into cost and
planning efforts. The Bureau developed a national response rate
estimate of 69 percent for the 2010 Census in 2001, but could not
demonstrate support for one of the three components underpinning it--
the general decline due to decreasing public participation in surveys.
(The other two components are the use of a short-form-only census and
the introduction of a replacement mailing, both of which the Bureau
expects to increase response.)
The Bureau does not have procedures for developing the estimate, nor
could the Bureau provide detailed documentation on how the response
rate estimate was developed. Clear, detailed documentation would allow
the Bureau and others to better assess the reliability of the estimate,
such as determining whether the estimate's assumptions are supported
and realistic. Although the Bureau reevaluated and developed a revised
estimate of 64 percent after the Bureau announced a major redesign of
the nonresponse follow-up operation in April 2008, the Bureau still
does not have procedures for systematically reevaluating the estimate
throughout the decade based on test results or other triggering events.
For example, the Bureau did not reevaluate the estimate prior to April
2008 or use test results conducted during the decade to determine if
the estimate should be updated. Having procedures for when and how to
reevaluate the estimate would help the Bureau to ensure that the
estimate is current and reliable.
Finally, for estimating cost and planning for nonresponse follow-up,
the Bureau relies on response rate estimates from four primary types of
local census offices (LCO), which help to account for expected
differences in response patterns across geographic settings. According
to the Bureau, the LCO-type response rate estimates are, in part, based
on components of the national estimate. However, the Bureau could not
explain why it applied the replacement mailing component equally across
all LCO-type estimates or how it incorporated the short-form-only
component and the general decline component into any of the LCO-type
estimates. Establishing a quantitative basis for applying the
components to the LCO-type estimates would better inform the Bureau's
cost estimate and planning efforts for operations that are based on
response, such as hiring enumerators for nonresponse follow-up.
Overall, the Bureau lacks procedures for developing, documenting, and
reevaluating the response rate estimate. Nonetheless, Bureau officials
stated that they tried to conservatively estimate the response rate to
ensure that they are adequately prepared during nonresponse follow-up
and to avoid repeating what happened in 1990, when the Bureau
overestimated the response rate, requiring a supplemental appropriation
of $110 million and forcing it to extend nonresponse follow-up by up to
8 weeks for some areas. Conservatively estimating the response rate may
be a reasonable approach. However, having clear documentation to
support the estimate and establishing procedures for when or how to
reevaluate the estimate to ensure that it reflects current information
from testing would enable the Bureau and others to assess whether the
estimate is reliable. An unreliable response rate estimate can produce
an inaccurate cost estimate and can increase risk and uncertainty to
operational plans that are based on the response rate estimate.
Through various national and field tests and experiments, the Bureau
tested nine methods to increase self-response to the 2010 Census and
plans to implement two of these methods--the replacement questionnaire
and two-column bilingual form. An additional method the Bureau plans to
include in the 2010 Census to increase response, the Integrated
Communications Campaign, has not been tested, although a similar
communications campaign was part of the 2000 Census. The Bureau does
not plan to include three methods it tested to increase mail response-
-a telephone reminder call, messaging to distinguish initial and
replacement questionnaires, and a due date on the initial mailing
package--because they did not significantly increase response or
required further testing. Bureau officials said that they will be
conducting additional testing on the use of a due date as part of the
2010 Census Program for Evaluations and Experiments. The National
Academy of Sciences also recommended such analysis. At the time of this
report, the Bureau had not yet completed its plans for 2010 evaluations
and experiments.
The Bureau also does not plan to include four methods it tested to
increase self-response through electronic systems--interactive voice
response; a letter encouraging responding via the Internet, instead of
sending a replacement questionnaire; computer-assisted telephone
interviewing; and the Internet--because they did not increase response
or were too costly. In July 2006, the Bureau decided not to include the
Internet option in the design of the 2010 Census largely because it had
underestimated contract costs for developing the system that included
the Internet option, but also because test results indicated that the
Internet did not always increase the response rate and that security
over respondent data was a concern. However, the Bureau recently
announced that it is again considering including the Internet option in
2010, even though it has not developed further plans for testing it.
Finally, the Bureau has not yet developed detailed testing or
evaluation plans for the communications campaign, though it plans to
test campaign messages in 2009.
To identify methods for increasing response in 2010 that it planned to
test, the Bureau developed a research strategy that included
establishing test objectives and research questions. However, the
Bureau did not document the methods it considered but decided not to
test during the 2010 testing cycle or the rationale behind those
decisions. Further, for methods that the Bureau decided to test,
officials could not provide support for how they selected or
prioritized these methods. Although officials said that they considered
cost, complexity of the test, and compatibility of experiments in their
decisions, they did not specify how they defined or weighed these
factors to select and prioritize the nine methods they chose to test.
Documenting decisions about methods that were not selected for testing
would help the Bureau more effectively build capacity and institutional
knowledge about changes in methods to consider in the future. Bureau
officials also said that they consider the experiences of other survey
organizations to identify potential methods to increase response rate.
However, both Bureau officials and some survey experts acknowledged
that methods used to increase response in other surveys may only be
indirectly applicable to the decennial census. Nonetheless, testing
modifications to methods the Bureau has previously considered or
tested, such as testing a variety of telephone reminder messages, may
yield additional opportunities for increasing response.
To enhance credibility of the response rate for determining cost and
planning for future decennial censuses and to inform assumptions
underlying the 2020 response rate estimate, we recommend that the
Secretary of Commerce direct the Bureau to establish procedures for
documenting the process for developing the response rate estimate,
including analyzing 2010 data to assess the reasonableness of
assumptions used in applying the national estimate's components to the
LCO-type estimates, as well as establishing when and how to reevaluate
the response rate estimate. Further, to improve the planning and
transparency of the Bureau's research and testing for future censuses,
we recommend that the Bureau develop procedures for selecting methods
to test for increasing response.
On September 22, 2008, the Secretary of Commerce provided written
comments on a draft of this report (see app. III). Commerce generally
agreed with our conclusions and recommendations and provided
suggestions where additional context or clarification was needed. Where
appropriate, we incorporated these changes.
Background:
In 1970, the Bureau moved away from conducting the census door-to-door
and began mailing census questionnaires to households to be filled out
and returned by mail. Since the 1970 Census, the Bureau has used mail
as the primary method for collecting census data. For Census 2000, the
mailout/mailback method was used for more than 80 percent of the
population.[Footnote 3] Households that fail to mail back the census
questionnaires are included in nonresponse follow-up workload, where
enumerators follow up with door-to-door visits and telephone calls or
solicit census data from knowledgeable people, such as neighbors.
The census response rate declined dramatically from 1970 through 1990.
The 1970 response rate was 78 percent. The rate decreased to 75 percent
for the 1980 Census and then decreased again to 65 percent for the 1990
Census.[Footnote 4] Although the Bureau estimated that the 2000 Census
response rate would continue to decline to 61 percent, actual response
exceeded the estimate, reaching 65 percent for the mailout/mailback
universe prior to nonresponse follow-up.[Footnote 5] The response rate
is defined as the percentage of census forms completed and returned for
all housing units that were on the Bureau's address file eligible to
receive a census questionnaire delivered by mail or by a census
enumerator. The denominator used in calculating the response rate
includes vacant housing units and other addresses that were determined
to be undeliverable or deleted through other census
operations.[Footnote 6]
Since 1970, per household census costs have increased (in constant 2010
dollars) from about $14 in 1970 to an estimated $88 in 2010--a figure
that does not include the recent, major redesign of the Field Data
Collection Automation program. According to the Bureau, factors
contributing to the increased costs include an effort to accommodate
more complex households, busier lifestyles, more languages and greater
cultural diversity, and increased privacy concerns. In addition, the
number of housing units--and hence, the Bureau's workload--has
continued to increase. The Bureau estimated that the number of housing
units for the 2010 Census will increase by almost 14 percent over 2000
Census levels (from 117.5 million to 133.8 million housing units). As a
result of multiple factors, the total inflation-adjusted life cycle
cost for the decennial census has increased more than tenfold since
1970, as shown in figure 1.
Figure 1: Decennial Census Costs from 1970 through 2010 (Projected) in
Constant 2010 Dollars:
This figure is a bar graph showing decennial census costs from 1970
through 2010 (projected) in constant 2010 dollars. The X axis
represents the census year, and the Y axis represents the cost in
billion of dollars.
Census year: 1970;
Cost in billions of dollars: 1.0.
Census year: 1980;
Cost in billions of dollars: 2.6.
Census year: 1990;
Cost in billions of dollars: 4.1.
Census year: 2000;
Cost in billions of dollars: 8.2.
Census year: 2010 (projected);
Cost in billions of dollars: 11.8.
[See PDF for image]
Source: GAO analysis of U.S. Census Bureau figures.
Note: This figure does not reflect the Bureau's estimated increase in
the 2010 Census life cycle cost estimate ranging from $13.7 billion to
$14.5 billion, which the Bureau announced on April 3, 2008, with the
replan of the Field Data Collection Automation program.
[End of figure]
The Bureau conducts its testing, evaluation, and experimentation
program for the decennial census primarily through the Decennial
Statistical Studies Division. The division develops and coordinates the
application of statistical techniques in the design and conduct of
decennial census programs. The Bureau conducts various tests throughout
the decade to determine the effect and feasibility of form or
operational changes. For example, in the 1990s, the Bureau tested
whether adding a mandatory response message and implementing a multiple
mailing strategy, such as sending an advance letter and reminder
postcard, as shown in figures 2 and 3, would increase the response
rate. Based on positive test results, these methods were added to the
Census 2000 design.
Figure 2: Excerpt from Advance Letter from Census 2000:
This figure is an excerpt from advance letter from census 2000.
United States Department Of Commerce:
Bureau of the Census:
Washington, DC 20233-2000:
Office Of The Director:
March 6, 2000:
About one week from now, you will receive a U.S. Census 2000 form in
the mail.
When you receive your form, please fill it out and mail it in promptly.
Your response is very important. The United States Constitution
requires a census of the United States every 10 years. Everyone living
in the United States on April 1, 2000, must be counted. By completing
your census form, you will make sure that you and members of your
household are included in the official census count.
Official census counts are used to distribute government funds to
communities and states for highways, schools, health facilities, and
many other programs you and your neighbors need. Without a complete,
accurate census, your community may not receive its fair share.
You can help in another way too. We are now hiring temporary workers
throughout the United States to help complete the census. Call the
Local Census Office near you for more information. The phone number is
available from directory assistance or the Internet at [hyperlink,
http://www.census.gov/jobs2000]. With your help, the census can count
everyone. Please do your part. Thank you.
Sincerely,
Signed by:
Kenneth Prewitt
Director:
Bureau of the Census:
Enclosure:
United States Census 2000:
[See PDF for image]
Source: U.S. Census Bureau.
[End of figure]
Figure 3: Reminder Postcard from 2003 Census Test:
United States Department Of Commerce:
Economics and Statistics Administration:
U.S. Census Bureau:
Washington, DC 20233-0001:
Office Of The Director:
February 7, 2008:
A few days ago, you should have received a request for your
participation in the 2003 National Census Test. The test is being
conducted by the U.S. Census Bureau to help develop new methods for the
next census in 2010.
If you have already mailed back your completed census form, please
accept our sincere thanks. If you have not responded, please mail it
back as soon as possible.
Taking an accurate census is important for all communities throughout
the United States to get their fair share of federal funding. Your
participation in this test is important to the success of the next
census.
Sincerely,
Signed by:
Charles Louis Kincannon:
Director, U.S. Census Bureau:
[See PDF for image]
Source: U.S. Census Bureau.
[End of figure]
The Decennial Statistical Studies Division also conducts evaluations of
census operations typically using data collected as part of the
decennial process to determine whether individual steps in the census
operated as expected. It is also responsible for conducting experiments
during the census that may be instructive for future censuses. These
experiments usually involve using alternative processes for a subset of
the population. The Bureau is working with internal and external
stakeholders on defining and developing its program for experiments and
evaluations for the 2010 Census, known as Census Program for
Evaluations and Experiments, with design work and implementation
starting in 2008 and continuing through 2011. The final set of
activities would include analysis, documentation, and presentation of
the research, and these activities would start in 2009 and be completed
by the end of fiscal year 2013.
The Response Rate Estimate Lacks Support, Is Not Systematically
Reevaluated, and Is Not Clearly Incorporated into Planning Efforts:
2010 Census Response Rate Estimate Is Not Fully Supported:
The Bureau developed the response rate estimate for the 2010 Census in
2001 but could not demonstrate support for one of the three components
underpinning the estimate. To establish the 2010 estimate, Bureau
officials told us that they used Census 2000 mail response data as the
baseline and then incorporated three other components to arrive at a
response rate estimate of 69 percent: an increase in response related
to eliminating the long form and moving to a short-form-only census, an
increase in response related to sending a replacement questionnaire to
nonresponding households, and a general decline in mail response due to
decreasing public participation in surveys. The components of the
estimate are outlined in figure 4, and a detailed explanation of the
baseline and each component follows.
Figure 4: Components of the Census Bureau's 2001 National Response Rate
Estimate for the 2010 Census:
This figure is a chart showing the components of the census bureau's
2001 national response rate estimate for the 2010 census.
[See PDF for image]
Source: GAO analysis of U.S. Census Bureau information.
[End of figure]
Baseline Rate:
Included in the baseline national response rate of 65 percent[Footnote
7] are the effects of methods, such as the multiple mailing strategy
and the communications campaign (called the Partnership and Marketing
Program) that were implemented in Census 2000 and are planned to be
implemented in the 2010 Census as well. Bureau officials stated that
the communications campaign, which included paid advertising, had an
impact on the response rate achieved in 2000, but were unable to
quantify that effect and did not project the campaign's effect for
2010.
Short-Form-Only Census:
The Bureau estimated an increase of 1 percentage point from eliminating
the long form and moving to a short-form-only census. The Bureau
conducted an analysis comparing Census 2000 response rates for the
short form and the long form. Even though the difference in response
rates for the two form types was rather large--the short form had a
response rate that was more than 10 percentage points higher than the
long form--the overall effect on the response rate estimate was small
because the long form was sent to only approximately 17 percent of
housing units in 2000.
Replacement Mailing:
In 2001, the Bureau expected that sending a replacement questionnaire
to households that had not responded by a certain date would increase
the response rate by 7 percentage points.[Footnote 8] The magnitude of
this component was based on test results from the 1990s, which showed
an increase in the response rate related to the replacement mailing of
at least 7 percentage points, and possibly 10 percentage points or
more. Bureau officials stated that they estimated this effect
conservatively because they assumed that a higher response rate from a
replacement questionnaire is more likely to occur in tests than in the
decennial census. Also, testing showed that not all the respondents who
return replacement questionnaires would be removed from the nonresponse
follow-up workload prior to being enumerated because of the timing of
the enumerator's visit.
General Decline:
The Bureau estimated a decrease of 4 percentage points from the
baseline level due to what it believes is a decline over time in public
survey participation. However, this assumption is not supported by
quantitative analysis or research studies but rather based on the
opinion of subject matter experts and senior officials. Further, the
Bureau could not demonstrate who was consulted, when they were
consulted, or how they decided on the amount of the general decline.
Best practices for cost estimation from our Cost Assessment
Guide[Footnote 9] call for documenting assumptions and data sources
used to develop cost estimates, such as the general decline component
of the response rate estimate. In addition, several experts we
interviewed agreed that the Bureau should have used quantitative
analysis of mail response data from other surveys to support the
general decline component when it developed the estimate in 2001. To
help support the general decline component, we suggested that the
Bureau use changes in participation in the American Community Survey,
which uses a similar mailing strategy to the decennial census and for
which response is also required by law, for comparison purposes. In May
2008, the Bureau completed an analysis of American Community Survey
response rate data from 2000 to 2007, which demonstrated a decline of
6.6 percentage points in cooperation rates[Footnote 10] to the initial
questionnaire. Although the decline in response to the American
Community Survey may not be directly comparable to response behavior
for the decennial census, it provides some support for the Bureau's
assumption that mail survey participation may be declining.
The Bureau does not have procedures for developing the response rate
estimate. Specifically, the Bureau has no established policies for
documenting deliberations related to or data sources used in developing
the estimate, including no 2010 Census decision memorandum to document
the original estimate prepared in 2001. Several experts also suggested
that the Bureau should develop a model for developing the estimate and
incorporate response rate data and demographic data from other
surveys.[Footnote 11]
Bureau officials, however, noted that they have not determined how to
apply data from other surveys to estimate the census response rate
because the census differs from other surveys, nor do they plan to
develop a model of the response rate estimate. Instead, Bureau
officials stated that they tried to conservatively estimate the
response rate to ensure that they are adequately prepared during
nonresponse follow-up and to avoid repeating what happened in 1990,
when the Bureau overestimated the response rate, requiring a
supplemental appropriation of $110 million and forcing it to extend
nonresponse follow-up by up to 8 weeks for some areas. In contrast, in
2000 the response rate exceeded the Bureau's estimate. Conservatively
estimating the response rate may be a reasonable approach. However,
having clear, detailed documentation about decisions related to or data
sources used in developing the estimate would enable the Bureau and
others to better assess whether the estimate is reliable, such as
whether its assumptions are supported and realistic. An unreliable
response rate estimate can produce an inaccurate cost estimate and can
increase risk and uncertainty to operational plans that are based on
the response rate estimate.
The Bureau Does Not Systematically Reevaluate the Response Rate
Estimate:
Although the Bureau updated the response rate estimate as a result of a
major redesign of census operations undertaken in April 2008, the
Bureau still lacks procedures for establishing when and how to
reevaluate the response rate estimate. From 2001 through April 2008,
the Bureau did not reevaluate the estimate to determine whether it
should be updated, even though, after establishing the initial estimate
in 2001, the Bureau completed several evaluations from Census 2000 and
conducted several census tests that could have informed the estimate.
During the first few years of the decade, the Bureau completed 12
evaluations that discussed aspects of the Census 2000 response rate. In
addition, the Bureau conducted five tests from 2003 through 2007 that
were designed, in part, to test methods to increase response. For
example, in 2005 and 2007, the Bureau tested a two-column bilingual
form, which includes identical questions and response options in
English and Spanish. The 2005 test demonstrated a significant positive
effect on mail response rates of 2.2 percentage points overall, and a
3.2 percentage point increase in areas with high concentrations of
Hispanic and non-White populations, and the 2007 test revealed a
similar impact. However, Bureau officials stated that they were
concerned that the two-column bilingual form has not been tested in a
census environment. In addition, they noted that the bilingual form
will be sent to only a portion of the country--approximately 10 percent
of census tracts--with high concentrations of Spanish-speaking
populations, based on American Community Survey data.
Best practices state that assumptions, such as the response rate
estimate, should be revised as new information becomes available. Our
Cost Assessment Guide recommends that preliminary information and
assumptions be monitored to determine relevance and accuracy and be
updated as changes occur to reflect the best information available.
Further, according to several experts, the Bureau could have updated
the estimate based on response rate data and demographic data from
other surveys, and another expert suggested that the Bureau use
triggering events, such as after tests, for reviewing the response rate
estimate. Based on this expert's experience, the Bureau could establish
a change control board chaired by senior Bureau officials to determine
whether the response rate estimate should be revised. However, Bureau
officials explained that they do not update the estimate based on
results from tests because the tests cannot replicate the decennial
census environment. Overall, the Bureau has not specified when or how
it is to reevaluate its response rate estimate. Establishing these
procedures would help the Bureau ensure that the estimate is current
and reliable in order to better inform planning efforts.
The Bureau revised the estimate down to 64 percent after announcing
that nonresponse follow-up would be changed from an automated to a
paper-based operation. Prior to the redesign, the Bureau had planned to
reduce the nonresponse follow-up workload by using handheld computers
to remove households that returned their forms late--including many
replacement mailing returns--from the enumerator assignments on a daily
basis. According to Bureau officials, they revised the response rate
estimate based on the timing of mail returns in 2000 and replacement
questionnaire returns in the 2003 test. However, the revised estimate
does not fully reflect recently designed procedures to remove late mail
returns in 2010. Specifically, for 2010 the Bureau now plans to mail
the replacement questionnaire earlier, send a blanket replacement
questionnaire to some areas and a targeted replacement questionnaire to
others,[Footnote 12] and conduct three clerical removals of late mail
returns immediately prior to and during the nonresponse follow-up
operation. These operational plans are still being finalized pending
further analysis. Bureau officials said that although they hope that
these revised operations will increase response, they did not update
the response rate estimate to reflect these current operational plans
because they have not tested this approach under decennial census
conditions and therefore have no basis for estimating the potential
effect of these operational changes on response.
Local Response Rate Estimates Used in Planning Efforts Lack Support:
To estimate costs and to plan for activities such as nonresponse follow-
up, questionnaire printing, and postage needs, the Bureau uses a life
cycle cost model that relies on response rate estimates for four
primary LCO types[Footnote 13] and not the national estimate. Using LCO-
type estimates helps to account for expected differences in response
patterns across geographic settings. For example, response rates for
inner-city areas are estimated to be lower than in suburban areas.
These estimates range from 55 percent to 72 percent, but lack support
for how they were developed. In determining the 2010 response rate
estimates for the four LCO types, the Bureau said that it equally
applied two components from the national estimate--the replacement
mailing component and the short-form-only component--to the LCO types
in the cost model. However, the Bureau did not conduct quantitative
analysis to determine whether the replacement mailing component in the
national estimate should be applied equally for each LCO type. By
applying an equal percentage point increase across all LCO types for
the replacement mailing, the Bureau has, in effect, assumed that the
LCO type with the lowest baseline--48 percent--would experience a
higher relative increase in response--15 percent--due to the
replacement mailing than the LCO type with the highest baseline--65
percent--which would be expected to experience an 11 percent increase
in response. The Bureau has not demonstrated support for assuming that
the LCO types will experience these different relative increases in
response. Further, the Bureau could not demonstrate whether or how the
short-form-only component and the general decline component of the
national estimate were reflected in the life cycle cost model.
Best practices for cost estimation from our Cost Assessment Guide call
for assumptions to be thoroughly documented, data to be normalized, and
each cost element to be supported by auditable and traceable data
sources. Following these best practices could allow the Bureau to
enhance its use of the response rate estimate in planning for the
decennial census and better inform stakeholders about the reliability
of the estimate. It is unclear why officials could only explain how one
component of the national response rate estimate was applied to LCO-
type estimates in the life cycle cost model. However, according to the
Bureau, it is currently documenting how these estimates are used to
calculate the costs for operations and how the estimates have changed
over time. Having support for the LCO-type estimates would better
inform the Bureau's planning efforts for operations that directly rely
on response, such as determining workload and hiring enumerators for
nonresponse follow-up.
The Bureau Plans to Incorporate Additional Methods to Increase Mail
Response in 2010:
Two of Nine Methods Tested to Increase Response Rate Are Planned to Be
Included in the 2010 Census:
We determined from reviewing Bureau testing documents that through
various national and field tests and experiments, the Bureau tested
nine methods to increase self-response to the 2010 Census, as shown in
table 1. The Bureau currently plans to implement two of these methods-
-the replacement questionnaire and two-column bilingual form (see fig.
5).[Footnote 14] Three other methods for increasing mail response and
four methods for increasing response through an electronic data
collection system were tested but are not planned for implementation in
2010. An additional method the Bureau plans to include in the 2010
Census to increase response, the Integrated Communications Campaign,
has not been tested, although the Bureau conducted a similar campaign
in 2000 and plans to test campaign messages with audiences in early
2009. Additional details regarding the testing of these methods can be
found in appendix II.
Table 1: Self-Response Methods Tested in 2010 Testing Cycle and Effect
on Response Rate in Tests:
Methods planned for implementation in 2010: 1. Replacement
questionnaire;
Year(s) tested: 2003, 2006;
Effect on response rate in tests (percentages): 8.8 to 10.3.
Methods planned for implementation in 2010: 2. Two-column bilingual
form;
Year(s) tested: 2005, 2007;
Effect on response rate in tests (percentages): 1.4 to 2.2.
Methods not planned for implementation in 2010: Methods not planned for
implementation in 2010;
Year(s) tested: [Empty];
Effect on response rate in tests (percentages): [Empty].
Methods not planned for implementation in 2010: 3. Telephone reminder
call;
Year(s) tested: 2003;
Effect on response rate in tests (percentages): 0.
Methods not planned for implementation in 2010: 4. Due date on initial
questionnaire mailing package;
Year(s) tested: 2003, 2006;
Effect on response rate in tests (percentages): 0 to 2.0[A].
Methods not planned for implementation in 2010: 5. Messaging to
distinguish the replacement from the initial questionnaire;
Year(s) tested: 2005;
Effect on response rate in tests (percentages): -1.2.
Electronic response options: 6. Interactive voice response;
Year(s) tested: 2000, 2003;
Effect on response rate in tests (percentages): -4.9 to 0[B].
Electronic response options: 7. Letter encouraging internet response
instead of sending replacement questionnaire;
Year(s) tested: 2005;
Effect on response rate in tests (percentages): -3.7.
Electronic response options: 8. Computer-assisted telephone
interviewing;
Year(s) tested: 2000;
Effect on response rate in tests (percentages): 2.1.
Electronic response options: 9. Internet;
Year(s) tested: 2000, 2003;
Effect on response rate in tests (percentages): 0 to 2.5.
Source: GAO analysis of U.S. Census Bureau information.
Notes: We included only those years in which the tests focused
primarily on determining the method's effect on response, rather than
on data quality, operational feasibility, or other aspects of survey
design. A zero effect indicates that the method tested had no
statistically significant effect on response rate compared to the
control group. Test response rates may not be fully comparable to
response rates or return rates calculated for decennial censuses, in
part because ineligible housing units, such as those that are vacant or
undeliverable as addressed, are treated differently in the
denominators.
[A] The 2 percentage point increase in response rate was obtained when
the due date was tested in conjunction with a compressed mailing
schedule, which shortened the time period between the mailing of the
questionnaire and Census Day.
[B] The Bureau notes that the 2000 test result showing no statistically
significant effect on response is difficult to interpret because a
portion of the sample in the interactive voice response panel either
received the census form late or did not receive it at all.
[End of table]
Figure 5: Excerpt from Two-Column Bilingual Form from 2008 Dress
Rehearsal:
This figure is a copy of an excerpt from two-column bilingual form from
2008 rehearsal.
[See PDF for image]
Source: U.S. Census Bureau.
[End of figure]
For the 2010 Census, the Bureau does not plan to include three methods
it tested to increase mail response, as shown in figures 6, 7, and 8.
Two of these methods--a telephone reminder call and messaging to
distinguish the initial and replacement questionnaires--were not found
to significantly improve response and therefore will not be implemented
as part of the 2010 Census. The third method--including a due date on
the initial mailing package--was found to generate faster responses
when tested in 2003, and increase overall response by 2 percentage
points when tested in conjunction with a compressed mailing schedule in
2006.[Footnote 15] However, the Bureau also believes that the use of a
due date alone could cause a lower response rate because some people
may not send back the census form after the due date. According to
Bureau officials, they are not including this method in the decennial
census design because they would like to further test it--both with and
without a compressed mailing schedule--in a census environment. It will
be important for the Bureau to optimize this testing opportunity by
designing the test to determine the extent to which the faster and
higher response is due to a compressed schedule versus a due date, as
well as exploring other test treatments the Bureau has recommended in
the past, such as including the due date on multiple mailing pieces.
Bureau officials said that additional testing on the use of a due date
will be conducted as part of the 2010 Census Program for Evaluations
and Experiments, which the National Academy of Sciences recommended in
its December 2007 interim report.[Footnote 16] The Bureau did not
provide further details on plans for the Census Program for Evaluations
and Experiments, but officials have said that they will consider costs
and staffing needs in deciding what to evaluate. The academy's final
report is due in September 2009.[Footnote 17]
Figure 6: Text for Telephone Reminder Call from 2003 Census Test:
Hello. A few days ago the U.S. Census Bureau sent a request to your
address, asking for your participation in the 2003 National Census
Test. If you’ve already responded to this request, please accept our
sincere thanks. If you haven’t, please take a few minutes to complete
your census form and return it by mail. Your response will help us
develop the best procedures for counting the U.S. population. Thank you.
[See PDF for image]
Source: U.S. Census Bureau.
[End of figure]
Figure 7: Message on Flap to Distinguish Replacement Questionnaire from
Initial Questionnaire from 2005 Census Test:
This figure is a copy of a message on flap to distinguish replacement
questionnaire from initial questionnaire from 2005 census test.
[See PDF for image]
Source: U.S. Census Bureau.
[End of figure]
Figure 8: Due Date on Envelope from 2006 Test:
[See PDF for image]
Source: U.S. Census Bureau.
[End of figure]
In addition to testing methods to increase mail response, the Bureau
also tested four methods to increase self-response through electronic
data collection systems: (1) providing for response via an interactive
voice response system;( 2) sending a letter to encourage either
responding via Internet or returning the initial questionnaire, instead
of sending a replacement questionnaire; (3) using computer-assisted
telephone interviewing; and (4) providing capability for responding via
the Internet. The Bureau found that the first two methods did not
increase response in tests and does not plan to include them in the
2010 Census. Computer-assisted telephone interviewing allows
respondents to use the telephone to connect with operators who record
interview responses electronically. The Bureau found that this method
increased the overall response rate in a Census 2000 experiment.
However, in a March 2004 report, the Bureau also stated that the method
would likely be too costly in terms of hardware, software, and staffing
resources, compared to the increase in response it might generate.
Computer-assisted telephone interviewing was not tested after the 2000
Census.
Although the Internet option was found to increase overall response
during the Census 2000 experiment, it did not increase overall response
when tested again in 2003, when a replacement mailing was also tested.
According to a July 2006 Bureau decision memo titled, "Rationale for
the Decision to Eliminate the Internet Option from the DRIS Contract,"
the Bureau decided not to include the Internet option in the design of
the 2010 Census largely because it had underestimated the costs of the
contract that included developing the Internet response option. In
responding to a draft of this report, the Bureau stated that they made
the decision because test results showed that offering the Internet
option did not increase overall response and would not offer any cost
savings. According to the 2006 memo, the Bureau also determined that
the operational risks and costs to develop this option outweighed the
potential benefits. In terms of benefits, the Bureau found improvements
in data completeness and data timeliness from Internet responses in
tests conducted in 2000/2001 for the American Community Survey and in
2003 and 2005 for the National Census Tests. The Bureau noted that
these benefits could translate into reduced costs since less follow-up
is required to improve data accuracy, and earlier receipt of responses
could result in fewer replacement questionnaires that need to be mailed
and fewer households that need to be enumerated during nonresponse
follow-up. However, with only 6 to 7 percent of the test population
using the Internet option, the Bureau concluded that no cost savings
could be realized from reducing the number or size of data capture
centers (facilities that process returned questionnaires) planned for
2010. Finally, the Bureau stated that the inability to fully test the
Internet option and growing concerns about Internet security made it
unfeasible for the Bureau to implement the Internet as a response
option for 2010. Despite its July 2006 decision, the Bureau recently
announced that it is considering including the Internet option as part
of the 2010 Census design; however, the Bureau has not developed plans
for further testing this option.
Testing and Evaluation Plans for the 2010 Communications Campaign Are
Not Yet Fully Developed:
Through its contractor and subcontractors, the Bureau has taken a
number of steps to inform the planning of the communications campaign
but has not yet fully developed the campaign's testing and evaluation
plans.[Footnote 18] From late 2006 through early 2008, focus groups
were conducted with various ethnic groups and hard-to-count
populations, such as unattached singles, to identify potential barriers
and motivators to participation, to better understand methods of
communication that work for different groups, and to develop the
campaign's overall creative expression. The Bureau also developed an
audience segmentation model using Census 2000 response data, updated
through 2006 using American Community Survey data, to provide a more
detailed understanding of the characteristics, such as home ownership
and unemployment level, of those more or less likely to respond, as
well as where they live, in order to better target communications to
encourage census participation. In addition, in 2008, a national phone
and in-person survey is being conducted to further explore barriers and
motivators to response, particularly in hard-to-count populations,
which would inform campaign message development.
According to agency officials, beginning in early 2009, campaign
messages are to be tested by showing storyboards to audiences that will
use electronic devices to vote on the messages. These messages will be
tested in 14 languages and for other populations, such as Spanish
speakers in Puerto Rico. According to the draft plan, testing of
events, partnership toolkits, promotional items, and public relations
ideas will also be conducted. However, the Bureau has not yet developed
detailed plans for this testing because, according to one official, the
Bureau intends to further develop testing plans as future fiscal year
funding amounts become available. An example of a Census 2000
communications campaign poster is shown in figure 9.
Figure 9: Example of a Census 2000 Communications Campaign Poster:
This figure is a copy of a census 20000 communications campaign poster.
[See PDF for image]
Source: U.S. Census Bureau.
[End of figure]
In addition, although the Bureau expects to award a contract by the end
of fiscal year 2008 for an independent evaluation measuring the
campaign's performance against its goals of increasing mail response,
improving accuracy, and reducing the differential undercount, and
improving cooperation with enumerators, it has not yet done so. In the
past, the Bureau has said that although evaluations have shown that the
Census 2000 communications campaign increased awareness for that
census, it was difficult to link increased awareness to changes in
respondent behavior. Bureau officials said that they have attempted to
analyze Census 2000 data to identify factors that influence behavior,
but their research results were inconclusive. Going forward, it will be
important for the Bureau to determine its plans for evaluating the 2010
communications campaign so that it does not miss opportunities to
collect data in the census environment to inform future campaigns.
Bureau Lacks Procedures for Selecting for Testing Methods to Increase
Response Rate:
For 2010, the Bureau developed a strategy to identify various methods
to test for increasing response. Specifically, the Bureau established
test objectives and research questions, such as identifying the optimal
mix of response options for the public to respond to the 2010 Census
and determining security and confidentiality issues surrounding
technology. The Bureau also developed the 2010 Census Integrated
Program Plan and 2010 Census Operations and Systems Plan to better
document its planning. However, Bureau officials did not document the
methods that they considered but decided not to test in the 2010
testing cycle or the rationale behind those decisions. Further, for
methods that the Bureau decided to test, officials could not provide
support for how they selected or prioritized these methods. Although
officials said that they considered cost, complexity of the test, and
compatibility of experiments in their decisions, they did not specify
how they defined or weighed these factors to select and prioritize the
nine methods they chose to test.
To ensure thorough and comprehensive planning of the decennial census,
our past work on lessons learned from Census 2000 highlighted the
importance of documentation to support research, testing, and
evaluation, and a comprehensive and prioritized plan of goals,
objectives, and projects.[Footnote 19] While the Bureau has developed
the 2010 Census Integrated Program Plan and 2010 Census Operations and
Systems Plan to better document its planning, these and other planning
documents we reviewed did not provide support for how the Bureau
selected and prioritized methods to test. It is unclear why the Bureau
lacks procedures for documenting decisions concerning how it selected
for testing methods to increase response. Documenting decisions about
methods that were not selected for testing would help the Bureau more
effectively build capacity and institutional knowledge about changes in
methods to consider in the future.
According to Bureau officials, they consider the experience of other
survey organizations when identifying methods for increasing response
rate. For example, they said that they attend research conferences to
learn about experiences of other organizations that conduct national
surveys to identify potential methods to increase response rate. Both
Bureau officials and some survey experts noted that methods used to
increase response in other surveys may only be indirectly applicable to
the decennial census. For example, for the Economic Census, officials
said that the Bureau sends nonresponding businesses multiple reminder
letters; the final letter to large businesses informs them that the
Department of Justice will be pursuing nonrespondents.[Footnote 20]
Bureau officials said that these methods are less feasible for the
decennial census to implement because of the shorter time frame for
obtaining responses and concerns about being respondent-friendly to
households. In addition, one survey expert noted that methods
applicable to small-scale surveys, such as personalizing a cover
letter, may be less feasible to implement for the decennial census.
Further, survey experts we interviewed generally said that they were
unaware of additional methods from censuses undertaken by other
countries or private sector surveys that the Bureau could consider to
increase mail response. Some experts noted differences between how the
United States and other countries conduct their censuses, which may
make it difficult to directly transfer other countries' practices to
the U.S. census. For example, some European censuses use a population
register to collect names, unlike the United States, which builds its
survey frame from phone and address lists. In addition, past research
has provided evidence that government-sponsored self-administered
surveys, such as the decennial census, tend to achieve higher response
rates than nongovernmental surveys.
Nonetheless, testing modifications to methods that the Bureau has
previously considered or tested in earlier studies may yield additional
opportunities for increasing response. For example, the Bureau could
test various telephone reminder messages stating that response is
mandatory by law or providing instructions for obtaining a new census
form. Although Bureau officials said that they have previously used the
American Community Survey to inform the census design, analyzing
respondent behavior to the American Community Survey, because of its
similar mailing strategy to the decennial census, could help the Bureau
regularly refine its survey methodology for increasing census response.
Although the survey forms are different, many concepts, such as
targeting the second mailing, modifying the appearance of the mailing,
and varying telephone and Internet messages to prompt nonrespondents,
could be tested with reasonable inference to the census.
Conclusions:
Nonresponse follow-up is the largest field operation, and the Bureau
estimates that it will cost more the $2 billion. To control the cost of
nonresponse follow-up, it will be important for the Bureau to devise a
strategy for getting people to return their census forms. A reliable
response rate estimate is a critical element necessary for determining
the resources needed to carry out nonresponse follow-up. The Bureau did
not have support for one of the components of the 2010 response rate
estimate--a general decline in responsiveness--and the Bureau does not
have procedures for reviewing the estimate after testing to determine
whether it should be revised. Establishing procedures for developing
the response rate estimate, including documenting data sources and
decisions, would enable the Bureau and others to better assess the
estimate's reliability. Also, establishing procedures for when, such as
after tests or other triggering events, and how to reevaluate the
estimate would help the Bureau ensure that it is providing the most
current response rate estimate for planning nonresponse follow-up and
other activities, such as questionnaire printing. The Bureau's strategy
of estimating the response rate conservatively may be prudent given
past difficulties with conducting nonresponse follow-up after
overestimating the response rate in 1990. Nonetheless, establishing and
following procedures for developing, documenting, and reevaluating the
estimate are important steps for understanding differences between the
estimate and the actual response rate for 2010 and for evaluating the
components and underlying assumptions when developing the estimate for
the next census. Successful enumeration depends on early research,
testing, and evaluation of census methods to increase response.
Establishing procedures for selecting and prioritizing the testing of
methods--such as the Internet or reminder telephone call--including
documenting methods considered but not tested, would help the Bureau
demonstrate that it has chosen an optimal research strategy for the
decennial census, more effectively build capacity and institutional
knowledge about changes in methods to consider in the future, and
enable it to more efficiently begin testing for the next census.
Recommendations for Executive Action:
To enhance credibility of the response rate for determining cost and
planning for future census activities, to inform assumptions underlying
the 2020 response rate estimate, and to improve the planning and
transparency of the Bureau's research and testing, we are recommending
that the Secretary of Commerce direct the Bureau to take the following
three actions:
* Establish procedures for developing the 2020 response rate estimate,
including documenting the data sources supporting the estimate's
components and decisions that are made in establishing the components
and analyzing 2010 data to assess the reasonableness of assumptions
used in applying the national estimate's components to the LCO-type
estimates.
* Establish procedures for reevaluating and updating the 2020 estimate,
including identifying events or changes in related operations that
should trigger a review and documenting the results of such reviews.
* Establish procedures for selecting methods for increasing response
rate that will be the subject of research and testing, including
requirements for documenting how the Bureau defines and weighs factors
used to select methods and documentation on methods considered but not
tested.
Agency Comments and Our Evaluation:
The Secretary of Commerce provided written comments on a draft of this
report on September 22, 2008. The comments are reprinted in appendix
III. Commerce generally agreed with our conclusions and recommendations
and stated that the Bureau is committed to developing and implementing
a documented, systematic methodology for establishing response rate
estimates for future censuses and reevaluating the estimates throughout
the decade. Because the recommendations in our draft report focused on
costs and planning for the 2010 Census, we revised our recommendations
to reflect actions to be taken to support future census planning,
including analyzing 2010 data to assess the reasonableness of
assumptions used in applying the national estimate's components to the
LCO-type estimates.
Commerce also provided technical corrections, which we incorporated as
appropriate. In its comments, Commerce disagreed with our statement
that the Internet response option increased response in the Census 2000
experiment. Commerce cited a summary statement from a Bureau report
that concluded that the use of multiple response modes tested in 2000
does not increase response. However, Bureau analyses from October 25,
2002, and March 2004 on the Census 2000 experiment stated that the
overall response rate increased when the Internet was offered as an
alternative response mode. We therefore made no changes in the report.
Commerce strongly disagreed with our statement about the reason why the
Bureau decided not to include the Internet option in the design of the
2010 Census. In the report, we state that underestimating contract
costs was the primary reason the Bureau eliminated the Internet option,
which we attribute to a July 19, 2006, memo documenting the Bureau's
rationale for eliminating the Internet response option from the
Decennial Response Integration System (DRIS) contract. This memo states
that the decision "was due largely to the fact that the Census Bureau
underestimated the FY 2006-2008 contractor costs proposed to develop
DRIS." We therefore left this statement unchanged. However, we added
the Bureau's explanation provided in its agency comment letter that the
decision was based on test results, which showed that offering this
option did not increase overall response and would not offer any cost
savings.
As agreed with your offices, unless you publicly announce the contents
of this report earlier, we plan no further distribution until 30 days
after its issue date. At that time, we will send copies of this report
to the Secretary of Commerce, the Department of Commerce's Inspector
General, the Director of the U.S. Census Bureau, and interested
congressional committees. We will make copies available to others upon
request. This report will also be available at no charge on GAO's Web
site at [hyperlink, http://www.gao.gov].
If you or your staff have any questions concerning this report, please
contact Mathew J. Scirè at (202) 512-6806 or sciremj@gao.gov or Ronald
S. Fecso at (202) 512-2700 or fecsor@gao.gov. Contact points for our
Offices of Congressional Relations and Public Affairs may be found on
the last page of this report. GAO staff who made major contributions to
this report are listed in appendix IV.
Signed by:
Mathew J. Scirè:
Director, Strategic Issues:
Signed by:
Ronald S. Fecso:
Chief Statistician:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
The objectives of this report were to (1) assess how the U.S. Census
Bureau (Bureau) develops, supports, and updates the response rate
estimate, and the extent to which the Bureau uses the response rate
estimate to inform its 2010 planning efforts; (2) describe the methods
the Bureau considered for increasing response rates in 2010 and what it
did to test these methods; and (3) assess how the Bureau identifies and
selects for testing methods to increase response rates, including
considering other surveys' methods for increasing response. The scope
of our review was limited to the census mailout/mailback universe,
which covered more than 80 percent of households in Census 2000. The
majority of households in this category have standard city-style
addresses, allowing them to receive their census questionnaires in the
mail in addition to being expected to return their questionnaires by
mail. We excluded from our review operations and methods aimed at
enumerating those not included in the mailout/mailback universe, such
as the Be Counted initiative and Enumeration at Transitory Locations;
those initiatives related to, but not primarily focused on, increasing
self-response, such as improving the quality of the Master Address File
and providing assistance through Questionnaire Assistance Centers;
methods tested prior to 2000 and already implemented in previous
censuses; and those primarily intended to improve operational
efficiency, such as postal tracking.
To determine how the Bureau develops and uses the response rate
estimate, we reviewed documentation to support the components of the
response rate estimate and research literature on efforts to estimate
survey response rate. We interviewed Bureau officials in the Decennial
Management Division about the process used to develop and update the
estimate, the assumptions on which the estimate is based, and how the
estimate is used. We also interviewed experts on survey methodology
from Statistics Canada, the U.S. Department of Labor's Bureau of Labor
Statistics, and various academic institutions, as well as former Census
Bureau officials and survey methodologists to obtain their views on the
strengths and weaknesses of the Bureau's process for developing and
updating the estimate. We compared responses to a common set of
questions we asked the experts in order to identify themes. We also
reviewed Bureau strategic planning documents--in particular, the 2010
Census life cycle cost model with supporting documentation, as well as
the 2010 Census Integrated Program Plan and the 2010 Census Operations
and Systems Plan--to understand the Bureau's use of the estimate, and
evaluated the Bureau's practices for using the response rate estimate
for generating the life cycle cost estimate against best practices
criteria in our Cost Assessment Guide and other relevant GAO products.
To describe the methods the Bureau considered for increasing response
rates in 2010 and how the Bureau tested these methods, we reviewed
Bureau analyses and evaluations of tests on methods to increase mail
response and various Bureau planning documents, such as the 2010 Census
Operations and Systems Plan. In addition, we reviewed documentation on
the Bureau's communications campaign, such as evaluations of the 2000
Partnership and Marketing Program and the 2010 Census Integrated
Communications Campaign Plan. We interviewed Bureau officials in the
Decennial Management Division to obtain additional context about the
methods they considered and tests they conducted.
To assess how the Bureau identifies and selects methods to test for
increasing response rates, we reviewed various Bureau planning
documents, such as research and development planning group action
plans, for factors that the Bureau considers in selecting methods to
test. We interviewed Bureau officials in the Decennial Management
Division and other divisions about the process for identifying and
selecting methods for increasing response. We also reviewed our past
work on lessons learned from Census 2000 on the importance of
documentation and planning in order to evaluate the Bureau's process
for selecting and prioritizing methods to test. We interviewed experts
on survey methodology from Statistics Canada, the U.S. Department of
Labor's Bureau of Labor Statistics, and various academic institutions,
as well as former Census Bureau officials and researchers, about
additional methods for increasing response that the Bureau could
consider and to obtain their perspectives on the Bureau's process for
identifying, testing, and implementing methods. We compared responses
to a common set of questions we asked the experts in order to identify
recurring themes. We also reviewed research literature on methodologies
to increase response to mail surveys to identify additional methods
that the Bureau could consider.
We interviewed the following experts in survey methodology:
* Anil Arora, Director General, Census Program Branch, Statistics
Canada:
* Paul Biemer, Distinguished Fellow in statistics, Research Triangle
Institute:
* Don A. Dillman, Regents Professor, Thomas S. Foley Distinguished
Professor of Government and Public Policy, and Deputy Director for
Research and Development in the Social and Economic Sciences Research
Center, Washington State University, and former Senior Survey
Methodologist, U.S. Census Bureau:
* Elizabeth Martin, former Senior Survey Methodologist, U.S. Census
Bureau:
* Colm O'Muircheartaigh, Vice President for Statistics and Methodology,
National Opinion Research Center and Professor, Irving B. Harris
Graduate School of Public Policy Studies, University of Chicago:
* Kenneth Prewitt, Carnegie Professor of Public Affairs, School of
International and Public Affairs, Columbia University and former
Director, U.S. Census Bureau:
* Clyde Tucker, Senior Survey Methodologist, Bureau of Labor
Statistics:
We conducted our review from July 2007 through September 2008 in
accordance with generally accepted government auditing standards. Those
standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe that
the evidence obtained provides a reasonable basis for our findings and
conclusions based on our audit objectives.
[End of section]
Appendix II: Testing of Methods to Increase Response to 2010 Census:
Replacement mailing. The Bureau plans to include the replacement
mailing in the 2010 Census design. The replacement mailing involves
sending a new questionnaire to households after households have
received the initial census questionnaire in order to increase the
likelihood that households will respond. The Bureau tested a targeted
replacement mailing--where households that have not returned their
initial questionnaires by a cutoff date receive the replacement form--
in the early 1990s and found that this method resulted in a 10 to 12
percent increase in the mail response rate. A blanket replacement
mailing--where all households received a replacement questionnaire,
including those that had already responded--was planned for Census
2000, but the Bureau dropped it from the design because of operational
concerns that became apparent in the 1998 dress rehearsal. A targeted
replacement mailing was incorporated into the 2010 design, based on the
test results from the 1990s and the Bureau's plans to move to a short-
form-only census. In 2003, the Bureau tested the targeted replacement
mailing and was able to confirm the results of the tests from the early
1990s. Subsequent testing of the replacement mailing focused on
ensuring that the Bureau could implement it successfully, including
processing and removing the returns from the nonresponse follow-up
workload using the handheld computers.[Footnote 21]
The decision to eliminate the handheld computers from nonresponse
follow-up and make nonresponse follow-up a paper-based operation
diminished the Bureau's ability to remove late mail returns (including
replacement questionnaires that are returned) from the nonresponse
follow-up workload. The Bureau then examined options to get a better
effect from the replacement mailing and decided to try to get the
replacement mailings out more quickly. In order to do this, the Bureau
decided that it would send a blanket replacement mailing to census
tracts identified as low response areas, based on Census 2000 response
data and updated demographic data from the American Community Survey.
The replacement mailing packets for these tracts can be printed and
labeled in advance--approximately 25 million total. Households not in
these areas will receive a replacement questionnaire on a targeted
basis--that is, only if they are in census tracts identified as medium
response areas and if their initial questionnaires have not been
received by the cutoff date. The Bureau estimates that it can print and
label 15 million of these targeted replacement mailing packages in 4
days. Under this schedule, the Bureau will finish mailing out the
replacement questionnaires on April 8, whereas under the original plan
the replacement questionnaire mail out would not be completed until
April 19. As a result, more households should be able to return the
replacement questionnaires in time to be removed from the nonresponse
follow-up workload.
Two-column bilingual form. The Bureau plans to use a two-column
bilingual form in certain locations in 2010. The two-column bilingual
form provides two response columns, one in English and one in Spanish.
Each column contains the same questions and response options, and
respondents are instructed to choose the language that is most
comfortable for them. The Bureau tested the bilingual form in 2005 to
determine whether it would (1) increase overall response to the census
and (2) lower item nonresponse (when a household returns the form but
does not respond to a particular question) when compared to the
standard English form. In the 2005 test, the two-column bilingual form
panel demonstrated a significant positive effect on mail response rates
of 2.2 percentage points overall, and a 3.2 percentage point increase
in areas with high concentrations of Hispanic and non-White
populations, but did not achieve lower rates of item nonresponse. The
Bureau conducted another test of the two-column bilingual form in 2007,
which produced response rate results similar to the 2005 results. In
2010, the Bureau plans to mail the two-column bilingual form to
households in communities with heavy concentrations of Spanish speakers
and areas with low English proficiency.
Telephone reminder call. The Bureau does not plan to use a telephone
reminder call in 2010. The multiple contact strategy that the Bureau
used in 2000 included a reminder postcard mailed 1 week after the
initial questionnaire packets were mailed. As part of the 2003 National
Census Test, the Bureau tested the effect of an automated telephone
reminder call in place of a reminder postcard on increasing the mail
response rate. Out of a sample of 20,000 households, the Bureau was
able to obtain phone numbers for 6,208 (31 percent), and these
households received telephone calls to remind them to return their
questionnaires if they had not already done so. The initial results
indicated a significantly higher cooperation rate for the telephone
reminder call panel when compared to the control panel, which received
reminder postcards.
The Bureau later conducted a supplementary analysis to determine if the
higher cooperation rate was related to an underlying higher propensity
to cooperate among households with listed telephone numbers. This
analysis compared cooperation rates of two groups of households: those
for which telephone numbers were available and those for which
telephone numbers were not available. Among households for which a
telephone number was available, the Bureau observed no significant
difference in cooperation rates for housing units that received a
reminder postcard compared to those that received a reminder telephone
call. The Bureau does not plan to use a telephone reminder call in
place of a reminder postcard in the 2010 Census.
Due date on initial questionnaire mailing package. The Bureau will not
place a due date on the initial questionnaire mailing package as part
of the 2010 Census design. The value of placing a due date on the
mailing package with the initial questionnaire is that it might evoke a
sense of urgency or importance in respondents, leading to increased
response to the census. In 2003, when the Bureau tested this method, it
found that the inclusion of a due date did not significantly affect the
overall response rate. However, it did result in a significantly higher
response rate at an earlier point in the test, which can decrease the
size of the replacement questionnaire mailing.
The Bureau tested a due date again in 2006, this time combined with a
"compressed schedule," in which questionnaires were mailed 14 days (as
opposed to 21 days, as in the control group) before Census Day. A due
date for mailing the questionnaire back was given in the advance
letter, on the outgoing envelope and cover letter for the questionnaire
mailing, and on the reminder postcard. The final report of the test
concluded that the inclusion of the due date and compressed schedule
resulted in a significantly higher mail response rate, by 2.0
percentage points. In addition, several measures of data completeness
and coverage showed significant improvement. Bureau officials decided
not to include the use of a due date in the 2010 Census, noting that
they would like to conduct further testing to better understand the
effects of a due date versus a compressed schedule. The Bureau is
considering whether to test both the use of a due date and a compressed
schedule in its 2010 Census Program for Evaluations and Experiments.
Messaging to distinguish the replacement from the initial
questionnaire. The Bureau does not plan to use a message to distinguish
the replacement questionnaire from the initial questionnaire in 2010.
The inclusion of a message on the replacement questionnaire informing
respondents that they did not need to return the replacement
questionnaire if they had already provided a response from the previous
mailing was thought to be a way to reduce the number of multiple
returns, and possibly improve overall response. The Bureau tested this
method in 2005, and while the results showed a significant decline in
the submission of multiple returns, they also showed a significant
decrease in the response rate of 1.2 percentage points. Including a
message to distinguish the replacement from the initial questionnaire
was not recommended for, and will not be included in, the 2010 Census.
Methods to Increase Self-Response Using Electronic Data Collection
Systems:
Interactive voice response. The Bureau does not plan to use interactive
voice response in 2010. Interactive voice response allows respondents
to use the telephone to respond verbally to digitized voice files that
contain census questions and instructions. Speech recognition software
is used to determine and record responses. Interactive voice response
was tested in 2000, with households given the choice of providing their
census data via interactive voice response or a paper
questionnaire.[Footnote 22] The response rate for the interactive voice
response panel was not statistically different from that of the control
(mail only) group. However, the results are difficult to interpret
because a portion of the sample in the interactive voice response panel
either received the census form late or did not receive it at all.
In 2003, the Bureau tested two different strategies for getting
respondents to provide their census data using interactive voice
response. One strategy, known as a push strategy, did not include a
paper questionnaire in the initial mailing but rather pushed
respondents to reply using interactive voice response. The other
strategy, known as a choice strategy, included an initial paper
questionnaire with the interactive voice response information on it,
allowing respondents to choose their response mode. All nonrespondents
received a paper replacement questionnaire. Households that had a
choice of responding via paper or interactive voice response had
overall response rates similar to households that only received paper,
with about 7 percent of respondents given the choice using interactive
voice response. Households in the push strategy had a significantly
lower response rate--4.9 percentage points lower than households that
only received a paper questionnaire. The Bureau does not plan to give
respondents the option of providing their census data using interactive
voice response in 2010.
Letter encouraging Internet response or returning initial questionnaire
instead of sending replacement questionnaire. The Bureau will not send
a letter encouraging Internet response or returning the initial
questionnaire instead of sending a replacement questionnaire in 2010.
Instead of mailing a replacement paper questionnaire to nonrespondents,
a letter could be sent to households encouraging them either to
complete the paper questionnaire that they previously received or to
use the Internet to submit their responses. The Bureau tested this
method in 2005, sending a replacement mailing that contained a letter
with an Internet address and an identification number needed to access
the Internet questionnaire in place of a paper replacement
questionnaire. Compared to sending out a paper replacement
questionnaire, this method resulted in significantly fewer responses
overall, decreasing the response rate by 3.7 percentage points. The
Bureau will not include a letter encouraging the use of the Internet
instead of a replacement questionnaire in 2010, in part because the
Internet option was dropped from the 2010 Census design.
Computer-assisted telephone interviewing. The Bureau will not use
computer-assisted telephone interviewing for increasing response in
2010. Computer-assisted telephone interviewing allows respondents to
use the telephone to connect with operators, who conduct interviews and
record responses electronically. In a 2000 experiment, households were
given the option of returning a paper form or providing their census
data via computer-assisted telephone interviewing.[Footnote 23] When
compared to households that were only given paper questionnaires,
computer-assisted telephone interviewing brought about a 2.06
percentage point improvement in the overall response rate and also had
a low item nonresponse rate. However, it entailed substantial costs for
hardware, software, and programmer and interviewer time. The Bureau has
not tested computer-assisted telephone interviewing since 2000 and does
not plan to use this option in 2010.
Internet. The Internet response option allows respondents to use an
Internet-based questionnaire--with screens designed to resemble the
paper questionnaire--to respond to the census. Respondents answer
multiple-choice questions by clicking the appropriate buttons and
checkboxes and text-entry questions by typing their answers into
response fields. The Bureau gave respondents the option to respond by
Internet in a 2000 experiment. Some households in the test were given
the choice of providing their census data via Internet or a paper
questionnaire.[Footnote 24] The experiment found that the Internet
response option resulted in a 2.46 percentage point increase in
response.
In 2003, the Bureau again tested allowing respondents the option of
providing their answers via the Internet by sending a paper
questionnaire along with instructions for responding via the Internet
in the initial mailing.[Footnote 25] Households that had a choice to
respond by paper or the Internet had a similar overall response rate to
households that were provided only paper, with about 10 percent of
respondents choosing to respond by Internet.
The Bureau decided not to include the Internet in the 2010 Census,
despite including it in the scope of the contract awarded in 2005 for
the Decennial Response Integration System. The Bureau noted that this
decision was based on a number of factors, including the Bureau's
underestimation of the contractor costs for the first 3 years of the
contract, as well as test results that indicated that the Internet
would not increase the response rate and concerns about the security of
respondents' data prior to the Bureau receiving it.
Communications campaign. Census 2000 included a greatly expanded
outreach and promotion campaign--including, for the first time, paid
advertising--in an attempt to increase public awareness of and promote
positive attitudes about the census. This program, called the
Partnership and Marketing Program, was considered a success after the
Bureau reversed the trend of declining mail response rates, and the
Bureau made plans to continue the program for 2010. Bureau officials
stated that the 2010 campaign consolidates all census communications
under a single communications contract. This Integrated Communications
Campaign aims to motivate the entire populations of the 50 states, the
District of Columbia, Puerto Rico, and other U.S. territories to
participate in the census through partnerships (with community groups,
businesses, colleges, faith-based organizations, and other targeted
groups); public relations; events; Census in Schools; and paid
advertisements in broadcast, print, and online media.
Bureau officials noted that the advertising campaign would not be
included in the dress rehearsal because the Bureau's experience
including the advertising campaign in the 1998 Dress Rehearsal did not
provide the feedback needed to revise the creative aspects of the
campaign. To refine the communications campaign, the Bureau conducted
an audience segmentation analysis to identify how to best reach people
with the paid advertising campaign. In addition, the Bureau conducted
focus groups in 2006 and 2007 to provide information on what motivates
individuals to respond.
The communications campaign is scheduled to run from mid-2008 through
June 2010. In 2008 and 2009, most of the activities are focused on
preparing and mobilizing partnerships. Further development of specific
messages for various audiences and communication channels will take
place from November 2008 through April 2009. Starting in mid-2009, the
partnership program will begin outreach to certain hard-to-count
populations. Activities and events targeting all audiences will begin
in January 2010 and peak in March 2010.
[End of section]
Appendix III: Comments from the Department of Commerce:
The Secretary Of Commerce:
Washington, D.C. 20230:
September 19, 2008:
Mr. Mathew J. Scire:
Director:
Strategic Issues:
United States Government Accountability Office:
Washington, DC 20548:
Dear Mr. Scire:
I enclose the U.S. Department of Commerce's comments in response to
recommendations contained in the United States Government
Accountability Office's Draft Report, 2010 Census: Census Bureau Needs
Procedures for Estimating the Response Rate and Selecting for Testing
Methods to Increase Response Rate (GAO-08-1012).
Sincerely,
Signed by:
Carlos M. Gutierrez:
Enclosure:
Comments in Response to the United States Government Accountability
Office Draft Report Entitled:
2010 Census: Census Bureau Needs Procedures for Estimating the Response
Rate and Selecting for Testing Methods to Increase Response Rate:
September 30, 2008:
General Comments:
The Census Bureau appreciates the opportunity to review the GAO's draft
report on this subject. We are in agreement with GAO's central
fording—that the 2010 Census response rate estimate is not fully
supported by a documented, systematic methodology, and that it has not
been reevaluated since it was first established in 2001. While this is
unfortunate, we believe that at this late date we cannot perform
additional tests or substantive research that would allow us to
establish a revised estimate. While it is possible that another review
of available information might lead us to revise the projected response
rate, such a revision would involve interpretation; it would not be
based on significant test data. Moreover, we feel strongly that despite
declining response rates for censuses and surveys throughout the
decade, the decennial census environment is unique. For example, all
people, households, and governments are involved, there is extensive
outreach, partnership, and promotion, and it is a mandatory activity
required by the U.S. Constitution. Accordingly, we remain persuaded
that the response rate of 64% arrived at through examination of Census
2000 data is viable for planning purposes.
We recognize that the actual response rate in 2010 may be different
than we project. From our perspective, the major risk comes from
overestimating the rate. If the actual rate is significantly lower than
we project (as happened in 1990), we will be faced with some major
challenges that will be difficult to overcome in the middle of census
operations. We have very little leeway in our schedule at that point;
the dates for producing apportionment and redistricting data are fixed
by law. Thus, we would need to utilize contingency funding—or if
necessary, request additional funding from Congress— in order to
quickly hire and train more field staff so as to stay on schedule.
The Census Bureau is committed to developing and implementing a
documented, systematic methodology for establishing response rate
estimates for future decennial censuses, beginning with an analysis of
2010 data in preparation for the 2020 Census. We also commit to
reevaluating our estimates throughout the decade.
Our specific comments on the report follow.
Detailed Comments:
On page 3, toward the middle of the page, the report states that we
have not reevaluated the response rate change to 64% (which we made
after the decision to drop use of handheld computers for nonresponse
follow-up). Last month we briefed auditors on some of the changes made
to the response rate estimate, and our preliminary revisions to the use
of replacement mailings. At that time, we also shared with them the
Census 2000 data and our analysis of response patterns that led to our
decision to lower the estimated rate to 64%.
At the top of page 5, the report states that our test results of
Internet response "did not always increase the response rate." In our
tests in Census 2000 and during this decade, the option never increased
the overall response rate, so we suggest the sentence be reworded to
accurately reflect our test results.
At the bottom of page 5, the GAO recommends "that the Secretary of
Commerce direct the Bureau to establish and implement procedures for
documenting the process for developing the response rate estimate, as
well as establishing when and how to reevaluate the response rate
estimate." As we state in our summary comments, the Census Bureau is
committed to doing this in preparation for the 2020 Census.
On page 6, the report states: "Households that fail to mail back the
census questionnaires are included in nonresponse follow-up workload,
where enumerators follow up with telephone calls and door-to-door
visits, or solicit census data from knowledgeable people, such as
neighbors." This sentence should be adjusted to accurately capture the
sequence of events that occurs during the census. First, one or more
door-to-door visits is made. If no one is home, or if the respondent is
busy, contact information is left, after which it may be possible to
conduct the enumeration interview over the phone. As a last resort,
when no data has been collected and no time is left to make additional
visits, census data may be solicited from knowledgeable people, such as
neighbors.
On pages 7-8, footnote 6 states: "The Bureau also measures the
percentage of surveys completed and returned for occupied housing units
with deliverable addresses, called the return rate. Although the return
rate is considered the more precise measure of public cooperation with
the census, it is less useful than response rate for measuring
nonresponse follow-up workload " We believe this footnote should be
clarified to note that the mail return rate cannot be calculated until
after nonresponse follow-up (NRFU) has been completed. The mailout
universe includes all the addresses that will turn out to be vacant,
and some that will be deleted as nonexistent or non-residential, during
the NRFU operation.
At the bottom of page 12, the discussion about replacement mailing
effects states that our estimate of the effect was based on research
during the 1990s. Replacement questionnaires were also tested in the
2003 National Census Survey, the 2005 National Census Test, the 2006
Census Test, and in the 2008 Census Dress Rehearsal. Our estimates are
based on these more recent results. Also, we believe we get a larger
impact from mailing replacement questionnaires in a test than in a
census because during a census, we get a higher response rate in the
first place. That is, a higher proportion of households will reply to
the initial mailing during the actual census (as a result of paid
advertising and partnership efforts), so this reduces the potential
number of households that need to respond to the second mailing. Some
of the testing that has occurred this decade deals with the timing. of
printing and sending replacement questionnaires compared to the
enumeration activities of NRFU. We were largely successful in removing
very late mail returns from the NRFU workload in an automated
environment. Due to the return to a paper-based NRFU operation, our
revised plans for 2010 involve printing and distributing the
replacement questionnaires earlier in hopes of gaining some of the same
benefit. However, given the late decision to move to a paper-based NRFU
operation, these plans have not been tested.
In the first paragraph on page 16, the last portion needs to be revised
to clarify that the operational plans to implement our revised
replacement mailing approach are still being finalized. In that regard,
the additional analysis that we are conducting does not pertain to fine-
tuning the estimated effects; it pertains to the specifics of how to
operationalize the approach by Type of Enumeration Area. Finally,
because we have never tested these revisions under true decennial
census conditions, we have no basis for precisely estimating how much
response rate improvement they will yield.
At the bottom of page 22, the report states that the Internet response
option increased overall response during the Census 2000 experiment. We
disagree with such a broad conclusion. During Census 2000, we conducted
an experiment to examine both the use of incentives and the use of
other response modes (Internet, interactive voice response, and an
operator telephone interview). While there were variations in response
rates for these different modes depending on whether incentives also
were offered, the overall conclusion was: "The use of alternative
response modes does not increase overall response rates to the census.
Rather, it shifts households who would respond via the paper census to
the other mode." (See Synthesis of Results from the Response Mode and
Incentive Experiment for Census 2000 (RMIE), March 2003.)
On page 23, we strongly disagree with the report's summary of why we
decided not to pursue an Internet response option for the 2010 Census.
On multiple occasions, the Census Bureau has stated that the decision
was made because our tests had shown that offering this option did not
increase overall response rate, and thus would not offer any cost
savings, much less offset the actual costs that would be incurred to
offer the option. While it is true we also have expressed—and continue
to have—concerns about Internet security, that was not the primary
reason we decided not to pursue this for the 2010 Census.
On page 35, with regard to the discussion of the bilingual form, we
wish to clarify that the results for item non-response for the 2005
National Census Test were thought to be attributed to a form design
problem. The form was tested again in a survey in 2007 with changes to
the form design, which corrected to some extent the problems with item
non- response. Also, it should be noted that the determination of when
to use the bilingual form is based not just on whether an area has a
heavy concentration of Spanish speakers, but also on whether an area
has low English proficiency.
On page 36, the second paragraph regarding telephone reminder calls
should say that the initial results indicated a significantly higher
cooperation rate. It is also worth mentioning in this paragraph that
there are costs to obtaining the phone numbers and making the calls,
that there was a low percentage of phone numbers found in 2003, and
that there is increasing use of individual cell phones over household
landlines. In deciding whether to pursue this option, all of these
considerations have to be weighed.
[End of section]
Appendix IV: GAO Contacts and Staff Acknowledgments:
GAO Contacts:
Mathew J. Scirè, (202) 512-6806 or sciremj@gao.gov:
Ronald S. Fecso, (202) 512-2700 or fecsor@gao.gov:
Acknowledgments:
In addition to contacts named above, Lisa Pearson, Assistant Director;
David Bobruff; Don Brown; and Elizabeth Fan made key contributions to
this report. Tom Beall, Susan Etzel, Andrea Levine, Donna Miller, Ellen
Rominger, and Elizabeth Wood provided significant technical support.
[End of section]
Footnotes:
[1] The $13.7 billion to $14.5 billion estimate is expressed by the
Bureau in nominal year dollars.
[2] In this report, we use the term response rate to refer to the
national mail response rate the Bureau uses to estimate nonresponse
follow-up workload, unless otherwise noted.
[3] The Bureau uses the U.S. Postal Service to deliver questionnaires
to housing units with city-style addresses (house number and street
name). However, in areas where housing units do not receive mail at a
city- style address, the Bureau delivers questionnaires through an
update/ leave process, in which enumerators deliver a census
questionnaire to each housing unit. The household is asked to complete
and return the questionnaire by mail.
[4] According to the Bureau, these rates reflect mail response received
as of the cutoff dates for determining nonresponse follow-up workload,
and because these rates were computed differently, caution should be
used in comparing the rates.
[5] The Census 2000 response rate includes responses received by
Internet, which other years do not include.
[6] The Bureau also measures the percentage of surveys completed and
returned for occupied housing units with deliverable addresses, called
the return rate, which is calculated after nonresponse follow-up has
been completed. Although the return rate is considered the more precise
measure of public cooperation with the census, it is less useful than
response rate for measuring nonresponse follow-up workload.
[7] The 65 percent rate used as the baseline for the 2010 estimate is
the rate that was achieved for the mailout/mailback universe on April
18, 2000, when the nonresponse follow-up universe was identified.
[8] The Bureau later described the ability to remove all late mail
returns from the nonresponse follow-up workload--both initial
questionnaires and replacement questionnaires--as included in the
replacement mailing effect.
[9] GAO, Cost Assessment Guide: Best Practices for Estimating and
Managing Program Costs: Exposure Draft, GAO-07-1134SP (Washington,
D.C.: July 2007).
[10] The response rates in the American Community Survey analysis are
comparable to the Bureau's definition of mail return rates for the
decennial census in that vacant and nonexistent housing units are
excluded from the denominator in the calculation. The Bureau's mail
response rate, as defined for the decennial census, includes vacant and
nonexistent housing units in the denominator.
[11] The Bureau has used some of these data for identifying hard-to-
count populations as part of the planning for the 2010 communications
campaign, but these data are not incorporated into the mail response
rate estimate.
[12] In a blanket replacement mailing, all households in a certain area
would receive a replacement questionnaire, regardless of whether they
had already responded. In a targeted replacement mailing, only
households that have not returned their initial questionnaire by a
cutoff date would receive a replacement questionnaire. The original
plan for the 2010 Census included using a targeted replacement mailing
for nonresponding households in mailout/mailback areas. The revised
plan uses a blanket replacement mailing for some areas, targeted
replacement mailing for other areas, and no replacement mailing for the
remaining areas.
[13] In 2000, Type A LCOs were located in inner-city and urban areas.
Type B offices were located in urban and metropolitan areas. Type C
offices were located in suburban areas, small and medium-size cities,
towns, and some rural areas. Type D offices were located in more rural
areas.
[14] The replacement questionnaire and two-column bilingual English/
Spanish census form are planned to be implemented for the first time in
the 2010 Census.
[15] Under the compressed mailing schedule, questionnaires were mailed
14 days as opposed to 21 days before Census Day.
[16] National Academy of Sciences, Experimentation and Evaluation Plans
for the 2010 Census: Interim Report (Washington, D.C.: Dec. 7, 2007).
[17] The National Academy of Sciences has convened a panel of experts
to review the Bureau's program of research, evaluation, and
experimentation for the 2008 Dress Rehearsal and the 2010 Census. The
panel will consider priorities for evaluation in the 2010 Census. The
panel will conduct its work over a 3-year period, from September 2006
to September 2009.
[18] In September 2007, the Bureau awarded a contract to DraftFCB,
Inc., a communications agency, to create, produce, and implement an
integrated marketing and communications campaign in support of the 2010
Census. The contract, with an estimated value of around $200 million to
$300 million, was structured as a series of task orders for specific
pieces of the campaign, to be issued over 4 years.
[19] GAO, 2000 Census: Lessons Learned for Planning a More Cost-
Effective 2010 Census, GAO-03-40 (Washington, D.C.: Oct. 31, 2002).
[20] Conducted every 5 years, the Economic Census surveys U.S.
businesses and provides official measures of output for industries and
geographic areas and key source data for the gross domestic product and
other indicators of economic performance.
[21] In addition to using the handheld computers to enumerate
households during nonresponse follow-up, the Bureau planned to use them
for managing enumerators' workloads during nonresponse follow-up. Late
mail returns, including replacement questionnaires, would be removed
automatically from the nonresponse follow-up workload as soon as they
were checked in, so long as the households had not yet been visited by
an enumerator. In this way, the response rate estimate would include
responses received even after the beginning of the nonresponse follow-
up operation.
[22] Another interactive voice response test panel was offered an
incentive--a telephone calling card good for 30 minutes of domestic
calls--for using interactive voice response instead of the paper
questionnaire. The incentive did not increase the overall response rate
compared to that of the mail-only control panel.
[23] Another computer-assisted telephone interviewing test panel was
offered an incentive--a telephone calling card good for 30 minutes of
domestic calls--for using computer-assisted telephone interviewing
instead of the paper questionnaire. The overall response rate for this
incentive panel (71.81 percent) was similar to the overall response
rate for the paper-only control panel (71.44 percent), though more
respondents from the incentive panel chose to use computer-assisted
telephone interviewing than the panel where no incentive was offered.
[24] Another Internet test panel was offered an incentive--a telephone
calling card good for 30 minutes of domestic calls--for using the
Internet instead of the paper questionnaire. The overall response rate
for this incentive panel (71.50 percent) was similar to the overall
response rate for the paper-only control panel (71.44 percent), though
more respondents from the incentive panel chose to use the Internet
than the panel where no incentive was offered.
[25] Another panel received a paper questionnaire along with
instructions for responding via the Internet or interactive voice
response. This panel had a similar response rate as the paper-only
panel, with about 7 percent of respondents using the Internet. A third
panel received instructions for responding via the Internet or
interactive voice response but no initial paper questionnaire. This
panel had a significantly lower response rate--by 5.7 percentage
points--than households provided only the paper questionnaire.
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office:
441 G Street NW, Room LM:
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: