Will adding a citizenship question reduce census response rates?

I wrote this morning about the Department of Commerce’s decision to ask about people’s citizenship when it takes the 2020 census. One of the main arguments against doing so is the claim that it will cause non-citizens not to participate in the census.

In my view, even if this is true, it’s an insufficient reason to abstain from finding out how many citizens live in the U.S. But will asking the question substantially decrease non-citizen participation in the census?

Steven Camarota of the Center for Immigration Studies argues that it probably won’t. He notes:

[The Census] Bureau has been asking more detailed immigration-related questions for many years on a number of its largest and most important surveys. For example, in addition to asking about citizenship, country of birth, and year of arrival, the CPS also asks each respondent for his or her mother’s and father’s birthplaces.

Surveys like the CPS are the basis — and, in many cases, considered the gold-standard source — for official government estimates on everything from the nation’s unemployment and poverty rates to wages and health insurance coverage. If asking about citizenship significantly reduced data quality by lowering response rates, then a good deal of information published monthly and annually by the federal government, based on these surveys, would already be compromised.

But maybe the election of Donald Trump has changed the game such that non-citizens who have participated in past Census surveys will abstain in 2020. There is evidence of a “Trump effect,” Camarota acknowledges. It comes from interviews the Census Bureau conducted with focus groups of field representatives and respondents. During these interviews, concerns were expressed about the confidentiality of information provided to census takers given “the current political climate.”

However, Camarota questions whether this evidence is persuasive:

First, the bureau is very clear that the interviews and focus groups were conducted only on a small number of people. Second, the bureau states that the focus groups and interviews were “qualitative studies and as such, unrepresentative of the population as a whole, and none of them were specifically designed to examine confidentiality concerns.” So it is not clear what generalizations can be made based these on unrepresentative small samples.

Third, even if respondents have become more reluctant to answer Census Bureau surveys or to fill in a particular question, it is not clear if adding one citizenship question on the Census would make any difference to response rates.

Looking at data about response rates, seemingly a more probative way of getting at the question of non-participation, Camarota finds no significant Trump effect. Respondents were about as likely to participate in 2015 as they were in 2016, when Trump burst onto the scene and won the GOP nomination. Simply put, there has been no significant decline in the size of the foreign-born population in the monthly Current Population Survey associated with the rise of Donald Trump.

The burden of proof here should be on those who claim adding the citizenship question will have a significant effect on non-citizen participation in the census. It looks like that burden hasn’t been met here, at least not yet. As Camarota concludes:

[R]efusal rates for the ACS, the survey most similar to the decennial Census, have been rising for a long time, but remain very low. Furthermore, while there has been a rise in allocations on the citizenship question, it, too, started long before the Trump presidency.

Of course, more detailed analysis would be necessary to confirm these tentative findings. But I cannot find a sudden decrease in the public’s willingness to take part in Census Bureau surveys that already include a citizenship question.