Much has been written about how media fragmentation has made possible — or perhaps made necessary, it’s not clear which — the niche marketing approach to policy formation. The basic idea is that with the demise of mass media, there’s no longer much point in making the case for a policy based on the general public good. The path to power consists of using modern public opinion techniques to identify groups that could be persuaded to vote for you, crafting policy proposals designed to benefit that specific clientele and using targeted media to communicate those proposals to your audience. The winning party is the one that succeeds in putting together the most pieces of the socio-economic mosaic. (Susan Delacourt’s Shopping for Votes is probably the best most recent exposition of this thesis.)

Related

I don’t know whether or not this is actually the case, but there’s reason to think that politicians believe it is. The Conservatives have certainly used their time in power to hone their proficiency in the art of policy micro-targeting. There’s no longer any meaningful distinction between CPC campaign announcements and government budget documents: both consist largely of an endless number of small measures, each addressed to a tightly-focused interest group. The Conservatives’ penchant for boutique tax credits is the mirror image of their predilection for niche spending.

There’s no longer any meaningful distinction between CPC campaign announcements and government budget documents.

Not all of this is new; modern marketing tools have simply made it easier to identify voter groups in socio-economic dimensions. Targeting groups based on geography has a long and dishonourable history in Canada: few can claimed to be genuinely surprised to learn that federal infrastructure spending under the Conservatives has been concentrated in ridings represented by Conservative MPs. (I live in a riding that was represented by the cabinet minister responsible for the Quebec City region during the recession. Even though this part of the country was spared the worst of the effects of the recession, federal largesse rained down in my neighbourhood like so much confetti. So many roads were dug up during those months that figuring out how to get to work was a daily challenge.)

THE CANADIAN PRESS/Sean KilpatrickPrime Minister Stephen Harper and Finance Minister Joe Oliver on their way to Commons last Tuesday to deliver the 2015 federal budget.

Electoral punditry has also largely bought into this approach to policy analysis. The first question that is asked of any policy is invariably “who benefits?”; the discussion then moves on to whether or not the measure will be effective in its presumed goal of attracting votes. Attempts to figure out whether or not the proposal makes sense in policy terms are generally perfunctory; the real question is whether or not the policy is a vote-winner.

This setup suits interest groups very nicely. Interest groups are by their very nature focused on promoting the interests of a subset of voters with certain narrowly-defined characteristics and/or viewpoints. In the brokerage model of politics, interest group leaders can play a powerful role, using their perceived ability to redirect votes from one party to the other for maximum gain. But again, this isn’t exactly new; political power brokers have been around ever since there was such a thing as political power.

So we have an entire industry based on the idea that pandering works: that people will vote for the party that will do the most to advance their individual private interests. The only real sense of community might be with the small number of people with similar characteristics and who also benefit from the same niche policy measure.

Happily, things aren’t as bleak as all that; counter-examples abound. For example, support for policies that redistribute income from the well-off to the not-quite-so-well-off is not entirely determined by one’s position in the income distribution. Many with high incomes will cheerfully support parties that advocate higher taxes for the rich, and low-income regions cannot be counted on to support redistributive political agendas. Apparently not all voters are driven by the question “What’s in it for me?” Different people will have different characteristics, and it will always be difficult to distinguish between the private and the public interest. Difficult, but perhaps not impossible.

The philosopher John Rawls once suggested the use of the following thought experiment: imagine that you are about to enter society, but you don’t (yet) know your place in it. You’ll have a roughly equal chance of being a man or a woman, a 1 per cent chance of being in the top 1 percent of the income distribution, a certain probability of being in a given ethnic group, and so on. Behind this “veil of ignorance,” the question of “What’s in it for me” won’t help you decide if a policy is good or not: you don’t yet know which characteristics you’ll have that would give meaning to the question.

Behind the veil of ignorance, the question of offering special treatment — or mistreatment — to different groups is not one of personal gain. Instead, the question is more fundamental: do we want to take our chances in a society where some groups happen to be able to extract favours from the government and others cannot?

Economist Stephen Gordon offers an easily-understood analysis of Canada’s federal budget situation and how we went from persistent surpluses to persistent deficits.

Read the whole thing here and don’t be scared off by the charts — you don’t need an economics degree to follow the bouncing revenue.

A few of the intriguing points he makes:

• The seeds of the federal deficit were planted before the events of 2008 and the ensuing recession.

• The government enjoyed a sharp rise in revenue from 2005 to 2007, led by a surge in personal income taxes.

• Although the cut in the GST produced a sharp drop in revenue, it was largely offset by a coincidental rise in income from corporate taxes.

• The surge in spending after 2005 came largely from increased transfer payments (including health care and the universal child benefit), initially led by transfers to persons, more recently by transfers to governments. Gordon emphasizes that these resulted from policy decisions rather than the business cycle or other factors.

• Revenue, which had risen so sharply after 2005, began falling in late 2007, before the meltdown that led to the recessions.

I won’t spoil the conclusion. Enough to note that growth in spending mixed with a reduction in tax revenue produces a dangerous mix.

This year’s federal budget is going to be an interesting one. A two-year stimulus package was introduced in 2009, which meant that the 2010 budget didn’t have much to say. But now some decisions will have to be made. Should the government extend the fiscal stimulus? Follow the United Kingdom’s example and embark on an austerity program to get rid of the deficit? Do nothing?

Clearly, the direction we want to take depends very much on where we are. GDP growth rates for the last two quarters were disappointingly low, generating alarming stories to the effect that Canada is lagging behind the rest of the G7. But let’s put this in perspective a bit. Here’s how GDP has evolved:

And here’s the graph for employment (both graphs are based on OECD data):

Canada is the only G7 country to have recovered pre-recession levels of output and employment. The story in 2010Q2-2010Q3 isn’t one of Canada lagging behind so much as the other G7 countries catching up.

In the early days of 2009, the case for fiscal stimulus was fairly compelling: output and employment were falling sharply, and the Bank of Canada was about to hit the interest rate lower bound. Things are very much different now, and it’s hard to see how a convincing case for extended fiscal stimulus could be made using available data.

So should we embark on a round of austerity to deal with the deficit? Well, no: the recovery isn’t really complete. Yes, total employment has recovered its pre-recession peak, but important series such as full-time employment, private-sector employment and hours worked have not. (The hours worked series is choppy, so I’ve plotted the 3-month moving average.):

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

A couple of weeks ago, the NDP suggested removing the GST from heating bills, and I bemoaned the idea as just another example of a policy designed to fit a communications strategy instead of the other way around. I was hoping that the proposal would do the decent thing and go away quietly, but the NDP has apparently given it a budget generous enough to air some ads. If you judge policies by how effective its communications strategy is, the initial reviews are positive.

The rest of this post is for the dwindling band of citizens who adhere to the quaint notion that politics should be about policies.

Much of what follows is based on points raised and developed by UBC’s Kevin Milligan here on WCI, on twitter and elsewhere on the internet. He characterizes the proposal as a misdiagnosis: the NDP is prescribing a price solution for an income problem. The notion of affordability only makes sense as an income problem: no-one seems to be worried about the ability of high earners to pay their heating bills.

When you see the problem as one of incomes, the remedy is clear: give money to low-income households. Happily, there is already a mechanism in place for alleviating income problems: the GST/HST rebates. If, in the view of the NDP, these payments are too small, then the solution is to make them more generous.

If heating costs were something that were incurred mainly by low-income households, then the gains would be concentrated mainly among this group. But that’s not the case. As Kevin notes, expenditures on heating increase with income:

Since expenditures on home heating increase with income, most of the revenues sacrificed by the NDP tax cut will go to those with higher incomes.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

REUTERS/Shaun Best<a rel="attachment wp-att-15048" href="http://arts.nationalpost.com/2010/10/22/the-ifoa-noir-questionnaire-r-j-ellory/ellroy_r_j_author/"><img class="alignleft size-medium wp-image-15048" title="R.J. Ellory" src="http://wpmedia.news.nationalpost.com/2010/10/ellroy_r_j_author.jpg?w=300&quot; alt="" width="300" height="451" /></a>A shot in the dark. A missing person. A dead body. A fingerprint. An unsolved mystery.
The 2010 International Festival of Authors takes place from October 20-30 and, this year, the festival is celebrating all things noir: crime fiction, thrillers, mysteries. Throughout the festival, the National Post will introduce readers to these writers and their work.
Meet: <a href="http://www.readings.org/?q=biographies/r_j_ellory&quot; target="_blank">R.J. Ellory</a>, author of <em>Saints of New York</em>.
<strong>Have you been to IFOA before? If so, what's your favourite memory? If not, what are you most looking forward to?</strong>
Have never been before, and the thing I am most looking forward to is the attitude I have found amongst the Canadians that is so different from the UK, and so very different from the US! In my brief experience, the Canadians I have met and talked with about fiction in general have far less preconceptions and fixed ideas. A good book is a good book, and the rigidity of genre that you find in the UK just doesn't seem to be there. Perhaps this has a little to do with the French historical connection, as in France you find the same refreshing lack of complexity about fiction as a whole. Also, I have a great interest in seeing as much of Canada as possible. I have seen some of Montreal and Quebec, and I want to see Toronto, Ottawa, Winnipeg and anywhere else I can get to in the time that I have!<!--more-->
<strong>What are the essential ingredients in any good thriller/mystery novel or short-story?</strong>
Honestly, I believe the real secret of a great story is emotional engagement. Someone once said to me that there were two types of novels. There were those that you read simply because some mystery was created and you had to find out what happened. The second kind of novel was one where you read the book simply for the language itself, the way the author used words, the atmosphere and description. The truly great books are the ones that accomplish both. I think a classic is the kind of book that presents you with a narrative so compelling you can’t read it fast enough, and yet is written so beautifully you can’t read it slowly enough. For me, the all-important thing about a book is the emotion it evokes and the degree of engagement I feel with the characters. When I finish a book I want to care enough about the people that I feel I'm leaving friends behind. And as far as thrillers are concerned, I think tension is based wholly on how much you want to find out what happens to the characters, and if you don’t care about them, well you won’t care what happens to them.
<strong>Agatha Christie is to have said "The best time to plan a book is while you're doing the dishes." When and where do you plan your books?</strong>
All the while. Everywhere I go. Always thinking. Always plotting! I get a lot of ideas for books, and they can come out of nowhere. For me, the real test is whether I am still thinking about that idea a month later. If I am, then that idea is probably strong enough to carry a book.
<strong>What inspires your writing?</strong>
People. Living life. Human relationships. Emotions. Music. Other peoples' books. Great movies. I think just living life, doing as much as I can, getting involved in as many things as I can.
<strong>Who is your favourite fictional detective?</strong>
Without question, it has to be Sherlock Holmes. I know that’s a cliché, but I'm English, and he's so much a part of our historical and literary heritage. I read Conan Doyle voraciously as a kid, and I still go back and read those amazing stories every once in a while.
<strong>Sidekicks: Yay or nay?</strong>
Nay.
<strong>Suppose your dog or cat goes missing. Which fictional detective do you want leading the search?</strong>
I don’t want to say Holmes again (though he would no doubt find it), so perhaps Auguste Dupin, from <em>Murders in the Rue Morgue</em>, <em>The Mystery of Marie Roget</em> and <em>The Purloined Letter</em>. That has a great deal to do with my huge affection for the work of Poe, by the way! Dupin was also a forerunner of both Holmes and Poirot, and the style of the storytelling is very much the foundation of those Conan Doyle/Christie mysteries where all is revealed at the end.
<strong>What makes a good villain? </strong>
Humanity. Having enough recognisable humanity that the reader can still identify with the villain, even though they are evil.
<strong>What's the best way to kill-off a character? </strong>
High drama. Like Holmes and Moriarty at the Reichenbach Falls! You know, there is even a memorial plaque at those falls to commemorate the fact that Holmes died there...and Holmes was a fictional character!
<strong>Who is your favourite crime/mystery/thriller writer of all-time. Why?</strong>
Truman Capote. I know <em>In Cold Blood</em> is 'non-fiction', but it was a written in the style of a novel, and I think it is a work of exceptional genius and extraordinary brilliance. <em>In Cold Blood </em>made Capote one of the most respected and influential authors in American literary history, and yet he spent the subsequent twenty years drinking himself to death and never really published another word. I am of the belief that non-fiction possesses as its primary purpose the conveyance of information, whereas fiction is there not to entertain (as we are so often told), but to evoke an emotion. There are some books that continue to stay with me, regardless of how long ago I read them, are those that somehow connected and impinged on an emotional level. This is most definitely one of them, and one that stands head and shoulders above the rest.
<strong>If you could be one of your characters for one day, who would it be? </strong>
Oh, what a question! The people I wrote about are always in such a mess! Okay, so one character...perhaps Ray Hartmann from <em>A Quiet Vendetta</em> as he listens to Ernesto Perez recount his five decades of Mafia experience.
<strong>Is there such a thing as the perfect crime?</strong>
Yes, there is. It is the crime that is never discovered. My most recent book is called <em>Saints of New York</em>. The genus of that book came out of a conversation I had with a Virginia State Homicide detective. She told me that there were 850,000 missing persons reports in the US every year, and that about 96% were solved. What happened to the rest? Those, surely, are perfect crimes, wouldn’t you say? The ones that are never even investigated.

I did a round of interviews for CBC radio on Tuesday after the federal government’s fiscal update was released, and it was often remarked that the 2009-10 deficit of $55.6b was the largest ever. I suppose I should have said that it sounds even larger if you say it was 5.56 trillion cents.

These numbers aren’t really informative without context, so here’s some. The data are taken from the Department of Finance’s Fiscal Reference Tables, also updated on Tuesday:

First up is the federal budget balance, scaled as a share of GDP and in terms of per capita 2010 dollars:The 2009-10 number is bad, but we’ve seen worse. Indeed, we had a string of 18 years in a row starting in 1977 where the deficit was a larger share of GDP than it was last year.

Here is the federal debt:

That upward tick will be something to watch in the next couple of years. A temporary blip is nothing to worry about. A sustained trend is.

I’ve found it instructive to look at spending and revenues using both methods of scaling the data.

After almost 30 years of drifting sideways, real per-capita spending jumped sharply. The stimulus program is set to expire in 2011, so the spending should stay in that region before falling back again (how far?) in 2011-12. Unsurprisingly, revenues fell during the recession, but they’ve already started coming back.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

A disconcerting trend is establishing itself in Canadian politics. Political parties are showing essentially no interest in the merits of a policy proposal beyond its potential as an element of some shrewd communications strategy.

From the Conservatives, we have the never-to-be-sufficiently-denounced cuts to the GST. They will never admit this was a mistake, because the policy was a winner according to only criterion that matters to them:

“Despite economic evidence to the contrary, in my view the GST cut worked,” Brodie said in Montreal at the annual conference of the McGill Institute for the Study of Canada. “It worked in the sense that by the end of the ’05-’06 campaign, voters identified the Conservative party as the party of lower taxes. It worked in the sense that it helped us to win.”

From the Liberals, we learn that they’re thinking about canceling corporate tax cuts. Not because it’s a good idea, but because it’ll fit into the narrative they want to sell:

Some senior Liberals contend that vowing to roll back the corporate tax cut might actually help Ignatieff, who’s been trying to position himself as the leader most in tune with the priorities of hardpressed middle-class families.

“Guess what? Every time some big moneybags kicks up a fuss, it’s going to prove to John and Jane Q. Public that we’re actually on their side,” says one senior Grit of the potential backlash among business leaders.

And the NDP wants to cut the GST on heating costs. This is dumb for at least two reasons:

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

REUTERS/Blair GableMOSCOW — An unmanned Russian cargo vessel experienced problems during a docking with the International Space Station on Friday, the Interfax news agency reported, citing the commander of the orbital station.
The Progress cargo ship “is moving away from us”, Interfax quoted cosmonaut Alexander Skvortsov as saying in a communication with Russian mission control outside Moscow. He was quoted as saying the cargo ship was “spinning uncontrollably” and later that it had disappeared from view.

By Stephen Gordon

The federal government won’t be bringing its 2011-12 budget down until at least four months from now, but there’s already some discussion about just when its fiscal stimulus should be phased out. The government’s current line is that the infrastructure program will end on March 31 as planned, with perhaps a certain amount of wiggle room for projects that may not be fully completed by then. There are suggestions that the recovery is still too weak to sustain such a reduction in aggregate demand.

The budget decision doesn’t have to be taken right away, so there’s no reason to make a definitive pronouncement on the subject just yet. So I’m going to content myself with some context here.

It should first be noted that the process of removing stimulus is already well under way. We all know that the Bank of Canada has already removed its extraordinary measures, and its overnight target has increased by 75 basis points over the past few months. But current policy is still very expansionary:

Inflation expectations are stable at 2% (at least I think so – anyone know if market-based estimates contradict this assertion?), so the real target rate is still negative.

What is less well-known is that the federal government’s fiscal stimulus has also started to be scaled back. Regular readers will recall that I’ve been tracking the Department of Finance’s monthly Fiscal Monitor of revenues and expenditures. These data are noisy and have a significant seasonal pattern, so I’ve been smoothing them by taking 12-month weighted sums. Here is how these series have evolved since the current government came to power:

A couple of preliminary points:

You can see why we were worrying about a structural deficit before the recession started: a series of tax cuts – notably to the GST – had slowed the growth of tax revenues. Even if the recession hadn’t occurred, the trends were pointing to a deficit in 2009.

For reasons I don’t understand, federal spending actually decreased during the first five months of the recession.

Revenues bottomed out in December 2009, and the main sources of revenues – personal income taxes, corporate income taxes and the GST – are all growing.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

One of the least edifying aspects of the census debacle is the government’s spin to the effect that that the only people who oppose its decision to make the long form voluntary are ‘left-wingers’, so their concerns can therefore be dismissed out of hand. One version of this meme takes the form of the argument that sabotaging the census is part of a broader strategy to diminish the importance of government in the lives of Canadians. If there are no data to guide would-be social engineers, so the reasoning goes, then they will be prevented from expanding the reach of the State into new spheres.

This is a puzzling argument, and not only because it is based on a non sequitur. It betrays a fundamental misreading of the history of the Canadian welfare state and of how evidence-based policy analysis has evolved over the past two generations. Before the census became an issue, the Left, not the Right, was the more determined opponent of evidence-based policy analysis.

The census as we know it is relatively recent phenomenon. Before 1971, governments had access to only fragmentary data sets, and the available resources for analyzing them were, by modern standards, rudimentary. But this lack of information was not an obstacle in constructing the basic infrastructure of the Canadian welfare state: its major features – publicly-funded health care, pensions and unemployment insurance – were all established before the modern census. It’s not clear why anyone would believe that depriving the government of data would prevent it from introducing new programs.

Of course, the lack of data did have important implications for the development of the welfare state: it was clumsily-designed and wasteful. As a result, much of the evidence-based policy analysis that has taken place over the past 40 years consists of documenting cases where existing policies were inefficient, ineffective, or even counter-productive.

But here’s the odd thing. Instead of being pleased at being offered the opportunity make the welfare state more effective and efficient, the Canadian Left’s reaction to these studies consisted of pretending they didn’t exist, or of attacking the authors with such pithy epithets as ‘corporate apologist’. There are any number of areas where the same debate played out; here are four.

Free Trade: Those of us of a certain age can remember how confidently the opponents of the FTA asserted that its passage would lead to disaster. Of course disaster didn’t happen, and the available data – as analysed in Dan Trefler’s famous AER article – suggest that its effects had exactly the effect that its proponents predicted: small and positive.

Pay equity laws: These measures are designed to improve the prospects for women in the labour market, and to close the wage gap with men. It turns out that the experiment in Ontario in the early 1990’s was largely unsuccessful in attaining these goals.

Employment Insurance. A poorly-designed employment insurance program can be abused, and as I noted in this post, that’s exactly what happened before the reforms of the 1990s. To those studies, we can add this one by my former grad school office-mate Jane Friesen, which documents how those reforms produced the predicted behavioural responses.

Social assistance. Economic theory predicts that benefits that are too generous provide a disincentive to find employment, and available data are consistent with the theory.

It would be a stretch to characterise these research programs as being part of a big-government plot, and they were invariably denounced by conventional leftists. Moreover, these denunciations were almost never evidence-based. Evidence-based criticism takes the form of “The study doesn’t take X into account. I should incorporate X into the analysis and see if the results are materially affected.” Instead, critics would content themselves with “The study doesn’t take X into account, so I’m going to squat stubbornly on my original position.” In retrospect, trashing the census might be something that you’d expect from a NDP government: as far as they would be concerned, evidence-based policy analysis has been a source of embarrassment.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

The government modified its census position this week. Unfortunately for the cause of responsible government, these changes don’t alter the situation much — but they do illustrate the vapidity of the government’s stance.

Firstly, in the face of a court challenge by minority language groups, the questions on language use are now going to be part of the mandatory short form. The reason for this is clear: there was no way that the government could convince the courts that a voluntary survey would provide information that would be credible enough to fulfill its obligations to minority language groups to provide services in the official language of their choice.

This is of course an implicit recognition of something that critics have been saying explicitly for the past six weeks: that the voluntary survey will not produce reliable data. What’s even more astounding is that on very the same day, Tony Clement saw fit to repeat the claim that increasing the sample size would fix the self-selection problems associated with the voluntary survey. If the government really believed that assertion, there would be no need to move the language questions to the mandatory short form.

The other thing the government did was to announce legislation to remove the threat of imprisonment for those who do not respond to mandatory surveys such as the census and the Labour Force Survey. The point has been made before that if the government did not want to threaten people with jail for not answering certain questions, the remedy was to remove the questions and/or remove the threat of jail; making the entire long form voluntary was a non sequitur. Now that the threat of imprisonment has been removed, there really isn’t anything left of the government’s original case for making the long form voluntary.

Yet the decision still stands, and we need to look forward. It’s an ugly, ugly picture.

Before the decision went public, Statistics Canada estimated that the response rates for the long form would fall from about 95% to 50%, but it could be brought back up to 70% if they spent a lot of money pestering non-respondents. This result was of course deemed to be unacceptably low a few months ago, but even this level of failure seems like an unattainable dream now. Now that the census has been made a playground for partisan politics, many CPC partisans will decide that their party’s cause will be best served by not filling out the voluntary forms, and many opposition supporters will boycott the long form in protest. The long form data will be a dog’s breakfast.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

July 23, 2010 — Economists Niels Veldhuis, of the Fraser Institute, and Stephen Gordon, of Laval University, join host Chris Selley for a surprisingly spirited debate on the elimination of the mandatory long-form census.

One of the surprising things about the census fiasco is that of all the publicy-provided services that small-government advocates could target, the census is very near the bottom of the list of priorities. Many of the services provided by governments could and perhaps should be produced by the private sector. But the economics of databases such as the census aren’t the same as the goods that inhabit most economic models, and so the standard market-is-best results of the First and Second Welfare Theorems don’t apply. When it comes to things like the census, markets fail.

Standard public finance theory offers two dimensions in which goods can be classified: rivalry and excludability. If a good is a purely private – that is, rival and excludable – then the welfare theorems apply, and there’s no obvious case for government intervention. But a database is different. Although intellectual property laws can provide a mechanism for owners to control access, it is non-rival: many users can benefit from a database just as easily as one can.

You can see why this means that markets would generate inefficient outcomes here: firms that require certain information would be obliged to produce multiple data sets, all serving pretty much the same purpose. The inefficiency takes the form of this duplication of efforts. It’s more efficient to simply have one database and allow access to multiple users. In other words, the census is a natural monopoly.

But what pricing strategy should the census monopolist adopt? Usually, we’d try to set price equal to marginal cost. In this case, the marginal cost of providing access to a database is essentially zero: once it is set up, maintenance costs are trivial. So in an ideal situation, the census would become a pure public good: universal access at zero cost.

This is pretty much the model adopted by the United States. A trip to the websites of the Bureau of Economic Analysis or the Bureau of Labor Statistics will get you a wealth of data at the price of a couple of mouse clicks. And over at the Census Bureau, fees only seem to be incurred by those whose projects require working at their offices in Washington; I can’t see any mention of fees for access to the data per se. (Readers more familiar with these matters will correct me if I’ve read things incorrectly.)

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

Maxime Bernier was Minister of Industry during the 2006 census, and he’s now saying that he received many complaints:

As industry minister during the 2006 census, Bernier said he was inundated with privacy complaints over a five- to six-week period.

“I received an average of 1,000 e-mails a day during the census to my MP office complaining about all that, so I know that Canadians who were obliged to answer that long-form census — very intrusive in their personal lives — I know they were upset,” he said.

Some of the complaints related to U.S. arms manufacturer Lockheed Martin providing computer services to Statistics Canada, he said, but the majority objected to the census questions themselves.

In the months following the census, Statistics Canada conducted its usual review in preparation for the 2011 census. This review included a consultation period between April and November 2007, which was the basis for this report.

Maxime Bernier was the Minister of Industry until August 2007, so he was responsible for setting the terms of the review. But as far as I can tell, there is essentially no mention of complaints of intrusiveness.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

Apparently I’m not the only one who thinks that the idea of sending out a voluntary long form in next year’s census is a bad idea. Since this seems to be a file to which I will be returning from time to time, here are a couple of points to consider:

1) Yes, it is a big deal. If response rates vary with the income and education levels, then you won’t have a random sample of income and education levels. There is a rather large amount of evidence in the sampling design literature documenting the fact that people with lower levels of education and income have lower response rates, and so these groups will be systematically under-sampled.

Sample selection bias is a recurring problem for virtually all surveys in which participation is not mandatory, and it is possible to correct for it – but only if you know the true distribution of the underlying distribution. For example, if you know that those in poverty are (say) 10% of the population and only 5% of your sample, you can re-weight your observations to make the sample representative. But in order to do this, you need to know that the ‘true’ proportion is in fact 10%. Usually, the source for this sort of information is the census. But if the census itself suffers from selection bias, there’s no easy way to fix it.

Related

2) Concerns about privacy are overstated. Anyone who has had dealings with Statistics Canada will tell you that they are ferociously – and at times irritatingly – determined to protect the privacy of those whose information is stored in their data bases. Researchers never see the data. They are obliged to send their estimation codes to StatsCan professionals, who run the programs and return the output to the researcher. That goes for all other non-StatsCan government employees as well.

And if the question requires looking at a subsample that is so small that there’s a chance that individual respondents could be identified, then the request is refused.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.

It would appear that there is a significant constituency in both the U.S. and in Europe agitating for immediate efforts to reduce their respective governments’ deficits, and some are pointing to the Canadian experience of the 1990s. If Canada could make the swift transition from decades of large and chronic deficits to being the poster child of fiscal rectitude with no apparent ill effects, then why can’t everyone else?

The answer is that Europe and the U.S. in 2010 is not Canada in 1995, in pretty much every way that matters.

1) Canada waited until the recession was over before embarking on an austerity program.Here is a graph of public and private sector employment in the 1990s:

Climbing out of the recession of the early 1990s was a long, brutal grind – much like what the US and Europe have ahead of them. It took five years for private-sector employment to return to its previous peak. This was no time to start reducing public sector employment, and Canadian governments didn’t try.

But by 1995, private-sector employment had returned to its pre-recession peak and the expansion was well under way. It was only then that Canadian governments started their austerity programs.

2) The austerity program was not painless. Much of the federal austerity program took the form of cuts to transfer payments to the provincial governments, who in turn were obliged to close hospitals and schools.

Stephen Gordon is a professor of economics at l’Université Laval in Quebec City, Canada and a fellow of the Centre interuniversitaire sur le risque, les politiques économiques et l’emploi. He is co-author of the blog site, Worthwhile Canadian Initiative.