Fixating on short-term volatile data is folly

Many economists spend their time devising more and more intricate ways to estimate, forecast, decompose and otherwise analyse relations between various data series.

Much of what passes for economic commentary is reportage and discussion of new releases of economic data or revisions to previously released statistics. Think of all the times we have heard and seen headlines on unemployment, stock market indices, PMI indices, leading and lagging indicators etc.

And yet, this is neither economics, nor even good reportage for the most part. Economics is not about data. Data are inputs into economic thinking. And data are more than figures from the CSO, data should include personal experience, anecdotes, qualitative data.

Merely because it is not quantitative, does not make data any less useful, nor does it imply that it cannot be analysed and interrogated in a rigorous fashion. Economics is a human societal phenomena (for the most part) and thus the entire gamut of data we process should be used for the analytics of the phenomena.

Even allowing for the fact that quantitative data are easier to analyse, we need to be careful about the quality of the analysis. Modern econometric techniques are mind-bogglingly complicated. But we don’t need complex techniques to deal sensibly with data. We need common sense and some basic statistics.

Consider three series that get a lot of airplay — unemployment, quarterly national accounts and purchasing managers indices. These are important data series, widely and correctly followed.

They are released regularly and while subject to some revision, the reality is that the initial release is the one on which the press and the politicians pounce to prognosticate. They are very volatile, but this is usually mentioned only in passing.

Basic statistics tells us that when a series is volatile there is a chance it will show a rise and a chance it will show a fall. There is a chance it will show no change. We can and should take into account the volatility of the data when we analyse it and more to the point the reportage of the data should.

This doesn’t happen. We will not soon see a headline “GDP rises (but its statistically equal to zero and might actually be a fall) in quarter 2”. A rise in GDP will be greeted as corner- turning stuff no matter how statistically meaningless this is. If we look at the Irish data, we see significant volatility. This is best indicated by the standard ‘measure’ of recession, two successive quarters of declining GDP.

By that measure, we have been in recession four times (and out three times) since 2007. The reality is that it is one long recession. The GDP and GNP data are extremely volatile.

GDP quarter-to-quarter changes are slightly less volatile than GNP, contrary to what many think and despite the GDP data being subject to distortion from the multinational sector. Quarter-to-quarter changes average about 1.4%.

If we look at the data knowing how volatile they are and how much the average change is, we can consider some ‘bounds’ within which, statistically, the data should lie. This is called the ‘standard error’ . There are other measures to give these confidence intervals but all are, in essence, variations on a theme. Using this gives us a sense of whether or not we can be confident that a small rise or fall is in fact one, given the way the data jump about.

Similarly, for PMI data, we can only despair at times at the commentary. A PMI reading below 50 indicates contraction. This does not stop people from commenting along the following lines — PMI rose from 47 to 48, indicating a rebound in managers expectations. That is not the case. What is the case is that first, the mean monthly change in composite PMI in Ireland is almost indistinguishable from zero, and second, the volatility of the data suggests that any month to month interpretation is an exercise in futility.

That doesn’t stop commentators breathlessly proclaiming doom or salvation, depending on the direction of this essentially random series. For unemployment the average change in the rate (not the numbers) over the last 13 years has been a very small decline. Since 2010 the average month-to- month change has also been very close to zero. Again this does nothing to stop minor blips up and down being hailed as proof of austerity failing or working.

It doesn’t have to be this way. There is nothing stopping the CSO or newspapers taking half an hour to do some simple statistics.

Instead of reporting on monthly or quarterly changes that are volatile and inherently meaningless, it would be nice to see reportage looking at three- or six-month averages, at some confidence intervals, at some degree of statistically meaningful measures. This would make for more boring but much more accurate reporting.