Embrace uncertainty
Michael Blastland
is a journalist, author and creator of the BBC Radio 4 programme More or Less
Uncertainty doesn't come easily to journalists. We tell people things. So telling them we don't know what to tell them feels like the proverbial chocolate teapot: useless.
It's not. It's part of the story. Do we need better ways of telling people what we don't know, as well as what we do?
Take Gross Domestic Product (GDP) numbers. Whether they are reliable is a big question, subject of comment and controversy lately, and something I've fretted about before. The row has political consequences: for judgment of the success of government economic policy; for the livelihoods of the population; for future policy decisions.
And it boils down to a question of uncertainty. How sure are we that these numbers tell us the truth, or something close to it?
The argument usually goes:
Critic: 'The GDP first estimate is surprising - and must be wrong.'
Office for National Statistics (ONS): 'Possibly, but on average not by much.'
The average revision, says the ONS, tends to be only around 0.2% of GDP, either up or down.
Is that a good point at which to let the argument - or our coverage - rest?
What matters about error is not what it is on average, but the range of errors that might reasonably obtain; for it is the range, not the average of the range, that shapes our uncertainty. Even in conceding uncertainty, an average error understates it. There's often more uncertainty around numbers than appears.
The average flattens out large errors. But the possibility of large errors is important, particularly if it turns out that the data is more wobbly in times of uncertainty or economic volatility, of which there is much about at the moment.
A quick look at the latest revisions for GDP from the ONS shows that revisions to quarterly GDP from the first estimate to the latest of 0.3 or 0.4 have been common recently. (The further back you go, the more likely some parts of the revisions are simply a result of changes in methodology.)
Revisions of 0.5 or more are not exactly common, but hardly unprecedented either. So to say that the average revision is 0.2 is a little like saying that a river is on average only four foot deep and therefore safe to wade across.
ONS figures are the best - often the only - figures available, collected with care and professionalism, and make a tricky trade-off between speed and accuracy. But even the best figures of this kind are usually wrong.
For journalists, the problem is how to say - about GDP growth and much else - briefly and routinely, that official numbers are probably wrong, but still worth reporting, and then to tell us what matters most: how wrong they might be.
Would they fail to hit a barn door, or probably be not far off the mark? Since 0.4% would be the difference between, in one direction, a roaring recovery and, in the other, a worrying deepening of the recession, how should we treat that first estimate? Should we put margins of error around GDP as we do around opinion polls? And how do we treat each subsequent revision (they go on for years)?
Thus any error in the
On graphs, statisticians sometimes use what they call 'box and whisker' plots to show the most probable number, but with tails top and bottom reaching to the points within which they think the true number might be. Too complicated for telly, impossible on radio. Business reporters and others are familiar with the Bank of England's fan charts showing uncertainty around inflation forecasts and GDP.
At the very least we can say: 'That's the data. All data can be flawed. How sure of it are we this time? Not very.'
That would have been handy at every step of the swine flu epidemic, where the appropriate phrase would have been something like: 'we haven't the foggiest' - and in many other stories where the temptation is to treat wool as though concrete, or, if there is uncertainty, to report the scariest extreme rather than the full range of possibilities. I'd argue that climate change could do with more, too, and school league tables. I've seen occasional brilliant examples on the BBC, but still feel the uncertainty slips out too often.
However we do uncertainty, we should do it. We do our audiences no favours if we pretend to know more than we do. Probably.
