Main content

Sexy surveys: Five questions to ask before you buy their top line

Anthony Reuben

is head of statistics for BBC News

It’s three o’clock in the morning and you’re doing the night shift for a breakfast programme. Suddenly a survey with a beguiling top line lands in your inbox. Can you run it? Here’s a handy list of questions to ask yourself:

Where has it come from?

Specifically, who paid for the research and who conducted it? Are you comfortable with letting this organisation influence your news agenda?

It is not necessarily unacceptable to have research commissioned by a travel agent suggesting that people think going on holiday is the most important thing in the world, but it should set alarm bells ringing and make you all the more careful about the remaining questions. Similarly, if the research has been conducted by an organisation that you’ve heard of - perhaps one that you recognise from pre-election opinion polls - that should increase your confidence but does not mean the research can be automatically trusted.

What questions does it ask?

Think about what answers you would give to the questions in the survey. Are you sure about how you would answer or does it seem ambiguous? Does the question feel as if it is pushing you towards a particular answer?

There was a case before the Leveson Inquiry was published in which YouGov was commissioned to conduct surveys by the Sun, Hacked Off and the Media Standards Trust. The findings were contradictory, which may have had a lot to do with how the questions were phrased.

How many people were asked?

As a general rule of thumb, if the population you want to say something about is more than 20,000 then you need to ask at least 1,000 people. If you’re trying to say something about a smaller population then it gets much more difficult to get a representative sample, but there are ways around it.

Say, for example, someone has managed to get the views of the managers of 10 of the 20 Premier League football teams. Now, that’s not enough to allow you to say what the managers of Premier League teams think. But you can say: 'Responses from 10 Premier League managers suggest…'

Many press releases make excessive claims about their findings, but sometimes changing the wording can make the findings usable. As the size of the population grows, the number of people you have to ask doesn’t grow, but what gets more difficult is answering the next question:

Were they the right people?

This is the tricky one. To get some insight into it, think about why people conduct surveys. If you’re trying to say something about everyone in the UK, the most accurate thing to do would be to ask them all. That’s what we do once a decade in the census, but it’s jolly expensive. Instead we ask a smaller group and extrapolate from those findings to say something about the UK as a whole.

There are two schools of thought about how you do this. One is that you make sure you ask a properly random group of people, by dialling randomly selected phone numbers, for example. The other is what pollsters such as YouGov do, which is to have a database of volunteers that you know a lot about. Then when it is commissioned to do a poll it can be sure that the people it is asking have the same features as the whole population in - for example, the proportion of men and women, or the age profile or income distribution.

If you’ve got neither a random sample nor a representative sample your results are meaningless. Imagine, for example, that you were trying to find out what proportion of people in the UK were planning to buy a car in the next year. If you’re not sure about who has been asked, you could just have responses from the 1,000 wealthiest people in the country, or the 1,000 poorest. If you just spoke to people at bus stops or those logging on to the WhatCar website, you also wouldn’t learn anything.

At 3am, the best thing to do is look at their methodology and see what steps they have taken to ensure they have asked the right people. If they seem to have gone to a reasonable amount of effort then there’s a decent chance they’ve got a reasonable sample.

Can we get across the levels of uncertainty?

The language used with surveys is very important. Surveys do not ‘prove’ or ‘show’ anything - they may ‘suggest’ things. There is always a margin of error on a survey, so say the figure is ‘about 20%’ and don’t go to more decimal places than the research justifies.

The results of a survey should not normally be the headline or top line on a story. If you are sent a survey that would massively change your view of the world if it were true, your first assumption should be that it is not true, unless you have had the methodology checked carefully.

Numbers in the news get a fresh pair of eyes

#BBC Data Day: sharing the scoops, skills and surprises of data journalism

Data journalism for beginners

Data journalism: What’s new, what’s not and work in progress

Our data journalism section

Reporting numbers

More reporting skills

Blog comments will be available here in future. Find out more.