Election polls: Which – if any – to trust?
Pundits and public alike may well ask for help when it comes to making sense of the polls during this election campaign.
Surveys of public opinion in recent weeks have been a far cry from 2015, when pollsters lined up behind one another to show a dead heat and – seemingly – a certain hung parliament. I have nearly ten years’ experience of working in polling and research and, whilst I continue to get questions on how my profession is still going after some ‘catastrophic results’ by pollsters, one only has to look at the multitude of polls released over the course of the election campaign so far to know that political polling is very much alive and kicking.
If we look at Sebastian Payne’s useful summary tweet, we see that there are various polls showing a lead for the Conservative Party of anything between 3% and 15%. They can’t all be right. So, which one do we trust? And should we trust polling at all?
The first thing to bear in mind is that a poll, or a survey, is not a census. We are talking to a proportion (often small) of the entire population. There is a lot riding on making sure that we talk to the “right” people. A good sample will be one that reflects the population we are trying to understand, such that we can extrapolate from the sample results to the total population. Next we need to ensure we ask the right questions. If you ask people how they are voting and preface it with ‘you’re not voting for the Tories are you?’ respondents might shy away from telling you the truth and bring the data collected into question. The phenomenon of the ‘shy Tory’ was blamed in both 1992 and 2015 for why some pollsters were found to be overestimating support for Labour and underestimating backing for the Conservative Party.
Beyond these two fundamental pillars of a representative sample and carefully worded questions there are a multitude of variables that help to explain the differences in vote share that we are seeing between the polling houses. For example, one pollster might collect the data online (used most widely today) and another via telephone. Each method has pros and cons and will be more or less effective at reaching different socio-economic groups. Pollsters will also analyse their data in different ways. For example, one may attribute greater weight to likelihood to vote. Another may use more advanced statistical techniques, such as a form of regression analysis, to model voting behaviour. The different assumptions and techniques employed will lead to varying estimates of vote share.
At Dods, we specialise in polling across the public sector and political space. Irrespective of the audience, however, the foundation for getting good survey results goes back to talking to the right people and asking them the right questions. The third, and final, element of the process is how the data is interpreted. Polling and research can be valuable when executed effectively, but it is not an exact science. Political polling often results in big headlines, pages and pages of commentary on what this all means for the election results, but the subject of error and uncertainty usually gets short shrift, with disclaimers often buried on a website or in the small print.
A client recently asked me “how confident are we in these figures”? I explained to them about margin of error. A typical poll of 1,000 people usually has a margin of error of plus or minus 3%. This means that if a poll is showing a party as having 30% support, support for the party in the total population might be anywhere between 27% and 33%. This is likely to be the case 95% of the time, but 5% of the time the “true” figure may fall outside of this range.
Survey research and polls can be a useful guide, particularly where we have data over time, but it is important to ensure that research is conducted effectively and the results analysed responsibly. If your organisation is thinking about commissioning polling or bespoke research be sure to seek good advice from sample and questionnaire design through to analysis and interpretation of the data.
Dr Jansev Jemal is the Research Manager for Dods, she has a PhD in voting behaviour and has previously worked for YouGov and Ipsos Mori.
Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.