Analysis: In defence of the polling

Analysis: In defence of the polling

Much of the commentary during and since the election night coverage has focused on the perceived inaccuracy of the published polling. “We were misled!” was the common refrain.

As someone who has managed research projects involving tens of thousands of respondents over many years, my view is that people were misleading themselves, rather than being misled.

Instead of carefully considering the information at hand, cognitive biases drew almost everyone to making the wrong conclusions.

Yes, the result seems unaligned with three years of Newspolls that put Labor ahead of the Coalition. But, what do we know about the polling during the campaign proper?

We know that Bill Shorten remained unpopular. We know there was a strong and consistent trend back towards the Coalition. We know that many of the final polls put Labor and the Coalition at 51/49 respectively, with the final 49/51 result within margins of error.

There was no reason to think that the trend to the Coalition would end when the last survey respondent put their phone down. And, with voters shifting towards the Coalition throughout the campaign, there was every reason to expect that undecided voters would break this way on election day, too.

Where there was an error was in the assumption of preference flows used to calculate 2PP figures for the major parties. Many of the major polls underestimated the degree to which Palmer and One Nation preferences would flow to the Coalition, particularly in Queensland. This trend was decisive in this state, and hence, the outcome.

Another error was in a reliance on “scoreboard” polling that reported on voting intentions at a particular point in time, without any qualitative insights. On Friday night, the Eagles were behind for most of the game, before running over the top of Melbourne at Optus Stadium. If you’d checked in at quarter time, half time and three-quarter time, you would know the Eagles were behind at each break. If you then came back at full time to discover the Eagles had won, you wouldn’t say the scoreboard had misled you earlier in the match.

There are plenty of reasons why the Eagles got back into the game and eventually rolled over the top of Melbourne. There are plenty of reasons why the Coalition won as well, and they should be obvious to anyone who has asked the right questions in their research. In CGM Research’s poll of voters in WA’s five most marginal electorates at the start of May, we found that the economy, jobs and taxes were becoming increasingly important issues, that the Liberals were seen as better on these issues and that this was driving some voters from Labor to the Liberals.

Even without qualitative information like this, there was plenty of evidence in the “scoreboard” polling published to suggest that the Liberals were on their way back and in with a chance.

So, why is everyone pointing the finger at the polling?  In my view, people didn’t see what happened during the campaign because of confirmation bias. Many campaigners, analysts and commentators had expressed strongly held views about the likely outcome over time and were resistant to evidence otherwise.

Post-election, it is far easier to blame the process than to accept responsibility for asking the wrong questions or misinterpreting the results. The inaccuracy of polling has become a political meme ever since Donald Trump beat Hillary Clinton in 2016. For the record, much of the polling accurately predicted Clinton winning the popular vote, with analysts giving Trump a one in five chance of winning the electoral college. Which is far from zero.

Instead of blaming the polling, my advice for campaigners, analysts and commentators is to do research better. Less of the cheap quantitative voter ID that simply tells us the score. More of the qualitative deep dives that tell us how people are feeling and what they are thinking about, to give us insights into where their votes might go on election day and how to influence this.

Despite having all the information above, I also got the result wrong. I considered the possibility of a hidden Coalition vote based on economic issues. However, I was assured Labor would pick up more seats in Victoria. I then assumed Labor would get across the line in enough of the many well resourced 50/50 marginal seat campaigns across the country to secure a narrow victory.

If CGM Research had run our marginal seats poll across the country, I probably would have concluded differently. We would likely have discovered that concerns about the economy, jobs and taxes were being felt nationwide and that this wave was likely to wash over all of the marginals.

I was wrong, not because my polling was wrong, but because I didn’t have enough of it.

Daniel Smith is executive director and founder of CGM Communications and Director of CGM Research. He directed research and advertising for WA Labor’s successful 2017 state election campaign.

Download our exclusive research report here

CGM Research Report May 2019

CGM Research Report May 2019