Skip to main content

"Over a third of students ‘no longer wish to study medicine’ amid the junior doctors’ contract row, says poll.” This recent headline from The Independent was just one of many about a widely reported survey that appeared to suggest that thousands of medical students are giving up on their vocation. But is there really a looming medical recruitment crisis?

Fortunately for Jeremy Hunt and the NHS – and despite making headlines in many other media outlets – the figures are nonsense. The 37% figure is not drawn from a representative sample of current or potential medical students. Instead it comes from a sample of students who allegedly “had once hoped to study medicine”, approached via a student website: The Student Room.

We cannot conclude from this poll that 37% of medical students are being put off becoming doctors by the junior doctors' strike, as the headline suggests. Instead, all we do know is that 37% of an unknown group of students with a claimed former interest in medicine, gathered from an unrepresentative website using an unknown method, say they have been put off studying a subject which many (perhaps most?) of them were not studying anyway. A rather different story!

As Scottish psephologist John Curtice pointed out recently, polling people properly is hard. He was reporting on the large discrepancy between NatCen’s annual British Social Attitudes survey, which polled a large number of people after the 2015 election and got the result spot on, and the myriad polls in the lead up to the ballot which, as we know, got it badly wrong.

To conduct a poll properly, we need to gather a sample of people which is representative of the group they are interested in understanding. In this case, we would need a comprehensive list of all medical students in the country, some way of contacting them all and some means of correcting for the differences between those who agreed to take part and those who didn’t. No easy task. Doing a “voodoo poll”, by contrast, is much easier: just gather the opinions of an arbitrarily big group of people by some convenient means – in this case, a student website – and report those as giving a representative picture of views on the matter at hand.

There are all sorts of good reasons to doubt polls conducted using such voodoo methods. Students who volunteer to answer a poll about the doctors' strike may be more angry about the strike, more interested in politics generally, or more negative about the government’s approach to reforming the NHS. Students who spend their spare time hanging around on student websites may be similarly different to their peers who prefer to spend their spare time away from laptops. Actual medical students may have very different views on the matter than people who once considered medicine but ended up doing something else.

Many of the stories pegged to the poll highlighted above ignored all of these problems and reported the Student Room’s voodoo poll much as they would any other poll, thereby suggesting to readers that this is a reliable source of information on the junior doctors' row.

This matters for two reasons. First, journalists who treat voodoo polls as valid measures of public opinion are like doctors (or, indeed, health secretaries) who treat homeopathy as valid medical treatment: they give a sheen of legitimacy to bunkum. This in turn pollutes public debate with misinformation, often gathered by organisations with an agenda to push, and reduces the impact of genuine polling, which gets reported (and hence consumed by non-specialists) on an equal basis with voodoo polling.

Second, journalists who take voodoo polling seriously encourage more people to do it. Anyone with access to a large pool of people – social media companies, campaigning organisations, TV channels, even newspapers themselves – can put out a poll to their consumers and then tout the results to reporters in order to push their views, or get some cheap PR. As proper polling is expensive and difficult – and voodoo polling is cheap and easy – the likely consequence is a whole lot of really poor voodoo polling confusing voters and policymakers on issues where getting a good sense of public opinion is often very important.

We shouldn’t be too hard on the journalists taken in by voodoo polls, however. Most have no training in survey methods or statistics and work to very demanding deadlines. A press release with a compelling story based on “new polling” will often be tempting: polls make for good headlines – and numbers add authority to an argument. Yet I’m also sure that journalists want to get to the truth of a matter and would bin useless voodoo polling if they could spot it. So here’s a quick and easy checklist to help journalists (and their readers) sort the authentic polls from the suspect.

Spotting a voodoo poll – five questions to ask

1. Who did the poll?

Most reputable polling in Britain is done by polling companies belonging to the British Polling Council (BPC), which has a code of conduct to ensure basic standards on the collection and release of data are met. If the poll’s done by someone else, that’s a red flag.

2. How was the fieldwork done?

For a poll to be reliable, the people in the poll should be representative of the larger group of people who didn’t take part. The pollster should be able to detail how they identified the group they surveyed, how they recruited respondents within it and what steps they took to ensure the people in their survey are representative of the broader group being studied.

3. Is the data public? If not, why not?

“Sunlight is the best disinfectant” – those collecting reliable data should be willing to share it. BPC pollsters are required to provide lots of details on their data, which is freely available for download on their websites. Academics given funding to conduct surveys are held to an even higher standard – they have to make their raw datasets freely available to everyone. Does your pollster provide a link to a detailed report on their results? Are they able to deliver details or data when asked? If they are reluctant to do so, you have to wonder what they are hiding.

4. Do the pollsters have a stake in the outcome?

Many voodoo polls are conducted by organisations with a pretty obvious agenda, and their results tend to fit rather conveniently with that agenda. For example, watch out for a campaign against NHS reforms that finds overwhelming opposition to the reforms, or a stridently anti-immigration newspaper which regularly finds 99% of readers share its views. Some campaigns will indeed have the wind of public opinion in their sails, but as a general rule such agenda-pushing polls should be treated like any sales pitch: with a large dose of scepticism.

5. Are the results surprising?

There is an old dictum in the polling industry known as Twyman’s Law: “If a poll result is surprising, it is usually wrong”. Does the poll show a sudden, inexplicable surge in support for a formerly unpopular position? Overwhelming and passionate backing for an obscure cause? Then assume it is suspect and do some more digging.

If the answers to these questions raise doubts, then get in touch with a pollster, or an academic who works with survey data. Both groups will be able to offer an informed take on the polling, and have a stake in ensuring readers are not misled by dubious polling. A quick phone call will reveal whether a poll is kosher. If it is voodoo polling, you’ll avoid putting your name and your employer’s authority behind junk data. If it’s a kosher survey, you’ll get some useful expert reaction to the findings to help flesh out the story.

Good data on public opinion is vital to inform political debates, but bad data can do nothing but harm. A little extra care can help get the right numbers in front of people, and keep the cowboys from hijacking the discussion.

Robert Ford is senior lecturer in politics at the University of ManchesterThis article was originally published on The Conversation. Read the original article.

 

Leave a Reply

Significance Magazine