Added: 29 December 2008
2. Can we count on them?
3. How Polls are conducted
4. The Key Pollsters
5. The British Polling Council
6. When it goes wrong
The results of the latest poll are rarely out of the news and if you’re a researcher, you should do your best to keep up. An opinion poll is simply a scientific survey designed to measure the views of a specific group. By using methods which ensure a sample representative of the population, pollsters can predict an election result by polling perhaps only one or two thousand people – a bit like a chef taste-testing a whole pan of soup with only a teaspoonful. The best established polling organisations (see below) have sorted out any chinks in their methodology and can produce polls accurate to usually within 3%.
Opinion polls showing the relative state of support for each party are commissioned by newspapers and published monthly, these are known as ‘peacetime polls’ and ask respondents hypothetical questions about how they would vote if there was an election tomorrow. During an election campaign polls are published daily, feeding the 24-hour news machine and giving the pundits something to speculate. Parties tend to use their own private polling to guide their strategy at such crucial times, but will certainly take an interest in the results of the bigger and well established newspaper polls. For example, despite his insistence to the contrary, the six-point Tory lead in the polls in autumn 2007 was seen by the media as the reason Gordon Brown eventually decided against a snap election.
2. Can we count on them?
The weight given to polling predictions is compromised by the simple fact that polls can sometimes be rather hit and miss. In 1992, for example, the polls wrongly predicted both a hung parliament and slight victory for Labour. In 1997 the scale of the Labour landslide was not revealed in polling, adding to the surprise and drama when the results came in. Since the 1990s, the polling industry has undergone some significant soul searching in order to not make the same mistakes and has made some changes in methodology to improve accuracy – notably by no longer assuming that a sample which is demographically representative of the population is going to be politically representative.
In the last election (2005) more polls were conducted than ever before – a total of 51 during the five week campaign period – by MORI, YouGov, ICM, NOP and Populus. The polls got it just about right this time and correctly estimated the percentage of votes (and number of seats) held by each party. Though they slightly overestimated the Labour/Lib Dem votes and underestimated the Conservatives, the margin of error was considered acceptable. One poll by NOP got the result exactly right in its final prediction before the election (Labour -36%, Conservative – 33%, Liberal Democrat – 23%). Polls may not always be perfect, but they’re best measure we have of how the public thinks.
3. How polls are conducted
Organisations vary between the key questions they ask their respondents and how they calculate results, often including correcting or ‘weighting’ the sample one way or another. The following methodology is followed by Populus and illustrates how, as a well established and experienced organisation, they obtain their results.
Populus derives voting intention from 4 basic questions. Respondents are asked:
- How likely it is that they will actually vote at the next general election
- If yes, which party they would support if there were a general election tomorrow
- Whether they voted at the last general election
- If yes, for which party
Populus then excludes those who say they will not vote, who refuse to answer or who don’t know which party they would vote for. The figures are then adjusted for turnout on the basis of respondents’ declared likelihood of voting (based on where on a ten point scale of likelihood to vote each person interviewed has placed themselves).
They then weight the whole sample on the basis of its ‘past vote’ – adding the most recent poll data to its previous 10 most recent polls (so as to avoid the random volatility that can appear in comparing any two individual samples) and calculates the past vote weighting from the average recalled past vote in this data.
An additional step is then taken to address the tendency for ‘spirals of silence’ among supporters of unpopular parties, when voters tend not to reveal their intended vote for unpopular or unfashionable parties – this causes an inadvertent bias in polls and is what happened in the 1992 election polling (simply put, more people voted Conservative than were prepared to admit it). Large numbers of former voters for one or other party will say that they don’t know how they are going to vote next time. The methodological response is therefore to put some of these ‘don’t knows’ back into the column of the party they voted for at the previous election.
4. The Key Pollsters
YouGov – published in the Daily Telegraph Sunday Times, Daily Mail – www.yougov.com
YouGov uses online panels to provide research for public policy, market research and stakeholder consultation. Operating a panel of over 200,000 UK members representing all ages, socio-economic groups and other demographic types it considers that online polling elicits a higher quality of response, as respondents are able to answer questions at a time and in a format convenient to them, allowing a more considered view.
You can participate in their polls yourself by joining the YouGov panel.
Ipsos MORI – various – www.ipsos-mori.com
Ipsos Mori has been conducting political polling since 1979 and runs a monthly monitor which asks respondents their opinions on voting intention, likelihood to vote, satisfaction with government, the most important issues facing Britain and economic optimism. Their website is particularly useful for finding the results of ‘the best party for…’ polls, where you can track public opinion of which is the best party for dealing with issues such as crime, taxation and the economy.
ICM – published in The Guardian – www.icmresearch.co.uk
ICM use telephone, internet and face-to-face methods of research to compile their polls, the Guardian Voting Series being done by phone. In the ‘Media Centre’ section of their website you can access the results of all their polls published in the Guardian since 1982, handy if you’re interested to know who was riding high in the polls on the day you were born.
NOP – various – www.gfknop.com
GfK NOP have been involved in opinion polling since the company was founded in 1957. As the leading research partner to public sector bodies, their website is useful for finding research they’ve carried out for various government departments, local authorities and government agencies.
In the 2005 general election, NOP’s final poll prediction was exactly correct for each of the three main parties, only the second time this has happened in the history of British election polling. In addition, the exit poll it conducted jointly with MORI for the BBC and ITN exactly predicted the size of the Labour majority in the House. The organisation originally conducted its polling by face to face interviewing, it is now mostly by phone and online.
Populus – published in The Times – www.populus.co.uk
Among other research methods, Populus run the Populus Telephone Omnibus of 1,000 people twice a week and a the Populus Online Omnibus of of 1,000 people via the internet every weekend.
Their poll archive is useful for finding results on a particular area of public opinion, for example the environment or economy, as well as general political party opinion polling. Their website is your best bet for more detailed information on research methods, covering focus groups and stakeholder interviews as well as polling.
5. British Polling Council – www.britishpollingcouncil.org
Established after the wide-of-the-mark polling fiasco in 1992, the BPC was set up by the leading polling organisations to:
- Ensure standards of disclosure designed to provide consumers of survey results that enter the public domain have an adequate basis for judging the reliability and validity of the result (i.e. soon after a result is published, the public should have access to detailed statistical tables and information on how the poll was conducted).
- Through full disclosure encourage the highest professional standards in public opinion polling.
- To advance the understanding, among politicians, the media and general public, of how polls are conducted and how to interpret poll results.
- Provide interested parties with advice on best practice in the conduct and reporting of polls.
If you’re interested in more detail on polling, their website is well worth a visit. There is an excellent series of seminar papers written by BPC members after the 2005 election, in which the performance of the polls is evaluated.
6. When it goes wrong
Finally, a look at when it does go wrong, and in some cases, why. Pollsters, it seems, have been erring since way before 1992….(the author notes without comment that the best examples of polling error seem to come from the US….).
1951 General Election
Polls predicted a comfortable win for the Conservatives in opposition led by Churchill, but early indications showed a Labour lead. The Conservatives finally won with a much smaller margin than the polls had predicted, Labour actually polling more votes but the first-past-the-post system ensuring the Tories ended up with more seats.
1970 General Election
An easy Labour win for incumbent Harold Wilson was indicated by the polls, putting the party 12.4 points ahead of the Conservatives. However, on the day a late swing gave the Tories a 3.4% lead.
1936 US Presidential Election
Literary Digest magazine used 10 million postcards to ask people how they would vote. Unfortunately, they sent the postcards to people who owned telephones, magazine subscribers and car owners, a sample biased toward Republican voters who were more likely to be able to afford phones, cars and magazines in the middle of the Great Depression. The second error was a low response rate, as only 2.3 million postcards were returned. The Digest also did not gather information to monitor the quality of its sample and weight it to compensate for the fact that some groups may be over-represented and others under-represented.
Their results therefore showed a lead for Alfred Landon of 57% to Roosevelt’s 43%. In the event, Roosevelt won 60%. A new-kid-on-the-block pollster – who was laughed off by the Digest for predicting they would get it wrong – used a much smaller but representative sample and correctly showed that Landon was on course for a landslide win. His name? George Gallup.
1948 US Presidential Election
Aided by erroneous polling, newspapers incorrectly called the election for Thomas Dewey over incumbent Harry Truman. The polls were problematic because they used ‘quota sampling’ techniques, which were popular at that time. In this method, interviewers are assigned quotas of people to interview, but the interviewers decide themselves exactly who to use. With a non-random selection, there is a greater possibility of bias. In this case, the interviewers were apparently most likely to choose middle class respondents (perceived as easier to interview) which led to a Republican bias in the survey and consequently over-estimated Republican support in the results.
1996 Republican Primaries
ABC, CBS and CNN are forced to admit that they relied on incorrect exit polls when they predicted that Senator Bob Dole would finish third in the Arizona primary, when in fact he Dole finished second. The mistake was widespread because the networks relied on data from the Voters News Service. Pollsters said the error crept in because followers of his opponent Pat Buchanan actively sought out pollsters, and that exaggerated Buchanan’s actual vote.
2008 Democratic Primaries
In the New Hampshire primary held on January 8th polls showed Obama ahead of rival Hillary Clinton in the January by an average of 8 percentage points. In a USA Today/Gallup poll taken on the day before, Obama had a 13 point lead over Clinton. However, Clinton won – marginally (with 2.5 points), but surprising the pollsters nevertheless. Gary Langer, director of polling for ABC News, claimed that “it is simply unprecedented for so many polls to have been so wrong.”
Do you want to make your opinion known? Have you ever wondered just who it is that they ask when they say “a recent poll said….” ?
Well, one of the major pollsters is YouGov and you can participate in their polls on the Internet.
Here’s what their website says :
YouGov has established itself as the UK’s leading provider of accurate, timely and comprehensive data on attitudes to public policy and public life. The majority of media polling in 2002 was conducted by YouGov. YouGov is the pollster for The Sunday Times, The Daily Telegraph, The Mail on Sunday, and has conducted numerous polls for other media clients including The Observer, Channel 4, ITN, LWT, Endemol, The Evening Standard, Closer, Municipal Journal, The New Statesman, The Spectator, Times Educational Supplement. Non-media clients include KPMG, the TUC, the NHS, the Department of Health, and Greenpeace, and the ‘think-tanks’ Social Market Foundation, IPPR, The Hansard Society, and Demos.
YouGov is a research company using online panels to provide research for public policy, market research, and stakeholder consultation.
YouGov has a track record as the UK’s most accurate pollster. In all five of the YouGov polls where data could be compared to actual outcomes (including the UK general election, the Australian election, the Pop Idol contest), YouGov was within 1% of the actual result.
YouGov provides much more detail than conventional research, using double the conventional sample size, and allowing for return to key respondents with follow-up surveys, vastly enriching the data and insight gained. The quality of response is also higher, as respondents are able to answer questions at a time and in a format convenient to them, allowing a more considered view. YouGov research allows much more precision in targeting key groups.
Turn-around after the project receives client sign-off can be 24 hours. We can also supply instant feedback (results within 2 hours), and real-time viewing of data collection.
Because of the efficiencies of automated fieldwork (panel respondents – incentivised by cash payments – complete surveys online on receipt of an email), significant cost-savings are passed on to the client.