|
Post by archaeologist on Jun 21, 2017 7:01:57 GMT
Just thought I would ask, since I've not seen much discussion about the performance of the polls on 8th June, and Anthony Wells has gone very quiet over at UK Polling Report.
A. Do we know why Survation got the result more accurately than other pollsters? What was different about their methodology?
B. Will there be a polling organisation inquest and report into how they got it so wrong?
C. Will we (and the media) stop believing in polls now? - Sadly, I doubt it.
Is it all down to wrongly modelling youth turnout? I somehow doubt it. There were big swings from UKIP and Green voters, and very big regional differences in relative swings between Con and Lab. I can't see the youth vote adding more than 2% at most to Labour and that would still leave most of the polls way off the result.
|
|
clyde1998
SNP
Green (E&W) member; SNP supporter
Posts: 1,765
|
Post by clyde1998 on Jun 21, 2017 19:24:43 GMT
Just some numbers from the pollsters who polled in the final two days of the campaign: Data tables availableBMG - 2.4% average party error; 11% lead error ComRes - 1.9% average party error; 8% lead error ICM - 2.2% average party error; 10% lead error Ipsos Mori - 1.6% average party error; 6% lead error Opinium - 1.6% average party error; 5% lead error Panelbase - 1.6% average party error; 6% lead error Survation - 0.9% average party error; 2% lead error YouGov - 2.1% average party error; 5% lead error No data tables avaliableKantar - 1.2% average party error; 3% lead error Qriously - 1.8% average party error; 5% lead error SurveyMonkey - 1.8% average party error; 2% lead error The Conservatives were, on average underestimated by 0.4%; Labour by 4.5%; the Lib Dems by 0.1%. UKIP were overestimated by 2.5%; the SNP by 1.0%; the Greens by 0.4% and Others by 1.0%. Pollsters seem to have underestimated the movement from UKIP to Labour, and possibly from the SNP to Labour. It would be interesting to see the split between telephone and online polls - which is something that I will probably look at. In terms of a pollster order of accuracy: - Survation
- Kantar
- SurveyMonkey
- Opinium
- Qriously
- YouGov
- Ipsos Mori
- Panelbase
- ComRes
- ICM
- BMG
|
|
|
Post by archaeologist on Jun 22, 2017 7:05:22 GMT
In the final polls there was a serious underestimation of Labour vote share, ranging up as high as 8%. The two telephone polls did better on the whole than online (a reversion to the usual pattern after the referendum said otherwise).
BMG - 8% ComRes - 7% ICM - 7% MORI (Tel) - 4.5% Opinium - 4.7% ORB - 5% Panelbase - 5% Survation (Tel) - 0.6% Kantar/TNS - 3% YouGov - 6%
Conservative error ranged from -1.9 to +2.6% error. For the Liberal Democrats it was -0.6 to + 2.4% error. For UKIP - the second largest after Labour - from +0.5% to +3.1%.
It does seem the UKIP v Lab relationship was the major error factor. I'm not sure about the error due to different turnout weightings though.
|
|
The Bishop
Labour
Down With Factionalism!
Posts: 39,071
Member is Online
|
Post by The Bishop on Jun 22, 2017 9:58:27 GMT
In terms of a pollster order of accuracy: - Survation
- Kantar
- SurveyMonkey
- Opinium
- Qriously
- YouGov
- Ipsos Mori
- Panelbase
- ComRes
- ICM
- BMG
YouGov and MORI would both have placed higher on that table had they not messed around with their final polls in somewhat dubious fashion.
|
|
YL
Non-Aligned
Either Labour leaning or Lib Dem leaning but not sure which
Posts: 4,917
Member is Online
|
Post by YL on Jun 22, 2017 17:13:47 GMT
Panelbase also changed their methodology during the campaign, didn't they?
Overall it seems that the pollsters who were furthest away this time tended to be those who made the biggest adjustments which were designed to retrospectively get the 2015 result right. Part of the problem was that one adjustment they made after 2015 was that they used turnout models entirely based on previous elections, so if turnout patterns changed they were always going to be wrong. Estimating turnout is hard, but that approach seemed like a cop-out, and a risky one too. (I think there was more to those turnout models than just age.)
IMO the big story was that YouGov model. Vaguely similar things appeared before, but I think this was the first time something like that actually worked. The difference may have been that this one was done by data science professionals who knew what they were doing...
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Jun 22, 2017 23:56:26 GMT
Yes, the ICM and ComRes adjustments in particular depended on downgrading actual current polling information in favour of historical rules of thumb which turned out not to apply to this election. A fairly sucky way to poll IMO.
|
|
|
Post by froome on Jun 23, 2017 7:20:29 GMT
Another factor must be that many voters are making their voting decision much later than they used to, and in many cases are willing to change their decision right up to the moment they actually vote. The anecdotal evidence of this election for me was that the likely outcome was shifting dramatically day by day, and that if the election had been a couple of days before or after June 8th, the result could have been very different again.
|
|
clyde1998
SNP
Green (E&W) member; SNP supporter
Posts: 1,765
|
Post by clyde1998 on Jun 23, 2017 7:33:58 GMT
Something that's come to mind is that there will probably be people that went to the polling station thinking that they'd vote for one party, only to find out that there wasn't a candidate for that party. This might explain the overestimation of the UKIP and Green (by a slight amount) support and the underestimation of the Labour and Conservative (by a slight amount) support.
I seem to recall someone was polling asking about the specific candidates available in the respondent's constituency. I don't know if it was Survation. If it was it could be a major factor as to why they were the most accurate.
Also the SNP were overestimated by 1% in UK-wide polls. There appeared to be very late movement from the SNP to Labour - so it's possible that this wasn't picked up in these polls. This late movement would account for more of underestimation of the Labour vote.
|
|
The Bishop
Labour
Down With Factionalism!
Posts: 39,071
Member is Online
|
Post by The Bishop on Jun 23, 2017 10:09:59 GMT
Something that's come to mind is that there will probably be people that went to the polling station thinking that they'd vote for one party, only to find out that there wasn't a candidate for that party. This might explain the overestimation of the UKIP and Green (by a slight amount) support and the underestimation of the Labour and Conservative (by a slight amount) support. I seem to recall someone was polling asking about the specific candidates available in the respondent's constituency. I don't know if it was Survation. If it was it could be a major factor as to why they were the most accurate. Also the SNP were overestimated by 1% in UK-wide polls. There appeared to be very late movement from the SNP to Labour - so it's possible that this wasn't picked up in these polls. This late movement would account for more of underestimation of the Labour vote. A few late Scottish surveys did - including one by Survation which was (again) widely dismissed.
|
|
|
Post by kvasir on Jun 23, 2017 14:49:12 GMT
I feel like the establishment and the elite need to give a huge apology to Survation. They stuck to their guns and didn't herd and were treated pretty poorly by the media and other pollsters. Anyone remember that Daily Politics show where they all just laughed at the Survation guy?
I'm just impressed they didn't herd. A brave decision.
|
|
The Bishop
Labour
Down With Factionalism!
Posts: 39,071
Member is Online
|
Post by The Bishop on Jun 23, 2017 15:00:33 GMT
YouGov also deserve praise for sticking with their seat by seat model despite similar derision ("Labour gain Canterbury, you say??!!?? ROFLMAO")
Just a shame that they fiddled their final "regular" poll - unmistakable whiff of bet-hedging there.
|
|
|
Post by kvasir on Jun 23, 2017 17:15:41 GMT
Indeed. The strong incentive to herd is one of the biggest problems with pollsters at the moment which is why even though YouGov did well Survation deserves much more praise. They were either going to be very right or very wrong and I'm glad the pollster who didn't herd was rewarded.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Jun 23, 2017 19:05:42 GMT
YouGov also deserve praise for sticking with their seat by seat model despite similar derision ("Labour gain Canterbury, you say??!!?? ROFLMAO") Just a shame that they fiddled their final "regular" poll - unmistakable whiff of bet-hedging there. Any proof it was fiddled? They could have just been unlucky?
|
|
|
Post by archaeologist on Jun 23, 2017 22:17:25 GMT
I agree Survation deserve praise for not herding. But what methodology did they adopt that made them so accurate compared to the others?
|
|
The Bishop
Labour
Down With Factionalism!
Posts: 39,071
Member is Online
|
Post by The Bishop on Jun 24, 2017 9:19:20 GMT
YouGov also deserve praise for sticking with their seat by seat model despite similar derision ("Labour gain Canterbury, you say??!!?? ROFLMAO") Just a shame that they fiddled their final "regular" poll - unmistakable whiff of bet-hedging there. Any proof it was fiddled? They could have just been unlucky? Both they and MORI made methodology changes for their final pre-election surveys. In both cases, they just happened to lower the Labour score significantly. Wrongly, as it turned out.
|
|
clyde1998
SNP
Green (E&W) member; SNP supporter
Posts: 1,765
|
Post by clyde1998 on Jun 24, 2017 9:32:33 GMT
I agree Survation deserve praise for not herding. But what methodology did they adopt that made them so accurate compared to the others? I've just had a look at the question Survation asked: "Thinking about your [X] constituency, I'm now going to read you the names of the candidates standing. Which of these candidates are you most likely to vote for in your constituency?" I knew that someone was asking specificity about constituencies. This backs up my earlier comment about how people who would vote Green or UKIP wouldn't realise with a standard question that there wasn't a candidate for that party in their constituency, whereas with the question styled like this they would. Additionally, it may pick up some incumbency bonuses (particularly for the Tories and Labour) as people may like their local candidate, despite preferring another party. I think this would be a major factor in Survation being the most accurate pollster in this election.
|
|
msc
Non-Aligned
Posts: 910
|
Post by msc on Jun 24, 2017 10:12:59 GMT
I agree Survation deserve praise for not herding. But what methodology did they adopt that made them so accurate compared to the others? I've just had a look at the question Survation asked: "Thinking about your [X] constituency, I'm now going to read you the names of the candidates standing. Which of these candidates are you most likely to vote for in your constituency?" I knew that someone was asking specificity about constituencies. This backs up my earlier comment about how people who would vote Green or UKIP wouldn't realise with a standard question that there wasn't a candidate for that party in their constituency, whereas with the question styled like this they would. Additionally, it may pick up some incumbency bonuses (particularly for the Tories and Labour) as people may like their local candidate, despite preferring another party. I think this would be a major factor in Survation being the most accurate pollster in this election. YouGov also asked constituency specific voting intent questions, complete with named candidates, and with the Greens removed.
|
|
|
Post by manchesterman on Jun 25, 2017 17:12:29 GMT
In the final polls there was a serious underestimation of Labour vote share, ranging up as high as 8%. The two telephone polls did better on the whole than online (a reversion to the usual pattern after the referendum said otherwise). BMG - 8% ComRes - 7% ICM - 7% MORI (Tel) - 4.5% Opinium - 4.7% ORB - 5% Panelbase - 5% Survation (Tel) - 0.6% Kantar/TNS - 3% YouGov - 6% Conservative error ranged from -1.9 to +2.6% error. For the Liberal Democrats it was -0.6 to + 2.4% error. For UKIP - the second largest after Labour - from +0.5% to +3.1%. It does seem the UKIP v Lab relationship was the major error factor. I'm not sure about the error due to different turnout weightings though. Could it be as rudimentary as the polling companies - having had their fingers burned in 2015 (and in previous elections too) in undercalculating the so-called "shy Tory" factor - so they put in some sort of compensatory adjustment for this and just ended up over-compensating?
|
|
The Bishop
Labour
Down With Factionalism!
Posts: 39,071
Member is Online
|
Post by The Bishop on Jun 25, 2017 17:22:59 GMT
Well if they built in allowance for "shy Tories" then that helps explain how they got it wrong.
"Shy Tories" were last a significant factor for polling error in 1992 - other reasons explain the collective failure two years ago.
|
|
clyde1998
SNP
Green (E&W) member; SNP supporter
Posts: 1,765
|
Post by clyde1998 on Jul 6, 2017 5:33:27 GMT
I've grade opinion pollsters on their accuracy over the previous four major elections in the UK. Methodology:- My research covered the Scottish independence referendum 2014, General Election 2015, EU Membership referendum 2016 and General Election 2017, plus any Scotland and Wales polls for the two general elections.
- My research looked at the difference between the two largest parties (or both referendum options), using the difference between the final poll for that company and the actual result.
- Only polls completed three days before the election are included.
- Pollsters must have completed a poll in at least two different elections to be included in the main list, other pollsters listed at the bottom.
- Any result with less than a 3% error is shown in bold green font.
Pollster | IndyRef | GE2015 | GE2015 (S/W) | EURef | GE2017 | GE2017 (S/W) | Average | Grade | SurveyMonkey | N/A | 0.6% | N/A | N/A | 1.5% | N/A | 1.1% | A* | TNS/Kantar | N/A | N/A | N/A | 1.4% | 2.5% | N/A | 2.0% | A | Survation | 5.1% | 0.6% | 5.7% | 4.9% | 1.5% | 1.7% | 3.3% | B | YouGov | 2.6% | 6.6% | 5.0% | 5.8% | 4.5% | 3.3% | 4.6% | C | Opinium | 6.2% | 5.6% | N/A | 2.7% | 4.5% | N/A | 4.7% | C | Panelbase | 5.3% | 8.6% | 3.7% | N/A | 5.5% | 2.7% | 5.2% | D | Ipsos Mori | 5.3% | 5.6% | N/A | 7.0% | 5.5% | N/A | 5.8% | D | ICM | 5.9% | 7.6% | N/A | N/A | 9.5% | N/A | 7.7% | F | ComRes | N/A | 5.6% | N/A | 10.5% | 7.5% | N/A | 7.9% | F | BMG | N/A | 6.6% | N/A | N/A | 10.5% | N/A | 8.6% | F | Populus | N/A | 6.6% | N/A | 13.8% | N/A | N/A | 10.2% | U | Qriously | N/A | N/A | N/A | N/A | 4.5% | N/A | 4.5% | C | Lord Ashcroft | N/A | 6.6% | N/A | N/A | N/A | N/A | 6.6% | E |
Obviously polls are snapshots and not predictions, but this gives a rough guide on accuracy from each pollster in recent elections both at a national and regional level. Other methods will produce different results; I based this on the gap between the two biggest parties compared to the gap in the final results.
|
|