Lunchtime Politics on Twitter

How the Polls Missed in 2020

Election 2020 What Happened

How the Polls Missed in 2020

In the aftermath of a closer-than-anticipated general election, pollsters have received no shortage of criticism for “missing” the election’s outcome. This criticism is well warranted. The day before the election, our Lunchtime Politics polling average had Joe Biden leading +7 points, higher than the +3 margin of victory that Joe Biden seems likely to finalize as vote counts conclude in most states.

On the Biden vote…

Most national polls were decently accurate on Joe Biden’s share of the vote. Below are the percentages from the last national presidential polls reported in the Lunchtime Politics newsletter prior to the election:

Pollster

Biden Vote Percentage

Difference between Actual Vote and Polling Vote Percentage

Quinnipiac

50%

-1

USC

54%

+3

YouGov

53%

+2

WSJ/NBC

52%

+1

CNBC

52%

+1

Morning Consult

52%

+1

SurveyUSA

52%

+1

RMG

51%

Same

Reuters

52%

+1

IBD/TIPP

50%

-1

Rasmussen

48%

-3

Fox News

52%

+1

Emerson

50%

-1

Hill/Harris X

49%

-2

Most major pollsters were off within the margin of error, many by just a single point. The Trump vote is another story altogether. Here, many pollsters greatly under-estimated Trump’s election haul.

Pollster

Trump Vote Percentage

Difference between Actual Vote and Polling Vote Percentage

Quinnipiac

39%

-9

USC

43%

-5

YouGov

43%

-5

WSJ/NBC

42%

-6

CNBC

42%

-6

Morning Consult

44%

-4

SurveyUSA

44%

-4

RMG

44%

-4

Reuters

45%

-3

IBD/TIPP

46%

-2

Rasmussen

47%

-1

Fox News

44%

-4

Emerson

45%

-3

Hill/Harris X

45%

-3

The likely culprit for underestimating Trump voters: over-inflating the percentage of third-party voters and voters who did not yet have an opinion on who they were going to vote for.

Pollster

Don’t Know Percentage

Third-Party Vote Percentage

Quinnipiac

9%

2%

USC

4%

YouGov

2%

2%

WSJ/NBC

3%

3%

CNBC

3%

3%

Morning Consult

3%

2%

SurveyUSA

2%

3%

RMG

4%

1%

Reuters

1%

3%

IBD/TIPP

3%

2%

Rasmussen

2%

3%

Fox News

2%

2%

Emerson

2%

2%

Hill/Harris X

3%

2%

Third-party candidates received about 1.5% of the total vote nationally. In a close contest, the misallocation of a few points like this can make a difference. Many pollsters also had high shares of likely voters who had yet made up their minds about the election.

For most of these polls, by attributing a point allocated to third party candidates and allocating the “Don’t Know” vote to Trump, Trump’s vote total gets to just about where it should be. But why didn’t this happen? There are a few possibilities:

  1. Late Breakers: There could have been a late break of undecided voters toward Trump in the final days of the election. Most polls showed a tightening race. Late-deciding Trump voters may be responsible for some of the discrepancies in the polls. However, a near-unanimous break among late-deciding voters for Trump strains credibility
  2. Shy Trumpers: A small percentage of voters may have felt uncomfortable expressing a preference for Trump, even in online polls. These voters selected “don’t know” or a third-party candidate in the surveys, inflating these responses. Many of these “don’t know” voters described themselves as political independents, which further reinforces this hypothesis. Even if this only 2% of the electorate, that would be enough to account for much of the discrepancy.
  3. Flimsy Likely Voter Screens: All pre-election polls were fielded among “likely voters” — those who self-report that they are going to vote (or already have voted) in the election. The high percentage of “don’t know” voters may mask respondents who either didn’t vote or never really intended to vote in the survey. This hard for polls to solve or prove, but may account for some of the inconsistency.
  4. Bad Sampling: Modern polling techniques make it difficult to ensure that polls are representative and reflect what the electorate will look like. Most rely on guesswork on behalf of the pollster. These guesses are often educated, but they still are not informed by the data the pollster receives. Instead, these assumptions guide the data collected. Here, pollsters were off about what the electorate would look like, undercounting particular demographic groups that lean for Trump.

It’s likely that all four of these issues had some effect. To be fair, there were several pollsters that accurately predicted a close race with a +3 or +4 point advantage to Biden. But something needs to change in light of how inaccurately most polls estimated Trump’s vote share. At the very least, pollsters need to deeply re-consider how they can effectively reach right-wing voters.

Andrew Rugg
andrew.rugg@certusinsights.com

Andrew Rugg is an expert in survey research, media analytics, and qualitative research projects. Before leading the team at Certus, he lead a fully integrated research department at a public relations agency.