top of page

Is this the New Way to Take the Political Pulse of the Nation?

In this recent article on OZY, BSG’s Amy Levin and other research leaders sound off on the future of political polling and the impact of online tools and social platforms.

When pollsters at the Democratic polling firm Benenson Strategy Group sought the opinions of voters older than 50 on the Republican plan to replace Obamacare, they started dialing phone numbers. When they wanted to hear what millennials thought, they formed a sample online. The dial-versus-surf split is a sign of an industry in flux. Reaching a representative sample of Americans is getting harder, and a few high-profile polling misses during the 2016 election underscored the urgency. Pollsters now are pushing online and into social media, a revolution a half-century in the making that could extinguish phone-call sampling within the next few years.

In the 1960s, the dominant method to find a random sample of people shifted from door-to-door surveys to the less expensive and time-consuming telephone. Like today, the newer technology risked excluding older people and poorer people. But phone calls became the norm, and lower costs helped broaden the polling universe beyond Gallup and Harris. Decades later, the rise of the cellphone became a challenge. The numbers are harder to obtain, and federal law prevents mass autodialing of cellphones — a holdover from the days when an unwanted call would sap your minutes. Nowadays, roughly half of the country’s households are cellphone-only. “There are so many people who you cannot reach unless you call them on a cellphone that I have zero confidence in numbers that are produced solely from landline Interactive Voice Response surveys,” says veteran Republican pollster Whit Ayres.


Cell or landline, fewer people pick up nowadays — making polls costlier and less representative. Pollsters use sophisticated models to weigh the results according to the population or who they expect to turn out to vote, but it’s tougher with fewer respondents. That was one of the problems with 2016 polling, which was more or less on the money nationally but missed badly in state-level Donald Trump triumphs like Wisconsin and Michigan.

Nick Gourevitch, the research practice leader for Global Strategy Group, a Democratic firm, says the electorate was less educated and more rural than his polls anticipated. But that doesn’t mean a tweak on those factors will work next time. “It’s very possible we always had education wrong in our polls, but it only mattered this time because it was the key cleavage in the electorate,” he says. “How do we make sure in the future to know what’s driving it each time and adjust accordingly?”

Part of the solution is bringing in more online tools without abandoning the phone. Google searches, Facebook chatter and Twitter mentions can help pollsters identify which candidates are getting a surge of enthusiasm or which issues voters care most about, but Gourevitch says, “I have a deep reluctance in relying on that stuff as gold.” Even Facebook does not think its data makes a useful poll. Measuring sentiment is tricky: The social media giant can’t tell whether your “Make America Great Again” comment was sincere or sarcastic.

Twitter, which is more open with its data, offers better opportunity. Northeastern University professor Nick Beauchamp examined troves of geolocated tweets in 2012, matching them up with polls, and found that Twitter traffic predicted in-state polling movement — and the issues causing the shifts. In essence, Twitter could become a cheap and fast way for campaigns to supplement their state-by-state polling. For example, Beauchamp tells OZY, if the Clinton campaign had known a couple of days sooner how much damage would be done by FBI Director James B. Comey’s letter alerting Congress that the bureau was reopening the Clinton email case, it might have redirected resources more quickly. But Beauchamp hasn’t noticed anyone jumping on the idea. “This stuff will take a while to percolate into actual campaigns,” he says, “given how risk-averse they tend to be.”

And the new tools deserve a grain of salt. In 2008 Google came up with a way to measure flu outbreaks based on what people were searching for, claiming that its methodology would be faster than the Centers for Disease Control and Prevention. But Google Flu Trends vastly overestimated flu cases in real life — its warning signs were mostly seasonal — and the initiative was scrapped, a victim of what researchers called “big data hubris.”

Still, this is an era of experimentation in the polling game, from text-message polling to carefully collected online panels to email blasts. Digital messaging tends to be easier and cheaper than hiring people to dial a ton of numbers and read from a script — lowering barriers to participate. That’s not always a good thing. “It is really easy to do bad research cheaply online,” notes Amy Levin, a partner with Benenson Strategy Group. Levin says the web should be a supplement “to really correct the shortcomings of the phone.”

One of the most controversial projects of 2016 was the University of Southern California/Los Angeles Times tracking poll, which used an online panel of 3,200 people. Participants were asked periodically to give their likelihood of voting for a candidate on a scale of 0 to 100, rather than via a simple yes or no, to better identify shifts in sentiment. The poll was an outlier, consistently showing Trump ahead and drawing plenty of scorn. “It was tough,” says Jill Darling, survey director for the Center for Economic and Social Research at USC, whose friends sent her Starbucks cards and encouragement. Although the poll was innovative and costly — it gave tablets to those without internet access in order to capture low-income voters — it missed the mark, as it had Trump winning the national popular vote by 3 percentage points. Darling says they had too many rural voters, the same kind of weighting mistake bedeviling the old-school pollsters on the phone.


bottom of page