Project Wonderful

Sunday, August 23, 2020

How to read a poll and what to ask your pollster!



Polling can be one of the most daunting adventures for a new campaign manager. Unlike mail or digital, depending on the department you came up through you may not have come in direct contact with a pollster's work before (although of course you have seen it at play!) I was lucky to have very patient consultants to guide me on my first couple of races, but in case you don't or you just want to seem prepared, I've asked a friend and former pollster to help you out!  It is a shame that he has requested to remain anonymous since he now works in another, "non-political" (I mean its not, is anything? but you feel me) industry because the below advice is so good. Enjoy! 

1) What does a pollster do?

Pollsters conduct surveys, focus groups, and alternative forms of market research on behalf of campaigns and other interested parties in order to help their clients figure out the best way to allocate resources both across and within campaigns. The work of pollsters is a key input into any campaign’s plan for success.


As a former pollster, I often get asked “Oh, so you’re the one that makes the calls?” In reality, almost all pollsters outsource the actual data collection to specialized firms. Pollsters are in charge of designing the survey instrument (writing the questions), selecting the sample structure (and back-end weighting), and producing data-based analysis to help the campaign and its other consultants decide on strategy and tactics.


2) When and why should candidates be polling or not be polling? What are polls good for and not good for?


Polls can be used to help campaigns in a range of different ways. The right nature (benchmark, tracker, brushfire, etc.), amount, timing, and frequency of polls for any given campaign can vary significantly based on strategy (and budget, of course!). Polls are all about informing what you do, so that you do it better. Polls don’t win votes. They inform actions that win votes.


Things polls are good for:

  • Assessing the viability of a candidate
  • Determining the right way to introduce a new candidate to the electorate
  • High level messaging decisions (what to talk about)
  • Nuanced messaging decisions (how to talk about it)
  • Higher level targeting / understanding of which messages resonate with which voters and who is persuadable


Things polls are NOT good for:

  • Detailed targeting (what age range of female voters in Region 3 is most persuadable?)
    • Sample size generally doesn’t support this level of analysis
  • Determining whether “positive” or “negative” messaging is more effective
    • Idiosyncrasies of surveys can drive implications here, but best to avoid strong conclusions
  • Identifying GOTV targets
    • The sample is a likely electorate itself, and asking people if things make them want to vote / how excited they are to vote has some merit, but is not the best way to identify GOTVable demographics
  • Getting to the “why” of public opinion
    • Why is complicated. Pre-written response options with no time to reflect isn’t a great way to understand it,
  • Deciding 2 weeks before an election that you have a big lead, will win anyway, and everyone can go home
    • “Leading” in polls is not “leading” an election. You have no “lead” (nor vote deficit) until ballots are cast and there is time left on the scoreboard clock until polls close on election night…get to work!

3) What should I consider when bringing on a polling firm or consultant?


A few things to consider when choosing a pollster:

  • Go with someone with experience in your state/district. Different geographies have polling quirks. Pick someone who has polled your state/district (ideally many times) and gotten it right.
  • Ask about the methodology. As with many things, to do it right, polling has its commensurate costs (cell phone sampling, multi-lingual interviewers in certain districts, etc.). Understand the pricing offered by different firms and be willing to pay, but only for quality.
  • Pick someone you’re willing to listen to. When interviewing, ask a pollster how they would handle a hypothetical situation or how they have handled past races. Make sure their approach is one that works for you. The best advisors are people who might bring different ideas to the table, while doing so in a way that you can understand and engage with in dialogue.
  • Consider access and attention. The ideal consultant has the ears of high up powers-that-be (to help bring focus to your race) and also you have their ear anytime you call. Be mindful of any tradeoffs in that spectrum, depending on the profile of your race.
  • Be mindful of the sales vs. execution handoff. If you pick a pollster based on someone whose name is on the shop door, be sure that person is involved! Reference checks can be helpful on this sort of thing. Chemistry is an important component of a working team relationship, so be sure before you hire a pollster, that you know who will actually be on the calls explaining the results to you!


4) What are the different types of polls? What circumstance are they useful in?


There are many reasons you could use different types of polls at different points in a campaign. Generally speaking, your first poll is the longest (a benchmark) and your last poll is the shortest (a tracker), but different campaign circumstances and budgets can inform any number of decisions along the way. Trust your pollster on what’s right here.


  • Benchmark
    • Determine viability
    • Assess the best way to introduce your candidate
    • Plan out your most effective messages
    • Make a plan on geographies for media spend / ground resources
  • Brushfire
    • Assess impacts and nuances of new developments
  • Tracker
    • Inform tactical adjustments of spending based on where things have traction
    • Refine views of persuadable universe


5) What are the main components of a poll?

Let’s take a classic benchmark as an example (other polls may skip much of the middle part of these)


  • Introduction: The warm up to get respondents into a political mindset
  • Initial Ballot Test: The first time the candidates get named
  • Candidate Introductions and Informed Vote: Simulating what things will be like after exposure to each side’s first positive ads
  • Messaging and Re-Ballots: Testing various lines of support/attack/defense
  • Demographics


6) How do you interpret them?

Leave it to an ex-pollster to protect the industry, but…this is what you pay your pollster the big bucks for!


No question should be interpreted in a vacuum (nor should any poll). The best way to interpret the poll is to take it in its entirety, add it to your prior beliefs/knowledge of your situation, and then act on the combination of the two. The best person to refine your views of how that poll should play into your overall understanding is someone who has seen other recent similar data and can contextualize it appropriately in the current environment vs. years and years of prior experience (your pollster!). But enough of my Bayesian soapbox, a few things that I’m willing to make more general statements about:


  • Significant movement between the Initial Ballot Test, the Informed Vote, and the Post-Message Votes can be meaningful and helpful in assessing viability and strategy; however don’t expect to see as much movement in reality unless you truly plaster the electorate with your ads!
  • There are certain positive/negative messages that almost always get high “Very convincing” or “Major doubts” responses, but are not necessarily the best messages. These include any message where a respondent might feel like the “right answer” is to say that it’s a compelling message - be mindful of these (and…theme here…Trust your pollster’s experience!)
  • Don’t make too much of the counter-intuitive results in the crosstabs - sample sizes can cause significant noise!


7) What are some basic terms someone should be familiar with (toplines, sample size etc) when trying to read and talk about a poll?


  • Sample Size: The number of people who answered the poll and the sole ingredient in calculating “Margin of Error”
  • Margin of Error: A statistical calculation of precision, based solely on sample size. Actual error includes systemic and execution components, so 
  • Weighting: A statistical adjustment to raw survey data by which each respondent’s data is assigned greater/lesser influence on the total to appropriately correct for sampling variation
  • Fielding Dates: The days that respondents were contacted to complete the questionnaire. Be wary of surveys with short field times (or that field over holidays), as calling people back is an important part of ensuring you get a good sample.
  • Toplines: A report showing each question, potential response, and the percentage of the total that chose each response
  • Crosstabs: A report showing how responses to questions vary by category (such as by for voters of different demographics)
  • Party ID: How a respondent characterizes her/himself when asked
  • Party Registration: A respondent’s party registration on the voter file (doesn’t exist in all states), which may determine eligibility to vote in a primary election


8) What questions should I be asking of my pollster/consulting team about our poll before it goes into the field?


As you approach finalizing a survey and fielding it, one of the best questions to ask your pollster is what they are seeing play well elsewhere. Your pollster is seeing races all across the country and your state, and their insights can help some of the best messages of the cycle pollinate across districts.


Along the same lines, as you are drafting, ask them what’s in your draft questionnaire that they’ve already seen a thousand times and works OK or not well. Cut those questions/messages, and then add others. It’s important not to let a survey get too long (it makes it hard to keep a representative sample’s attention and can impact the results). You may as well leverage your pollsters’ existing knowledge on those messages, and learn something new with your poll!


9) What questions should I be asking when I get a poll back for my candidate?


As a favor to all of my pollster friends, before the survey is done fielding, please don’t ask for partial results. They’re just not meaningful. There’s a reason that it takes time to field a survey, and a single night’s results are not informative, only leading to confusion when compared with final results.


As less of a favor to all of my pollster friends, don’t let your pollster just send you a data dump (even if they format it nicely!). When you get results back, ask for an executive summary of what is most meaningful from the pollster’s perspective. Too many pollster memos are full of statements like “Among Independents, Message X played better with Men (34% Very Convincing) than with Women (28% Very Convincing)”. That could be a junior team member simply putting crosstabs into prose (which may or may not be meaningful), or it could be a helpful insight into how to unlock the persuadable independent male vote - get the key takeaways from your pollster.


10) What should I be wary of, what questions should I be asking when looking at a poll that I didn't commission (who paid for this? Is this firm reputable? What is the sample, methodology etc)


Even pollsters with the best intentions can create misleading polls. A few things in particular to keep an eye out for:

  • Sample/methodology: Look out for interactive voice response (IVR), opt-in online panels, and any methodology which doesn’t give everyone some chance of participating. 
  • Demographic composition: Cross-check demographics against historical data / other surveys. Demographic variance from one pollster to another may give the false appearance of a “change in opinion”
  • Question wording: Questions should be balanced and not lead respondents more to one answer than the other
  • Question response options: Often the hottest media headlines on polls come from re-characterizing the options as presented to voters. For example, there are many ways to ask “Job Approval” (Excellent/good/fair/poor vs. Approve/disapprove)
  • Question ordering: Respondents can easily be primed by prior questions. For example, a “most important issue” open ended question should be toward the very beginning of a survey (as otherwise people will be more inclined to say whatever they’ve just been asked about is most important)


11) I feel like a lot of campaigns under-utilize their pollster. Other than crafting and executing polls what else should I be asking of my polling team?


Of all of the consultants you retain, pollsters arguably have some of the best perspective beyond your district/state because they tend to work across a ton of campaigns at once. Too few campaigns pick pollsters’ brains on what messages are playing well and which demographics seem particularly persuadable/GOTVable in other races (you’re not alone!).


Additionally, consultants love to show off in each others’ territory. Fully looping your pollster in on media/mail drafting, etc. can make for added creativity (and also put the heat on your media consultants to bring their best)!


While pollsters only charge for polls, the good firms view themselves as full service consultants who are with you all the way through election day - no matter how you may need them. Don’t hesitate to ask!


12) Anything else you want us to know?


Congrats on making it this far down the interview. Given that you’re here, I’ll assume you’re into the weeds enough to hear me out on a couple of things I’d clear up on polling likely voters:

  • Every election, there are many voters who show up who are “unlikely voters”. For every four voters with a 25% propensity to vote, one of them does! It’s important to reflect this fact in a poll, since these are some of the most persuadable voters out there (and GOTVable too!). A “likely voters” poll’s respondents should be a sample of the likely electorate - which is less homogenous and more persuadable of a universe than you’d get if you just called only “likely voters”
  • Voters are terrible at predicting whether they will vote. Don’t rely on the difference between “Registered Voters” and “Likely Voters” in a public poll where the difference between the two groups is how they answer questions about how likely they are to vote.

No comments:

Post a Comment