Project Wonderful

Sunday, August 23, 2020

How to read a poll and what to ask your pollster!



Polling can be one of the most daunting adventures for a new campaign manager. Unlike mail or digital, depending on the department you came up through you may not have come in direct contact with a pollster's work before (although of course you have seen it at play!) I was lucky to have very patient consultants to guide me on my first couple of races, but in case you don't or you just want to seem prepared, I've asked a friend and former pollster to help you out!  It is a shame that he has requested to remain anonymous since he now works in another, "non-political" (I mean its not, is anything? but you feel me) industry because the below advice is so good. Enjoy! 

1) What does a pollster do?

Pollsters conduct surveys, focus groups, and alternative forms of market research on behalf of campaigns and other interested parties in order to help their clients figure out the best way to allocate resources both across and within campaigns. The work of pollsters is a key input into any campaign’s plan for success.


As a former pollster, I often get asked “Oh, so you’re the one that makes the calls?” In reality, almost all pollsters outsource the actual data collection to specialized firms. Pollsters are in charge of designing the survey instrument (writing the questions), selecting the sample structure (and back-end weighting), and producing data-based analysis to help the campaign and its other consultants decide on strategy and tactics.


2) When and why should candidates be polling or not be polling? What are polls good for and not good for?


Polls can be used to help campaigns in a range of different ways. The right nature (benchmark, tracker, brushfire, etc.), amount, timing, and frequency of polls for any given campaign can vary significantly based on strategy (and budget, of course!). Polls are all about informing what you do, so that you do it better. Polls don’t win votes. They inform actions that win votes.


Things polls are good for:

  • Assessing the viability of a candidate
  • Determining the right way to introduce a new candidate to the electorate
  • High level messaging decisions (what to talk about)
  • Nuanced messaging decisions (how to talk about it)
  • Higher level targeting / understanding of which messages resonate with which voters and who is persuadable


Things polls are NOT good for:

  • Detailed targeting (what age range of female voters in Region 3 is most persuadable?)
    • Sample size generally doesn’t support this level of analysis
  • Determining whether “positive” or “negative” messaging is more effective
    • Idiosyncrasies of surveys can drive implications here, but best to avoid strong conclusions
  • Identifying GOTV targets
    • The sample is a likely electorate itself, and asking people if things make them want to vote / how excited they are to vote has some merit, but is not the best way to identify GOTVable demographics
  • Getting to the “why” of public opinion
    • Why is complicated. Pre-written response options with no time to reflect isn’t a great way to understand it,
  • Deciding 2 weeks before an election that you have a big lead, will win anyway, and everyone can go home
    • “Leading” in polls is not “leading” an election. You have no “lead” (nor vote deficit) until ballots are cast and there is time left on the scoreboard clock until polls close on election night…get to work!

3) What should I consider when bringing on a polling firm or consultant?


A few things to consider when choosing a pollster:

  • Go with someone with experience in your state/district. Different geographies have polling quirks. Pick someone who has polled your state/district (ideally many times) and gotten it right.
  • Ask about the methodology. As with many things, to do it right, polling has its commensurate costs (cell phone sampling, multi-lingual interviewers in certain districts, etc.). Understand the pricing offered by different firms and be willing to pay, but only for quality.
  • Pick someone you’re willing to listen to. When interviewing, ask a pollster how they would handle a hypothetical situation or how they have handled past races. Make sure their approach is one that works for you. The best advisors are people who might bring different ideas to the table, while doing so in a way that you can understand and engage with in dialogue.
  • Consider access and attention. The ideal consultant has the ears of high up powers-that-be (to help bring focus to your race) and also you have their ear anytime you call. Be mindful of any tradeoffs in that spectrum, depending on the profile of your race.
  • Be mindful of the sales vs. execution handoff. If you pick a pollster based on someone whose name is on the shop door, be sure that person is involved! Reference checks can be helpful on this sort of thing. Chemistry is an important component of a working team relationship, so be sure before you hire a pollster, that you know who will actually be on the calls explaining the results to you!


4) What are the different types of polls? What circumstance are they useful in?


There are many reasons you could use different types of polls at different points in a campaign. Generally speaking, your first poll is the longest (a benchmark) and your last poll is the shortest (a tracker), but different campaign circumstances and budgets can inform any number of decisions along the way. Trust your pollster on what’s right here.


  • Benchmark
    • Determine viability
    • Assess the best way to introduce your candidate
    • Plan out your most effective messages
    • Make a plan on geographies for media spend / ground resources
  • Brushfire
    • Assess impacts and nuances of new developments
  • Tracker
    • Inform tactical adjustments of spending based on where things have traction
    • Refine views of persuadable universe


5) What are the main components of a poll?

Let’s take a classic benchmark as an example (other polls may skip much of the middle part of these)


  • Introduction: The warm up to get respondents into a political mindset
  • Initial Ballot Test: The first time the candidates get named
  • Candidate Introductions and Informed Vote: Simulating what things will be like after exposure to each side’s first positive ads
  • Messaging and Re-Ballots: Testing various lines of support/attack/defense
  • Demographics


6) How do you interpret them?

Leave it to an ex-pollster to protect the industry, but…this is what you pay your pollster the big bucks for!


No question should be interpreted in a vacuum (nor should any poll). The best way to interpret the poll is to take it in its entirety, add it to your prior beliefs/knowledge of your situation, and then act on the combination of the two. The best person to refine your views of how that poll should play into your overall understanding is someone who has seen other recent similar data and can contextualize it appropriately in the current environment vs. years and years of prior experience (your pollster!). But enough of my Bayesian soapbox, a few things that I’m willing to make more general statements about:


  • Significant movement between the Initial Ballot Test, the Informed Vote, and the Post-Message Votes can be meaningful and helpful in assessing viability and strategy; however don’t expect to see as much movement in reality unless you truly plaster the electorate with your ads!
  • There are certain positive/negative messages that almost always get high “Very convincing” or “Major doubts” responses, but are not necessarily the best messages. These include any message where a respondent might feel like the “right answer” is to say that it’s a compelling message - be mindful of these (and…theme here…Trust your pollster’s experience!)
  • Don’t make too much of the counter-intuitive results in the crosstabs - sample sizes can cause significant noise!


7) What are some basic terms someone should be familiar with (toplines, sample size etc) when trying to read and talk about a poll?


  • Sample Size: The number of people who answered the poll and the sole ingredient in calculating “Margin of Error”
  • Margin of Error: A statistical calculation of precision, based solely on sample size. Actual error includes systemic and execution components, so 
  • Weighting: A statistical adjustment to raw survey data by which each respondent’s data is assigned greater/lesser influence on the total to appropriately correct for sampling variation
  • Fielding Dates: The days that respondents were contacted to complete the questionnaire. Be wary of surveys with short field times (or that field over holidays), as calling people back is an important part of ensuring you get a good sample.
  • Toplines: A report showing each question, potential response, and the percentage of the total that chose each response
  • Crosstabs: A report showing how responses to questions vary by category (such as by for voters of different demographics)
  • Party ID: How a respondent characterizes her/himself when asked
  • Party Registration: A respondent’s party registration on the voter file (doesn’t exist in all states), which may determine eligibility to vote in a primary election


8) What questions should I be asking of my pollster/consulting team about our poll before it goes into the field?


As you approach finalizing a survey and fielding it, one of the best questions to ask your pollster is what they are seeing play well elsewhere. Your pollster is seeing races all across the country and your state, and their insights can help some of the best messages of the cycle pollinate across districts.


Along the same lines, as you are drafting, ask them what’s in your draft questionnaire that they’ve already seen a thousand times and works OK or not well. Cut those questions/messages, and then add others. It’s important not to let a survey get too long (it makes it hard to keep a representative sample’s attention and can impact the results). You may as well leverage your pollsters’ existing knowledge on those messages, and learn something new with your poll!


9) What questions should I be asking when I get a poll back for my candidate?


As a favor to all of my pollster friends, before the survey is done fielding, please don’t ask for partial results. They’re just not meaningful. There’s a reason that it takes time to field a survey, and a single night’s results are not informative, only leading to confusion when compared with final results.


As less of a favor to all of my pollster friends, don’t let your pollster just send you a data dump (even if they format it nicely!). When you get results back, ask for an executive summary of what is most meaningful from the pollster’s perspective. Too many pollster memos are full of statements like “Among Independents, Message X played better with Men (34% Very Convincing) than with Women (28% Very Convincing)”. That could be a junior team member simply putting crosstabs into prose (which may or may not be meaningful), or it could be a helpful insight into how to unlock the persuadable independent male vote - get the key takeaways from your pollster.


10) What should I be wary of, what questions should I be asking when looking at a poll that I didn't commission (who paid for this? Is this firm reputable? What is the sample, methodology etc)


Even pollsters with the best intentions can create misleading polls. A few things in particular to keep an eye out for:

  • Sample/methodology: Look out for interactive voice response (IVR), opt-in online panels, and any methodology which doesn’t give everyone some chance of participating. 
  • Demographic composition: Cross-check demographics against historical data / other surveys. Demographic variance from one pollster to another may give the false appearance of a “change in opinion”
  • Question wording: Questions should be balanced and not lead respondents more to one answer than the other
  • Question response options: Often the hottest media headlines on polls come from re-characterizing the options as presented to voters. For example, there are many ways to ask “Job Approval” (Excellent/good/fair/poor vs. Approve/disapprove)
  • Question ordering: Respondents can easily be primed by prior questions. For example, a “most important issue” open ended question should be toward the very beginning of a survey (as otherwise people will be more inclined to say whatever they’ve just been asked about is most important)


11) I feel like a lot of campaigns under-utilize their pollster. Other than crafting and executing polls what else should I be asking of my polling team?


Of all of the consultants you retain, pollsters arguably have some of the best perspective beyond your district/state because they tend to work across a ton of campaigns at once. Too few campaigns pick pollsters’ brains on what messages are playing well and which demographics seem particularly persuadable/GOTVable in other races (you’re not alone!).


Additionally, consultants love to show off in each others’ territory. Fully looping your pollster in on media/mail drafting, etc. can make for added creativity (and also put the heat on your media consultants to bring their best)!


While pollsters only charge for polls, the good firms view themselves as full service consultants who are with you all the way through election day - no matter how you may need them. Don’t hesitate to ask!


12) Anything else you want us to know?


Congrats on making it this far down the interview. Given that you’re here, I’ll assume you’re into the weeds enough to hear me out on a couple of things I’d clear up on polling likely voters:

  • Every election, there are many voters who show up who are “unlikely voters”. For every four voters with a 25% propensity to vote, one of them does! It’s important to reflect this fact in a poll, since these are some of the most persuadable voters out there (and GOTVable too!). A “likely voters” poll’s respondents should be a sample of the likely electorate - which is less homogenous and more persuadable of a universe than you’d get if you just called only “likely voters”
  • Voters are terrible at predicting whether they will vote. Don’t rely on the difference between “Registered Voters” and “Likely Voters” in a public poll where the difference between the two groups is how they answer questions about how likely they are to vote.

Sunday, August 16, 2020

The First Ever CampaignSick Book Club: Poll Dancer

 

A couple weeks ago I put out a call to read along with some campaign ladies and me on the first ever CampaignSick book club as we review Laura Heffernan's Poll Dancer. According to Google Books, the description is as follows: 

"When politics meets pole fitness, Mel's life flips upside-down.After Mel's disastrous promotional video goes viral, a "family values" group launches a protest against her dance studio. Their leader isn't just trying to stop her from teaching--he's using Mel as a moral scapegoat for his own senate campaign. If he wins, he threatens to change the laws to keep all pole dancing out of their community. Mel's not going down without a fight. Because running for office beats unemployment, she decides to face off against him. She hires a campaign manager and tosses her hat in the ring. There's just one problem: voters don't get pole. Now Mel needs to change her image, fast. If she can't get the people on her side, she won't have a business to save. To make matters worse, Mel's campaign manager Daniel is giving her some very UN-professional fluttery feelings. Who knew the hardest part about running for office would be not losing her heart? Fans of My Fair Lady will love this fun, witty twist on a classic." 

Note: Major spoilers in the video above, but if you've read long or you'd just like to see what we had to say please press play and enjoy the first ever edition of CampaignSick book club. Special thanks to Ilana Kaplan and Candy Emmons for joining me! 



 

Sunday, August 9, 2020

Ask Nancy: DC vs the Campaign

Hey Nancy,

I used to work as a finance assistant at a committee in DC, then I got sent to work on a campaign for the last couple weeks of an election. I noticed that the local campaign staff seemed really suspicious of me. Now I'm working as staff on a targeted race and even though my boss on the ground is really smart and my national desk is awesome they seem like they are always frustrated with one another. Is this type of thing common? Why can't we all just get along?

__________________________________________________________________________________

Great question! This is indeed a common phenomenon and one I have often experienced myself. I also want to say I'm sorry you're getting caught in the middle. First let me address the why and then what I think we can do about it. 

The frustration between committees or endorsing organizations and staff on the ground is based on a couple of a vicious cycles. 

First off, DC and the committees have a map. By this I mean the DCCC, for example, has a certain number of seats it needs to flip or hold but doesn't really care which ones they are. When you're doing well--raising the money, polling competitively--you're generally going to get more support from them, both in terms of the direction you're going and in terms of resources. At the same time if/when the map changes and you're less competitive than other races, resources are redirected elsewhere. This can feel really frustrating to campaign staff and candidates since it's largely out of your control. In addition, campaign managers and candidates frequently make the point that early investment from a committee will allow the campaign to meet the polling and fundraising thresholds that DC has set forth for them in order to prove that their campaign is competitive and it's a lot easier to say, "raise $250,000 this quarter" than it is to do it. So there can be a little tension around this chicken and the egg phenomenon.

Second, not everyone is as awesome at their job as we are. Your DC contact might have 5 different campaigns telling them that the goals are too high, the suggested messaging doesn't fit their race, or that honk and waves really are important in their district. Even if you are the exception and these things (or others) really are true for you it's hard for someone not on the ground to distinguish between that and other campaigns that are just unwilling to cooperate. Similarly local activists and staffers are sometimes wary of DC politicos who have been known to come in with a one-size-fits-all approach and negate the value of local opinions and talent. 

The solution, in my mind, as it is to almost every kind of of intra-campaign tension, is for us to cut each other a little bit of slack and have some empathy. We all want to win. We all think we have something to contribute and so it smarts on either end of the equation when it feels like your talent and experience is being negated. (This is easier said than done and I am reminded of a recent incident in my own life when I did a B- job at exactly the advice I'm now giving you.

It also helps to come with evidence, or at least test a hypothesis. If you don't agree with advice you're getting,  "I 'feel' like this will work better" is a lot less convincing than running an A/B test or presenting Analyst Institute tested best practices. Especially this year, no one really knows, well, anything. So unless an idea is so far out of left field it could hurt the campaign, almost anything is worth trying. Then if it doesn't work you've made a good faith effort to work with your supporting partners and if it does, then great! I've often settled disputes with candidates by saying "just do me a favor and try making the ask that way and we'll reevaluate based on what happens."

And if you're caught in the middle, as it sounds like is the case, my best advice is to keep your head down and not get pulled into the drama. As I said, everyone in this situation is ultimately on the same team and when you win no one will remember who was frustrated with whom.

That's all I got!

Campaign Love and Mine,


Nancy 



Sunday, August 2, 2020

CampaignSick Merch Is Here!

Good Morning CampaignSickles! I am so excited to share with you that CampaignSick merch is finally here! This week also marked my 14 year anniversary of working on campaigns, so I am all up in my feels. 

I write this blog because its really important to me that we have a collective team culture across our industry and that is even harder there days so I thought Big Dialing Energy perfectly encapsulates what 2020 campaigns are all about. Get yours here!










These are custom made and union printed by the same company that prints many large campaigns' merch so please expect similar turnaround times. If you like these designs and we sell them we'll make more! Let me know what you think and order yours here.