Here is the CNN polling director's advice for reading polls

Here is the CNN polling director's advice for reading polls

This story was published in the What Matters Newsletter of CNN. Sign up here for free to receive it straight in your mailbox.


Polling is a topic that will be of interest to anyone who follows American politics.

If done correctly, it's a valuable way to find out what motivates voters and which candidates resonate. If done incorrectly, it can be misleading and counterproductive.

This is why I spend a lot of time talking to Jennifer Agiesta (director of CNN polling and elections analytics) about which surveys are up to CNN's standards, and how I should use them.

It seemed like the perfect time to ask her for some tips about what to avoid and watch out for as the industry adapts itself to the new ways Americans communicate and live. Below is our email conversation.

Recent polling has been a failure.

WOLF : My impression was that the polling missed Donald Trump's rise in 2016, and then the power of Democrats on the national level by 2022. What is the truth?

AGIESTA : I'd say that in both 2022 as well as 2016, polling had mixed results when taken together. Separately from all the other polls, methodologically sound polling performed better.

Many polls in 2022 were excellent. High-quality pollsters conducted national generic ballot polling for the House of Representatives and found that it was a tight race, with a slight Republican advantage. This is exactly what occurred. State polls also had a good track record, especially in competitive races.

In five important Senate battlegrounds for example, our CNN state polls had an error average of less that a single point when comparing the candidate estimates with the final vote count. And across five contested governorship races, we had an error average of less then a half-point.

There were a number of polls with partisan bias that skewed some poll averages, and may have distorted the results.

You probably remember that in 2016, the biggest takeaway was that national polls are actually very accurate, and the larger issues were found to be state polling.

This was partly because of the more methodologically sound research being done at the national level and that many state polls did not adjust ('weighting,' the survey term for this type adjustment) their polls to take into account the level of education of the respondents.

The polls which did not adjust for this tend to represent those who have college degrees, but are less likely to support Donald Trump.

Add to this evidence of late changes in the race, and very close contests, and you'll find that a lot of polling done in important states didn't paint a true picture. (The polling industry has provided an assessment of 2016 issues here.) The majority of state polling does now adjust for education.

What does CNN do in order to make sure its polls are accurate?

CNN: We have recently changed the way that we poll to better reflect the modern way of communicating. We use different methods depending on what type of work we do.

We conduct surveys at least once a year with 1,500-2,000 respondents from a sample of US residential addresses. We first contact the respondents via a mailing inviting them to complete the survey online or by telephone, based on their preferences and convenience. Then, we follow up with a reminder mailing and phone outreach for people who tend to be harder to reach.

These polls remain in the field for nearly a month. This allows us to increase response rates, and obtain a methodologically solid estimate of certain baseline political measures that don't have independent national benchmarks like partisanship or ideology.

We also do polling using samples taken from people who signed up for surveys but were recruited initially by scientific sampling methods. This helps protect against biases which can exist in panels that anyone can join.

The panel-based polls we conduct can be done online, over the phone, or via text message depending on our time constraints and complexity of the topic.

When to believe a poll

The difference between a poll that is worth your attention and one that is not can be difficult to discern for those who don't have a good understanding of survey methodology.

There are many ways to conduct a poll.

Transparency is the most important indicator. You can't find out the basic information of a survey - the person who paid for it; the questions asked (not just the brief description in the graphic); how the surveys were collected; how many people have been surveyed etc. If you can't find out the basics of a poll - who paid for it, what questions were asked (not just a short description someone put in a graphic), how surveys were collected, how many people were surveyed etc. - chances are that this is not s good poll.

It's pretty common practice for reputable pollsters to share this kind of information.

Consider the source of any information you receive.

Gallup and Pew are well-known for their expertise in methodological research and for having a long history of independent and thoughtful research. You can be pretty sure that anything they publish is based on solid science.

In the same way, many academic research centers and pollsters in independent media take the correct steps to ensure that their methods are sound.

A pollster without a track record or with ambiguous details about their methodology would be a pass for me.

I'd also caution you to be wary of campaign polls. As campaigns tend to release polls only when it suits their interests, I would be cautious about those numbers.

If a company releases a survey that claims Americans don't get enough sleep, perhaps you shouldn't take it too seriously.

It is difficult to poll the primary elections

The coming primary campaign presents its own challenges, as there are polls that focus on early contest states such as Iowa, New Hampshire, and South Carolina. Have you any advice for these polls?

It is notoriously hard to poll primary electorates. The primary elections are often low-turnout events, with different rules for who can participate in each state. Also, the quality of the voter lists used by pollsters varies from one state to another.

As the election draws nearer, the candidates' field and the shape of the race could change. For example, in 2020, the Democratic field shrank drastically in the two-day period between the South Carolina primaries and Super Tuesday.

When you look at the primary polling results, remember that they are snapshots and not always good predictors of what will happen in the future.

Horse race numbers: Be careful

WOLF : The most important thing for consumers like myself is to know which candidate is leading. You've warned me against focusing solely on polling results that show a horse race. Why?

This caution is based on several factors.

The first is that any polling has an error margin because of sampling. Even the most accurate poll can have some noise because no sample is a perfect representation of the entire pool.

Due to this, any race with a margin of less than 5 points will look close in the polls.

In this situation, polling has two purposes: It can help you understand why a race may be close, or which candidate is favored. Once you have several polls that use similar methods, you can get an idea of the direction of a particular race.

The polling process is a great way to measure which issues are most important for voters, the enthusiasm of different segments of the population, and how people feel about candidates in terms their personal characteristics or job performance. These measures tell you more about the race than a horse race measure.

How to determine who is in front of a race

What is the best method to determine who is in front or behind during an election?

There are some tactics you can use to help make sense of the disparate data when looking at trends in time.

It is best to follow the trend within a single survey. The trend line of a poll can reveal a lot if a pollster uses the same methodology.

It can be difficult to find this information, because not all pollsters conduct multiple surveys on the same race.

Averaging polls is another way to gauge change over time. However, as we found out in 2022 these averages can be very different depending on the pollster and whether or not they include polls that have a partisan bias.

Not only can polling be conducted on a phone

WOLF : I do not have a landline, and I never answer strange phone numbers. What makes you think that polling reaches a large enough audience?

Today, many polls are conducted by other methods than the phone.

Six of the thirteen pollsters that released surveys in May or juin on Joe Biden’s approval rating and met CNN's reporting standards conducted their surveys exclusively by phone. These phone pollsters call far more mobile phones than landlines.

It is important that any poll reaches a representative sample of the people who don't respond to it. So far, multiple methods have been successful in achieving this.

People pollsters are having difficulty reaching

WOLF : Do pollsters admit that they have difficulty reaching certain groups? What are the solutions?

AGIESTA: Several demographic groups are harder to reach by pollsters than others. These include younger people, those who have less formal education and Black and Hispanic Americans. The prevailing theory for why the 2020 election polling was off is that Republicans were less likely than other Republicans to take part in surveys.

The pollsters use several methods to counter this.

The sampling plan of some pollsters that draw from online panels, where they are aware in advance of the demographics and political characteristics of those who may participate, will take this into account.

When using a sample from a list of voters, some of the information can be linked to a voter’s contact information.

If a pollster wants to get to the bottom of a difficult-to-reach population, they can do an oversample. This allows them to reach more people in that group and improve their statistical power.

What's next?

The next election will always be a challenge for pollsters.

The biggest challenge is to find the best ways to engage people in research and reach them. Leaders in the industry are working out how to make use of tools like text messaging, AI, and social media while producing representative work.

The attention-grabbing aspect of survey research is elections, but pollsters also measure attitudes and behavior in so many other areas of daily life that we would lose a lot of understanding of society if our survey methods did not keep up with how people communicate. I am excited to watch it evolve.