Politics and Polling: Digging into How Data Influences the Political Process
As part of Tech and Society Week 2023, the Massive Data Institute and the Georgetown University Institute of Politics and Public Service, both part of the McCourt School of Public Policy, teamed up to host a politics and polling panel that welcomed an energetic crowd eager to get the scoop of polling ahead of the upcoming presidential election year.
Kicking off the discussion moderator Chris Stirewalt, a Spring 2023 GU Politics Fellow; Senior Fellow, American Enterprise Institute (AEI); Contributing Editor and Weekly Columnist, The Dispatch; Political Editor, NewsNation, asked polling experts on the panel, “Why do we bother [with polling]? What is the point of upsetting people.”
The panelists included Jennifer Agiesta, CNN Director of Polling and Election Analytics; Fred Yang Chief Executive Officer, Hart Research Associates; Whit Ayres, Founder and President of North Star Opinion Research; and Jonathan M. Ladd, Ph.D., Associate Professor in the McCourt of Public Policy and Department of Government, Georgetown University and Nonresident Senior Fellow, The Brookings Institution.
Each panelist brought a unique perspective on why polling is important — from educational, republican, democratic, and news-oriented angles.
Agiesta made a point early on that news organizations such as CNN rely heavily on data and polling, calling it a “measure of where the nation’s political temperature is at any time.”
However, for CNN, Agiesta noted response rates are one of the biggest issues pollsters face. She described that since people are less engaged with the political process this creates an accessibility problem for pollsters.
“We’re depending more on online surveys, texts — and we’re trying to move away from phone calls,” says Agiesta, emphasizing there must be more innovation and flexibility in the industry if pollsters want to be able to collect representative data.
Yang echoed Agiesta’s sentiments about less engagement in the political process saying most young people have no idea what is even being polled. By not tackling these issues head on, pollsters experience more burden in the polling process and it also chips away at trust in their work.
“Part of the problem with polls is you’re giving people information in a short span of time,” says Yang.
When the panel was asked by a student how much attention should be paid to other polls, Ayres said it was important to look at other polls conducted by both objective and partisan organizations.
“We will always pay attention to other things out there — it doesn’t mean it shades your view or changes your poll,” he says.
In terms of reading and understanding polls, the panelists acknowledged it can be difficult if you aren’t data savvy.
Stirewalt spoke of his own experience learning to better understand polls, saying he had to do research into what polling organizations held themselves to a high standard. He also prioritized looking at polls that test against colleagues on the other side of the aisle.
However, good polling still has room for error and shouldn’t be taken as gospel according to Agiesta. She also told the audience to assume there’s a bubble around polls and to always look into what pollsters mean in their definition of “likely voters.”
When asked about polls being a potential contributor to voter suppression — such as one candidate polling low could make voters less likely to vote if they believe their preferred candidate would lose regardless — all panels agreed there was not really evidence to support this idea.
“There’s logic that if your candidate is behind you’re less likely to participate but it’s hard to find evidence to prove that,” says Ladd.
In general, polls are both an art and a science that need to be analyzed with care.
This article was written by Lacy Nelson (MPS ’24), a graduate journalism student at Georgetown University’s School of Continuing Studies.
For more information about the event, please visit the event page (new window).