Sunday night was an unexpectedly exciting time in the Anderson household: Just after dinnertime, my husband got called and asked to participate in an election poll. I was in the middle of getting my toddler ready for bed when he rushed into the room, holding our baby in one hand and his phone in the other, now on speaker so I could hear the questions.

There’s a real thrill you get as a pollster when you get to peek behind the curtain and see how someone else in your field is asking voters what they think. But it was also a reminder of how messy social science can be. Here was my husband, telling this very nice interviewer what he thought about, among other things, immigration and Tim Walz, all while pacing the hallway bouncing a bedtime-bound infant. I can only imagine what others are up to while answering survey questions.

As we near Election Day, Americans’ interest in polls will only increase. Election forecasting models are being heavily scrutinized and debated by commentators and reporters. Small shifts in polling results can trigger big headlines. As we start this post-Democratic convention week, when analysts will be combing for any hint of a bump for Kamala Harris, it’s worth keeping in mind that there are lots of things that make polling art as much as science.

Pollsters try to get it right, but we don’t know what we don’t know, and sometimes in our industry it can feel you’re fighting the last battle. In 2012 some pollsters missed the mark and overestimated Mitt Romney’s vote share, in part because they were excluding less-likely voters or not calling enough voters on cellphones. By 2016, those issues were largely addressed, but new problems reared their heads, like some pollsters overrepresenting college graduates in their surveys, yielding poll results that were overly favorable toward Hillary Clinton, particularly in key Upper Midwest states.

Four years after that, pollsters had mostly fixed the issues that plagued them in 2016, yet in 2020 many polls “featured polling error of an unusual magnitude,” according to the American Association for Public Opinion Research’s task force on the matter. The task force was circumspect in its report on what went wrong and didn’t propose a definitive solution. And troublingly, the industry did not come away with a clear conclusion as to what went wrong, so that it could be remedied.

As a pollster, I’ll be the first person to tell you that I have a measure of apprehension about the polls this year. I mentioned last week that a couple of warning lights are starting to blink on the 2024 polls. And so an accounting of a few things that keep me up at night: