These surveys are surveying 3 to 10 or more percentage points more Democrats than Republicans. Especially for the higher values, does anyone believe that is really going to happen this year? So how do pollsters justify this? CBS News has this:
"We do not weight for party ID," said Peter Brown, assistant director of the Quinnipiac University Polling Institute, which is conducting battleground state polls along with CBS News and the New York Times. "We do not predetermine how many Democrats, Republicans and independents will be in our sample."
The reason, he says, is that "party ID is a changing statistic. People will over time change back and forth in terms of how they view themselves politically."
Instead, Brown says the polls are weighted by "immutable characteristics - race, gender, age." Respondents are then asked their party identification during the interviews.
"We do that because there is a set standard that we can compare ourselves with to make sure we're getting an accurate demographic representative. And that standard is the United States Census Bureau data," said Brown. "What we get is what we get," he added. . . .
We know that there are lots of biases in the rate that people respond to pollsters that might not be picked up by simple race, gender, and age. For example, during the 2004 general election the exit polls were biased towards Democrats because Republicans tended to be less likely to answer questions from pollsters. For example, here is a book by Best and Krueger entitled "Exit Polls: Surveying the American Electorate, 1972-2010," Chapter 1:
“VRS claimed the Democratic overstatement in the raw exit poll data was due to partisan differences in the willingness of voters to complete the exit poll, not to a poor selection of precincts or differential response rates by age, race, or gender. Republicans simply refused to participate at the same rates as Democrats, resulting in there being fewer Republicans in the raw exit poll results than there should have been. Mitofsky speculated that the disparity was due to different intensities of support for the candidates—Democratic voters were just more excited about voting for Clinton than Republican voters were about voting for Bush and, as a result, were more motivated to communicate this message by filling out the exit poll questionnaire; others thought it was due to Republicans in general having less confidence in the mass media.”
UPDATE: Michael Barone has a very useful column here.
it's getting much harder for pollsters to get people to respond to interviews. The Pew Research Center reports that it's getting only 9 percent of the people it contacts to respond to its questions. That's compared with 36 percent in 1997. . . .
Pollster Scott Rasmussen, who weights his robocall results by party identification, adjusted monthly, has shown a much closer race than most pollsters who leave party identification numbers unweighted. So has the Susquehanna poll in Pennsylvania.
It may be that we're seeing the phenomenon we've seen for years in exit polls, which have consistently skewed Democratic (and toward Barack Obama in the 2008 primaries). Part of that is interviewer error: Exit poll pioneer Warren Mitofsky found the biggest discrepancies between exit polls and actual results were in precincts where the interviewers were female graduate students. . . .
Labels: mediabias, poll