Hi. The evidence seems pretty damaging. But still, in the discussion of the results, why do the authors place all the emphasis on pollsters' biases rather than respondents' biases? I'm not sure about the American context, but in several European countries interviewers need to inform respondents of the company they work for and where the poll results will be published. Is it inconceivable that at least some respondents, when faced with the identity of a pollster and the media outlet (and that outlet's perceived biases), refuse responding or conceal their preferences? Will, say, a liberal person respond to a Fox poll in the same it responds to a NYT poll? A field experiment on this would be nice.
Pedro, in the polls I've been called for during this election, they identify the research company, but NOT the entity that contracted for the research.
Even in few-close-races Illinois I've been called at least 4 times by political pollsters since the primary. I'm lucky I don't live in Ohio.
Over at 538, there's a good case study of how polling methods that seem fairly reasonable or justifiable can lead to badly biased results.
[poll results show McCain leading 72% to 24% among young adults]
Well, if that's the case, that's important to know. For most voters, "Opinion Dynamics" will mean very little. But CBS/NYT will mean something. I still wonder whether some sort of field experiment, with the exact same questionnaire and interviewers, just announcing themselves as working for different organizations, would allow us to detect some effect…
Comments are closed.