Immigration Polls and Dog Whistle Effects

Mark Blumenthal uses vastly different results in by major polls on the immigration issue conducted by NBC/WSJ and CBS/NYT. “Both ask about essentially the same provision in the bill, and include many of the same elements of that proposal, yet show diametrically opposite results.”

The NBC/WSJ poll is headed up by Democrat Peter Hart and Republican Neil Newhouse, two of the top pollsters in the business. (I should note here that Newhouse is my wife’s boss and a friend. Still, there’s no questioning either man’s credentials.) The CBS/NYT poll is unsigned but I presume they’re using top-notch folks as well.

Why, then, are polls conducted by seasoned professionals with no incentive to slant questions to please a client so different?

The challenge in interpreting any of these results is to remember that most Americans have, at best, only a vague sense of what the immigration bill does, much less which provisions they favor. As I wrote previously (especially here and here), survey questions about specific proposals largely measure reactions, not preexisting opinions about the proposals. Respondents tend to listen to the text of the question and form opinions on the spot. As such, the results can vary greatly depending on the way the pollster asks the question.

Blumenthal breaks down the nuances of the questions and explains why they would lead respondents with little previous knowledge of the issue (which is to say, respondents) to give different answers. My instinctual preference is for the NBC/WSJ question, which is cleaner and has fewer potentially confusing qualifiers, but the CBS/NYT version provides more information and may well produce more accurate results.

The point, though, is that we really don’t know. Polling on issues where public opinion is unfixed (again, most issues) is simply problematic. Blumenthal’s includes himself in the assertion that,

Unfortunately, media and campaign pollsters know little about how respondents hear these sorts of questions or what pieces of information they weigh most heavily in their answers. We often refer to these sorts of differences as “dog whistle effects” — the respondents seem to hear things we miss. There is a way to “debug” this sort of question (hint: Google “cognitive pretesting”), but it requires far too much lead time and costs far, far too much to be practical for media and campaign pollsters. So we are left to speculate about how to interpret the results.

[…]

The more results tend to diverge, the greater the odds that respondents are confronting the proposal for the first time and are simply reacting, drawing upon real (and sometimes conflicting) attitudes triggered by the information provided in each question.

Polling is a useful tool for gauging public opinion. We too often presume, though, that the public has an opinion. On an emotional issue like illegal immigration, people have very strong attitudes which are often in conflict. That doesn’t mean they have a firm grasp on the details of complicated public policy proposals.

Hell, I’m not sure most Members of Congress understand this bill.

FILED UNDER: Blogosphere, Borders and Immigration, Congress, Public Opinion Polls, , , , , , ,
James Joyner
About James Joyner
James Joyner is Professor and Department Head of Security Studies at Marine Corps University's Command and Staff College. He's a former Army officer and Desert Storm veteran. Views expressed here are his own. Follow James on Twitter @DrJJoyner.

Comments

  1. Patrick T McGuire says:

    Yeah, most of us hicks out here in fly-over country can’t understand hundreds of pages of nuanced legalese. Our minds are so simple, all we can understand is simple concepts, like right and wrong.

  2. rpkinmd says:

    I must be one of those that has uninformed strong opinions, I have not read the bill and that puts me in the company of probably all of the Senators who I doubt if any has sat down and actually read the whole thing.

    I do know I have absolutely no basis for trusting the federal government to close the borders down. In this I have the facts on my side. I am old enought that I was very opposed to the 1986 amnesty bill. I was right and all those who accused me back the were either disingenuous or more uninformed than I was.

    Don’t feed me comprehensive (amnesty) reform; IT’s THE BORDERS STUPID!

  3. Michael says:

    Yeah, I don’t think that’s exactly what he meant Patrick. Have you read the immigration bill? No? That’s what he meant.

    How can anyone form an opinion on the contents of something they haven’t read? The answer, as James pointed out, is that they form their opinion on the spot based on something less that complete exposure to the bill, and something more than what is actually in it.

    Having work in the field service industry (the part that actually conducts the survey) I understand how much of an impact the interviewer can have on which way those spontaneous opinions are formed. If you don’t have a strong opinion, you might be less willing to tell someone with an Hispanic accent that you oppose immigration. Emphasis can also have an impact, if they put a stronger emphasis on “illegal” they will form a different opinion than if they put an emphasis on “immigration”. Some firms are turning to pre-recorded audio to ask the questions, so they know that every respondent gets ask the same way. Even then, you never know if your particular combination of questions and wording will influence the opinion of the respondent.