Should We Believe Public Opinion Surveys?
When a new survey tells you that one in six Americans believe that President Obama’s a Muslim, do you believe it? Most of us do, especially if the organization is one of a handful of widely trusted and respected public opinion research centers like Gallup or the Pew Research Center. These polls are important barometers for public opinion and often (as we’ve seen this past week with same-sex marriage) shape the debate around a particular issue.
Should we really trust public opinion surveys, though, given that fewer than 1-in-10 Americans participate in them if contacted? A fascinating new report from the Pew Research Center sheds light on one of the biggest problems that its own researchers face: a falling response rate. At Pew, the response rate of a typical telephone survey was in 1997 was 36%, meaning that 36% of the people who were called at random to participate in the survey actually stayed on the line and answered the questions. Today, the response rate is just 9%, which means that fewer than 1-in-10 Americans stand in for the entire population on the most trusted public opinion surveys. The question then becomes: how representative of everyone else is this tiny minority?
This change is in large part due to the rise of cell phones. Back in 1997, people still relied on landlines, which remain the primary method of reaching respondents for public opinion surveys. Now, large numbers of households have cell phones but no landlines, which meant that polling organizations had to change their methodology to include calls to cell phones, which are generally more expensive for polling field houses to reach. However, cell phones have their problems too: for example, they have caller ID, which allows potential respondents to recognize an unknown number and ignore the call. Many cell phones also belong to people under the age of 18, who cannot participate in most public opinion surveys. Back when landlines were the norm, that wasn’t a problem — children or teens could just give the phone to an adult in the household. Not so with cell phones.
The result is that some populations are more likely to participate in public opinion surveys than others. Senior citizens are more likely to participate than younger Americans, and minorities are more difficult to reach than white Americans. Women also tend to be overrepresented.
Does this skew the survey’s results? Interestingly, Pew found that the people who respond to their surveys are, for the most part, representative of the general population — except for their level of civic engagement. Respondents to Pew surveys are markedly more likely to have contacted a public official in the past year, volunteered for an organization in the past year, or talked to a neighbor in the past week.
There are, of course, no shortage of problems with public opinion surveys. Take, for example, the famous “Bradley effect” — in which white respondents are less likely to express their discomfort with a black candidate to a person on the phone. Nonresponse bias is just one of these issues, and it doesn’t mean that polls are completely unreliable. The fact that Pew took the time to conduct this research indicates that they care about reducing as much of the bias in their surveys as possible. But at the same time, reports like these are important reminders that the people who are responding to surveys are not always representative of the average American — and that every survey should be taken with a grain of salt.
Photo Credit: aflcio2008