While, as we saw in Topic I, public opinion data may be skewed because of inappropriate sampling methodologies, sometimes polls produce inaccurate results because the people who participate them -- even if they comprise a maximally representative sample of the whole population -- aren't quite truthful about their opinions. There are two primary facets to the "respondents lie" problem: nonattitudes and insincerity.
Nonattitudes (sometimes also known as "pseudo-opinions")
Nonattitudes is a term coined by public opinion researchers back in the early days of public opinion polling to refer to the phenomenon of people offering an opinion even when they don't actually have one. Various studies have established that nonattitudes are a real phenomenon. One of the more prominent examples is a study conducted by University of Cincinnati researchers duing the late 1970s, in which they asked respondents their opinions on the "Public Affairs Act of 1975." More than 30% of the respondents offered an opinion, even though there actually is no such legislation.
As you might imagine, when it comes to the prevalence of nonattitudes, not all polling subjects are created equal. Most people do have opinions about high-profile subjects (e.g., abortion, the war in Iraq, Barack Obama). However, when it comes to relatively obscure policies, personalities, and events, nonattitudes are increasingly likely to surface. Similarly, while they may have opinions about an issue in general, they may not have opinions about its finer points and technicalities -- but might nevertheless offer "pseudo-opinions" about them when responding to a public opinion poll. As for why respondents report nonattitudes rather than just decline to respond at all -- there are various possibilities, ranging from wanting to appear more knowledgable than they actually are to feeling pressured to "just answer the question" that's asked of them.
There are a few steps pollsters can take to minimize the problem of nonattitudes creeping into their poll results:
- Screening questions at the start of the poll can weed out likely nonattitudes by asking respondents from the get-go whether they are familiar with the poll's subject and/or have an opinion about it.
- Follow-up questions can also minimize the nonattitudes problem by asking respondents to elaborate on their simple "yes/no," "agree/disagree," or "favor/oppose" responses with explanations as to why they hold those opinions.
- A "mushiness index" (i.e. an index of how "mushy" -- that is, unfixed -- an individual's opinions on some topic are) can be integrated into the poll. This would consist of a series of questions (usually in the neighborhood of 3-5) that ask how much respondents know about the topic, how often they think about or discuss it, and how passionate they are about it.
- Finally, simply providing respondents with the explicit possibility of resopnding "no opinion" or "I don't know" can limit the number of people who offer opinionated responses that they don't actually believe in.
Sometimes respondents do have an opinion about a poll's subject, but choose not to divulge it, as described in this unit's first required text, "When Voters Lie," an article that was published in the Wall Street Journal last summer.
As with nonattitudes, insincerity affects some polling subjects more than others. In contrast to nonattitudes, though, the problem surfaces less as a result of respondents' lack of knowledge about the poll's subject than their psychological urge to come off as "socially desirable" to the poll administrator. So, insincerity is often a problem when people are polled about their behaviors, whether pertaining to their hygiene, their participation in illicit activities, or even whether they vote in presidential elections. People are also more likely to provide insincere responses when asked about attitudes they hold that might be perceived by others as "socially unacceptable" or "politically incorrect." As quite a few of the required texts in this unit suggest, polling subjects that have an explicit racial angle are especially susceptible to bias stemming from voter insincerity.
The simplest way to minimize biased results stemming from insincerity is to have respondents self-administer the poll -- that is, to read the questions and record their responses themselves rather than have an interviewer do it. This isn't always possible, though, especially since telephone polling is often the most effective way to conduct a timely, large-scale poll. Therefore, coming up with new ways to get around the problem of respondent insincerity is always a hot topic in public opinion research. You can hear about one example of an alternative approach in another of this unit's required tests, a Youtube video clip of an interview with an award-winning graduate student presenter at the American Association for Public Opinion Research (AAPOR) conference last summer:
No comments:
Post a Comment