Most of our work here at Pure Poison deals with taking on specific instances of intellectual dishonesty in the media. We’re aiming to dismantle the flawed arguments and promote a genuine exchange of ideas. But it’s also important to step back and look at the processes of thinking and communication that help intellectual dishonesty to flourish. That’s where Pure Science comes in.

A study by researchers at the University of California, Berkeley, investigated people’s perceptions of bias in scientific research. UC Berkeley has issued a media release about the study, but it has been published in the February 2009 issue of the journal Political Psychology and is currently available for free online. Let’s take a look at the study and what it might tell us about how columnists and their readers might misinterpret research findings.

What did they study?
The researchers were interested in whether research on controversial yet important topics, such as public policy decisions about how to approach justice, education and health care, might have limited impact because of people’s existing beliefs on those issues. In many cases, those beliefs are likely to be related to political ideology. The researchers wanted to investigate whether people tend to discount research findings that don’t fit with their existing beliefs as being a product of the researcher’s own beliefs.

How did they study it?
The research was part of a larger telephone survey that respondents from 1,050 randomly selected California phone numbers completed. Each participant was asked to respond to a description of a hypothetical research finding about the effectiveness of a public policy intervention, but the descriptions were experimentally manipulated so that each participant heard a randomly selected combination of topic and finding.

The five research topics involved interventions that typically receive different levels of support across the political spectrum:

  • Gun control (favoured by liberals)
  • Medical marijuana (favoured by liberals)
  • Death penalty (favoured by conservatives)
  • School vouchers (favoured by conservatives)
  • Using doctors rather than actors in nutrition ads (politically neutral; included as a baseline measure)

Whichever topic was described to a given participant, he or she was then told either that the research had found the intervention to be effective (e.g., the death penalty deters crime) or ineffective (e.g., the death penalty fails to deter crime).

After hearing about the study, participants were asked how sceptical they were of the finding, a measure that was made up of questions about how surprised they were by the finding and how believable they thought the finding was. Participants were also asked what they thought the political views and orientation of the researcher might be. Another part of the phone survey asked participants about their own attitudes toward the five specific issues above as well as their general political orientation.

What did they find?
The results showed statistically significant but weak correlations between attitudes toward the issues and political ideology, with the death penalty showing the strongest relationships (capital punishment was opposed by 20% of conservatives, compared to 55% of liberals). Most participants said that the description of the study’s findings would not change their attitude at all – hardly surprising, given that it was an imaginary finding.

But the findings that get to the heart of the study’s aims came from two sets of analyses; the first set examined which factors predicted scepticism about research findings, and the second looked at which factors predicted inferences (attributions) about the researchers’ own attitudes.

Scepticism
Participants tended to be more sceptical of research findings:

  • that contradicted their own attitude about that issue;
  • when the research finding went against their political ideology (even after controlling for attitudes on the specific issues);
  • when the study showed an intervention was effective (rather than ineffective), or when the outcome favoured liberals (e.g., the death penalty does not deter crime, or gun control does reduce crime); and
  • if the participants were conservatives rather than liberals, although this did not translate into a significant difference between Democrats and Republicans.

Attributions
More than half of the participants were willing to offer an inference about the researcher’s political ideology. There was an overall tendency to view researchers as more likely to be liberals (which is consistent with actual research on political views among academics). However, the pattern of attributions was affected by the research topic and outcomes. Participants were more likely to:

  • offer an inference about the researcher’s ideology for the four ideologically-relevant topics than the neutral one; and
  • infer that the researcher was a liberal when the research finding showed an intervention to be effective (rather than ineffective) or when the outcome favoured liberals. However, there was not a corresponding pattern of inferring the researcher was conservative for findings that favoured conservatives.

But the participants’ own political attitudes also had an effect on attributions about the researcher. In particular, conservative respondents were more likely to make inferences that the researcher was a liberal when the study’s findings favoured liberals. On the other hand, liberals were not more likely to infer that the researcher was a conservative when the findings favoured conservatives.

What does this tell us?
The results suggest that research findings which support liberal approaches to public policy are more likely to be regarded with scepticism, and that this scepticism seems to be associated with concerns about the ideological bias of the researchers. These perceptions of bias are more likely to come from those who are conservative in general, or who hold conservatively-aligned attitudes on the specific issue the research looked at. These findings seem consistent with a lot of the reactions to research that we see in conservative columns and blogs, and in responses from the commenters on those sites. And although they were explicitly artificial, the descriptions of research findings are similar to what we typically see presented in the mainstream media – brief, superficial and lacking the detail needed for critical evaluation. Under those conditions, there appears to be a tendency to see Leftist influence on the research endeavour – and the source of the research becomes the focus, rather than the integrity and quality of the research itself.

In Australia, we have seen a Senate Committee investigate conservative claims of academic bias in universities and schools. We regularly see scientific research and academic institutions criticised as having philosophical and/or ideological motivations to conduct research that supports certain outcomes (e.g., anthropogenic global warming). This study provides evidence for one type of bias in judgment that may contribute to these types of claims.

But that doesn’t mean those of us who lean to the left can sit back with a smug sense of self-satisfaction. Liberals still appear to be more suspicious of findings that contradict their existing beliefs. It’s good to be sceptical, but that scepticism needs to be applied equally, without being influenced by the nature of the findings. And as the authors of this study note, the proneness to see liberal but not conservative bias might be because researchers are more likely to be liberals.

Rigorous, objective research should be able to serve as evidence in the debate over public policy. Rather than dismissing any research on ad hominem grounds, everyone involved in that debate needs to focus on the research itself. If the findings are genuinely affected by ideological bias, point to the evidence of ideological contamination in the study. We need to avoid this natural tendency to point to the researcher just because the findings don’t fit with what we believe.

Reference
MacCoun, R. J., & Paletz, S. (2009). Citizens’ perceptions of ideological bias in research on public policy controversies. Political Psychology, 30(1), 43-65.

(Visited 59 times, 1 visits today)