Saturday, October 23, 2010

People Are Allergic To The Facts

New research finds we trust experts who agree with our own opinions, suggesting that subjective feelings override scientific information.

By Tom Jacobs
Source: "Miller-McCune.com"
October 8, 2010
Courtesy Of "Alter Net"



A clear consensus of opinion emerges within the scientific community on an important issue, such as climate change. But the public, and its elected leaders, remains unconvinced and unreceptive to well-founded warnings.
With this phenomenon growing frustratingly familiar, researchers can be forgiven if they begin to feel like Rodney Dangerfields in lab coats. From their perspective, they don’t get no respect.
Newly published research suggests that’s not entirely true: Americans do believe and trust researchers. But we focus our attention on those experts whose ideas conform with our preconceived notions. The others tend to get discounted or ignored.
“Scientific opinion fails to quiet societal disputes on such issues (as climate change) not because members of the public are unwilling to defer to experts, but because culturally diverse persons tend to form opposing perceptions of what experts believe,” a team of scholars writes in the Journal of Risk Research. “Individuals systematically overestimate the degree of scientific support for positions they are culturally predisposed to accept.”
The research team, led by Dan Kahan of Yale Law School, studied “a broadly representative sample” of 1,500 Americans in 2009. Through a series of questions, their cultural beliefs were measured on what can be called a left-right scale (although the researchers do not use the terms “liberal” and “conservative” in their paper). Those strongly holding egalitarian and communitarian outlooks were on one end of the spectrum, while those with hierarchical and individualistic views were on the other.
Participants were then presented with a series of statements and asked whether in their view most experts concurred with them. Three of the statements represented the consensus of scientific opinion: “Global temperatures are increasing,” “Human activity is causing global warming,” and “Radioactive wastes from nuclear power can be safely disposed of in deep underground storage facilities.”
Finally, the participants were introduced to a fictional expert on one of those subjects, who either agreed or disagreed with their position. After reviewing the researcher’s credentials and reading a bit of their writing, each participant rated the degree to which they found the expert knowledgeable and trustworthy.
On the first question, participants were predisposed to believe that a majority of experts agreed with their personal point of view. Among those with an egalitarian/communitarian mindset (i.e., liberals), 78 percent reported (correctly) that there is a scientific consensus that climate change is occurring.
But among those with a hierarchical/individualistic attitude (i.e., conservatives), only 19 percent said there was a scientific consensus that climate change is real. Fifty-six percent reported the scientific community is divided on the issue, and another 25 percent insisted that most scientists agree with them that climate change is not real.
Before you start cursing close-minded conservatives, consider this: When it came to the issue of safely storing nuclear waste, the opposite effect was found (although the differences between the two groups were not as large). Thirty-seven percent of those on the right reported, correctly, that the scientific consensussupported their view. In contrast, 35 percent of those on the left inaccurately believe the scientific consensus reflects their opinion.
When it came to assessing the imaginary expert, the attitude he purportedly held “dramatically affected the responses” of the participants, Kahan and his colleagues report. If his writing sample reflected the belief the planet is at high risk from global warming, 87 percent of those on the left agreed he was trustworthy and knowledgeable, compared to 23 percent of those on the right. The numbers reversed when the expert, with the exact same credentials, stated that climate-change risks are low: 86 percent of conservatives called him knowledgeable and trustworthy, compared to 47 percent of liberals.
The bottom line: We seek out, and find comforting confirmation from, experts who agree with our pre-existing beliefs. The researchers call this a matter of “cultural cognition.” As Kahan put it in a recent editorial in the journal Nature:
“People find it disconcerting to believe that behavior that they find noble is nevertheless detrimental to society, and behavior that they find base is beneficial to it. Because accepting such a claim could drive a wedge between them and their peers, they have a strong emotional predisposition to reject it.”
Whether that intense inclination is exclusively peer-based or due in part to internal factors, such as a genetic predisposition to a particular ideological outlook, remains an open question. But either way, scientists and policymakers who base their decisions on science have a problem. Kahan and his colleagues offer some possible ways around this dilemma, but they’re admittedly sketchy and tentative.
“To overcome this effect,” they write, “communicators must attend to the cultural meaning as well as the scientific content of information.” The researches recommend “crafting messages to evoke narrative templates that are culturally congenial to target audiences.”
Kahan gives a couple of examples in his Nature editorial, suggesting that people with individualistic values might be more receptive to climate change arguments “if made aware that the possible responses to climate change include nuclear power and geoengineering, enterprises that to them symbolize human resourcefulness.
“Similarly, people with an egalitarian outlook are less likely to reflexively dismiss evidence of the safety of nanotechnology if they are made aware of the part that nanotechnology might play in environmental protection, and not just its usefulness in the manufacture of consumer goods,” he adds.
In other words, scientists need to do some radical reframing if they hope to get through to people whose world view is threatened by their results. While there’s no guarantee that effort will succeed, Kahan and his colleagues convincingly argue that the effort is “critical to enlightened democratic policymaking.” Otherwise, our gut instincts will remain the experts we rely upon the most.
Tom Jacobs is a veteran journalist with more than 20 years experience at daily newspapers. He has served as a staff writer for the Los Angeles Daily News and the Santa Barbara News-Press. His work has also appeared in the Los Angeles Times, Chicago Tribune and Ventura County Star.

No comments:

Post a Comment