Impact of reading about climate science goes away almost instantly
For decades, the scientific community has been nearly unanimous: Climate change is real, it’s our doing, and its consequences are likely to be severe. Yet even as it gets more difficult to avoid some of its effects, poll after poll shows that the public hasn’t gotten the message. There’s very little recognition of how strong the scientific consensus is, and there is a lot of uncertainty about whether it’s our doing—and none of the polling numbers seem to shift very quickly.
Over these same decades, there have been plenty of studies looking at why this might be. Many of them have found ways to shift the opinions of study subjects—methods that have undoubtedly been adopted by communications professionals. Yet the poll numbers have remained stubborn. Misinformation campaigns and political polarization have both been blamed, but the evidence for these factors making a difference is far from clear.
A new study offers an additional hint as to why. While polarization and misinformation both play roles in how the public interprets climate science, the biggest problem may be that the public has a very short memory, and anything people learn about climate science tends to be forgotten by a week later.
Time after time
To test people’s responses to climate information, researchers gathered a set of materials that had appeared in major publications. Some weren’t climate-related and served as a “placebo.” Others were coverage of an earlier report from the Intergovernmental Panel on Climate Change. Finally, there was a set of articles that focused on partisan disagreements regarding climate change (as opposed to the scientific content) and a set of opinion pieces that argued against accepting the scientific evidence.
The study focused on creating a bunch of paths through this information, with different readings in consecutive weeks. For example, one group of participants might receive science all the way through, while another might get science one week and then have it contradicted by an opinion piece the week after. The goal was to detect whether exposure to science had a lasting effect or if it could be undercut by either time or misinformation.
The risk here was that having so many potential paths through the information would mean that only a few people went down each particular path, making any results statistically suspect. The researchers overcame this by recruiting a lot of participants—nearly 3,000 people did the entire multi-week process. To do so, they had to rely on Mechanical Turk, a service some users have managed to script. But a number of studies have indicated that Mechanical Turk results have been replicated by in-person studies, so the researchers felt it was sufficiently reliable.
The experiment ran over four weeks. In the first, basic information about the participants’ existing beliefs about climate change was established. Afterward, there were two weeks of reading articles, followed by additional polling. Week four simply saw a final poll to determine whether the previous weeks’ reading had changed any opinions.
Ups and downs
The study showed that facts make a difference. When exposed to science news stories on the climate, people were more likely to accept that climate change is happening and is caused by humans. This also had consequences for thoughts on policy, as people who read science news favored government action and an expansion of renewable energy. The two alternative presentations—partisan conflict and anti-science opinion pieces—didn’t have a statistical effect on the acceptance of climate change. (The anti-science opinion pieces did reduce support for renewable energy, though.)
The problem is that facts don’t make a difference for very long. Even the participants who got two weeks of science news in a row saw their acceptance of science drop a week later. A week of science with either random news stories or a discussion of partisan arguing made things drop even more quickly. And a week of science followed by a week of anti-science opinion pieces did exactly what the writers of the opinion pieces might hope for: They essentially canceled each other out.
Unsurprisingly, support for policy action generally followed the drop in acceptance of the science that should drive the policy.
Throughout all of this, partisan biases seemed to play a relatively small role. The primary point where it became obvious was when participants were exposed to the anti-science opinion pieces, which had a stronger impact on Republicans. These articles also had a stronger effect on people who rejected climate science at the start of the study, although this population is likely to have a strong overlap with Republicans.
If you’ve had the feeling that we need to reconvince the US public of the dangers of climate change every few months, that idea seems to have a basis in reality. Exposing people to scientific information seems to work, but only briefly—and very briefly if they subsequently get exposed to misinformation in the form of opinion pieces. The researchers involved refer to holding accurate information as “fragile” and conclude that “the persistence of misperceptions reflects both the limits of human memory and the constraints imposed by the political information environment.” In that regard, these results are likely to apply well beyond climate change.