Motivated Reasoning and Science in Politics

What we feel actually changes what we see.

There’s a whole branch of psychology that investigates this phenomena, and it has serious implications on science, and how science is used in policy and politics. It’s called “motivated reasoning,” and you’re probably already familiar with one version of it.

Back in the 1950s, a few psychologists decided to use football as a mechanism for measuring motivated reasoning. They gathered students from two different Ivy League schools to watch film of a game between their schools. That game film had several controversial calls in it, and the psychologists were watching for how those calls were interpreted by the students.

Surprisingly basically no one, they found that the students were more likely to agree with the call when it favored their own team or when it burned their rival. It wasn’t just willful ignorance: the loyalty to one team over the other actually changed what the students believed they saw on the tape. The emotional stake the students had in one team over the other affected not just how they felt about the game, but what they actually saw happening in front of their eyes. The tape, the neutral arbiter of fact in this case, wasn’t sufficient to resolve the disagreement. Studies since have seen over and over a similar phenomenon. Cognitive bias toward our own team, or our own way of viewing our team, still reigns today.

Motivated reasoning, or motivated cognition, is the psychological model explaining how we fit information to conclusions that fit our values system. (Tara Haelle wrote a fantastic article about this study at NPR, and I highly recommend reading it.) As Ms. Haelle puts it, “Motivated reasoning is the psychology concept that explains why people move the goalposts in an argument.”

We see motivated reasoning popping up in major political debates as well. The practical outcome of motivated reasoning happening in political arguments is that education doesn’t always fix ignorance on science-related issues. Two useful examples are climate change and vaccination rates. We have mounting evidence that teaching people about climate science or the safety of vaccines does nothing to actually impact the view of opponents.

***

Several recent polls have been looking into people’s attitudes toward climate science, cross-referenced with political leanings. It found a fundamental split in how factual science affects their views. For Democrats and liberals, formal education increased the likelihood of accepting the scientific consensus on global warming. For Republicans and conservatives, increased formal education actually decreased that likelihood. Put another way, the more conservatives thought they knew about climate change, the less likely they were to accept the evidence for it.

Multiple polls since, including one recently by the Pew Research Center, seems to indicate that a substantial portion of the population doesn’t really understand the scientific consensus around climate change. It found that ~26% of the population thinks that there is no solid evidence that the average temperature on Earth has been getting warmer over the past few decades.

Polls like this have convinced advocates that public education is the solution to breaking the logjam. You’ve probably seen some variation of the statement that “97% of climate scientists” believe that human activities are causing climate change. Surveys like the one from Pew are the reason why: showing extremely high scientific consensus can help communicate a very high level of certainty in the science itself. As a scientist concerned about climate, I understand why other scientists who agree with me think that’s convincing. And while it can be a powerful argument, it also makes a fundamental assumption that education and facts will change someone’s mind, if only they knew some critical piece of information.

***

In the summer of 2013, The View named Jenny McCarthy as a new co-host, and my Facebook feed promptly blew up. Many of my scientist friends were losing their collective minds over the elevation of a woman who has been a leading voice in the anti-vaccination movement. I wasn’t surprised, but I was a little shocked at how viscerally angry people were.

In 1998 Andrew Wakefield, a British former surgeon and medical researcher, published a paper in the medical journal The Lancet claiming to find evidence that the measles, mumps, and rubella (MMR) vaccine could induce autism. Subsequent reviews and studies have shown the paper to be totally false, and The Lancet retracted the thoroughly discredited paper fully years later. However, the damage was already done. Today, vaccination rates are dropping, driven by parents who honestly fear for their children and question the medical consensus that vaccines are safe. The recent outbreak of measles in southern California has been widely attributed to one child whose parents didn’t immunize, and didn’t realize was sick when spending a day in the park. Since that outbreak, there is a growing body of discussion and outrage at how these anti-vaxxers are breaking the social contract, endangering others for their own irrational reasons.

I was a little thrown at how angry my Facebook feed got, not just at McCarthy, but at these parents. It’s happened again since the California measles outbreak has begun. People kept posting study after study and article after article talking about how the anti-vaccination movement is dangerous and stupid and flies in the face of social progress and modern medicine.

On many levels, they’re absolutely right to be highly critical of these parents. There’s little-to-no evidence that vaccinations have much of a risk associated with them, and refusing vaccinations for their children not only risks their own kids’ lives, but the lives of people around them. (Note that there are Disneyland employees who also contracted the measles.) But focusing on that also sort of misses the point. These parents aren’t making decisions based on science, they’re basing them on fears. And, just like in the climate debate, debunking false facts doesn’t necessarily change anyone’s opinion.

A new study published in Vaccine in November 2014 investigated whether correcting misconceptions on the flu vaccine actually changes anyone’s behavior. And, just as in other controversial science areas, education reduced belief in erroneous facts, but actually made it even less likely people who already feared vaccines would get them. (Again, Tara Haelle wrote a fantastic article about this study at NPR, and I highly recommend reading it.)

***

According to Alan Leshner, head of the American Association for the Advancement of Science (AAAS): “When scientists see that science is being distorted, they feel compelled to stand up and say, ‘No, that’s not true.” It can be equally true that, even when the public understands the science, they may simply dislike the answer.

A basic tenet of scientific inquiry is that everyone with the same factual information – regardless of culture, race, creed, or anything else – can come to the same conclusion. So it’s not necessarily surprising that I’ve heard scientists making the argument that “If only you knew what I knew, you’d believe what I believe.”

The problem with that statement is that it equates factual knowledge with our value systems. Facts and values are connected – they inform each other, and color perception of each other – but they are not the same. Another way of saying that same thing is “If you had all of my knowledge, you’d share all of my values.”

Science is the mechanism by which our society mediates factual disputes. It can tell us how much of a chemical is present in water, the impacts of obesity on disease, the changing temperature of the earth over millennia. But science doesn’t make value judgments. It’s intended to provide knowledge and truth, as best as we can understand it.

Politics, on the other hand, is the mechanism by which our society mediates values disputes. Deciding to regulate chemical production and usage, putting in place policies to encourage good health, or mitigating climate change impacts are political (and values) choices. Whatever personal or moral obligations one feels for one policy choice over another stems from our own personal value systems, regardless of their source or direction.

Despite this difference, science and politics will always impact each other. The line between the two is not clear-cut. There are feedback loops that run through every individual and community. Science will always impact politics, and politics will always impact science. We may think of them as fundamentally different – and they are – but they are not isolated from each other. The facts we trust and the values we hold will always inform, impact, and change each other. Just because they are different does not mean they are separable.

As I said at the beginning, how we feel about things impacts how we think about them. A lot of the time, when we think we are reasoning, we’re actually rationalizing. It’s how we’re wired, but it has a serious impact on the policies and politics of our society. And it’s a huge source of why scientists can get frustrated with the political process. Those of us who want to engage in that process need to keep in mind that the values and beliefs of the public can be just as important as the accuracy of our information.


Connect with Chris on Twitter @cwavery!

Photo credit: boingboing.net

3 thoughts on “Motivated Reasoning and Science in Politics”

Comments are closed.