Pages

Sunday, July 5, 2015

Ideology, Motivated Reasoning, and Cognitive Reflection

Despite what many social psychologists claim, humans are mostly rational.  When we are thirsty, we drink water.  When're hungry, we eat food.  When we're on the edge of cliff about to fall off, we look for ways to save ourselves.  For the large majority of decisions we make everyday, humans typically take actions that will satisfy our goals.  In certain circumstances, however, we are systematically biased.  Take, for instance, the domain of politics.  It's understandable for there to be disputes over values: privacy vs security, economic freedom vs equality, etc.  What's less understandable, however, is the fact that people get heated up over facts.  Why do liberals believe that humans are the cause of climate change while conservatives don't?  Why do conservatives believe that gun control would increase crime while liberals believe the opposite?  Liberals will say it's because conservatives are biased, and conservatives will say it's because liberals are biased.  Who's correct?

To begin, let's catalogue the factors that may lead to ignorance of the empirical facts.  There are at least three.  One factor is that humans aren't always too thoughtful or deliberate.  Instead, we use heuristics, mental shortcuts for arriving at a desired outcome.  For instance, we rely on scientific experts to tell us the truth rather than seeking it out for ourselves.  Our choice of experts, however, is also often the result of nondeliberative thinking.  This unfortunate quirk of human decision making may lead people to become uninformed or misinformed.  A second factor is motivated reasoning.  That is, even if people engage their more reflective cognitive processes, they may do so in a way that steers them away from the truth.   When a person is motivated to maintain a relationship or preserve their identity, they may selectively interpret the evidence to suit their non-truth-attainment goals.  Finally, a person may have a certain kind of reasoning style that interferes with the attainment of truth.  There is a considerable body of research, for instance, that conservativism is associated with dogmatism, an aversion to complexity, and a craving for closure in an argument.  These cognitive traits may hinder a person in their pursuit of truth.

It's unclear, however, how these three factors interact to generate belief polarization.  In his article, Dan Kahan outlines three possibilities.  First is what he calls the "Bounded Rationality Position" (BRP).   According to BRP, our heuristic-driven reasoning is the most decisive factor in generating public discord over empirical matters.  On this view, laypeople inadequately engage in effortful information processing.  As a heuristic, then, these nondeliberative folk will tend to trust the received wisdom of their particular in-group, which in turn will lead to greater belief polarization.  A second alternative is what Kahan calls the "Ideological Asymmetry Position" (IAP).  IAP posits that right-wing ideology matters most in distorting empirical judgments.  Like BRP, IAP takes reasoning to be heuristic-driven and inadequately engaged.  This is said to be especially true of conservatives in light of previous correlative research on their cognitive traits.  Because liberalism is associated with, among other things, open-mindedness, it might be thought that they would be less vulnerable to the siren song of political bias.  The final account Kahan considers is what he calls the "Expressive Utility Position" (EUP).  This position, unlike both BRP and IAP, posits that motivated reasoning is the most important factor in belief polarization.  A person's primary motivation when looking at data under this view is to protect their identity, and they will do so by means of selectively searching for and interpreting the evidence to fit with their particular in-group.  Reasoning, then, is not inadequately engaged.  Far from it.  Reasoning will tend to magnify ideological differences, not mitigate it, and this will be true across the political spectrum.

So there's the theory; now where's the test?  In the first part of his study, Kahan presented participants with the Cognitive Reflection Test (CRT).  This test is generally used to measure a person's disposition to engage in conscious and effortful information processing as opposed to heuristic-driven processing.  This quick three item test can be found here.  Following the CRT, participants were split into three conditions.  In one condition, participants were told the following:
Psychologists believe the questions you have just answered measure how reflective and open-minded someone is.
The second condition tacked on an additional bit of information:
In one recent study, a researcher found that people who accept evidence of climate change tend to get more answers correct than those who reject evidence of climate change […] a finding that would imply that those who believe climate change is happening are more open-minded than those who are skeptical that climate change is happening.
The third condition replaced the above paragraph with this one:
In one recent study, a research found that people who reject evidence of climate change tend to get more answers correct than those who accept evidence of climate change […] a finding that would imply that those who are skeptical that climate change is happening are more open-minded than those who believe climate change is happening.
Participants were then asked how valid they personally believed the CRT was in assessing how reflective and open-minded they were.  Because open-mindedness is almost universally considered a positive trait, participants have an emotional stake in believing their group to be more (or at least not less) open-minded than their ideological opponents.  Hence, if a liberal were biased in favor of liberalism, they would discount the CRT's validity in the third condition while accepting the CRT's validity in the second.  The opposite would be true for conservatives; they would discount its validity in the second condition and accept it in the third.  All the theories above predict some motivated reasoning, but they each generate different hypotheses about the form that such reasoning would take.


  • IAP predicts that motivated reasoning should be especially pronounced among conservatives.  Liberals, on the other hand, should have roughly similar validity ratings regardless of condition.  Furthermore, conservatives should score worse than liberals on the CRT itself given that the CRT actually does measure cognitive reflection.
  • BRP predicts that people who score poorly on the CRT will be more inclined to express polarized sentiments.  That is, low cognitive reflection will lead to more bias.  This result, however, should hold regardless of one's political affiliation.  Political affiliation should also be unrelated to actual CRT scores.
  • EUP is the reverse of BRP.  It predicts that polarization will increase as CRT score increases.  That is, contrary to BRP, greater cognitive reflection will lead to more bias.  Similar to BRP, however, and unlike IAP, EUP is neutral on political affiliation.

Before moving on to the results, take a moment to guess which theory you think was best supported.

The first step of data analysis was to compare liberals and conservatives on CRT scores.  Contrary to IAP, conservatives actually scored significantly better than liberals on the CRT, a surprising result given the literature on conservatives' other cognitive traits.  The results of the second step of data analysis were also contrary to IAP's predictions.  Both liberals and conservatives split in their perceptions of CRT validity depending on which condition they were in.  Conservatives believed the CRT was valid when it favored their own side but not so when it didn't.  The same pattern was evident among liberals as well.  So much for IAP then…

Next up on the chopping block is… *drum roll* … BRP!  As you'll recall, BRP predicts that as people's CRT scores increase, they should exhibit less ideological bias.  Cognitive reflection, on this view, should have a salutary effect on polarization.  However, this was not borne out by the data.  Indeed, as people scored higher on the CRT, they exhibited more and more partisanship.  In particular, liberals who scored high on the CRT were remarkably more averse to accepting the CRT's validity when in the third condition (in which climate change skeptics appeared more open-minded).  Additionally, conservatives who scored highly on the CRT welcomed it with open arms when in the third condition.  In conjunction, these findings jive more convivially with the predictions of EUP; greater cognitive reflection leads to greater political bias for both liberals and conservatives.

In my previous post, I discussed Jonathan Haidt's Social Intuitionist Model, a model of moral judgment that describes humans in a less than favorable light.  Humans, it is said, arrive at their moral judgments through automatic intuitive processes and later use their powers of reason, not to correct potential errors, but rather to rationalize all those errors away.  Like a magician waving his wand, the motivated reasoner can make any problem disappear.  Of course, the problems are still there.  They're just hiding in a hat now instead of out in the open.  Kahan's research supports this interpretation of moral judgment.  In Kahan's research, those who were most reflective were also the best magicians; they were more able to twist their judgments to fit the narratives of their in-groups.  This picture, though grim and pessimistic, is the one we must look to if we hope to paint a brighter future for humanity.  Kahan's research suggests that we need to not only nudge people's intuitive judgments towards the truth as many others have suggested.  This will only do so much.  Given that a large portion of our bias comes from our more reflective moments, we have to also remove the incentives people face for forming beliefs on grounds unconnected to the truth of such beliefs.  Indeed, in a sense, it's rational for, say, a conservative to believe that gun deregulation will decrease crime.  Not because this belief is particularly supported by evidence but rather because he will be expressing his commitment to his group, and his group will in turn support him.  It's rational, too, for a liberal to believe genetically modified foods are devil spawns, again, not because it's supported by evidence, but because by believing such things, they will be boosting their status within their group.  Removing or circumventing these cultural associations will be the work of future researchers.

Of course, this polarization is not true of every topic.  In fact, most topics don't exhibit such polarization.  (e.g. that the Moon revolves around the Earth, that earthquakes result from shifting tectonic plates, that height is hereditary, etc.)  Like I said at the outset of this post, people are mostly rational.  But for those domains where we have some trouble, like politics, more research is needed to figure out exactly how we get things wrong and how we can fix our mistakes.


Frederick, S. (2005). Cognitive reflection and decision making. Journal of Economic perspectives, 25-42.


No comments:

Post a Comment