Summary:
Opinions about moral
and political issues are like iPhones and Facebook profiles, everybody has one,
but not everyone makes sure they're up to date.
Oftentimes, people try to preserve their favored opinion by rationalizing
away any new evidence. Indeed, according
to Jonathan Haidt's Social Intuitionist Model (SIM) the majority of our moral
judgments are arrived at by means of non-conscious automatic intuitive
processing and are later justified by post hoc biased reasoning.
In support of this
model, Haidt draws on a large body of research that details distortions in
human cognition. For example, it's been found that when expecting to discuss an issue with a partner (especially a
friend) whose attitudes are known, people tend to shift their attitudes to
align with those of their partner. When
their partner's attitudes are not known, people tend to moderate their opinions
to minimize disagreement. This type of
attitude revision is due to what's called the "relatedness
motive." When people want to get
along with others, they selectively shift their opinions.
Additionally,
another kind of motive that is said to distort our moral judgments is the
"coherence motive." People
with this kind of motive want to preserve their identity, and consequently,
they eschew evidence that contradicts their core attitudes and beliefs, and
they uncritically accept evidence that confirms them. In one study, for instance, people were given
mixed evidence about the efficacy of capital punishment in deterring
crime. Those who went in to the study
supporting capital punishment left with greater confidence. Those who went in to the study against
capital punishment also left with
greater confidence. This flies in the
face of rational deliberation. When
given evidence inconsistent with one's beliefs, one should lower the confidence
of those beliefs. Hence, the coherence
motive may lead to accuracy distortions of our beliefs.
S. Matthew Liao,
however, disagrees with Haidt's account of our moral judgments. He doesn't dispute the fact that we are
influenced by our friends or that we seek to preserve our core beliefs and
attitudes. Instead, he disputes that
these should properly be considered biases.
For a person to be biased, is to say that they are not epistemically
justified in believing certain propositions.
A person may be epistemically unjustified if they lack sufficient
evidence to believe in a proposition, or alternatively, if their belief is not
grounded in that evidence.
Liao argues that
people are typically justified in
shifting their beliefs to become consistent with those of their friends. To see why, consider what it means to be a
friend. A friend is someone whose judgment
you typically trust. When they express
a belief, you have reason to believe that their belief is not arbitrarily
arrived at. Further, suppose you and
your friend are about equal in intelligence.
It would be positively irrational not to take your friend's opinion into
account, and it would be arrogant to suppose that you could not be
mistaken. This reasoning applies about
equally as well with strangers. Suppose
you disagree with a person who you have no reason to believe is exceptionally
irrational. Again, given that there's a
chance the stranger is correct and you are incorrect, you ought rationally be
inclined towards shifting your own confidence, even if just a little bit. Thus, having the relatedness motive need not
entail that a person is biased.
What about the
coherence motive? Liao argues that the
coherence motive need not always lead to biased reasoning. Let's make up a hypothetical example. Suppose you believe that gun control will
lead to fewer violent deaths, and someone else believes the opposite. Now both of you are given the following two
mixed pieces of evidence.
(1)
In one state with strict gun control, there is a greater than average number of
gang wars, which has lead to more violent deaths.
(2)
In one state with lenient gun control, there have been more school shootings.
Here's how you and
the other person can both rationally
walk away with greater confidence in your initial beliefs. Suppose you believe, independently of the
debate about gun control's relation to violent deaths, that the presence of
school shootings decreases the outbreak of future gang wars (somehow). Suppose the other person believes - again
independently of the debate in question - that gang wars lead to fewer school
shootings. You would accept proposition
(1) as confirming your beliefs while discounting proposition (2), while the
other person would do the opposite.
Thus, you would both rationally leave with greater confidence in your
beliefs.
Critique:
Liao's defense of
the relatedness motive seems weak. It's
certainly irrational to believe in one's own infallibility. And it's also irrational to completely
discount an epistemic peer's opinion.
But it's also irrational to continue
to have one's beliefs shifted after having learned of the reasons behind the
disagreement. Once you know that your
friend has a belief because of x, y, and z, the fact that he is your friend
becomes irrelevant. Believing simply because your friend says so is
irrational. Yet it is this kind of
shifting of beliefs that (I think) is more common.
It is not that people shift their beliefs because of their epistemic humility,
but rather to maintain social relations.
And that is irrational.
Liao's defense of
the coherence motive also seems weak. He
concedes that people may be biased towards favoring their initial beliefs. His argument is simply that belief polarization
need not entail that people are biased.
It's an empirical point as to whether or not polarization is, in fact, a
result of bias, one which he claims Haidt does not substantiate. Though all this is true, it obscures where
the burden of proof lies. It is on Liao
to show why people systematically gain
confidence in their beliefs given mixed evidence. One would think that if people were assessing
the evidence independently of their
other beliefs, the statistical variation would be normally distributed. Instead, people invariably become more
confident in their beliefs. It's Liao that has to
explain how this is the case, not the other way around.
The general
structure of Liao's arguments is like a wedge.
He tries to show how it's technically
possible to account for these results while preserving the rationality
of moral judgments. From this, he
suggestively hints that people's moral judgments are
in fact rational. This latter claim, however, is
exceptionally lacking in support, and he would do well to acknowledge this more
in his paper.
Chen, S., Shechter, D., & Chaiken, S. (1996). Getting at the truth or getting along: Accuracy-versus impression-motivated heuristic and systematic processing. Journal of Personality and Social Psychology, 71(2), 262.
Liao, S. M. (2011). Bias and reasoning: Haidt’s theory of moral judgment. New waves in ethics, 108-128.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: the effects of prior theories on subsequently considered evidence.Journal of personality and social psychology, 37(11), 2098.
Liao, S. M. (2011). Bias and reasoning: Haidt’s theory of moral judgment. New waves in ethics, 108-128.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: the effects of prior theories on subsequently considered evidence.Journal of personality and social psychology, 37(11), 2098.
No comments:
Post a Comment