The charts shown above suggest that some of the assessments
that political partisans make of the views of their opponents are wildly inaccurate.
The probability that a Democrat will consider that men should be protected from
false accusations of sexual assault is higher than Republicans believe it to
be, and the probability of a Republican accepting that racism still exists is
higher that Democrats believe it to be. The organization which published the
data makes the point that Americans have much more similar views on many
controversial issues than is commonly thought, especially among the most politically
active. My focus here is on why partisans make such large errors in assessing
the views of their opponents.
Probability assessment is not always easy.
Steven Pinker included “a sense of probability” in his list
of 10 cognitive faculties and intuitions that have evolved to enable humans to
keep in touch with aspects of reality (Blank
Slate, 220). Individuals obtain obvious benefits from an ability to
keep track of the relative frequency of events affecting their lives. A
capacity to reason about the likelihood of different events helps them to
advantage of favorable circumstances and to avoid harm.
Pinker points out that our perceptions of probability are
prone to error, but Daniel Kahneman has a much more comprehensive discussion of
this in Thinking,
Fast and Slow. Kahneman points out that even people who have studied
probability can be fooled into making errors in assessing probability when they
are led to focus unduly on information that appears particularly pertinent and
to ignore other relevant information. He gives the example of a cab involved in
a hit and run accident in the city in which 85% of cabs are Green and 15% are
blue. A witness identifies the cab responsible as Blue, and the court
establishes that he would be able to identify colors correctly 80% of the time
under circumstances that existed on the night of the accident. What is the
probability that the cab is Blue? Most people say 80%, but the correct answer,
provided by Bayes’ rule, is about half that (Loc 3005-3020). People tend to
make a large error because they overlook the fact that a high proportion of
Green cabs means that there is a good chance that the witness has mistakenly
identified a Green cab to be Blue, even though his observations are accurate
80% of the time.
Kahneman notes that people are more likely to make errors in
assessing probability when they “think fast” rather than analytically. However,
it is not necessary to understand and apply Bayes’ rule to solve problems such as
the one presented above. A simple arithmetic example can suffice. If there were
1,000 cabs in the city, there would be 850 Green cabs and 150 Blue cabs. If we
had no more information, the probability of a Blue cab being responsible for
the accident would be 15%. We are told the witness saw a Blue cab and would correctly
identify 80% of the 150 Blue cabs as Blue (i.e. 120 cabs) and would mistakenly
identify 20% of the 850 Green cabs as Blue (i.e. 170 cabs). The total number of
cabs that he would identify as Blue is 290 (120+170). The probability that the
witness has correctly identified a Blue cab is 0.414 (120/290) or 41.4%.
Kahneman also makes a point about causal stereotypes. He
does this by altering the example to substitute information that Green cabs are
responsible for 85% of the accidents, for the information that 85% of the cabs
are Green. Other information is unchanged. The two versions of the problem are
mathematically indistinguishable. If the only information we had was that Green
cabs are responsible for 85% of accidents, we would assess the probability of a
Blue cab being responsible at 15%. As before, if we evaluate the witness information
correctly, it raises the probability of a Blue cab being responsible to 41.4%.
However, when people are presented with the second version,
the answers they give tend to be much closer to the correct one. They apparently
interpret the information that the Green drivers are responsible for 85% of the
accidents to mean that the Green drivers are reckless. That causal stereotype
is less readily disregarded in the face of witness evidence, so the two pieces
of evidence pull in opposite directions.
Political partisans don’t have much incentive to make
accurate assessments of the views of their opponents.
The potential for errors in fast thinking and the impact of
cultural stereotypes may account for much of the error of partisans in
assessing the views of their opponents, as shown in the above charts. People do
not have a strong personal incentive to ensure that they accurately assess the
views of their political opponents. Potential errors do not affect their income
and lifestyle to the same extent as, say, errors in the probability assessments
they make relating to personal occupational and investment choices.
In addition, political partisans may not even see any
particular reason to be concerned that they may be misrepresenting the views of
their opponents.
Reasoning along those lines seems to me to provide a straightforward
explanation for the prevalence of partisan conspiracy theories. Research by
Steven Smallpage et al (in an article entitled ‘The
partisan contours of conspiracy theory beliefs’) suggests that partisans
know which conspiracy theory is owned by which party, and that belief in
partisan conspiracy theories is highly correlated to partisanship. The authors
conclude:
“Many conspiracy theories function more like associative
partisan attitudes than markers of an alienated psychology”.
Extreme partisans tend to promote theories that discredit
their opponents. Perhaps that is the way we should expect partisans to play politics
in a society where many people think it is ok to “bear false witness” because
they believe everyone has “their own truths” and objective reality does not
exist.
We do not have to speculate that partisans are deluded or
crazy when they hold firmly to improbable theories about their opponents in the
face of contrary evidence. They are more likely to be ignoring the evidence to demonstrate
loyalty to their party and its leaders.
However, that doesn’t offer us much solace. Some of the
conspiracy theories currently circulating seem similar to the false rumors that
governments circulate about their enemies during wartime. Extremists among political
partisans may be circulating those rumors with the intention of promoting
greater political polarization and a breakdown of the values that have hitherto
made it possible for people with divergent views to coexist peacefully.
Is increasing polarization inevitable?
Much depends on the attitudes of the majority of people who currently
disinclined to spread rumors that they believe to be false and likely to promote
social conflict. If people with moderate views make known that they expect political
leaders to disavow false rumors about their opponents, they can encourage that
to happen. Leaders of the major parties have an incentive to try to attract
voters with moderate views away from opposing parties. If leaders disavow false
rumors, partisans will tend to echo their views.
No comments:
Post a Comment