Skip to main content

| Seth Barnes | | 5 Comments on More information is not enough to change you

More information is not enough to change you

engin akyurt gJILnne HFg unsplash 1 scaled 37556337
From an article in The Boston Globe. Although it targets the issue of political opinions, It also challenges our information-based concepts of discipleship. If you want to help a person change, growing closer to Jesus, then a better curriculum, better sermons, or more information is probably no…
By Seth Barnes
From an article in The Boston Globe. Although it targets the issue of political opinions, It also challenges our information-based concepts of discipleship. If you want to help a person change, growing closer to Jesus, then a better curriculum, better sermons, or more information is probably not the answer.
 
It’s one of the great assumptions underlying
modern democracy that an informed citizenry is preferable to an
uninformed one. “Whenever the people are well-informed, they can be
trusted with their own government,” Thomas Jefferson wrote in 1789. This
notion, carried down through the years, underlies everything from
humble political pamphlets to presidential debates to the very notion of
a free press. Mankind may be crooked timber, as Kant put it, uniquely
susceptible to ignorance and misinformation, but it’s an article of
faith that knowledge is the best remedy. If people are furnished with
the facts, they will be clearer thinkers and better citizens. If they
are ignorant, facts will enlighten them. If they are mistaken, facts
will set them straight.

In the end, truth will out. Won’t it?

Maybe
not. Recently, a few political scientists have begun to discover a
human tendency deeply discouraging to anyone with faith in the power of
information. It’s this: Facts don’t necessarily have the power to change
our minds. In fact, quite the opposite. In a series of studies in 2005
and 2006, researchers at the University of Michigan found that when
misinformed people, particularly political partisans, were exposed to
corrected facts in news stories, they rarely changed their minds. In
fact, they often became even more strongly set in their beliefs. Facts,
they found, were not curing misinformation. Like an underpowered
antibiotic, facts could actually make misinformation even stronger.

“The
general idea is that it’s absolutely threatening to admit you’re
wrong,” says political scientist Brendan Nyhan, the lead researcher on
the Michigan study. The phenomenon – known as “backfire” – is “a natural
defense mechanism to avoid that cognitive dissonance.”

Most of us like
to believe that our opinions have been formed over time by careful,
rational consideration of facts and ideas, and that the decisions based
on those opinions, therefore, have the ring of soundness and
intelligence. In reality, we often base our opinions on our beliefs,
which can have an uneasy relationship with facts. And rather than facts
driving beliefs, our beliefs can dictate the facts we chose to accept.
They can cause us to twist facts so they fit better with our
preconceived notions. Worst of all, they can lead us to uncritically
accept bad information just because it reinforces our beliefs. This
reinforcement makes us more confident we’re right, and even less likely
to listen to any new information. And then we vote.

This effect is only heightened by the
information glut, which offers – alongside an unprecedented amount of
good information – endless rumors, misinformation, and questionable
variations on the truth. In other words, it’s never been easier for
people to be wrong, and at the same time feel more certain that they’re
right.

What’s
going on? How can we have things so wrong, and be so sure that we’re
right? Part of the answer lies in the way our brains are wired.
Generally, people tend to seek consistency. There is a substantial body
of psychological research showing that people tend to interpret
information with an eye toward reinforcing their preexisting views. If
we believe something about the world, we are more likely to passively
accept as truth any information that confirms our beliefs, and actively
dismiss information that doesn’t. This is known as “motivated
reasoning.” Whether or not the consistent information is accurate, we
might accept it as fact, as confirmation of our beliefs. This makes us
more confident in said beliefs, and even less likely to entertain facts
that contradict them.

New research, published in the
journal Political Behavior last month, suggests that once those facts –
or “facts” – are internalized, they are very difficult to budge.

There are also
some cases where directness works. Kuklinski’s welfare study suggested
that people will actually update their beliefs if you hit them “between
the eyes” with bluntly presented, objective facts that contradict their
preconceived ideas. He asked one group of participants what percentage
of its budget they believed the federal government spent on welfare, and
what percentage they believed the government should spend. Another
group was given the same questions, but the second group was immediately
told the correct percentage the government spends on welfare (1
percent). They were then asked, with that in mind, what the government
should spend. Regardless of how wrong they had been before receiving the
information, the second group indeed adjusted their answer to reflect
the correct fact.

Kuklinski’s study, however, involved
people getting information directly from researchers in a highly
interactive way. When Nyhan attempted to deliver the correction in a
more real-world fashion, via a news article, it backfired. Even if
people do accept the new information, it might not stick over the long
term, or it may just have no effect on their opinions.

Comments (5)

Leave a Reply

Your email address will not be published. Required fields are marked *

about team