KG13 http://kahlila.posterous.com Most recent posts at KG13 posterous.com Tue, 13 Jul 2010 11:26:00 -0700 Facts only get in the way of a good story http://kahlila.posterous.com/facts-only-get-in-the-way-of-a-good-story http://kahlila.posterous.com/facts-only-get-in-the-way-of-a-good-story


Scientific research reveals how easily we can ignore facts that don't fit our agenda


In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds.
In fact, they often became even more strongly set in their beliefs.
Facts, they found, were not curing misinformation.
Like an underpowered antibiotic, facts could actually make misinformation even stronger.

 

These observations or findings rang a bell with me as I work in newspapers myself and have always been curious about the way facts are put and how they're interpreted. There is and always will be spin and vested interests where newspapers are concerned, but you can still learn a lot as details emerge in an investigation, war or court case, even if you have to read between the lines.

But I have noticed among both staff (usually well educated) and readers and correspondents that minds have often been made up at the beginning, on a hunch or because of skewed information when a case is far from proven.
And those who have made up their minds so firmly are only really looking for details that can reinforce that view; they're not actually looking for answers.
EG they've decided at the outset that So-and-so did the crime or fired the first shots, and no new evidence will budge them on that.

You have to wonder sometimes if the story is being placed in their memory alongside something similar they "already know" (possibly only subconsciously), because no thought processes are evident and their gut reactions can be unpredictable.
I am convinced some people buy into a conspiracy theory simply on the basis that it has been stated as fact by someone they trust or the first thing they read or heard about it made a deep impression.
This sort of wrongheadedness is indeed disturbing, because it is quite perverse and -- according to the article -- widespread. So people in the most responsible positions (judges, managers, politicians, journalists and the like) are likely just as susceptible as the rest of us, deep down.

 

The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

 

Our need to believe and to belong makes a mockery of our rational thinking at times. How else to explain why we fell for X (whether X be a lover, a product or a lifestyle), or continued with Y even when it was clearly not working out?

And how else to explain why otherwise intelligent, articulate people join dubious cults and movements, swallow communion wafers, attend church or persist in delusions about themselves or someone they loathe or worship?
There is nothing rational about anorexia, binge eating or compulsive exercise, and equally advocacy and activism often require all-or-nothing states of mind and an ability to block out facts that don't help a cause.

In all these cases there is a potential enemy, real or invented, that can annihilate or diminish us if we don't get support from the group/party/tribe we belong to.
If we looked beyond the tribe we might realise we don't actually need it. But more likely we're so uncomfortable about going it alone that we will simply conform; anything for love and an easy life.

 

Permalink | Leave a comment  »

]]>
http://files.posterous.com/user_profile_pics/614974/weleda_oct.jpg http://posterous.com/users/5BhuT7HqzoGJ Judith Judith Judith