"A man with conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.
We have all experienced the futility of trying to change a strong conviction, especially if the convinced person has some investment in his belief. We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks.
But man's resourcefulness goes beyond simply protecting a belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong; what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view." - Leon Festinger, When Prophecy Fails, published in 1956:
According to cognitive dissonance theory (Festinger, Schachter, & Riecken, 1964; Aronson, 1992; Tavris & Aronson, 2008),
when people are presented with new evidence that conflicts with their
previously held beliefs, this results in a form of cognitive tension
called “dissonance”. Importantly, the strength of this uncomfortable
tension depends on the degree to which people have invested in their
beliefs, for example by way of public commitment, or by the time and
effort spent acting in accordance with these beliefs (Batson, 1975).
If the psychological investment in a belief is high, people are more
motivated to reduce dissonance by rationalizing away disconfirming data.
In the refined version of dissonance theory, dissonance arises not so
much because of two conflicting cognitions, but because adverse evidence
conflicts with one’s self-esteem as a competent and reasonable person[1].
This accords with our earlier observation that, when people explain
away unwelcome evidence, they do so in a way that allows them to uphold
an illusion of objectivity. For example, if a psychic has publicly
professed his powers and risks losing his credibility, he is unlikely to
be put off his balance by blatant failure. Or if a believer has spent a
substantial amount of time and money on astrology consults, typically
no amount of rational argumentation and debunking efforts will make him
renounce his beliefs. As Nicholas Humphrey noted: “psychic phenomena
can, it seems, survive almost any amount of subsequent disgrace” (Humphrey, 1996, p. 150).
By contrast, if the psychological stakes are low, as in the everyday
situations we mentioned above, the motivation for belief perseverance
will be greatly reduced. Consider another example related to paranormal
beliefs: suppose that Anna and Paul both start to suspect that they have
psychic powers, but their level of confidence is not very high. While
Paul hastens to tell his friends that he may be psychic and even
performs some psychic readings, Anna decides to conduct an experiment on
herself at an early point, when her beliefs are still privately held.
All other things being equal, it is much more likely that Anna will
abandon her beliefs silently when she discovers that they do not pan
out (Humphrey, 1996, p. 105),
while Paul will rationalize his failures because he has already made a
public commitment. Thus, we would predict that people with an
inquisitive and cautious mindset are more likely to put their hunches to
the test early on, and are less likely to be sucked into commitment to
wrong beliefs like these. By contrast, people who rush to conclusions
and start spreading the news right away will more often find themselves
in a situation where they obstinately refuse to abandon a false belief.[2]
A
classic illustration of cognitive dissonance can be found in the
landmark study by Leon Festinger and his colleagues, who infiltrated a
doomsday cult and observed the behavior of the followers when the
prophesized end of the world failed to come true(Festinger, et al., 1964).
The followers who had resigned from their jobs, given away their
material belongings and were present at the arranged place and time with
full conviction in their imminent salvation, became even more ardent
believers after the prophecy failed, and started to proselytize even
more actively for the cult. However, those for whom the cognitive stakes
were lower (e.g. those who kept their belongings and stayed home in
fearful expectation of what was supposedly to come), were more likely to
abandon their beliefs afterwards.
2. https://sites.google.com/site/maartenboudry/teksten-1/how-convenient#_edn2
1 comments on "Cognitive Dissonance - Leon Festinger"
http://www.bethinking.org/stories-illustrations/intermediate/from-atheism-to-christianity-a-personal-journey.htm
Post a Comment