Friday, May 27, 2011

People Believe What They Want To Believe

I'm pretty excited about Michael Shermer's new book that I just started reading: The Believing Brain: From Ghosts and God's to Politics and Conspiracies - How We Construct Beliefs and Reinforce Them As Truths.  The book summarizes 30 years of Shermer's research into how we formulate and reinforce our beliefs - and not always in a rational way.

I found the 1973 study conducted by David Rosenhan, that Shermer discussed in his book, absolutely fascinating and instructive. Rosenhan, a psychologist, wanted to find out what would happen when regular (and completely sane) people went to a mental hospital and pretended to hear voices in their heads.

So he, and 7 other pseudopatients, went to 12 different hospitals, where they pretended to heard voices that told them: "empty," "hollow," and "thud." If asked to interpret the voices in their head, they said they thought it meant "My life is empty and hollow." All 8 patients were admitted as psychiatric patients to the different hospitals - 7 with the diagnosis of schizophrenia, and 1 with the diagnosis of manic depressive disorder. After they were admitted, they were told to act completely normal, to claim the hallucinations had stopped, that they now felt perfectly fine, and to try to get out as soon as they could. However, they were on their own, and had to get out of the hospital by their own means.

The pseudopatients - including 3 psychologists, 1 psychiatrist, 1 psychology grad student, 1 pediatrician, 1 housewife, and 1 painter who were all completely sane - took an average of 19 days to get out of the different mental hospitals. The quickest escape was after 7 days, while the longest took an amazing 52 days.  Dr. Rosenhan later said that the only way to get out of the mental hospitals was to admit that you were insane, take the medication prescribed by the hospital staff, but say that you were feeling better. To tell the staff who you really were  (a fake patient who just pretended to hear voices so they could be admitted to a mental hospital) would make them think that you were even more crazy, and therefore  would submarine any chance you had of getting out.  Despite acting completely normal after initially complaining of hearing voices, all of them were discharged with a diagnosis of "schizophrenia in remission."

The diagnosis of schizophrenia was never questioned by the hospital staff during their admission. Ironically, the only people to question the diagnosis of the fake patients were the other patients in the mental hospital who (because the pseudopatients were taking copious notes about their experience in the hospital) thought that they were journalists or professors who were "checking up on the hospital." However, the hospital staff thought this "writing behavior" was a sign of pathology consistent with their diagnosis of schizophrenia. Also, the painter, to help pass the time, produced high quality paintings that were hung on the drab walls of the mental hospital. However, the staff thought they detected clear manifestations of psychosis in the paintings - such as "real disturbances in her sensorium" and "libidinous pushes" that were "erupting into consciousness." The staff were seeing what they expected to see; all observations (though normal) were viewed as abnormal because that is what they thought they should observe.

Rosenhan's study, entitled "On Being Sane in Insane Places," caused an uproar in the world of psychiatry at the time of publication.  A mental hospital not involved in the study was incredulous: they told Dr. Rosenhan that they would never fall for such a hoax.  So Dr. Rosenhan told them that over the next 3 months, he would send 1 or more pseudopatients to their hospital to see if they could determine which ones were fake. The hospital admitted a total of 193 patients during this time - of which they thought that 41 patients were fake, and 42 more were suspected as fakes. However, Dr. Rosenhan had actually sent NO fake patients to these hospitals!

The important point of this study (and this blog post) is to note how people - even highly trained professionals - will see what they want to see. Because these totally sane patients initially complained of hearing voices, and were therefore diagnosed as "schizophrenics," anything they did thereafter was filtered by the belief that they were crazy. No amount of sane behavior dissuaded the hospital staff from their fixed belief that they were nuts.  In short, beliefs come first, and then evidence is selectively chosen to bolster those beliefs.  Disconfirming evidence is omitted and forgotten. Only evidence that supports our preconceived beliefs is noticed.  It's like we are walking through life with lenses on that allow us only to see what we want to see. Disconfirming beliefs are filtered out with the selective "lenses" we all wear.  As Shermer says, "What you believe is what you see."

And while I readily acknowledge that everybody (including myself) formulates beliefs without evidence, I want to highlight how orthodox Mormons do this to people who leave the Mormon church.  Like psychiatrist who have been trained to believe that people who hear voices are schizophrenic, Mormons have been taught that people who leave the church are sinful, wrong, deluded, or over-thinking intellectuals who "miss the mark" and are "Ever learning, and never able to come to the knowledge of the truth."  (2 Timothy 3:7)

And because they have this belief, no amount of persuasion can convince them that people leave the church because they have good reasons too.  In a sense, I am like the pseudopatients in Rosenhan's experiment who, despite their pleading that they are sane, are actually judged all the more insane.  To people like myself, our family and friends have fixed beliefs about why we left the church, and no amount of persuasion will change their minds. Despite my insistence that I am happy in life, and justified in my decision to leave the church, there appears little I can do to effect perceptions of me, other than "agreeing to disagree."

Evidence should underly the beliefs that we have. But it doesn't seem that we use evidence to form our beliefs. According to Shermer,

We form our beliefs for a variety of subjective, personal, emotional, and psychological reasons in the context of environments created by family, friends, colleagues, culture, and society at large; and forming our beliefs we then defend, justify, and rationalize them with a host of intellectual reasons, cogent arguments, and rational explanations. beliefs come first, explanations for beliefs follow.  
And since we form out beliefs without a lot of strong evidence to back them up sometimes, we should be a little more willing to consider the fact that we may be wrong. What we believe may not always be absolutely metaphysically certain Knowledge and Truth.  Being an agnostic can be a good thing - even for religious people. And with that "agnosticity" we can be a little more humble, a little less dogmatic, a little more understanding of another's opinion, a little more willing to hear them out and try to understand them. They may not be as crazy as we think.


Sunday, May 22, 2011

I Know It's True Because I Feel It's True!




According to Harold Camping - a charismatic engineer-turned-prophet leader of Family Radio - the world was supposed to end May 21, 2011. That was yesterday. It didn't happen.

I don't think that the people who left their families and jobs, or gave away their homes, or donated money by the millions to Camping were just pretending to believe the world would end. I think they really believed it would happen, and were genuinely surprised - even disappointed - when it did not.

Unreasonable beliefs are commonly found in religion, alternative medical practices (homeopathy), or psychic phenomenon (ESP, astrology, and psychic readings). Belief in God, despite any evidence of God, is nearly universal in the United States. However, belief in psi (or psychic phenomena) is also very common. According to a recent Pew Forum poll, 29% of Americans believe they have been in touch with the dead, 26% believe in spiritual energy, 25% believe in astrology, 24% believe in reincarnation, 30% believe in UFO's, while 46% believe in ESP.

To put all this prevalence of wackiness in perspective, consider that only 26% of Americans agree with Darwin that life evolved through natural selection - which is supported by mountains of accumulating and converging objective data gathered by scientists over the last 150 years - while 60% of Americans "believe that humans and other animals have either always existed in their present form or have evolved over time under the guidance of a Supreme Being" - which is supported by no evidence at all. Clearly, Americans (like most people) believe a lot of bullshit.

There are many reasons people believe in bullshit. However, one of the most common, persistent, and convincing to believers is this: I know X is true, because I feel X is true. It doesn't matter that there is no scientific evidence of X, it doesn't matter that they may not have ever experienced X with their five physical senses, it doesn't matter that there are serious scientific or philosophical arguments against X. None of it matters. If someone feels that X is real or true or right, then they will find a way to believe.

It is sometimes very frustrating to have a rational conversation with people who are utterly convinced. No amount of evidence will convince them. They just know it. How do they know it? Because they feel it. They have knowledge because of their feelings. This is precisely what we are taught in the Church - that spiritual confirmations come through feelings. Read the Book of Mormon; if you feel good while doing it - IT'S TRUE! You KNOW it's true because if how you feel when you are doing it (by the same logic, alcoholism would be true also because alcoholics feel good while doing it too).

Inigo Montoya talked about the problem of semantics when he said "You keep using that word. I do not think it means what you think it means." It's just "inconceivable" for many people to consider that their feelings are not the best guides to knowledge. Something just doesn't feel right about that - and that's the problem.




So what is knowledge then? The Greek philosopher Plato taught that before we can claim knowledge about something, three conditions must be met:

First, the person in question must believe it.  Second, the belief in question must be true. And third, the belief needs to be justified.

Plato's first two conditions seem pretty straight forward. What is usually meant by the term "justified" is that there needs to be some evidence for it. W.K. Clifford said "it is wrong, always and everywhere, to believe anything on insufficient evidence."

This idea, that true belief requires evidence to be considered as knowledge, is called evidentialism. It turns out that there is a few philosophical problems with evidentialism that I won't get into here. But the point remains that "extraordinary claims require extraordinary evidence." Our beliefs in something should correlate with the evidence for that thing.

The unspecified problem of evidentialism (google it) is solved by reliabilism - which states that for a true belief to be justified (and therefore count as knowledge) that the belief is brought about through "reliable" mechanisms. Or in other words, a reliable mechanism is a mechanism that tends to produce true beliefs. A basic example would be our senses of sight and sound. These sense have evolved because they were pretty reliable in helping us avoid being eaten by predators or to catch prey ourselves (among other things). Nowadays vision and hearing are pretty reliable in helping us avoid getting hit by cars when we pull out of our neighborhood. Even though our senses are prone to hallucinations, they are fairly reliable.

The scientific method would be another reliable method of aquiring knowledge because it relies on observation of facts and data, makes testable hypothesis, and makes conclusions that can be falsified (proven wrong) and reproduced by anybody else. And like people driving in rush-hour traffic without constantly bumping into eachother, the scientific method relies on observations by many different people all making the same fairly reliable observations with the same fairly reliable tools: our senses.

So what about religious feelings? Do these count as a reliable mechanism to aquire knowledge? Alvin Plantinga, one of the greatest theologians of modern times, said that God has given every one of us a God-sensing faculty (or sensus divinitatus) that enables us to "know" God in a reliable way. According to Plantinga, God would obviously want everybody to know himself, so he gave everybody this sensus divinitatus (also sometimes called "the light of Christ" or "Holy Ghost" by Mormons). So how does Plantinga explain why not everybody has such unambiguous and reliable experiences of God manifest in their life? Sin, of course:

Were it not for sin and its effects, God's presence and glory would be as obvious and uncontroversial to us all as the presence of other minds, physical objects and the past. Like any cognitive process, however, the sensus divinitatis can malfunction; as a result of sin, it has been damaged.
Even without this poppycock notion that anybody who doesn't know God is sinful, Plantinga's reliable God-sensor of knowledge has other problems. One of the problems with a sensus divinitatus (or Holy Ghost, or light of Christ - whatever you call it) being a reliable source of knowledge is that billions of other people have spiritual feelings that lead them to adopt mutually incompatible conclusions. Therefore it seems that relying on our religious feelings, despite what Plantinga says, is extremely unreliable.

Also problematic is the fact that spiritual feelings can by fabricated by hallucinogenic drugs, electrical stimulation of your brains temporal lobe, seizure activity in your temporal lobes, and generalized brain hypoxia - common in pilots pulling high G-force turns or patients while experiencing "near death experiences" when their heart does not pump enough blood to your brain.

Religious feelings can also be reproduced through meditation, rituals, prayer, isolation, architecture, collective chanting or singing, and fasting - which is the stuff of religion. It's no wonder why religions all have these elements in their worship.

Religious feelings also tend to be culturally specific. For example, people living in predominantly Christian countries experience Jesus or Mary, while people living just across the border in predominantly Muslim counties do not. And within these countries, religious experiences are influenced by the different cultural expectations of the various branches of Christianity or Islam. For example, Mormons will have religious experiences that reinforce their brand of religion, while Pentecostals living down the street will have religious experiences that reinforce theirs. This should make one wonder if the religious feelings and experiences they are having are more related to geography and cultural expectations than anything else (ie. that the experiences correlate with the truth).

And finally, these religious experiences are mutually exclusive and contradictory. Analyzing all the different religious revelations and personal experiences, throughout the course of history,  produces a hodge-podge of contradictory claims. They can't all be right, and it's unlikely that any of them are.

So now, let me say something nice.  I don't think that people who have religious feelings are making them up. I think religious claimants (even the nuttiest of them like Harold Camping and his followers) are sincere people who genuinely believe what they preach. I just wish that they would be a little more humble and private in their views. I wish they were willing to consider that believing something really strongly, because they feel it really strongly, does not mean they know anything.  Maybe they should just say "I believe" instead of "I know!" Doesn't religious faith imply a lack of knowledge - as well as admit a certain level of doubt?

Sunday, May 1, 2011

Why We Believe: Part 3 (Cognitive Dissonance)

In my last few posts, I have discussed the psychology of religious belief.  Specifically, I have written about why people believe certain religious ideas when there is little evidence for them, and why we hold onto these beliefs so tenaciously in the face of contradictory evidence.  In Part 1 and Part 2 I talked about how we are socialized into our religion's beliefs, and how these beliefs are very practical.

A third reason why many people hold tenaciously to religious beliefs, is that religious people make sacrifices to join and maintain their membership in the religious group.  This seems a little counter-intuitive at first. However, social scientists have shown again and again that if people make a personal sacrifice to join a group, or remain within the group, they view the group more favorably than people who don't make equivalent sacrifices. 

In their classic study showing this fascinating side of human behavior, Elliot Aronson and Judson Mills invited Stanford college students to join a group that would be discussing the psychology of sex. However, before the students could join the sex-discussion group, they had to undergo an entrance test. He divided the students into two groups: one group had to undergo a "severely embarrassing" initiation procedure where they read aloud lurid, sexually explicit passages from racy novels (and for the 1950's, this was extremely embarrassing), while another group only had to undergo a "mildly embarrassing" initiation by reading aloud sexual words from a dictionary.

After the initiation, students then listened to an identical audio recording of a sham "group discussion," purportedly of the group they had just been initiated into. The recording was made so that it was as boring and uninformative as possible. The group wasn't even talking about sex - only the secondary sex characteristics of birds. Students in the recorded "group discussion" stammered, hemmed and hawed, made rambling comments that were off topic, or said they hadn't done the required reading on bird courtship practices.

Finally, the students rated how much they liked the discussion and the members of the group they had just joined. The students who had only undergone "mild initiations" saw the discussion for what is was: boring and worthless. They correctly thought the student who came unprepared was irresponsible and let the group down. However, the group who underwent "severe initiations" thought the group was interesting, exciting, and rated the group members as attractive and sharp. They even forgave the unprepared student who hadn't done his assigned reading. They thought his honesty was refreshing!

This experiment has been replicated many times using a variety of initiation techniques (like painful electric shocks or physical exertion) and always shows the same results: people like groups more when they undergo severe initiations. The reason why is explained by cognitive dissonance theory, first elucidated by Leon Festinger in 1957. Cognitive Dissonance Theory says "there is a tendency for individuals to seek consistency among their cognitions (i.e., beliefs, opinions)." When there is an inconsistency in beliefs, attitudes, or actions, then individuals will feel uncomfortable (dissonant), and will usually change their beliefs in order to harmonize them with their actions. This congruence of beliefs and actions minimizes this mental uncomfortableness called, in fancy pants psychology lingo, cognitive dissonance.

So, how does Aronson's and Festinger's ideas of cognitive dissonance apply to religious beliefs? Well, people will make tremendous sacrifices of their time, energy, and money to religious organizations. They do this in order to build and support the church they belong to, to earn favor with God both now and in the afterlife, to convert other people, to pass on the tradition and beliefs to their children, and to advance their social status in the group.  Members devote their entire lives to the teachings and beliefs of the church for these reasons. According to Cognitive Dissonance Theory, these sacrifices cause members to view the church in a more favorable light than they would if they did not sacrifice for the group.  The sacrifices serve to lock-in the membership and protect membership from disconfirming information.

Joseph Smith, a religious genius and master of understanding human psychology, was way ahead of Festinger when he said:

"A religion that doesn't require the sacrifice of all things never has power sufficient to produce the faith necessary unto life and salvation. For it is by the medium of this sacrifice, and this alone, that a person knows that a course of life he or she is pursuing is according to the will of God."

When LDS members bump into information that is contradictory or disconfirming of their beliefs, it creates a tremendous amount of cognitive dissonance. This causes them to push back and engage in mental defenses that minimize the threat of this information. Festinger describes how difficult it is to try and change someone's belief when they have already invested so much in this belief: 

"We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks. But man's resourcefulness goes beyond simply protecting a belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view . . . ."

We should remember this when we try to change someone's opinion when they have a life-time of sunken-costs into it.  It's going to be nearly impossible unless people are asking questions themselves.  Perhaps this is exactly why the church demands so much sacrifice from its members, why most LDS males go on missions, why women are encouraged to have family over graduate school and having careers, why we pay 10% of our income to the church, why we are never paid for church service, why we promise to consecrate all our time and possessions to the church in temples, and why LDS worship services are so long and so boring (seemingly by design).  All these factors serve to make LDS members all the more committed and all the more resistant to change our minds in the face of contradictory evidence.

One final point: I don't think the LDS leaders designed a church based upon the theories of Leon Festinger on purpose. However, organizations that demand personal sacrifice are going to be more successful than organizations that don't. This is true for our church, just as it's true for many other churches or secular organizations; military service starts with boot camp, you are hazed at military academies and elite special force groups upon entrance, you start at the bottom of any pyramid scheme, medical residencies start with a grueling year as an semi-abused intern, college fraternities have initiation rights, etc, etc.  Groups that demand sacrifices from members to join it, will have members that are all the more committed to it's cause - whether it's good or bad or indifferent - and will therefore be more successful than groups which don't.  And having highly devoted church members greatly benefit the institution of the church.