I'm pretty excited about Michael Shermer's new book that I just started reading: The Believing Brain: From Ghosts and God's to Politics and Conspiracies - How We Construct Beliefs and Reinforce Them As Truths. The book summarizes 30 years of Shermer's research into how we formulate and reinforce our beliefs - and not always in a rational way.
I found the 1973 study conducted by David Rosenhan, that Shermer discussed in his book, absolutely fascinating and instructive. Rosenhan, a psychologist, wanted to find out what would happen when regular (and completely sane) people went to a mental hospital and pretended to hear voices in their heads.
So he, and 7 other pseudopatients, went to 12 different hospitals, where they pretended to heard voices that told them: "empty," "hollow," and "thud." If asked to interpret the voices in their head, they said they thought it meant "My life is empty and hollow." All 8 patients were admitted as psychiatric patients to the different hospitals - 7 with the diagnosis of schizophrenia, and 1 with the diagnosis of manic depressive disorder. After they were admitted, they were told to act completely normal, to claim the hallucinations had stopped, that they now felt perfectly fine, and to try to get out as soon as they could. However, they were on their own, and had to get out of the hospital by their own means.
The pseudopatients - including 3 psychologists, 1 psychiatrist, 1 psychology grad student, 1 pediatrician, 1 housewife, and 1 painter who were all completely sane - took an average of 19 days to get out of the different mental hospitals. The quickest escape was after 7 days, while the longest took an amazing 52 days. Dr. Rosenhan later said that the only way to get out of the mental hospitals was to admit that you were insane, take the medication prescribed by the hospital staff, but say that you were feeling better. To tell the staff who you really were (a fake patient who just pretended to hear voices so they could be admitted to a mental hospital) would make them think that you were even more crazy, and therefore would submarine any chance you had of getting out. Despite acting completely normal after initially complaining of hearing voices, all of them were discharged with a diagnosis of "schizophrenia in remission."
The diagnosis of schizophrenia was never questioned by the hospital staff during their admission. Ironically, the only people to question the diagnosis of the fake patients were the other patients in the mental hospital who (because the pseudopatients were taking copious notes about their experience in the hospital) thought that they were journalists or professors who were "checking up on the hospital." However, the hospital staff thought this "writing behavior" was a sign of pathology consistent with their diagnosis of schizophrenia. Also, the painter, to help pass the time, produced high quality paintings that were hung on the drab walls of the mental hospital. However, the staff thought they detected clear manifestations of psychosis in the paintings - such as "real disturbances in her sensorium" and "libidinous pushes" that were "erupting into consciousness." The staff were seeing what they expected to see; all observations (though normal) were viewed as abnormal because that is what they thought they should observe.
Rosenhan's study, entitled "On Being Sane in Insane Places," caused an uproar in the world of psychiatry at the time of publication. A mental hospital not involved in the study was incredulous: they told Dr. Rosenhan that they would never fall for such a hoax. So Dr. Rosenhan told them that over the next 3 months, he would send 1 or more pseudopatients to their hospital to see if they could determine which ones were fake. The hospital admitted a total of 193 patients during this time - of which they thought that 41 patients were fake, and 42 more were suspected as fakes. However, Dr. Rosenhan had actually sent NO fake patients to these hospitals!
The important point of this study (and this blog post) is to note how people - even highly trained professionals - will see what they want to see. Because these totally sane patients initially complained of hearing voices, and were therefore diagnosed as "schizophrenics," anything they did thereafter was filtered by the belief that they were crazy. No amount of sane behavior dissuaded the hospital staff from their fixed belief that they were nuts. In short, beliefs come first, and then evidence is selectively chosen to bolster those beliefs. Disconfirming evidence is omitted and forgotten. Only evidence that supports our preconceived beliefs is noticed. It's like we are walking through life with lenses on that allow us only to see what we want to see. Disconfirming beliefs are filtered out with the selective "lenses" we all wear. As Shermer says, "What you believe is what you see."
And while I readily acknowledge that everybody (including myself) formulates beliefs without evidence, I want to highlight how orthodox Mormons do this to people who leave the Mormon church. Like psychiatrist who have been trained to believe that people who hear voices are schizophrenic, Mormons have been taught that people who leave the church are sinful, wrong, deluded, or over-thinking intellectuals who "miss the mark" and are "Ever learning, and never able to come to the knowledge of the truth." (2 Timothy 3:7)
And because they have this belief, no amount of persuasion can convince them that people leave the church because they have good reasons too. In a sense, I am like the pseudopatients in Rosenhan's experiment who, despite their pleading that they are sane, are actually judged all the more insane. To people like myself, our family and friends have fixed beliefs about why we left the church, and no amount of persuasion will change their minds. Despite my insistence that I am happy in life, and justified in my decision to leave the church, there appears little I can do to effect perceptions of me, other than "agreeing to disagree."
Evidence should underly the beliefs that we have. But it doesn't seem that we use evidence to form our beliefs. According to Shermer,
I found the 1973 study conducted by David Rosenhan, that Shermer discussed in his book, absolutely fascinating and instructive. Rosenhan, a psychologist, wanted to find out what would happen when regular (and completely sane) people went to a mental hospital and pretended to hear voices in their heads.
So he, and 7 other pseudopatients, went to 12 different hospitals, where they pretended to heard voices that told them: "empty," "hollow," and "thud." If asked to interpret the voices in their head, they said they thought it meant "My life is empty and hollow." All 8 patients were admitted as psychiatric patients to the different hospitals - 7 with the diagnosis of schizophrenia, and 1 with the diagnosis of manic depressive disorder. After they were admitted, they were told to act completely normal, to claim the hallucinations had stopped, that they now felt perfectly fine, and to try to get out as soon as they could. However, they were on their own, and had to get out of the hospital by their own means.
The pseudopatients - including 3 psychologists, 1 psychiatrist, 1 psychology grad student, 1 pediatrician, 1 housewife, and 1 painter who were all completely sane - took an average of 19 days to get out of the different mental hospitals. The quickest escape was after 7 days, while the longest took an amazing 52 days. Dr. Rosenhan later said that the only way to get out of the mental hospitals was to admit that you were insane, take the medication prescribed by the hospital staff, but say that you were feeling better. To tell the staff who you really were (a fake patient who just pretended to hear voices so they could be admitted to a mental hospital) would make them think that you were even more crazy, and therefore would submarine any chance you had of getting out. Despite acting completely normal after initially complaining of hearing voices, all of them were discharged with a diagnosis of "schizophrenia in remission."
The diagnosis of schizophrenia was never questioned by the hospital staff during their admission. Ironically, the only people to question the diagnosis of the fake patients were the other patients in the mental hospital who (because the pseudopatients were taking copious notes about their experience in the hospital) thought that they were journalists or professors who were "checking up on the hospital." However, the hospital staff thought this "writing behavior" was a sign of pathology consistent with their diagnosis of schizophrenia. Also, the painter, to help pass the time, produced high quality paintings that were hung on the drab walls of the mental hospital. However, the staff thought they detected clear manifestations of psychosis in the paintings - such as "real disturbances in her sensorium" and "libidinous pushes" that were "erupting into consciousness." The staff were seeing what they expected to see; all observations (though normal) were viewed as abnormal because that is what they thought they should observe.
Rosenhan's study, entitled "On Being Sane in Insane Places," caused an uproar in the world of psychiatry at the time of publication. A mental hospital not involved in the study was incredulous: they told Dr. Rosenhan that they would never fall for such a hoax. So Dr. Rosenhan told them that over the next 3 months, he would send 1 or more pseudopatients to their hospital to see if they could determine which ones were fake. The hospital admitted a total of 193 patients during this time - of which they thought that 41 patients were fake, and 42 more were suspected as fakes. However, Dr. Rosenhan had actually sent NO fake patients to these hospitals!
The important point of this study (and this blog post) is to note how people - even highly trained professionals - will see what they want to see. Because these totally sane patients initially complained of hearing voices, and were therefore diagnosed as "schizophrenics," anything they did thereafter was filtered by the belief that they were crazy. No amount of sane behavior dissuaded the hospital staff from their fixed belief that they were nuts. In short, beliefs come first, and then evidence is selectively chosen to bolster those beliefs. Disconfirming evidence is omitted and forgotten. Only evidence that supports our preconceived beliefs is noticed. It's like we are walking through life with lenses on that allow us only to see what we want to see. Disconfirming beliefs are filtered out with the selective "lenses" we all wear. As Shermer says, "What you believe is what you see."
And while I readily acknowledge that everybody (including myself) formulates beliefs without evidence, I want to highlight how orthodox Mormons do this to people who leave the Mormon church. Like psychiatrist who have been trained to believe that people who hear voices are schizophrenic, Mormons have been taught that people who leave the church are sinful, wrong, deluded, or over-thinking intellectuals who "miss the mark" and are "Ever learning, and never able to come to the knowledge of the truth." (2 Timothy 3:7)
And because they have this belief, no amount of persuasion can convince them that people leave the church because they have good reasons too. In a sense, I am like the pseudopatients in Rosenhan's experiment who, despite their pleading that they are sane, are actually judged all the more insane. To people like myself, our family and friends have fixed beliefs about why we left the church, and no amount of persuasion will change their minds. Despite my insistence that I am happy in life, and justified in my decision to leave the church, there appears little I can do to effect perceptions of me, other than "agreeing to disagree."
Evidence should underly the beliefs that we have. But it doesn't seem that we use evidence to form our beliefs. According to Shermer,
We form our beliefs for a variety of subjective, personal, emotional, and psychological reasons in the context of environments created by family, friends, colleagues, culture, and society at large; and forming our beliefs we then defend, justify, and rationalize them with a host of intellectual reasons, cogent arguments, and rational explanations. beliefs come first, explanations for beliefs follow.And since we form out beliefs without a lot of strong evidence to back them up sometimes, we should be a little more willing to consider the fact that we may be wrong. What we believe may not always be absolutely metaphysically certain Knowledge and Truth. Being an agnostic can be a good thing - even for religious people. And with that "agnosticity" we can be a little more humble, a little less dogmatic, a little more understanding of another's opinion, a little more willing to hear them out and try to understand them. They may not be as crazy as we think.