Why Can’t Insiders Accurately Perceive Their Own Culture?

The following is a summary of some of the principles of social behaviour that make it difficult for insiders to accurately perceive their own behaviour.

Cognitive Dissonance

Cognitive dissonance is at the root of denial. Fear is at the root of cognitive dissonance. The extent of our fear is determined by our general tendencies in that regard, and our beliefs. The nature of our beliefs determine our vulnerability to the issue in question. For example, I used to fear not being with my family in the Celestial Kingdom and wanted to be there with them. Fear and desire walk down this path hand in hand. Hence I obeyed the rules designed to get me what I wanted and avoid what I feared. As soon as I no longer believed that the Celestial Kingdom existed, my motivation to do many things evaporated, including some that I did not even know were related to that belief disappeared. I discovered the link while wondering why my motivation toward certain activities or attitudes had changed.

Cognitive dissonance theory is concerned with the relationships among cognitions. A cognition is a piece of knowledge about an attitude, an emotion, a behaviour, a value, etc. People hold a multitude of cognitions simultaneously, and these cognitions form irrelevant, consonant or dissonant relationships with one another. (Seehttp://www.ithaca.edu/faculty/stephens/cdback.html) As William Safire in a New York Times op-ed piece (December 29, 2003), put it:

A cognition is a bit of knowledge or belief. When it disagrees with another cognition in our head … a nasty jangling occurs. To end this cognitive dissonance … we change the weak cognition to conform to the stronger one. Take Aesop’s fox, who could not reach a lofty bunch of grapes no matter how high he jumped. One foxy cognition was that grapes were delicious; the other was that he couldn’t get them. To resolve that cognitive dissonance, the fox persuaded himself that the grapes were sour – and trotted off, his mind at ease.

Cog dis usually functions in a manner no more complicated than that. But while Aesop neatly illustrated cog dis, he did not adequately reveal the primary force that lies beneath it – fear.

One of Buddhism’s central and enlightening notions is that most of mankind’s ills are caused by the manner in which fear or desire cause us to make unwise decisions. As the following summary of recent research will show, this ancient insight is remarkably accurate. Buddha’s “middle way” was the path that lay between fear and desire and so was out of both their reaches. And since a good portion of desire is fear that we will not obtain that which we most desire, fear is the most primal and effective of emotions. The well known case of denial in marriages where infidelity is a problem illustrates this. The faithful spouse is usually unable to see the evidence of cheating until well after most others can see it. This denial of reality is a function primarily of the spouse’s fear of losing the relationship if the information in question is processed and dealt with. The greater the fear, the greater the cog dis it will produce and the deeper will be the consequent denial and suppression of threatening information. The psychology related to personality profiles indicates to us that not all people are influenced by fear and desire in the same way. In one study that focussed on the question of why some people are more religiously inclined than others, it was determined that the personality trait called “openness” correlates strongly to religious tendencies. Openness is the inclination toward new experience; the opposite of dogmatism. The more “open” a person is, the less likely she is to be influenced by fear in any particular situation, and the less likely she is to be religious in the traditional sense of that word. That is, the less likely she will be to accept traditional religious authority and the literalistic interpretation of scripture it posits. And of course the opposite is also true.

So, the picture that comes into focus is that in any particular case, denial is a function of two things. First, how open to new experience the individual in question person is, and second, how significant is the fear that the denied information is perceived to create.

A faithful Mormon should be expected to experience massive amounts of fear upon contemplating the possibility that the religious experience on which much of his life, family and social relationships are based is false. This fear produces a powerful form of cognitive dissonance, and hence an extensive or suppression of the information. We should expect that the more faithful the Mormon, the less able she will be to see the reality of the institution that sponsors her religious faith and the effect that faith has upon her.

Rational v. “Automatic” Decision-Making

Humans perceive themselves to be rational decision makers. However, there is a great deal of psychological and other research that indicates that many of our decisions are automatic, likely as a result of decision making routines that evolution programmed into us to help us to survive in a harsh environment where decisions have to be made quickly and on the basis of limited information. However, we have a primal need to justify our actions, and in this modern world dominated as it is by a “rational” paradigm, that means we twist our knee jerk reactions into a rational framework in order to feel comfortable with them. For example, why do Mormons believe that tithing brings forth God’s blessings? Because of stories told that illustrate the cause effect relationship between paying tithing and receiving blessings. Why are Mormon Priesthood blessings perceived to “work”? Same kind of reasoning. Michael Shermer wrote a book that persuasively sets out how coincidence, mankind’s tendency to look for patterns where they don’t exist and a misunderstanding of cause and effect relationships nicely accounts for beliefs of this nature, and that the more intelligent a person is the more likely she is to defend the beliefs that she at some point in her development (usually early) she accepted as “true” (See “Why People Believe Weird Things”).

One of the evolutionary rules of thumb (sometimes called “heuristics”) noted in the research is that when powerful emotions are encountered, reason shuts down. One of those forces is fear. This is adequately explained by what I indicated above respecting cog dis. Powerful desires for money, prestige, sex etc. can also overcome reason. One of my clients was on the verge of falling for a fraudulent financial scheme that offered him $20,000,000, and came to me for tax planning advice. He had tickets purchased to fly to Nigeria the following week to sign a few papers and collect his money. After I asked some questions, and then provided him with news service articles that indicated how others had lost their money, been kidnapped for ransom, and in one case killed as a result of participating in similar schemes, he reacted like someone coming out of a trance. This experienced, successful businessman’s considerable ability to reason had been overcome by the emotion of greed, which is of course a variant of desire.

Other research indicates that the most powerful of emotional forces are often connected to “value structures” such as religion (my religion is “true” and yours is not, for example), morality (the abortion issue; the homosexuality issue, for example), political issues (democracy v. communism, for example), etc. Another powerful emotion that affects our beliefs is love. I recently watched in amusement (and with some concern) as one of my young friends who I did not think had a religious bone in his body fell in love with a faithful Mormon girl and began to think seriously about serving a mission after years of resisting the pressure of his parents and others to do so.

Love and fear combine to produce potent emotional distortions of reason. This is responsible for the advice provided to medical doctors and other professionals that they not attempt to diagnose or treat themselves or family members. For example, a doctor’s love for her child, and fear of the consequence that a serious illness would bring to that child, for example, has been demonstrated to impair her ability to see symptoms that clearly indicate serious illnesses such as cancer.

Yet another area of study focuses on our inherent risk aversion. We tend to overestimate risk and underestimate potential gain from risk taking, and we tend to overvalue what we already possess when it is compared to what we don’t possess. One fascinating study in this regard provided university students with one item each that had the same value (say $5) in their school book store. They were also given some money with which to bid on the items other students were given, and were required to put their own item up for auction with a minimum sale price. On average, each student was prepared to pay much less (say $3.50) for items similar to her own than the amount for which she was prepared to sell her own item (say $7). The tendency to value what we have more than similar items we don’t have, and to overestimate risk and underestimate the rewards to be gained by taking risk, would promote societal stability and hence make evolutionary sense. And they make us unlikely to change our minds respecting something like religious beliefs we have already accepted.

Another line of research deals with decision-making under conditions of great uncertainty and indicates that the more uncertainty and perceived risk, the more likely it is that we will go with the crowd and accept what authority figures have to say about what we should do. This is one manifestation of something called the “conformist bias” or “authority bias”. The conformist bias explains the stock market buying that leads to “bubbles” in the market, and the panic selling that leads to irrational market collapse. It also applies to things like the global warming issue. There is a strong tendency in this regard to agree with the people who are dominant in our group. And what is more uncertain than religious belief? Even in cases where the phenomena are not terribly complex, the conformist bias exerts a powerful influence.

Some researchers have suggested that the conformist bias is just one of many aspects of the authority bias. A strong, perceived source of authority is often found at the root of group behaviour that sets in motion the conformist bias. It should be clear how this plays into the religious mindset, and particularly with regard to the authoritarian, hierarchical Mormon social structure.

In general, the more uncertain a matter, the more influential the authority and conformist biases will be. And authority, of course, is a subjective matter. My beliefs confer authority on certain people and institutions. Hence, those who want to influence me should be expected to attempt to control what I believe. These biases are aided and abetted by the nature of human memory. Elizabeth Loftus, world-renowned memory expert and U. of Washington psychology professor has noted:

Memories don’t fade… they … grow. What fades is the initial perception, the actual experience of the events. But every time we recall an event, we must reconstruct the memory, and with each recollection the memory may be changed – colored by succeeding events, others people’s recollections or suggestions … truth and reality, when seen through the filter of our memories, are not objective factors but subjective, interpretative realities. (Shermer, Why People Believe Weird Things, p. 182)

Loftus provides numerous examples of how easy it is to suggest to people that they have had an experience, and cause them to believe that they really had it (See “Memory, Faults and Fixes”, Issues in Science and Technology, Summer 2002, reprinted in “The Best American Science and Nature Writing (2003 Edition) at p. 127). Of particular note are certain experiments that have been conducted to illustrate the way in which our memories and current perceptions are shaped by how we think others have perceived the same event we did. For example, subjects might be shown a series of slides depicting an event or actually witness a staged event, such as a theft or a traffic accident. Then, the subjects would be given additional information concerning the event. The post-event information given to one group would contain material that contradicted some details of the actual event, such as a stop sign being described as a yield sign. The post-event information provided to a second group of subjects (the control group) would contain no such conflicting information. After ingesting the supplemental information, all subjects would be given a test concerning what they witnessed. In all of these experiments, the subjects who were given the misleading supplemental information performed more poorly than control subjects respecting the items regarding which they had been given misleading information.

This research sheds light on how Mormon testimonies are created. Once we have heard enough other people say, for example, that they felt something particular when they read the Book of Mormon, we are capable of manufacturing similar memories. And the more authoritative, credible, loving etc. the people who suggest these things to us, the more effective they are likely to be. I believe, in addition, that there are other and much more real influences behind the LDS testimony phenomenon. Seehttp://www3.telus.net/public/rcmccue/bob/documents/out%20of%20my%20faith.pdf at p. 77 and following for a summary.

It has also been shown that certain experiences that cause of the emotion of “elevation” to occur are highly influential with respect to our behaviour. When people see unexpected acts of goodness, they commonly described themselves as being surprised, stunned, and emotionally moved. When asked “Did the feeling give you any inclination toward doing something?,” the most common response is to describe generalized desires to help others and to become a better person, and feelings of joy. These feelings bind human groups together, and so create strong, reliable communities. Members of Mormon communities exhibit this kind of behaviour. However, the behaviours in question often also bind the participants to the Church itself. For example, by leaving on a mission for two years, a young man in the Mormon community inspires precisely the kind of emotion described above. And he is subjecting himself to a powerful conditioning force that will make it much more difficult for him to “question” when he returns, and he is keeping himself very busy during precisely the period of time during which most young men question. Hence, the community is strengthened by an act that inspires the emotion of elevation, and at the same time a number of other things are done that will also strengthen the community. Many Mormon conventions have this kind of effect.

As noted above, the prize religion offers is huge – relief from the anguish caused by our greatest existential fears. And the LDS Church ups the stakes significantly in this regard by positing the possibility of eternal family life and has created a society in which an admission of disbelief often costs dearly in terms of marriage and other family relationships, social status, etc. In the face of this kind of prize/penalty structure, we should not be surprised that apparently rational people are easily persuaded to believe in irrational, extremely low probability versions of future reality such as the Celestial Kingdom. And when you add to this the psychological pressure that being surrounded by believing Mormons for most of life, bearing public testimony on countless occasions as to the certainty of my belief, and then being placed in leadership positions within the Mormon community, it is not surprising to me that for almost three adult decades I was unable to see what is now so clear to me respecting the Church and the manner in which it treated me and continues to treat others.

Even Scientific Thinking is Influenced by these Principles

As noted above, the principles just described were developed with respect to human mental processes in general. They have not been yet broadly applied to religious phenomena. One of my friends who is an LDS professor of religious psychology who has been helping me with this project indicated recently to me that he thinks this neglect is due to the greater credit given within the academic community for empirically oriented research. Since the application of psychological principles to religious behaviour does not easily fit into the mould, it is not an attractive research subject. He agrees with my assessment that the application of these principles to the formation of religious beliefs and cultural practises is reasonable to assume, and that given the dominant nature of emotional forces relative to religious issues, it is also reasonable to conclude that cognitive dissonance, denial etc. will be powerful forces in the determination of religious beliefs. For an excellent overview respecting the application ofcognitive dissonance principles to religious issues in general, see “Speculations on a Privileged State of Cognitive Dissonance, by Conrad Montell athttp://cogprints.ecs.soton.ac.uk/archive/00002388/01/temp.pdf.

I note in particular something that Thomas Kuhn pointed out in his landmark book on the philosophy of science, “The Structure of Scientific Revolutions”. In that book he coined the term “paradigm shift” to describe how science changes. Until his time, it was believed that science progressed in a more or less linear fashion. He pointed out that science seems, rather, to lurch forward. His explanation for this, which has been widely accepted in the scientific community, is that the majority of each generation of scientists becomes captive to the dominant “paradigm” of their day. However, a minority of each generation will see things the majority cannot see, and will pursue those interests, sometimes to the derision of their colleagues. A future generation of scientists, less encumbered by the paradigm of their forbears, will often recognize in the fringe work something of importance that will be adopted, amplified and provide the basis for a new paradigm that will rapidly transform the scientific community’s views respecting the issues in question. And then the process will repeat itself. A classic example of this is found in the history of DNA. Gregor Mendel did the ground work for modern DNA theory, published his work, and was ignored by the scientists of his generation. He is now revered as the founder of genetic science.

The scientific community is the pinnacle of rational thought in our society. If scientists are subject to the forces described above in the manner Kuhn indicates, how much more so are the rest of us likely to be? And since the correlation between emotion and irrational belief is so strong, and the connection of religion to emotion so pervasive, should we not expect great difficulty as we attempt to be “rational” about religion? But, given modern man’s need to explain everything he does in rational terms, should we not also expect him to do that, and believe with all his heart that he is being rational with respect to his religious beliefs?

When we add all of the above factors us, we should not be surprised that it is excruciatingly difficult for the typical faithful Mormon to look any information in the eye that questions the legitimacy of the beliefs on which his life is based.

Leave a Reply