How Denial Works: Denial in General and Mormon Denial in Particular

November 29, 2005

version 2

Word formatted version of this document: how_denial_works.doc

Whether we like it or not, each of us is constrained by limits on what we can do and feel. To ignore these limits leads to denial and eventually to failure. To achieve excellence, we must first understand the reality of the everyday, with all its demands and potential frustrations.  Mihaly Csikszentmihalyi

All forms of tampering with human beings, getting at them, shaping them against their will to your own pattern, all thought cialis online control and conditioning is, therefore, a denial of that in men which makes them men and their values ultimate.  Isaiah Berlin

Introduction

This essay’s purpose is to examine denial’s roots and its implications for those of us who wish to understand as much as possible about where our blind spots are likely to be.

After I found the historical and other data that clearly indicated Mormonism was not what I thought it to be, I was still deeply troubled by various aspects of my Mormon experience.  In particular, where did the “spiritual experiences” I had while Mormon come from?  How it was possible that I could have felt so certain that Mormonism was “true”  And why did so many intelligent people also profess to have had experiences similar to mine while continuing to be (apparently at least) devout orthodox Mormons?  As I came to understand how denial works, I saw my Mormon spiritual and social experience in a different light.  This mostly resulted from finding parallel experiences in many other religious cultures, including extremely intelligent people who held literalist beliefs that I had no trouble dismissing as irrational.  There was so little difference in substance between these beliefs and Mormon beliefs as I had come to understand them, that I came to feel comfortable trusting my rational faculties as far as Mormonism was concerned, and I recognized that it was largely a matter of limited perspective and my connection to a Mormon family and community that was responsible for my beliefs, as well as those of other “obviously” irrational people (young earth creationists; Muslim suicide bombers; holocaust deniers; Jehovah’s Witnesses; etc.).

This essay started out as part of a much longer piece of analysis I was doing on the process of recovering from the kind of conditioning Mormonism administers to its faithful.  However, that project bogged down as I felt I had understood enough of what had attracted my attention to it, and other concepts began to seem more important.  However, the material I had collected with regard to how denial works seemed like it might be helpful enough to other people that I have carved it out and will make it publicly available as it is.  If it seems rough around the edges, I have just explained why.

As is usual for me, I present this on an “as is, where is” basis, without apology.  I don’t have the time or desire at this point to polish my notes into something worthy of real publication, but since I continue to have people write to me with thanks for making what has been helpful to me available to others, I will continue to do so.

I should note for the record a couple of other areas of research related to denial that fascinate me, and are the subject of my ongoing (if sporadic) study as time and energy permit, but are outside the scope of what I have written here.

First, in some ways humans have an amazing ability to process small amounts of information rapidly and accurately.   See Malcolm Gladwell’s   “Blink”, Gerd Gigenenzer’s   “The Adaptive Toolkit” (see http://www.edge.org/3rd_culture/gigerenzer03/gigerenzer_index.html) and John Gottman”™s research regarding marriage (http://www.gottman.com/ and http://www.artsci.washington.edu/newsletter/Autumn00/Gottman.htm).   On the other hand, there are vast areas of perception that have been shown to be faulty.   This essay outlines many of those.   It is possible to create a map showing the areas of personal and social interaction in which we are likely to use our best perceptive capacities, and those where we are more likely to be faulty.   I intend to at some point do that, or find someone else’s analysis that performs the same function.   To my knowledge, nothing like this yet exists.

Second, the question of how to deal with our propensity for denial fascinates me.   That is, once we have a pretty good idea what causes denial and hence when and where we are likely to be subject to it, what can we do about that?   The short answer is that we should rely upon the judgment of others when we know we are at risk, just like after having more than two drinks I no longer trust my instincts as to whether I am fit to drive.   I follow mechanistic rules in that circumstance that have to do with how much I have had to drink, when I drank it, etc. before I will under any circumstance let myself behind the wheel.   And I do this because I know my judgment is compromised by more than a certain small amount of alcohol.   The same is true with regard to the psychological and social forces that cause denial.   However, the simply formula I have just indicated is the tip of a huge, interesting iceberg about which a massive amount of worthwhile research has been conducted.   See   http://www.findarticles.com/p/articles/mi_m0IBS/is_1_28/ai_82351480 for some useful additional ideas in this regard.

 

The Basics of Denial

Denial is an invisible enemy ““ a virus that infects the patient in a way the patient cannot detect even though its symptoms are plain for others to see.   This creates a great deal of conflict between those in denial and others who would help them   “see the light”.   The analogy to death or divorce does not work well here because in those cases something has happened that cannot be denied for very long.   Someone is dead or a spouse has moved out and a judge has pronounced the relationship legally dead, for example.   In Mormonism’s case, the corpse appears lively and there is no guarantee that any particular person will ever be able to grasp the reality of their belief system.   The reasons for which a belief system like Mormonism is so easy for outsiders to understand while being so opaque for insiders is at the core of this issue.

Many books have been written on the topics of denial, cognitive dissonance and the various other mechanisms that relate to them.   What follows summarizes a number of these concepts.

For purposes of this essay, denial can be defined as a mental defense mechanism that suppresses awareness of troubling realities by causing the mind to refuse to consciously acknowledge them. Denial may or may not be adaptive, depending on the information being denied.   What follows is an attempt to explain some of the many mental and social mechanisms that contribute to the Mormon denial of certain aspects of religious reality.

Denial is a human universal.   Thomas Kuhn pointed out in his landmark book on the philosophy of science, “The Structure of Scientific Revolutions” that even the scientific enterprise is subject to denial’s perception distorting forces.   In that book he coined the term “paradigm shift” to describe how science changes.   Until his time, it was believed that science progressed in a more or less linear fashion.   He pointed out that science seems, rather, to lurch forward.   His widely accepted explanation for this is that the majority of each generation of scientists becomes captive to the dominant “paradigm” of their day.   However, a minority of each generation will see things the majority cannot see, and will pursue those interests while ignored or derided by their colleagues.   A future generation of scientists, less encumbered by the paradigm of their forbears, will often recognize in the fringe work something of importance that will be adopted, amplified and provide the basis for a new paradigm that will rapidly transform the scientific community’s views respecting the issues in question.   And then the process repeats itself.   A classic example of this is found in the history of DNA.   Gregor Mendel did the groundwork for modern DNA theory, published his work, and was ignored by the scientists of his generation.   He is now revered as the founder of genetic science.

The scientific community is the pinnacle of rational thought in our society.   If scientists are subject to the forces described above in the manner Kuhn indicates, how much more so are the rest of us likely to be?   And since the correlation between value systems, emotion and irrational belief is so strong, should we not expect great difficulty as we attempt to be “rational” about religion?   And given the human need to explain everything we do in rational terms, should we not also expect that we will believe with all our heart that we are rational with respect to our religious beliefs?     In short, we should expect the forces of denial to be rampant in the religious world, and indeed they are.

To understand how denial works in that regard and hence we can protect ourselves against it, we need to understand some of the basics of how belief systems function in the lives of individual believers.     There are many ways to think about this process.     Since we are talking about denial in a Mormon context, the place to start is with the Mormon system of   “knowing”, usually referred to as its   “epistemology”.   Hence, I approach denial in this context by attempting to answer the following questions:

  • How does Mormon epistemology work?   That is, how do Mormons come to   “know” things?
  • How do foundational belief systems establish and stabilize their epistemic systems in particular human beings?   That is, how are the forces that create denial of the type relevant to Mormonism created?   And how does Mormonism in particular establish itself in the heads of its follows in this regard?
  • What makes it more likely that a particular belief system will collapse in the life of a particular individual?   That is, what are the mechanisms that are likely to allow a particular individual to overcome the forces of denial with regard to Mormonism?

I also note that each of the mechanisms I am using to describe Mormon denial can of course affect post-Mormons.   Getting out of Mormonism does make us immune to the group and emotional forces that cause humans to misperceive reality.   However, having been taught a painful lesson as to how these perception warping forces work, and when we are most likely prone to them, post-Mormons should be better equipped than they were as Mormons to navigate the tricky water around these forces.

 

Mormon Epistemology

An often inarticulate theme in discussions related to religious belief is epistemology ““ the study of how humans come to think they   “know”. Without trying to summarize more than a small fraction of the wonderful debates that have occurred over the centuries respecting this topic, I will attempt to outline and analyze Mormon epistemology.

 

Frequently denial is a direct result of a conflict between a basic principle of a belief system and an aspect of reality.   For example, Catholics believe that the Bible and the Pope were inerrant.   That was a basic feature of their epistemology.   When Galileo came along with his microscope, he produced information that was perceived to conflict with both the Bible and the Pope.   This caused wide-spread denial with regard to the evidence, and scientific tools, Galileo produced.

 

Mormons use a certainty based epistemology that is very similar to that of the Catholics of Galileo’s day, and frequently produces denial for much the same reason it was produced with regard to Galileo.

 

Within Mormon culture, the words “I know” are among those most commonly heard.   Mormons constantly tell each other and outsiders that they “know” that Joseph Smith is God’s prophet; that The Book of Mormon is the word of God; that the Mormon Church is the only true Church on Earth; etc.

 

So, how do we come to “know”, and what can we “know”?   And what are we to make of claims like those made by faithful, well intentioned Mormons who claim certain knowledge of things such as those mentioned above?

 

Certain v. Uncertain Knowledge

Most of what we think we know is based on various scientific or other theories that enable us to interpret our perceptions.   While we often think of this knowledge as certain, it is not.   Even things we take for granted, like the shape of the Earth, cannot be proven with 100% certainty.   Science does not produce certain knowledge, but rather knowledge with varying degrees of certainty or in which belief is justified to a greater or lesser extent.   Uncertainty, in fact, has been shown to play a critical role in both the scientific and creative processes.   Erich Fromm observed that,   “The quest for certainty blocks the search for meaning. Uncertainty is the very condition to impel man to unfold his powers.”

 

Knowledge about our physical world is inexhaustible and hence never complete or certain.   Alfred Korzybski developed a lesson especially for children to teach this idea.   He would explain:

 

“Children, today we want to learn all about the apple.”

 

He would place an apple in view of the children,   “Do you children know about the apple?”

 

“I do!”,   “I do!”,   “Yes, I know about apples!”

 

“Good” Korzybski moved to the blackboard. ,   “Come, tell me about the apple?”

 

“The Apple is a fruit.”,   “The apple is red.”,   “The apple grows on a tree.”

 

Korzybski would begin to list the characteristics described by the children on the blackboard.

 

The children continued,   “An apple a day keeps the Doctor away.”

 

Korzybski continued listing the children’s answers until they run out of ideas, then he would ask,   “Is that all we can say about the apple?

 

When the children answered in the affirmative, Korzybski would remove his pocket-knife and cut the apple in half, passing the parts among the children.

 

“Now, children can we say more about the apple?

 

“The apple smells good.”   “The juices are sweet.”   “The apple has seeds.”   “Its pulp is white.”   “Mother makes apple pie.

 

Finally when the children had again run out of answers, Korzybski would ask,   “Now, is that all-we can say about the apple?” When the children agreed that it was all that could be said, he would again go into his pocket only this time he removed a ten power magnifying lens and passed it to the children. The children would examine the apple, and again respond:

 

“The apple pulp has a pattern and a structure.”   “The skin of the apple has pores.”   “The leaves have fuzz on them.”   “The seeds have coats.”

 

Thus Korzybski would teach the children the lesson of Non-ALLness.

 

Now we could continue to examine the apple””with a light microscope, x-ray crystallography, and eventually the electron microscope. We would continue to discover more to say about the apple. However, we can never know ALL there is to know about anything in Nature. We humans have the power to know about Nature, but not to know ALL. Knowing is without limit, but knowing is not total. Universe is our human model of Nature. Our “knowing” can grow evermore complete. It can grow closer and closer to the “˜Truth”™, but it cannot equal the “˜Truth”™. It must always be incomplete. (See http://futurepositive.synearth.net/2004/04/09).

 

The humility inherent in this approach to life is what keeps our minds open, keeps questions before us, and so keeps us learning.   As Dr. Timothy Wilken notes:

 

When we are certain, we are surprised and disheartened by our mistakes. This attitude toward human error is the most damaging of human ignorances. We humans make mistakes because we make all our decisions without ALL the information. This is a major point that all humans must understand. The only cause of mistakes is ignorance.

 

We humans must become aware of our ignorance. When we humans have knowledge of our ignorance, we can learn from our mistakes and protect ourselves in the future. When an individual knows he doesn’t know, he is wise. Wisdom is the opposite of certainty. The knowledge of our ignorance is wisdom. (http://futurepositive.synearth.net/2002/10/06)

 

That is, an epistemology based on certain knowledge will perpetuate ignorance and the cycle of error and guilt that goes along with it.   An epistemology of uncertainty, and hence humility, opens the mind as well as the soul.

 

The Human Aversion to Uncertainty

There is a near universal human tendency to resist the acknowledgement of uncertainty.   There are many explanations for this feature of our individual and collective psychology.   I will summarize only a few here, and will return to the theme of certainty v. uncertainty throughout this essay as we explore the mechanisms used by belief systems to create perceptions of reality.   The human inclination toward certainty is key to many of these.

 

Nietzsche blamed Socrates and Plato for humanity’s deadening desire for certainty.   Socrates taught of an absolute good ““ a transcendent good that was above humanity.   Plato used this idea in his ideal   “forms”, which in turn supported (or perhaps even caused) the Christian idea that the Universe is governed by a single divine power. Nietzsche believed that this absolutist way of thinking satisfied the human desire for certainty, but at the cost of a “slave morality”.   Nietzsche’s philosophy was largely a reaction to this deficient, in his view, perspective.   He rejected all transcendent powers, and grounded his philosophy in the uncertainty human experience, thus freeing the slaves   (see http://webpages.ursinus.edu/rrichter/talksixteen.html).

 

It seems to me that while Nietzsche was right to identify a bad idea deep in Socrates foundational philosophy, that he was wrong to blame Socrates for the manner in which this idea came to dominate Western thought.   I doubt that Socrates, great though he was, could have exercised such influence over millennia of human beings.   Furthermore, the idea of a supreme being or other forces that provides certainty springs up in so many cultures (See Pascal Boyer,   “Religion Explained”) that the explanation for both the popularity of Socrates teaching and its many other manifestations seem likely to lie in the structure of the human psyche.

 

For example, we have certain brain mechanisms that are designed to alert us to danger and keep us safe.   When these kick in, they tend to override all other functioning in our brains and hence cause to us overlook and misinterpret evidence so that we err on the side of safety.   That is, when threatened we tend to overestimate risk (perceive more uncertainty that in fact exists) since that is most likely to keep us safe by either freezing us in place, or causing us to flee.   In the Western world, we are rarely put in such a position from a physical point of view, but our fear instinct still flares in response to psychological and social threats.

 

When our overriding safety mechanisms are not functioning, we spend a lot of our time deciding what to do, and then persuading others to cooperate with us.   In this environment, decisiveness and confidence are of critical importance.   If we are successful in our persuasive efforts, we often do not need to be precise in our analysis.   In fact, if enough people agree with us, we will have created the equivalent of a self fulfilling prophesy.

 

Hence, we tend to overestimate our own abilities, and those of the people allied with us.   And we tend to underestimate the risks we face.   This helps us to both appear, and feel, the confidence we need to make our way through life.   It is well known that mildly depressed people, who tend not to be as successful as their peers, are also much more accurate in their perceptions (See John Ratey,   “Shadow Syndromes”; Michael Shermer,   “Why People Believe Weird Things”).   These people are   “downers”.   They   “rain on the parade”, and are often not much fun to be around.   And, their perceptions are much more accurate than those of   “normal” cheery people.

 

It has also been shown that our ability to perceive and interpret the information that comes to us is dramatically impaired when our emotions are inflamed.   Hence, for example, medical doctors are advised not to treat their own families.   Some of the many perception distorting forces to which are subject are described below.   Often these tend toward helping us to avoid the perception of uncertainty.

 

There are many lessons in the history of humanity”™s tendency toward certainty. An important lesson for present purposes is that if we want to accurately perceive what is going on around us, we need to recognize that we likely have an unwarranted tendency to feel certain of our perceptions and take steps to check and double check our intuition using the best objective sources of information available to us.

 

Five Mormon Epistemic Principles

In light of the background just summarized, we should not be surprised that a relatively traditional belief system such as Mormonism would use an epistemology heavily oriented toward certainty.   Here are the foundational principles of Mormon epistemology:

 

1.                 Truth can be discovered, and we believe all truth, from whatever source it may come. This sounds pretty good.   But, it implies an unwarranted certainty that   “truth” can and will be found, and so tends to exacerbate the problem of assuming things to be certain that are not.

 

2.                 Mormons already “know” some things because God has “revealed” them through his prophets, and so any information purporting to question those things should be discarded because regardless of how the probabilities appear, that information is misleading. That is, Mormon minds are closed as to information that purports to disconfirm “revealed truth”.   The Catholic Church”™s response to Galileo resulted from a similar epistemology.

 

3.                 It is wrong to read or talk about anything that questions revealed truth.

 

4.                 Mormon prophets sometimes make mistakes and so at times unquestionable truths will be discovered to be erroneous. Mormons will know such mistakes by waiting for a Mormon prophet to acknowledge that an earlier prophet was wrong.   Mormon prophets are the ultimate source of authority.

 

5.                 Since some Mormons will disobey rule #3, an alternative approach is designed for them that is not disclosed to the average member.   That is, nothing is certain; hence those who criticize Mormonism and the information they use to do so is uncertain; and so since no one can know what is “true”, Mormons are justified in choosing a set of beliefs that work for them; and Mormonism is at least a justifiable choice on that basis, and is arguably the best choice because it works so well.   This rule has only become important to Mormonism recently as a result of the manner in which the Internet and other media has publicized many of the difficulties with Joseph Smith”™s credibility and other problematic aspects of Mormon history and social practice.   Those who become aware of this rule are told that they should remain quiet about it ““ that they are part of an elite who should not harm the average Mormon who is unprepared for this much “meat” by talking about it.

 

Problems with the Mormon Epistemic System

The story of human history discloses, in a sense, a continual series of awakenings.   These involve the usually painful determination that what seemed to be an utterly reliable source of wisdom and authority is unreliable, and should be questioned.   Generation after generation of our ancestors have fought through the fear this process entails, and found more freedom on the other side.   Thus, for example, was the grip of the Catholic Church over Europe broken and the divine right of kings to rule replaced by democracy.

 

But within even the most democratic and scientifically advanced countries on Earth, large pockets of superstition still exist, based on emotions that overpower reason.   And as seems to have always been the case, this superstition is exploited by the few who have power at the expense of the many who do not.   Mormonism is an example of this continuing, and thankfully shrinking in most respects, problem.

 

The essential problem with Mormon epistemology is that it is driven by blind faith in authoritarian leaders who go out of their way to encourage such faith and the obedience it causes. Those leaders justify continuing to misrepresent Mormon history in order to hold onto their power base.   While I am critical of that, I recognize that these leaders are well intentioned.   This makes their deception hard to spot.   They believe God’s will requires that they mislead their followers. This is not new. Nietzsche said that the   “pious lie” ““ the lie told with the intent of doing good ““ is the foundation of all priesthoods.   Plato taught that the   “philosopher kings” ““ the wise few who lead ““ had a moral obligation to lie to the ignorant masses when necessary to “help” them make “good” choices.

 

I have concluded that the attitudes respecting certainty and deception just described are at the core of the Mormon epistemic system, and are responsible for most of its problems.   Therefore, Mormonism is dangerous on three levels.

 

First, Mormonism creates a relationship of trust between leaders and members that is abused constantly by Mormon leaders to extract resources from average Mormons and retard individual growth.   This trust could be further abused.   For example, what if a future prophet related to the Mormon Church decided that it was time for all faithful Mormons to sell their homes and move to Independence, Missouri as Joseph Smith prophesied would be the case?   And what might that prophet do if the things he predicted did not come to pass, and his group was about to be broken up by government authorities, etc.?   That is the situation that led to Jonestown in another religious movement in which similar certainties and deception were at the core of an epistemic system.   And more to the point, the possibility for emotional and sexual abuse with Mormon congregations is much higher than in other human groups where the average member is more skeptical of authority than are most Mormons.

 

Second, a belief in an emotion based epistemology makes Mormons vulnerable to other radical groups and fraud artists of all kinds.   This is what caused, likely, Elizabeth Smart to go along with her captor.   She had been prepared by her Mormon upbringing to fall in that kind of line when the right kind of pressure was exerted on her.   This is what is likely responsible for the continual stream of people moving from mainstream Mormonism to the fundamentalist polygamist groups.   This is likely why Mormons are such easy prey for the right kind of fraudulent investment salesmen, and are near the top of the US charts in terms of multilevel marketing participation rates.

 

Third, Mormonism encourages the idea that deception is justified in order to solve short term problems, or to support things about which they are “certain”.   Psychologists have shown that this approach to life blurs moral lines and tends to cause increased dishonesty and other forms of immorality.   It also promotes the false idea that lying to avoid what appear to be relatively small problems is a viable life strategy.   Often, this approach to life produces much greater problems and immense pain for many people down the road.   The recent movie “Goodbye, Lenin!” nicely illustrates how this happens (see http://mccue.cc/bob/documents/rs.goodbye%20lenin.pdf).

 

Most Mormons when presented with a system such as the one I just described as used by another group (such as the Jehovah’s Witnesses, for example) would immediately pick it apart in a sensible fashion (  “Not have a blood transfusion when you need one, on the basis of a religious belief!? That is nuts!”     “How many times have they prophesied the second coming of Christ and it did not happen?   Come one!!”   Etc.). But, the same Mormons would likely be incapable of recognizing the nuttiness of their own system. Some of the reasons for this blindness ““ this denial ““ are explored below.

 

Mormon Anti-Wisdom

The Mormon epistemic system does not generally speaking harness the wisdom of the Mormon crowd by allowing for the best ideas generated by Mormons to percolate up from the bottom and be sorted out and refined along the way.   Rather, Mormon wisdom tends to be received by Mormon leaders by   “revelation” and passed down to the masses.   Most of this happened long ago when Mormon leaders still presumed to speak for God.   For quite some time, Mormonism”™s prophets have not done much more than re-emphasize platitudes credited to their predecessors, while avoiding some of the more colourful teachings from Mormonism”™s past, such as that man can become God   (see Morin,   “Suddenly Strangers”, http://www.suddenlystrangers.com/Chap10.htm) .

 

Since most of Mormon”™s teachings came from close to two centuries ago, they often do not benefit from recent useful scientific and social innovations.   Think, for example, of Mormon racial attitudes; attitudes respecting male – female roles; attitudes respecting homosexuality; attitudes respecting how history should be taught; attitudes respecting the theory of evolution in particular, and how conflicts between science and the teachings of religious leaders should be sorted out in general; the divine origin of polygamy and that Mormons would never be required to stop practicing it; etc.

 

This traditional “wisdom” is given a privileged place within the Mormon epistemic system. As noted above, it is the starting point that cannot be questioned unless it is stated to be wrong by a Mormon prophet.   So Mormons are forced to justify their continued belief in things that have a high probability of being false, or change a foundational belief.   Few are able to change their foundational beliefs, and so the forces of cognitive dissonance (see below) keep them in denial

 

Since Mormon prophets do not act alone, this means that 15 very old, conservative men have to agree before the teaching of a prior prophet can be discarded. Hence, the nature of the process required within Mormonism to replace defective information with better information guarantees that few changes will occur, and that even those most badly needed (like that respecting blacks and the priesthood) will take a very long time.   Mormons are consigned by their epistemic system to be at the tail end of conservative North American society when it comes to the acceptance of any knowledge that happens to conflict with their foundational beliefs.   And ironically, Mormons are near the forefront of society in accepting knowledge that happens not to conflict with their beliefs.

 

It is fascinating to watch well educated Mormons struggle with this conflict between the rational and irrational sides of their inherited epistemic system.   A few minutes before typing this I had a conversation one of my Mormon lawyer friends.   This man is successful as a lawyer and businessman, received excellent grades through university, and is a respected member of both the Mormon and business communities.   He told me with a straight face that of all the stories of how various religions started, he believes that the Mormon story is the most likely to be true.   He is aware of how many times Joseph Smith lied about various important things while trying to persuade people to believe him.   When I asked him how he justified believing such a liar about other things that Smith told for the purpose of getting people to obey him, my friend said that he often believes people he knows are liars while doing business because his instincts tell him he should.   He believes Smith”™s story on the same basis.   And, on that basis he feels justified to believe that the case for Mormonism is stronger than the case that can be made for any other religion.   This man has demonstrated many times that his ability to assess probabilities and cause and effect relationships is superior.   The only way I can explain him, and many others of his type with whom I am acquainted, is by the cognitive dissonance and emotion related theories referred to below.   My friend is like a medical doctor trying to diagnose his own child.

 

As noted above, many of the pieces of Mormon anti-wisdom that comprise the body of Mormon tradition can’t be proven wrong, and will endure.   There is a strong tendency carefully cultivated within the Mormon population to hold onto these bad ideas, since to let go of them too easily would threaten the authority, and power, of Mormon leaders.

 

And even those Mormon ideas that are acknowledged in various ways to have been wrong live on in harmful ways. For example, Spencer Kimball gave the blacks the priesthood, but he cannot get rid of the passages in the Book of Mormon that equate skin color and sin. The Book of Mormon was quietly changed in some ways to eliminate the most egregious of these problems (native Indians used to be referred to as becoming “white and delightsome” when they obeyed God; this was changed to “pure and delightsome”), but this is a bandage on a broken leg. And we have the legacy of statements from various influential Mormon leaders and prophets respecting the blacks being “less valiant” in the pre-existence, Lamanites becoming more white as time passes after having accepted the gospel, etc. Hence, racial attitudes within Mormonism continue to be problematic. It is not permitted within the Mormon epistemic system to just say, “That was wrong” and clean house.

 

How about polygamy? It was introduced (in my view) as a result of Smith”™s sexual appetite. In this he follows an inexhaustibly long line, and is followed by a similarly long line, of male religious and societal leaders. The alpha male in the group gets unusual sexual access – that is how human groups tend to function. Evolutionary theory supports this idea; as does the history of every people to which I have had access; as does psychological, sociological and anthropological theory and the evidence collected in support of it. To justify his actions, Smith told his followers that God commanded his sexual behaviour (after lying about what he had done for over a decade) and said that his lying was mandated by God too. Were we considering the history of any group other than our own, would this story get more than an eye roll and a laugh?

 

Then, polygamy was done away with by fiat of the U.S.  Feds, while the Mormon prophets fought and lied throughout the entire process. This lying created the foundation of today’s Mormon fundamentalist polygamy, and is directly related to Mormon epistemology. That is, Mormon leaders during the 1890 – 1905 period said in private that God had spoken through Smith (as noted above), and revealed a reality – that polygamy must be practised on Earth in order to qualify for the Celestial Kingdom in heaven, where polygamy would be practised by all who entered that blessed state. So, it did not matter what was said in public (Smith himself provided ample precedent for this practise), God’s word was  His word and polygamy would always be the way of life for His people on earth. Contrast this with the fact that most of what today’s Mormons would regard as the important part of their religion (family values, etc.) are a direct result of the mainstreaming strategy that Mormon leaders adopted because of what the U.S. Feds forced them to do – abandon polygamy. So, civil authority exercised in accordance with democratic principles is responsible for the Mormon abandonment of polygamy and the modern mainstream Mormon way of life (including Mormonism’s strong patriotic tendency), and Mormon prophetic authority is responsible for resisting the process that took them mainstream and created what Mormons value today, while also being responsible for today’s Mormon fundamentalist polygamist groups. What a nice picture that line of reasoning paints. Again, the Mormon epistemic system does not permit Mormons to simply acknowledge a leadership error and correct it.

 

And what about homosexuality? Mormon prophets have interpreted the Bible to condemn homosexuality, and hence homosexuality is still considered to be sin within the Mormon community in spite of the wealth of evidence coming from science to indicate that in many cases at least, homosexuality is rooted in the biology of particular human beings. The last I heard, the Mormon Church continued its policy of using what I think it is fair to call physically and emotionally abusive “therapies” to try to “cure” homosexuals of their supposed ill. And it is no surprise to find that suicide is much more common in the Mormon population of gay people than it is elsewhere. See Terry Hiscox, “In God’s Name:   The Treatment of Homosexuals by the Christian Church” at http://www.affirmation.org/learning/in_gods_name.asp#_edn55

 

I could go on at some length in chronicling the examples of the anti-wisdom Mormons have inherited from their prophets, but will stop here since I believe the point has been made well enough.

  

The Purpose of The Mormon Epistemic System

Given the foregoing, I think it fair to conclude that the purpose of the Mormon epistemic system is not primarily to produce knowledge that is as accurate and useful as possible.   Rather, I suggest, the primary purpose of the Mormon epistemic system is to protect the authority of Mormon leaders.   A secondary goal is to produce as much useful knowledge as possible.   See the summary of Pierre Bourdieu”™s thought under the heading   “Misrecognition” below for a summary of why these purposes are not recognized by Mormons or Mormon leaders.

 

From the Mormon point of view, the damage perceived to result from impairing the willingness of Mormons to obey their leaders overrides the good that important information could do.   Such is the case, for example, with information related to the biological causes of homosexuality.   It has the potential to help many people and even save human life by reducing suicide rates in the gay Mormon population, but it calls Mormon authority into question.   So, information is rejected out of hand, and those who insist on bringing it forward are generally removed from the Mormon community and treated as pariahs.   However, once sufficient time has passed or other circumstances have transpired to make it possible to accept useful knowledge without unduly threatening Mormon authority, then this is often done.     Again, we see that Mormons are not necessarily anti-knowledge; they are pro-authority to the point that authority trumps knowledge.   Only when knowledge threatens authority does this become clear.

 

Postmodern Mormon Epistemology

That authority maintenance is the goal of Mormon epistemology is evidenced perhaps best of all by the shift that occurs when one questions Mormonism sufficiently to show that the story the Mormon Church tells about itself is probably not accurate, and is met with an almost complete reversal of field as indicated by Mormon epistemic rule #5 above.   This concept is worth further development since it shows the boundaries of the current tension between Mormon epistemology and its competitors and the odd innovations this is producing within the Mormon community.

 

The “everything is certain and don’t question  authority” approach can be thought of as a kind of   “blue pill” in the sense that term is used in the movie   “The Matrix”.   That is what is offered to those awakening humans who find reality to hard to bear and wish to remain unconscious of what is really going on.   When a Mormon is so awake that this pill no longer works, some other form of blue pill is required.   And it must be something that acknowledges the uncertainty of the Mormon position.   It no longer works for those who defend Mormonism to argue that the historical and scientific case against it is weak. It is not weak. The very essence of history and science has to be challenged to defend Mormonism. And certain parts of postmodern theory provide the means to do this in, regrettably, a manner that is respected in at least some academic circles.   It is from that academic genre, loosely speaking, that the approach outlined in rule #5 above is drawn.   Other religions are going down the same road as they defend their shaky foundations against secular forces.     That is, nothing is certain, so you can believe what you want on the basis of what   “works” for you.   This an extreme form of relativism that intellectual Mormons would likely only be comfortable using with regard to religious beliefs.

 

From my point of view, this throws the baby out with the bath water. We are invited to jettison critical thinking in terms of science, history etc. in order to save Mormonism. Susan Haack has written a couple books that show how ridiculous this approach is overall. A taste of her work can be found at http://www.csicop.org/si/9711/preposterism.html

 

Ironically, postmodernism is designed to critique paradigms (called “metanarratives”) so as to better understand them.   Science, in this sense, is a metanarrative.   Postmodernism questions what we can know as a result of science.   History is another metanarrative.   Western democracy is a third.   Mormonism is, of course, a metanarrative.

 

Postmodern theory quite rightly points out that those who live within any particular metanarrative often make misleading assumptions about it, and so do not understand the reality of their experience.   That is the point of much of the psychological and sociological research outlined below.   Pierre Bourdieu is, in fact, a leading postmodern sociologists.

 

Postmodernism indicates that the reality of any metanarrative is often better seen by outsiders with the help of large data collection and analysis projects.   Enlightening patterns can often been seen from this perspective that are otherwise invisible.

 

For an example of how postmodernism has been applied with Mormonism, see David E. Bohn, “Unfounded Claims and Impossible Expectations: A Critique of New Mormon History”, and Malcolm R. Thorp, “Some Reflections on New Mormon History and the Possibilities of a “New” Traditional History”.   Both are found in “Faithful History: Essays on Writing Mormon History”, edited by George D. Smith.   Thorp is replying to Bohn who attempts to use postmodern theory to defend Mormonism.

 

Among other things, Bohn invokes Heidegger, Derrida, Gadamer and others from the phenomenology and postmodern streams of thought to defend his thesis that Mormon history should not be read in an “objectivist” fashion.   That is, we can”™t really know what Mormon history means.   To this Thorp replies, among other things, that if we take what the thinkers just referenced have said seriously, we end up questioning Mormonism from the ground up using tools and seeking answers to questions that go far beyond anything the so-called New Mormon Historians have sought to do.   That is, Bohn and other similarly inclined Mormon scholars unsheathe and put into use for all a set of sharp, multiple edged swords in defence of Mormonism without seeming to realize what these might be used to do against them.

 

It is my view that Derrida, Gadamer et al had social constructs precisely such as Mormonism in mind when they developed their analytical “deconstructive” tools.   And if anything is antithetical to a postmodern theory of history, it would be the use by an elite leadership group of something like Mormonism’s “faithful history” to maintain control of the group they lead by suppressing information.

 

So, the metanarrative that it would be most useful for Mormons to critique using postmodern theory is Mormonism.   This essay is, in fact, a weak example of a postmodern critique of Mormonism.   I have not met a single  faithful Mormon who wants to use postmodern theory for that purpose.   Rather, they attempt to use it against science and history to suggest that the evidence they provide as to Mormonism”™s problems should be dismissed. By so doing they discredit the entire scientific enterprise. This is further evidence, in my view, of the manner in which Mormonism produces bifurcated brains and mental pathology. Mormons are usually practical people. And yet its adherents are desperate enough at this point in its evolution that they have resorted to the most impractical of philosophies ““ extreme postmodernism ““ in an attempt to stem the tide of criticism against Mormonism.   In this, Mormonism follows the path described by Karen Armstrong in “The Battle for God” as she attempts to explain why fundamentalism has recently  become such a vibrant force in the Jewish, Muslim and Christian worlds.

 

It is important to note that the postmodern  approach to a defence of Mormonism makes the defender of the faith sound intelligent. He can ground his defence in the work of people like Jacques Derrida, Richard Rorty, Heidegger, Gadamer, respected if eccentric and hard to understand postmodern philosophers.   Their views are  are mostly rejected by the more practical science oriented philosophers, who are the natural companions of Mormons to the extent that science does not conflict with Mormon belief.    Ideological war indeed makes strange bedfellows.

 

However, despite the seeming polar opposite natures  of  the “certainty” and postmodern (or “uncertainty”)  approaches to defending Mormon belief, they have something fundamentally telling in common – they both  take the decision making tools out of the hands of the average Mormon.   The “certainty” approach makes it impossible for him to make up his own mind because he is to accept authority, not question.   The “uncertainty” approach says that it is  somewhere between difficult and impossible  for him to  make up  his mind in a rational fashion to do anything other than remain Mormon because that is where his life is based.   And, it is strongly suggested that if you  “don’t get” the postmodern approach, you are simply not bright enough, or not creative enough, or too inflexible, etc.   The really bright, creative, flexible, etc. people get it, and are remaining within Mormonism.   This is, of course, another falsehood.

 

There are two things I find most offensive about the approach to defending Mormonism that I just described.   First, it justifies the deception of the leaders on the basis that the members are incapable of understanding these things ““ thinking for themselves. I was one of those  faithful members who poured his guts into the Mormon Church for many years. I would have made many important decisions in vastly different ways had I been given the information necessary to do so. It is immoral, without any question in my view, for Mormonism”™s leaders to continue to deceive as they do. And, it pours salt in the wound to listen to some academic, in cognitive dissonance up to his eye lids, drone on about a twisted application of postmodern theory that guts science and history in order to plug the holes in the good ship Mormon.

 

Second, note the resort to emotionalism.   “Everything is uncertain; so you can’t be sure what is right; and you have a lot to lose; so don’t change what you are doing.”   How do you think that would go as a missionary tool while attempting to convert Catholics, let say, to Mormonism?   And for that reason, this approach is not made known to the faithful members, and those who do know about it preach a code of silence ““ this is dangerous knowledge only to be shared with those who are ready for it.   As pointed out by the essay at http://mccue.cc/bob/documents/rs.religious%20faith%20-%20enlightening%20or%20blinding.pdf, this use of secrecy to control information and people runs back to near the beginning of Mormonism.

 

I have heard this approach advocated by one General Authority during a personal interview with him, and by several academically oriented Mormons who choose for various reasons to remain affiliated with Mormonism.   This is the “advanced course”.   One had the gall to tell me that most Mormon bishops and stake presidents don’t really understand Mormonism or spirituality because they are  too literal in their interpretation of both.   Accordingly to this fellow, the  “enlightened” Mormon intellectuals who see the deception, uncertainty etc. and decide to continue to believe and obey (at least when the feel like it ““ they are not generally as obedient as the true believers) are the only really spiritual ones.   This is a vanishingly small group at the moment.

 

So, here we have the blue pill for educated Mormons. You get a second chance to go numb after having broken through certainty’s illusion.   And when you put the two blue pills together, what you have is a program that seems designed to control the masses, and then retain to a degree ““ and more importantly silence ““ those who figure out the scam so that the masses will remain undisturbed.   I do not suggest that anyone at the top of Mormonism is responsible for this fraud.   This is the kind of thing that evolves within organizations below the conscious radar of almost everyone, as indicated by people like David Sloan Wilson (“Darwin’s Cathedral”),  Steven Johnson (“Emergence”, and “Wide Open Mind”), and Jacques Bourdieu (David Swartz, “Culture and Power: The Sociology of Jacques Bourdieu”).   Aspects of their thought in this regard are explored below.

 

How Do Foundational Belief Systems Establish Themselves?

Having looked at how the Mormon epistemic system works, we will now examine how belief systems in general, and the Mormon system in particular, engrain their beliefs in individual believers.   This topic is sufficiently complex that there is no one way to approach it and fully understand it.   This like the old blind-men-feeling-different-parts-of-the-elephant story (see http://www.noogenesis.com/pineapple/blind_men_elephant.html).   I like better the analogy to a sporting event or large scale dramatic production.   No one camera angle will adequately capture the event.   Luckily, we have many cameras at our disposal.   They will each show us something different, and hopefully useful, about how denial works relevant to Mormonism and other belief systems.

 

Because of its complexity, it is my view that religion will eventually be explained by the science of complexity better than in any other way   (see http://www.calresco.org/info.htm, http://www.calresco.org/sos/sosfaq.htm#1.1 and http://scholar.google.com/scholar?q=sociology&domains=santafe.edu&q=inurl%3Awww.santafe.edu%2Fresearch%2Fpublications&sa=Search).   Complexity theory tells us, for example, that there are some phenomena that at least for the present cannot be captured by an single paradigm.   This is consistent with the conclusion I reached, as set out above, in a more intuitive fashion.

 

A Conceptual Framework

Since the analysis that follows is both lengthy and complex, I will provide a brief conceptual framework to make it easier to digest the information that follows.

 

Humans share a common biology, and an evolutionary history.   Our capacity to perceive evolved to make it more likely that we would survive and propagate in our physical and social environment at the time we evolved.

 

Each human being is born into a social group.   Yeats   “social mask”, as described above, memorably describes what happens next.   We each need a set of beliefs to connect us to our group, and to help us make sense out of life”™s day to day issues as well the   “terrible questions” like where did we come from, why do humans suffer, what happens after death.   Our primary social group, whether African Pygmy, Canadian Mormon, or New York Atheist Jew, provides these basic concepts for us.   I refer to these below as   “foundational beliefs” and the   “premises”, in the logical sense of that term, upon which the rest of the group”™s belief structure and social order depends.

 

If a society is closed, its members”™ worldview may simply develop within a single framework.   In a pluralistic society such as the one in which most North Americans live, foundational belief systems must defend themselves against the conflicting world views with which we become familiar.     Taboos and suppression of information that challenges foundational beliefs are used in this regard.

 

In the environment in which our ancestors evolved, the well-being of the dominant, small social group and an individual”™s security within it were far more important to survival and reproductive opportunities than is now generally the case.   Therefore, both in the evolutionary environment and now, when confronted with information that might threaten one of the group”™s foundational values and hence threaten the group, humans tend to misperceive information so that it is not threatening.   The same is true with regard to information that might threaten our place within the group.   Thus, reality is denied.   The study the following individual and group phenomena helps to explain this:

 

  •   “sacred premises”:   The most important and hence unquestioned (or even invisible) values on which a social group is based.   For example, God exists; He communicates with prophets; when the prophets speak God”™s will, we must obey or forfeit many wonderful (and real) blessings.   And by the way, the men are in charge and the women will obey (or at least make the men think that the men are in charge and the women are obeying).

 

  •   “social context”:   The framework of society within which values, sacred premises, certain forms of power and reasoning, etc. all   “make sense”.

 

  •   “taboos”:   Cultural rules, many of which seem to be designed to prevent the rules that hold society together from being broken.   Taboos protect sacred premises.   For example, sexual intercourse is only permitted within marriage, unless the prophet says otherwise.

 

  •   “value structures”:   Ideas or values like democracy, the importance of the nuclear family, the importance of obeying religious or other forms of authority, etc.   Sacred premises are the most important value structures within a group.   These are what make possible Bourdieu”™s misrecognition, and provide order to society, thus restraining chaos.   It is the fear of chaos, which has mightily afflicted humankind throughout most of its history, that gives these structures such strength.

 

  •   “bounded rationality”: While we like to tell ourselves that we are rational, our behaviour and hence reason are affected by constraints of various types.   For example, what is   “rational” to someone who is sitting in the safety of her home may seem irrational to another person whose child is likely to die if she does not take some form of risky action, for example.   The study of   “bounded rationality” is hence an attempt to make the classical models of human rationality more realistic, while continuing to use models that can be systematized and hence rigorously studied.     “Heuristics” are used for decision making purposes in bounded rationality theory.

 

  •   “heuristics”:   Much of behavior we think of as   “rational” is in fact governed by heuristics ““ simple, efficient rules of thumb that we use automatically whether they make sense in a particular case or note.   These explain how people make decisions and solve problems with limited time and information. Heuristics work well under most circumstances, but in certain cases lead to systematic cognitive biases of the type described below under the heading   “Cognitive Biases and Cognitive Dissonance”.   Heuristics are linked to emotion and instinct. Once a heuristic switch is tripped, we act in accordance with it and if we notice what we have done or it is pointed out to us, we tend to explain our actions to ourselves on a rational basis that may have nothing to do with our real   “reason” for acting.

 

  •   “ecological rationality”: A form of reasoning that is often used within a given limited information context and is often dominated by emotion. This is another way of speaking about behaviour that is boundedly rational. For example, consenting to allow your wife to have sex with God”™s prophet makes sense for people who believe that a certain type of heaven is real and obedience to God”™s prophet will guarantee a place there for them and perhaps their families as well.

 

  •   “deliberative rationality”: A form of reasoning that is more like what is used in the scientific method and is as divorced from emotion as possible.

 

  •   “the collective mind”: Think of Adam Smith”™s   “invisible hand” or the forces by which ant colonies seem to   “self organize.

 

  •   “misrecognition”: As humans are conditioned to accept their place in the group, there is a process of misunderstanding or   “misrecognition” of what is really going on. For example,   working class children accept that it is legitimate ““ right ““ that their middle-class peers have more success in the educational system than is justified based on objective performance.   A key part of this process is the transformation of people”™s cultural or economic positions into   “symbolic capital” that allows them to do more or less within the group and so to benefit more of less from group resources.   So, the king lives in a castle while the peasants live in hovels.   The prophet gives the orders and the people obey, and if the prophet says that God has told him to start having sex with your wife, both you and she submit to God”™s will.     So, members of groups tend to be unaware of who benefits from various actions taken by group members and to be unable to question the justification for benefits of which they are aware.   This lack of awareness uncritical acceptance is critical to maintaining any social order.   Greater awareness of how behaviour is linked to personal interest, or greater ability to question the justification of benefits, tends to cause relationships to change.   This change generally works against the traditional powers within the group.

 

  •   “inference systems”: Aspects of our perceptive behavior that seem to have evolved for certain functional purposes. For example, we have many inference systems that seem designed to help us deal with other conscious agents.   Ideas related to religion and god tend to fit patterns that can be predicted by an understanding of how our inference systems are formatted. We are more likely to misperceive when under the influence of our emotions.   Our emotions tend to flare when our group”™s foundational values or our place in the group are threatened.   However, we tend to rational when examining the foundational values of other groups, and so can spot their irrationality.   The obvious irrationality of other groups coupled with our inability to perceive our own irrationality strengthens our group.

 

  •    “cognitive dissonance”:   Humans seem to need certainty, and so when confronted with two or more pieces of conflicting information, they tend to eliminate the perceived uncertainty by suppressing or modifying their perceptions to bring them into harmony.   For example,   if I believe that God inspired Joseph Smith to do many great things that are of critical importance to me and my family, and I learn that many people say he was a deceiver and so was untrustworthy, I will tend to find reasons to suppress or dismiss the information that suggests he was a deceiver.

 

  •   “cognitive biases”:   These are often the result of cognitive dissonance, and are behavior traits that in certain circumstances result from misperceived information.   For example, people who make statements to a certain effect (such as, I know that Joseph Smith was God”™s prophet) tend to strengthen (or move toward) their belief that whatever they have said is true.

 

  •   “attachment”: Attachment theory applies evolutionary principles to the study of child-parent relations and has been extended by some researchers to adult romantic relationships and others to the relationship between adults and religious groups or ideologies.   It posits that experience with caregivers shapes subsequent behaviour and social relations, and that the relationship of the individual to beliefs in god (as a caregiver of a particular kind, for example) can profoundly influence many kinds of individual and social behaviours.   In this regard, it can be regarded as akin to the authority bias, or is perhaps a way to explain the origins of the authority bias.

 

  •   “interpretative ambiguity”:   Many phenomena are susceptible to interpretation in more than one way, and experiments have been designed that highlight how the mind flickers between possible interpretations, until it is rewarded.   Once we are rewarded by an interpretation, it becomes the only one we can see.
  •    “spiritual experiences”: Particularly powerful emotional experiences, often characterized as   “spiritual experiences”, are human universals.   These are used in most human groups to support their foundational beliefs.

 

  •   “the emotion of elevation”: The emotion we feel when we see someone do something good.   It motivates us to also do good, and provides social glue.

 

  •   “information suppression”:   This is used to slow down the circulation of information within social groups that would otherwise bring sacred premises and other values into question.

 

  •   “misdirection”:   Misdirection is used to perform magic by the magician using tendencies in human perception to make us look at his left hand while his right hand (or foot, or assistant, etc.) does something that we do not notice and gives the impression that magic has occurred.   Religious groups and others do something similar be defining for us the things to which we should pay attention, and structuring our lives so that it is difficult to attend to other aspects of life.

 

  •   “inherited beliefs”: We all start somewhere in terms of what we think we   “know”.   This starting place is more scientific in some cases and less in others, but in all cases this position is handed to us by our social group. We   “inherit” this knowledge, regardless of how inaccurate it may be.   And we are taught to believe it.   These beliefs are hard to overcome because of things like the confirmation bias, authority bias, etc. as noted above.   And the insiders of one group tend to be unable to identify their own blind spots while having no trouble pointing out those of others..

 

  •   “cult control techniques”: Mind control techniques are not necessarily good or bad but can be used for good (to help people to break addictive patterns) or bad (for cult control) purposes.   Mind control techniques can be understood to operate on four levels: behaviour, information, thought, and emotion.   Religious and other organizations can be assessed in terms of how they use these various aspects of control to increase individual autonomy, or to control the individual.

 

After reviewing each of these topics, we will return to the big picture and attempt to synthesize what we have learned.

 

The Power of Social Context and   “Premises” to Shape Human Brains

Various nuerologists have recently pointed out (see for example Quartz and Sejnowski, “Liars, Lovers, and Heroes: What the New Brain Science Reveals About How We Become Who We Are”), that our brains are formatted to a significant extent by the physical and social environment in which we exist during our developmental years (up to early adulthood). For example, cats raised from birth in a room without vertical lines walked into table legs when released into the “real” world, and their patterns of brain activity were consistent with those legs not being visible. Their brains, conditioned by their environment, could not perceive the vertical plane. Similar data of a less formal nature has been collected regarding pygmies led out of the forest for the first time who were incapable of grasping the significance of animals grazing on a plain hundreds of yards away. Such distances were not part of their world. Their brains had no context within which to make sense out of their perceptions. They thought they were seeing miniature animals.

 

The most authoritative storytellers within a particular social group provide much of the material around which human brains are formatted. The story about the manner in which God did (or did not) confer supernatural powers on Joseph Smith in order to bring the Book of Mormon into being is a good example of a powerful story of this type. In the contemporary Mormon community, this story ironically coexists with many others that are mostly scientific in orientation. The Biblical texts that within the conservative Christian community are confidently interpreted to define our relationship to God, and incidentally, to define sexual roles and other basic aspects of our behavior that we take for granted, are similarly powerful stories.

 

This brings us to the concept of “premises”.     A   “premise” in logic is a foundational statement that if true, supports a conclusion.   For example, we might say: If all men are humans (premise no. 1), and all humans are mammals (premise no. 2), then all men are mammals (conclusion).   The rules of logic dictate that IF both premise are true, THEN the conclusion must be true.

 

Premises in the sense I am using the term are a group’s “givens”; its wallpaper; and most importantly, the ideas that if accepted render the rest of its belief system “sensible” and part of what Joseph Campbell would call a mythology ““ a set of beliefs used to answer life”™s most important questions.

 

For example, IF God did confer on Joseph Smith the power to “translate” the Book of Mormon, then it is reasonable to believe that the many stories Smith told of angelic visitations and other special authorities he received from God were also true, including his stories about being told by God both to have sex with multiple women (some of whom were married to other men) and to deceive both the public and members of his church about what he was doing. Again, there are many mainstream Christian analogues that are easy to find, and the same can be said with regard to each clearly defined belief system that has attracted a significant number of adherents.

 

So, premises relate to what a group deems “sacred” – that is, beliefs so important that they should not be questioned in order to preserve the group”™s social order.   These ideas are therefore protected by all of the taboos the group can muster. Rituals are important in that they help to both ingrain the sense of the   “sacred” as just noted, as well as obedience from a Skinnerian or Pavlovian point of view.   They do this largely by causing key elements of the stories that engrain the sacred premises to be reenacted and hence remembered.   The Mormon sacrament and temple rituals are good examples of this.   Rituals also cause the behaviors mandated by the sacred premises to be repeated, and hence engrained.   In the Mormon community, any ritualized behavior that entails obedience to the current dictates of Mormon leaders would fit this bill.   For example, holding family home evening on Monday nights to the exclusion of all other competing activities; Mormon meeting attendance on Sunday and at various times during the week; many daily personal, spousal and family prayers; daily personal, spousal and family scripture study; etc.

 

In the contemporary western world, democracy, capitalism, free trade, “equal” (in some impossible-to-define sense) human rights, and a few others values are sacred in the same sense foundational religious beliefs are sacred.     Belief systems as diverse as those of primitive peoples, Catholicism, Scientology, Marxist Communism, representative democracy, etc. can be boiled down to a few “premises” which if accepted, render most of the rest of the belief system “logical”.

 

The nature of premises is nicely illustrated by an account the anthropologist Paul Ehrlich gave respecting a visit he made to Artic to study the Inuit.

 

Some may consider today”™s Western religions to be an evolutionary advance from hunter-gatherer religions, but as with language, there is little basis for placing the religions of modern people on a scale from primitive to advanced.   People tend to think of their own religion as the one true religion, of course, and adherents of the religions that have become   “organized” as societies have become more complex tend to look down on those of the hunter-gatherers.   This was brought home to me during my Inuit summer.   Father Rio was a Belgian Catholic Oblate missionary stationed at Coral Harbour in the Canadian Arctic, part of the   “the largest diocese on Earth.”   He was also battling with a native Anglican priest for the souls of the local Inuit.   Father Rio made great raisin wine, and I spend many a peasant evening helping him consume it.   It was 1952, and in his view the Inuit, with their   “simple” religion, were   “just like children.”   The   “simple” Inuit religion was actually a form of animism based on a complex of spirits, ghosts, human and animal souls, and several major gods, employing shamans, numerous taboos, magic words, and the like. It was anything but simple. “¦

 

Interestingly, Father Rio”™s contempt for the religion of the people whom he sought to serve and convert was reciprocated.   Once, when I was taking a language lesson with Tommy Bruce and several other Inuit, the talk turned to the feud between the Anglican priest and Father Rio, which had become intense.   Why, Tommy wanted to know, if their religion was based on loving one”™s neighbors, did the priests shoot at each other”™s dogs with shotguns?   Then he said,   “Do you know what Father Rio believes?” and regaled me with the story of the virgin birth.   By the time he had finished, all the Inuit were laughing so hard that tears were running down their cheeks.   Tommy was an Inuit of the old breed, a former shaman who had spent his life hating another Inuit because of a wife-trading deal gone bad in the distant past.” (See Paul Ehrlich,   “Human Natures”, page 219-220)

 

Such is the nature of our most important premises – they are pounded so deep into our cultural and psychological background that they are immune from our own critique while being bizarre when considered by anyone who considers them in the rational way that would come naturally to a cultural outsider.

 

So, the function of the stories told by a group’s storytellers is, for the most part, to so thoroughly engrain the basic premises of a belief system that they pass into the realm of the “sacred”, and so beyond questioning, in the fashion just described.   The function of ritual is to remind us of these things, and encourage behavior that is consistent with the reality ““ the premises ““ posited by a group”™s foundational myths.   And if this is done for long enough, the brain formatting noted above occurs, making it extraordinarily difficult to shake what are from an outsider”™s point of view irrational beliefs.

Different Conceptions of Freedom ““ An Illustration of Sacred Premises

There are few ideas more central to human social life than   “freedom”.   In the democratic West, we are particularly proud of and grateful for our freedom, and it is safeguarded in many ways by our social structure.   It is one of our sacred premises.   Hence, it is instructive to see how this foundational concept has been manipulated by various human groups, and how some of the most twisted (from a Western democratic point of view) ideas related to freedom have become foundational to Mormonism.   There are few of Mormonism”™s foundational ideas about which Mormon denial is deeper than this one.   Mormons consider themselves to be free as birds.

 

Once the knotted ideological thread related to the Mormon conception of freedom is laid straight, it is interesting to note that the ideas regarding freedom currently used by Mormon leaders were in vogue politically at about the time Mormonism came into being.   They have since been abandoned by Western democracies.   I am indebted to Isaiah Berlin (see   “Freedom and Its Betrayal”, for example) for most of what follows.

 

Berlin starts by asking the question   “Why should anyone obey anyone else?” He then provides a classic definition of freedom or liberty that involves balancing my right to do things against how my actions may impact on the similar rights of other individuals. Underlying this analysis is the notion that all individuals have the same rights.   Oddly enough, while I was a faithful Mormon that is how I thought of freedom, and I saw nothing in the Mormon system that was inconsistent with it.   Such is the invisible nature of sacred premises.   However, even a casual inspection of the Mormon system of governance discloses that both the thinkers Berlin profiles in his book   “Freedom and Its Betrayal” and Mormon leaders believe that there are certain individuals (the “leaders”) who by virtue of their wisdom, ability or authority, have more rights than the rest of humanity, and as a result a double moral standard exists that justifies their doing things to control the   “lesser” beings by whom they are surrounded.   And in particular, those who are not leaders are intended to surrender their freedom.   That is the definition of Mormon freedom; that the individual is free to choose to obey or not obey.   Those who obey are pleasing to God and will live with Him and their families after death in a state of unimaginable joy, while those who do not obey will be cast out and will live in relative misery.   So in reality, we are only free to surrender our freedom and obey.

 

As this concept came into focus for me a couple of years ago, I was staggered.   How, I wondered, could something so obvious have escaped my notice into middle age?   After all, the idea of freedom is central to Mormon theology.   Agency, or as Mormons call it “free agency” is central to the Mormon system of thought.   Mormons believe that we all lived with God in a life before this life called the “Pre-existence”, and that there we were given the choice to follow God, which meant coming this earth under the conditions in which we find ourselves, or to follow Satan who had his own idea as to how things would work.   The fact that we are here is prima facie evidence that we choose to follow God instead of Satan, and in this life, we still had our agency to decide whether to obey, or not, and how to live our lives.   Or so I thought.

 

In the summary that follows, I do not attempt to deal with the many interesting issues related to the concept of freedom that involve determining when the rights of individuals should take precedence over those of the group and vice versa; how to balance the need for individual autonomy with the need for rules; and a host of other issues.   The problems with the Mormon conception of freedom are more basic than this.   They relate more to who should have the right to address the type of question just indicated ““ the group as a whole, or an elite that for one reason or another has ended up in control of the group.   That is, who should be trusted to exercise ultimate power, each member of the group itself and hence the wisdom of the group as a whole, or an elite that derives its power from something other than appointment by the group?

 

Another way to think of this issue is to ask what is the ultimate source of power within the group?   Is it the group itself, acting through each individual member, or it is some source of power that is alleged to be outside the group, such as God, or the force that the elite can bring to bear, etc.?   These questions underlie the analysis that follows.

 

Submission to Authority

Berlin summarizes part of Fichte”™s thought as follows:

 

As for freedom, individual freedom and individual conscience, and right and wrong, whether discovered or invented, what has become of those now? What of that individual freedom of which we spoke earlier, which the British and French writers defended, the freedom of each man to be allowed, within certain limits at least, to live as he likes, to waste his time as he likes, to go to the bad in his own way, to do that which he wants simply because freedom as such is a sacred value? Individual freedom, which in Kant has a sacred value, has for Fichte become a choice made by something super-personal [Fichte believed that the nation was a force that should dominate the individual]. It chooses me, I do not choose it, and acquiescence is a privilege, a duty, a self-lifting, a kind of self-transcendent rising to a higher level. Freedom, and morality generally, is self-submission to the super-self ““ the dynamic cosmos. We are back with the view that freedom is submission. (Freedom and Its Betrayal, page 71)

 

Compare Fichte’s ideas to those of Mormon Apostle Neal Maxwell (see   “Swallowed Up in the Will of the Father,” Ensign, Nov. 1995, at page 22):

 

Seventy years ago, Lord Moulton coined a perceptive phrase,   “obedience to the unenforceable,” describing   “the obedience of a man to that which he cannot be forced to obey” (  “Law And Manners,” Atlantic Monthly, July 1924, p. 1). God”™s blessings, including those associated with consecration, come by unforced obedience to the laws upon which they are predicated (see D&C 130:20-21). Thus our deepest desires determine our degree of   “obedience to the unenforceable.” God seeks to have us become more consecrated by giving everything. Then, when we come home to Him, He will generously give us   “all that [He] hath” (D&C 84:38).

 

In conclusion, the submission of one”™s will is really the only uniquely personal thing we have to place on God”™s altar. The many other things we   “give,” brothers and sisters, are actually the things He has already given or loaned to us. However, when you and I finally submit ourselves, by letting our individual wills be swallowed up in God”™s will, then we are really giving something to Him! It is the only possession which is truly ours to give!

 

Consecration thus constitutes the only unconditional surrender which is also a total victory!

 

May we deeply desire that victory, I pray in the name of Jesus Christ, amen.

 

The Voluntary Assumption of Chains

In his summary of the work of the French thinker Rousseau, Berlin says:

 

In short, the problem goes somewhat as follows: You want to give people unlimited liberty because otherwise they cease to be men; and yet at the same time you want them to live according to the rules. If they can be made to love the rules, then they will want the rules “¦. If your problem is how a man shall be at once free and yet in chains, you say:   “What if the chains are not imposed upon him? What if the chains are not something with which he is bound as by some external force? What if the chains are something he chooses himself because such a choice is an expression of his nature, something he generates from within him as an inner ideal? If this is what he above all wants in the world, then the chains are no longer chains.” A man who is self-chained is not a prisoner. “¦ if the chains are of your own making, if the chains are simply the rules which you forge, with your own inner reason, or because of the grace which pours in while you lead the simple life, or because of the voice of conscience or the voice of God or the voice of nature, which are all referred to by Rousseau as if they were almost the same thing; if the chains are simply rules the very obedience to which is the most free, the strongest, most spontaneous expression of your own inner nature, then the chains no longer bind you ““ since self-control is not control. Self-control is freedom. In this way, Rousseau gradually progresses toward the peculiar idea that what is wanted is men who want to be connected with each other in the way in which the State forcibly connects them. (Freedom and Its Betrayal, pages 43-44)

 

Compare Rousseau’s statement to that of Mormon Apostle Boyd K. Packer (see   “Agency and Control,” Ensign, May 1983, at page 66), which echoes an often-repeated Mormon refrain.

 

Several weeks ago I had in my office a four-star general and his wife; they were very impressive people. They admire the Church because of the conduct of our youth. The general”™s wife mentioned her children, of whom she is justly proud. But she expressed a deep concern.   “Tell me,” she said,   “how you are able to control your youth and build such character as we have seen in your young men?”

 

I was interested in her use of the word “˜control”™. The answer, I told them, centered in the doctrines of the gospel. They were interested; so I spoke briefly of the doctrine of agency. I said we develop control by teaching freedom. Perhaps at first they thought we started at the wrong end of the subject. A four-star general is nothing if not a disciplinarian. But when one understands the gospel, it becomes very clear that the best control is self-control.

 

It may seem unusual at first to foster self-control by centering on freedom of choice, but it is a very sound doctrinal approach.

 

While either subject may be taught separately, and though they may appear at first to be opposites, they are in fact parts of the same subject.

 

Some who do not understand the doctrinal part do not readily see the relationship between obedience and agency. And they miss one vital connection and see obedience only as restraint. They then resist the very thing that will give them true freedom. There is no true freedom without responsibility, and there is no enduring freedom without a knowledge of the truth. The Lord said,   “If ye continue in my word, then are ye my disciples indeed; and ye shall know the truth, and the truth shall make you free.” (John 8:31-32.)

 

The general quickly understood a truth that is missed even by some in the Church. Latter-day Saints are not obedient because they are compelled to be obedient. They are obedient because they know certain spiritual truths and have decided, as an expression of their own individual agency, to obey the commandments of God.

 

We are the sons and daughters of God, willing followers, disciples of the Lord Jesus Christ, and   “under this head are [we] made free.” (Mosiah 5:8.)

 

Those who talk of blind obedience may appear to know many things, but they do not understand the doctrines of the gospel. There is an obedience that comes from a knowledge of the truth that transcends any external form of control. We are not obedient because we are blind, we are obedient because we can see. The best control, I repeat, is self-control.

 

Mormon Apostle James Faust summed things up as follows (see “The Abundant Life”, The Ensign, November, 1985, p. 7):

 

President Gordon B. Hinckley reminded us,

 

As a Church, we encourage gospel scholarship and the search to understand all truth. Fundamental to our theology is belief in individual freedom of inquiry, thought, and expression. Constructive discussion is a privilege of every Latter-day Saint. (Ensign, Sept. 1985, p. 5.)

 

The Saviour said,   “I am come that they might have life, and that they might have it more abundantly.” (John 10:10.) How is the abundant life to be obtained? The abundant life involves an endless search for knowledge, light, and truth. President Hugh B. Brown said:

 

God desires that we learn and continue to learn, but this involves some unlearning. As Uncle Zeke said: “˜It ain”™t my ignorance that done me up but what I know”™d that wasn”™t so.”™ The ultimate evil is the closing of the mind or steeling it against truth, resulting in the hardening of intellectual arteries. (Baccalaureate address, Utah State University, Logan, Utah, 4 June 1965.)

 

No stone wall separates the members of the Church from all of the seductions of the world. Members of the Church, like everyone else, are being surfeited with deceptions, challenges, and temptations. However, to those of enduring faith, judgment, and discernment, there is an invisible wall which they choose never to breach. Those on the safe side of this invisible wall are filled with humility, not servitude. They willingly accept the supremacy of God and rely upon the scriptures and counsel of His servants, the leaders of the Church. These leaders of the Church are men with human frailties, and are imperfect in their wisdom and judgment. Perfection in men is not found on the earth. But almost without exception these leaders sincerely, humbly, and prayerfully render great and dedicated Christian service to the best of their ability. More important, they hold a divine warrant and commission through which great and eternal blessings come to those who sustain and follow them. They are God”™s servants.

 

I believe that few things in life deserve one”™s complete confidence. I testify that the Church is worthy of our full trust. There is no inconsistency between truth and faith. I know that everyone who sincerely and righteously seeks to know this can have it spiritually confirmed. May we open up our minds, hearts, and spirits to the divine source of truth. May we reach above ourselves and beyond our mundane concerns and become heirs to the knowledge of all truth and to the abundant life promised by our Lord and Savior, Jesus Christ. I pray that this may be so, in His holy name, amen.

 

That is, according to Faust we should seek the truth ourselves.   This is a promising start.   But then he tells us to remember that the truth is whatever the Mormon Church’s leaders say it is.   We should trust them, and obey them.   We should not question them.   We should use, in essence, blind faith instead of the best of your rational abilities.

 

Joseph Fielding McConkie, a BYU professor, tells this story about his father, Mormon Apostle Bruce McConkie:

 

It was an unusual Sunday that found Dad without a conference assignment and hence at home. It was on one of those Sundays when I was in my early teens that I took occasion to see how the doctrine of agency worked. When it came time to leave for church, I announced that I had decided to exercise my agency and not attend church that day. Dad assured me that I had agency, which in our family meant that I could go to church willingly or I could go unwillingly. The choice, he said, was mine. Then he added, “Now get your coat on. You don’t want to be late.”

 

Some years passed before I understood the principle involved. When I was baptized, I chose to be an agent for Christ. As His agent, that is, as one committed to represent Him, I had already made the decision of whether I would attend my meetings or not. The covenant I had made assumed the responsibility to attend meetings and fill assignments. Properly understood, agency is the right to act, the right to do our duty. It is not and cannot be the source of excuse for refusing to do the same. (The Bruce R. McConkie Story: Reflections of a Son)

 

That is, the fact that he as an eight year old child had been baptized meant that he had surrendered his agency with respect to countless decisions that could not then have been contemplated by his childish mind.   And he came to accept that this was the case, and apparently still accepts this as an acceptable exercise of choice.   In my view, this is just another way to justify the absence of real choice.

 

Deception by the Leaders is Sometimes Necessary

Next, Berlin summarizes the one Saint Simon’s ideas related to the elite who he believed should govern, as follows:

 

About the elite he sounds a very modern note, when he says that they must practice two moralities. What was so wonderful about the priests of Egypt, for example, who were a very early and original elite, was that they believed one thing and fed the population with another. That is good, that is exactly how things should be conducted, because the people cannot be expected to face the truth at once, but must be gradually educated. Consequently we must have a small body of industrialists and bankers and artists who gradually wean mankind, who gradually condition them to take their proper part in the industrial order. That is a familiar kind of neo-feudalism. The great phrase, indeed, on which Communism is built ““   “From everyone according to his capacity “¦” ““ comes from Saint-Simon and the Saint-Simonians. Again, when Stalin said that the artists ““ novelists, for example ““ are   “engineers of human souls”, that their business is applied, not pure, that the end of art is not itself, but the moulding and the conditioning of human beings ““ that is a Saint-Simonian idea.”   (Freedom and Its Betrayal, page 125)

 

Compare this to the following quote from Boyd Packer:

 

Church history can be so interesting and so inspiring as to be a powerful tool indeed for building faith. If not properly written or properly taught, it may be a faith destroyer”¦

 

There is a temptation for the writer or the teacher of Church history to want to tell everything, whether it is worthy or faith promoting or not. Some things that are true are not very useful”¦

 

The writer or teacher who has an exaggerated loyalty to the theory that everything must be told is laying a foundation for his own judgment…The Lord made it clear that some things are to be taught selectively and some things are to be given only to those who are worthy”¦

 

That historian or scholar who delights in pointing out the weaknesses and frailties of present or past leaders destroys faith. A destroyer of faith – particularly one within the Church, and more particularly one who is employed specifically to build faith – places himself in great spiritual jeopardy. He is serving the wrong master, and unless he repents, he will not be among the faithful in the eternities”¦Do not spread disease germs!” (Boyd K. Packer, 1981, BYU Studies, Vol. 21, No. 3, pp. 259-271)

 

Or this from Mormon Apostle Dallin Oaks:

 

My duty as a member of the Council of the Twelve is to protect what is most unique about the LDS church, namely the authority of priesthood, testimony regarding the restoration of the gospel, and the divine mission of the Savior. Everything may be sacrificed in order to maintain the integrity of those essential facts. Thus, if Mormon Enigma [a book respecting which his opinion was sought] reveals information that is detrimental to the reputation of Joseph Smith, then it is necessary to try to limit its influence and that of its authors. (Inside the Mind of Joseph Smith: Psychobiography and the Book of Mormon, page xliii, footnote 28)

 

As already noted, the root of these ideas is found with Plato and his “philosopher kings”. They were the wise few who Plato felt were justified in deceiving the masses when it was necessary to do so, since the masses were incapable of understanding what was in their best interest. This concept is closely related to Nietzsche’s notion of the “pious” or “holy” lie.   He said:

 

That the lie is permitted as a means to pious ends Is part of the theory of every priesthood ““ to what extent it is part of their practice is the object of this enquiry. (The Will to Power, p. 89)

 

Nietzsche condemned the pious lie, as do I, as did Joseph Smith. In the so-called “Plan of Salvation” (the Mormon idea of how we lived prior to this life with God; were sent to this earth to be tested; and if we pass the test, will return to live with God and will become like Him), Satan beautifully articulated the philosopher king and pious lie approach, and was vilified for it.

 

It is clear, in my view, that Joseph Smith behaved in classic philosopher king fashion, and that the Mormon “faithful history” policy (the policy of only teaching versions of Mormon history, particularly with respect to Mormonism’s foundational events, that will encourage the members to be more obedient to current Mormon leaders) discloses a group of modern philosopher kings who feel justified in telling pious lies.

 

Fear is Necessary to Restrain Chaos

Berlin summarizes Maistre as follows:

 

Maistre stresses tradition, the past, the unconscious, dark forces, not the amiable imaginary attributes of the folk soul, as did its enthusiastic champions ““ the   “German romantics ““ or the champions of the simple life (which he too always praised). On the contrary, he stresses the stability, the permanence and the impregnability of the authority that belongs to the dark mass of half-conscious memories and traditions and loyalties, and the power of institutions in exacting obedience, especially in regard to the supernatural. He lays great emphasis on the fact that absolute rule succeeds only when it is terrifying, and he fears and detests science, precisely because it pours too much light, and so dissolves the mystery, the darkness, which alone resists skeptical enquiry. (Freedom and Its Betrayers, page 153)

 

As noted above, a large part of the Mormon system is built on a kind of fear that is for the most part not consciously acknowledged.   This is a fear that we cannot cope with the wicked world outside of Mormonism; fear that if we are not obedient that we will not be able to live with our families after death; fear that we will lose blessings if we are not obedient to various Mormon rules; fear that God will punish us for the same reason; etc.     It is mostly experienced by Mormons as a desire to have the things that will avert what they fear ““ a desire for God”™s approval; a desire to live in the Celestial Kingdom with their families; etc.

 

The “Higher” or “Real” Self Needs Help to Come Forth

Berlin Summarizes Rousseau on this point as follows:

 

I know what any man”™s true self seeks for it must seek what my own self seeks, whenever I know that what I am now is my own true self, and not my other, illusory, self. It is this notion of the two selves which really does the work in Rousseau”™s thought. When I stop a man from pursuing evil ends, even when I put him in jail in order to prevent him from causing damage to other good men, even if I execute him as an abandoned criminal, I do this not for utilitarian reasons, in order to give happiness to others; not even for retributive reasons, in order to punish him for the evil that he does. I do it because that is what his own inner, better, more real self would have done if only he had allowed it to speak. I set myself up as the authority not merely over my actions, but over his. This is what is meant by Rousseau”™s famous phrase about the right of society to force men to be free.

 

To force a man to be free is to force him to behave in a rational manner. Any man is free who gets what he wants; what he truly wants is a rational end. If he does not want a rational end, he does not truly want; if he does not want a rational end, what he wants is not true freedom but false freedom. I force him to do certain things which will make him happy. He will be grateful to me for it if he ever discovers what he own true self is: that is the heart of this famous doctrine, and there is not a dictator in the West who in the years after Rousseau did not use this monstrous paradox in order to justify his behavior. “¦ This is Rousseau”™s central doctrine, and it is a doctrine which leads to genuine servitude, and by this route, from this deification of the notion of absolute liberty, we gradually reach the notion of absolute despotism. There is no reason why human beings should be offered choices, alternative, when only one alternative is the right alternative. Certainly they must choose, because if they do not choose then they are not spontaneous, they are not free, they are not human beings; but if they do not choose the right alternative, if they choose the wrong alternative, it is because their true self is not at work. They do not know what their true self is, whereas I, who as wise ““ I know this.   (Freedom and Its Betrayal, pages 46, 47)

 

Berlin says that Rousseau”™s worst perversion is the idea of the   “real” or   “true” or   “higher” self. Once we accept that such a self exists, and that the   “leaders” know what that self would do were it only in control of the individual in question, the high road to despotism is wide open.       In the final essay of his illustrious career, Berlin develops this idea further (See The Power of Ideas, pages 17, 18), and in that regard, indicates the following:

 

The notion of positive freedom has led, historically, to even more frightful perversions. Who orders my life? I do. I? Ignorant, confused, driven hither and thither by uncontrolled passions and drives ““ is that all there is to me? Is there not within me a higher, more rational, freer self, able to understand and dominate passions, ignorance and other defects, which I can attain to only by a process of education or understanding, a process which can be managed only by those who are wiser than myself, who make me aware of my true,   “real”, deepest self, of what I am at my best? This is a well known metaphysical view, according to which I can be truly free and self-controlled only if I am truly rational ““ a belief which goes back to Plato ““ and since I am not perhaps sufficiently rational myself, I must obey those who are indeed rational, and who therefore know what is best not only for themselves but also for me, and who can guide me along lines which will ultimately awaken my true rational self and put it in charge, where it truly belongs. I may feel hemmed in ““ indeed, crushed ““ by these authorities, but that is an illusion: when I have grown up and have attained to a fully mature,   “real” self, I shall understand that I would have done for myself what has been done for me if it I had been as wise, when I was in an inferior condition, as they are now.

 

In short, they are acting on my behalf, in the interest of my higher self, in controlled my lower self; so that true liberty for the lower self consists in total obedience to them the wise, those who know the truth, the elite of sages; of perhaps my obedience must be to those who understand how human destiny is made ““ for if Marx is right, the it is the Party which alone grasps the demands of the rational goals of history ) which must shape and guide me, whichever way my poor empirical self may wish to go; and the party itself must be guided by its far-seeing leaders, and in the end by the greatest and wisest leader of all.

 

There is no despot in the world who cannot use this method of argument for the vilest oppression, in the name of an ideal self which is seeking to bring to fruition by his own, perhaps somewhat brutal and prima facie morally odious, means (prima facie only for the lower empirical self). The   “engineer of human souls”, to use Stalin”™s phrase, knows best; he does what he does not simply to do his best for his nation, but in the name of the nation itself, in the name of what the nation would be doing itself if only it had attained to this level of historical understanding. That is the great perversion which the positive notion of liberty has been liable to: whether the tyranny issues from a Marxist leader, a king, a Fascist dictator, the masters of an authoritarian Church or class or State, it seeks for the imprisoned,   “real” self within men, and   “liberates” it, so that this self can attain to the level of those who give the orders.

 

This goes back to the naïve notion that there is only one true answer to every question: if I know the true answer and you do not, and you disagree with me, it is because you are ignorant; if you seek to disobey me, this can be so only because you are wrong, because the truth has not been revealed to you as it has been to me. This justifies some of the most frightful forms of oppression and enslavement in human history, and it is truly the most dangerous, and in our century in particular, the most violent, interpretation of the notion of positive liberty.” (See pages 17b and 18).

 

Berlin indicates that this attitude leads to idol worship and a form of sacrifice.   I quote:

 

Someone once remarked that in the old days men and women were brought as sacrifices to a variety of gods; for these, the modern age has substituted new idols: – isms. To cause pain, to kill, to torture are in general rightly condemned; but if these things are done not for my personal benefit but for an ““ism ““ socialism, nationalism, Fascism, Communism, fanatically held religious belief, or progress, or the fulfillment of the laws of history ““ then they are in order. Most revolutionaries believe, covertly or overtly, that in order to create the ideal world eggs must be broken, or otherwise one cannot obtain the omelette. Eggs are certainly broken ““ never more violently or ubiquitously than in our times ““ but the omelette is far to seek, it recedes into the infinite distance. That is one of the corollaries of unbridled monism, as I call it ““ some call it fanaticism, but monism is at the root of every extremism. (The Power of Ideas, page 14)

 

There are innumerable quotes from Mormon leaders that could be marshalled to show the influence in Mormon thought and culture of the “higher self” idea.   In fact, much of what is above regarding deception, fear, submission etc. ties into the idea of the “higher” self.   But one more quote is useful to show how far this doctrine can be taken in the name of freedom in the hands of leaders who “know” what is best for those who follow them.

 

Blood Atonement is the doctrine that if a person has committed certain sins, salvation is on possible if that person is killed.   Brigham Young, in reference to this doctrine, said:

 

I could refer you to plenty of instances where men have been righteously slain, in order to atone for their sins. . . .This is loving our neighbor as ourselves; if he needs help, help him; and if he wants salvation and it is necessary to spill his blood on the earth in order that he may be saved, spill it.” (Journal of Discourses, vol. 4, p. 220).

 

So, we are doing a person a favour when we kill him?   This, in my view, is arguably worse than even the noxious doctrines of Hitler and Stalin that required the extermination of large numbers of people in order to establish the kind of society they felt was required for the progress of man.   At least they thought, it appears, that the carnage for which they were responsible was an evil, albeit necessary.   Brigham Young took this a large step further, in my view.   He taught that death of some people is an unmitigated good that would be welcomed by even those who are murdered if they had the spiritual maturity, sensitivity, whatever, to appreciate “reality”.   This is what their higher selves would wish, were that aspect of themselves in control.   This, in my view, is the high water mark of evil that can be justified by Plato’s philosopher king idea, or Rousseau’s “higher” self.

 

When I recently mentioned this quote to an intelligent, relatively open-minded young returned missionary of my acquaintance, he laughed and said something like, “Oh that Brigham Young!   He was such a nut case!”   And that seemed to be the end of that as far as he was concerned.   I note two things in this regard.   First, Brigham Young was for decades if not well over a hundred years accepted as God’s prophet and was obeyed, revered etc. as such by the Mormon people.   It is only in the light of relatively recent times that many of his ideas have been rejected as belonging to the lunatic fringe.   So, are we to simply laugh off the fact that a lunatic was the de facto dictator of the Mormon people for decades?   This is no laughing matter for those whose lives were sacrificed in various ways to his vision.   And if the light of future generations showed Young to be a lunatic, are faithful Mormons justified in assuming the something similar will not occur with regard to current, or future, Mormon prophets who are and will obeyed and venerated in their time as was Young?   I suggest not.

 

Freedom = Surrender of Freedom?

The analysis above makes it clear that the voluntary surrender of freedom is the modern Mormon definition of freedom, and is the objective of the Mormon enterprise.   This is one of Mormonism”™s key, and amazingly, invisible premises.   The vast majority of faithful Mormons would argue vigorously against this idea.   The vast majority of non-Mormons who understand how Mormonism operates would simply nod in agreement after perhaps noting the insightful tracing of ideological roots contained in the analysis above.

 

I was stunned when this sacred Mormon premise came into view for me.   The analysis that follows details various psychological and social mechanisms that are used to achieve that premise”™s implicit goals, as well as the goals related to other sacred Mormon premises.

 

Denial”™s Building Blocks

As noted above, denial in general is a psychological or social phenomenon and ask such it cannot be captured fully by any single form of analysis.   I have put the concept of   “logical premises” out there as the closest I am aware of to a basic approach that can be taken in this regard, while admitting that it does not get there.   However, I think that I can defend the position that this approach is more basic than the rest I am about to describe.   If   “logical premises” is the foundation, these are the building blocks.   In some cases one combination of foundation and building blocks may be more explanatory than others.   And it may be possible to find cases where it appears that the building blocks simply rest on the ground without foundation.

 

In any event, here are a number of additional conceptual approaches to understanding the phenomenon of human denial of reality that I have found helpful.   For a useful summary that comes at this topic from a different perspective, see http://en.wikipedia.org/wiki/List_of_cognitive_biases.

 

Reason v. Emotion

The philosopher Max Payne has noted that:

 

“¦religion is a social construct, but this construct is merely the outer envelope for inner personal experience. Gallup polls show that over 40% of modern Western people will admit to some contact with a power greater than themselves. For most this is a fleeting “blick” which nevertheless may transform their lives, though many do not connect it with any formal religious organization. Seers, shamans, yogis and prophets have this “blick” more intensely. The outward personal and cultural expressions of this experience vary widely. There are many paths up the mountain, and it is a matter of metaphysical argument and faith [as to whether] all paths converge to a single summit.   But the mountain is certainly there, and religion is about our stumbling contact with it.   (See http://www.datadiwan.de/SciMedNet/library/reviewsN81+/N82Shermer_believe.htm)

 

The sense of stumbling over ““ or perhaps groping toward ““ something essential that defies our coherent perception, nicely frames one of religion’s central problems.   Religious belief combines enduring allure with stubborn resistance to understanding.   It seems rooted in the realm of mist and inarticulate, but deeply significant, meaning.

 

Payne’s statement above was made as part of a critical review of Michael Shermer’s book “How We Believe: Science, Skepticism and the Search for God”. His main criticism of Shermer was that while purporting to use science to explain religious phenomena, Shermer did not do so in two senses.   First, he did not use science.   And second, he did not explain belief.   This will be particularly stinging criticism for Shermer, who as the president of the Skeptics Society and publisher of the Skeptic Magazine, is justifiably proud of his ability to provide rational analysis.   While his failure to satisfy his critics is not surprising (I once read that writers feel about critics as do fire hydrants about dogs “¦), the terrain between religious belief and emotion on the one hand and science and its rational processes on the other is notorious tricky.   Those who venture into this area are easy targets for critics.

 

We make up our minds as to what to do and how to do it partly on the basis of emotion, and partly reason.   Some of us are more emotional and less rational than others.   Our dispositions in this and in most other important respects of life are made up roughly 50% by our genetic heritage, and 50% by the manner in which we have been conditioned by prior experience (See “The Blank Slate”, by Steven Pinker).

 

However, it is well established that the emotional range of human experience often dominates the rational when the two go head to head (bad pun).   This is thought by some scientists to be due to the fact that there are many more neural pathways leading from the brain’s more primitive, emotional equipment (the hippocampus, amygdala, etc.) sometimes referred to as the brain”™s   “reptilian core”, to its more recently developed, rational equipment (the cerebral cortex, etc.) than the other way around. Hence, when the brain is subjected to stimuli that ignite its emotional structures, reason succumbs to emotion. (See “Fear Not” by Rudiger Vaas, in “American Scientific Mind” vol. 14, no. 1, 2004, p. 69). One way to understand this is to remember what happens when the human adrenalin system kicks into operation.   That is not a time for rational thought.   It is a time for quick (and usually conservative) action.

 

The tendency of the emotional to overcome the rational is particularly strong when we deal with phenomena that are not well understood, such as our religious beliefs. However, the feedback system from the rational to the emotional structures in the brain can calm us if we become confident that we “know” what is going on. Hence, the more difficult to understand a phenomenon is, the less likely it is that our rational faculties will have their calming effect on us.   Think, for example, of the terror an eclipse of the Sun at one time caused. Now that we think we understand what is going on, the effect is quite different.   Many other similar phenomena have gone a similar route.   When our emotions are excited, and we don’t “know” why we generally resort to what seems most safe. This is part of our preservation instinct. Staying with the group and doing what it does is one of those things that is assumed to be safe.

 

The findings of science are in this case consistent with various sources of traditional wisdom.   Based on their study of mythology, for example, Joseph Campbell, C.G. Jung and others have emphasized the role that fear, and its close cousin desire, play in our decision making. Campbell noted the teaching central to Buddhist theory that fear and desire are the two primary forces in life that cause us trouble. And, desire is really for the most part an aspect of fear since when we want something, what motivates us largely is fear that our desire will not be satisfied. The idea that we fear some things and want others lies beneath much of what is summarized in this essay.   Those who influence our fears and desires (television advertisers, for example) exercise a measure of control over our behavior.

 

Most of the mechanisms described below enable emotional forces to suppress or submerge rational forces, thus producing what looks like a denial of reality to those who are not similarly effected by emotional forces or who have a perspective broad enough that reason has calmed emotion to the point that it does not cloud perception.

 

Taboos

Sacred beliefs, as noted above, are protected by taboos, and again the role of the storytellers is of critical importance in creating the perception of reality that gives strength to a taboo. And taboos are designed to make us fear and so to shut down the rational examination of our behavior, such as whether it makes sense to go on a mission instead of to university, or for a young couple to marry at ages 21 and 19 and immediately start having children; or for a woman to abandon her university education or career to become a full time mother; etc.   Mormons believe that terrible things happen to a Mormon who stops obeying the rules set out by Mormon authority.   This is the overriding Mormon taboo ““ disobedience to current Mormon authority.   The bad things (classic taboo related curses) include not being able to be with his family after death, often going through divorce and estrangement from family members and close friends during this life, and in general, losing the “joy” that comes only from being a faithful Mormon. Similar beliefs are found in countless other communities from the Jehovah’s Witnesses, to certain Muslim, Jewish, Christian and other groups.

 

Another common taboo relates to   “dangerous” information.   The way in which humans have used the information available to them over time seems to indicate that evolution often trades a certain amount of denial of reality for a reduction in our fear of chaos. That is, there seems to be a strong belief at both the conscious and subconscious levels that too much knowledge is dangerous; that if we know too much and hence have too many choices, society (or we) may crumble and as a result we may lose control of our fragile existence. Humans in particular tend to stay away from information that could threaten the legitimacy of their social group, since the disintegration of the group threatens all with chaos.

 

As noted above, taboos and the fear they engender are reinforced through rituals of various kinds.   Mormon testimony bearing; worthiness interviews; self inventory of sin during the sacrament ritual each week; self inventory of sin during personal prayer each day; the repetition of temple covenants through   “proxy” temple attendance; etc. are all rituals that remind Mormons of various taboos, hence enhancing the power of Mormon belief system in their lives.

 

Mormons are taught that the first big step toward breaking the   “obedience” taboo is often to question Mormon authority.   So, when a Mormon is confronted with evidence that questions the validity of an important belief or the validity of Mormon authority, her adrenalin system is likely to fire up. The stronger evidence, the stronger this response is likely to be. In primitive societies, the breach of many such taboos meant being pushed out of the group, and so death. Our instincts still appear to be wired on this basis because of the lengthy period of time during which our ancestors lived this way. And it is well documented that a firing adrenalin system interferes with our ability to engage in certain types of reasoning. Rather, we tend toward conservative behaviour. Sources of perceived danger, in particular, are avoided instead of examined. And so any source of evidence that questions a fundamentally important belief tends to be avoided.

 

Powerful desires for money, prestige, sex etc. can also overcome reason.   One of my clients was on the verge of falling for a fraudulent financial scheme that offered him $20,000,000, and came to me for tax planning advice.   He had tickets purchased to fly to Nigeria the following week to sign a few papers and collect his money.   After I asked some questions, and then provided him with news service articles that indicated how others had lost their money, been kidnapped for ransom, and in one case had been killed, as a result of participating in similar schemes, he reacted like someone coming out of a trance.   This experienced, successful businessman’s considerable ability to reason had been overcome by the emotion of greed, which is of course a variant of desire.

 

Other research indicates that the most powerful of emotional forces are often connected to “value structures” such as religion (my religion is “true” and yours is not, for example), morality (the abortion issue; the homosexuality issue; etc.), political issues (democracy v. communism, for example), etc.   That is, we have been taught by the stories we believe and the premises they engrain to feel fear or similarly powerful emotions related to our group”™s foundational values.   This is tied into our ancient need for group stability in order to survive, and the inference that to threaten the group”™s foundational ideology is to threaten group stability.   We hence have a difficult time discussing such things   “rationally”.   Anyone who has been involved in discussions related to abortion, climate change, religion or politics with other people who have strongly held and opposing opinions with regard to those issues will likely have noted behavioral similarities in both themselves and other participants in those different discussions.

 

Another powerful emotion that affects our beliefs is love.   I recently watched in amusement (and with some concern) as one of my young friends who I did not think had a religious bone in his body fell in love with a faithful Mormon girl and began to think seriously about serving a Mormon mission after years of resisting the pressure of his parents and others to do so.

 

And it is well known that medical doctors and other professionals are taught not to attempt to diagnose or treat themselves or family members.     It has been shown, for example, that a doctor’s love for her child, and fear of the consequence that a serious illness would bring to that child, is likely to impair her ability to see symptoms that clearly indicate serious illnesses.

 

Bounded Rationality and “Heuristics”

Many models of human behavior in the social sciences assume that humans human behaviour is   “rational”.   Scholars like Amos Tversky, Daniel Kahneman, Gerd Gigerenzer have shown that while we like to tell ourselves that we are rational, our behaviour and hence reason are affected by constraints of various types.   For example, what is   “rational” to someone who is sitting in the safety of her home may seem irrational to another person whose child is likely to die if she does not take some form of risky action, for example.   The study of   “bounded rationality” is hence an attempt to make the classical models of human rationality more realistic, while continuing to use models that can be systematized and hence rigorously studied.     “Heuristics” are used for decision making purposes in bounded rationality theory.

 

Heuristics are simple, efficient rules of thumb that explain how people make decisions and solve problems with limited time and information. These rules work well under most circumstances, but in certain cases lead to systematic cognitive biases of the type described below under the heading   “Cognitive Biases and Cognitive Dissonance”.   For example, the differential calculus that would be required to calculate when and how fast to run to catch a fly baseball are terrifically complex.   But we do this easily and without thought by using an efficient rule of thumb ““ run just fast enough keep the angle between you and the ball constant.   Humans also make amazingly accurate character judgments based on only a few seconds of observing other humans.

 

A wide variety of heuristics have been studied.   Malcolm Gladwell”™s   “Blink” presents an accessible summary of many of these, as well as a review of the   “bias” research that shows heuristics downside ““ human mental functions that are in certain circumstances predictably off the mark.   Some of these are reviewed under various headings below related to   “cognitive bias”.     For instance, people may tend to perceive more expensive beers as tasting better than inexpensive ones. This finding holds true even when prices and brands are switched; putting the high price on the normally relatively inexpensive brand is enough to lead experimental participants to perceive that beer as tasting better than the beer that is normally relatively expensive. One might call this “price implies quality” bias.

 

Heuristics are linked to emotion and instinct. Once a heuristic switch is tripped, we act in accordance with it and explain our actions to ourselves on a rational basis after the fact. These heuristics enable us to make pretty good decisions based on amazingly small amounts of relevant information.   Decisions made on this basis ““ quickly without much information ““ are said to be   “boundedly rational”.   That is, they make sense on the basis of the limited information available.   At times, people operating on this basis appear to be in denial of reality to those who have a broader perspective.

 

The sacred premises noted above are rules that affect the operation of reason, and so cause a kind of bounded rationality and the development of particular heuristics.

 

As noted above, fear is one of our most powerful emotional switches. Once it is triggered by a religious belief (I won’t be in the Celestial Kingdom with my family if I disobey/disbelieve) the heuristics we use change because our perception of our circumstances have changed.   We no longer feel safe, and so rational thought stops and we act to preserve ourselves.   We don”™t rationally count the cost of many actions we feel compelled to engage in while we are fearful until that cost reaches such a painfully high level that it cannot be ignored.   And this is rational behaviour in the same sense as is sprinting away from a bush in which you heard a “big animal” sound at night if you had learned earlier that a tiger had escaped that day from a nearby zoo.

 

Our emotions are designed first and foremost to preserve our lives, and propagate our genes. As just noted, our emotional “switches” make the price we are prepared to pay to obtain “goods”, like being in the Celestial Kingdom, very high. By measuring the price people are prepared to pay in terms of time, money, opportunity cost etc. to remain part of a community, we have a way to gauge the strength of particular beliefs and fears in that community relative to others in other communities. On that basis, Taliban beliefs are likely stronger than Mormon beliefs, which are stronger than Evangelical beliefs, which are stronger than Anglican beliefs, etc. There is a correlation between the strength of religious beliefs and how protected from competing beliefs they are by various means.   The more protected a belief, the less likely they are to be affected by disconfirming information.

 

A group”™s sacred premises harness heuristics within the information scarce, rule bound worlds created by the religions themselves.   Our heuristics related to fear and designed to deal with uncertainty are prominently used in this regard. The apparent purpose of this exercise is to foster the kind of belief that confer power on the religious institutions in question.

 

Given the information most faithful Mormons have, and the premises they have been trained to accept, their behavior relative to Mormonism is rational.   Likewise, the fear and anger transitioning Mormon feel as they break these bounds is rational.

 

Ecological v. Deliberative Rationality

Another line of research identifies two kinds of rationality ““ one that is adaptive or practical (called “ecological rationality”) that deals with things like social and moral behaviour and what is “rational” given a particular social and emotional reality.   This is a species of the   “bounded rationality” discussed above.   And other that is more rational an less emotional in orientation, and is called   “deliberative rationality”.

 

The laws that govern ecological rationality are not absolute in the sense that the law of gravity is absolute.   Rather, they are relative to particular social structures and circumstances.   For example, while I served a mission in Peru many years ago our Mission President authorized us to drink both tea and Coca Cola (both thought by most Mormons to be contrary to the Word of Wisdom) since they were safer than the water we might otherwise drink, and because he believed (wrongly as it turned out) that both had curative properties relative to stomach parasites.   As our presiding religious authority, his instructions to us in that regard changed our   “ecology”, and hence our behaviour.   What was not socially acceptable in Mormon missionary society generally speaking became so simply because he said it should be.

 

On the other hand, while visiting Peru with my family a couple of years ago I took a great deal of abuse from some of them for drinking tea made from the leaf of the coca plant, from which cocaine is derived.   This is the local remedy for altitude sickness (kind of like a mild case of the flu) in the areas between 11,000 and 12,000 feet above sea level that we visited.   A glass of this tea has roughly the same effect as an extra strength Tylenol pill.   But, its association with cocaine was off putting for my family because of their 21st century/North American/Mormon ecology.   At the time, I was a faithful Mormon as well, but my experience years earlier in Peru had accustomed me to the use of herbs (including the coca plant) for various legitimate purposes.   From my point of view, drinking that tea was as legitimate as taking Tylenol, and much easier.   And just as Tylenol 3 is regularly abused, so is the drug that can be obtained by processing and refining coca leaves in a particular way to produce cocaine.   And if we went back a couple of generations in time in North America, we would find a completely different ecology respecting cocaine itself.   We forget that not long ago cocaine made Coca Cola the cultural fixture it is, and that during the same period of time cocaine was sold over the counter in North American drug stores as a cure all.   Ecologies change, and as they do so does the ecological reasoning they produce.

 

A more jarring example of bounded or ecological rationality is the behavior of a battered spouse who chooses to remain with her husband in circumstances where she may not survive without his breadwinning assistance.   That is, being physically or emotionally abused is rational if probable homeless and all that goes with it for self and children is the alternative.   Other features of human psychology such as denial and cognitive dissonance (described below) often strengthen this process by suppressing information that if consciously acknowledged might compel the abused spouse to action that her unconscious mind fears.   The same sort of mental processes may well apply to a male whose mate is having an affair with the most powerful individual in a violent social group, such as a primitive tribe, a Mafioso community or a group of chimpanzees.

 

Deliberative rationality, on the other hand, includes the kind of reasoning required by the scientific method.

 

As Matteo Mameli notes:

 

Evolutionary considerations (and neurological data) indicate that emotions are very important (and in many more ways than people usually think) “¦ for [ecological rationality, including] social and moral rationality. But things are different with deliberative rationality. Emotions do not help with deliberative rationality. Deliberative rationality is the ability that a person has when (i) she is able to form beliefs about which mental state she ought to be in, (ii) she is able to form the intention to be in this mental state, and (iii) this intention is successful (i.e. the intention causes the person to be in the mental state she thinks she ought to be in). A paradigmatic case of deliberative rationality is scientific rationality. The scientist examines the data at her disposal and (i) she forms the belief that she ought to believe in the truth of theory T, (ii) as a result of this belief she forms the intention to believe in T’s truth, and (iii) as a result of this intention she believes in T’s truth.

 

Evolutionary considerations (and neurological data) suggest that emotions limit the extent to which humans can be deliberatively rational. Intentions to have certain emotional reactions and to avoid other emotional reactions are often unsuccessful, and for good evolutionary reasons. The different roles that emotions play in (two different kinds of) rational behaviour explain why the debate about the rationality of emotions has been so long and so messy.   (Matteo Mameli, Department of Philosophy, Logic and Scientific Method, London School of Economics, “The Rationality of Emotions from an Evolutionary Point of View”, to be published in “Emotion, Evolution & Rationality”, Oxford University Press, March 2004)

 

For example, it is “rational” for me to wish to get along within my family and community, and the thought of being ostracized produces fear ““ a strong, negative emotional reaction.   This is likely due to the historic connection between being shut out of society and non-survival.   Hence, behaviour that prevents my expulsion from the safety of society is supremely rational in an ecological sense.   And yet, that behaviour may require that I deny reality.   To use a crude example, imagine the primitive male who has seen plenty of evidence that his mate may be having sexual relations with the group’s powerful male leader.   Further assume that a confrontation with the leader would not likely bode well for the first male’s survival.   It appears that our brains have developed mechanisms that will screen an amazing amount of dangerous information such as this so that we do not need to deal with it.   Sometimes this is a good thing, and other times it is not.

 

While George Orwell did not use the terms bounded or ecological rationality, he recognized these concepts at work in his day.   His lovely little book   “Why I Write” was written in England during World War II.   While providing fascinating insight into why Orwell wrote what he did (  “Animal Farm”,   “1984” etac.), it is mostly a viciously insightful critique of the British socials ills that he believed led to its national predicament at that time ““ the British appeared on their way to losing a life and death struggle.

 

While I recommend the book for a variety of reasons, its utility for present purposes to point out an interesting parallel between the forces that according to Orwell were at the root of Britain”™s perspective problems leading up to World War II, and those that currently plague Mormonism. For example, read the following passages, written by Orwell in the context above, as if they had been written by a Mormon intellectual who was fully conversant with the strengths and weaknesses of the institution that sponsors his faith, changing references:

 

  • from   “democracy” to   “literalist Mormonism”;

 

  • from   “totalitarianism” to a religious tradition other than Mormonism that has a cultish;

 

  • from England to   “the Mormon Church”;

 

  • from particular British leaders to particular Mormon leaders, etc.

 

And when Orwell speaks of stupidity, think instead of denial.   Here we see a classic example of bounded rationality at work.   All page references are to Orwell”™s   “Why I Write”.

 

“An illusion can become a half-truth, a mask can alter the expression of a face.   The familiar arguments to the effect that democracy is   “just the same as” or   “just as bad as” totalitarianism never take account of this fact.   All such arguments boil down to saying that half a loaf is the same as no bread.   In England such concepts as justice, liberty and objective truth are still believed in.   They may be illusions, but they are powerful illusions.   “¦ Even hypocrisy is a powerful safeguard.   The hanging judge, that evil old man in scarlet robe and horse-hair wig, whom nothing short of dynamite will ever teach what century he is living in, but who will at any rate interpret the law according to the books and will in no circumstance take a money bribe, is on of the symbolic figures of England.   He is a symbol of the strange mixture of reality and illusion, democracy and privilege, humbug and decency, the subtle network of compromises, by which the nation keeps itself in its familiar shape.”   (pages 21, 22)

 

“In spite of the campaigns of a few thousand left-wingers [who are the intelligentsia of whom Orwell was part], it is fairly certain that the bulk of the English people were behind Chamberlain”™s foreign policy [that played into Hitler”™s hands, setting up what looked like a war headed for disaster].   More, it is fairly certain that the same struggle was going on in Chamberlain”™s mind as in the minds of ordinary people.   His opponents professed to see in him a dark and wily schemer, plotting to sell England to Hitler, but it is far likelier that he was mere a stupid old man doing his best according to his very dim lights.   It is difficult to otherwise explain the contradictions of his policy, his failure to grasp any of the courses that were open to him. “¦” (page 28

 

“England is not the jewelled isle of Shakespeare”™s much”“quoted message, nor is it the inferno depicted by Dr. Goebbels.   More than either it resembles a family, a rather stuffy Victorian family, with not many black sheep in it but with all it cupboards bursting with skeletons. It has rich relations who have to be kow-towed to and poor relations who are horribly sat upon, and there is a deep conspiracy of silence about the source of the family income.   It is a family in which the young are generally thwarted and most of the power is in the hands of irresponsible uncles and bedridden aunts.   Still, it is a family. It has its private language and it common memories, and at the approach of an enemy it closes its ranks.   A family with the wrong members in control ““ that perhaps, is as near as one can come to describing England in a phrase. (page 30)

 

“One of the dominant facts in English life during the past three quarters of a century has been the decay of ability in the ruling class. “¦ The existence of these people was by any standard unjustifiable.   They were simply parasites, less useful to society than his fleas are to a dog.

 

By 1920 there many people who were aware of all this. By 1930 millions were aware of it.   But the British ruling class obviously could not admit to themselves that their usefulness was at an end. Had they done that they would have had to abdicate.   For it was not possible for them to turn themselves into mere bandits “¦After all, they belonged to a class with a certain tradition, they had been to public schools where the duty of dying for your country, if necessary, is laid down as the first and greatest of the Commandments.   They had to feel themselves true patriots, even while they plundered their countrymen.   Clearly there was only one escape for them ““ into stupidity.   They could keep society in its existing shape only by being unable to grasp that any improvement was possible.   Difficult though this was, they achieved it, largely by fixing their eyes on the past and refusing to notice the changes that were going on round them.”   (pages 31 ““ 33)

 

“It is important not to misunderstand [the leaders] motives, or one cannot predict their actions.   What is to be expected of them is not treachery, or physical cowardice, but stupidity, unconscious sabotage, an infallible instinct for doing the wrong thing.   They are not wicked, or not altogether wicked; they are merely unteachable.   Only when their money and power are gone will the younger among them begin to grasp what century they are living in.” (page 37)

 

“England is perhaps the only great country whose intellectuals are ashamed of their own nationality.”   (page 40)

 

“It is clear that the special position of the English intellectuals ruing the past ten years, as purely negative creatures, mere anti-Blimps [the uneducated masses], was a by-product of the ruling-class stupidity.   Society could not use [the intellectuals], and they had not got it in them to see that devotion to one”™s country implies   “for better, for worse”.   Both Blimps and high-brows took for granted, as though it were a law of nature, the divorce between patriotism and intelligence.   If you were a patriot you read Blackwood”™s Magazine [a low-brow publication] and publicly thanked god that you were   “not brainy”.   “¦   Patriotism and intelligence will have to come together again.   It is the fact that we are fighting a war, and a very peculiar war, that may make this possible.” (page 41)

 

That is, England”™s ruling class was making decisions that made sense to them in the context of their historic dominance, understandable reluctance to give up power and influence, etc.   And these decisions put the entire country at risk.   Intellectuals were scorned because they called the established order into question.   Among the   “faithful” ignorance became a badge of honour.   And if shown this situation in any other culture, the British of Orwell”™s day and Mormons today would immediately recognize it as a recipe for disaster.   Then, if confronted by the proposition that they were headed down precisely the same perilous path, they would distinguish their case from the other on grounds that would leave most knowledgeable outsiders shaking their heads in amazement at the depth of denial these mental gymnastics show.

 

The parallels between the British leaders in Orwell”™s time and Mormonism”™s leadership today are particularly striking.   Their circumstances blind them to the reality of both their position and the effects of their actions.   Time will tell how far Mormonism”™s fortunes will have to decline before fundamental leadership change will occur.

 

In conclusion regarding bounded and ecological rationality, I note that it may well have been Christ”™s observation of this universal human trait that prompted him to note that only those who had ears for his teachings would hear them.

 

Value Structures and Emotion

Other researchers have noted the relationship, and conflict, between reason and the emotions that are induced by membership in social groups and the values imposed by such membership:

 

Emotions are “¦ intrinsically linked to a mentally represented set of goals, values and standards which we’ll call a value structure.   Moreover, there are good reasons, indeed good and systematic evolutionary reasons, to think that the contents of value structures will often be maladaptive.   When they are, the emotions and the behaviour they lead to will typically be irrational. (Chandra Sripada & Stephen Stich, Department of Philosophy, Rutgers University, “Evolution, Culture and The Irrationality of the Emotions”, to be published in “Emotion, Evolution & Rationality”, Oxford University Press, March 2004)

 

I like the terms “deliberative” and “ecological” rationality and the concepts they convey.   The circumstances created by values and emotions related to them are part of the sacred premises described above that cause denial.

 

The analysis provided by Sripada and Stich helps to flesh out why deliberative rationality is restricted by certain types of ecologies and the role value structures play within ecologies.   Value structures are a big part of what ties social groups together.   Hence, they are an important part of the ecology within which our reason must be exercised.   In order to understand how our ecology influences our deliberative reason, we need to understand how culture is created and then modified over time.

 

Sripada and Stich’s paper is set in the context of other research that indicated emotion to be importantly connected to reason, in the sense that sound reasoning depends upon a sound emotional, or value, base.   This is consistent with what I indicated above ““ that emotion will often overcome reason when the two are in conflict.   Hence, it is important to clear as many emotional roadblocks to good reasoning as we can.   Many of these roadblocks will be found within value structures, and can only be dealt with by changing those values.   One way to do this is to get enough information on the table so that all group participants can understand the reality of their social interactions and the hidden interests that are being served by group behavior.   This is what ended the divine right of kings, for example, and created both American and French democracy.   This process, in effect, changes our values.   I experienced this as I learned more about my Mormon heritage, and came to the view that the answers Mormonism provided to a host of important questions were not reliable.   I then began to look for answers to those questions elsewhere.   That process continues.   And as I learn and substitute conclusions reached through Mormon ecological rationality for those reached through deliberative rationality, I can still feel my value structures changing.

 

I also note that Sripada and Stich also pay particular attention to the intense emotions that are elicited by membership in large groups, and indicate that these are particularly suspect in terms of their potential to interfere with our deliberatively rational processes.

 

Sripada and Stich speak of   “cultural inertia” and provide a fascinating example of how deep its effects can be. They summarize studies respecting   “cultures of honour” in which males are conditioned to be more prepared than is usual to protect themselves, family or property by resorting to violence.   These cultures often arise in places were resources are scarce and the force of legal protection weak ““ “law of the jungle” places.

 

In various studies it has been demonstrated that men from cultures of honour react differently than other men to threatening insults.   The nature of their reaction can be measured in terms of the greater degree of anger they report feeling as a result of the insult, as well as physiological signs, such as higher levels of cortisol and testosterone measured in their saliva just after the insult is delivered.   Cortisol and testosterone indicate the nature of stress and aggressive behavioural response (Sripada & Stich, pp. 13 ““ 16).

 

When men from cultures of honour are placed in or interface with societies in which resources are abundant and the force of law adequate to protect average citizens, their behaviour is perceived to be overly aggressive, and hence is maladaptive, but slow to change since it is engrained biologically at least to an extent.   The force of biological is largely what creates cultural inertia.   I have observed this type of behaviour in a variety of recent immigrant communities, as well as in the Canadian native community.   Each of these communities fit Sripada and Stich”™s description of the kind of place in which violent personality types are bred.   In longstanding communities of this type, no doubt both genetic and conditioning factors contribute to the resultant dominant personality type.

 

The flip side of the   “culture of honor” findings is the recent research into the role that Oxytocin plays in the emotion of trust (see http://www.sciencenews.org/articles/20050604/fob4.asp).   Trusting persons have higher levels of Oxytocin than persons who re not inclined to trust. And entire populations of persons who are not trusting, and for good reason, have on average lower levels of Oxytocin than populations where trusting behavior is adaptive.   For example, South Americans have lower Oxytocin levels in general than do North Americans.   South Americans need to be more careful about who they trust.   I suppose that it should not be surprising that science is slowing unravelling the biology that lays behind what we feel.

 

Let’s consider the practice of polygamy and related values as a cultural phenomenon.   It appears that polygamy developed largely in societies where resources were scarce, and where those males who proved capable of marshalling resources were much more likely to be able to support a woman and her children than other males.   When women faced the choice of a polygamous relationship and survival, or a monogamous one and likely non-survival, it is not surprising that polygamous relationships became the order of the day.   However, in an era of plenty the values and emotions underpinning polygamy are likely to be seen as maladaptive, and could be expected to give rise to what would be perceived as irrational behaviour.   Modern fringe Mormon fundamentalist groups exhibit behaviour that fits this description.   Efforts to protect polygamous communities against the encroachment of the mainstream culture are often the worst of the many influence to which these people are subject.   This causes them to be poorly educated, to not have developed the social skills necessary to cope with the complexity of modern life, to be fearful, probably to have lower than adaptive Oxytocin levels for a North American, etc.   And an adult polygamist will have these traits hardwired by their brain development as a result of the processes that Quartz and Sejnowski (see above) and others have outlined.

 

A similar analysis could be developed respecting authoritarian religious leadership structures, which also tend to originate in places and times of scarcity where group cohesion is essential.   In such environments survival of the group often dictates the sacrifice of individual liberties to the greater good, while in environments rich in resources such a stifling of individual initiative and energy would likely cost much more than it would produce, and hence would be   considered maladaptive.   For a current example of this behaviour, consider the Old Order Amish living in contemporary US society, or the Hutterites in Canada (See http://www3.telus.net/public/rcmccue/bob/documents/brainwashing.pdf for a summary of some aspects of how Hutterite society works).   While modern Mormonism is not that extreme, it tends in the same direction.   This in my view explains, among other things, the relative dearth of Mormons who have received Nobel or Pulitzer Prizes (See   “What is the Challenge for LDS Scholars and Artists”, John and Kirsten Rector, Dialogue, Vol. 36, No. 2, Summer 2003, http://www.dialoguejournal.com/excerpts/36-2a.shtml).

 

Another important point Sripada and Stich make relates to two types of social learning that are important in creating culture and the value structures they entail.   The first they call   “directly biased social learning”.   This occurs when we learn by our own experience, such as when we touch something hot and are burned, or eat something and get sick.   However, many things are not susceptible to being learned by experience, and respecting those things in particular we tend to rely upon   “indirectly biased social learning”.   In general, the harder information is to come by respecting a phenomenon and the more important the phenomenon is perceived to be, the more indirectly biased social learning will occur respecting it.   For example, it is hard to tell from personal experience whether the Muslim heaven is better than the Christian, or if masturbation will make you eventually go blind, become homosexual or molest children.   So, indirectly biased social learning has dominated the spread of opinion respecting how issues such as these should be dealt with.

 

It is not surprising that more errors are made respecting things learned indirectly about phenomena that are poorly understood than about what we can experience ourselves, and understand.   And so it is not surprising that recent research shows that indirectly biased social learning is the source of many value structures that give rise to maladaptive, emotionally disported perceptions and behaviours.   At the root of this problem are two mechanisms that are responsible for much of our indirectly biased social learning ““ the prestige or authority bias and the conformist bias ““ which are described in some detail below.   For the moment, it is sufficient to note that their effect is distinguished from cultural inertia in the sense that many of the behaviours they cause were never adaptive whereas cultural inertia is usually the result of a trait that developed because it was adaptive in one environment being maladaptive elsewhere.   On the other hand, the cultural effects of the prestige and conformist biases are often mistakes made along the evolutionary path that were preserved at least for a time because their negative effects were not understood (like the effects of poor hygiene in hospitals), or because they benefited certain powerful parties sufficiently to gain their support and so continued to be forced upon subservient parties (such as the divine right of an aristocracy to govern the commoners) (Sripada and Stich, pp. 18 ““ 20).   And because these behaviours are tied into value structures, powerful emotional influences support them.

 

The Collective Mind

Many human groups seem to have an unconscious decision making mechanism that is far greater than any individual member of the group, or even of any subgroup including the leadership subgroup.   This so-called   “collective mind” appears to be the result of the individual heuristics described above, which in turn are related to the group”™s sacred premises and so support denial in some cases.

 

One way to quickly capture this concept is to think of an ant colony.   Each ant is aware of only a few signals that the ants give each other as they walk to and fro.   These convey basic information about what they have noticed during their travels.     “Not enough food is stored”.     “Too much food is stored”.     “Dead bodies need moving”.   Etc.   Based on these few simple signals, ant colonies   “self organize” in consistent patterns as if under the control of a master planner.   This is the ant colony”™s collective mind in operation.   The collective mind operates within human groups in analogous, but more complex, fashion.   Think of Adam Smith”™s   “invisible hand” that allocates resources in seeming magical fashion within markets.

 

The best explanation I have found so far of the collective mind is in David Sloan Wilson’s “Darwin’s Cathedral”. He relies on biological research to form his social theories. Gerd Gigerenzer (See   “The Adaptive Toolbox”) to an extent, does the same. These are both examples of the theory of evolution being applied to human decision making and social behavior.

 

The collective mind concept is similar to Adam Smith’s “invisible hand”. The recent Nobel Prize winner in economics Vernon Smith has done some work along these lines, showing how the theory of bounded rationality causes various kinds of collective minds or invisible hands that function in many different types of markets and game theories. He points out over and again how in computer simulations and other contexts agents with limited intelligence and information, acting in accordance with a series of simple rules, quickly reach market equilibrium or a clever solution to a problem that highly intelligent agents with access to much more complete information had not been able to find. While not all of the unintelligent agents are right all of the time, as they take cues from each other and modify their behaviour based on those cues their collective behaviour trends quickly toward a rational equilibrium or efficient solution to the problem they collectively face. As noted, this occurs even when they have access to very limited information. And this occurs without any of them having the information necessary to make a “rational” decision in the conscious sense of that term. They appear to, as a group, be using the kind of limited information heuristics of which Gigerenzer writes. This is the collective mind in operation.

 

The important point with regard to the invisible hand or collective mind is that its nature is determined by the most basic beliefs or values of the group in question ““ the rules by which each individual agent makes decisions. He who controls those beliefs and values, whether he knows it or not, controls the hand. As those beliefs and values change, so does the collective mind.

 

In many groups, there is no entity to which belief control can be ascribed. In most cases, however, there are entities that have a measure of influence. Religious groups can be placed on a spectrum in this regard. At one end we find those that are more democratic in nature, in which it might be impossible to find a belief influencing agent other than the voice of the people as it changes from time to time, and the occasional persuasive leader who catches the imagination of many within the group. At the other end of the spectrum you find tight knit, small groups that are under the control of a single charismatic leader who wields a strong influence over the groups beliefs. David Koresh and Jim Jones come to mind in the modern context, as does Joseph Smith in his day.

 

The collective mind is the result of all of the decisions individual members of a society make. The quality of the decisions the collective mind will make, and the amount of time it will take to make them (which is often relevant to their quality), is determined by the information to which each individual has access, the quality of each individual’s decision making process and most importantly (as noted above) the basic values held by the individuals within that group.

 

Think of a simple auction. The basic value is buying a given quality at the lowest possible price. Knowledge of this value will help to predict most of the behavior of those participating in the auction, and is the guiding force that produces the collective mind that Vernon Smith and others have observed. This is one of the reasons for which North Americans who had the chance to participate in the establishment of capitalist values in the former Soviet Block countries found that to be so interesting. One of my LDS friends who is a professor of entrepreneurship studies spent a year over there on a Fulbright Scholarship. He and others have described to me the difficulty of predicting how their students would react to different issues that arose for discussion because their values were not focused on maximizing profit, but rather concerned things like maximizing production at any cost because that would maximize the number of jobs and resources that remained under their control. Product quality, cost control, marketing and the other basic building blocks in a successful business initially make no sense to these folks. And as long as that is the case, their collective mind will not produce Adam Smith’s invisible hand, as the history of the Soviet Union shows.

 

The more oriented toward the authoritarian end of the spectrum a group is, the more inclined its collective mind will be to accept things like Mormonism”™s faithful history that distort the perception of reality and hence stymie individual growth. This, in my view, works hard against the interest of individual members of the Mormon group, in favour of the Mormon institution and is responsible for much of the seemingly wilful blindness in Mormon behaviour that has long baffled me. The collective mind produced by such value systems is, quite simply, retarded.   Rather than harnessing the collective decision making strength of many minds, those minds are purposefully turned off by way in which they value deference to authority and so Mormon leaders are left to steer the ship alone.   They are the religious equivalent of the Soviet command economy, and in terms of what Mormonism has to offer its individual members as opposed to its leaders, the comparison to communist states is apt.

 

“Misrecogntion”

Religious leaders whose conscious or unconscious objective is personal control over the largest possible follower group and asset base see to intuit the importance of keeping their belief systems well within the kind of uncertain environment in which fear based heuristics operate at full strength. The manner in which religious and other societal leaders, in a seemingly unconscious fashion, do this kind of thing is relevant to how denial functions in the lives of the religious faithful, and has long caused wonder in those who have observed it. Pierre Bourdieu, the respected French social theorist, developed the concept of   “misrecognition” to explain the kind of denial that often occurs within Mormonism.   See http://en.wikipedia.org/wiki/Pierre_Bourdieu for an overview of his thought.

 

Bourdieu speaks of   “fields” ““ social arenas in which people struggle over desirable resources. A field is a system of positions based on power relationships.   Complex societies have many fields ““ families, clubs, businesses, churches, political parties, etc. and members of complex societies tend to move from field to field on a weekly, daily or even hourly basis.

 

There are four basic types of capital or sources of power within each field:

 

  • Economic capital: command over economic resources (cash, assets).

 

  • Social capital: resources based on group membership; relationships; networks of influence and support; people can tap into by virtue of their social position.

 

  • Cultural capital: forms of knowledge; skill; education. Parents provide children with cultural capital, the attitudes and knowledge that makes the educational system a comfortable familiar place in which they can succeed easily.

 

  • Symbolic capital: accumulated prestige; honour.

 

Bourdieu’s theory tries to explain power relative positions of power perpetuate themselves.   For example, he would be interested in how Mormon leaders maintain their position and pass their influence on to their successors.

 

Bourdieu sees the legitimation of the various types of capital as crucial to its effectiveness as a source of power. He says that this process involves   “symbolic violence”.   That is, as people are taught to accept their place in the social order there is a systematic and hence predictable misunderstanding or   “misrecognition” of what is really going on. So working class children see it as legitimate that their middle-class peers have more success in the educational system than justified by their objective performance.   This is the use of symbols to do social violence, ironically with the full cooperation of those harmed by it.   This highlights the role of engraining acceptance of ideas related to status and hence the power some people are justified in exercising over others.

 

Bourdieu wrote, for example, about a gift exchange that operated within a primitive group where the regular giving of “gifts” performed the exchange goods function the market economy performs in ours.   He said that to continue to be effective, this exchange requires individual and collective misrecognition of its objective reality ““ who was benefiting or losing on a net basis (David Swartz, Culture and Power – The Sociology of Pierre Bourdieu, p. 91)

 

To this the philosopher John Searle adds the following:

 

Human institutions are structures of constitutive rules. People who participate in the institutions are typically not conscious of these rules, often they even have false beliefs respecting the nature of the institution, and even the very people who created the institution may be unaware of its structure. Further, the very people who created or participated in the evolution of the institution may themselves have been totally ignorant of the system of rules. (quoted in Adam Gifford Jr., On the Nature and the Evolution of Institutions, Journal of

Bioeconomics, 1:127 ““ 149, at p. 141, (1999) http://buslab5.csun.edu/agifford/Research/B&TBioecon.pdf)

 

Both Bourdieu and Searle are describing mechanisms that are very similar to the collective mind to which I referred above as well as a special type of the sacred premises I mentioned above.

 

Bourdieu”™s writings with respect to misrecognition predate by many years the behavioural and neurological research on which most of the ideas in this essay are based.   Without being able to explain his observations as thoroughly as later theorists could, he observed an inconsistency over long periods of time between what members of certain groups said they were doing, and what the data he collected indicated they were doing.   He concluded from this that many groups of people are not aware of the objective effects of their actions ““ that they   “misrecognized” the nature of their relationships with each other.     If ants could talk, for example, imagine what they might say about their activities?   In a real sense, the behaviour within human groups should be considered on the same basis.

 

Bourdieu”™s idea of misrecognition is rooted in the reality that in order for many social groups to continue to exist, one part of the group must control the others.   And this control must be exercised on a basis that is accepted by most of the parties involved, or the group will break down.   For example, once the commoners realized that there was no reasonable justification for their exploitation by the aristocracy, the movement toward democracy commenced and human society was fundamentally changed as the aristocracy grudging surrendered its power.   The same process occurred in the battle between labour and capital.

 

One of Bourdieu”™s important insights is that an understanding of the reality of the relationships involved within many groups seems to be perceived to be so dangerous (and hence is so feared) that it is unconsciously suppressed by both the dominated and dominating parties, though often in different ways and to different extents.   The analogy of the abusive or alcoholic spouse again is enlightening.   In such cases, both spouses are usually in denial as to the reality of what is going on between them.   Bourdieu would tell us that most of us, most of the time, to one extent or another, are in denial as to the nature of our relationships.   Not a comforting thought.   But the more closely I look at what goes on around me, the more convinced I am that in my case at least, Bourdieu was far more right than wrong.

 

The emotional and other mental forces responsible for misrecognition are discussed elsewhere in this essay.   Misrecognition is overcome by the distribution of more accurate information about the hidden interests that disclose the nature of the social transactions which make up the society.   This disclosure almost always works to the advantage of the classes who are dominated in the existing social structure.

 

The kind of misrecognition to which religious and other leaders are subject has been described by many before Bourdieu, with Plato and Nietzsche among them.

 

Accordingly, religious and other leaders have long justified their use of deception, whether by way of outright lie or more subtle persuasion of the type this essay primarily treats, on the basis of the duty of leadership.   This is neither more nor less than another illustration of what Bourdieu calls misrecognition.   The leaders will tend to find all available means to justify their attempts to keep all of the power they can get until it is pried from their cold, dead hands.

 

It is also important to note respecting Bourdieu that while later theorists provide support for his theories based on types of research to which he did not have access, many of them seem to have missed his point respecting the interested, power-based underpinnings of most forms of social behaviour.   It is almost invariably enlightening to consider the question of whose interests are furthered and whose are weakened as we walk through the various behaviour predicting factors found in the recent lines of research summarized below.   And this, incidentally, is how evolutionary theorists work ““ they first ask   “who benefits” when trying to predict what will happen in any environment as a result of changing circumstances.

 

In light of what Bourdieu and Searle have to say, consider what several evangelical pastors told me while discussing the difference between literalist and “liberal” Christian congregations.   They said that the literalists are growing and the liberals are shrinking because the literalists have an urgency about them that attracts human energy.   That is, if those good people in Africa, China etc. really will go to hell if they are not baptized, and if there are literal riches in heavenly store for those who enable their baptism, then your typical congregant in Calgary, Canada is far more inclined to donate time and money to the cause than if all of that stuff is mere metaphor.   This fact ““ that members were easier to attract and were more generous with their time and money ““ was not described as a reason to deceive people.   Rather, it was described as evidence that this must be God”™s way.   The unstated, sacred premise behind this reasoning is that a God exists who is working toward accomplishing his literal will as set out in the Bible ““ that all the Earth must come until him or at least have been warned of the consequence of not doing so, before the end of time will come.   Therefore, since the literalist approach harnesses more energy to this end, it must be more true than the liberal approach.

 

This approach to religion has a frightening upshot.   The same reasoning could be applied to justify Hitler”™s approach, for example.   He was certainly adept at harnessing human energy.   This idea has a long history ““ that power is self justifying; that if we can do something, the fact that we can do it justifies doing it.   And through the use of circular reasoning and the help of some sacred and invisible premises, almost anything can be justified on this basis.

 

For example, I have certain literalist beliefs (sacred premises) that cannot be questioned.   They could just as easily relate to the purification of the human race through some form of eugenics, the idea that all humankind must be taught about a particular religious belief so that God can bring about the end of the wicked Earth and usher in a reign of peace, or any one of countless other ideologies.   Then, I find that by harnessing the universally effective powers of fear and desire I can motivate more human allegiance to my ideas than if I do not use fear and desire.   This is then put forward as evidence in favor of the truth of both my sacred premises and the appropriateness of the means I have sought to achieve the ends I believe they dictate.

 

Accordingly to Searle and others, human social institutions (including religious institutions) are useful friction reduction devices.   They are facilitators and can be used for ends we are likely to deem   “good” as well as   “bad”.   And it is crucial to note that they can perform function we are likely to deem good while disseminating a measure of falsehood.   Governments, for example, are generally considered to be justified in telling lies about certain aspects of national security programs.

 

Another way of looking at this is to remember that if everyone believes the same lie, at least chaos does not reign. In this sense, the worst of religions can in some cases be considered the lesser of evils. Societal as well as biological evolution deals in relative merit (“comparative advantage”) instead of truth. Hence, changing just enough to maintain a lesser of evils status over a long period of time will create huge cultural institutions. The history of many religions, including Mormonism, can be thus instructively read.

 

Inference Systems as Sources of Emotion and Maladaptive Behaviour

The noted anthropologist Pascal Boyer, while not using the terms “ecological rationality” or   “biases”, provides insight into some of their underlying mechanisms in his recent book “Religion Explained: The Human Instincts that Fashion Gods, Spirits and Ancestors”.     At the heart of Boyer’s thesis is the notion that religious beliefs arise from the normal functioning of the human mind. To explain this he uses research from anthropology as well as cognitive and development psychology to support the notion of “inference systems” ““ mental explanatory devices or templates that we have developed to order our perceptions of reality. These devices developed to do things that are required to survive and propagate, and are based on presumed and often erroneous cause and effect relationships and are used to take the relatively small amounts of information we ingest and create a meaningful world.   For example, for most of human history humans have perceived the world to be flat, at the center of the universe, and inhabited by witches, spirits, demons, etc.   This made sense from an ecologically rational point of view given the information our ancestors had at their disposal.   Now, in the democratic West at least, we are much more influenced by deliberative rationality than were our ancestors, and so the world appears quite differently to us.

 

Boyer’s sensible claim is that our inference systems have developed to help us understand the physical and social reality around us.   He does not speak in terms of deliberative or ecological rationality, but it is possible to analyze his inference systems using these concepts.   What we find is that inference systems are almost all hybrids of ecological and deliberative rationality, and that most are much more ecological than deliberative.   We also find that the formation of value structures, such as religious belief systems, is heavily influenced by our inference systems.

 

Boyer’s theory is that if we understand how our inference systems work, they should help us to see why certain religious beliefs are commonly found, whereas others that are theoretically no less sensible are uncommon.   For example, throughout life we deal with conscious agents that make decisions of various kinds that are important to us.   These include other human beings and animals that might either prey upon us or be our prey.   As religious beliefs develop, it is hence not surprising that the supernatural realm was assumed to be peopled by similar agents that behaved in ways that were consistent with the agents humans understood.   This is a classic example of ecological rationality.

 

Boyer also claims that the mental processes underlying our inference systems are not subject to introspection and therefore are naturally resistant to deliberative rationality.   These processes are largely intuition and emotion driven and so produce beliefs that seem unchallengeable.   The stories told that create our sacred premises are shaped so as to be consistent with what our inference systems find sensible.   This is why these beliefs hold such sway over us.   Again, Sripada and Stich”™s ideas respecting poorly understood phenomena are helpful here in combination with Boyer”™s insights.   Boyer can be understood to provide an evolutionary framework to explain the biases Sripada and Stich describe.   And Sripada and Stich”™s concepts of ecological rationality and indirectly biased learning explain why Boyer”™s inference systems are influential in some aspects of our lives, and not in others.

 

Boyer applies his theory to various kinds of religious phenomena. These include gods and spirits, rituals and doctrinal beliefs.   For example, Boyer says that we believe in spirits because they activate our inference systems for how humans behave both individually and in groups.   I also note that some beliefs related to spirits make us feel better about what happens after death, while others may make us more inclined to obey the rules of our society.   These ideas, hence, appeal to both our psychological and social needs and instincts.

 

Boyer would say that our instincts arise from inference systems.   And ideas that compel one human being often compel others.   Boyer uses Richard Dawkins’ meme theory to explain how this happens.   A meme, essentially, is an idea that replicates through a society as it passes through person-to-person interaction, both direct and indirect.   Memes are sometimes compared to viruses in that regard.   The word   “meme” is a play on the word “gene” and its related concepts.   Religious ideas are memes, as is the scientific method, as is the catchy tune to the latest pop hit.

 

Boyer says that an idea’s “aggregate relevance” ““ that is, the degree to which it activates our various inference systems and related emotional responses ““ will determine the degree to which it will be remembered and passed on to others, as well as how often it will be “discovered” from time to time and place to place.   The idea that we have an essence or spirit that will continue to live after death is perhaps at the top of the aggregate relevance list.   Wearing baseball hats sideways does not likely have the same shelf life.

 

Another of the inference systems Boyer describes is related to coalition building, forming dominance hierarchies, and categorizing people. He notes in this regard that inferences we normally use to tell species apart (Sesame Street”™s old   “one of these things is not like the other” game) are also used respecting behaviour within groups of human beings.   This is part of the idea complex noted above respecting the importance of noticing cultural signifiers.

 

Boyer sees religious fundamentalism as a result of our instincts respecting the importance of preventing defections from coalitions (groups that sufferer defections are weakened, threatening the survival of their members) and as a reaction to the secular message that disobedience to rules that bind religious coalitions together is not costly.   Religious fundamentalism significantly ups the ante in this regard, as 9/11 terrifyingly illustrated.   Boyer sees religious ritual as a way of exhibiting and testing social cooperation and coalition solidarity.                                                                                                                                                                     See http://www3.telus.net/public/rcmccue/bob/documents/temple%20marriage.pdf for a summary of my views in this regard respecting Mormon temple rituals, which support Boyer”™s thesis.

 

Boyer’s work has been criticized by many on the basis that it is too speculative; that its title is too ambition (it does not “explain” religion) and for a variety other reasons.   However, most reviewers have conceded that the book ““ while flawed as all books are ““ is thought provoking at least, and perhaps ground breaking.   The book became more useful to me after I understood, with the help of the material summarized above, the interface between value structures, emotions and reason.   I now see inference systems as one of the mechanisms that influence the formation of value structures.

 

While I agree with many of Boyer’s main points, I see no reason to conclude (as he suggests) that evolutionary forces would not be aligned with social utility and would not set us up to produce what we experience as psychological gratification by, for example, purporting to answer our existential questions and make us feel secure within a group.   Religious theories and leaders who are effective in allaying fear are likely important group and coalition building agents.   The formation and maintenance of groups is no doubt a key to survival and it seems fair to assume that evolution works along these lines as well as those to which Boyer points.   So, rather than reading Boyer to contradict the theorists summarized above, I see his research and theories as confirmatory.   He uses a different approach to reach many of the same conclusions.

 

Cognitive Biases and Cognitive Dissonance

“Biases” and   “cognitive dissonance” play important roles in the psychological matrix that reduces the likelihood that sacred beliefs will be questioned once they have been created by the combination of social and personal experience described above.   This strengthens the forces of denial.     For an excellent overview respecting the application of cognitive dissonance principles to religious issues in general, see “Speculations on a Privileged State of Cognitive Dissonance”, by Conrad Montell at http://cogprints.ecs.soton.ac.uk/archive/00002388/01/temp.pdf

 

While not all cognitive biases are caused by cognitive dissonance, many are and so I have decided to discuss them together.   Cognitive dissonance has been one of the most carefully and successfully studies aspects of psychology.   Thousands of different repeatable experiments have been performed, many of them numerous times, to reach the conclusions summarized below.   I have almost at random pulled examples from the literature to illustrate how cognitive dissonance and related biases affect Mormon behaviour.   Much of the material I will cite comes from one of the leading texts used in American universities today, Elliott  Aronson “The Social Animal” (9th Ed.) Worth Publishers, 2004.   Robert Cialdini, Robert Levine, Philip Zimbardo and Daniel Kahneman are among the other leading researchers in this field.

 

A cognition is a piece of knowledge about an attitude, an emotion, a behaviour, a value, etc.     Two cognitions are said to be dissonant (thus producing   “cognitive dissonance”) if one cognition conflicts with another.   For example, I like my friend, and trust him.   Various cognitions relate to this.   If I find out that my friend has lied to me, other cognitions form that are dissonant with those I already hold.   Cognitive dissonance is the term used to describe the resulting unpleasant mental state, which most humans immediately attempt to relieve themselves of much as they look for water when thirsty.

 

What happens to people when they discover dissonant cognitions? Cognitive dissonance is experienced as a state of unpleasant psychological tension. This tension state has drive-like properties that are similar to those of hunger and thirst. That is, when a person has been deprived of food for several hours, she experiences unpleasant tension and is driven to reduce it. Cognitive dissonance produces similarly driven behaviour to find consonance.   However, finding the means to reduce this dissonance is not as simple as eating or drinking.

 

How does dissonance work?   First, dissonance increases as the degree of discrepancy among cognitions increases. That is, how serious was my friend’s lie, and how often has he lied?   Second, dissonance increases as the number of discrepant cognitions increases. That is, how strong is the evidence of the lying behaviour?   How many different cognitions support the dissonant conclusion that I can no longer trust my friend?   Third, dissonance is inversely proportional to the number of consonant cognitions held by an individual. That is, if he only lied once and on a multitude of occasions I could be certain that he has been trustworthy, I would be less concerned.   Fourth, dissonance is affected by the relative importance of the various consonant and dissonant cognitions in play.   Perhaps in the case of this friend, lying is not that important because I do not depend on him in a significant fashion.   In such cases my dissonance would be lower than it would if the friend in question was also the mother of my children or my wife.

 

How can dissonance be reduced?   If two cognitions are dissonant, we can change one to make it consistent with the other, change each cognition in the direction of the other, find more offsetting consonant cognitions, or we can re-evaluate the importance of either the dissonant or consonant cognitions.   These strategies often result in what is sometimes called denial ““ the suppression or unrealistic appraisal of evidence in an effort to reduce dissonance.   As William Safire in a New York Times op-ed piece (December 29, 2003), put it:

 

To end “¦ cognitive dissonance “¦ we [often] change the weak cognition to conform to the stronger one.   Take Aesop’s fox, who could not reach a lofty bunch of grapes no matter how high he jumped. One foxy cognition was that grapes were delicious; the other was that he couldn’t get them. To resolve that cognitive dissonance, the fox persuaded himself that the grapes were sour – and trotted off, his mind at ease.

 

So, If two cognitions are dissonant, we tend to change one or both to make them consistent with each other.   This often results in   “denial”, which in the cognitive dissonance context is considered to be the suppression or unrealistic appraisal of evidence in an effort to reduce cognitive dissonance.   Denial is, by definition, invisible to the person or group that is subject to it, but often easily visible to outsiders.   A well known example of cognitive dissonance induced denial is that of the wife who husband is   “cheating” on her, and while many friends and family have seen enough evidence to feel fairly confident that they understand what is going on, the faithful wife refuses to acknowledge the possibility even when the evidence is placed before her by well meaning friends.   In this case, the dissonant cognitions are between the man who expresses his love for her, and her dependence on him in various ways as a result of the life they have built together, and the evidence that suggests that that same man is being sexually unfaithful to her.   The more she fears the consequence of the second cognition, the blinder she is likely to be to evidence supporting it.

 

Cognitive Dissonance Theory”™s Religious Roots

Leon Festinger is the father of cognitive dissonance theory, which was based on his observations of a Wisconsin-based flying saucer cult of the 1950s whose prophecy of universal destruction failed to come true. The cult prophesied a vast flood would kill everyone on Earth except for the members of the cult, who would be carried away by flying saucers.   The flood, of course, did not materialize.   But the faith of the cult members, while stressed, was not broken.

 

As Festinger put it:

 

A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.

 

We have all experienced the futility of trying to change a strong conviction, especially if the convinced person has some investment in his belief. We are familiar with the variety of ingenious defences with which people protect their convictions, managing to keep them unscathed through the most devastating attacks.

 

But man’s resourcefulness goes beyond simply protecting a belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervour about convincing and converting other people to his view. (Leon Festinger, Henry W. Riecken, and Stanley Schachter, “When Prophecy Fails”, (New York: Harper and Row, 1956), p. 3)

 

He later continued respecting the reaction of believers to a significant failed prophesy:

 

“¦ dissonance would be reduced or eliminated if the members of a movement effectively blind themselves to the fact that the prediction has not been fulfilled. But most people, including members of such movements, are in touch with reality and cannot simply blot out of their cognition such an unequivocal and undeniable fact. They can try to ignore it, however, and they usually do try. They may convince themselves that the date was wrong but that the prediction will, after all, be shortly confirmed; or they may even set another date as the Millerites did…. Rationalization can reduce dissonance somewhat. For rationalization to be fully effective, support from others is needed to make the explanation or the revision seem correct. Fortunately, the disappointed believer can usually turn to the others in the same movement, who have the same dissonance and the same pressures to reduce it. Support for the new explanation is, hence, forthcoming and the members of the movement can recover somewhat from the shock of the disconfirmation. (Festinger et al, p. 28)

 

The cult in question explained the failed prophesy by the fact that on the critical night their prayers “had spread so much light that God saved the world from destruction”. And the cult became fervently evangelistic. Festinger suggests that the only way for them to reverse their humiliation was to convert other people to their beliefs. If everyone believed, no one would laugh. However, this did not work: after such a spectacular failure, the cult predictably failed to convert anyone.   The platitude that if you are going to lie, you should tell a big one seems as true here elsewhere.

 

Other researchers have questioned Festinger’s emphasis on increased levels of proselytizing as a cognitive dissonance coping mechanism, and have proposed broader behavioural models that are in my view consistent with the basics of cognitive dissonance theory.   For example, in the context of a failed Lubavitch Jewish messianic expectation, Simon Dein noted the following:

 

A popular model for looking at failed prophecy is that of Zygmunt (1972) who suggests three modes of adaptation to prophetic failure: adaptation, reaffirmation and reappraisal. First, believers may acknowledge an error of dating such as occurred among the Millerites. Second, the blame may be shifted to some force inside or outside the group which interferes with the cosmic plan. Lastly, believers may postulate that the event in fact occurred but on the spiritual not on the material plane and was not, therefore, directly observable to believers. Building on Zygmunt’s ideas, Melton (1985:21) argues “the denial of failure of prophecy is not just another option, but the common mode of adaptation of millennial groups following a failed prophecy.” He suggests two additional modes of adaptation, those being social and cultural. The cultural or spiritualization mode means the groups tend to reinterpret the promise of a visible verifiable event into the acceptance of a nonverifiable, invisible event. The prophecy has come about on a spiritual plane. Members may however still experience dissonance and emotions such as sadness, fear, bewilderment and disappointment and it is for this reason that the prophecy must be reinterpreted. The social mode addresses emotional distress by placing an emphasis on renewing group ties after disconfirmation. (What Really Happens When Prophesy Fails:   The Case of Lubavitch, p. 3, http://www.findarticles.com/cf_dls/m0SOR/3_62/79353385/p3/article.jhtml?term=)

 

In the Lubavitch case, the observed behaviour fit nicely the theory of adaptation. It is also interesting to note the Lubavitch is one of the few proselytizing, messianic factions of Judaism, and that its relatively moribund fortunes within the Hasidic branch of Judaism were rescued by several charismatic, missionary oriented leaders, the last of who was presumed until well after his death to be the Messiah.   Again, the big stories are the ones that sell.

 

Other similar failed prophesy scenarios have received a great deal of attention.   One, also referred to by Festinger, is referred to as “The Great Disappointment” ““ an event in the early history of the Seventh-day Adventist Church.     Between 1831 and 1844, William Miller, a Baptist preacher, launched what he called the “great second advent awakening”, also known as the Millerite Movement. Based on his study of the prophecy of Daniel 8:14, Miller calculated that Jesus would return to earth sometime between 1843 and 1844. Others within the movement calculated a specific date of October 22, 1844.   When Jesus did not appear, Miller’s followers experienced what became to be called “the Great Disappointment”.   Thousands of followers left the movement. A few, however, went back to their Bibles to find why they had been disappointed. They concluded that the prophecy predicted not that Jesus would return to earth in 1844, but that a special ministry in heaven would be formed on that date. From this started the modern-day Adventist Church.   (See http://en2.wikipedia.org/wiki/Great_Disappointment)

 

Of even more interest are the numerous, and failed, similar predictions made by the Jehovah’s Witnesses.     As noted by Penton, a Watchtower historian:

 

No major Christian sectarian movement has been so insistent on prophesying the end of the present world in such definite ways or on such specific dates as have Jehovah’s Witnesses, at least since the Millerites and Second Adventists of the nineteenth century who were the Witnesses’ direct millenarian forbears. During the early years of their history, they consistently looked to specific dates-1874, 1878, 1881, 1910, 1914, 1918, 1920, 1925, and others – as having definite eschatological significance…When these prophecies failed, they had to be reinterpreted, spiritualized, or, in some cases, ultimately abandoned. This did not deter Russell [the JW leader] or his followers from setting new dates, however, or from simply proclaiming that the end of this world or system of things was no more than a few years or perhaps even months away.   (M. James Penton, “Apocalypse Delayed” (Toronto: University of Toronto Press, 1985), p. 34)

 

Later dates of 1944 and 1975 were also prophesied.

 

The experience of the JW community respecting the 1925 “second coming” was typical.   As noted by Randall Watters (See “When Prophecies Fail: A Sociological Perspective on Failed Expectation in the Watchtower Society” http://www.freeminds.org/psych/propfail.htm):

 

At the death of C.T. Russell in 1916, J.F. Rutherford took over the role of the “prophet”, proclaiming in 1920 that Millions Now Living Will Never Die in a booklet and lecture by the same name. Rutherford set a new date for the end for 1925, also claiming that it would bring the resurrection of the ancient men of God to the earth, such as Abraham, Isaac, David, etc. So sure was Rutherford of this that he made the following statements:

 

Therefore we may confidently expect that 1925 will mark the return of Abraham, Isaac, Jacob, and the faithful prophets of old, particularly those named by the apostle in Hebrews chapter 11, to the condition of human perfection.

 

The date 1925 is even more distinctly indicated by the scriptures than 1914.

 

Our thought is, that 1925 is definitely settled by the scriptures. As to Noah, the Christian now has much more upon which to base his faith than Noah had upon which to base his faith in a coming deluge.

 

Rutherford even had a house built in San Diego for these ancients, and it was deeded to them when it was built! Bearing witness to the ability of the Witnesses to ride out this period of disconfirmation, the house and the prophecy wasn’t abandoned until 1943, when it was promptly sold. The Witnesses were later told that it was “built for brother Rutherford’s use.

 

Tremendous disappointment and disillusionment followed this failure.   Watters continues:

 

The disappointment didn’t last long, however. The outbreak of World War II was seen as the beginning of Armageddon. An in-house publication of the Watchtower stated in 1940:

 

The Kingdom is here, the King is enthroned. Armageddon is just ahead. The glorious reign of Christ that shall bring blessings to the world will immediately follow. Therefore the great climax has been reached. Tribulation has fallen upon those who stand by the Lord.

 

The Watchtower of September 15, 1941 (p . 288) even stated that we are “in the remaining months before Armageddon.” Armageddon fever was at an all-time high. Barbara Grizzuti Harrison, former member of the Watchtower’s Bethel family, gives us a glimpse of the air of expectancy:

 

So firmly did Jehovah’s Witnesses believe this to be true that there were those who, in 1944, refused to get their teeth filled, postponing all care of their bodies until God saw to their regeneration in His New World. (One zealous Witness I knew carried a supply of cloves to alleviate the pain of an aching molar which she did not wish to have treated by her dentist, since the time was so short till Jehovah would provide a new and perfect one. To this day, I associate the fragrance of cloves with the imminence of disaster.)”

 

Amazingly, new leadership while initially cautious of making further predictions eventually fell prey to their predecessors’ habit and set out another date.   Watters notes:

 

“¦ the prediction of 1975 that first appeared in Life Everlasting in Freedom of the Sons of God (1966). Exercising caution in stating that this new date would definitely be the end, Franz (through his public lectures and Watchtower articles) made statements such as “according to this trustworthy Bible chronology six thousand years from man’s creation will end in 1975, and the seventh period of a thousand years of human history will begin in the fall of 1975 C.E.” Any Jehovah’s Witnesses knew that the end of 6000 years meant the beginning of the millennium of Christ’s reign. The Awake! magazine of October 8, 1968 (p. 14) stated, “How fitting it would be for God, following this pattern, to end man’s misery after six thousand years of human rule and follow it with his glorious Kingdom rule for a thousand years!”

 

In lectures given to the members of the headquarters staff in New York, Franz stated (regarding the end) that “we don’t know now if it will be weeks or months,” before a crowd of 2000 Witnesses. Many other statements were made in print. One traveling overseer even gave a public talk indicating it would be a total lack of faith to doubt that 1975 would be the end! Franz became the fourth president of the Watchtower a year later.

 

Unlike the flying saucer cult and the Millerites, the Watchtower was at first unwilling to accept blame for the disconfirmation, shifting it to “over-zealous brothers.” Many Witnesses, however, were outraged and the Watchtower finally accepted much of the blame publicly.

 

Friends of those who were Jehovah’s Witnesses often noted the changes in their lives as 1975 approached. Janice Godlove relates this regarding her JW brother and sister-in-law:

 

As 1975 approached, the signs of tension increased. Strange bits and pieces of the family atmosphere came to our attention. There was an almost morbid fascination with flocks of birds gathering in the fall. We were given all of their canned goods since they wouldn’t need them anymore. An access panel had been cut in the wall behind their washing machine and the boys (who were 5 and 3 at the time) were told to run to the kitchen and hide if they heard screams. Bill was so disappointed by the failure of 1975 that he attempted suicide. But the tract we left by his hospital bed went unread and the family remained in the organization.

 

Today, each of the above failures are played down, and no reason is officially given for them. Many recent JW converts are not even aware of the relevant history.   There are close parallels here to the faithful Mormon community’s ignorance of many important aspects of Mormon history and, in particular, Joseph Smith”™s failed prophesy with regard to Christ”™s second coming.   That is explained with the Mormon community by the few who are aware of it on the basis that it was conditional on the worthiness of the people, and they failed.

 

Watters concludes as follows respecting the resilience of the JW organization:

 

A pattern emerges when we examine the growth figures before and after each disconfirmation [failed prophesy]. Typically, there was a rapid growth in numbers at least two years before the prophetic date, followed by a falling away of some (viewed as a “cleansing” of the organization of the unfaithful), then another growth spurt as a new emphasis on evangelism was put forward.

 

It may seem incomprehensible how the Witnesses could ignore the implications of each disconfirmation. Outsiders view the Witnesses as lacking common sense for not leaving the organization after numerous failures. They fail to understand the dynamics of mind control as used by cults. Even many ex-JWs fail to understand that the further disconfirmation of the importance of 1914 and “this generation” will not seriously affect the numbers of those swelling the ranks of the Watchtower. The results of mind control and unquestioning obedience will have the same effect today as it did in Russell’s day. His view was, “Where else can we go?” Harrison writes regarding this attitude,

 

That, of course, is one of the keys to survival of the organization Russell founded on soft mysticism, glorious visions and worldly disaffection. The Witnesses had nowhere else to go. Their investment in their religion was total; to leave it would have meant spiritual and emotional bankruptcy. They were not equipped to function in a world without certainty. It was their life. To leave it would be a death.

 

This same dependency-unto-death phenomena is at work in thousands of cults all over the world. People wondered at Jonestown: “Why didn’t they leave when they saw what Jim Jones was becoming?” The people of Jonestown answered by their actions, “Where else would we go?” They had burned their bridges to follow their Messiah unto death.

 

Over 110 years and several failed prophecies later, the Watchtower movement is testimony enough that failed predictions do not mean the dissolution of a cult following. The failure of 1975 resulted in a decrease of less than 2%. The Watchtower will always be able to develop clever rationalizations regarding their changing dates, as their history documents. Today, the Watchtower grows at a rate of about 5% per year worldwide, with over 3.7 million door knockers and over 9 million sympathizers!

 

The behaviour of the JW faithful in the face of the kind of incontrovertibly disconfirming evidence just described is hard to understand.   The reaction of the Millerites seems more sensible ““ most believers deciding that they should not continue to believe.   Nonetheless, the evidence from many cases in addition to the few summarized above seems clear to the effect that in certain cases, the ties that bind a group together may become so strong that in the short term at least, there it is virtually impossible to shake their faith.   In fact, it is in my view consistent with the evidence to suggest that the more potentially disruptive a piece of information is to one”™s comfortable existence, the more likely it is to be suppressed or otherwise misrecognized.   While evidence of this is everywhere around us, including in the summaries of human reaction to failed prophesy above, none is more disturbing than that found in the holocaust autobiographical classic “Night”, by Elie Wiesel.   Here we find graphic evidence of misrecognition”™s power to shape our perception of reality.

 

Wiesel tells the story of how he lived as a 14-year-old Jewish boy in a small Hungarian town called Sighet during World War II.   As the Nazis gradually closed their net around this town, rumours began to circulate.   However, the residents found reasons to believe that their comfortable little world would not collapse, and so few if any of them escaped while they had the chance.   At some point, all of the foreign Jews in Sighet were expelled.   One of them was Wiesel”™s religious mentor, Moshe Beadle, a joyful, deeply spiritual man.   Months passed, and life in Sighet continued mostly at its comfortable pace.

 

Then Moshe returned.   He told a chilling story.   The buses in which the deportees left Sighet had crossed the border into Poland and been handed over to the Gestapo.   The Jews were forced to get off and dig huge pits.   Then they were all ““ men, women and children – machine gunned and pushed into what became their graves.   Some babies were tossed into the air and used as human skeet to entertain the soldiers.   Moshe was wounded and left for dead.   It had taken him months to make his way back to warn his friends.   Wiesel notes:

 

Through long days and nights, [Moshe] went from one Jewish house to another, telling the story of Malka, the young girl who had taken three days to die, and of Tobias, the tailor, who had begged to be killed before his sons”¦

 

Moshe had changed.   There was no longer any joy in his eyes.   He no longer sang.   He no longer talked to me of God or of the cabbala, but only of what he had seen.   People refused not only to believe his stories, but even to listen to them.

 

‘He’s just trying to make us pity him.   What an imagination he has!” they said.   Or even, “Poor fellow.   He’s gone mad.’

 

And as for Moshe, he wept.

 

No one would believe him.   Not even his protégé Wiesel.   This was toward the end of 1942.   There was plenty of time to escape.

 

Wiesel goes on to tell of how more and more news of the war, Hitler’s atrocities in general and his plans for the extermination of all Jews gradually infiltrated his town.   They heard of what the Germans were doing to the Jews in other parts of Europe.   Still lots of time to escape, but no one put what Moshe had said together with these reports and acted.   Finally the Germans arrived.   They began to remove Jewish liberties ““ still time to escape and no one acted.   Then they created Jewish ghettos, and finally prepared them for mass deportation.     All along the way, it would have been possible for many to escape.   But at every juncture along this path, the good Jewish people of Sighet rationalized in different ways that things were not so bad; that their lives were not going to change too much; that their god would watch over them.   This rationalization ended only as they watched their family members being led into the gas chambers and furnace of Auschwitz, or entered there themselves.

 

With the benefit of hindsight, we can say that they should have been able to see the signs all around them as to what was happening.   Why could they not see the obvious?

 

Two things come to mind.   First, I have become acutely aware during the past couple of years of how difficult it is to see anything that may shatter one”™s world.   That is why spouses cannot see abusive, unfaithful, or addictive behaviour in each other.   That is why Mormons are resistant to any information that suggests their worldview is incorrect.   However, as the Jews of Sighet illustrate, this power is far greater than I could have understood until I read of their experience.   In short, the emotional aspect of human life regularly overcomes what to an objective observer would likely seem to be an insurmountable rational case.

 

And second, the Jewish people of Sighet suffered from a kind magical worldview.   That is, they believed in miracles; they believed that their God had, could and would override the physical laws of cause and effect for their benefit, provided that they were appropriately obedient and exercised the right kind of faith.   They believed that god would protect them.   And most of all, they did not believe that the kind of evil Hitler represented could exist bring its weight to bear upon them, even in the face of first hand evidence from Moshe and unlimited amounts of anecdotal evidence from other sources.   They were naturally resistant to anything that contradicted their magical worldview.   This belief was part of what cost the vast majority of those who held it their lives.   Many other Jews ““ often well educated and less magical thinking people such as Einstein ““ understood enough to leave while they could.   That is, the broader a person’s worldview, the better developed her connection to reality and hence the less affected she was by magical thinking, the more likely it was that she would flee.   Note that I did not mention intelligence.   Many intelligent people died at Auschwitz, due to some extent at least to their poor purchase on reality.   The framework within which intelligence is exercised is, at it turns out, much more important in many ways than the degree of intelligence.

 

Perhaps the message that comes clearly through the above accounts of group denial is that the denial inducing nature of cognitive dissonance makes it difficult to self-diagnose.   This highlights the importance of getting outside of one’s self perception.   By definition, we cannot see our own blind spots.   We must either have others we learn to trust point them out for us, or we may eventually feel them through the cognitive dissonance process, which is usually lengthy and painful.

 

This reminds me of something of which I have heard many of my clients complain ““ the 360 degree review process that has become fashionable in the business community.   That process requires senior executives to authorize an outside consultant, who will protect the confidentiality of the other participants, to collect and summarize their views regarding the executive in question.   The participants will be drawn from the environment that surrounds the executive ““ hence the term “360 degree”.   Hence, superiors as well as subordinates will be interviewed, as will customers, suppliers and other organizational stake holders to whom the executive is relevant.

 

For the average self-confident, often egotistical senior executive, this is a bruising process that can provide a wealth of information as to the location and nature of personal blind spots.   This will usually dramatically increase cognitive dissonance in the short term (what do you mean they all hate me!?   They smile at me and are nice to me every time I see them!) and will cause behavioural modification that will dramatically improve performance and likely reduce long term cognitive dissonance, or dissonance respecting more important issues (What do you mean they have all quit!?   They said they liked me every time they saw me!).

 

As noted above, one of Buddhism”™s central and enlightening notions is that most of mankind”™s ills are caused by the manner in which fear or desire cause us to make unwise decisions.   As the following summary of recent research will show, this ancient insight is remarkably accurate.   Buddha”™s   “middle way” was the path that lay between fear and desire and so was out of both their reaches.   And since a good portion of desire is fear that we will not obtain that which we most desire, fear is the most primal and effective of emotions.   The well known case of denial in marriages where infidelity is a problem illustrates this.   The faithful spouse is usually unable to see the evidence of cheating until well after most others can see it.   This denial of reality is a function primarily of the spouse”™s fear of losing the relationship if the information in question is processed and dealt with.   The greater the fear, the greater the cog dis it will produce and the deeper will be the consequent denial and suppression of threatening information.

 

The psychology related to personality profiles indicates to us that not all people are influenced by fear and desire in the same way.   In one study that focussed on the question of why some people are more religiously inclined than others, it was determined that the personality trait called   “openness” correlates strongly to religious tendencies (see Shermer,   “How We Believe”).   Openness is the inclination toward new experience; the opposite of dogmatism.   The more   “open” a person is, the less likely she is to be influenced by fear in any particular situation, and the less likely she is to be religious in the traditional sense of that word.   That is, the less likely she will be to accept traditional religious authority and the literalistic interpretation of scripture it posits.   And of course the opposite is also true.

 

So, the picture that comes into focus is that in any particular case, the unconscious suppression or reinterpretation of information is a function of two things.   First, how open to new experience the individual in question person is, and second, how significant is the fear that the denied information is perceived to create.

 

A faithful Mormon should be expected to experience massive amounts of fear upon contemplating the possibility that the religious truth-claims on which on which much of his life, family and social relationships are based are false.   This fear produces a powerful form of cognitive dissonance, and hence an extensive or suppression of the information.   We should expect that the more faithful the Mormon, the less able she will be to see the reality of the institution that sponsors her religious faith and the effect that faith has upon her.

 

So, what do we learn from these bits of theory and history?   Is it as simple as implied by the statement wrongly attributed to P.T. Barnum ““ That there is another sucker born every minute? (See http://www.historybuff.com/library/refbarnum.html).   Hardly, although since Barnum was allegedly equating “suckers” with those whose buttons he could push so as to cause them to buy his wares, the statement attributed to him was right.   More importantly, the clear message of religious history relative to cognitive dissonance is that the social and psychological forces that are the subject of this essay are formidable enemies and powerful allies, and while the reaction of individuals or groups of humans to information that should cast serious doubt on their religious beliefs cannot be predicted, the clear and universal pattern is one of denial and stubborn resistance to any evidence that may or should disconfirm beliefs on which their worldview as well as social and familial relationships are based.   This denial is followed in some cases by painful acceptance and adjustment, usually within “the faith”.

 

Cognitive Dissonance”™s Power is Magnified by Uncertainty and Fear

Many religions (including the Christian and Mormon) create significant amounts of uncertainty that would not have existed without their intervention.   This uncertainty has the effect of creating fear, and so inflates the perceived value of the certainty and security the very religions that created the fear have to offer.   The Mormon use of the Celestial Kingdom exemplifies this practise.   It is no different, however, than the use by the Christian, Jewish and Muslim faiths of the concepts of heaven, hell, purgatory and other after life states that depend upon obedience to religious authority during this life, or the Hindu concept of reincarnation that rewards or punishes one in the next life for his adherence to certain prescribed standards in this one, or innumerable other religious systems that are set up in this fashion.

 

The historians and anthropologists I have read on this topic suggest that religion has since time immemorial been used as an important part of the glue that binds groups together and that fear is a significant part of the glue.   However, many of the particularly inward looking aspects of religion are likely the result of a process that started when religious leaders first had to compete against each other for a limited pool of followers. This competition required that those leaders distinguish their religions from the competition.   One effective way to do this is to posit that their particular brand of religion had a monopoly respecting certain concepts (such as the Celestial Kingdom) and on God’s approval, and that anyone who rejected this belief would be punished by God.   This likely raised the “fear” stakes significantly in the religious marketplace.   Anthropologists who study isolated groups often find that religion in that context does not deal with the concept of one religion being “true” and others being “false”.   This is a foreign notion to people who do not have a history of having to choose between religious traditions.

 

If you think of what likely happened as small, isolated groups of humans merged into more cosmopolitan societies, the above theory makes sense. Each group would have brought its own religion. As the boundaries between the groups broke down, people would have had to choose between religions. Religious belief systems change over time, and so beliefs would have come into play that could then be used to help one religious leader persuade his flock not to leave, and others to join. And religion was of course used as a political tool. These concepts are relevant to that process as well. The Old Testament is an account of mankind during the period while this process was underway.

 

As noted above, the perception that we have power to overcome what frightens us makes us feel secure. So, we sometimes cling in an unhealthy fashion to the things that help us to overcome our fears. All a religion has to do is create beliefs (the Celestial Kingdom exists and only obedient Mormons can go there to live eternally with their families) that cause desire/fear (I want to be with my family and hence fear not being with them after death) and offer the “power” to obtain what is wanted and so to avoid the fear. The more deeply we fear, the more prepared we will be to bargain away much of what we have in terms of time, money, talent etc. to avoid that fear. So, the Mormon belief is that obedience to Mormon authority in myriad ways is required in order to have the power to be with our families after death. And the Mormon Church is not bashful ““ it requires that we promise all that we have before it will release us from our fear, and then makes us feel lucky that it does not require that we give all we have promised to give. This is a stock psychological persuasion or sales trick (see Robert Levine,   “The Power of Persuasion”). First ask for the moon, and then when something more reasonable is requested in lieu, it is far more likely to be given than if the smaller request were made up front. “All you want is 10% of my income and most of my free time!? Sure! That is so much better than what I thought you were going to ask for.” After all, the covenant each faithful Mormon makes in a Mormon temple is:

 

you do consecrate yourselves, your time, talents, and everything with which the Lord has blessed you, or with which he may bless you, to the Church of Jesus Christ of Latter-day Saints, for the building up of the Kingdom of God “¦

 

The reason the system that links belief, fear and power works so well for religious leaders is that the thing desired/feared is based on a belief, the creation of which is under the religious leader’s control.   The thing believed is not real, and hence the power it creates lasts only as long as the belief does. That is, this power is not like that which comes from holding a loaded gun that would do something if the trigger were pulled. All a believer has to do is find out that the gun is not loaded, refuse to obey, and the power evaporates; the bully runs away. Regrettably, that is easier said than done.

 

Because the authority of religious leaders is strongest in environments of maximum uncertainty, religious claims tend to be both spectacular (to arose desire) and untestable (to maximize uncertainty).   Did the Virgin Birth occur?   How about the resurrection or the miracles Christ is said to have performed?   Did Christ himself exist?   Did Joseph Smith see God?   Is Heaven real?   None of this can be proven or disproven.   And the stakes are set as high as possible.   If you are wrong, you will burn in hell; be without your family; suffer some other horrible thing; and so be miserable forever.   So why not just obey, since what you are being asked to do is not so bad anyway?

 

When religious issues are framed in this fashion, uncertainty is maximized and a potential believer is surrounded by a group who have accepted the beliefs as both real and of the utmost importance, it is hard to resist.   This decision making framework puts the maximum weight on apparent authority and what the majority of the group by which the individual is surrounded does.   This plays into the hands of the dominant religious force in the group, and since that would be the religious group that is trying to prevent its members from joining other groups, it makes sense that this is how things would have developed.   And it makes is highly probable that any reality that is inconsistent with the group”™s foundational beliefs will be denied.

 

When a proposition that supports dogmatic religious authority becomes testable, this reduces the tendency of the believers to rely upon groupthink and apparent authority.   Hence, religious leaders in such cases are often quick to redefine the issues to the extent possible so that they remain uncertain.   A recent example in this regard can be drawn from the Mormon context in the form of The Book of Mormon ““ DNA debate.

 

One of my friends provided a few examples of how this kind of debate tends to work.   He noted that Karl Popper, the philosopher who formulated the roots of the hypothetical-deductive model used by most working scientists today and that has been used to cast doubt on the Book of Mormon in many ways, worked to draw the line between what should be considered science and what should not. For instance, there may be an invisible unicorn behind the moon, but because there is no way to test this proposition it is not scientific in nature. Popper also claimed that Marxism was at one time based on a scientific model of society that could be tested.   Later, it was tested and was shown not to work as it had predicted it would. But in the wake of its refutation, Marxists often attempted to redefine Marxism so that it could not be tested, and so that the tests to which history had already subjected it could be deemed invalid.   Hence, Popper claims that Marxism as now often defined is not scientific.

 

In a fashion similar to Marxism, the Book of Mormon was set up by Joseph Smith and confirmed by many of his successors (including most notably and recently Spencer Kimball) on the so-called hemispheric theory (that Book of Mormon people occupied most of the Americas).   This theory came within the grasp of science, was tested, and the evidence now shows a high probability of its refutation or falsification, to use Popper’s term.   So, the Mormon Church and its apologists now seek to redefine the theories on which the Book of Mormon is based.   This requires that what Smith and other Mormon prophets said about the book that was consistent with the hemispheric theory be dismissed as “non-prophetic error” and will henceforth be ignored.   The so-called limited geography theory (the Book of Mormon narrative was played out in an area so small that we have yet found it and can reasonable expect not to ever find it), which is the result of this redefinition process, has been made as difficult to test as possible ““ so difficult that it is likely a non-scientific theory.

 

I would suggest that the whole point of those who defend the Mormon Church’s position is to make their positions as difficult to test as possible, and that this is done for the reasons noted above ““ to maximize uncertainty, which maximizes the influence of religious authority and groupthink over individual decision-making, which maximizes the power of religious leaders.   However, the efforts of Mormon leaders do not put their revised theories beyond the ken of probabilities. How probable is it that there is unicorn behind the moon? It would be nice to test this theory, but we don’t need to given what we know about the world in which we live.   This experience gives us enough evidence to form an opinion on which we are prepared to act.

 

Given the experience we have seen others have with communism, do we much care how those who wish to still use it choose to define it? Given the mass of evidence contra The Book of Mormon being real history, do we care whether a small part of the theory believed by those who want to protect their beliefs respecting that book is not scientifically testable? And finally, respecting the Book of Mormon in general and the limited geography theory in particular, consider the following.   We are dealing with scientists and historians whose stated objective is to support their belief in the Book of Mormon.   Their conclusions disagree with those of every non-believing expert to have ever carefully considered the matter.   And, the effect (if not purpose) of these faithful Mormon scientists’ and historians’ research program is to take something that was testable and failed its test, and make it untestable just as have the Marxists with their failed project.   Why should we take people such seriously?   They have just slightly more credibility than those who still argue that the earth is flat.   See http://www.flat-earth.org/ and http://www.talkorigins.org/faqs/flatearth.html for their story.

 

Regrettably, the kind of reasoning just outlined has little impact on most believers.   Such the effect of denial.   They want to believe, and the weak arguments put forward by their religious leaders are more than enough to keep them marching in line.

 

Cognitive Dissonance and Self Esteem

A number of insightful experiments have been conducted respecting the connection between self-esteem and cognitive dissonance using cheating.   In one experiment the self-esteem of certain college students was modified by giving them false information after having filled out a personality survey.   One-third of the students were told that the personality survey showed that they were mature, interesting, etc.   Another third of the students were told that the test indicated they were relatively immature, uninteresting, etc. and the remaining third were not given any information respecting the test.   Then, the students were run through an experiment that involved playing a game of cards against other students.   They were offered the opportunity to cheat where it seemed impossible that they would be detected.   If they did not cheat, they would lose.   If they did cheat they would be certain to win a sizable sum of money which they would keep.   The results showed that the students who had previously received information that tended to lower their self-esteem cheated far more than those who had received self-esteem bolstering information.   Those who received no information fell between the other two groups.   This highlights the importance of bolstering the self-esteem of children and others.

 

As noted above, obedience to Mormon authority is the dominant paradigm within Mormon culture.   And obedience is required with regard to a host of rules.   Many teenagers in particular in our increasingly cosmopolitan and information-rich world, find it difficult to obey Mormonism’s rules.   This causes them to run headlong into family and social group influences that batter their self-esteem.   Cognitive dissonance theory suggests that this will reduce the strength of their moral fibre.   I experienced this myself as a teenager, and have seen it particularly in the life of my eldest daughter as she struggled during her teenage years with a personality that was naturally resistant to authority and therefore to the requirements of Mormon life.

 

There are, of course, limits to utility of high self-esteem.   If it borders on narcissism and is not grounded in reality, a false sense of superiority, it can cause severely dysfunctional behaviour.

 

Cognitive Dissonance and the Perception of Pain

Cognitive dissonance theory also applies to certain physiological phenomenon.   In one experiment, many people were subjected to intense electric shocks.   Half of those people were in a high dissonance condition.   That is, they had been induced to commit themselves to the experiment with little external justification (payment, for example).   The other half were in a low dissonance condition.   That is, they had significant external justification.   The results of the experiment indicated that those in the high dissonance condition reported less pain than their low dissonance counterparts.   This phenomenon was measured physiologically.   That is, there is clear evidence that the physiological response to pain is measured by a “galvanic skin response” was somewhat less intense for those in the high dissonance condition.   Also, the pain of the high dissonance participants interfered less with tasks they were performing while experiencing that pain.

 

Similar results have been shown for hunger and thirst.   A group of people were set up with similar high and low dissonance coordinates and then subjected to significant degrees of hunger and thirst.   The high dissonance participants reported less hunger and thirst than did low dissonance participants.   And after the experiment, the high dissonance participants consumed less food and water than did the low dissonance participants.

 

These experiments explained what I have observed with regard to the Mormon “fast” as indicated above, each month most faithful Mormons go without food and water for a period of 24 hours.   While doing so, they go about a full set of Sunday activities.   While I was serving as Bishop, this generally involved at least eight hours of meetings that started early on Sunday morning.   Non-Mormons marvel at the Mormon ability to nonchalantly skip eating and drinking for an entire day and going without food is much less difficult than going without water, I might add.   I previously assumed that it was simply a matter of getting used to going without food that made the feat seem so extraordinary for those who had not done it.   However, I can now see why a typical Mormon would have some significant assistance from cognitive dissonance theory in controlling his ergs for food and drink that would not be available to a non-Mormon (Aronson,   “The Social Animal”, pages 190, 191).

 

Cognitive Dissonance and Memory

In other series of cognitive dissonance experiments it was shown that when people hear or read arguments respecting something that is important to them, they do not remember the best arguments pro and con, which would be the rational thing to do.   This is rational because remembering the best argument contrary to your own would enable to modify your position in the event you were eventually presented with evidence that was consistent with the best arguments confronting you.   What the experiments clearly showed was that people remember the strongest arguments in favour of their position, and the weakest arguments against their own position.   Again, this behaviour is clearly in evidence when dealing with faithful Mormons (see Aronson,   “The Social Animal”, page176-177).

 

Cognitive Dissonance and Culture

It is likely that cognitive dissonance is a universal human phenomenon.   However, the manner in which dissonance is experienced and how it effects individual behaviour depends to an extent on personality type and also on the nature of the culture within which the dissonance is experienced.   For example, in societies that are less individualistic than the North American, dissonance reducing behaviour often takes on a more communal form.   For example, Japanese students were subjected to a replication of the original Festinger-Carlsmith experiment that gave rise to cognitive dissonance theory.   That was the experiment in which students were induced to perform a boring task and then lie to the next student coming into the experiment with regard to how exciting the task was.   The students who were paid a small amount of money to tell the lie changed their view with regard to the experiment (they decided that it was more exciting than they had previously thought), whereas those who were paid a large amount of money and hence had adequate external justification for their lie, did not modify their opinion with regard to the nature of the experiment.   When this experiment was repeated in Japan, it was found that those who lied for a minimal reward still came to believe their own lie.   In addition, people who observed someone they know and like stating that an objectively boring task is interesting, also experienced cognitive dissonance and as a result, come to believe that the task is interesting.   In short, in a communal and deferent to authority culture like Japan, cognitive dissonance has a much greater effect than is the case in ordinary North American culture.

 

It is not a coincidence that the beehive is used as the Mormon societal symbol.   I would suggest that Mormons are much more like the Japanese than are typical North Americans.   This idea is supported by both Richard Nisbett”™s “The Geography of Thought” as well as the observations of many people who have served Mormon missions in Asia and then returned to a relatively non-Mormon social situation in North America.   Hence, I would suggest that the line of Japanese research relative to cognitive dissonance likely applicable within Mormon culture.

 

The Confirmation Bias

Often, cognitive dissonance theory is used to help us to understand why humans in particular situations appear to be biased toward certain types of behavior.   For example, once we have made up our minds about something and held the opinion for some time, we are biased in favor of not changing our minds. This is called   “confirmation bias.” The idea we first come to hold is one cognition; the new idea is another one.   We tend to be irrational biased in favour of the first idea.   Some psychologists believe that this bias alone is responsible for more faulty human decisions than any other human foible.

 

Mark Twain, without the benefit of the research we have, hit this nail as squarely on the head as its possible when he said,

 

“I have seen several entirely sincere people who thought they were (permanent) Seekers after Truth. They sought diligently, persistently, carefully, cautiously, profoundly, with perfect honesty and nicely adjusted judgment–until they believed that without doubt or question they had found the Truth. That was the end of the search. The man spent the rest of his life hunting up shingles wherewith to protect his Truth from the weather. If he was seeking after political Truth he found it in one or another of the hundred political gospels which govern men in the earth; if he was seeking after the Only True Religion he found it in one or another of the three thousand that are on the market. In any case, when he found the Truth he sought no further; but from that day forth, with his soldering-iron in one hand and his bludgeon in the other he tinkered its leaks and reasoned with objectors.” (Mark Twain,   “What is Man?”)

 

The confirmation bias has also been used to explain even more striking examples of information misrecognition.   Aronson notes the following with respect to the Heavens Gate Cult:

 

In 1997, thirty”‘nine members of the Heavens Gate, an obscure religious cult, were found dead in a luxury estate in Rancho Santa Fe California ““ participants in a mass suicide.   Several weeks earlier, a few members of the cult had walked into a specialty store and purchased an expensive, high”‘powered telescope so that they might get a clear view of the Hale Bopp Comet and spaceship they fervently believed that was traveling behind it.   Their belief was that, when the comet got close to earth, it was time to rid themselves of “earthly containers” (their bodies) by killing themselves so that their essence could be picked up by the spaceship.   A few days after buying the telescope, they came back to the store, returned the telescope, and politely asked for their money back.   When the store manager asked them if they had had a problem with the scope, they indicated that it was defective:     “We found the comet all right, but we couldn’t find the spaceship that is following it.”

 

Needless to say, there was no spaceship but, if you are so convinced of the existence of the spaceship to die for a ride on it, and your telescope doesn’t reveal it, then, obviously, there must be something wrong with your telescope! (Aronson,   “The Social Animal”, page 150)

 

This explains the inability of the Mormon faithful to process what most people would call a rational manner, clear cut information with respect to the deceptive nature of their religious leaders, past and present.   When faithful Mormons are confronted with this evidence, most initially dismiss it out of hand as untrue, and when confronted with such evidence that they cannot dismiss, they rationalize (for example) that Joseph Smith was a man who said he was imperfect, and his imperfections do not taint his “prophetic work”.   When examples of similar human beings are put to them in other context, for example such as that exhibited by fraudulent stock promoters and others who also mislead in order to cause behaviour that works to their benefit in other people, they freely indicate that they would not trust the very kind of person Joseph Smith was.   However, when brought back specifically to Joseph Smith’s deceptive behaviour and questioned as to their continuing faith, in the vast majority of cases they will continue to be unable to see that his deceptive behaviour indicates clearly that it is unwise to believe him with regard to anything he said when trying to persuade others to obey him.   Cognitive dissonance theory, again, nicely explains this behaviour.   Faithful Mormons have invested such huge amounts of time, money and effort in their faith that to acknowledge they have been defrauded would be similar to the Heaven’s Gate cult members acknowledging that no spaceship existed.

 

The examples provided above are all part of the manner in which cognitive dissonance reducing behaviour or belief structuring is “ego defensive”.   That is, by reducing dissonance we maintain a positive image of ourselves.   This is an image that depicts us as being good, smart, etc.   Such ego defensive behaviour can have disastrous consequences.

 

Cognitive dissonance related to personal self-esteem and ego status is at the top end of the scale in terms of its attitude and information distorting capability.   For example, if I performed a cruel or stupid action, this threatens my self-esteem because it makes me think that I may be a cruel or stupid person.   Therefore, persons with high self-esteem are much more affected by cognitive dissonance than persons with low self-esteem.   Consider the irony:   It is precisely because I think I am such a nice person that, if I do something that causes you pain, I must convince myself that you are a rat.   In other words, because nice guys like me don’t go around hurting innocent people, you must have deserved every nasty thing I did to you.

 

The ego defensive mechanism just noted plays a role in most types of cognitive dissonance induced behaviour, and many of the biases described below.

 

The Authority and Conformist Biases

As noted above, Sripada and Stich describe two types of social learning that are important in creating culture and the value structures they entail.   The are     “directly biased social learning”   “indirectly biased social learning”.   They also noted that the prestige or authority bias and the conformist bias are responsible for a lot of our indirectly biased social learning.

 

I first note that these biases are both caused by sources of power.   The prestige bias is tied to particular authority figures who exercise coercive or persuasive power, and the conformist bias is tied to the authority or power represented by what the majority of a group does, whether under the influence of any particular authority figure or not.

 

The prestige bias works in part as a result of those who are perceived to be successful being copied, without an understanding of which aspects of their behaviour causes their success. Is it Michael Jordan”™s shaved head that made him a great basketball player?   By the number of shaved heads on basketball courts (not to mention in office buildings) all over the world today, it seems clear that many people subconsciously at least, think there is something to that theory.   Or what of the case of third world mothers who abandoned breast-feeding their babies in favour of formula feeding at least in part due to advertising designed to show that the high status members of their society had gone this route?   The result was less healthy babies, and a substantial misallocation of financial resources within already impoverished societies.

 

The effect of the prestige bias is enhanced when a successful or otherwise authoritative person directs those who follow him to engage in certain behaviours because they are linked to success as he, or the value structure he seeks to inculcate, defines it.   Mormon polygamy, the continued emphasis on large families within Mormonism, Mormon food taboos, and a host of other religious practices provide examples of how this works.   Each of these is a highly charged emotional issue that arises out of a value structure.

 

The prestige bias is problematic for a variety of reasons.   The first two that come to mind are as follows.   First, the leader may genuinely believe that the behaviour he recommends or requires will benefit his followers, but he may misunderstand the applicable laws of cause and effect.   For example, many counterproductive practices (such as   “bleeding” patients) were long the state of medical art and taught as healing techniques, while the importance of simple hygiene to medical success went unnoticed; despite Mormon food taboos, it is now clear that in certain cases the consumption of moderate amounts of red wine and black or green tea is good for human health; and recently gained knowledge of the genetic link to sexual preference shows the folly of the longstanding Mormon practice of counselling the gay community to simply repent and go “straight”, while at the same time helping us to understand why so many gay Mormons in the face of the treatment just noted have resorted to suicide.   Powerful emotions interfered with the advance of knowledge regarding ancient medical technique, and continue to enliven many maladaptive Mormon practices.

 

The second major problem with the prestige bias is that the leader of a group will often be subconsciously influenced by the needs of the group he leads as opposed to the needs of individual members of the group, and by the tangible or intangible benefits he, as the leader, obtains as a result of the group’s continued existence and vitality.   Hence, his bias is likely to be toward increasing the strength of the group and his own influence, and those interests often work against the interests of individual group members.   So, for example, an emphasis on giving time and other resources to group efforts such as bringing new members into the group, or even on having large families that will continue to expand the group at a pace faster than the expansion of other competing parts of society, will often work in the interest of group leadership and against the interest of many followers.   Followers can hence be influenced by the prestige bias and leadership dictates to engage in maladaptive behaviours based on value structures (often including the value of obedience to leadership dictates) designed to strengthen the group.   Respecting family size in particular, Harold Bloom (See   “The American Religion: The Emergence of the Post-Christian Nation”) and other scholars have identified the continuing large size of Mormon families relative to family size in the rest of the developed world, combined with the ability of the Mormon Church to retain those who are born members, as the long term keys to the continued success of Mormonism as a movement.   The continuing Mormon emphasis on large family size is hence not surprising.

 

The conformist bias is based on the idea if that if most of the people are doing something, it is probably a wise thing to do.   The lemming-like quality of humans has been well documented.   However, as we just saw, some ideas become wide spread without being adaptive, and the conformist bias sometimes then cements them in place.   And particularly in cases of rapid environmental change, the conformist bias works against the kind of learning that needs to take place to adjust to new realities.

 

The conformist bias is also closely connected to the human instinct toward forming coalitions, as described by Boyer above.   In the contest for resources that determined which humans survived and which did not, the manner in which groups were formed, and the stability of groups, was of critical importance.   Hence, humans developed many social customs and instincts related to the question of whether a person was, or was not, part of a particular group.   Most initiation rituals are closely related to this.   The brutal nature of primitive rites of this type was designed expressly to bind the group together through a difficult shared experience.   The psychological turmoil caused for modern Western man by the Mormon temple ceremony performs a similar function (See http://www3.telus.net/public/rcmccue/bob/documents/temple%20marriage.pdf in that regard).

 

In addition, many neurological theorists now believe that the size of the human brain evolved in response to the complexity of social interaction that arose as human social groups moved from single families to groups of several families (See Steven Pinker,   “The Blank Slate”).   The exponential expansion of the relationships and interrelationships that this type of community required each individual to maintain was staggering.   And today, it appears that groups of more than about 250 do not function efficiently if the personal relationships within them are important to that functioning.   Within large groups of this nature, when functional units exceed this size they tend to divide.   One of the many organizations in which this trend has been observed is the Mormon Church.   When the number of people attending a congregation begins to move toward 300, the creation of a new ward is often considered.   The optimum size of participating congregation seems to be between 200 and 250.

 

Because of the evolutionary advantages conferred by the social group, it was important to be able to recognize who was a member of a group and who was not.   Hence, each group evolved conscious and subconscious signifiers of membership.   Some of these were easy to see and developed for other reasons, such as language, skin color, body shape, etc.   But within visually homogenous groups, the same kind of signifiers developed to distinguish subgroups from each other.   This produces much of the uniqueness of human culture.   Today it is still possible to quickly recognize a person who is   “not from around here” be virtue of subtle differences in speech patterns, clothing styles, cultural references, etc.   And when the best rapper is white and the best golfer black, the visual markers of race and gene pool are of less use than ever.   This forces us to rely more heavily upon the subtle signs of cultural belonging to determine who is with us, and who is not.

 

Since social behaviour was so important to survival and reproduction, a key indicator of fitness for the evolutionary race was how attentive one is to the various, complex social identifiers just described.   Thus fashion sense, and cultural   “with-it-ness”, were and still to an extent are, important intelligence tests.   These things are serious business.   The kids who   “didn”™t get it” or   “weren”™t cool” back in evolutionary time were in serious trouble.   Their life expectancy was short.   Today, the   “not cool” kids and adults are still marginalized to some extent because of the instincts that were developed in this regard in our evolutionary past ““ that is, because they show what used to be clear liabilities in the ancient race for survival.   And these instincts are to some extent still linked to things that are important to prosperity in our culture, as well as being misleading in other ways.

 

This brings into focus another striking example of cultural inertia.   Dr. John Ratey of the Harvard Medical School (  “Shadow Syndromes: Recognizing and Coping with the Hidden Psychological Disorders that Can influence Your Behaviour and Silently Determine the Course of Your Life”) describes a weak form of the brain disorder that creates autism as being a significant part of the mental makeup of many members of the technology   “geek” community.   Among other things, this disorder impairs the ability to read non-verbal cues and hence renders those who have it wooden, literal interpreters of the words they hear who have little ability to understand the complexities of how those around them experience life and are trying to communicate.   For example, when they hear,   “Oh, you really shouldn”™t have!” from a grateful friend, they tend not to do whatever provoked that message again.

 

These people are usually male,   “right brain” (analytical) types, and tend to prefer to be in environments that function on the basis of linear logic and can be controlled.   That is, the same brain architecture that makes people like Bill Gates inattentive to social cues marvellously suits him and those like him for the complex mental tasks required to put together computer software and hardware.   And modern society rewards these people very well.

 

This creates the spectacle in the mating arena of women whose instincts, based on the social cuing process described above, repel them from   “geeks” whose talents have been richly rewarded by modern society, making them powerful and hence highly desirable mates in other ways.   These conflicting signals create entertaining mating rituals.   Gates, for example, is reputed to having enjoyed   “virtual dates” with his now wife.   She worked for Microsoft in Redmond, Washington while they were dating.   He was often elsewhere due to his responsibilities as CEO.   So, at an appointed time they would both go to the same kind of restaurant (Italian, let”™s say).   By cell phone, they would chit chat while reviewing their respective menus and deciding what they would order, the kind of wine that went best with that food, etc.   They would then share their views respecting the food while eating.   Afterward, they would both go to the same movie, and then get back on the cell phone to talk about what they had seen and felt.

 

Note how this form of interaction eliminates Gate”™s need to pick up on the subtle non-verbal cues Dr. Ratey indicates people like him do not see and as a result both look and feel   “out of it”.     Note also how an attractive woman, who if she is like most other females would not want to miss all of that fascinating non-verbal information, was prepared to meet Gates on his terms.   But I digress.

 

There are huge variety of other lines of research that could be dealt with under the authority and conformist bias heading.   Here are a few of them.

 

The authority bias causes us to unconsciously screen information that may bring us into conflict with our social group or other sources of power (see Aronson, “The Social Animal” and Shermer, “Why People Believe Weird Things”).   Those youngsters in primitive times who failed to pay attention to their elders faced a higher probability of death and removal of their genes from the gene pool.   Evolution thus selected for deference to authority and to the expectations of one’s social groups. This form of bias explains why members of religious groups are able to identify illogical beliefs in other religious groups but not their own.

 

I again note the connection between emotion and the value structures that are supported by the conformist bias.   To change one’s religious (or in some places, political) beliefs is to repudiate one”™s social group, ancestors etc. and almost invariably results in powerful, negative emotional reactions.   Those who led the way in terms of recognizing the relative strengths of science and religion (remember Galileo?), the importance of universal suffrage, gay rights, civil rights and other similar social issues, were all pilloried as a result of the emotional reactions they provoked within their communities.   We should hence expect any challenge to values held by the majority of our social group to be met with powerful, emotional, and largely irrational, resistance.   This will be particularly the case where the challenge is perceived to threaten the security of important relationships.

 

Another line of research deals with decision-making under conditions of uncertainty and indicates that the more uncertainty and perceived risk or importance of the decision, the more likely it is that we will go with the crowd or accept what the most powerful authority figures in our dominant social group say about what we should do.

 

The conformist bias explains the stock market buying that leads to “bubbles” in the market, and the panic selling that leads to irrational market collapse.   That is, we have a tendency to follow the crowd.

 

The authority and conformist biases also apply to things like the global warming issue, politics and free trade or other important socio-politico issues.   And what is more uncertain, complex and of perceived importance than religious belief?   We should accordingly expect questions of religious belief to be powerfully influenced by the authority bias.

 

Robert Cialdini adds (See “Scientific American Mind”, a Special Edition of the Scientific American, vol. 14, no. 1, 2004) that it is well established that people respond positively to those who have an authoritative appearance.   Just by changing the dress of a person making a request from casual to formal, the positive response rate increases by 350%.   We tend to equate authoritative appearance with expertise.   “Four out of five doctors recommend “¦”.   A highly successful ad in the 1970s featured the actor Robert Young, who played the TV character Dr. Marcus Welby, proclaiming the health benefits of decaffeinated coffee.   His well known apparent, pretend authority had significant impact.   Such are the foibles of the human decision making process, and the nebulous nature of the boundary between reality and fantasy in our lives.   This boundary fascinates me as I emerge from an illusion caused by Mormon ecological rationality, and wonder how real my present perceptions will prove to be.

 

Cialdini also notes what is sometimes called the   “consistency bias” and may be thought of as a subset of the authority bias.   That is, once a commitment has been made, it represents a form of authority in and of itself.   And contrary to popular belief, the human tendency is to keep commitments.   Restaurants find a much higher call back rate prior to no-shows on reservations when they change their request for notice of cancellation from, “Please call if you have to change your plans”, to “Will you call if you have to change your plans?”, followed by a pause to wait for the potential patron to say “yes”.   The patron”™s   “yes” is a commitment that changes her call back behaviour.

 

Cialdini”™s   “social validation bias” is another subset of the authority bias.   A classic experiment in this regard involves one man standing on a crowded sidewalk looking up toward the sky, at nothing.   About 4% of those who pass him stop to look as well.   If five men are planted to look at the sky, 18% of passers-by stop.   If 15 are planted, 40% stop and within a minute the adjoining street is so clogged with people looking up in the air that traffic can’t get through.   The presence of lack of an authoritative appearance also affects how many people stop.

 

So, if we want people to do what we ask of them, it helps to make them think that they are in good company.   Sripada and Stich, as noted above, refer to this as the conformist bias.   And remember those third world mothers who abandoned breast feeding their babies in favour of formula.

 

Cialdini also describes a   “liking bias”.   This, in my view, is also a subset of the conformist bias, since liking is based on a feeling of something in common.   That is, we like those who we feel connected to for one reason or another.   A blood or tribal relationship will do the trick in this regard.

 

So, we tend to say, “Yes” to people we like.   Tupperware and innumerable copycats have based hugely profitable businesses on this single fact ““ friends buy from friends more willingly than from unknown sales people.   It is also well known that good-looking, well dressed people are more successful in the sales professions than the less attractive.   And people like us when we have something in common with them.   Hence, the effort sales people make to find a common denominator with their customers ““ “You’re from Minnesota?!   My Mom was raised in “¦”   “You lived on a farm!?   When I was a kid “¦”   The addition of connecting factors of this kind (whether real or contrived) dramatically affects the probability of a sale being closed or the desired cooperation being offered.

 

Cialdini refers to the   “reciprocation bias”.   This is another subset of the conformist bias.   When we reciprocate, we are mimicking or conforming to the behaviour of another ““ he who just gave us something.

 

If you give people something ““ even something of insignificant value ““ it is much more likely that they will do what you want them to do.   This is why organizations like the Disabled Vets send out cheap address labels with their request for financial support.   Their rate of donation doubles (from about 18% to about 35%) when they add those labels to their request.   This is why so many products come with a “free gift”.   My wife was recently looking for a camera bag, found one she liked, but was told that the only way she could get it was to buy the camera ““ the bag came along as a “free” gift.   She then found out that the camera without the bag was $x, and the camera with the “free” gift was $x + $35.   But she could not persuade the salesman either to sell her the bag for $35 dollars, or even that the bag was not a “free gift”.   After several minutes of conversation, she left the store in frustration, feeling disgusted with the ethics of those who put this particular advertising gimmick in place, and the intelligence of the salesman with whom she had dealt.

 

Many studies show how physicians’ propensity to prescribe drugs is influenced by gifts from pharmaceutical companies and their finding people who need X-rays is influenced by ownership in a radiology lab.   An odd way in which reciprocity can be used is to make a huge request (to act as a volunteer counsellor for two years for two hours a week, for example), and when that is declined to ask for something more realistic (like a financial donation).   The disappointed but gracious acceptance of the first request’s refusal is a form of “gift”, and makes the second, smaller, request much more likely to be successful than if it had been made on its own.

 

The   “scarcity bias” can also be thought of as part of the conformist bias.   Why do we want things that are scarce?   Because other people want them.   Anything that is perceived to be scarce is more desirable than if it were not so perceived, and whether it is in fact scarce does not matter.   The rating of cafeteria food on campus goes up dramatically before the temporary closure of the cafeteria.   This is why the “limited time offer” has become so common a marketing tool as to be meaningless.   It is why (by my observation) the most successful money raisers for business projects are those who subtly and with credibility (but often falsely) indicate that they don’t really need the money they are asking for, and that so many others want   “into the deal” that if the potential investor does not hurry she will not have the chance to participate.   People like this fascinate me.   They have the ability to believe 100% of the story they tell others, while to an objective observer like me who knows most of the story, it seems incredible.   And these folks are not dumb.   They are at the bright, articulate end of the spectrum.   More “misrecognition” a la Bourdieu.

 

Having reviewed in general terms how the authority and conformist biases work, here is a summary of how they are used within Mormonism.

 

Mormon activities are particularly authority oriented.   The leaders are prominent participants, they are usually dressed for “power”, and the activities are carefully controlled.   Ideas do not come up from the bottom, as is the case in organizations that are trying to maximize the creation of knowledge and growth.   They come down from the top, as is the case in organizations where control and execution are the goal.   This is the military paradigm, for example.   The content of “testimonies” is highly controlled.   Leaders set the tone for each sharing exchange.   In relatively small group settings (such as Home and Visiting Teaching) great effort is made to control the message content.   And most of all, the leaders project an image of supreme confidence.   As McConkie puts it, they have   “sure knowledge”.   The members, most of whom do not feel that they have sure knowledge despite how often they say words to that effect, are comforted by the leaders confidence and come to regard the feelings they have as   “sure knowledge”.   This is kind of like the Emperor being told that he is wearing clothes for so long that he believes his is, in spite of all the evidence to the contrary.

 

The projection of confidence in the face of uncertainty is a classic leader characteristic.   The greater the uncertainty (such as just before battle), the more prized leadership confidence becomes.   Not surprisingly, one of Mormonism”™s most common metaphors is that of battle.   Mormon leaders to a great extent, and members to a lesser extent, perceive themselves to be soldiers engaged on the side of right and god in a literal battle against the forces of evil, which are defined to include any influence that does not support the Mormon Church.   The war metaphor explains much Mormon behaviour.   It is axiomatic that truth is war”™s first casualty and that in times of war much behaviour is motivated by fear instead of reason.

 

The Mormon Church emphasizes doing things in groups since it is easier to extract commitments from people in that way.   And it is important to control the validating messages the group sends.   Hence, only those who will support the leadership’s point of view will be permitted air time.   And anyone who goes against that grain must be quickly silenced and then gutted from a credibility point of view.   Those who disagree are hence often assumed to have sinned, and lost the spirit, or to have adopted values that render their judgement unreliable.   The epithet   “intellectual” in the Mormon context is often used to discount those disagree with Mormon orthodoxy on the basis that have learned some things that are more persuasive than what the Church teaches.   Hence, it takes precious little learning to earn the title   “intellectual” in faithful Mormon circles. Mormon   “intellectuals” are said by the Mormon faithful to   “know too much for their own good”, and hence to have fallen prey to pride and abandoned their faith.   They are pitied and what they have to say is ignored by faithful Mormons.

 

The sharing of emotional experience creates strong bonds.   These bonds make continued obedience to the norms that dominate the group that has shared this experience more likely.   This strengthens the group”™s authority and the conformist bias driven by it.   Hence, the trick from Mormon Church’s point of view is to control the dominant forces within the group.

 

There is a tremendous emphasis within Mormonism on getting people to make commitments.   Boyd Packer, the architect of the faithful history policy and a Mormon Apostle who is currently second in line for the Church”™s presidency, has taught on many occasions that the purpose of each and every meeting or activity in which Mormons engage is to cause people to either make, or keep, the   “covenants” required of Mormons.   Almost all of those covenants are designed to reinforce obedience to the dictates of Mormon leaders.

 

At a relatively superficial level, it is fair to say that Mormons are provided with activities at which emotional experiences and community involvement that they like or need.   They respond by being more willing to do what the Mormon Church asks.   This is a form of imitation ““ you give to me, so I will give to you.   But this principle operates at a much deeper level as well.   Authority figures can hence use this bias to strengthen their position by causing their followers to perceive the receipt of divine gifts.   God is a useful tool in this regard.   He is posited as the source of all that is good in our lives.   This great gift puts us in his debt.   We repay a tiny portion of what we owe him by obedience to his laws.   His laws are communicated to Mormons by Mormon leaders, and Mormons end up paying their debt to god to those leaders as god”™s proxies.   Many religious leaders have notoriously and literally appropriated gifts of this kind.   I do not suggest that Mormon leaders are guilty of this.   However, they would be most unusual humans if they did not derive immense ego gratification from the adulation they receive and the right they have to direct the massive wealth the Mormon Church has accumulated as a result of its members”™ largesse.   They claim not to derive such gratification.   When I was a Mormon leader, I so claimed.   This is a classic example of denial or misrecognition.

 

Reciprocation research has indicated that if a large donation is asked for, and then refused, and then a smaller donation is asked for, it is much more likely that the smaller donation will be given than if it was asked for in the first place on its own.   This trick makes its way into Mormon culture largely through Mormon temple ritual.   There, Mormons commit to give all of their time, talents and other resources to the Mormon Church, on demand.   This promise is made each time a Mormon attends the temple.   When less is asked for (and anything is less than that), it seems modest by comparison.   In addition, the commitments made by modern Mormons are regularly compared to what their pioneer forebears gave.   Their sacrifices included literally all of their time, talents and other resources, and in many cases life itself.   So 10% of one”™s income, two years serving as a missionary, a large percentage of one”™s time outside of employment, etc. all seem like reasonable and relatively small requests to the average faithful Mormon.

 

Within Mormonism we find many “scarcity” ideas.     One of the most powerful is that if you want to live after death with your family in a place too wonderful to describe (something most people would want), there is only one way to get that ““ by being a fully obedient Mormon.   This is a very scarce good.   Mormonism has a complete monopoly over it as a result of being god’s “one true church”, the only church that has a prophet who still communicates directly with god as a result of exclusive authority that has been handed down from prior, authorized representatives of god, starting with Joseph Smith.

 

There is another well known persuasion/sales technique that is linked to the scarcity concept.   At the critical moment during the course of the sales pitch, the salesman lowers his voice to an almost conspiratorial whisper.   This change in tone draws the potential customer”™s attention, and conveys a sense of intimacy and secrecy to the exchange of information.   It says,   “This information is too important, too sensitive for others to have.   This information is just for you!”   Studies have shown this to be a highly effective sales technique.   In Mormon culture, the lowering of tone occurs during testimony bearing.   It conveys a sense of reverence for what is being said.   This reverence connotes the special nature of the information being shared, in precisely the fashion the salesman”™s closing speech does.

 

The communication of a scarcity concept is also recommended to Mormons as a sales technique when they give Books of Mormon or other Mormon related material to their non-Mormon friends.   It is suggested that when doing this, they first let their friend know how much they care about her, and that as a result of this caring they want to share the most rare and precious thing they have with her ““ their knowledge of the truthfulness of the Book of Mormon, the Mormon Church, etc.   The tone of voice will intuitively lower during this communication.   I did this countless times myself without being aware of the mechanisms I was using.

 

The various biases noted above are descriptions of how our subconscious or emotional selves assert themselves over our conscious, rational decision making process.   If fear and desire are fundamental to the emotional process, we should find them at work beneath each bias.   The following is what I observe in this regard.

 

Authority/prestige: We fear the sanctions of authority figures and desire the benefits they can confer if we are obedient to them, or that we can obtain by association if we imitate them.

 

Consistency:   It has been shown to be important to an individual”™s success in our society that she be reliable.   We desire the benefits conferred by consistent behaviour, and fear the consequences of being known to be inconsistent.

 

Social validation/conformist:   If others have or do something we assume it to be desirable and want it, or fear that if we do not have or do it we will miss out.   If we behave differently from the rest, we fear that we will stand out in a negative way and will be punished by the group.

 

Liking and Reciprocation:   When something has fulfilled a desire for us in the past, we are prepared to bet that more of the same will occur in the future and behave so as to enhance that possibility.   The more important the desire that has been fulfilled, the more extensive our cooperation is likely to be.

 

Scarcity:   We fear losing a chance to have something more when it is scarce than when it is not.   Hence, we desire scarce things more than things easily had.

 

Attachment Theory

Attachment theory applies the principles of evolutionary psychology to the study of child-parent relations and has been extended by some researchers to adult romantic relationships and other researchers (see Kirkpatrick and Faber below) to the relationship between individuals and religious groups or ideologies.     In light of the cognitive bias research we have already reviewed, it is fair to suggest that the application of attachment theory to religious behaviour is closely related to the authority and conformist biases, or perhaps tells us something about their origins.

 

Lee Kirkpatrick, (see   “Attachment, Evolution, and the Psychology of Religion” http://www.guilford.com/cgi-bin/cartscript.cgi?page=pr/kirkpatrick.htm&dir=pp/paci&cart_id) in what is likely the best book published on this topic so far, indicates that attachment is just one of a large number of evolved behavioural systems that comprise human nature and are relevant to how different kinds of religious beliefs can be expected to affect the behaviour of different kinds of people.   He indicates that attachment theory is particularly explanatory with regard to how many monotheists (for example, Christians, Jews and Muslims) relate to their religious beliefs and groups, and are affected by them.

 

Kirkpatrick”™s statement as to why be believes attachment theory is a good place to start with regard to understanding monotheistic behaviour in particular is worth repeating in large measure   Here it is, from the introduction of his book:

 

First, attachment theory is a fundamentally psychological theory. It was developed initially as a theory of infant social development, particularly focusing on the ways in which experience with caregivers shapes subsequent behavior and social relations; it was in no way developed specifically for the purpose of describing or explaining religion. …

 

Second, attachment theory is more comprehensive than most alternatives currently extant in the psychology of religion. “¦ It is not a theory about emotion, behavior, cognition, or physiology; it is a theory about all of these and, most important, about how all of these are integrated in an organized, functional way. The theory includes both normative and individual-difference components, which are needed if we wish to answer both normative questions (Why are people religious?) and individual-difference questions (Why are different people religious in different ways?) about religion.

 

Third, attachment theory is deeply explanatory. It does not merely describe how infants interact with their mothers, or adult romantic partners with one another, but purports to explain why humans are built in such a way that they behave this way. It not only provides a descriptive typology for conceptualizing individual differences in people”™s orientations toward personal relationships and intimacy, it purports to explain how these differences come about and why the system works in this rather than some other way. This functional, process-oriented approach enables its application to other phenomena such as religion, offering a basis for addressing questions about both the causes of and individual differences in religious belief and behavior.

 

Fourth, attachment theory is unambiguously a scientific theory. It has been supported by countless empirical studies reflecting a multitude of methodologies and populations, meaning not only that we can have considerable confidence in it, but also that it has clearly been demonstrated to be amenable to empirical testing. Perhaps equally important, however, is the fact that its application to religion is not laden by evaluative baggage. In contrast to earlier psychoanalytic formulations that presuppose religion to be inherently infantile, regressive, and mentally unhealthy, attachment theory provides a more value-neutral theoretical basis for understanding many of the same aspects of religious belief in which Freud was interested. Like Freud”™s theory, attachment theory focuses on human concerns about comfort and protection, and God is psychologically represented as a kind of parent figure. However, from an attachment theory perspective, there is absolutely nothing assumed to be   “infantile” or   “regressive” about any of this. As Bowlby argued cogently and other researchers have subsequently explored in depth, attachment system processes are designed to operate across the entire lifespan.   Attachment theory thus provides a scientific view of how humans are designed with respect to these issues in a way that is inherently neither pro- nor antireligious.

 

The theory of attachment as it applies to children suggests that the manner in which a child relates to her parents ““ the form of attachment between child and parent ““ affects the way in which the child relates to many aspects of her environment.   For example, one stream of research suggests that there are three common attachment   “styles” demonstrated by infants to their parents.   These are called Secure Attachment Anxious-Ambivalent Insecure Attachment Anxious-Avoidant Insecure Attachment, Disorganized Attachment.   In each case, the nature of the nature of the attachment to the parent or primary caregiver is mirrored to an extent by other aspects of the infant”™s behaviour.

 

For example, the manner in which a child relates to its mother might be observed in a way that would allow the child”™s attachment style to be determined.   Then other aspects of its behaviour would be observed.   In such experiments, a correlation has been found between children who are securely attached to their mothers and children who tend to explore freely while the mother is present, will engage with strangers, will be visibly upset when the mother departs, and happy to see the mother return.   The theory says that children are best able to explore when they have the knowledge of a secure base to return to in times of need. When assistance is given, this bolsters the sense of security and also, assuming the mother’s assistance is helpful, educates the child in how to cope with the same problem in the future. Therefore, secure attachment can be seen as the most adaptive attachment style. According to some psychological researchers, a child becomes securely attached when the mother is available and able to meet the needs of the child in a responsive and appropriate manner. Others have pointed out that there are also other determinants of the child’s attachment (including genetic factors), and that behaviour of the parent may in turn be influenced by the child’s behaviour.

 

Other researchers detected similar patterns of behaviour in adult romantic relationships. Securely attached people are able to place trust in their partner which, in turn, means they can confidently spend time apart. People with an anxious-ambivalent attachment style may have difficulties because their way of behaving in relationships can be seen as needy or clingy by their partner. They are prone to worry about whether their partner loves them or whether they are valued by their partner. People with an avoidant-attachment style are uncomfortable being close to others. They have difficulties in trusting other people and do not like to depend on others.   These patterns are believed to develop in infancy, but can be modified as people enter into new relationships.

 

M.D. Faber in   “The Psychological Roots of Religious Belief: Searching for Angels and the Parent-God” (see http://www.skeptic.com/eskeptic/archives/2005/05-07-15.html#Krause) also develops the attachment theme along the religious axis, but in a narrower (and less helpful in my view) fashion than Kirkpatrick.

 

Faber focuses on how our earliest biological needs, our dependence on our parents and their endless satisfying of those needs predisposes us toward a belief in a kind of God that would treat us in similar fashion.   Faber makes a good case for the way in which some religions exploit this   “weakness” in our character with which our biology has equipped us.   As he puts it:

 

[Churches] strive to trigger state-dependent memories of the early period through formal, diurnal practices”¦ [Religion] has shrewdly played into man”™s most childlike needs, not only by offering eternal guarantees for an omniscient power”™s benevolence (if properly appeased) but by magic words and significant gestures, soothing sounds and soporific smells “” an infant”™s world”¦ Thus religion is a cunning, unconscious method of preserving the tie to the”¦ original mother and father”¦ We can play the game of life in two directions, staying put and moving on”¦ And so it is with religion”¦ Not only does one get the caregiver back, but one gets the caregiver back in an idealized form. One is not alone, and one has nothing to fear from a just and merciful God.

 

The basic biological situation, the implicit memories, the desperate anxiety associated with separation, and every church”™s deliberate and clever attempt to seduce innocent minds “” such factors travel a great distance in explaining monotheism”™s virtually irresistible attraction for humanity, including the most intelligent and educated among us.

 

Both Faber and Kirkpatrick note that not all religions present the kind of a God just described ““ one that infantilises His followers.   Many religions, and the Eastern religions in particular (at least as they tend to be interpreted in the West) posit a god that likely encourages us to grow out of ideas of dependence and attachment.   In fact, Buddha blamed   “attachment” for most of what ails humankind (see http://en.wikipedia.org/wiki/Buddhism).   And Marvin Levine (see   “The Positive Psychology of Buddhism and Yoga”) does a fine job of pointing out many of the ways in which some aspects of Eastern   “religious” wisdom is well prescribed for what ails Westerners.

 

This reminds me of the various ways in which different kinds of religions have been categorized by religious studies scholars.   We have, simply,   “good religion” v.   “bad religion”,   “sick-souled” v.

“healthy-minded”,   “mature” v.   “immature”,   “intrinsic” v.   “extrinsic”, etc.   As we attempt to characterize religion in terms of the positive or negative attachment style, we are making this kind of value judgment.   That is, we are defining where a particular religions ideology stands relative to what we value.   We are not defining something essential about the religion.   There are many people who believe that the best religions, for example, are those that cause the most complete dependence of the worshiper on the worshipped.   Indeed, the most of the Muslim faith and large parts of Christianity are so premised.

 

There is, of course, a vigorous debate in this field along nature v. nurture lines.   To what extent, for example, does how Mom parents cause the attachment style and to what extent is it innate?   Does the nature of one”™s belief in God affect parental attachment, and vice versa?   And how much can that vary in accordance with the romantic experience in adulthood?   Many other similar questions are being asked.

 

Various fine books have been recently written on the nature v. nurture topic in general.   Among my favourites are Steven Pinker”™s   “The Blank Slate” and Quartz and Sejnowski   “Liars, Lovers and Heroes”.   None of them dare do more than point the way, and indicate that most of our major behaviour characteristics have large components of both nature and nurture.

 

Kirkpatrick makes a variety of worthwhile points about how religious works using attachment theory.   For example, he notes that the attachment instinct is triggered by danger and fear.   That is, when in danger or pain we need security, and the instinctive source of security (the primary caregiver in a child”™s case; a spouse in a romantic attachment case; the religious institution or authority figure in the case of religious attachment; etc.) become more attractive.   And this will be the case even if the attachment figure is the source of fear or pain.   As Kirkpatrick puts it:

 

“Bowlby (1969) noted that lambs and puppies develop and maintain attachments despite receipt of unpredictable punishments from their caregivers and, moreover, that attachment behaviors actually increase as a result of such treatment.   Similarly, human infants are attached to parents who mistreat them (Egeland & Sroufe, 1981). The basis for this seemingly paradoxical behaviour is that the punishments, like other sources of fear and distress, activate the attachment system and hence the seeking of proximity to the primary attachment figure.   The same individual is, in a sense, both the source of the problem and the solution.

 

Belief in a god that rains fire and brimstone upon the world but who also serves as an attachment figure, may function in a similar manner. In fact, this may explain why beliefs about vindictive, frightening gods have persisted throughout the history of humankind despite the negative emotions they elicit.   It may also help to explain why conservative Christian churches, which generally give greater attention to this aspect of God, are growing, while mainline and liberal denominations continue on a downward spiral (Kelley, 1972; Stark and Glock, 1968).” (Kirkpatrick, page 83)

 

Let”™s extend this a bit further. The sad fact is that a certain type of abuse favours the abuser at least in the short term.   This explains why many a woman has been attracted to an abusive man (to the amazement of her family and friends), and then worse, stayed with him for a long time.   If the abuse becomes bad enough, the woman will eventually leave in most cases. However, just the right level of abuse will maximize control by the man over the woman by activating her attachment instinct while not being so bad as to drive her away.   And I should add that the women are sometimes in the abusers role as well.

 

This idea explains a lot of religious behavior to me.   To an extent, the religious institutions that have established themselves in our lives as sources of security and hence become attachment figures can further their hold on us by causing us to fear and otherwise abusing us.   Think of this in light of the Mormon conversion process.   The missionary lessons and fellowshipping process establish the attachment figure by emphasizing God”™s love; the ability to be with family after death; etc.   The convert is loved bombed into the community, and given small tasks to help her integrate.   Then the temple preparation lessons start, and about a year later in many cases the converted has become dependant on the Mormon community and its theology in various ways, and goes to the temple. There much higher standards of obedience are required.   The next step is leadership callings, which entail attendance at leadership meetings that are in large measure exercises in behaviour manipulation through guilt as a result of the attendees being continually being warned about how they are not honouring their covenants if they do not donate enormous amounts of time to Mormon projects, do not do more temple and missionary work, etc.   This form of abuse seems to me nicely designed to activate the attachment instincts.

 

The Bias Toward Risk Aversion

Yet another area of study focuses on our inherent risk aversion.   We tend to overestimate risk and underestimate potential gain from risk taking, and we tend to overvalue what we already possess when it is compared to what we don’t possess.   One fascinating study in this regard provided university students with one item each that had the same value (say $5) in their school book store.   They were also given some money with which to bid on the items other students were given, and were required to put their own item up for auction with a minimum sale price.   On average, each student was prepared to pay much less (say $3.50) for items similar to her own than the amount for which she was prepared to sell her own item (say $7).   The tendency to value what we have more than similar items we don’t have, and to overestimate risk and underestimate the rewards to be gained by taking risk, would promote societal stability and hence make evolutionary sense.   And they make us unlikely to change our minds respecting something like religious beliefs we have already accepted.

 

The Bias Against Acknowledging Inevitable Risks

When risks are perceived to be inevitable, however, cognitive dissonance produces quite a different result.   In one experiment that demonstrates this link, children were induced to eat a vegetable respecting which they had previously expressed significant dislike.   After eating the vegetable, half the children were led to believe that they would have to eat much more of that vegetable in the future, whereas the other half were not so conditioned.   The cognition “I hate vegetable X” is dissonant with the cognition “I will have to eat vegetable X regularly in the future”.   Cognitive dissonance theory predicts that the children would reduce the just described dissonance by modifying their attitude with regard to vegetable X.   In fact, that is what the experimental data disclosed.

 

In a similar experiment a group of college women volunteered to participate in a series of meetings in which they would each discuss their sexual behaviour with another woman with whom they were unfamiliar.   Before the meetings, each student was given two folders each containing the profile of a different woman.   They were told that one folder profiled the woman with whom they would be meeting, and the other profiled another member of the student study group.   The profiles were different in specifics but very similar in terms of the mix of positive and negative qualities described.   The women tended to develop a much more favourable impression of the woman with whom they had agreed to share the intimate details of their sexual lives.   This is consistent with what cognitive dissonance theory would predict.   If I have agreed to share the intimate details of my life with a stranger, it is best that the stranger in question have as many admirable characteristics as possible.

 

Once again, cognitive dissonance theory is highly explanatory of behaviours observed within the Mormon community.   Obedience to Mormon authority is perceived by faithful Mormons to be inevitable.   Spencer Kimball, a former Prophet of the Mormon Church, wrote what is still a highly influential book called “The Miracle of Forgiveness”.   It is only a small exaggeration to say that my generation was raised on the concepts contained in that book.   At page_____, Kimball describes the Mormon attitude with regard to obedience as follows:

 

That is, Mormons are encouraged not to think about obedience.   The decision to be obedient is made once, and after that it is not necessary to think about the matter again.   That is, obedience is inevitable.

 

Similar quotes regarding the Mormon conception of freedom are found above under the heading   “Different Conceptions of Freedom ““ An Illustration of Sacred Premises”.

 

Many Mormons chafe under the burden imposed upon them by their commitment to obedience to Mormon authority.   Given that the things they have committed to are inevitable, they tend to look on the bright side and minimize the things they are missing as a result of all of the time, energy and money they pour into their Mormon lifestyle.   The testimonies Mormons bear during their monthly fast and testimony meeting as described above and that are woven into the lessons and talks given each week during Mormon meetings are full of this kind of reasoning.

 

Mormons find in their life experience “blessings” that come to them as a result of the significant financial contribution they make to the Mormon Church.   Giving up all of that money makes them budget better, God provides unexpected financial blessings from time to time, and they feel good about the way in which they are helping other people.   They rationalize that the strictures of Mormon morality protect them from sexually transmitted diseases and other behaviours which they assume they would be much more likely to fall prey to were they not Mormon.   They are grateful for the way in which Mormonism shields them from “toxic” sources of information that would otherwise pollute their lives.   Most of these judgments are based in ignorance.   They have no idea with regard to what they are missing, what their options are, or the causal mechanisms related to the assumed maladaptive behaviours to which they would be subject without the protective influence of Mormonism.   In short, their rational processes are short circuited once again by the perception that their obedience is inevitable, and the resulting effect of cognitive dissonance in their lives.

 

Consider the case of university students living near UCLA.   Geological studies have shown with a high degree of probability the likelihood of a major earthquake in that area in the near future.   And yet, the attitudes of students regarding securing themselves against that highly probable eventuality disclosed very little concern.   Cognitive dissonance theory would suggest that if I deny there is going to be an earthquake and refuse to think about it, my dissonance will be reduced.   This is a self-justifying response to a dangerous and inevitable event that provides comfort in the short run, but could provide disaster in the long run.   A similar Mormon attitude is seen with regard to the undeniable tendency of the Mormon belief complex to limit their access to information respecting the origins of their own belief system, and to any scientific information that questions their belief system.   Most intellectually oriented Mormons acknowledge that limiting access to information is a bad thing.   However, they down-play its significance in the case of Mormonism in a fashion that they would not accept in virtually any other circumstance.   This is a classic example of justifying something that is inevitable, and minimizing its consequences in the short run to the probable great detriment of the individuals involved and their children, grandchildren, etc., in the long run.

 

Also, during the Three Mile Island Nuclear Power Plant disaster in 1979, hundreds of thousands of people living in the vicinity of the nuclear plant in question were endangered.   Those living closest to the plant displayed a surprising nonchalant attitude respecting what had happened when compared to those living further away.   Those living further away from the plant may have thought they had perhaps been infected, and had significant motivation to leave before the infection could become serious.   Many contradictory reports circulated with regard to the extent of the danger, hence, great uncertainty existed as to the nature of what had happened, and the human response to fear is generally speaking to flee.   However, those living close to the plant had little doubt with regard to whether they were infected.   Hence, they had little reason to leave.   It would have made much more sense for them to completely suppress their cognitions respecting the risks they faced due to the inevitability of the consequences that were to follow their previous exposure to the nuclear fallout resulting from the plants’ failure.   Hence, they could perhaps have been expected to stay put.   And when authorities from the Nuclear Regulatory Commission (NRC) arrived on the scene and issued reassuring statements, respecting the minor nature of the radioactive leak, those living closest to the plant would have had the greatest incentive to believe this information, and again, the response predicted by cognitive dissonance theory would be for them to remain where they were.

 

A recent Stanford study (see http://www3.interscience.wiley.com/cgi-bin/abstract/97516222/ABSTRACT) indicated that poor African Americans and Hispanics in the United States tend to be more supportive of the status quo there which prejudices them than more wealthy African Americans and Hispanics who have a chance to change their circumstances in various ways; abused spouses who perceive themselves as not being able to cope without the abusive spouse in financial or other terms are more likely not to perceive the danger of their situation; etc.

 

The examples just provided illustrate something of fundamental importance with regard to cognitive dissonance as it relates to inevitable events.   In the case of something that is inevitable but not terribly important, such as the eating of disliked vegetables, it is possible to change the attitude with regard to vegetables.   That is, the vegetables can become more desirable.   However, in the case of something as dire as death by an earthquake, there is no way to find a silver lining in the crowd.   Hence, cognitive dissonance causes the complete suppression of that information.   Thinking along these lines in terms of the illustrations I have just provided with regard to Mormon culture is interesting.   Some aspects of the Mormon belief structure are significant, but not of such significance that they are required by cognitive dissonance to be completely suppressed.   For example, the idea that Mormon leaders distort information delivered to the membership is something that most members can understand, and simply accept as not being that important.   The difficulties related to the Mormon lifestyle in terms of its requirements of time, money, etc., can also likewise be justified and its favourable aspects emphasized.   However, issues related to the deception at the core of Mormonism, to be found throughout its foundation, are so threatening to faithful Mormons that information respecting it is completely suppressed.   When such information is first encountered, it is generally speaking denied.   Those few faithful Mormons who reach the point where they must accept certain facts as highly probable prove themselves incapable of comprehending the significance of those facts relative to the trustworthiness of Joseph Smith and other early Mormon leaders.   This is akin to the complete suppression of information disclosed by the behaviour of the UCLA students just noted.

 

I also suspect that cognitive dissonance of the type just described causes faithful Mormons to be more prone that one would suspect to the often specious arguments offered by Mormon apologists and then heave a sigh of relief.   See the essays at http://mccue.cc/bob/documents/rs.dna%20controversy1004917.pdf and http://mccue.cc/bob/documents/rs.apologetic%20mind.pdf for examples of the type of argument Mormon Apologists use and how it is accepted by the Mormon faithful.

 

The Bias Toward Escalating Commitment or   “Foot-In-The-Door” Technique

It has been noted that during the Vietnam war the American government made a variety of decisions with regard to the continued escalation of bombing in North Vietnam that ignored crucial evidence provided by the CIA and others to the effect that bombing was an ineffective strategy.   Some researchers have concluded that the probable reason for the American government’s inability to recognize the significance of their own intelligence was that this information was inconsistent with many decisions they had taken with regard to Vietnam.   Hence, the new information produced cognitive dissonance in the minds of those who had made the prior decisions and its relevance was suppressed.   As Aronson notes at pages 157-158 of   “The Social Animal” with regard to Vietnam,

 

Escalation is self”‘perpetuating.   Once a small commitment is made, it sets the stage for ever increasing commitments.   The behaviour needs to be justified [so information inconsistent with it is suppressed] “¦

 

Again, this behaviour is clearly evident in the behaviour of Mormons as they gradually become more committed to their faith.   Each commitment makes the next increased level of commitment more likely, as long as the steps are not too large or quickly taken.   And newly baptized members are not as likely to unconsciously suppress evidence concerning the murky origins of their faith as do people who do have given many years of service and huge amounts of money, have made special covenants in Mormon Temples.

 

The nature of this escalating commitment is also shown through experiments in which the response of motorists who signs requesting lower speeds is measured (Aronson,   “The Social Animal”, page 158).   Those motorists who were persuaded to sign a petition in favour of safe driving (a small commitment) experienced a much greater modification in their driving habits when they passed a “drive carefully” sign than did ordinary motorists (a larger commitment).   Other studies involving the effect of signing a petition on the willingness to make subsequent more substantial commitments point to the same conclusion.   This is referred to as the “foot-in-the-door” technique.   Having made a small commitment, we are much more likely to accept a larger commitment that moves us in the same direction.

 

Mormon missionaries are using something called the “commitment pattern”.   It explicitly employs the foot-in-the-door technique.   Those investigating the Mormon Church are encouraged to do things like pray while in the company of the missionaries, do certain readings required by the missionaries, attend Church meetings, etc.   Each step escalates the time and effort required from those investigating.   Once an individual has joined the Mormon Church, the process continues.   At baptism, the covenants to be part of the Mormon community and help take care of other members of the community is made.   This is followed immediately by, in most cases, a relatively in unobtrusive “calling”.   This calling would involve assisting the teacher of class or some other community responsibility that would not require much time or effort.   Later, other responsibilities would be given that would be more time consuming.   Within a year of baptism, the new member would be encouraged to go to the Temple where much more onerous covenants would be made.   After having attended the Temple, callings that would require a much greater commitment of time and effort in many places would be extended.

 

The foot”‘in”‘the”‘door technique is similar to something in the automotive sales business called “low balling”.   That is, get the customer in the door with the perception that a vehicle is available for a certain price, and then add extras, warranties, etc. that drive the price up to something that would be profitable for the dealer.   In the Mormon context, this is often referred to as the “milk before meat” approach.   From the Mormon point of view, this idea that full commitment is   “good” and the best way to get to full commitment is a little at a time without disclosing what the end of the road looks like, is used to justify disclosing only a small part of the evidence that would likely be thought relevant by an objective person investigating the Mormon faith with regard to many things.   These would include the kind of partial and misleading information contained in the Mormon missionary discussions, and would not include the much more troubling story of Mormon leadership deception, Smith”™s sexual practises or how much time and effort that would eventually be demanded of a fully faithful Mormon.   This kind of information is thought to be too “strong” for a new member or someone investigating Mormonism.   Therefore, those relatively few Mormons who have done the math and are sufficiently self”‘aware to have figured out what their lifestyle means in terms of commitment would in most cases instinctively not provide a straightforward answer to a question of this sort put to them by an investigator, and would not consider volunteering the relevant information.   Most faithful Mormons, while believing themselves to be near the top of the honesty totem pole, are ironically rendered less reliable than the average used car salesman in this regard.

 

Even more revealing is the fact that most adult Mormons are unaware of the amount of time they spend engaging in Mormon Church related activities and what might be called by “ritual behaviour”.   Ritual behaviour relates to things like prayers, scripture study, meeting attendance, interviews, time spend on Mormon callings, etc.   When these behaviours are added up they disclose a pattern of mental attachment to reminders of the importance of obedience to Mormon authority which most objective parties would not consider to be healthy.   Most Mormons are oblivious to the role that these rituals and commitment to Mormonism play in their lives.

 

The foot in the door technique also provides some interesting insights into the behaviour of cult leaders and the manner in which they accumulate the power they do over those who follow them.   Consider the Jonestown massacre.   Jim Jones evidenced classic “foot-in-the-door” technique as he led his followers deeper and deeper into their commitment to him and the community he founded.   Aronson describes this process as follows at page 196:

 

Let us start at the beginning.   It is easy to understand how a charismatic leader like Jones might extract money from the members of his Church once they have committed themselves to donating a small amount of money in response to his message of peace and universal brotherhood, he is able to request and receive a great deal more.   Next, he induced people to sell their homes and turn over the money to the Church.   Soon, at his request, several of his followers pull up stakes, leaving their families and friends, to start life anew in the strange and difficult environment of Guyana.   There, not only do they work hard (thus increasing their commitment), but they also are cut off from potential dissenting opinion in as much as they are surrounded by true believers.   The chain of events continues.   Jones takes sexual liberties with several married women among his followers, n, if reluctantly; Jones claims to be the father of their children.   Finally, as a prelude to the climatic event [group suicide], Jones induces his followers to perform a series of mock ritual suicides as a test of their loyalty and obedience.   Thus, in a step-by-step fashion, the commitment to Jim Jones increases.   Each step in itself is not a huge, ludicrous leap from the one preceding it.

 

This brings to mind, Joseph Smith’s saying that the gospel is to be revealed “line upon line, precepts upon precepts.”   The parallels between Jim Jones and Joseph Smith, as disclosed in the preceding paragraph, are manifold.   He seemed to follow the pattern of someone who was surprised at the degree of power other human-beings handed over to him, and simply kept asking for more once his foot was in the door.

 

The Justification Bias

Cognitive dissonance theory predicts that if a person works hard to attain a goal, that goal will become more attractive to that person than would be the case had a small effort been invested to attain it.   This is the concept that underlies difficult initiation rights to fraternities and primitive human societies.

 

For example, a group of women volunteered to join a group that would discuss the psychology of sex.   They were told that in order to be selected for the group, they would have to go through a screening process.   One-third of the women underwent a severe screening process that amounted to an initiation procedure.   This required them to recite aloud various offensive and obscene words.   One-third of the students underwent a mild procedure of a similar sort.   And one-third were admitted without undergoing any procedure.   The discussion group in which they were encouraged to engage was structured so as to be boring.   After the session was over each participant was asked to rate the discussion in terms of how interesting it was, how intelligent the participants were, etc.   As predicted by the theory, those participants who made little effort to get into the group did not enjoy the discussion as much as those who underwent the severe initiation.   In fact, those who were not initiated had what might be called a “realistic” assessment of the discussion.   They viewed it as dull, boring, uninformative and a waste of time.   In fact, the researchers who structured the discussion had designed it to be of precisely that nature.   However, the severely initiated participants had an unrealistic assessment of the discussion.   They found it enlightening, worthwhile, etc.

 

Another example showing the effect of entry level commitment on long-term behavioural patterns is provided by experiments conducted respecting female weight loss.   A group of woman were invited to participate in a cutting-edge weight loss program sponsored by a university.   They were told that entrants to the study were being screened and they would have to pass a test before being told whether they would be selected to participate in the study or not.   Half of the women were required to endure a very significant test that required a great deal of effort on their part.   The other half were submitted to a very easy test.   All of the women who applied were then admitted into the weight loss program.   The program lasted for four weeks.   During the course of the program, there was no material difference between the amount of weight loss by the two groups of woman.   However, six and twelve months later, the experimenters contacted the woman and found that those who had expended a significant effort to gain entry to the group had lost on average, eight pounds more than those who had expended little effort.   That is, the amount of effort invested in order to gain entrance to the group appears to have a significant effect on the manner in which the ideas and behaviours promoted by the group were absorbed by the woman participating in the study (Aronson,   “The Social Animal”, page 193).

 

Psychologists call the phenomenon just described “justification of effort”.   That is, if you expend a significant effort to obtain something, it would produce cognitive dissonance if you acknowledge that it was not worthwhile.   Accordingly, you will tend to find it worthwhile whether it is or not.   This causes the kind of unrealistic appraisals of various situations as described above.

 

There are various ways in which cognitive dissonance can be reduced in the type of situation described above.   One way, already described, is to read the meaning and utility which is not present into the experience you have had.   Another way is to misremember the way things were like before you obtained the goal for which you had worked so hard.   This is illustrated by another set of experiments.

 

For example, a group of students were signed up for a study skills course.   Prior to taking the course they were asked to evaluate their study skills.   After three weeks of useless training, the objective data showed that the students were not doing any better in their course work.   They tended to reduce dissonance related to the fact that they had invested three weeks of effort in an apparently useless project by misremembering the nature of their performance prior to taking the course.   That is, by remembering that they were doing much worse than they in fact were prior to taking the course, their performance after the course seemed better.

 

The concepts just described have numerous applications to Mormonism.   As already noted, the Mormon lifestyle requires a huge investment of time, money and effort.   And the Mormon Temple ceremony is a particularly bizarre form of psychological initiation.   The clothing worn, the things said, the actions in which the initiates are required to engage are all other worldly and embarrassing.   Some of these are described in the essay at http://mccue.cc/bob/documents/temple%20marriage.pdf.   That essay also references other online sources of information that more fully described the Mormon Temple process.   Having been through an other worldly, psychologically-stressful experience such as the Mormon Temple initiation, having pledged allegiance to the Mormon Church in public during that ceremony and on numerous other occasions, and having invested all of the time and effort required by Mormonism over a significant period of time, cognitive dissonance theory would predict that Mormons would read much more meaning into their Mormon cultural experience than in fact exists there.   This is consistent with my observations.   Likewise, the last piece of experimental evidence noted above is also concordant with my observations respecting Mormons.   That is, they grossly undervalue and misremember the nature of their life prior to Mormonism.   This helps them to favourably compare what their tremendous investment in Mormonism has produced for them.   I hasten to note that this does not mean that Mormonism is useless in all or even very many cases.   Mormonism encourages many helpful habits.   However, as noted above what Mormonism encourages is little more than universal moral values.   These values can be found almost anywhere.   Mormon’s unique claims tend towards the bizarre and it is the radical conditioning and commitment pattern within Mormonism that touches human psychological buttons of the type described in this essay to produce the seemingly profound and without question moving experience within Mormonism.   Regrettably, it is this very experience which causes individuals to surrender a great deal of their personal responsibility to Mormon leaders.

 

I again note here as an aside that most of the cognitive dissonance functions I am describing are as applicable to post-Mormons as well as Mormons.   That is, for example, having left Mormonism (and paid a huge price to do so in most cases) the tendency predicted by cognitive dissonance theory would be for the post-Mormon to undervalue her Mormon experience and overvalue post-Mormon experience.   We are on a two way street here.

 

The justification bias can also be used to explain some of humanity’s most apparently cruel behaviour.   For example, during the 1960s during the infamous Kent State University riots, four university students were shot and killed by members of the Ohio National Guard.   In the immediate wake of that tragedy a number of rumours spread.   Both of the women who were killed were rumoured to be pregnant and therefore by implication of low morals.   The bodies of all four students were rumoured to have been crawling with lice and syphilis such that they would likely have been dead within a short time in any event.   Those rumours were proven false.   The students who were killed were clean, bright people.   In fact, two of them were not even involved in the demonstrations that resulted in their death.   Rather, they were simply walking across campus minding their own business.   An analysis of why the good folk of Kent would have spread such rumours is enlightening.

 

It is fair to assume that the conservative residents of Kent, Ohio were offended by the radical student demonstration that rocked Kent State University.   It is also fair to assume that many would have hoped that the students would get their comeuppance.   However, death is a harsh punishment for demonstration.   This could be expected to give rise to cognitive dissonance.   One way to reduce that dissonance would be to demonize the students, thus justifying their deaths.   This is likely the driving force behind the rumours just mentioned.   Unfortunately, this theory cannot be proven because no attitudinal data was collected respecting Kent’s residents prior to and after the incident in question.   However, other experiments have been done which support the conclusions just drawn.   (Aronson,   “The Social Animal”, pages 179-180)

 

A variety of studies have been conducted which have one group of students either inflicting pain upon or insulting other groups.   The attitudes of the students delivering the insults with regard to their classmates is measured before and after the insults are delivered or the insults or pain are delivered.   Generally speaking, those delivering the insults and pain express a lower, less human, opinion of their “victims” after the pain has been inflicted.   Cognitive dissonance theory would suggest that this is done in order to justify the cruel act which has been committed.   That is, if the person delivering pain can make himself feel that the victim in some way deserved it, or was less than human, the cognitive dissonance caused by the delivery of pain to another human being would be reduced.   Interestingly, in cases where the victim would be given the opportunity to retaliate, the delegation of opinion with respect to the victim did not occur.   Some experiments, for example, were set up so that the person delivering the pain was led to believe that the victim would later have the opportunity to deliver a similar type of discomfort in recompense.

 

While experiments have not been conducted in war time, the evidence just presented explains a well-known military phenomenon.   That is, soldiers tend to dehumanize the civilians upon whom they often inflict suffering much more than they do members of the opposing military force.   This would be explained by the fact that the civilians have no opportunity to retaliate, whereas the opposing military force does.   (Aronson,   “The Social Animal”, pages 181-183)

 

The research stated just summarized sheds light on a number of Mormon cultural practices.   First and foremost among them is the manner in which “apostates” are treated.   Apostates within Mormonism are those who have first of all declined further obedience to Mormon authority, and secondly being prepared to state their views in public.   These people are particularly dangerous from the prospective of Mormon leaders because their dissenting voices may cause other Mormons to examine their experience closely and also disobey.   Apostates are referred to by Mormon leaders as a “cancer” which must be excised from the Mormon body, and with others similarly pejorative terms.

 

The manner in which Mormon leaders treat apostates is revealing with regard to the real priorities of the Mormon Church.   Mormons are not excommunicated for holding heretical beliefs.   One can believe as a Mormon virtually anything one wishes.   However, if heterodox beliefs are expressed within the Mormon community, this is clearly defined as apostate behaviour, and excommunication is the required remedy.

 

Once a person has been branded as an apostate, rumours tend to quickly circulate respecting them in the same fashion as they did respecting the Kent State students who were killed as described above.   The rumours in my case included that I was involved in a sexual affair with a variety of different women.   The identity of the woman varied from rumour to rumour, but the nature of the illicit extramarital affair was consistent.   It was also rumoured that one of my sons had become addicted to internet pornography, that I followed suit and that when Mormon authorities tried to help me overcome my alleged addiction I became angry with them and as a result began to publish “lies” on the internet about the Mormon Church.   It was also rumoured that I could no longer be trusted; that I was telling lies about the leaders of the Mormon Church; and that I was treating my family in a harsh and unloving fashion.   It was further rumoured that I had been deceived by a bunch of evil anti-Mormons and that my tendency toward intellectualism and pride had caused me to lose my formerly humble and obedient nature and to reject God, the Mormon Church and morality in general.

 

Cognitive dissonance theory would predict this behaviour on the basis that because I had been a respected persons whose views used to carry significant weight within the Mormon community, it was necessary to demonize me so that my new behaviour could be explained on a basis consistent with Mormon belief and so that my voice would lose credibility where it used to be welcome.

 

The effect of ego on this aspect of cognitive dissonance is worth repeating.   As noted above, if I performed a cruel or stupid action, this threatens my self-esteem because it makes me think that I may be a cruel or stupid person.   But I know that I am not cruel of stupid.   Therefore, if I do something that causes you pain, I must convince myself that you deserved what I did to you.   What nasty irony.

 

Furthermore, once cognitive dissonance has been reduced by dehumanizing members of the outsider group or those that have been harmed by our actions, it becomes easier to hurt other human beings.   Hence, reducing dissonance in this fashion has potentially terrible consequences.

 

Insufficient Justification and the   “Saying is Believing” Principle

In his seminal cognitive dissonance experiments (Leon Festinger,   “Conflict, Decision, and Dissonance”, Stanford University Press, Stanford, CA, 1964) Festinger illustrates nicely another important aspect of cognitive dissonance that is relevant to the Mormon experience that is the flip side of the justification bias. This is known as the principle of   “insufficient justification” and has been used to explain a wide variety of odd human behaviours.   One aspect of this principle is known as the   “saying is believing” principle.

 

Many experiments have been conducted with respect to the “saying is believing” principle. Some of these experiments are summarized in Aronson, “The Social Animal”. Aronson notes by way of summary:

 

If all I want you to do is recite a speech favouring Fidel Castro, the Marx Brothers, socialized medicine or anything else, the most efficient thing for me to do would be to give you the largest possible reward. This would increase the probability of your complying by making that speech. But suppose I have a more ambitious goal: suppose I want to effect a lasting change in your attitudes and beliefs. In that case, just the reverse is true. The smaller the external reward I give to induce you to recite the speech, the more likely it is you will be forced seek additional justification for delivering it by convincing yourself that the things that you said were actually true. This would result in an actual change in attitude rather than mere compliance. The importance of this technique cannot be overstated. If we change our attitudes because we have made a public statement for minimal external justification, our attitude change will be relatively permanent; we are not changing our attitudes because of reward (compliance) but because of the influence of an attractive person (identification). We are changing our attitudes because we have succeeded in convincing ourselves that our previous attitudes were incorrect. This is a very powerful form of attitude change.

 

Aronson later added:

 

“¦ lying produces greater attitude change when the liar is under compensated for lying especially when the lie is likely to invoke a change in the audience’s belief or behaviour. A great deal of subsequent research supports this reasoning and allows us to state a general principle about dissonance and the self-concept: dissonance effects us greatest when

 

(1) people feel personally responsible for their actions and

 

(2) their actions have serious consequences.

 

That is, the greater the consequence and the greater our responsibility for it, the greater the dissonance; the greater the dissonance the greater our own attitude change.

 

So, there are three important principles to bear in mind. First, if we say something we don’t believe without receiving an adequate external reward for doing so, our attitudes are likely to shift in the direction of what we have said. Second, the more important the consequences for which we might be responsible as a result of saying what we did, the more likely it is that our attitudes will shift and the stronger that shift is likely to be. And third, I think that it is fair to assume that the more often a statement is repeated, the stronger its effect will be.

 

The “saying is believing” paradigm is a highly visible aspect of Mormon culture. Young people who were raised as Mormons but do not “have a testimony” are encouraged to “bear their testimony” until the find it. That is, they should publicly state that they believe Mormonism to be God’s one and only true religion, etc., or that they want to believe this, even if they do not. This is the one of the primary techniques used by missionaries and members of the Mormon Church to “strengthen the faith” of prospective members and young Mormons, including Mormon missionaries who do not yet believe. A large percentage of the Mormon missionaries who started their missionary service with me in 1977 fell into this category.

 

Mormon Apostle Russell Ballard referred to this practice in a talk given at a Mormon General Conference on October 3, 2004.   The story dates to Brigham Young, near the beginning of Mormonism, and it is fair to assume has been told many times since then. It is important to note that anything said by a Mormon Apostle at a General Conference is more important than scripture from a Mormon point of view. That is, to the extent that it does not contradict scripture it is on par with it, and to the extent that it contradicts or “clarifies” scripture, the scripture is overridden. Ballard’s remarks included the following words:

 

My experience throughout the Church leads me to worry that too many of our members’ testimonies linger on “I am thankful” and “I love,” and too few are able to say with humble but sincere clarity, “I know.” As a result, our meetings sometimes lack the testimony-rich, spiritual underpinnings that stir the soul and have meaningful, positive impact on the lives of all those who hear them. “¦

 

Many years ago Brigham Young told of an early missionary in the Church who was asked to share his testimony with a large group of people. According to President Young, this particular elder “never had been able to say that he knew Joseph [Smith] was a Prophet.” He would have preferred to just say a prayer and leave, but the circumstances made that impossible. So he started to speak, and “as soon as he got out ‘Joseph is a Prophet,’ “¦ his tongue was loosened, and he continued talking until near sun-down.”

 

President Young used this experience to teach that “the Lord pours out His Spirit upon a man, when he testifies that [which] the Lord gives him to testify of” (Millennial Star, supplement, 1853, 30). “¦

 

The lesson, I believe, is clear: having a testimony alone is not enough. In fact, when we are truly converted, we cannot be restrained from testifying. And as it was with Apostles and faithful members of old, so is it also our privilege, our duty, and our solemn obligation to “declare the things which [we] know to be true” (D&C 80:4). “¦

 

Brothers and sisters, join together with the missionaries in sharing your precious testimony every day, witnessing at every opportunity the glorious message of the Restoration. The fire of your testimony is all that you need in order to introduce the gospel to many more of our Father’s children. Trust in the Lord, and never underestimate the impact your testimony can have upon the lives of others as you bear it with the power of the Spirit. Doubt and fear are tools of Satan. The time has come for all of us to overcome any fear and boldly take every opportunity to share our testimonies of the gospel. “¦

 

So, Ballard is saying several things. First, Mormons have a duty to say they “know” the Mormon Church is true more often. Second, they should say that even if they don’t believe it is true. Third, they should ignore the feelings of fear and doubt that indicate they do not know the Mormon Church is what it claims to be. And fourth, the act of saying something is true will cause them to “know” that they did not previously know.   And most of all, Ballard is saying that the basic premises of Mormonism are sacred, and hence unquestionable, as far as Mormon leaders are concerned.

 

When I served my Mormon mission a large percentage of the missionaries who entered Mormon missionary service do not have a “testimony”. It is commonly believed within the Mormon community that young men are sent into the mission field first and foremost for their own good ““ that is, to get their own testimony and become firm in the faith. And, they are encouraged to find their testimony by bearing it. That Ballard would say this at a Mormon General conference is not surprising since he has been a key player in the formation of Mormon missionary strategies for decades and has consistently taught this principle in that context at least since I was a missionary in the late 1970s.

 

Think of how the principles Aronson outlined above would be likely to apply to one of the many Mormon missionaries who starts his missionary service without a testimony and then many times a week for two years he stands in public wearing an authoritative looking suit and bears solemn testimony in God’s name with regard to the truthfulness of the Mormon Church. Initially, he mouths these words while memorizing the missionary discussions whether he believes them or not.   When it is his turn to teach a lesson to an   “investigator” or make a presentation on a doorstep, he usually says what he is required to say much as would a door to door home alarm salesmen.   Except he is not paid to do this. In fact, he knows that he is sacrificing his and his family’s money and time in order to have the privilege of bearing this testimony. So, either what he says is true, or he is a liar (or fool) to have said it. Since few people like to admit that they have been fooled or are a liar, the easiest conclusion to reach is that the statements made must be true. He also knows he is encouraging the people who hear him to make a commitment that will absorb a huge percentage of their lives and will change the course of their lives in dramatic fashion. This situation is calculated to produce the maximum attitudinal change in those young missionaries.

 

Something similar happens when regular Mormon members bear testimony to their friends and neighbours, and it is intensified if any of those friends become Mormons as well. This is why Mormon leaders like Ballard are constantly after the members to do missionary work with the friends, and to bear their testimonies. That is not to say this is a conscious strategy on the part of Mormon leaders. Rather, there is a strong correlation between members who bear regular testimony and members who remain faithful, hence testimony bearing is encouraged. Cognitive dissonance theory and the principle of insufficient justification in particular provide a cogent explanation as to why this is the case, and it has nothing to do with the truth of Mormonism’s claims. Not surprisingly, a similar strategy works well for the Jehovah’s Witnesses and many other religious groups.

 

Mormon leaders justify the practise of encouraging people to say things that they don’t believe on the basis that those things are certain to be true, so even if the person saying them does not believe them to be true, she is still telling the truth. So, testimony bearing is a fundamental part of the Mormon culture. Each meeting, class, Mormon activity, etc., is opened with prayer. Married couples and families are counseled to prayer together at least once each day.   Most Mormon prayers are an implicit bearing of testimony; a certification that the Mormon Church is the Mormon Church’s God’s true Church. From the time they are able to speak their first words, little children are encouraged to utter such prayers. They do so at their meetings on Sunday, and at home on a daily basis with their families. Those occurs both in private, with Mom and Dad initially saying the words for the child, and in public before family members in the home each day and later in larger groups at Church.   Daily personal, spousal and family scripture reading is also counseled.   This leads naturally to testimony bearing.   Formal testimony bearing is part of every lesson presented at Mormon Church or activity, and every speech (talks by Mormons) presented in Church services. Young people, again, begin to give these talks on a regular basis starting at age three or four. They are encouraged to bear their testimony each time they stand up and give a talk. Most adults have teaching responsibilities within the Mormon Church. They also bear their testimony each time they stand before the congregation to teach. Mormon hymns are another form of testimony bearing. Starting at age 18 months, Mormon children are taught to mouth the words to songs that testify to the truth of the Mormon message. Each week these songs are sung at Mormon worship services for children, teenagers and adults. Mormon are encouraged to sing these songs in their homes during weekly Family Home Evenings and to have them playing in the background at other times.

 

Once a month, each Mormon congregation has a “fast and testimony” meeting. This is a meeting held at the end of a Sunday on which food and water are abstained from for a period of 24 hours by faithful Mormons. Toward the end of that period, the testimony meeting occurs. Going without food weakens body and the intellect, making it more susceptible to emotional experiences. These meetings are intended to provoke emotional experience. Feelings are shared with regard to the importance of family, community and a part of every testimony is a formula which has been laid down by Church leaders respecting belief that Joseph Smith was a prophet, that the current prophet (whoever he may be) is also God’s only prophet on Earth and that the Mormon Church is the one and only true Church of God on the Earth. No dissenting opinion is permitted. And a steady stream of members approach the pulpit to express their faith in public. It is thought “cute” to have small children to stand up before the congregation to utter the words “I know the Church is true; I know the Book of Mormon is true; etc.” Special, and highly charged testimony meetings are held for teenagers at “Youth Conferences” and other special youth meetings which for many young Mormons is where the first visible glimmers of testimony are felt, and magnified. The short story “The Missionary” explores this process (see

http://home.mccue.cc:10000/bob/documents/rs.the%20missionary.pdf).

 

A variety of other similar examples from within Mormon culture could be described. I do not accuse Mormons of consciously planning to brainwash their children and those who investigate the merits of the Mormon Church; however, the system just described could hardly be better designed for that purpose.   This design is not purposeful, but rather functional.   Organizations are like organisms.   Those who survive have, through trial and error if not conscious planning, found tools that work to get them the resources they need to survive.

 

The saying is believing paradigm was also illustrated by experiments related to safer sex programs.   In some experiments, college students were asked to compose a speech describing the danger of aids and advocating the use of condoms.   All the students were easy to persuade to do this because they expressed their agreement with the ideas about which they were asked to write.   One group of students were asked to simply write a speech.   Another group was asked to both write a speech and then recite it in front of a video camera after being advised that the resulting tape would be played to an audience of high school kids.   Half of each group of students were made aware before writing the speech, etc. of their own past failures to use condoms by being required to make a list of circumstances in their own lives in which they had failure in that regard.   This meant that those students that were making a video to be viewed by high school kids were in a state of high dissonance being already mindful of their own failure to use condoms when they wrote the speech and then recorded it on video.   This dissonance was caused by their awareness of their own hypocrisy and the fact that they were telling a group of high school students to behave in a fashion other than that in which they themselves behaving.   Cognitive dissonance theory would predict that in order to reduce the feeling of hypocrisy, this experience would create and hence maintain their self-esteem.   They would change their behaviour to make it concordant with what they were saying.   And this is precisely what the experimental data disclosed.

 

Something similar was found with regard to water conservation.   This has long been a problem in California.   At the University of California at Santa Cruz, during the severe water shortage, students were approached on their way to shower at the university field house and asked to fill out a questionnaire.   The questionnaire measured their attitudes with regard to water conservation.   One group of students was asked if they would sign a poster encourages others to conserve water.   Another group was asked to respond to a water conservation survey which consisted of items designed to remind them of their pro-water conservation attitudes and the fact that showering behaviour wasted water.   The students were then surreptitiously followed into the shower room and the length of their showers was measured.   Just as in the condom experiment described above, only the students who had been placed in a high dissonance condition modified their showering behaviour.   That is, it was only where students had been induced to both advocate shorter showers and were reminded of their own prior inconsistent behaviour, did they modify their current behaviour.   Those who were put in this condition, had showers that averaged just over 3 ½ minutes in length, a far shorter period of time than was used by those students not in a high dissonance condition.

 

This aspect of cognitive dissonance theory is explanatory with regard to a variety of Mormon practices.   One of the interesting aspects of Mormon culture, as noted above, is the manner in which Mormons are led to believe that they are free to choose whatever they want; that free will is a fundamental aspect of the Mormon belief systems; that they have access to all information that is relevant to their choice of belief system and behavioural practices; etc.   The truth is, of course, quite different.   This is similar to the manner in which student beliefs with regard to the importance of practicing safer sex and using condoms diverged from their behaviours.   And the reasons for their diverging beliefs and behaviours are similar.   The Mormon tendency to avoid information that questions their beliefs is based on the fear caused by the potential damage to important relationships that acceptance of this kind of information could cause.   Student behaviour which causes them not to use condoms is also based on emotional forces related to their sexual drive.   Hence, one might predict that it might assist a faithful Mormon to address the inconsistency between their beliefs and behaviours if they were first induced to make statements in favour of freedom of thought, access to information, etc. and then perhaps to “bear their testimonies” with regard to such beliefs in public, and then finally confronted with a situation that required them to consider information that questioned their own belief system.   Having produced a high dissonance environment for the faithful person in question might help her to overcome her fear ““ a powerful emotional force similar to the sexual drive ““ that ordinarily causes many Mormons to behave in a fashion that is inconsistent with one of their most important beliefs.

 

For the most part, the forces of cognitive dissonance related to testimony bearing are brought to bear in the fashion indicated above to bind faithful Mormons to their belief system and make them resistant to any information that questions that system.   For example, Mormon leaders in particular are constantly placed in a position that requires them to stand and publicly affirm their belief in the Mormon Church.   This predictably puts them in a high dissonance position with regard to any information that questions that belief, which results in the suppression or denial of that information in the fashion described above.   For this reason, Mormon leaders tend to stay faithful.   While I was a Mormon Bishop, I had ample opportunity to consider information that should objectively have shaken my beliefs.   That information rolled off me like water off a ducks’ back.   It was only after a period of years during which I did not have heavy Mormon callings that required me to constantly bear testimony, that I was able to gain some emotional prospective and critically examine my own beliefs.

 

Another thing that anchors Mormons to their belief system and something that is emphasized in the testimony bearing behaviour described above, is the fact that Mormonism does teach many good values.   Mormons are taught that education is important.   They are taught that families are important.   They are taught that morality and honesty are important, etc.   This constant mouthing of positive attitudes makes it easy for Mormons to believe that what they are involved with is good.   However, their limited prospective makes it difficult for them to see that each of the good things they are taught by their Mormon leaders and peers is subject to an important condition.   That is, education is important, but only so long as it does not question Mormon authority.   This radically diminishes the utility of education provided within the Mormon community.   One of my daughters recently provided me with the following story:   She seldom attends church anymore, but a couple of weeks ago went to see some friends. She was sitting in the foyer with three older girls who are all in university. They were approached by a young man, also a university student. He was strutting a bit, obviously trying to impress one or perhaps all of them. At one point he said:

 

“Yeah, I was in my physics class the other day and the prof starts taking about the big bang theory. I listened for a couple of minutes and then just couldn’t take it any more. So I put up my hand and said, ‘What you are saying offends me, and I don’t think I should have to sit here and listen to it.'”

 

“Pardon me?”, said the professor.

 

“What you said offends me” replied this faithful young paragon of Mormon intellectual virtue. “It is not proven to be true; I don’t think it”™s true; and it offends my religious beliefs.”

 

After some discussion, the professor politely told the young man that if he wished to leave the classroom, he was welcome to do so. So he left. He told this story with his chest puffed out, obviously proud of how he had been valiant in the defence of his faith.   Three of the four girls listening to him tried to restrain the horror, mirth and disdain the proud young man standing before them had inspired.   The fourth congratulated him for standing up for his beliefs.

 

I do not suggest that the Mormon Church teaches the particular piece of nonsense this young man was spouting, although lots of Mormons share his belief.   The problem is in the paradigm he was using.   He had a religious belief, and the simple fact that it was a religious belief instead of one of a different kind, put it beyond challenge.   I suspect that only someone in religious authority would be able to straighten him out.   What science had to say respecting the matter was not only irrelevant, it should be actively resisted because it questioned something that it is wrong to question.   He was doing precisely what he had been taught by Mormon leaders, such as those quoted above, to do.   Perhaps the most disturbing aspect of this story from my point of view is how it illustrates that not only does Mormonism succeed in inculcating ignorance within its membership, that that ignorance is becoming increasingly wilful and militant.   Given the fear Mormon leaders use to shape the behaviour of the faithful, as noted below, none of this should surprise us.

 

Cognitive Biases and Memory

The confirmation, authority and other biases are aided by the nature of human memory.   Elizabeth Loftus, world-renowned memory expert and U. of Washington psychology professor has noted:

 

Memories don”™t fade”¦ they “¦ grow.   What fades is the initial perception, the actual experience of the events.   But every time we recall an event, we must reconstruct the memory, and with each recollection the memory may be changed ““ colored by succeeding events, others people”™s recollections or suggestions “¦ truth and reality, when seen through the filter of our memories, are not objective factors but subjective, interpretative realities. (Shermer, Why People Believe Weird Things, p. 182)

 

Loftus provides numerous examples of how easy it is to suggest to people that they have had an experience, and cause them to believe that they really had it (See   “Memory, Faults and Fixes”, Issues in Science and Technology, Summer 2002, reprinted in   “The Best American Science and Nature Writing (2003 Edition) at p. 127).   Of particular note are certain experiments that have been conducted to illustrate the way in which our memories and current perceptions are shaped by how we think others have perceived the same event we did.   For example, subjects might be shown a series of slides depicting an event or actually witness a staged event, such as a theft or a traffic accident.   Then, the subjects would be given additional information concerning the event.   The post-event information given to one group would contain material that contradicted some details of the actual event, such as a stop sign being described as a yield sign. The post-event information provided to a second group of subjects (the control group) would contain no such conflicting information.   After ingesting the supplemental information, all subjects would be given a test concerning what they witnessed. In all of these experiments, the subjects who were given the misleading supplemental information performed more poorly than control subjects respecting the items regarding which they had been given misleading information.

 

This research sheds light on how Mormon testimonies are created.   Once we have heard enough other people say, for example, that they felt God”™s spirit when they read the Book of Mormon, we are capable of manufacturing similar   “memories” out of thin air and the words we have heard.   And the more authoritative, credible, loving etc. the people who suggest these things to us, the more effective they are likely to be.

 

Again, the term   “denial” is often used by those observing Mormon behavior when it is likely that the memory construction function described by Loftus and others, when coupled with the way in which Mormon culture works, provides an adequate explanation for the odd Mormon perception of past and current reality.

 

Interpretative Ambiguity

Our tendencies to be unconsciously influenced by biases and cognitive dissonance it captured in another way by research summarized by Harvard”™s Daniel Gilbert at http://www.edge.org/3rd_culture/gilbert05/gilbert05_index.html.   He notes that we tend to underestimate the power of random processes to create order, and hence we often seek explanations where none are needed.   This in turn leads us to sometimes erroneously assume that a god or some other powerful agency must be at work with regard to things that science has recently been able to show are possible on the basis of simply random chance operating over long periods of time.

 

He also notes that we tend to accept as authoritative nice sounding statements that upon close analysis are found to say little or nothing.   These statements end up supporting the person or organization responsible for them and hence are part of the authority bias.

 

Gilbert also had some useful comments regarding our tendency to interpret ambiguous phenomena in only the way we are rewarded for interpreting them.   He notes that if we glance at a   “Necker” cube (see the link above) we have the sense of looking through the lines that define the cube at a dot on its left inside corner. But if we stare for a few moments, the cube suddenly shifts, and we seem to be looking down at a box that has a dot sitting on its lower left edge (see www.dogfeathers.com/java/necker.html for a similar illusion).

 

These illusions are ambiguous objects and hence can be interpreted in more than one way.   Our brains tend to jump between different possible interpretations until we are rewarded for seeing the illusion in one way.   The reward might be a dollar bill, or a friendly pat on the back.   Then, our brains begin to see only the rewarding view.

 

Events are much more ambiguous than graphic representations like a Necker cube and hence the mental phenomena demonstrated   by that cube should be expected to heavily influence our ability to see life as it is, or can be interpreted, in cases where our bread (or cube?) is buttered on only one side.

 

The human tendency just noted explains many aspects of our behavior.   For example, this is why most people adapt so well to most tragedies and windfalls.   And of course, if we are part of the social group that powerfully rewards interpretations of life that support the idea of a particular kind of god and religious belief, and punishes alternative views, is it not surprising that we will tend to perceive reality in a manner that is consistent with these beliefs.   This explains the remarkable   “blindness” that religious believers of all stripes display from the point of view of non-believers.   And by that I mean that Mormons can see this blindness in JWs or Muslims, and they can each see it in Mormons, but none can see it in themselves.

 

The Necker cube kind of research also suggests that people often mistakenly attribute the good fortune that the result of circumstance, or perceived good fortune that is the result of the brain looking of the bright side of ambiguous events, as being the result of a helpful external agent. For example, Gilbert describes one of his studies in the following terms:

 

“¦ female volunteers were told that they would be working on a two-person task that required them to have a teammate whom they liked and trusted. The volunteers were shown four folders, each of which contained the biography of a potential teammate. They were told that before reading the biographies they must choose a folder randomly, and that the person whose biography was in the chosen folder would be their teammate. The volunteers looked at the four folders, chose one randomly, and then read the biography they found inside. What the volunteers did not know was that the experimenter had put the same biography in all four folders, and that it was the biography of someone who was not particularly likeable or trustworthy.

 

So what happened? As the volunteers read the biography, their brains naturally did what brains do best: They searched for, found, and held on to the best possible view of the teammate (“Her bio says that she doesn’t like people all that much, but I bet she’s just an exceptionally discerning person”). When volunteers finished reading their new teammate’s biography, they were given three other biographies to read, and they were then asked to rate all four of the biographies. Not surprisingly, the volunteers rated their teammate as superior to the others. The volunteers liked their teammates best because they had brains that knew how to find the most rewarding view of their current circumstances.

 

Now comes the interesting part. After the volunteers read and rated the biographies, the experimenter took the volunteer aside and made a confession. The experimenter explained that while the volunteer had been “randomly choosing” a folder, the experimenter had been using a subliminal message to try to make the volunteer choose the best possible partner. This wasn’t true, of course, but the volunteers believed it. Then the volunteers were asked the critical question: “Do you think the subliminal message had any effect on your choice of folders?” The results showed that, by and large, volunteers thought the subliminal message had guided their choice of folders. Although they had been given a relatively dislikeable teammate, their brains had managed to find a rewarding view of that teammate; but because they did not know that their brains deserved the credit for their good fortune, they gave the credit to a subliminal message. After all, they clearly chose the best possible teammate, and there had to be some explanation for their extraordinary luck!

 

This study wasn’t about subliminal messages, of course. Like many psychological studies, this one was meant to be an allegory. It suggests that under some circumstances people can misattribute the uplifting work that their brains have done to a fictitious external source. Brains strive to provide the best view of things, but because the owners of those brains don’t know this, they are surprised when things seem to turn out for the best. To explain this surprising fact, people sometimes invoke an external source “” a subliminal message in the laboratory, God in everyday life.

 

I think Gilbert”™s explanation of the Necker cube and other research nicely explains why Mormons and many other religious people perceive the ambiguous events that form the reality of human experience, and more importantly, why alternative interpretations of the same events are invisible or nonsensical to   “believers” while being both obvious and compelling to others who are not rewarded for them.

 

Gilbert concluded by noting that one of his co-researchers in the experiment just described is a seriously religious person.     Gilbert asked him how he felt about having demonstrated that people can misattribute the products of their own minds to powerful external agents. He said, “I feel fine. After all, God doesn’t want us to confuse our miracles with his” thus inadvertently buttressing the research”™s conclusions.   He drew lines to define the logical implications of his research, which are of course ambiguous, in a manner consistent with the beliefs for which he is rewarded.

 

Gilbert notes that   “Science rules out the most cartoonish versions of God by debunking specific claims about ancient civilizations in North America or the creation ex nihilo of human life. But it cannot tell us whether there is a force or entity or idea beyond our ken that deserves to be known as God.”   I agree with him.   Regrettably, many Mormon beliefs are of the cartoonish sort he described.   These can only be maintained within the kind of mental walls this essay attempts to describe.

 

The Emotion of   “Elevation”

It has also been shown that certain experiences that cause of the emotion of “elevation” to occur are highly influential with respect to our behavior.   When people see unexpected acts of goodness, they commonly described themselves as being surprised, stunned, and emotionally moved. When asked “Did the feeling give you any inclination toward doing something?,” the most common response is to describe generalized desires to help others and to become a better person, and feelings of joy.   These feelings bind human groups together, and so create strong, reliable communities.   Members of Mormon communities exhibit this kind of behavior.   However, the behaviors in question often also bind the participants to the Mormon institution itself.   For example, by leaving on a mission for two years, a young man in the Mormon community inspires precisely the kind of emotion described above.   And he is subjecting himself to a powerful conditioning force that will make it much more difficult for him to “question” when he returns, and he is keeping himself very busy during precisely the period of time during which most young men question.   Hence, the community is strengthened by an act that inspires the emotion of elevation, and at the same time the missionary engages in a wide range of behaviors explicitly intended to strengthen the Mormon community in other respects while truncating the missionaries individuation process and so arguably weakening him.   Many Mormon conventions have this kind of effect.

 

As noted above, the prize religion offers is huge ““ relief from the anguish caused by our greatest existential fears.   And the Mormon Church ups the stakes significantly in this regard by positing the possibility of eternal family life and has created a society in which an admission of disbelief often costs dearly in terms of marriage and other family relationships, social status, etc.   In the face of this kind of prize/penalty structure along with a large group of believers and a powerful story telling mechanism, we should not be surprised that apparently rational people are easily persuaded to believe in irrational, extremely low probability versions of future reality such as the Celestial Kingdom.   And when you add to this the psychological pressure that being surrounded by believing Mormons for most of life, bearing public testimony on countless occasions as to the certainty of my belief, and then being placed in leadership positions within the Mormon community, it is not surprising to me that for almost three adult decades I was unable to see what is now so clear to me respecting the Church and the manner in which it treated me and continues to treat others.

 

Information Suppression

The Mormon Church enhances the power of its sacred premises and the emotional and social mechanisms that support them within its membership by suppressing the troubling aspects of its history and threatening those who dissent with being excised from their primary social group (see http://home.mccue.cc:10000/bob/documents/come%20clean.pdf).   This, of course, likely deepens and extends the average Mormon”™s denial.

 

Mormons are taught that those who write the “real” history of Mormonism are not trustworthy.   Because obedience to leadership authority is paramount within modern Mormonism, I chose not to read anything that questioned my religious leaders or the beliefs they approved.   Most faithful Mormons do likewise.   And so I made it to age 44 in a state of almost complete ignorance respecting the most important aspects of Mormon history and culture formation, and hence the information most relevant to whether the spectacular claims Mormonism makes with respect to exclusive divine authority, and hence being God”™s   “only” true church on earth, are justifiable.   Ironically, I considered myself well informed respecting Mormonism and religion in general, and was looked up to within the local Mormon community in that regard.

 

The suppression of information relevant to belief within the Mormon community slows the rate at which disconfirming information, and hence cognitive dissonance, builds within community members.   Changes, however, are afoot in this regard, largely in my view forced by the Internet.

 

I have noted, for example, the increasing tendency (even among moderately Mormon history literate Mormons) of an approach to what is news to Mormons with regard to their own history that is consistent with how a General Authority dealt with me.   That is, during a lengthy interview with me regarding my Mormon history and other concerns, and after acknowledging the existence of the faithful history policy, accepting that I had been taught not to look or question, and that being an obedient member I felt obliged to do what I had been taught, he somehow concluded that it was my fault that I had not looked and questioned.   He basically said that I had no one to blame but myself for my ignorance of the reality of LDS history.   He said that the information was there for me to see whenever I made the slightest effort to find it.

 

Recently on an Internet bulletin board for faithful Mormons, I saw one Mormon gently rebuke another for her naiveté and hence concern about Joseph Smith”™s polyandry.   The more knowledgeable of the two referred the other the LDS.org website and the page there that shows that Smith was married to women who were married at the same time to other men.     “See”, she said,   “the Church isn”™t hiding anything”.   I could not tell if the other woman accepted this disingenuous response or not.

 

Let me illustrate how this kind of thing could be spun with faithful Mormons who want to reinvent their faith, or by post-Mormons who are trying to get their loved ones to think in broader terms.

 

First, while what the General Authority to me said makes no sense on its face, what he may have been trying in an awkward way to tell me was that Mormon and the faithful members use very different communication paradigms.   The leaders want obedience ““ there is no question about that ““ and hence their rhetoric gets pretty strong as they attempt to rein in the disobedient.   However, this General Authority told me that he never took the “don’t look; don’t question” rule seriously, and so always looked and questioned.   He interpreted this advice as being aimed at those who were too weak to handle looking and questioning.   That is, in his view, it was okay to look and question as long as you still obeyed.   So, in effect the advice had an implicit condition (don”™t look only if it will make you disobey), and if you were too dull to figure that out, too bad for you.   So, he said, “Too bad for you Bob”.   While this kind of reasoning makes my head dance, it is the only way I can make sense of what this man told me without taking the position that he was simply lying.   I don”™t believe that he was doing that.   The human ability to rationalize renders lying, generally speaking, unnecessary.

 

Here is a second, and more metaphoric way of interpreting what the General Authority told me.   I am confident that these ideas were not going through his head when we spoke since if they were he is the type who would have gloried in illustrating how deep his thinking was when compared to mine.

 

The reasoning implicit in the General Authority”™s explanation of his interpretation of the   “don”™t look; don”™t question” rule could be said to underlie the Adam and Eve narrative as interpreted by the Mormon Church.   That is, God said “don’t eat the apple” while fully intending that it be eaten.   So Mormon authority may be not intended to be as absolute as it appears.   And in both cases, knowledge is the forbidden fruit that once ingested changes the world.   The Mormon Church appears to wish to enforce the strict interpretation of the   “don”™t look; don”™t question” rule.   But as the General Authority himself noted, when push comes to shove they don’t really mean it.

 

I would extend this analogy to say that in the end, we become our own authorities.   This is the fundamental philosophy of the Mormon plan of salvation ““ personal choice.   Decide what kind of an existence you want, determine the laws that must be obeyed to obtain it, and go out and create your world.   I would say that is inherent in the most basic of the concepts Joseph Smith taught.   But I don’t believe that is what he intended, as indicated above in the summary related to the Mormon conception of freedom.   I could just as easily have used quotes from early Mormon leaders, but choose contemporary leaders to make the point that this warped conception of freedom is still in use today.   Smith was working with stock mythic themes, most of which he borrowed from other places and cobbled together in an imaginative way.

 

Finally, here is the third way to interpret Mormonism”™s actions relative to faithful history in light of what the General Authority said.   This, in my view, is the reality of the situation.   The double-speak communication described above puts Mormon leaders in a position in which they can do what the General Authority did to me – interpret words in ways that are counterintuitive.   This might be called the Humpty Dumpty linguistic paradigm – words can mean almost whatever we wish them to mean.   And so, Mormon leaders told us to not to look or question and now say that they did not really mean it.   Most reasonable observers would assume they have misrepresented both this and Mormonism”™s history while proclaiming the importance of truth and honesty to help keep the members in line.   And the powerful socialization forces described in this essay have a tendency to inculcate obedience, while the members feel free because personal agency is central to LDS theology and philosophy. This maximizes the power of Mormon leaders.

 

When we apply David Sloan Wilson’s evolutionary paradigm (See   “Darwin”™s Cathedral”) to this mess, I see a clear pattern.   Wilson says that organizations evolve as do biological organisms as a result of similar processes.   While I don”™t think the analogy between biological evolution and social evolution is as complete as he suggests, there are certainly similarities.

 

When we think of Mormonism in this context, a pattern emerges.   Mormon leaders have from near Mormonism”™s beginnings had to twist words and concepts away from their usual meanings in order to evolve and deal with the challenges they faced from time to time.   It has been pointed out to me that Joseph Smith did not technically lie about his sexual relationships, he just carefully used language so as to mislead.   He was a lot like Bill Clinton in more ways than one.   The same thing happened relative to Brigham Young”™s taking control of Mormonism contrary to Joseph Smith”™s apparent wishes.   More word twisting.   And then when polygamy was done away with, linguistic pretzels were tied for over decade, as well as outright lies being told during that same period of time.   More creative wordsmithing occurred when the priesthood was granted to black men.   This tradition continues up to date as Mormonism continues to change without wishing to appear to change.   And this is what I see in the General Authority”™s interpretation of the   “don”™t look; don”™t question” rule to me.   The leaders are still trying to do what they can to maximize their influence.   As noted above, the key to understanding evolutionary theory is constantly asking   “who benefits?”

 

Survival is far more important to an organization than the welfare of any individual member.   The leaders can’t articulate this, because they are on the inside and so   “misrecognized” what they are doing.   They are the individual bees who do a particular dance that set off other actions, that set off other, etc. until a decision is taken and group action occurs that is beyond the control of even those who think they lead.   It takes outsiders armed with analytical tools honed on other contexts to see this pattern and shed light on it.

 

If an individual is aware of the conflict of interest between his interest and the Mormon institution”™s interest, he can vote with his feet and avoid the harm an organization struggling for survival in a changing world may seek to inflict on us.   The more members are informed enough to vote with their feet in this fashion, the greater the evolutionary pressure brought to bear on the institution.   If it can’t respond quickly enough it will cease to exist and other more flexible organizations will take its place.

 

For example, the “faithful history” policy is Mormonism”™s reaction to vast amounts of information that question its foundations suddenly becoming accessible to large numbers of its members.   This policy caused the education of Mormons respecting any of the branches of science and history that might question LDS orthodoxy to take a Dark Ages style step backwards.   This harms individuals and strengthens the Mormon institution, and can only be justified through the use of the kind of mumbo jumbo summarized above.   However, now that the members are becoming aware of faithful history itself, it is becoming an organizational liability and may have to be eliminated in order to save the institution.   I see in the   “all the information has been there all along” approach the beginnings of the end of   “faithful history”.

 

In my view, the Mormon institution will have to evolve more quickly during the next decade than at any other time in its history in order to survive in the long term.   I see Gordon Hinckley”™s recent attempts to take Mormonism “mainstream” as another indication of the changes that are coming.   Only time will tell if this will be enough.

 

The Magic of   “Misdirection”

Information suppression, and the time control to which Mormons are subject, are classic examples of how   “misdirection” can be used to control what we perceived.   See http://www.leirpoll.com/misdirection/misdirection.htm for examples.

 

As the legendary magician Jean Hugard said,

 

“The principle of misdirection plays such an important role in magic that one might say that Magic is misdirection and misdirection is Magic”.

 

That is, magic is performed by the magician using tendencies in human perception to make us look at his left hand while his right hand (or foot, or assistant, etc.) does something that we do not notice and gives the impression that magic has occurred.

 

One of my favourite magic tricks (and one of the few that is simple enough for me to do) is performed as follows:

 

  • A group of people is seated in chairs and watching the trick.

 

  • I put my hands in front of the subject”™s face, and about a foot away from her nose.

 

  • I show her a handkerchief with my left hand, and then while moving my hands around each other in a circular manner that is supposed to look confusing but not be confusing, I stuff the handkerchief into my closed right fist so that an edge is still visible.

 

  • While doing so, I close my left hand into an identical fist.

 

  • I then ask her where the handkerchief is.

 

  • She points to my right hand.

 

  • I repeat this procedure twice.

 

  • Each time the subject easily spots the handkerchief.

 

  • Then, having defined the   “relevant space” and   “relevant actions” for my subject, I know that her attention will be focused on the area around my hands in front of her and on what my hands have done the past three times.

 

  • So, this time as I move my hands in precisely the circular motion I have trained her to watch, I release the handkerchief from my left hand so that it flies quickly over her head.

 

The flight of the handkerchief is obvious to everyone else in the room because they stand at a distance from the action that allows them to see the handkerchief as it hangs in the air for a second and falls to the floor. That is, their perspective enables them to avoid the   “misdirection” that tricks the subject.

 

However, the subject only has the chance to see the handkerchief as it moves about 12 inches before passing out of her field of vision, and she is focused on the area a few inches around my hands.     While the human eye is quick enough to pick motion of this sort up, when   “misdirected” it will not do so. The subject looks foolish when she assumes that the handkerchief is in the right hand again, and is amazed when it is not in either hand.

 

Mormons have their attention so focussed on the minutiae and ritual of life within the Mormon community that they are in the position of the subject in my trick.   Handkerchiefs of all kinds are flying right and left over their heads without them being able to see them.   But of course, those who are not so involved in Mormonism ““ who stand at a distance ““ can see what is going on and react a little like those who watch my trick my subject with the handkerchief.

 

But then the table turn.   Mormons can look at many of those who laugh at them (some Evangelical Christians, for example, who believe that the Earth is 6,000 to 10,000 years old and Noah really did get all those animals into the Ark, etc.) and see hankies galore flying around that are invisible to other people.

 

Such is the nature of cultural misdirection.   We all have blind spots that only others can help us see.   If we cling to the belief that we are uniquely capable of assessing truth and so refuse the help of others whose perspective is in some ways broader than our own, we will be forever blind in some ways.

 

While this may belabour the point, I think I should outline one more   “misdirection” experiment. The point here is that misdirection in magic is based on based weaknesses in the human ability to perceive that psychologists and neuroscientists now study. My favourite object lesson in this regard can be found on the Internet (see http://viscog.beckman.uiuc.edu/grafs/demos/15.html,   if you have a java enabled computer).   It is a video clip of people passing a couple of basketballs between them. Five (I think) people are dressed in relatively light coloured clothes, are walking in a complicated pattern and are passing two light coloured balls between them.

 

If you can access this video, you may as well perform the experiment on yourself.   So before reading further, watch the video and count the number of times the balls moves from one person to another. This is not easy to do because of the way they move in front of each other while passing the balls around.

 

After this short video ends (maybe 30 seconds) you are asked if you noticed anything   “odd”. I didn”™t.   “You didn”™t see the gorilla?” you are asked.   “Nope” was my response. So you replay the video.

 

While the people are walking through their pattern and passing the ball, a man dressed in a black gorilla suit walks into the middle of the group, turns toward the camera, beats his chest and makes a face, and then walks out of the frame. It is that obvious. And I did not see it because I was focused on who was passing the ball to whom, and the gorilla was dressed like the background (dark) instead of the figures (light). But once you knew that something   “odd” had happened and paid close attention, this was as obvious as the computer sitting right now on the desk in front of me. It was   “magical” when the gorilla appeared out of thin air.

 

Such is the power of misdirection.   And it is far from just a religious phenomena.   It applies to politics, economics, social relationships of all kinds, etc.   It is one of those fundamentally important things to grasp if one wishes to understand as much as possible of human behaviour, both individual and social.

 

To show how deep this runs, consider the unsettling story of how progress sets traps that destroy entire civilizations is really about the human tendency to focus on social fine points (like how quickly our economy is growing) while missing critical big picture imperatives (like global warming).   Jared Diamond tells this story in   “Collapse” (see http://www.newyorker.com/critics/books/?050103crbo_books and http://www.davidbrin.com/collapse.html) .   For a shorter and much more accessible (if darker) version of the same events, see Robert Wright”™s   “A Short History of Progress” (see http://blogs.salon.com/0002007/2005/03/23.html).

 

So, if we are sufficiently focused on the minutiae of living a Mormon life, the big picture will not be questioned. Hence, Mormonism (and many other religions that use the same system) are all about the details, routine and ritual of daily living, and result in such a busy day to day existence that there is little opportunity to think about where the train is headed.

 

This is not the result of the plan of some evil men sitting around in the Salt Lake Temple. Rather, this is how human social organizations of all types to some extent function, as already noted. They spontaneously organize to protect themselves, find the resources they need to flourish, etc. The reason that the rules of modern democracies are so important is that they run against the hierarchical grain of human groups, and so force human organizations in an unnatural direction. This requires leaders to account to members; this restrains the natural direction of hierarchical power; this requires information about how and why leadership decisions are made to be disclosed to the members.

 

Perhaps the clearest lesson from human history is that absent the constraints that democracy imposes on the power of those at the top of the social pyramid, power will be abused.

 

“Spiritual Experiences”

I have had many intense, moving spiritual experiences that given the premises of the Mormon belief system as noted above, should be expected to buttress the belief that Mormonism was literally true and so increase the force of denial in my life.   These experiences included the loss of sense of self Newberg et al describe in “Why God Won’t Go Away” that I have had many times while praying about the Book of Mormon and in other Mormon religious contexts. During these moments I felt connected to a source of love and peace that was at times overpowering and more attractive than anything else I have felt. I also had many experiences while serving in various church leadership capacities, and particularly while serving as Bishop, during which I felt like I was the conduit of “pure knowledge” while counseling those in need or while giving priesthood blessings.   One particularly powerful experience occurred as I prepared to officiate and speak at a particularly tragic funeral.   I still marvel at the power I felt, and insights that came to me, at that time.   Finally, I had many intensely moving experiences in a family context while giving or receiving priesthood blessings (see http://mccue.cc/bob/documents/rs.the%20blessing%20chair.pdf for a look at the dark side of that), and wonderful experiences connected to the birth of some of my wife and my seven children.

 

How can we understand experiences of this nature?   How can such seemingly wonderful moments spring from something as bankrupt as literalist Mormonism seems to be?   Are these experiences not   “good fruits” that indicate God”™s hand somehow works through Mormonism despite all of its obvious difficulties?

 

Sir Issac Newton’s history is instructive. Among other things, he had worked out an approximate date for Christ’s second coming based on a Biblical numerological system, and believed that he (Newton) was being guided by God in his scientific work to prepare the way for that day. Newton’s contributions to science were rejected by the scientific establishment of his day as “magical” and “unscientific” because while describing gravity, Newton could not explain it from a cause and effect point of view and at the same time was calculating the time of Christ”™s second coming.   Newton was widely considered to be a dangerous, and very bright, nutcase. One of Newton’s most important promoter’s was the atheist Voltaire, who when questioned about his support for a religious lunatic said that Newton had been so right with regard to so many things that his silliness regarding certain religious matters merely proved he was human.

 

There are countless other examples of people whom history has judged as brilliant in many ways and misguided in others. Many of these people held (or still now hold) strong religious convictions that most educated Mormons would reject as ridiculous.   These convictions are often based on   “spiritual experiences” that are amazingly similar to what I described above.   I have found it helpful to study the various ways in which the lives of these people have been explained by social science and neurology, and then apply the same research to Mormonism and, in particular, my own experience as a faithful Mormon.     Here are a few ideas that I have found helpful in this regard.

 

The   “God Spot” and Brain Dysfunction

I am a person who has what some have called a big   “god spot” in my brain (see http://cas.bellarmine.edu/tietjen/images/new_page_2.htm   and for an Islamic perspective, see http://www.islamonline.net/English/Science/2005/04/article02.shtml). I have not been tested, but I would expect the activity rates in this part of my brain to be above the norm. This is likely in part genetic, and in part related to how I was raised. But even within the faithful Mormon population, my tendencies toward things of an emotional, or spiritual, nature was above average.

 

It is my belief that in general the Mormon population has more   “god spot” activity than the rest of the population.   This would be the result of the conditioning forces exerted on Mormons week after week as they attend Mormon activities, pray daily and perform countless other Mormon rituals that should be expected to develop the god spot in the same way as daily pushups develop arm, chest and stomach muscles.   Mormons who remain orthodox and participating in Mormon meetings also may have a genetic predisposition toward this kind of brain activity that makes Mormon worship attractive to them.   And the people who are converted to Mormonism may reasonably be assumed to tend in this direction as well.

 

In some cases, I believe that what are assumed to be spiritual experiences result from brain dysfunctions, such as temporal lobe epilepsy.   There is a huge academic literature in this regard.   At the Cal Tech conference I attended recently on “brain and consciousness” a number of speakers referred to temporal lobe epilepsy and other similar phenomena as a source of paranormal experience. The presenters at that conference included several of the top neuroscientists in the world. The two links above also show what some scientists have been saying about this kind of thing recently.

 

And interestingly, Karen Armstrong’s autobiography “The Spiral Staircase” notes her moderate temporal lobe epilepsy as a probable cause of her life long fascination with spiritual phenomena. Michael Persinger of Laurentian University in Canada has for years been performing experiments that induce mild temporal lobe seizures and then observing the results, which include out of body experiences, quasi-religious experiences and a variety of other paranormal phenomenon. See http://www.wired.com/wired/archive/7.11/persinger.html for a summary of some of Persinger’s work. I note that many neuroscientists are critical of Persinger’s methodology. He apparently has not used double blind procedures in the way they should be used and so his findings have not been adequately purged of the psychosomatic elements they should be presumed to contain. Nonetheless, his work is taken seriously enough that others are now trying to replicate (or falsify) his work.

 

Sleep Paralysis and Alien Abductions

Other types of what might be called brain malfunction have also been shown to relate to paranormal (and hence possibly religious) experience.   For example, Harvard”™s Richard McNally and others (see http://cms.psychologytoday.com/articles/index.php?term=PTO-20030527-000002) have shown that certain types of sleep paralysis and hypnopompic hallucinations are likely to be perceived as alien abductions.

 

I heard McNally speak at a conference at Cal Tech in May, 2005.   He started with a description of sleep paralysis.   Roughly 30% of the population suffers from this problem to one extent or another.   It is caused by a misfiring of certain parts of the brain during the rapid eye movement stage of sleep that produces the terrifying feeling of being awake and seeing a dream while not being able to move.   About 5% of the population experiences a particularly intense form of this experience, and many of those people are within the group that report experiences such as being abducted by aliens.   McNally said this problem is exacerbated by people in the repressed memory industry getting involved and helping people to “recover” these memories.   That is, people who already have had these traumatic experiences with nightmares in sleep can be induced under certain types of therapy to believe that they have had experiences that they probably have not had.

 

The sleep paralysis experience has been shown to have a number of remarkably consistent components.   Most people who suffer from this disorder have a significant percentage of the same symptoms.   Among the most common of these are the following:   the sense of being awake; the inability to move one”™s body, except the eyes; the sense or a terrifying physical presence in the room, often described as a dark blob; the sense of weight on the chest; the hearing of sounds consistent with a physical presence in the room; etc.

 

McNally summarized a fascinating set of experiences in which the brain states of the people who believe they have been abducted by aliens are compared to the brain states of people who are suffering from post-traumatic stress disorder as a result of experiences during war.   One of the tests they use to determine the degree of a person”™s post-traumatic stress disorder is to have him read a statement describing a terrifying or experience while being wired up so that his brain and certain other physical states can be measured.   PTSD suffers show an increase in the activity in the parts of their brains that relate to terror and certain other things, as well as other physical symptoms consistent with stress.     When these tests are used on people who report having alien abduction experiences, they show physical symptoms that indicate a higher level of PTSD on average than people who have had the real war experiences.   That is, those affected by imaginary experiences as a result of the misfiring of neurons, sleep paralysis, etc. are sometimes effected in a more   “real” manner by that experience than people who have had terrifically traumatic war experiences.

 

This is evidence that what most of us are confident are experiences manufactured the mind alone (alien abductions) seem “more real than real” to those who have them, and leave physiological evidence that is consistent with the subjects’ reported perceptions. It has been noted that since our perceptive systems are designed to take input from the external world and use it to interpret that world, it makes sense that when certain aspects of the perceptive system itself malfunction, we would interpret that as messages about the external world.

 

McNally also summarized the kinds of people who tend to have alien abduction experiences.   They tend to have new age beliefs, familiarity with alien abduction stories, elevated fantasy tendencies, believe in non-naturalistic processes (god, devil, spirits, fates etc. override the ordinary processes of cause and effect); experience with sleep paralysis, and they have often engaged in memory recovery therapy sessions.

 

I was fascinated to learn that one of McNally’s Harvard colleagues, the recently deceased John Mack (see http://www.pbs.org/wgbh/nova/aliens/johnmack.html) had become convinced of the realty of alien abductions as a result of research similar to that conducted by McNally. This shows how bright, well-informed, well-qualified people can differ regarding things of this nature, and then get locked into paradigms that seem to make little sense to virtually all outsiders.   I think of Hugh Nibley when I run across people like Mack in fields as diverse as alien abductions, young earth creationism, intelligent design theory and holocaust denial, not to mention the best educated and articulate of the apologists for numerous religious faiths.

 

McNally also notes how surprised he was at how the people he studied greeted the conclusions to which his research pointed. He thought that they would be thrilled to find that there was a plausible explanation for their experience that did not require them to believe that they have been abducted by, raped by, tortured by, etc. aliens. To his surprise, almost all of his patients were highly resistant to his explanation of their experience. Their perception of what happened to them was so real that it overrode what his data indicated, and they felt a deep sense of loss when confronted with the possibility that something they believed to be more real than anything else they had experienced was an illusion. While he did not mention this in his lecture, my reading with regard to cognitive dissonance, “biases” and related areas of psychology indicates that one of our deepest human needs is to be able to trust our perceptions. It we can’t trust our perceptions, for example, we would be unable to make decisions and often decisiveness is more important than accuracy of decision to our survival. To admit that something that had seemed so compelling was the result of a brain hiccup of some kind is hence existentially threatening, and so very hard to do. I also believe this is part of what prevented me for so long from reinterpreting some of my experiences.

 

McNally concluded his presentation at Cal Tech with a quote by the German psychologist Max Weber that described how science disenchants the world and how people need that enchantment and hence are resistant to anything that will take it away.   My experience as a Mormon suggests another hypothesis.   That is, we tend to identify with the most intense emotional experiences that we have.   The more we talk about these experiences and build relationships related to these experiences, the more they become a part of our self-definition.   It is extraordinarily difficult for most people to change their self-definition once it has been established.   So, once we have gone a certain distance down a particular road, even though that road may be terrifying, inhumane, and something that most other people with choices would want nothing to do with, it has become our road; it has become us in a basic way and we are afraid to abandon it.

 

Also, the cognitive dissonance and bias research summarized in this essay makes it clear that we have a powerful need to feel that we can rely upon our own perceptions.   Any experience that leaves the kind of strong impression the sleep paralysis experience has been shown to leave should be expected to be trusted.   If you can”™t trust your impressions when they are that strong, what can you trust?   The same would apply to the Mormon testimony feelings.   If you can”™t trust things of this nature, you can”™t trust anything and if you believe that, you would have hard time making the decisions necessary to get by from day to day.   That is, you would be paralysed.   There is no reason to assume that evolution would have equipped us to respond to our experience in that way.   It therefore makes sense that would accept powerful emotional and other experiences as   “real” and act in a manner that is consistent with that perception.

 

Co-Opted Brain Functions

Newberg and D’Aquili’s theories (as set out in “Why God Won’t Go Away”) explain the brain correlates of the mystic encounter with   “ultimate reality” or   “unitary being” that so many religious and other people have reported. See http://mccue.cc/bob/documents/rs.do%20smart%20mormons%20make%20mormonism%20true.pdf starting at page 38 for a summary of some of his research. I have had the experience he describes (see http://mccue.cc/bob/documents/out%20of%20my%20faith.pdf starting at page 77). So, I understand what Newberg means (I think) when he speaks of a spiritual experience being more real than our baseline waking experience.   His research in this regard is consistent with that summarized above with regard to the   “more real than real” aspect of the sleep paralysis experience.

 

Newberg’s book suggests that the mystic experience points toward a reality that is only accessed by passing through the gate of mystic experience, and that this reality is more real, more to be trusted etc., than our waking, base line experience. I had the chance to spend a week with Dr. Newberg on Star Island a few weeks ago (see http://mccue.cc/bob/documents/rs.star%20island%20overview.pdf) and so heard him express his views first hand and had the chance to ask him about them.   He is an extraordinarily pleasant, approachable person and his statements in person were much weaker than what is in the book with regard to the reality of some kind of mystic   “unitary being”.   Other people on Star Island who know both him and his co-authors attribute the stronger view of ultimate reality to them.   I think it fair to characterize Dr. Newberg”™s view as being agnostic as to ultimate reality.   He seemed well grounded in the same kind of reality I am, and willing to consider whatever the evidence suggests.   I have decided in part for the reasons noted above that it is best to limit my belief to what can be established, or reasonably inferred, based on a naturalistic and hence testable basis.

 

Newberg’s research does not suggest that the mystic experience is caused by a brain dysfunction. Rather, he has identified a brain function that has an important role to play in other aspects of human life. Sexual climax, for example, is one of the few times during which the same pattern of brain activity seen in deep meditation is also displayed. In that case, the outcome of the feelings related to that brain state are clear. Euphoria makes the activity resulting in that state attractive. This enhances the probability of human reproduction. The loss of sense of self while in the embrace of another human being assists pair bonding which is crucial to the care of human infants during the their lengthy period of dependence while their brains mature. Etc.

 

As Pascal Boyer so nicely explains in “Religion Explained”, the main social and psychological features that drive religious systems at the personal and group level are likely offshoots of individual and social functions that were developed for other purposes and are co-opted in the creation and maintenance of religious institutions, which likely only appeared with the formation of complex human society.   This was a relatively recent event in human history. For example, while it is possible that an infinite number of different types of god could be worshipped, relatively few are. The selection of these few is nicely explained by human psychological and social traits that are reasonably inferred to have developed long before religious institutions and the notion of “god” arrived on the scene.

 

So, when humans stumble across experiences that they can access on demand that put them into the same mental state as making love or something similarly compelling, why would we not expect that state to be accorded special significance? This is what happens through learning certain contemplative disciplines; engaging in group activities involving chanting, dancing, fasting etc.; engaging in individual activities such as those used by the Sufi whirling dervishes; using certain types of drugs such as the peyote button or ecstasy; etc.   And in the few other circumstances in which the same mental state is produced by circumstance as Newberg also indicates is likely, we should also expect that kind of experience to exert a special influence over us.   For example, Newberg suggests that the confluence of stress and relief (such as when being saved from disaster) will in some cases produce a euphoria or even the experience of “union with ultimate reality” of which the mystics speak.   He says that the same thing can happen as a result of coming to a mental conclusion while in a state of stress that relieves that stress.   This could occur, for example, after worrying for years about an inability to believe as the rest of your community does and then finding what seems like justifiable intellectual means to do so.   I believe it was precisely this that led to my first Mormon “spiritual” experiences.   This also explains many of the spiritual experiences of which I have read, starting with Saul of Tarsus, moving through Aquinas and other early Catholic mystics and theologians, and continuing through every tradition of which I have read.

 

The regrettable thing about the experience Newberg”™s book describes as well as many of the others noted above is that they create a sense of certainty with regard to things that have, in my view, nothing to do with physical reality and everything to do with social reality or brain states.   This conclusion is the result of considering the consistency of human biology and social interaction across groups and the diversity of conflicting beliefs that are held to the death within different groups.

 

It is fair say, in my view, that a person’s socially conditioned beliefs tend to be set in concrete by experiences of the type Newberg describes.   My experience is consistent with this, as is that of countless people of whom I have read.   For example, a Jehovah”™s Witness girl in Calgary last year died while refusing blood transfusions. Her belief that this was essential to her eternal salvation and hence more important than life itself was set in her mental concrete in the same way as was my belief in the Book of Mormon. And there is a lot of controversy around here right now regarding the Fundamentalist Mormon community in Creston, British Columbia that is breaking up and flooding the countryside with former polygamists who are ill equipped to deal with the complexity of life outside their community. The “reality” that conditioned them to live so as not to develop the kind of coping skills necessary in our society was created just as was mine ““ by a well intentioned family and community members. And we don’t need to talk about the Taliban, Al Quaida, etc.

 

Inherited Beliefs

Much of what has been described so far can also be explained by way of the typical human reaction to what   might be called inherited beliefs.

 

We all start somewhere in terms of what we think we   “know”.   This starting place is more scientific in some cases and less in others, but in all cases this position is handed to us by our social group. We   “inherit” this knowledge, regardless of how inaccurate it may be.   And we are taught to believe it.   These beliefs are hard to overcome because of things like the confirmation bias, authority bias, etc. as noted above.

 

As the generations pass, the foundational knowledge of each human group has tended to become increasingly scientific.   This means that in the course of most human lives and within virtually all human groups, there is tension between certain types of traditional or inherited knowledge and knowledge produced by science.   Within the scientific community itself this tension has been institutionalized, and the tendency of even our scientific brightest minds to become wedded to an idea and blinded by it is recognized and protected against.

 

Within most human groups the tension is not between different views of science but rather between certain inherited bits of knowledge and either old or new scientific ideas.   For example, we tend to have strong tribal tendencies as a result of our evolutionary history as small herd animals.   It is not surprising, therefore, that differences between races amplified our tribalism and that racism has been a terrible problem.   The science of genetics and other branches of human biological science have shown how minute the differences are between the various human races, and polymaths like Jared Diamond (see   “Guns Germs and Steel, or   “The Third Chimpanzee”) paint compelling big pictures that show are arbitrary and insignificant the differences between various human groups are at the most fundamental level.   It is easy to understand why these ideas would be hard to swallow for white European males living in the Southern U.S. in the 1960s, given their inherited knowledge and how they were raised.

 

And what of those who have been raised from the cradle to believe that homosexual people are deviants, perverts, sinners, etc.?   The information produced by the scientific community during the past couple of decades with regard to the genetic and biological origins of the homosexual impulse has become increasingly clear.   Sexuality, it seems, is not   “either/or”.   It is a spectrum.   Even in bastions of conservatism like BYU, authoritative voices are now heard arguing from this point of view (see http://newsnet.byu.edu/story.cfm/49488)

 

Since science advances and these beliefs tend not to, science regularly challenges them and over time reduces them to the status of   “prejudices”.   Last year”™s wisdom is often next year”™s prejudice.   Medical history is full of this stuff.   Be thankful that you were not alive when   “bleeding” was a near cure all, and the idea that washing hands between surgeries was still considered silly.   Sky high birth rates were required in those days to keep up with the deaths.

 

like the division between religion, science and philosophy suggested at http://progressiveliving.org/religion/culture_war.htm.   This relationship can be usefully thought of in the following terms.

 

Think of a diagram as follows:

 

  • Science is a small, inner circle.

 

  • Then social science and less certain science forms a surrounding circle.

 

  • Then metaphysics (beyond science or beyond physics) in a huge surrounding area.   This includes how we think about what is important, about meaning, about values, about the standards we use to justify knowledge or belief, etc.

 

I have heard science described as a small, slowly expanding clearing in the middle of an immense forest of the unknown.   However, the idea that science should be accorded primacy within the areas it has shown itself to be more reliable than any other source of knowledge makes sense to me.

 

The Pattern of Insider Belief and Outsider Rejection

It is also interesting to note a clearly defined pattern created by the generally scientific nature of modern society and the various sacred and often supernatural premises of religious groups. This is evident, for example, in how Mormons and most members of almost all other faiths (except the Jehovah’s Witnesses) reject the supernatural premises that are peculiar to JW beliefs – the sacred JW premises. Conversely, JWs side with the members of virtually all other faiths in rejecting the supernatural claims that support Mormon beliefs.   The pattern is that religious people tend to be scientific in orientation with regard to all beliefs except those required to support the premises of their particular belief system.   So, when they enter the arena defined by their own religious beliefs they become, from the perspective of all outsiders, “irrational”.   The same pattern is visible to a lesser degree (in most cases) when it comes to issues related to politics, economics, environmentalism, and other issues that are hard to definitively analyze and charged with emotion.

 

In this context, consider the relationship between science and inherited belief systems like religion throughout recorded history.   Religions have retreated over and over when their beliefs have come into conflict with science.   Most religions are now careful not to commit to anything that might conflict with science.

 

And look at pattern of where various religions now are arguing against science.   They only do so where their most basic beliefs are in conflict with science.   And they each tend to accept all science that is not in conflict with basic beliefs.

 

Going back to the diagram above, we could draw a circle to represent Mormon belief.   It would overlap almost all of science (while excluding some) and would include a specific swath of non-science.   If we mapped the JWs beliefs, it would look similar except it would exclude a different part of science and cover to some extent a different part of non-science.   The same could be done for Islam, the young earth creationists and each other belief system.

 

This map discloses a clear trend.   That is, it is only where the most important of inherited beliefs conflict with science that religious people in the democratic west tend to disbelieve science.   This causes the insiders of each group to accept beliefs that conflict with science in a way that no outsider would.   And consistently, insiders in each group can recognize the fallacies in other groups, but not their own.   What are we to make of this?

 

Think of it this way.   If 99 out of 100 people look at three colour swatches and identify them as green, orange and brown, and the last person insists that they are all brown, what do we conclude?   Regardless of the loudly the 100th assures of us of what he sees and that he is right and we are wrong, we will assume him to be colour blind.   He has a perceptive defect that prevents him from seeing what is there.   If he is wise, he will eventually accept this to be the case and trust the judgement of others when it is important for him to be able to distinguish between these colours.

 

The study of cognitive bias, cognitive dissonance, denial and related subjects explains why most humans have blind spots when it comes to perceiving information that is relevant to the reality of their social groups.     This pattern ““ insiders being   “irrational” but only with regard to beliefs and evidence related to their most sacred belief ““ suggests a widespread form of denial caused by cognitive dissonance.     The consistency of this pattern was, by itself, enough to convinced me that there is an intellectual pathology caused by institutional religion (or other group based ideologies) that is the result of the combination humanity’s mimetic (copying) behavioural nature combined with our small group animal biology.

 

Evolutionary theory has an elegant explanation for this.   That is, evolution selects for people who are pre-disposed to not “falsifying” the myths on which their society is based while being able to do so with regard to myths on which other social groups are based. This strengthens the “in group” ideology and weakens all “out group” ideologies thus strengthening the   “in group”.

 

Throughout most of human history keeping one’s group together and holding one’s place within the group likely conferred greater survival and reproductive advantages than any individual “finding the truth” etc.   That is, if you disagree with the group to the point of being kicked out during most of human history you would shortly thereafter be dead.   And if your group disintegrated, many would likely not survive.

 

So, our brains developed to tend toward conscious acknowledgement of the kind of realities that threatened group cohesion (or our place in the group) only when most of our “in group” was ready to come to the same conclusion, hence reducing pressure on group stability to manageable levels and protecting us from the harm that could come from seeing the group as it really is and feeling compelled to take action.   This causes Mormons generally to have trouble seeing problems with Mormon foundational planks that are obvious to non-Mormons, and vice versa.   Hence, it explains the odd pattern that comes into view when one considers how the religious world at large operates.

 

Cult Control Techniques

Steven Hassan (see   “Releasing the Bonds ““ Empowering People to Think for Themselves”) is a former Moonie who has dedicated the last several decades to researching the methods the various cults and counselling primarily the loved ones of cult members as to how they can interact with those under the control of a cult so as to maximize the probability that they will rejoin the   “real” world. I should note that in accordance with Hassan”™s definition of cult behavior, contemporary Mormonism is not a full-blown cult. I would put it at a six to seven on a scale of ten, with organizations like the Hare Krishna, some Mormon polygamists, the Moonies and other communally oriented organizations near ten, and most of the Anglicans, Episcopalians, etc. at near one. Mormonism in its earlier years would have been close to a ten.

 

Hassan focuses on mind control techniques, and notes that these are not necessarily good or bad. Rather, they can be used for good or bad purposes. For example, it is possible to use mind control techniques to help people to break addictive patterns, develop good habits or learn just about anything. In most such cases, the autonomy ““ and hence freedom and range of choice – of the individual is increased. The same techniques can, and are, used by cults of various kinds to reduce human freedom. Hence, the question to ask is how mind control techniques are being used in our lives, and whether as a result of our interaction with them our freedom increases, or decreases.

 

Hassan”™s summary of cult techniques cuts across many of the individual and social mechanisms I have described above without describing how they work in the fashion I did.   I will summarize the parts of his analysis that are relevant to the creation of an environment in which Mormons should expect to be subject to a powerful form of denial.

 

Hassan uses the acronym   “BITE” to remind us of the dimensions of mind control that cults use. Behavior, Information, Thought, Emotion. He summarizes the cog dis literature and shows how it is used to control elements of each of these facets of the   “mind”. Hassan provides an extensive check list (see   “Releasing the Bonds” pages 42 ““ 45) that can be used to assess the mind control attributes of any organization. While this is too long to review in detail here, I will provide a few highlights as far as Mormonism is concerned. The bracketed information is my off the cuff thoughts as to the elements of Mormonism that fit Hassan”™s paradigm.

 

Behaviour Control

Regarding Behavior control, Hassan notes the following as cult characteristics:

 

  • Control of time so as to permit little chance to question or find alternative social groups. (Think through, hour by hour, the weekly schedule of a TBM and count the number of rituals and the time they take. It is mind numbing.)

 

  • Extensive system of rewards and punishments that dominate the individual”™s time and attention. (If you want to be with your kids when they get married, you must have a temple recommend. Holding one of these requires a promise of complete obedience to Mormon authority, the payment of tithes, etc. Mormonism encourages a   “service” mentality. That is, the most respected community members are those who hold the heaviest callings. Mormons are required to confess any behavior that breaks a major rule to a Mormon authority figures. And the big one ““ only those who are obedient to Mormon authority throughout life will be allowed into the Celestial Kingdom to be with the families after death.)

 

  • Rigid rules and regulation regarding many types of behavior. (No comment required.)

 

Information Control

Regarding Information control, he notes:

 

  • Use of deception, such as deliberately withholding and distorting information about the organization particularly for recruits and new members ““ information is distributed on a   “need to know” basis once members are fully committed. (Think of   “Faithful” history.)

 

  • Access to information not controlled by the organization is discouraged or restricted. (Mormons are counselled to avoid the Internet. All   “questioning” or   “anti-Mormon literature as described as a form of   “cancer”. Mormons are counselled to teach lessons only from lesson manuals instead of doing any independent research that might bring into contact with   “questioning” material. Mormon young people are given   “canned” materials from which to give talks so that they do not have to do research. Etc.)

 

  • Use of   “outsider v. insider” and   “black v. white” thought systems. (No comment required.)

 

  • Use of systems that have members spying, and reporting to leaders, regarding each other”™s activities. (Home teaching, visiting teaching, concerns reported by quorum leaders re. activity rates, etc.)

 

  • Use of confession and other systems to engrain authority and abolish identity boundaries. Past   “sins” are used to manipulate and control current behavior. (Mormons are required to subject themselves to regular interviews, particularly as young people. There is requirement to confess behaviours that are not authorized. Temple recommend interviews require an acknowledgement of Mormon authority. If a   “sin” is committed after repentance, the effect of repentance is nullified, and the weight of all prior sin falls back on the sinner.)

 

  • Hassan says:   “Looking at a group”™s attitude toward information is the fastest way to evaluate whether it is using destructive mind control. A legitimate organization will allow people the freedom to think for themselves, read whatever they like and talk to whomever the choose in order to arrive at their own decisions, whereas a destructive mind control group will want to do the thinking for the people.” (  “Releasing the Bonds” page 50)

 

Thought Control

Regarding Thought control, Hassan notes the following:

 

  • The group”™s doctrine must be internalized as   “truth”. Berlin would call this a   “monist” perspective. That is, there is only one right answer to every question. This approach fell out of favour well over a century ago in most aspects of human endeavour, giving way to various forms of pluralism. That is, there may be more than one right answer to many questions. (The   “one true church” idea is classic monist thinking.)

 

  • The use of clichés to truncate critical thought. (  “I know the church is true!”   “Follow the Prophet”, etc.)

 

  • Use of   “thought stopping” techniques, such as singing songs, chanting mantras, in order to prevent   “bad” thoughts from influencing us. (I was taught to sing   “I am a Child of God” whenever a   “bad” thought came into my mind. That song still comes on in my head from time to time.)

 

  • Rejection of any form of reasoning that questions the group”™s, or the leader”™s, authority. (No comment required.)

 

Emotion Control

Regarding Emotion control, Hassan notes:

 

  • Narrow the range of the individual”™s emotional spectrum. This is done by directing the individual”™s attention toward cult directed priorities. Love bombing makes the individual feel good about the cult. Fear is used extensively to bind the individual to the cult. Fear of after-life consequences, as well as the evil nature of the   “world outside” and what would happen to the individual without the support structure provide by the cult and its leaders. (No comment required. See extensive analysis of the fear issue in   “Religious Faith: Enlightening or Blinding?” at http://mccue.cc/bob/documents/rs.religious%20faith%20-%20enlightening%20or%20blinding.pdf)

 

  • Use of ritual to create powerful emotional responses that can be used as evidence that the cult is   “true”, indispensable, etc. (Mormon”™s use testimony meetings; youth conferences; leadership meetings; temple weddings, father”™s blessings; blessings of health, etc. The Church interposes itself as a third part in our most intimate moments, and then takes credit for the good things that we feel there, while blaming us if anything goes wrong.)

 

Miscellaneous Additional Denial Factors

Many of the biases and other perspective distorting forces described above are also dealt with under other names.   For additional perspectaive as to how these (and related biases) work, consider the following (and related information at http://en.wikipedia.org/wiki/List_of_cognitive_biases):

  • anchoring – the tendency to rely too heavily, or “anchor,” on one trait or piece of information when making decisions.
  • bandwagon effect – the tendency to do (or believe) things because many other people do (or believe) the same.
  • belief bias – the tendency to base assessments on personal beliefs (see also belief perseverance and Experimenter’s regress)
  • bias blind spot – the tendency not to compensate for one’s own cognitive biases.
  • confirmation bias – the tendency to search for or interpret information in a way that confirms one’s preconceptions.
  • contrast effect – the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • disconfirmation bias – the tendency for people to extend critical scrutiny to information which contradicts their prior beliefs and accept uncritically information that is congruent with their prior beliefs.
  • endowment effect – the tendency for people to value something more as soon as they own it.
  • hyperbolic discounting – the tendency for people to prefer more immediate smaller payoffs than having to wait for larger payoffs.
  • illusion of control – the tendency for human beings to believe they can control or at least influence outcomes which they clearly cannot.
  • impact bias – the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • just-world phenomenon – the tendency for people to believe the world is “just” and so therefore people “get what they deserve.”
  • loss aversion – the tendency for people to strongly prefer avoiding losses than acquiring gains (see also sunk cost effects)
  • mere exposure effect – the tendency to express undue liking for things merely because they are familiar with them.
  • color psychology – the tendency for cultural symbolism of certain colors to effect affective reasoning.
  • planning fallacy – the tendency to underestimate task-completion times.
  • pseudocertainty effect – the tendency to make risk-averse choices if the expected outcome is positive, but risk-seeking choices to avoid negative outcomes.
  • rosy retrospection – the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • selective perception – the tendency for expectations to affect perception.
  • status quo bias – the tendency for people to like things to stay relatively the same.
  • Von Restorff effect – the tendency for an item that “stands out like a sore thumb” to be more likely to be remembered than other items.
  • Zeigarnik effect – people remember uncompleted or interrupted tasks better than completed ones.

And for an interesting list of articles related to various aspects of the bias research, see http://www.austhink.org/critical/pages/cognitive_biases.html.

 

Emotional Experience, Ritual and Prayer ““ A Case Study

In myriad situations Mormonism and other religious groups harness the power of human emotion to strengthen the institutional hold of individual behavior.     Let”™s consider, for an example, how various ritualized forms of prayer are used in this regard within Mormonism.

 

While I no longer pray in the way I used to, I still regularly (and in fact more often than before) express gratitude and love toward others for whom I care. And I am sure that this has a positive effect on both them and me. I will explain why below. And I believe that Mormon prayer and certain of its rituals that involve prayer are effective in a sense for this reason as well.

 

There is a large body of psychological literature that explores the way in which expressions of gratitude, love and encouragement affect both the persons giving and receiving them. The person expending energy to make the expression perceives herself to have been energized by the experience, and the person receiving the expression tends to react similarly though along different psychological correlates. This is a great deal for all involved. 2 + 2 = 6.

 

The various ritual behaviors used within the Mormon and other religious traditions harness the power of this long understood phenomenon in at least two important ways. First, it is used to make people feel good about themselves and others. And second, it is used to strengthen the perception that the Mormon institution is the source of this wonderful aspect of human experience. Let’s look at the second aspect of how Mormonism uses this part of human experience.

 

Frequently Mormon prayers are not offered in private. Public and semi-private open and close with prayer. Mormons pray over meals. Fathers’, priesthood, baby and patriarchal blessings are usually performed before groups of people. At the core of the marriage ceremony we find a prayer. The expression of testimony is close to a prayer in that while it is addressed to the group, it closes in the name of Jesus Christ, as do Mormon prayers.   Hence, it is in a sense offered to God, and is a form of prayer. “Prayer lists” are maintained in Mormon temples. The temple “prayer circle” is a particularly interesting form of semi-private prayer that gives a feeling of privacy and exclusivity since only the “most worthy” are permitted to participate. Missionary companions and married couples are encouraged to prayer together, day and night, and to pray audibly as well as silently while with each other.   This is the infamous among many missionaries   “companion prayer” that I was taught would be precursor to daily prayer with my spouse.

 

In each of these prayer forms, Mormons are encouraged to do various things having different purposes that are intertwined by the act of praying about them at the same time. For example, Mormons are encouraged to express gratitude both to God and to other people, often people who are present or if not present who will become aware that the prayer was said. The temple prayer circle is particularly interesting in this regard. The names on the prayer roll are not a matter of public record. But frequently the word makes it back to the “sufferer” (my name was various prayer rolls for a long time as I left Mormonism and for all I know, still is) that her name had been put on the prayer roll at one temple or another (sometimes simultaneously at several) as an act of love; acts often performed by “insiders” who are assume themselves to be more worthy that the “outsiders” who are unworthy in some way that motivates the act of putting their name on the roll. This simultaneously lifts the insider by making her feel good about having acted in a loving manner, while highlighting the insider v. outsider lines more clearly in her life.

 

The act of expressing gratitude, as noted above, has been shown to have a powerful positive effect on how both the person expressing and the person receiving the expression feel. All others present tend to be positively moved as a result of witnessing what has happened. This binds the social group together. By causing this to occur in the context of Mormon ritual, the Mormon institution can take credit for the good feelings produced by these universal human mechanisms, and so strengthen itself.

 

Another important aspect of expressing gratitude is that the thing we express gratitude toward becomes more precious to us. This is the case when we silently express gratitude, and the more public the expression becomes, the stronger the effect. This is why testimony bearing, or witnessing, in particular is stressed within Mormonism and other faiths. And this is why prayer is a stepping stone toward testimony bearing both for children and for those “investigating” Mormonism. The evolution of many rituals can be explained in this way. For example, this is one of the reasons psychologists and anthropologists believe the ritual of public marriage ceremonies evolved. The community has a stake in encouraging stable marriages for various reasons that I won’t go into here. The public nature of the commitment, expression of love, expression of gratitude by the couple for each other, etc. were found over time to help stabilize the marital relationship. And everyone likes an excuse to party anyway.

 

In Mormon prayer and prayer-like rituals, expressions of love and gratitude for those closest to us are intertwined with expressions of love for God. And these are confounded both with each other and expressions of love for the Mormon institution and its symbols – Joseph Smith; the temple where absolute obedience to Mormon authority is promised; the current prophet; other current leaders; etc. So, much of the good feeling and energy that results from expressions of love and gratitude end up solidifying the relationship between the individuals giving, receiving and witnessing these expressions and the Mormon institution.

 

In fact, when required to choose between the Mormon institution and any of these loved ones, the choice is intended by the Mormon institution to be clear, though few Mormon leaders will admit this. The Church comes first. The Celestial Kingdom is more important than Earthly life. But rather than counsel marital break up, most Mormon leaders will stand aside and let the chips fall where they may when one spouse seems clearly committed to leaving Mormonism and the other intent on staying. And this should not surprise us since many social groups historically have operated on this basis, and this teaching is at the core of Christianity. Christ’s message was intended to divide families as well as communities over the issue of religious faith, if it came to that. The only thing unclear about the many New Testament passages that make this point in different ways is whether Christ himself said what they say, or whether those building the Christian community after Christ’s death erroneously remembered Christ saying what was so obvious to them and so added these sayings to his record themselves and so invoked his authority.

 

Because the powerful feelings I have just tried to describe occur in circumstances that the Mormon Church creates, it is reasonable for a person with little or no experience outside of Mormonism with regard to these things to conclude that Mormonism is responsible for them.

 

This brings us to the emphasis on pageantry, solemnity, reverence etc. that accompany many Mormon rituals. These individual and group actions are well known to produce powerful emotional experiences that humans like. Combine that with the power of the personal expression of love and gratitude, and a wonderful cocktail has been mixed.

 

And then there is the so-called placebo effect. It is well established in the medical as well as psychological literature that if we believe that something will have a positive effect on at least some aspects of our physical health (herpes, for example, reacts positively to placebos), emotional well being (depression reacts particularly positively to placebos) or perception of pain, it probably will have. An article in a recent Economist magazine summarized current medical studies that have been done in this regard. These studies linked the latest brain imaging (PET) scans to traditional placebo studies to see what was happening in the brain when people were under the influence of a placebo they believed would reduce their perception of pain. It was shown that the brain produced increased levels of endorphins, the body’s natural pain killer, when the participants thought they were receiving a pain killer but in fact were only receiving a placebo. And they of course reported significantly decreased levels of pain.

 

I can think of no reason for which the placebo research would not apply as well to Mormon prayer and priesthood blessings as it would to sugar tablets and saline solution thought to contain effective medication. This would reinforce the idea that something supernatural was possessed by the Mormon institution in the form of priesthood authority, furthering the reverence, deference and obedience reasonable people would tend to show to that institution.

 

So prayer works. It does all kinds of powerful things when linked with the right social and psychological mechanisms that are known to be effective in many other contexts.

 

Does this mean that there is no God and that faith and prayer have no other effects? In my view, the evidence does not go that far. What this line of research clearly indicates is that many of the supernatural aspects of human experience that are attributed to prayer and faith are the result of misunderstood natural phenomena. And the most important lesson for me in all of this is that I was hoodwinked into believing that the Mormon institution had unique power to make me feel good; to heal me; to foster loving relationships; etc. when in fact it was simply misdirecting my attention from the most probable nature of the mechanisms that were having their expected effect in my life, and taking unearned credit for some wonderful aspects of life.

 

Now we have unwoven part of a rainbow. Part of the wonder and beauty of Mormon life may lie smashed on the floor all around us. Mormons are often critical of the “anti-Mormons” for the manner in which we tear down without building up. So let’s do a little building up, and notice how easily this occurs.

 

It took quite a while on my way out of Mormonism to pick apart the threads I just described. As I did so, something happened that I believe to be the normal, sensible response to what I had experienced. The key to understanding this is to appreciate the nature and importance of perspective, and what we should expect to happen to us once our perspective changes about anything that is important to us.

 

Once I understand how important the expression of gratitude was (thanks to Martin Seligman’s psychological studies), I made it a point to express gratitude more often. Each time I do this, it lifts me. Thanks Martin!!! Even writing that made me feel good. And the understanding that this is something natural, available to all, that has nothing to do with Mormon or any other kind of authority, fills me with joy. It felt wonderful, for example, to get rid of the idea that there was something unique and special about the feelings of joy a Mormon couple have as they promise to love each other in a Mormon temple and there express gratitude to a few crying, oddly dressed family members and friends. This is a universal human response to that kind of circumstance. So it makes perfect sense that once I understand this, I would simply go out of my way to find opportunities to express sincere gratitude for those in my life.

 

The same thing applies to expressing love. The same thing applies to expressing encouragement. The same thing applies (to a point at least) to helping other people.

 

And having learned a few useful tricks from people like Seligman, I was encouraged to see what else they can teach me. The importance of forgiveness is something else I have learned from them. The importance of being involved for a significant part of each day in “flow activities” is another important point. The importance of identifying my “signature strengths” and focusing on doing as much as I can with them instead of worrying about fixing what I perceive to be “character flaws” that I am likely never to overcome. We seem to go further and enjoy the ride more if we concentrate on doing what we can with our strengths instead of beating ourselves up for what we are not so good at doing.   Etc.

 

My intent here is not to try to write a life manual, but rather to indicate that there is a vast world of information out there, well organized and back up by solid empirical studies, that we can use to guide ourselves toward lives that we have reason to believe will be more joyful, productive and fun than anything the well intended but ego blinded old guys in Salt Lake City could possibly offer from their point of view. The basic reason for this is simple and clear. Their primary objective is not to create the strongest, healthiest, happiest individuals possible. Their objective is to create the strongest Mormon institution possible. And that often requires sacrifices to be made by many individual Mormons.

 

Factors that will Influence the Depth of Mormon Denial

The depth of denial and hence both likelihood of leaving Mormonism and pain that process will cause, seem to be largely determined by two things ““ genetics and social conditioning.   The best recent research indicates that those two factors are about equally responsible for our major personality traits and behaviours (see Steven Pinker,   “The Blank Slate” for example).   See also http://home.mccue.cc:10000/bob/documents/rs.do%20smart%20mormons%20make%20mormonism%20true.pdf from pages 26 through 70 on this point.

 

The most important factors in the denial equation appear to be temperament and how much a person has to lose.   The more fearful a person is generally, the more they tend to shy away from things that appear dangerous, and we are all hardwired to feel threatened at the prospect of being thrown out of our primary social group.   Not long ago in our evolutionary past, that meant death.   Our genes do not forget things like this, and they change much more slowly than our social context.

 

Second, and more important, is what the person in question has to lose.   Someone who has a marriage, a career, social status in her community, etc. tied up with her Mormon beliefs will have a much more difficult time assessing the reality of Mormon history than a recent convert whose family and many of whose friends is unhappy with his decision anyway.

 

It is impossible to separate the influences just noted in most cases and I will not to do so here.   However, one can get a sense for how these factors are likely to influence them by thinking about questions such as those found below.   To give hope to those who score   “poorly” (that is,   who would seem to have a low probability of leaving Mormonism and lots of pain on the way out) on this test, I will provide the answers I would have given just before I woke up:

 

  • How long have you been a Mormon?
    • All my life.   44 years.

 

  • Are your parents life long Mormons?
    • Yes.

 

  • Did you attend church with your parents virtually every week as a child?
    • Yes.

 

  • How   “orthodox” and   “faithful” were your parents in terms of their Mormon beliefs?
    • Very.

 

  • Did your parents hold Mormon leadership positions?
    • Yes.   Constantly.

 

  • Did you grow up in a community that was predominantly Mormon?
    • Mostly.   From age 11 to 16 I was in a predominantly non-Mormon community.   Otherwise, I was raised in   “the Corridor”.

 

  • To what extent did you   “rebel” against the Mormon way of doing things as child or teenager?
    • I rebelled from age 13 to 16, spent two years repenting and breaking   “bad habits”, and then lived in full compliance.

 

  • Did you attend Seminary regularly?
    • No.   I hated it but still attended for a total of a little over two of the four required years.

 

  • Did you graduate from Seminary?
    • No.

 

  • Did you attend Institute while in university (or attend an LDS run institution like BYU)?
    • Yes.

 

  • Did you graduate from Institute?
    • No.   Ironically, I was the president for two years of the LDS Students Association at the University of Alberta (in Edmonton, Alberta, Canada) but did not take quite enough courses to graduate from the Institute there.

 

  • Did you serve a mission for the Mormon Church?
    • Yes.   Southern Peru, 1977-79.

 

  • Where you an   “obedient” missionary?
    • Yes.

 

  • Did you marry in a Mormon temple?
    • Yes.

 

  • Is your spouse a faithful Mormon?
    • Yes.

 

  • How old are your children?
    • 20 through 6 years of age.

 

  • What percentage of them are faithful Mormons?
    • 6 of 7.

 

  • How many of them have served missions, married in or otherwise been through a Mormon temple?
    • One.   The only one old enough to do so.

 

  • During what percentage of the time since you first attended a Mormon temple have you held a valid temple recommend?
    • 100%

 

  • During what percentage of the time since you first attended a Mormon temple have you attended the temple at least twice during each year?
    • 100%

 

  • During what percentage of your adult years have you been a full tithe payer?
    • 100%

 

  • How frequently during your adult life have you broken   “major” Mormon rules, such as the Word of Wisdom, the law of chastity, the law of tithing, etc.?
    • Never.

 

  • How frequently have you declined Mormon callings?
    • Never.

 

  • How frequently did you do 100% of your home or visiting teaching?
    • 100% from return of mission until five years ago (a total of almost 20 years of 100% home teaching).   Sporadic since then.

 

  • Did you hold Mormon leadership positions?
    • Yes.

 

  • What percentage of your close friends are faithful Mormons?
    • 95%

 

  • What percentage of your employment or business income depends to a significant extent on relationships with faithful Mormons who are likely to think less of you if they knew you were not   “faithful”?
    • 10%

 

  • What percentage of your discretionary time during the past year, or two, or five, or ten, has been spent doing things related to Mormonism, including socializing with faithful Mormons?
    • 90%

 

  • How old are you?
    • 44 years of age.

 

 

It did not look good for me, and I suffered immensely on my way   “out”.   And I would do it again in a heartbeat.   I can”™t imagine returning to the narrow life I once lived and the benefits of life as I now know it far outweigh the price paid to get here in my case.

 

What Causes Denial ““ A Synthesis

We have just finished what may feel like a dizzying tour of concepts.   It is now time to sift back through them to look for patterns that will allow us to better understand and remember the most important aspects of how Mormonism and denial relate to each other, and to us.

 

The empirical and theoretical research produced by sociology, social psychology and psychology (as summarized above) can be synthesized into a description of a few features of human behavior that Mormonism is well suited to take advantage of.   These can be stated as follows:

 

  • Our perceptive faculties and brains do not primarily record objective information.   They rather function in a manner consistent with what evolutionary theory indicates to be our most basic objectives ““ they help us to maximize our probability of survival and reproduction.   Hence, we have an astonishing ability to more or less accurately perceive those aspects of reality that seem to increase the probability of our accomplishing those two objectives, and to suppress those aspects of reality that hinder us in that regard.

 

  • In order to understand why we both perceive and misperceive certain things, we need to think about what our environment was like when our biology, brains, perceptive systems, etc. developed instead of our environment as it is at the moment.   While this is a speculative exercise, some of the most basic assumptions that we need to make in this regard seem reasonable, such as the idea that each individual human was much more dependant on her relatively small social group for safety and prosperity throughout most of human history than is the case at present.

 

  • Our evolutionary imperative mandates many forms of relatively accurate perception, some of the most interesting of which are summarized in the heuristics research, and two overriding types of misperception which are as follows:

 

  • The first type of misperception relates to the importance of the group historically to our individual survival and prosperity.   While we are no longer so dependant on the group, our brains developed in an environment in which if the group disbanded, or if we were pushed out of the group, we were likely to die.   This causes us to tend toward acceptance as   “real” whatever we perceive to be important to the group”™s survival and prosperity and to suppress information that we perceive to threaten the group.   Think, for example, of Bourdieu”™s   “misrecognition” concept and the authority bias research.   Most of the rest of the bias research can be explained by this concept as well.

 

  • The second type of misperception is caused by our need to feel secure within the group as individuals.   For example, if our contribution v. our cost to the group does not meet some minimal standard, we may be pushed out and when our instincts were formed by evolution this likely often meant death.   And the greater our status within the group, the greater our security and reproductive opportunity will tend to be.   While this was likely true when our instincts were formed, it is still true in different ways now.   Think, for example, of the justification bias research in this light.

 

  • Our group”™s beliefs are the cumulative effect of the its historic perceptions, which evolved for the practical purposes just noted and are almost certain to be inaccurate to a significant extent.   See the information above regarding social context and   “premises”.

 

  • We will be slower to accept accurate information that conflicts with an inaccurate belief we hold than would a similarly educated and intelligent person who was not burdened by our inaccurate belief.   This is likely in part because our brains format around our group”™s foundational beliefs.   However, we behave this way with regard to foundational beliefs as well as beliefs formed on a deliberatively rational basis in adulthood.   This feature of our psychology likely evolved as a result of the importance of foundational beliefs to group stability and the likelihood that wisdom passed on to us by our elders of a more practical sort would be on balance adaptive.   The confirmation bias research bears this out.   This is one of the most pervasive and harmful cognitive biases.

 

  • Emotion is largely driven by the older structures within the brain”™s core, while deliberative reason of the type used in the scientific method is largely driven by structures that evolved more recently and are in the cerebral cortex.   The older, cruder brain structures tend to overcome the more recent rational structures when they are pitted against each other. See the information above related to reason v. emotion.

 

  • The more heavily we are influenced by emotion as opposed to reason (  “ecological rationality” as opposed to   “deliberative rationality”, as noted above), the greater our tendency to misperceive.   This increases the probability that we will act in accordance with our evolutionary imperative (even thought this behavior is often not adaptive in our current environment) when confronted with evidence, whether accurate or inaccurate, that could threaten our group or our place in it.     See, for example, the information above regarding taboos, ecological rationality, reason v. emotion, and value structures.

 

  • We tend to equate strong feelings with   “knowing”.   This enhances our tendency to be certain of whatever moves us most deeply from an emotional point of view, whether it related to fear or desire, and so strengthens the tendencies already noted.

 

  • Powerful emotional experiences, often characterized as   “spiritual experiences”, result from both normal brain functioning and brain dysfunction.   They are sometimes the result of solitary contemplation or other individual experience, and sometimes the result of group interaction of various sorts.   These experiences are human universals and are used in both deliberatively and ecologically rational ways in most human groups to support their foundational beliefs.   See the information above related to spiritual experience and the emotion of elation.

 

  • We are not as affected by emotion when examining the experience of other individuals or groups as we are when attempting to understand our own experience, and hence are able to see ecologically rational behavior (that from our point of view is not deliberatively ration) in others that we cannot see in ourselves.   See the information above related to the pattern of insider belief and outsider rejection.

 

  • Human tendencies evolve because they are on balance adaptive at the time of evolution.   Hence, a tendency like the authority bias may have been adaptive on balance, but in some cases maladaptive.   This would be particularly so from the perspective of many individuals within the group since the authority bias likely evolved to strengthen groups, and so only indirectly to benefit individual members of groups.   And yet individual members of the group would be subject to it whether it was adaptive for them or not.   Individuals who become aware of this can now often leave groups that work contrary to their particular interest, but should be expected to instinctively fear doing so for the reasons indicated.   It requires strong deliberatively rational abilities to do this, and hence the suppression of the emotional forces that religion and other cultural forces tend to use to cause ecological rationality to dominate human decision making.

 

  • Human culture changes much more quickly than human biology.   So human tendencies that evolved because they were at one time adaptive on balance (such as the authority bias) may persist after they are less adaptive on balance or even maladaptive.   The declining importance of adherence to the dictates of certain kinds of small group authority makes the authority bias a likely an example of this in certain cases.   This explains why entire groups are instinctively held together by obedience to authority even though the costs they impose on their members are far greater than the collective benefits the members receive.   These groups eventually either evolve, or pass out of existence.   The Fundamentalist Mormons are a current example of a North American Group that face this difficult decision.   Mainstream Mormonism faced a similar decision in the late 1800s related to polygamy.   Jonestown is an example of a group that choose to eliminate itself rather than adapt.   This is a classic example of the potentially toxic effects of non-democratic leadership and the belief in what amounts to magic ““ supernatural forces that confer special authority on certain leaders.   Jim Jones leadership, and the group structure that he controlled, was under pressure from forces outside the group.   Rather than allow change to occur, he led the group into mass suicide.   This is analogous to a man who kills his estranged lover because if   “I can”™t have you, no one will”.

 

Let”™s now condense these principles by another order of magnitude to see if we can get a   “take away” concept that is concise enough to be remembered.

 

  • The human capacity to perceive evolved to make it more likely that we would survive and propagate in our physical and social environment (our   “evolutionary environment”) at the time we evolved.   In our evolutionary environment the well-being of our dominant, small social group and our security within it were far more important to our survival and reproductive opportunities than is now generally the case.   Therefore, both in our evolutionary environment and now, when we are confronted with information that might threaten one of our group”™s foundational values and hence threaten our group, we tend to misperceive the information so that it is not threatening.   The same is true with regard to information that might threaten our place within the group.   For example, information that suggests Joseph Smith was a lying sexual predator is rejected out of hand by most faithful Mormons while being compelling to most non-Mormons.

 

  • We are more likely to misperceive when under the influence of our emotions.   Our emotions tend to flare when our group”™s foundational values or our place in the group are threatened.   However, we tend to be deliberatively rational when examining the foundational values of other groups, and so can spot the ecological nature of their rationality, and note how it conflicts with our own brand of deliberative or ecological rationality.   The obvious   “irrationality” (what we usually call ecological rationality) of other groups coupled with our inability to perceive our own irrationality strengthens our group.   And particularly powerful emotional experiences, often characterized as   “spiritual experiences”, are human universals.   These are used in most human groups to support their foundational beliefs.

 

That is short enough that it will do the trick for me.

 

So, how does Mormonism use these attributes of human behavior to strengthen itself?

 

  • Mormonism emphasizes the possibility of knowing impossible to know and deeply comforting things with certainty, thus taking advantage of the human dislike of dissonance, bias toward certainty and fear of death and social instability.

 

  • Mormonism emphasizes emotional feeling as a form of knowledge that should take precedence over   “rational” or   “intellectual” knowledge whenever there is a conflict, and encourages both group and individual behavour that will increase the likelihood of powerful emotional experiences.   This supercharges the irrational effect emotion has within the Mormon community.

 

  • Mormonism emphasizes a black and white understanding of reality.   If the Joseph Smith was either 100% God”™s prophet (and hence produced good things) or 100% a fraud (and hence produced bad things), it is sensible for a Mormon who has experienced the goodness of the Mormon community to conclude that the 100% fraud option is not viable.

 

  • Mormonism maintains control over as many of life”™s experiences as possible that tend to produce positive emotions, and takes as much credit as possible for those feelings.   These feelings are then used as evidence that Mormonism”™s truth claims are   “true”.

 

  • Among Mormonism”™s foundational beliefs we find many that raise the fear and desire stakes, thus intensifying an already powerful authority bias and making Mormons more prone to the irrational effect of emotion.   The most significant of these is that only those obedient to Mormon authority will be reunited after death in the Celestial Kingdom with their families in a state of unimaginable joy.       This concept”™s most pervasive influence comes from its making complete obedience to Mormon authority a condition to marriage and close association with family members after death.   This means that any strong taboo set up by Mormon leaders will evoke the fear response, which will impair reason.   For the last several decades one of Mormonism”™s strongest taboos has been against reading or talking about information that questions Mormon authority, regardless of the information”™s academic merit.   Hence, the first hurdle most Mormons must get over when faced with information that questions the Mormon belief system is an irrationality inducing fear response caused by the mere idea that one might look at such information.   If that can be overcome, the fear response that in most groups would be caused by seriously considering information that questions foundation group values must then be dealt with.

 

  • Mormonism monopolizes its members’ time as well as physical, mental and emotional energy, while suppressing information that conflicts with Mormon belief.   This slows the manner in which cognitive dissonance of various types will build within the Mormon population, and the opportunity reason will have to calm emotion and so overcome emotional irrationality.   Importantly, it is taboo to read or talk about anything that questions Mormon authority.   The mere appearance of this information is therefore enough to evoke a strong fear response in most Mormons, and so impair their rational faculties.

 

  • Mormonism uses a host of group and individual rituals that are likely to amplify the effect of various biases and cause both group and ego induced misperception so as to strengthen the Mormon group.   The emphasis on constant vocal affirmation of Mormon belief through public or semi-public scripture reading, praying and testimony bearing of various types is central to this.

 

While that is far from complete, it is good enough for present purposes.

 

Conclusion

When we add all of the above factors us, we should not be surprised that it is excruciatingly difficult for the typical faithful Mormon to look any information in the eye that questions the legitimacy of the beliefs on which his life is based.

 

So, we should not be surprised that it takes many of us until mid-life to   “wake up”.   And, we should not be surprised that many of our family and friends will never wake up.   In fact, we should expect those who wake up to be in the minority. The force of denial within a heavily conditioned, socially tight community like most Mormon communities should be expected to be powerful.

 

On the basis of the foregoing, I feel justified to conclude that under the influence of the powerful personal experiences and social conditioning I have noted, the socially relative becomes more real than every day waking reality for many religious believers, including many Mormons, creating barriers to the kind of understanding across religious and other cultural lines that is becoming increasingly important in our shrinking world.   The amounts to the denial of many kinds of highly probable reality, and explains to me both my own experience, and those of believers within many other traditions.

 

In sum, we should expect Mormons who have been fully conditioned by their community to be highly resistant to any information that challenges their beliefs.   And, if for some reason a faithful Mormon is put in a position where the certainty he has felt that the Mormon worldview is   “true” collapses, we should expect that to be a trauma on par with losing a close family member to death.

 

Postscript

I ran across another concept that is relevant to the topic of denial and rather than weaving it into this essay, I am going to add it here.   I do this primarily because I have now referenced this essay, by page number, so many times that I don”™t want to throw that off.

 

The topic is   “ideomotor control” in general, and chiropractors in particular.   The following comes from a newspaper article by Jeremy Loome of the Edmonton Sun that can be found at http://edmsun.canoe.ca/Lifestyle/Columnists/Loome_Jeremy/2006/04/19/1539827.html.   Loome”™s text is indented.   My comments are at the margin.

 

The science that says we’re frequently irrational may be easier to understand when you consider our subconscious may be to blame.

 

The survival instinct is wired deeply into the brain, as is the related fear of our own mortality.   That can lead to the brain performing some interesting tricks on our perception.   In Dr. Andrew Newberg’s experiments, outlined earlier in this series, he demonstrated how the brain can be tricked into thinking our internal monologue ““ that little voice on our shoulder ““ is a disembodied voice talking to us, rather than emanating from us.   Our bodies are just as susceptible.

 

A spiritual case in point?   Ouija boards.   Almost anyone who has tried them and successfully seen them spell out a message from spirits will have a hard time believing science has repeatedly disproven their value.

 

As with other “sciences” such as a “facilitated communication,” “applied kinesiology” and “Toftness Radiation Detection,” Ouija communication seems to work due to what is termed “ideomotor action”:   the brain subconsciously influencing human muscles to turn a belief into reality.

 

This element of Loome”™s research was new to me, and fascinating.   More proof as to how our most important beliefs affect what we perceive.   Without experiencing what I have re. Mormonism, no amount of reading could have drilled into me the “theory ladenness of observation”[1] concept as it has. I don’t take this as far as many do. That is, I believe there is an objective reality, but that the part of it we see is highly influenced by our worldview. Our worldview creates a portal through which we look, in effect. Even the best scientists are so limited, at least to an extent[2].

 

All of the above noted practices have been disproven using double blind studies, where no participants were able to view the process as it happens.   When no participants can see the board as they attempt to have spirits contact them, Ouija doesn’t work.   Ever.   Anywhere.   Yet hardcore believers will dismiss the science before the technique, demonstrating how a powerful belief can trump rationality.

 

“Under a variety of circumstances, our muscles will behave unconsciously in accordance with an implant expectation,” writes Dr. Ray Hyman of the University of Oregon, in his paper How People Are Fooled by Ideomotor Action”[3].   Hyman has used science to disprove everything from water divining to cold reading ““ the process of pretending to know about someone by reading their emotions and reactions.

 

“What makes this simple fact so important is that we are not aware that we ourselves are the source of the resulting action,” he says.

 

One of the most striking “medical” failures disproven by double blind experiments is applied kinesiology.   As bogus as it is as a science, it does a credible job of demonstrating the power of belief.   Thousands of North Americans still subscribe to the technique, which involves using muscular pressure and tongue sensation to allegedly diagnose illnesses and allergies.   Nonetheless, even after demonstrating to a roomful of chiropractors that it doesn’t work during double blind studies, Hyman could not get them to admit defeat.

 

“When these results were announced, the head chiropractor turned to me and said, ‘you see, that is why we never do double blind testing anymore.   It never works!,” Hyman writes.   “At first, I thought he was joking.   It turned out he was quite serious.”

 

The man was so convinced going in that the neutral, controlled science had to be wrong, not applied kinesiology.

 

“Many pseudo and fringe scientists often react to the failure of science to confirm their prized beliefs, not by gracefully accepting the possibility that they were wrong, but by arguing that science is defective,” he says.

 

This is just another illustration of how various of the factors described in the essay above contribute to beliefs that are not supported by the evidence.   Chiropractors have years of education, professional and community prestige, their livelihoods, etc. at stake in what Dr. Hyman is disproving.   There is enough evidence to superficially support the chiropractors claims.   And they should be expected to be in denial regarding the evidence that does not support their claim, as predicted by the confirmation bias, which in this case would be strengthened by all that they stand to lose if they are shown wrong.   I don”™t have time to run down the long list above in terms of cognitive dissonance, the authority bias etc., but quickly thinking about a few of them it is easy to see how they apply to this case.

 

Jon Haidt in   “The Happiness Hypothesis” describes the kind of reasoning that is used by most people to make them feel comfortable with their position in life.   This is related to our tendency to denial reality when it conflicts with that we need to perceive in order to be comfortable (see my take on this at http://mccue.cc/bob/documents/rs.denial.pdf), as well as why arguments related to anything important and uncertain (religion, politics, global warming, etc.) often become pitched.   Haidt uses different terminology than others I have read in this field.   What follows is a sample of his concepts.

 

p. 64 – 65:     “Deanna Kuhn, a cognitive psychologist who has studied such everyday reasoning, found that most people readily offered   “psuedoevidence” “¦   Most people gave no real evidence for their positions, and most made no effort to look for evidence opposing their initial positions.   David Perkins, a Harvard psychologist who has devoted his career to improving reasoning, found the same thing.   He says that thinking generally uses the   “makes-sense” stopping rule. We take a position, look for evidence that supports it, and if we find some evidence ““ enough so that our position   “makes sense” ““ we stop thinking.   But at least in a   lower-pressure situation such as this, if someone else brings up reasons and evidence on the other side, people can be induced to change their minds; they just don”™t make an effort to do such thinking for themselves.

 

Now let”™s crank up the pressure.   The client has been caught cheating on her taxes.   She calls her lawyer. She doesn”™t confess and ask,   “Was that OK?”   She says,   “Do something”.   The lawyer bolts into action, assesses the damaging evidence, researches precedents and loopholes, and figures out how some personal expense might be plausibly justified as business expense. The lawyer has been given an order: Use all your powers to defend me. [RDM note: This misunderstands the lawyer”™s role.   Lawyers are risk assessors.   If I am consistently wrong in my risk assessments (that is, I say   “Sure you will win” all the time and my clients consistently lose, I will have trouble staying in business.   Hence, Haidt has chosen a poor example to make his point.   He should have left the lawyer out of it.   The client will likely grad any scrap of evidence in her favor, but a good tax lawyer in cases like this will often advise that the taxes owing should simply be paid unless the client wishes to live with a potential criminal prosecution hanging over her head.]

 

Studies of   “motivated reasoning” show that people who are motivated to reach a particular conclusion are even worse reasoners than those in Kuhn”™s and Perkins”™ studies, but the mechanism is basically the same: a one-sided search for supporting evidence only. People who are told that they have performed poorly on a test of social intelligence think extra hard to find reasons to discount the test; people who have asked to read a study showing that one of their habits ““ such as drinking coffee ““ is unhealthy think extra hard to find flaws in the study, flaws that people who don”™t drink coffee don”™t notice.   Over and over again, studies show that people set out on a cognitive mission to bring back reasons to support their preferred belief or action.   And because we are usually successful in this mission, we end up with the illusion of objectivity. We really believe that our position is rationally and objectively justified.”

 

pp 66 ““ 69:   Haidt summarizes various studies to show how unrealistically we tend to be when comparing our selves to others, or average.   These included the following:

 

  • People overestimated how many flowers they would buy at a charity event that raised money by selling flowers, but were pretty accurate in their estimates of the behaviour of others.
  • People underestimated their own propensity to cheat, but were relatively accurate in their assessment of the likelihood that others would cheat.
  • People overestimate their own leadership qualities.   In fact, the more ambiguous the characteristic under consideration is, the more likely it is to be overestimated.
  • University professors overestimate their ability level.   94% believe that they are   “above average” for example.

 

p. 68:     “If the only effect of these rampant self esteem inflating biases was to make people feel good about themselves, they would not be a problem.   In fact, evidence shows that people who hold pervasive positive illusions about themselves, their abilities and their future prospects are mentally healthier, happier and better liked than people who lack such illusions.   But such biases can make people feel that they deserve more than they do, thereby setting the stage for endless disputes with other people who feel equally over-entitled.”

 

pp. 69 ““ 71:   Haidt describes here a number of studies related to dispute resolution.   Some of the findings were as follows:

 

  • Where the parties to a dispute were forced to become familiar with both sides of the case before finding out which side was theirs, settlement was far more likely than if they knew which side they were on before beginning to look at the evidence.
  • Reading an essay about bias did not help.   People assume others are biased, and they are not.
  • Writing an essay from the other party”™s position did not help.   In fact, it made settlement less likely than if nothing had been done likely because thinking through an opponents arguments from a biased point of view made them seem weaker than they actually were.
  • It did work, however, to read an essay about how biases work and then write an essay about the weaknesses in your own case.   This seems to work as long as the weaknesses in question do not imply character weaknesses in the person doing the analysis.
  • Learning about self-serving biases consistently helped people to predict other”™s behaviour, but not their own except in the single class of cases just noted.

 

p. 71:     “Pronin and Ross trace this resistance to a phenomenon they call   “naïve realism”; Each of us thinks that we see the world directly, as it really is. We further believe that the facts as we see them are there for all to see, therefore others should agree with us.   If they don”™t agree, it follows either that they have not yet been exposed to the relevant facts or else that they are blinded by their interests and ideologies [RDM note: or are stupid].   People acknowledge that their own backgrounds have shaped their views, but such experiences are invariably seen as deepening one”™s insights; for example, being a doctor gives a person special insight into the problems of the heath-care industry. But the background of other people is used to explain their biases and covert motivations; for example, doctors think that lawyers disagree with them about tort reform not because they work with the victims of malpractice (and therefore have their own special insights) but because their self-interest biases their thinking.   It just seems plain as day, to the naïve realist, that everyone is influenced by ideology and self-interest. Except for me. I see things as they are.

 

If I could nominate one candidate for   “biggest obstacle to world peace and social harmony,” it would be naïve realism because it is so readily scales up from the individual to the group level: My group is right because we see things as they are. Those who disagree are obviously biased by the religion, their ideology, or their self-interest. Naïve realist gives us a world full of good and evil, and this brings us to the most disturbing implication of the sages advice about hypocrisy: Good and evil do not exist outside of our beliefs about them.”

 

 

 

[1] See http://carnap.umd.edu/phil250/250F98/theory_laden.html.

[2] See for example, Michael Ruse, “Mystery of Mysteries: Is Evolution a Social Construction?”.

 

[3] See http://www.quackwatch.org/01QuackeryRelatedTopics/ideomotor.html.

Leave a Reply