Friday, October 31, 2014

Clinging Tightly

Closely related to loss aversion is the endowment effect, where we highly value things we possess. In loss aversion, we cling to what we have, not wanting to part with it. With the endowment effect, things we possess gain value merely by virtue of possession. This also can be applied to belief.

Daniel Kahneman repeated Richard Thaler's classic experiment with coffee cups many times, always with the same results. It is a simple demonstration of the endowment effect.

"Mugs were distributed randomly to half the participants. The Sellers had their mug in front of them, and the Buyers were invited to look at their neighbor's mug; all indicated the price at which they would trade. The Buyers had to use their own money to acquire a mug. The results were dramatic: the average selling price [determined by the one possessing the mug] was about double the average buying price [the price a buyer deemed he would be willing to pay for that mug], and the estimated number of trades was less than half of the number predicted by standard theory."

More than a coffee mug, beliefs are things that we cling to. We don't want to part with them; it is like they are a part of us and who we are. We gave up our former lives and our families for those beliefs. They cost us a lot. We value more highly things that require more effort to attain.

COG members were required to "witness," i.e. try to convince others of the value of our beliefs. Like the techniques used in multi-level marketing, the more we worked to "sell" our religion to others, the more we treasured it ourselves.

What of disconfirming evidence? 

"It is extremely rare for someone to simply abandon a valued belief when confronted with disconfirming information. In fact, recent psychological research shows that when this happens, people tend to hold the erroneous belief even more strongly." Daniel Kahneman

Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills

Tuesday, October 28, 2014

Commitment

For some reason, I had placed an unrealistic importance on commitment. "I've given my word and I can't go back on it." This idea had a very strong hold on my mind.

I wanted to believe that I had made "the right" choice for my life. I wanted my life to have purpose and meaning. So in times of doubt, I would look back to that initial commitment, to the initial purity I thought I had seen in the COG, and I would determine to continue to work to make the group as good as it could be, which for me was to provide a decent education to the children in my care.

This can partly be explained by the phenomenon of sunk cost. I had already invested my everything - my meager savings, personal possessions, emotional dependence, and many years - into the cult, how could I leave now? I'd paid such a high price, surely I should continue to give it my all.

In general, people become more convinced of a decision they make the harder it is to undo it. The greater the cost in embarrassment, money, time, whatever the case may be, the harder we cling, and the harder we strive for reasons to justify our decision as being right or the best thing.

I never actually considered leaving the group as an option. Aside from my mental state, the logistics were difficult. I had nowhere to go - in effect, no family to return "home" to. I was in a foreign country, far from what had been familiar to me in my youth. As the years passed, it would become harder and harder for me to support myself in mainstream society. I had nothing of worth to show on a resume.  

I stayed in the group, considering it my life's work. 

As the author of skeptic.com so succinctly put it:


"To continue to invest in a hopeless project is irrational. Such behavior may be a pathetic attempt to delay having to face the consequences of one's poor judgment. The irrationality is a way to save face, to appear to be knowledgeable, when in fact one is acting like an idiot." 

Monday, October 27, 2014

The Power of Anecdotes

In addition to the Mo Letters and Bible that provided our mental sustenance, we were regularly fed a side dish of anecdotes, which we called “testimonies.” Such stories were particularly helpful in reinforcing the idea of meaning behind all things. Stories are, by nature, easy to remember, so they can play a big role in feeding the availability heuristic.

In society at large, anecdotes are used everyday in advertising and all types of media. This is a two-edged sword because they are so memorable. Problems can arise because the first thing we tend to forget about information we hear is its source. Disturbingly, the second thing we forget is whether it is true or not.


"I remember something about apples and cyanide... What was it? Well, to be safe, I won't buy apples." 


The Family used testimonies to appeal to the emotions of its members and increase the feeling of team spirit. Although not necessarily well-written, and with an over-generous use of exclamation marks, these vivid accounts of "miracles" were published regularly and were a part of our required reading. 


This was a perfect platform for subjective validation, where these stories were perceived to be completely true because they spoke so meaningfully to our faith. We could see the spiritual workings behind the scenes in action – the cause and effect that fit perfectly with our beliefs. No doubt these testimonies were, perhaps even unwittingly, tailor-made to fit, conform to, and reinforce, our predetermined ideas.

They served as precedents for future expectations, built up the faith of group members to "expect miracles" themselves, provided a means of social norming, made members want to emulate the behavior of the ones in the stories, and even gave people a little claim-to-fame and social approval if they were fortunate enough to have one of their testimonies published for all.


These testimonies were not only in written form but were a regular part of group meetings, with members coming up with "inspiring" stories of events that happened during their days of proselytizing, told with excitement and animation, and raising the enthusiasm levels of the group to work harder for our cause.

Tuesday, October 21, 2014

Happiness

With the "power of prayer" at our disposal, of course we were "happy." In fact, we were constantly reminded in the letters from Berg that living for the Lord in the cult was enjoyable. We were experiencing God's blessings, we were playing the "glad game" (like in the movie, Pollyanna), and we were the happiest people on earth. After all, we were the "chosen ones."

In 1959, psychologists Leon Festinger and James Carlsmith conducted interesting experiments on 
cognitive dissonance, one portion being on people's attitudes toward the enjoyment of a very boring assignment, such as mindlessly turning knobs on a board.

When they were done with the tedious job given them, the participants remaining in the study were offered the job of explaining to the next participants how enjoyable the tasks were. Some people were paid $1 to do this; others were paid $20. Those who only got $1 felt they had to justify to themselves why they were telling the next participators how fun the very boring job was, so they convinced themselves that indeed the tasks were enjoyable. Those who got $20 were content to lie just for the money; no need for self-justification. ($1 obviously being considered not enough to pay someone to lie.)

The researchers concluded that when people are compelled to lie about, in this case, the supposed enjoyment of a boring task, in order to relieve the cognitive dissonance of the contradiction, they convince themselves that the lie is not really a lie but is actually true. Lying to themselves to justify lying to others.

In a curious admittance of the folly of our happiness delusion, although not seen as such, Berg wrote (among other things), "I'd rather be happy in madness, than only be sane and sad." So madness is better than sanity as long as one is happy.  

I've only included parts of this experiment that I felt pertained to my point of the idea that Family members considered themselves to be happy in spite of the circumstances. On the surface, that sounds like a wonderful positive outlook. In reality, it was mindless and crippling denial.

If you care to read a more thorough synopsis of the study, you can find it here.

Monday, October 20, 2014

The Illusion of Control

Closely related to the idea of a "just world" is the need to feel that we have control over our lives and situations. Like the widespread beliefs and traditions of superstitions, we Family members had the ultimate power of the universe at our disposal - prayer.

Imagine, I had a God who was concerned about every aspect of my life, as if I were the only one on Earth. He was interested in my every petty worry or imagined need. What a nice, feel-good story! 

This all-powerful God was ever-waiting to listen to my every word. He was also keeping track of those words to judge me for them. ("Every idle word that men shall speak, they shall give account thereof in the day of judgement. For by thy words though shalt be justified, and by thy words, thou shalt be condemned. Matthew 12:36-37) Frightening. Clearly, he was focused on me.

He answered all my prayers, but I needed to always be praying ("Pray without ceasing." I Thessalonians 5:17). Fervent prayer was all important, too, as without wholehearted fervency, my prayers might not be heard. 

If I didn't get an answer, well, that was just God's "no" for his divine reasons that were beyond my feeble and limited understanding. After all, as our loving father, he wants what is best for us. If I were to get what I perceived as the answer to my prayer, then surely God answers prayer! Hallelujah!

So either way, God wins. And we win, too, because we could be content knowing that we had our own (albeit delusional) method of control - prayer.

Sunday, October 19, 2014

Flirty Fishing Follows

As if the in-group promiscuous sex wasn't enough, potential converts were included in this practice.

In this thinly veiled prostitution, the women were encouraged to seek out men to "win to the lord" through having sex with them. How naive! It was easy to find men who were eager for some cheap sex with a young, healthy girl, even if they had to endure a bit of ridiculous religious preaching to get it.

So it was that I lost my virginity to a (fortunately for me) very kind man from Jordan just before my 21st birthday. 

This practice was called "Flirty Fishing," in reference to the Bible verse where Jesus said, "Follow me, and I will make you fishers of men." Yet, I rejected the terminology the cult used for such men. I could not bring myself to think of these outside men that I had sex with as "fish" or "kings," but rather I preferred to think of them as "friends." (Perhaps that is one way I dealt with the cognitive dissonance I experienced with this new morality.)

Why did I do it? Perhaps here is a piece of an explanation.

In the early 1960's, Stanley Milgram conducted some rather controversial experiments on obedience to authority. He postulated that people will do what they are told, even to the degree of contravening their own sense of morality.

He set up his experiments using 3 different subjects: the Experimenter (authority figure), the Teacher, and the Student.

The Experimenter explained to the Teacher that he would be part of an experiment on operant conditioning (think B.F. Skinner). The Student would be “punished” for answering questions wrongly, so that the Experimenter could (ostensibly) learn how punishments affected learning. Unknown to the Teacher, the Student was a confederate of the Experimenter and only acting his part, and the experiment actually involved studying the Teacher's level of obedience.


The Teacher, believing the experiment was about negative reinforcement, was to help the Student answer questions correctly by administering slight electric shocks when he gave a wrong answer. These would gradually increase in strength as the Student answered incorrectly. The Teacher tested an initial 15 volt shock on himself and thought it wasn't so bad, so he had no problem delivering that same shock to the Student.


The Teacher walked into a science lab fitted with an impressive-looking machine. It had wires, lights, and labeled switches that covered various ranges: mild, moderate, severe, and at the top of the incrementally increasing scale, a life-threatening shock of 450 volts. The Experimenter instructed the Teacher to administer a 15 volt stronger shock for each wrong answer given, “teaching” the Student the importance of giving correct responses.

When the Student gave an incorrect answer, the Teacher gave him an electric shock, just as the Experimenter told him. After each incorrect answer, the strength of the shock was increased by 15 volts, gradually building up to frightening levels as the experiment went on.

Although some of the Teachers grew visibly agitated, the majority carried on with the experiment, continuing to deliver shocks at the request of the Experimenter, in spite of the screams of the unseen Student in the next room. After 300 volts were administered, the Student became quiet, which the Experimenter deemed as an "incorrect answer" and ordered continuing shocks, which a disturbing number of participants delivered.

These results have been replicated by other psychologists. Milgram wrote, "Authority was pitted against the subjects' strongest moral imperatives...and...authority won more often than not."

He wrote further, "Often it is not so much the kind of person a man is as the kind of situation in which he finds himself that determines how he will act."


Perhaps the Teachers felt that they weren't responsible. It wasn't their fault if the Student was hurt. They were just doing what they were told.

The moral implications are alarming.

Saturday, October 18, 2014

"Free Love"

When I first joined the COG, what I saw was a group of young people dedicated to returning to the lifestyle and ways of early Christianity. As such, the group appeared to be chaste, with sex allowed only within the bonds of marriage. Little did I know...

Even at that time, Berg and his household were living with a completely different set of mores. Promiscuity, as long as it could be justified as being "done in love," was encouraged, since after all, "God is love" so then "love" must be "God." "As long as something is done in love, then it is completely lawful in God's eyes," or so said the "prophet." What a convenient and holy-sounding justification for his and others' licentiousness.

This gradually filtered down to the plebeian cult members through the letters Berg wrote. Finally, he felt it was "God's time" to share these new "strange truths" with "his children." It was 1977, and Berg instituted Flirty Fishing - sex with potential converts. Within months, we were officially given permission to have sex indiscriminately among ourselves, called "sharing."

Oh, how scared was this little former-Catholic virgin!

Friday, October 17, 2014

Social Control

An interesting theory of social control was developed by Travis Hirschi in 1969. Although referring to criminals and other social deviants, I found his ideas very applicable to the society within the COG.

He suggested that in order to discourage deviance - violation of accepted norms of the society, in my case, the cult - one needed to be bonded to the community in the following ways: attachment, commitment, involvement, and belief.

Attachment was easily provided through the hierarchy of Family leadership. Although friendships were often shattered, there was always a leader in my house, and always Berg as our ultimate "father" and his mistress, Zerby, as our ultimate "mother" figure. When one feels attached to a society through authority figures, such as pseudo-kinship or leadership structure, one naturally demonstrates respect and love for them by accepting their standards and values. When we disappoint those we feel attached to, we experience shame and guilt. Therefore, we are less inclined to deviate from what is deemed "normal behavior" in the group.

As far as commitment goes, one could not get much more committed than the burning of bridges required of a Family member. Hirschi advanced that there is a relation between the amount of commitment to a society and the disinclination towards social deviance. The greater the commitment, the more one has to lose by misbehavior.

The involvement required of Family members was complete. Every moment of every day was scheduled and regulated. All my time was filled with my so-called "work for the lord." Time for thought of other lifestyles was seriously limited, although the likelihood of thinking such damning thoughts grew slimmer with each passing day. Besides, "An idle mind is the devil's workshop."

Belief, of course, being aided and abetted by the availability heuristic and the confirmation bias, among many others, was reinforced daily. The required 2 hours of "word time" (reading publications from the group, and/or the King James bible), plus memorization, kept me busy during any time that would have otherwise been free. "Living in the word," i.e. constantly having the "word of God and/or Berg" in one's mind, was held as virtuous and an ideal for which to strive.

As well, I was getting quite interested in the stories of the bible, and I ended up making that a main focus of study and the form of mandatory "word time" that I enjoyed teaching the most to the children I worked with. Since it was only that or the words of Berg, it was an easy choice.

Thursday, October 16, 2014

Neural Degeneration is No Joke

Years later, when I finally made the break from TFI (The Family International, which is the current name for any remnants of the COG), I found concentration very difficult. No longer under the thumb of the group I felt I could return to reading books, a source of pleasure in my youth, but I had a very difficult time reading. I was firstly plagued by feelings of guilt with recurring thoughts that "I shouldn't be wasting my precious time reading fiction when I could be studying the Bible." Secondly, my comprehension skills were shot. I had a really hard time following even the Sherlock Holmes stories I was trying to read through.

Thankfully, the brain is very plastic and can continue to form new cells and connections for our entire lives. Gradually I regained brain function and have been able to learn and study again, but it took time to pull out of the mental haze.

Now I am revisiting some of my favorite psychology courses that I listened to 8 years ago, and I've been surprised at how new they seem to me. Back when I first listened, I had trouble ingesting them. Finally, I have been able to experience the sheer joy of learning that I allowed myself to be robbed of during my years of cult dedication. May it never end.

Wednesday, October 15, 2014

Speaking of Stress

As my COG "family" replaced my biological family, I naturally formed attachments to people I loved and respected. For my first 12 years in the group, I found myself in the position of caring for leaders and their children for years at a time; much of the time, it was just myself and maybe someone else with the leadership couple and their children. Of course, I grew to love the children as my own and develop attachments to their parents as pseudo-parent figures for myself. The people I served in this capacity were considerably older than I was, making feelings of parental affection somewhat natural.

As well, my parents had died not long after I moved away from the US, so now they were completely absent from my life. I had minimal contact with my siblings.

True to the core beliefs of "The Family" (new name for the COG), these situations would come to sudden and abrupt ends, leaving me emotionally distraught.

Any friendships we may have made within the group never lasted, as we were strongly discouraged from keeping in touch with people once our paths had diverged.

It appears this (among other things) caused me some sort of psychological damage, as from time to time when something would strike me funny, I would laugh uncontrollably for about 20 minutes straight. Although I would try to stop, all I could do was put my head down and try to stifle it and the inevitable tears that would fall.

Oddly, this was not a "red flag" to my wise, older "shepherds in the lord."

Tuesday, October 14, 2014

Neural Atrophy

These attack times, sometimes called "pushes," continued intermittently throughout my entire time in the group.

We never actually habituated to these times, as the attack periods would only last a number of weeks, followed by a break of several months. Then the next one would come upon us as suddenly as the last, throwing us into another high-tension mode of operation, and providing a strong distraction from thought, inflating our sense self-importance, and bringing in extra funds for the cult.

Living with the stress and anxiety of cult life had an effect on my brain. Prolonged stress produces neural atrophy and suppresses neurogenesis - the birth of new brain cells. Chronic subordination also causes a decrease in neurogenesis. Or, put more simply, being in a situation where one has to constantly submit to someone else causes one's brain to stop producing new cells as it should. As I continued in the cult, I was gradually getting dumber and dumber.

Meanwhile, my frontal lobe was still no where near completely developed. I was missing the integral feature that was needed to discern good and bad choices and foresee long-term consequences. 

Now that I was on the far-side of the globe, I continued to be absorbed deeper into the group and further from any friends or family who could possibly rescue me. Not that I would have listened.

Good-bye USA

The frequent search for some illusory correlation, i.e. looking for the reasons for random events, took a fair bit of time and introspection. We had to find out why a person, perhaps ourselves, "deserved" what they got. "Nothing happens by accident to one of God's children," or so said the "prophet." The concept of randomness did not fit into the parameters of a "just world" where everything happens for a reason.

Being busy was an important feature of the life of a cultist. This provided us with the illusion of importance, which of course fueled the idea of us being the "called out elite army of the lord." (In retrospect, the absurdity is baffling.)

To keep us even more busy and stressed, we had intermittent "attack times." These were periods of weeks when we were on "high alert" for some perceived threat or imminent disaster. (I wonder if Berg was consciously aware of the effect that a high-stress atmosphere had on people; if not, he must have had an innate sense that it would serve his purpose.)

These attack times would usually be in the form of long hours on the streets, passing out pamphlets and collecting donations. In later years, this would leave those at home who cared for the children busy 24/7 caring for large numbers of children who were divided roughly by age. A room full of 12 toddlers night and day can be quite a challenge.

It was during one such period of impending doom, just before July 4, 1976, that I, at the tender age of 19, moved to the opposite side of the world in my quest to "go into all the world and preach the gospel to every creature." The farcical truth was that the "gospel" we were preaching was nothing more than the teachings of the cult - but we believed they were one and the same.

From time to time, deep down inside me there was a hint of embarrassment of the cult's beliefs and a desire to not let on to outsiders how really weird some things were. Not a strong enough hint to even consider leaving, though.

Sunday, October 12, 2014

Just World

The just world bias was kept alive and well on a daily basis in a COG commune. Since we believed that "life is fair and God is good," i.e. the world is just, then when an accident or illness occurred, we were obligated to look for its cause so that we could learn "what God was trying to teach us." 

In fact, there is a part of us that wants things to happen for a reason. We want the world to be fair and just. It make things simpler and easier to understand, giving us a false sense of security. This naturally appeals to us.

With a combination of biases at play, we looked for the underlying pattern to name as the cause for, say, an accident. Since "life was fair," we assumed there was a causal connection that we must discover; we had sinned and were therefore "reaping what we sowed." Together with the negativity bias, giving more weight to negative information than to positive information, we "searched our hearts" to come up with something, some "lesson" we could learn, so we could be delivered, healed, or whatever.


If one looks hard enough for a pattern or causation, one will find it. Indeed, "seek and ye shall find." We find what we are looking for, and we ignore what we are not looking for.

Granted, there are real causes to such things as catching a cold, but to look for a deep spiritual transgression as the reason that precipitated a certain mishap is absurd, yet it was just a run-of-the-mill happening in the life of a COG member.


This was carried to extremes when disasters happened to cities or countries. "God is judging them for their sins," of course. 


Just as a basketball player's "hot hand is entirely in the eye of the beholders"* so the causation of sin being behind bad things happening is entirely in the eye of the believers. 


* Thinking, Fast and Slow, Daniel Kahneman, page 117

Saturday, October 11, 2014

Filling in Blind Spots

Just as we have visual blind spots where our minds fill in the gaps according to expectation, so we have many mental biases that do the same to the information we take in, some of which I've mentioned already. 

Our minds easily recognize and concoct patterns, whether it be a face in the clouds, a ghost in the shadows, or a snake on the path. This is an obvious help in dangerous situations, as we all would rather make the mistake of thinking a stick is a snake rather than assuming the reverse is true and suffering for it. (This concept has been given the name "apophenia" by Klaus Conrad, and then built on and renamed "patternicity" by Michael Shermer.)

With the help of this and other natural mental shortcuts, our minds fit together pieces of the patterns and information that we take in to form an image or idea consistent with our personal expectations and our own sense of reality.

This innate tendency was exploited to a great degree by the COG, with conspiracies lurking behind any small event. There were reasons for events, and there was meaning in everything. "Everything happens for a reason," and behind everything was the great God of the universe pulling the strings. And thank God, we were the chosen few who really knew what was going on. His invisible hand was working in all our lives. All these thoughts continually reinforcing the notions of being chosen, called out, separate, and pulling us further into the group.

The daily bombardment of reading and memorization of cult materials made it so much easier to perceive the patterns in the world around us and give them meaning. We deemed those who had this ability as being more "spiritual" and "in-tune with God." They were strong in "faith." Well yes, because "believing" truly "is seeing." We see what we believe we see.

From the Illuminati, to the CIA killing Kennedy, we were steeped in conspiracy theories through the group's publications. Daily delusion spinning, serving to further pull me into the web of deceit and lies that were the foundation of the COG.

Friday, October 10, 2014

Dancing Pigeons

B.F. Skinner did a now-famous experiment on pigeons and their reactions to randomness. He deprived pigeons of food for a while, then placed them in cages with a system where a food pellet would be delivered at regular intervals. There was nothing the pigeon could do to control the food supply.

As time went on, the pigeons seemed to remember what they had done just before the food was delivered. Being hungry, they wanted to speed up the process, so they would repeat what they had done just before they got food. Next time a pellet dropped, they would again try to replicate what they had done just before it arrived.  

Before long, pigeons were turning in circles, stepping from foot to foot, tossing or bobbing their heads - all the while perceiving that their actions "caused" the food to drop. Is it as if the pigeons' prayers had caused the answers they desired to come to pass? 

Just like a pigeon, I wanted the illusion of control in my life. Prayer provided just such an illusion. "Prayer is the hand of faith" that moves the heart of God. Oh, what power I had! How important my petty problems were that the God of the universe would take time to listen to me! How good that made me feel! I was important. I had value. I was God's child. 

I was as deluded as those pigeons.

Wednesday, October 8, 2014

Dissonance

The majority of people have been shown to judge themselves above average in intelligence, driving ability, and other attributes. This illusion of superiority causes us to have difficulty in admitting failure and leads us to blaming others and/or rationalizing about circumstances. "Surely there were extenuating reasons why I failed, it couldn't have been my fault, after all, I'm smart." Failures and mistakes do not correspond with our self-image.

(Conversely, if you are among the minority who think themselves below average, you reinforce that negative mental image by explaining away successes, "It was just a fluke." "Before you know it people will find out how truly stupid I really am.") 

Now that I am aware of this inherent tendency, I try to alter my perception of myself and try not to give in to thinking of myself as smart, but rather try to think of myself as one who desires to learn and persevere. This has its own set of accompanying biases, but I prefer those to the ones I'd held for too many years.

To admit that I had wasted years - yea, even decades - in a fruitless, nonsensical delusion, and worse yet, subjected the people I love the most (my children) to the same delusion, was a deeply wrenching experience, to put it mildly. As I wrote in my introduction, this is what prompted my search to understand how such a dreadful and far-reaching mistake could have been made in my life.

I wasn't the "good person" I thought I was, and I certainly wasn't very smart.

Sunday, October 5, 2014

The Paradoxical Effects of Unfulfilled Prophecy

Inevitably in following a self-proclaimed "prophet" there will be prophecies that will not be fulfilled. One would think that an obvious unfulfilled prophecy would enlighten the followers that something was amiss. Not exactly.

Initially, yes. Something unexpected occurred. The predicted event didn't happen. This didn't fit our concept of what should happen. The inconsistency of it strikes our mind with discomfort. Imagine the mental dissonance that is clamoring to be resolved! Enter rationalization - usually in the form of another letter from our founder explaining how the Lord had changed his mind, due to our prayers, or whatever self-serving reason he came up with.

Since we'd all already "burned our bridges" (a requirement for members which equalled cutting ties with former friends and family), we were deeply committed to the cause. We were true believers, and as such, we unconsciously wanted to believe.  And as we know, we see what we believe. (The confirmation bias is ever with us.)

The COG is just one among many cults and religious groups that have dealt with unfulfilled prophecies in this way. Reasons given for why the prophecy didn't come true can really be any ridiculous explanation, such as, "It did happen, but it was a spiritual occurrence invisible to us." Or, "It didn't happen because God saw your faithfulness and decided to reward your prayers by bestowing mercy on the world," fitting right in with the illusion of group importance.


Then, in order to further decrease the discomfort of the dissonance that we had earlier experienced, a new proselytizing campaign would begin, spreading the word of God's mercy to mankind in sparing them from whatever catastrophe didn't occur. 

So paradoxically, what would seem like a wake up call to reality actually serves to galvanize the faith of the true believers in the prophet and the group.

Saturday, October 4, 2014

Delusion

As I have written, once a decision is made, the information we get afterwards tends to only confirm it in our minds.

This is even sadly true when it comes to judicial decisions. Ever since DNA testing freed the first wrongly imprisoned innocent man, groups like Project Innocence have been fighting a hard battle. Why is it so hard? Isn't the judicial system seeking truth and justice? Well, yes and no. Even more important than justice is the mind's internal, and nearly uncontrollable, desire for consistency and self-justification. "I'm an honest judge, and I would never punish the innocent." There is a natural resistance to changing one's mind on a verdict, even when presented with definitive evidence.

In my case, I was I was the honest, truth-seeking judge who made the decision to join the COG. I would not change my mind. Ironically, I was also the prisoner of that decision, kept in a mental prison, not seeing past the bars. 

This delusion I shared with the other COG members as the "called-out, elite soldiers of Christ." If it had only been me, perhaps I would have been deemed mentally ill. Surrounded with so many like-minded people, it was a group delusion.

Our minds have a reality testing feature that is very handy. This is turned off while we dream, which explains why we can believe dreams when we experience them, and then afterwards wonder at how bizarre they were. Much like a dream, being part of a group delusion hampers reality testing.

Of crucial importance to cult members was the concept of "surrender." I was oft-corrected for being willful - clearly a serious sin - and chided to "surrender my will to the Lord" (aka the whims of the leaders). Turning off my will equaled turning off critical thinking. It took me years of trying my best to attain that level of "dedication."

Friday, October 3, 2014

Self-serving Bias

By default, we protect our ego. Our image of ourselves is shielded from the dissonance of conflicting ideas by the lies we unconsciously tell ourselves.

Curiously, research has shown that the majority of people think themselves above average in many categories. But clearly, most people cannot be above average, and plenty of people fall prey to cons and deceits every day. "I'm not the sort of person who would..." join a cult and follow a leader mindlessly, join a ponzi scheme, etc., "I'm smart. In fact, I'm smarter than average."

This tendency to think of ourselves as above average is known as illusory superiority, and it is yet another cognitive bias that we employ to rationalize our decisions. I personally think it is largely responsible for those who remain in cults, bad situations, abusive relationships, etc. We tell ourselves, "I'm not the sort of person who would would stay in an abusive situation, so it really can't be that bad," and we dismiss any contradictory evidence. We do this without realizing it to protect our ego and remain consistent in our self-image.

Working in cahoots with this illusory superiority is the self-serving bias. It is ever-present to protect our idea of ourselves as a good person, a smart person, and in my case, a sacrificial, dedicated Christian. Combine this with the availability of all the information I was bombarded with about how noble the cause, how unselfish my motivation, and how correct my life-choice, and my narrative had plenty of confirming evidence on which to feed.

In fact, I had plenty of "good reasons" to continue "fighting the good fight of faith as a loyal follower of Jesus."

Thursday, October 2, 2014

Programmed to Believe

By nature, we are believing creatures. We have evolutionarily developed to err on the side of caution as a survival mechanism. To borrow Michael Shermer's oft-used explanation, our ancestors survived if they assumed the rustle in the bushes was a predator and fled, even if it wasn't. If they had figured it was just the wind when it actually was a predator, their DNA would have perished and we would not have been born.

Likewise, we readily believe, subscribing causation and reason to events and things. These beliefs (as well as decisions) are generally reached by purely emotional impulses.


Once a belief is formed, we intuitively filter all incoming data through our biases to fit our belief system. We have our idea of what the world is like, and then we find confirming information in the data we take in.


If we happen to come across contradictory ideas, we experience the discomfort of cognitive dissonance. This must be resolved, so in comes the rationalization of why that information isn't important or true, or perhaps we twist the information to suit our narrative. (This is how news stories can be seen to support both sides of the political spectrum depending on how they are spun by the media.)


Once that contradictory information is explained to ourselves, we are mentally and emotionally rewarded with a feeling of pleasure. Our mental equilibrium has been restored.


Another technique to resolve dissonance that was commonly used by the COG to help clear up contradictions was to advise members when they were in doubt about any belief, to "wrap it up in a bundle of faith" and set it aside. This is a clear application of our mental inclination to compartmentalize. We can manage like this, but of course, the most emotionally satisfying method would be rationalization, as mentioned above.


We are very good at inventing reasons to justify our beliefs. And ironically, it appears that more intelligent people are even better at this. That is why you may see very smart people who believe very strange things.


As an aside, I will bring up the interesting case of Sir Arthur Conan Doyle, who the world remembers for his rational, logical character, Sherlock Holmes. Perhaps not so well know was his avid interest in the supernatural and mysticism. His friend, the famous Harry Houdini, tried to convince Doyle that mystics employed tricks just like he did, yet bizarrely, Doyle chose to believe that Houdini himself possessed supernatural powers. He even fell for the childish hoax of the Cottingley Fairies, cardboard cutouts of fairies posed with an amateur photographer's daughters, going so far as to use them as illustrations in an article he wrote for The Strand about fairies, and then later in his book, The Coming of the Fairies, published in 1922.

Wednesday, October 1, 2014

Confirmation Bias

As much as we may like to think that our minds are like video cameras, accurately recording what comes through our senses, this is very far from the truth. Our brains actively and continuously construct our own reality. We pick and choose, according to beliefs and biases, the information that pleases us, and ignore the rest. All this is done automatically.

The minds of all of us, not just members of religious groups, construct narratives about ourselves, our beliefs, and our lives. We tell ourselves what kind of person we are, and unconsciously fill in the details to fit the pattern we have decided upon, rejecting or ignoring what doesn't fit. We feel a need for consistency and an aversion to the mental discomfort of cognitive dissonance - holding two conflicting ideas in our minds, or being confronted with evidence that contradicts our beliefs.

When we accept information that confirms our image of ourselves/our beliefs, we are mentally rewarded. Finding confirming evidence is pleasant. (Why do liberals and conservatives choose to watch their favorite pundits instead of those who present views that conflict with their beliefs? Is it because they value an unbiased presentation of facts?)

This tendency to accept and remember what confirms our beliefs, and reject, ignore, or actually not even notice information that does not, has been called the confirmation bias. This has been a hugely important factor in my life, being partly responsible for my years spent in a bizarre cult. 

Oh, but that wasn't the only one. There are plenty more interesting biases that were at play in my psyche.