Buyer’s remorse. Sour grapes. Rationalization. Self-justification. How do we feel after we’ve made a decision that was hard and came with significant consequences ?
I recently quit a reputed startup, choosing what I want to work on over a substantial sum of money. Now that decision is behind me, a few weeks have gone by, how do I feel ?
I can say the honeymoon phase is still on i.e. I feel great and haven’t once regretted my decision. Curious about my own feelings and about how people in general react in situations similar to mine led me to ponder over the literature that I had encountered, recently and in the past.
Just about every one of us knows of the famous Aesop’s fable of the fox and its case of sour grapes. A fox salivates at the sight of a grape vineyard, full of ripe, juicy grapes. Despite its best efforts, it cannot reach the grapes. It leaves the vineyard in a huff, consoling itself with the thought that grapes may have been sour and that its loss is not great.
It is one of the most popular stories that highlights the concept of cognitive dissonance, a concept that has its origins in the study of an alien worshiping cult. Leon Festinger, a social psychologist was curious about how people react when they act on something they dearly believe in, with significant costs to those actions, find that their beliefs were wrong. He and two associates infiltrated a cult that believed the world would on December 21 of 1955. A number of the followers quit jobs, left unbelieving spouses, sold off their possessions, gave away all their money and assembled at the house of the cult leader, a Marian Keech, in anticipation of an alien spacecraft that would rocket them to safety as the world ended. The predicted time came and went and nothing happened. The world continued its merry way.
A couple or so hours later, the cult leader announced with great fanfare that she had received word from God that he had spared the world because of the actions and beliefs of the little group assembled in her living room. The group went from being despondent to euphoric. They also went from being secretive to proselytizing. Where they felt no need to discuss the end of the world with others before December 21, now they felt the need to report the miracle of survival to the world and to ask others to join the cult.
Leon Festinger went on to coin the term “cognitive dissonance” to define the emotional state that arises from holding two conflicting ideas. In the case of the cult, the conflicting ideas were that they had been duped and that their beliefs were the truth. Rather than admit to being wrong and deal with the enormous consequences of that (no job, no spouse, no house, what life?), they chose to proselytize their beliefs, as if convincing others would be sufficient to prove that they were right after all.
Festinger wrote a book about all of this, now a psychology classic, called “When Prophecy Fails”. Marian Keech, whose real name was Dorothy Martin, went on to found another cult and assume other names. She was known by the name of Sister Thedra when she died in 1992, as part of the cult that she founded called Association of Sananda and Sanat Kumara.
Aesop, that fantastic fabler of foibles, also fabled the after-effects of the cult in another story involving another (or was it the same) fox in which a fox which gets its tail cut off and becomes the laughing stock of animals begins to convince other animals to cut their tails off because not having a tail is such a convenience!
Sororities and fraternities around the world (and military academies) enact a variation of this self-justification every year when they force their new members through a hazing ritual. Studies conducted show that the societies with the worst hazing rituals have the most fanatical followers. This attempt at resolving the cognitive dissonance (was it worth the effort to undergo humiliating rituals to join this group) is called effort justification and was enunciated by Elliot Aronson and Judson Mills in 1959.
Cognitive dissonance has been called the engine of self-justification, by Carol Tarvis and Elliot Aronson in their popular 2007 book, “Mistakes Were Made (But Not By Me)”. Aronson says in an interview with NPR:
“Dissonance is disquieting because to hold two ideas that contradict each other is to flirt with absurdity and, as Albert Camus observed, we humans are creatures who spend our lives trying to convince ourselves that our existence is not absurd. At the heart of it, Festinger’s theory is about how people strive to make sense out of contradictory ideas and lead lives that are, at least in their own minds, consistent and meaningful. The theory inspired more than 3,000 experiments that, taken together, have transformed psychologists’ understanding of how the human mind works. Cognitive dissonance has even escaped academia and entered popular culture. The term is everywhere. The two of us have heard it in TV newscasts, political columns, magazine articles, bumper stickers, even on a soap opera. Alex Trebek used it on Jeopardy, Jon Stewart on The Daily Show, and President Bartlet on The West Wing. Although the expression has been thrown around a lot, few people fully understand its meaning or appreciate its enormous motivational power.”
How We Process the Sacred and the Mundane
A new study titled “The price of your soul: neural evidence for the non-utilitarian representation of sacred values” concludes that different parts of the brain are active when we decide. If the decision is based on cost-benefit analysis, then one part of the brain is active, and a different part of the brain is more active if the choice is made based on non-utilitarian principles. Here is how, Gregory Berns, the main author of the study, sets up the problem:
“Why do people do what they do?” said neuroscientist Greg Berns of Emory University. “Asked if they’d kill an innocent human being, most people would say no, but there can be two very different ways of coming to that answer. You could say it would hurt their family, that it would be bad because of the consequences. Or you could take the Ten Commandments view: You just don’t do it. It’s not even a question of going beyond.”
Dr. Berns and his team designed an experiment in four phases as follows. In the first phase, the participants in the experiment were asked if they agreed or disagreed with each of the 114 statements presented. The statements ranged from the pedestrian – you’re a MAC person, you like white wine – to the sacred – you’re willing to kill an innocent human being, you think its OK to sell a child, you believe in god. Next, the statements were presented as complementary pairs (You believe in God & You don’t believe in God, you’re a MAC person & you’re a PC person etc.) and the participant was asked to choose one from each such pair. In the third phase, the participant was asked if they were willing to switch their answer from the previous phase in return for money. For example, “Is there a dollar amount that you would accept to disavow your belief in God for the rest of your life?’ In the final phase, the participant were given the chance to make money by selling their answers from the second phase. The participant could decide to opt-out of being paid for a statement or they could specify the amount they were willing to accept ($1-$100) in return for changing their answer (i.e. selling their soul). If the participant asked for $1 in exchange for changing the answer, they were assured of getting some amount, usually $50. If instead they asked for $100, there was a 1% chance of getting that amount (the amount and probability was determined by rolling a pair of 10-sided dice). They were also told at the beginning of this final phase that they’d be given a printout of their final choices and had to sign their name to the paper. They could then keep the paper and the money they had won, no one else would know about their selling out on their principles.
They signed up 43 on-site participants. To probably counter the WEIRD charge, the team also signed up an additional 391 people online to participate in the study. The participants were a roughly equal mix of men and women, between the ages of 21 to 69 years, all in good health with no history of psychiatric and neurological disorders, most with. Most were well educated (over 80% had studied some college). Ethnically, most were Caucasian (76%) while African-Americans and Asians were about 10% and 6% of the respondents. 32 of the participants were inside an fMRI scanner during the first three phases of the experiment.
The distribution of statements for which people were willing to selling out was bimodal i.e. people were either willing to sell out their answer for very little ($1) or not at all (opt out). Also, some participants were willing to sell out on just about everything while others were unwilling to sell out on anything but a handful.
Dr. Berns and his team found that when it came to statements that participants were unwilling to sell out on, there was increased activity in the regions of the brain known to evaluate right/wrong behavior (the left temporoparietal junction) and with semantic rule retrieval (the left ventrolateral prefrontal cortex), and not with the regions of the brain associated with reward or cost-benefit analyses. Furthermore, when it came to evaluating the opposite statement of their inviolate beliefs, there was increased activity in the amygdala, a part of the brain known for processing negative emotional reactions that arise when one’s value systems are violated.
A follow-up survey of the 32 onsite participants who were put through the fMRI, conducted 6-14 months after the initial testing showed that the answers remained stable.
The article that accompanied the report quotes Dr. Berns:
“Most public policy is based on offering people incentives and disincentives,” Berns says. “Our findings indicate that it’s unreasonable to think that a policy based on costs-and-benefits analysis will influence people’s behavior when it comes to their sacred personal values, because they are processed in an entirely different brain system than incentives.”
And So …
According to the Wikipedia article on cognitive dissonance:
Cognitive dissonance theory warns that people have a bias to seek consonance among their cognitions. According to Festinger, we engage in a process he termed “dissonance reduction”, which he said could be achieved in one of three ways: lowering the importance of one of the discordant factors, adding consonant elements, or changing one of the dissonant factors.  This bias gives the theory its predictive power, shedding light on otherwise puzzling irrational and even destructive behavior.
In my case, lowering the importance of one of the discordant factors would be, for example, to say that money is not important anyway; adding consonant elements might be to say that I could make that kind of money by joining another venture; and changing one of the dissonant factor might be to say that I might not have made that much money anyway.
So, do I suffer from cognitive dissonance as a result of my choice to quit my job ? I can’t say that I have spent much time mulling over my choice. I also haven’t felt any of the feelings (guilt, dread, embarrassment) that’s supposed to be a harbinger of the dissonance.
Was my decision based on cost-benefit analysis or because I felt some inviolate principle had been threatened. Not being inside an fMRI scanner, I can’t really say. Were larger sums of money involved, would I decide differently ? I’d like to say no, but that’d be hard to substantiate since we can’t run that experiment again, not in any true fashion. Furthermore, if I had difficulty putting food on the table or didn’t feel financially secure, I might’ve decided differently. Also, I’m not planning to work (if I choose to get employed) for free. I can only say that given my current state, of who I am and what I want to be, I’d have been disappointed with myself had I chosen a different outcome than the one I chose.
I shall be telling this with a sighSomewhere ages and ages hence:Two roads diverged in a wood, and I—I took the one less traveled by,And that has made all the difference. – The Road Not Taken, Robert Frost