The following is a contribution to the JREF’s ongoing blog series on skepticism and education. If you are an educator and would like to contribute to this series, please contact This email address is being protected from spambots. You need JavaScript enabled to view it..

“I’m an insulin-dependent diabetic. Twice a day I take synthetically manufactured insulin that still contains some animal products — and I have no qualms about it… I’m not going to take the chance of killing myself by not taking insulin. I don’t see myself as a hypocrite. I need my life to fight for the rights of animals.” --PETA Senior Vice President MaryBeth Sweetland on her use of insulin, which was tested on animals

Many years ago, while teaching my first college level course in Human Sexuality, I was having a bit of difficulty dealing with two students who seemed to always be at each other’s throats. Each time a controversial topic (e.g., abortion, homosexuality, pornography) was covered, their open disagreements seemed to escalate into full blown arguments. After a few weeks of this I’d had enough, and I executed a plan designed to teach them (and the rest of the class) to expand their horizons and to find common ground. I asked all of the students to write a short paragraph expressing their positions on a list of ten sex-related topics, and to turn it in after they had signed it. They were then instructed to write a research paper in which they attempted to support the opposite of their own opinion on one of the topics in the list. This is an old trick, and I expected a bit of resistance and protest, but I had faith that most would follow through and learn from the experience.

A strange thing happened after the papers were submitted, however. Inexplicably, a few of the students thanked me for the assignment, and said that they had changed their minds about the topic they chose for their papers. I asked if they had learned something new about the topic, or whether they had been “on the fence” in the first place. Many said that they really hadn’t learned much and that they already knew their opponents’ positions, but they just felt differently after writing the paper. Although this reaction was shown by a relatively small percentage of students, and there are many possible explanations for their reports of attitude, I couldn’t help wondering if I might have inadvertently “brainwashed” my students! A better explanation, however, can be found in the literature on cognitive dissonance.

Coined in the late 1950s by psychologist Leon Festinger in his very popular book, When Prophecy Fails, cognitive dissonance refers to the psychologically uncomfortable state of holding two conflicting cognitions (thoughts, opinions, beliefs, memories, etc.). Dispelling many of the myths about so-called brainwashing (which assumes that others can directly change our cognitions), Festinger found that people engage in cognitive shifts designed to reduce the unpleasantness they experience when their cognitions contradict each other. These reactions can range from mild annoyance to mental torment, and Festinger suggested three possible routes to reduction of the dissonance that creates them: 1) change the behavior that creates the dissonance; 2) add new cognitions to reduce the dissonance; or 3) change the social environment that reinforces the dissonance. Although I could never be certain, I suspected that #2 may have occurred in my students who’d inexplicably changed their minds after turning in their papers. After all, they could not go back in time and alter the fact that they had written the paper and expressed certain things as true; what they could change, however, were their beliefs. Once those two elements were consonant, their cognitive dissonance had been eliminated and they felt better.

The history of Cognitive Dissonance Theory is as interesting as the theory itself. The tale begins in the late 1950s, when Festinger and his colleagues infiltrated a UFO cult led by Dorothy Martin, a Chicago housewife and former devotee of Scientology founder L. Ron Hubbard. Based upon messages she claimed to receive through spirit writing, Martin and her followers believed that the world was about to end, and that the group’s adherents would be the only survivors. As Festinger predicted, although the end-of-world prophecy failed, the most faithful among them became even more committed to their leader and her teachings because accepting the truth (that they had been wrong to give up their lives to follow her in the first place) was simply too psychologically distressing.

Although most of your students will, thankfully, never join a cult, they can all relate to holding some belief as a sacred cow about which they fail to think critically. Some might justify their cigarette smoking by stating that it prevents them from gaining an unhealthy amount of weight. Others may refuse to recycle, claiming that corporations and large factories pollute far more than they ever could as individuals. Still others might justify their religious beliefs in the face of evidence that their religious texts show no sign of divine inspiration, arguing that the books were written by fallible men who could have inserted their own ideas, and/or mistranslated certain sections. Everyday examples such as these help me begin a discussion with my students that assists them in directly relating to others they might perceive as gullible or foolish.

After encouraging them to relate personally to the topic, I then ask my students to consider how cognitive dissonance might help explain why our offers to educate or to demonstrate errors in others’ reasoning are sometimes ineffective. I try to guide them toward seeing how petty and insignificant factual information might seem to “believers” when compared to:

  • the chance that they might lose important connections to friends, family, and worldview (religion),

  • the idea that they must accept the death of a loved one, or of the idea that the future can be foreseen (psychics),

  • or even to the relatively mild notion that they have wasted money or time on a useless product (Powerbalance Bracelets).

At this point, with a coherent framework established, students are usually prepared to approach the published literature on cognitive dissonance. Luckily, a wealth of studies and books is now available for them. As my colleague Carol Tavris and her co-author Elliot Aronson point out in their wonderful book Mistakes Were Made (But Not By Me), Festinger’s theory has been incredibly fruitful, inspiring over 3,000 published experiments and revolutionizing our understanding of human psychology. Tavris and Aronson have followed in Festinger’s footsteps, pointing out how cognitive dissonance affects many facets of our daily lives, influencing politics, war, marriage, prejudice, science, and countless other areas. In addition to Tavris and Aronson’s book, some of the published scientific literature can serve as good starting points for discussions with students about the power of cognitive dissonance. Among these are:

  • Research on the role cognitive dissonance plays in criminal investigations, and how law enforcement and prosecutors can be biased in favor of evidence that confirms their beliefs, rather than that which is true.(e.g.: Ask, K., Reinhard, M., Marksteiner, T., & Granhag, P. A. (2011). Elasticity in evaluations of criminal evidence: Exploring the role of cognitive dissonance. Legal and Criminological Psychology, 16, 289-306.

  • Brain scan (fMRI) studies suggesting that decision-related attitude change are associated with increased activity in specific parts of the brain, and occur rapidly, before someone has an opportunity to deliberate between options. (e.g., Jarcho, J. M., Berkman, E. T., & Liebman, M. D. (2011). The neural basis of rationalization: Cognitive dissonance reduction during decision-making. Social Cognitive & Affective Neuroscience, 6(4), 460-467.)

  • Explaining what differentiates those who join terrorist organizations from others who may hold similar social and political views, but never turn to acts of violence. (e.g., Kohn Maikovich, A. (2005). A new understanding of terrorism using cognitive dissonance principles. Journal for the Theory of Social Behaviour,35(4), 373–397.)

  • Investigation of the effects of dissonance reported by lesbians who identify as Christians, and the strategies they employ to deal with those feelings of dissonance. (e.g., Mahaffy, K. A. (1996). Cognitive dissonance and its resolution: A study of lesbian Christians. Journal for the Scientific Study of Religion, 35(4), 392-402.)

 

References:

Festinger, L., Riecken, H. W., & Schachter, S. (1956). When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the Destruction of the World. University of Minnesota Press: Minneapolis, MN.

Tavris, C., & Aronson, E. (2007). Mistakes Were Made (But Not By Me). Harcourt Press: Orlando, FL.

 

Sheldon_W_HelmsSheldon W. Helms is an associate professor of psychology at Ohlone College in Fremont, CA. He has taught psychology for more than 16 years, and teaches a wide range of topics including Abnormal Psychology, Experimental Psychology, Social Psychology, and Human Sexuality. He serves on the Board of Directors for the Bay Area Skeptics and is the founder of the Ohlone Psychology Club Speaker Series, through which he regularly hosts top name speakers in science and skepticism.