Hello all! Blog introductions are always awkward, so I’ll just say that my name is Mike Wood and I’m another contributor to this blog. I’m currently a Ph.D. Student at the University of Kent, working with Karen Douglas alongside Dan. My research is mainly about conspiracy theories as belief systems or ideology, as well as how people come to believe or disbelieve them and how they affect the way people deal with the world.
I’ve been interested in the world of conspiracy since I was fairly young and I spend a lot of time reading about this on the Internet, so I know a good deal about most of the theories that are out there, as well as some of the communities and social movements that are associated with them. It’s really fascinating stuff, and it’ll probably be a theme in my posts here.
Today I’ll start off with reference to a classic in psychology, the seminal 1956 book When Prophecy Fails by Leon Festinger, Harry Riecken, and Stanley Schachter. It details the rise and fall (mostly fall) of an apocalyptic UFO cult, and charts the members’ reactions to failed prophecies of doom using cognitive dissonance theory. It’s also one of the books that first got me really interested in psychology when I read it during my undergrad years – equal parts thought-provoking, disturbing, and hilarious. Today I’ll talk a bit about the cult in the book and a similar situation with a recent conspiracy theory or two, and in a later post I’ll probably look at its relationship to the wider world of new religious movements and conspiracy theories.
The leader of the small cult was Dorothy Martin, who in the book was given the pseudonym Marion Keech in order to protect her identity. She was an automatic writer – a kind of channeler or medium. Rather than having other entities use her voice to speak, they took control of her hands and wrote out their messages with pen and ink – at great length, according to Festinger. The entities themselves were aliens from a distant planet called Clarion, primarily from a fellow called Sananda who was probably actually Jesus, and they warned of great cataclysms that were to come to Earth in short order. Only a few would be saved – and joining Martin’s cult was the only sure way to be one of those lucky enough to be picked up by UFOs and carried away before the flooding started.
Sananda and friends made a specific prediction of the apocalypse through their communication with Martin – that it would occur on a certain date (a few days before Christmas) and in a certain way. When the apocalypse didn’t happen as predicted, the waiting cult members seemed energized by the failed prediction and went after new converts with renewed fervor. It wasn’t until a couple more of the same that the cult fell apart under the strain of failure. Festinger explained this pattern as a result of cognitive dissonance – the inner circle of members had committed so much of themselves to their beliefs that they would not abandon them easily, and in fact increased the degree to which they put forth social signifiers of belief.
This is a pattern that has repeated many, many times – an initial prediction failure is rationalized and does little to dissuade the most committed believers, but repeated disconfirmations push people away and eventually lead to the dissolution of the movement. The same thing happened with the Great Disappointment of the Millerites in the 1800s; Nancy Lieder‘s Nibiru cataclysm cult in 2003, one which bears very substantial similarities to the cult of Dorothy Martin; and most recently, the church of Harold Camping, who predicted that the Rapture would come in May of 2011, and then October of the same year when that failed. The more canny predictors, like Lieder, eventually stop giving specific dates and instead turn to vague predictions of doom once their hard core of members is established.
Dissonance theory has a lot to say about failed predictions, and a perfect example of its applicability comes in the form of the aftermath of the predictions of doom at the London Olympics. If you’re a watcher of the conspiracy world, you probably saw lots of theories being thrown around earlier this year about how there would be a false flag terror attack. All the signs were there, just like 9/11 – date numerology! “Predictions” in popular media! Weird symbology! Unfortunately it turns out these things are mostly just good for retrodiction rather than prediction – nothing like a terror attack happened at the London Games, despite some strange artistic choices at the opening ceremonies that sent hypervigilant symbologists into a tizzy. Other failed predictions about the London games included a fake alien invasion in the style of Project Blue Beam and something vaguely related to Israel (cf. the “ZION” interpretation of the logo).
So how did the people who made these predictions react to the dissonance caused by their failur? Many of the most popular “upcoming false flag terror attack” Youtube videos were taken down by their authors, probably in part because the deluge of smug comments became too much to bear. Others maintained that they had never said that there would definitely be a false-flag terror attack at the London games, just that if there were a “terrorist” attack, it would necessarily be a false flag (but it’s debatable whether the people who said that wouldn’t just assume any terror attack would be a false flag anyway). Probably the most common rationalization was that since people had decoded the signs and symbols of the conspirators, they had been forced to cancel their plot. This is pretty much what happened with Dorothy Martin’s cult – when the apocalypse failed to come, a new communication from the aliens arrived, saying that Martin’s group had “spread so much light” that God decided to cancel his judgment on Earth.
Each of these reduces disssonance in a separate way. Removing all evidence that you’d ever predicted an attack in the first place reduces the chances that others will remind you of it, making it easier to ignore or discount your own prediction; revising the prediction so that it’s not really wrong allows you to feel okay about it; and claiming that your prediction actually prevented the catastrophe in question is best of all. You, too, can be the hero of your very own Da Vinci Code – and all it takes is reposting some predictions and explaining them away when they don’t work out. It’s the logical extension of Internet slacktivism – the conspiracy theory equivalent of cryptosexy Facebook statuses claiming to raise awareness for breast cancer. What 21st century soothsayer could resist?