When the levees break: Hurricane conspiracy theorising

Preparing for Hurricane Sandy, October 2012

Hurricane Katrina remains one of the worst natural disasters to occur on U.S. soil. It’s estimated that at least 1,833 people were killed in the hurricane and subsequent floods, and property damage was in the region of $81 billion. Conspiracy theories about Katrina flourished almost immediately, and remain prevalent to this day. Some theories merely allege that the levees were deliberately destroyed in an act of profiteering or ethnic cleansing; others claim that the hurricane was conjured up by the Bush government using secret military weather manipulation technology.

Now Hurricane Sandy is set to pass directly through New York City and continue north towards Canada. Because of its record-breaking size, much of the north-eastern United States, from the East Coast to the Great Lakes, will be affected. Extensive damage is being predicted. It should come as no surprise that some familiar old conspiracy theories are being dusted off and updated for 2012. Could Obama be using weather modification technology to turn the upcoming election in his favour? Professional conspiracy theorist Alex Jones is calling Sandy the ‘engineered storm of the century‘.

Will these theories remain popular after Hurricane Sandy has passed, like the Katrina theories have? I suspect it depends on how severe the consequences of Sandy are. Just over a year ago, Hurricane Irene threatened to do vast amounts of damage to New York City; the media were predicting major devastation and long-lasting consequences for the entire U.S. economy. In the days before Irene hit, the same kind of conspiracy theories were being passed around internet forums. Fortunately, Irene weakened and ultimately passed through New York without causing the chaos that had been predicted. The conspiracy theories fizzled out too.

Why have Hurricane Katrina conspiracy theories endured so much better than Hurricane Irene theories? It may have something to do with the proportionality bias. Patrick Leman referred to this psychological phenomenon as the ‘major event-major cause’ heuristic. We are unconsciously biased towards seeking proportionally grand, intricate, complex, and significant causes for grand, intricate, complex, or significant events. Explanations which meet this rule-of-thumb may seem more plausible and appealing than explanations which defy the intuitive logic of the proportionality bias. In his research, Leman found that conspiracist explanations were more likely to be endorsed for the successful assassination of a President than an assassination attempt which failed. In the real world, conspiracy theories of J.F.K.’s assassination have become mainstream while conspiracy theories about the unsuccessful attempt on Reagan’s life are few and far between. It’s easy to believe a lone, unknown, and possibly unbalanced individual can wound the most powerful person on the planet. But such an insignificant cause may seem insufficient when it comes to killing the most important person on the planet; a long-running, systematic, and nefarious conspiracy can begin to seem more plausible. The death of Princess Diana, the 9/11 attacks, Hurricane Katrina: these momentous and shocking events have all produced enduring conspiracy theories, while the Reagans and Irenes have been forgotten.

In reality, small causes sometimes have large effects. In the world of conspiracy theories, large effects demand a conspiracy. If Hurricane Sandy has the devastating effects that some are predicting, it’s entirely possible that conspiracy theories will flourish. If the devastation fails to materialise, the conspiracy theories will be forgotten. Personally I’m hoping for the latter.

Posted in Biases & heuristics, Proportionality bias | Tagged , , , , , , , , , , | 1 Comment

When Prophecy Fails in the Digital Age

Hello all! Blog introductions are always awkward, so I’ll just say that my name is Mike Wood and I’m another contributor to this blog. I’m currently a Ph.D. Student at the University of Kent, working with Karen Douglas alongside Dan. My research is mainly about conspiracy theories as belief systems or ideology, as well as how people come to believe or disbelieve them and how they affect the way people deal with the world.

I’ve been interested in the world of conspiracy since I was fairly young and I spend a lot of time reading about this on the Internet, so I know a good deal about most of the theories that are out there, as well as some of the communities and social movements that are associated with them. It’s really fascinating stuff, and it’ll probably be a theme in my posts here.

Today I’ll start off with reference to a classic in psychology, the seminal 1956 book When Prophecy Fails by Leon Festinger, Harry Riecken, and Stanley Schachter. It details the rise and fall (mostly fall) of an apocalyptic UFO cult, and charts the members’ reactions to failed prophecies of doom using cognitive dissonance theory. It’s also one of the books that first got me really interested in psychology when I read it during my undergrad years – equal parts thought-provoking, disturbing, and hilarious. Today I’ll talk a bit about the cult in the book and a similar situation with a recent conspiracy theory or two, and in a later post I’ll probably look at its relationship to the wider world of new religious movements and conspiracy theories.

The leader of the small cult was Dorothy Martin, who in the book was given the pseudonym Marion Keech in order to protect her identity. She was an automatic writer – a kind of channeler or medium. Rather than having other entities use her voice to speak, they took control of her hands and wrote out their messages with pen and ink – at great length, according to Festinger. The entities themselves were aliens from a distant planet called Clarion, primarily from a fellow called Sananda who was probably actually Jesus, and they warned of great cataclysms that were to come to Earth in short order. Only a few would be saved – and joining Martin’s cult was the only sure way to be one of those lucky enough to be picked up by UFOs and carried away before the flooding started.

Sananda and friends made a specific prediction of the apocalypse through their communication with Martin – that it would occur on a certain date (a few days before Christmas) and in a certain way. When the apocalypse didn’t happen as predicted, the waiting cult members seemed energized by the failed prediction and went after new converts with renewed fervor. It wasn’t until a couple more of the same that the cult fell apart under the strain of failure. Festinger explained this pattern as a result of cognitive dissonance – the inner circle of members had committed so much of themselves to their beliefs that they would not abandon them easily, and in fact increased the degree to which they put forth social signifiers of belief.

This is a pattern that has repeated many, many times – an initial prediction failure is rationalized and does little to dissuade the most committed believers, but repeated disconfirmations push people away and eventually lead to the dissolution of the movement. The same thing happened with the Great Disappointment of the Millerites in the 1800s; Nancy Lieder‘s Nibiru cataclysm cult in 2003, one which bears very substantial similarities to the cult of Dorothy Martin; and most recently, the church of Harold Camping, who predicted that the Rapture would come in May of 2011, and then October of the same year when that failed. The more canny predictors, like Lieder, eventually stop giving specific dates and instead turn to vague predictions of doom once their hard core of members is established.

Dissonance theory has a lot to say about failed predictions, and a perfect example of its applicability comes in the form of the aftermath of the predictions of doom at the London Olympics. If you’re a watcher of the conspiracy world, you probably saw lots of theories being thrown around earlier this year about how there would be a false flag terror attack. All the signs were there, just like 9/11 – date numerology! “Predictions” in popular media! Weird symbology! Unfortunately it turns out these things are mostly just good for retrodiction rather than prediction – nothing like a terror attack happened at the London Games, despite some strange artistic choices at the opening ceremonies that sent hypervigilant symbologists into a tizzy. Other failed predictions about the London games included a fake alien invasion in the style of Project Blue Beam and something vaguely related to Israel (cf. the “ZION” interpretation of the logo).

So how did the people who made these predictions react to the dissonance caused by their failur? Many of the most popular “upcoming false flag terror attack” Youtube videos were taken down by their authors, probably in part because the deluge of smug comments became too much to bear. Others maintained that they had never said that there would definitely be a false-flag terror attack at the London games, just that if there were a “terrorist” attack, it would necessarily be a false flag (but it’s debatable whether the people who said that wouldn’t just assume any terror attack would be a false flag anyway). Probably the most common rationalization was that since people had decoded the signs and symbols of the conspirators, they had been forced to cancel their plot. This is pretty much what happened with Dorothy Martin’s cult – when the apocalypse failed to come, a new communication from the aliens arrived, saying that Martin’s group had “spread so much light” that God decided to cancel his judgment on Earth.

Each of these reduces disssonance in a separate way. Removing all evidence that you’d ever predicted an attack in the first place reduces the chances that others will remind you of it, making it easier to ignore or discount your own prediction; revising the prediction so that it’s not really wrong allows you to feel okay about it; and claiming that your prediction actually prevented the catastrophe in question is best of all. You, too, can be the hero of your very own Da Vinci Code – and all it takes is reposting some predictions and explaining them away when they don’t work out. It’s the logical extension of Internet slacktivism – the conspiracy theory equivalent of cryptosexy Facebook statuses claiming to raise awareness for breast cancer. What 21st century soothsayer could resist?

Posted in Social psychology | Tagged , , , , | 5 Comments

Conspiracy Round-Up 19/10

A few noteworthy conspiracy-oriented stories from around the internet over the past few days and weeks.

Have I missed a good story? Let me know in the comments.

Posted in Round-Ups | Tagged , , , , , , , , , | Leave a comment

Dan’s consequences welcome

As this is my first blog post on here, I thought I best start by introducing myself properly. I’m Dan, based at the University of Kent, going into the second-year of my PhD. My research is based around the psychology of conspiracy theories. Specifically, I am taking quite an experimental approach in exploring some of the consequences of conspiracy theories. Therefore, some of my previous empirical work has involved exposing participants’ to conspiracy theories, and then subsequently measuring their behavioural intentions.

Using this kind of experimental approach, it is possible to explore cause and effect, which most research in the field has been unable to do. I have previously investigated exposure to conspiracy theories impact on both political and carbon footprint intentions, which have both yielded  some interesting findings.

I am currently now exploring other behavioural consequences. For example, relating to immunisation conspiracy theories and their potential association with vaccination intentions, again using experimental methods. However, in order to demonstrate a fuller picture, I also aim to investigate some of the positive consequences that can be associated with conspiracies. For example, are there certain traits that can be associated with conspiracy theory beliefs that are positive?

I can’t promise I will an amazing blogger, but hopefully I can provide an interesting perspective, from a slightly different experimental angle.

Posted in Social psychology | Tagged , , | 1 Comment

How to sell a conspiracy theory

Architects & Engineers for 9/11 Truth (A&E9/11 for short), more than any other conspiracist organisation I’ve come across, showcases the psychology of sales techniques, influence, and persuasion. I don’t doubt the sincerity of those who believe the claims made by A&E9/11; all the supporters I’ve spoken to have been knowledgeable, eloquent, and passionate about their cause (of course, being passionate about something often leads to confirmation bias, but that’s for another post). I’d be surprised if the people at the head of the movement responsible for generating web content, documentaries, and DVDs aren’t equally passionate. But whether they consciously intend to or not, they take advantage of almost every psychological sales technique in the book.

And when it comes to the psychology of sales techniques, the book is Influence: The Psychology of Persuasion by Robert Cialdini – a psychological classic describing the subtle (and sometimes not so subtle) persuasion techniques which play on our inbuilt psychological foibles. These are most obvious when it comes to the realm of professional sales, where persuading people by any means possible to part with their money is often all that matters. But persuasion techniques aren’t only used by salespeople: they are an ever-present feature of social life, and usually when we’ve been influenced by subtle persuasive tactics we are completely unaware of it.

Cialdini breaks the research down into 6 key principles of influence. Let’s take a look at how A&E9/11 describe a DVD of their documentary Solving the Mystery of WTC7 in light of these psychological insights.

Hard core activists can buy 15 of these mini-DVDs for $15“Last September, our mini-documentary, Architects & Engineers – Solving the Mystery of WTC7, went ‘viral’ on YouTube, generating over 500,000 hits in less than four weeks. Now this powerful 15-minute film about the explosive destruction of WTC Building 7, narrated by legendary actor Ed Asner, is available as a 3-inch mini DVD to energize your outreach efforts.”

Straight away this intro demonstrates two of Cialdini’s principles. First, social proof – believing that a lot of other people want, like or think something usually makes you more likely to want like or think it yourself. Half a million people watched this video in less than four weeks? Well then there must be something in it… The second principle is liking – if a person we know, like, and trust is involved with something we are more susceptible to influence. This is why advertisers pay celebrities huge sums of money to endorse their product. A&E9/11 got Ed Asner to narrate this documentary, and who could be more trustworthy than the guy who played Santa in the movie Elf?

“This new DVD is one of the most effective 9/11 Truth materials to ever hit the streets. Why?

6) The smaller 3-inch size means you can easily carry it in your pocket for quick distribution – and, for the hard core activists among you, give out dozens more DVDs per day. They are so cute that almost everyone will take one – and more importantly… watch it!

9/11 Truth activists love to give away leaflets, pamphlets, and DVDs – I’ve got a drawer full of them. This illustrates another of Cialdini’s principles: reciprocity. If somebody does something for you, not matter how trivial, you are obliged to reciprocate. This is why marketers like to give away free samples; it’s not just about selflessly giving you the chance to try out their product for free, it’s about subtly obliging you to return the favour by purchasing the product later. If someone gives you a free A&E9/11 DVD, it kick starts unconscious processes in your brain which want you to watch it and like it so that you’re holding up your side of the social contract. And, regardless of the quality of the evidence it contains, the simple act of watching a DVD about 9/11 conspiracy theories means you’re psychologically primed to be persuaded thanks to the principle of consistency. When you commit to an idea, even in a small way like spending 15 minutes of your valuable time watching a free DVD, you face unconscious pressure to internalise the idea as part of your self-image rather than admit that you committed time or effort to something you don’t believe in.

One more principle worth discussing in relation to A&E9/11 is authority. We are particularly susceptible to being influenced by those we believe to have some kind of authority. This was most famously demonstrated by Stanly Milgram, who persuaded many of his participants to administer what they thought were lethal electric shocks to a fellow participant merely by creating an air of authority. In everyday life the effects are usually less dramatic, but we constantly look to authorities for an indication of how we should behave or what we should believe. Often we’re justified in doing so – the important factor is whether the authority is legitimate, and sometimes it’s hard to tell. A&E9/11 is predicated on creating the perception that the 1,700+ architects and engineers who have signed their petition have the authority, by virtue of their profession, to know that the World Trade Center towers could not have collapsed without the use of controlled explosives. But it’s worth bearing in mind that not every architect has experience with 100+ story skyscrapers and not every engineer has trained in the relevant kinds of structural engineering.

The prominent use of this seemingly large number, 1,700+, is another example of the social proof persuasion technique. However when given proper context it starts to look somewhat less effective…

Number of A&E for 9/11 Truth: approx 1,700
Total A&E in the U.S = approx. 2,728,000 1 2
A&E9/11 as a percentage of total A&E = 0.06%

Posted in 9/11, Biases & heuristics, Social psychology | Tagged , , , , , , , | 11 Comments

Psychology of A&E9/11Truth on SGU

Twin Towers, March 2001This week’s Skeptic’s Guide to the Universe features a discussion (starting at 38:20) of the claims made by Architects and Engineers for 9/11 Truth. A&E9/11Truth is a conspiracist organisation whose main argument is that the collapse of the three World Trade Center towers could only have been caused by controlled demolition. An email is read asking “can you… honestly say that there is absolutely nothing interesting or suspicious about the manner in which all of the towers, but specifically WTC7, came crumbling down.” The panel offers cursory rebuttals of the main claims, noting that each point has already been covered in detail by people with legitimate expertise in the relevant subjects. The discussion turns towards the psychology of conspiracy theorising, pointing out that A&E9/11Truth’s claims are an example of anomaly hunting – combing through every eyewitness account, news report, photograph, video, official report etc about the events of 9/11, and seizing upon any piece of information that doesn’t immediately appear to fit with the generally accepted narrative. These small pieces of errant data are framed as being curious and sinister, with the implication that they provide evidence of a conspiracy.

Anomaly hunting is not unique to 9/11 conspiracy theories; it is characteristic of all conspiracy theories. Philosopher Brian Keeley argues that the reliance on errant data gives conspiracy theories an appearance of explanatory strength; the conspiracy theory is apparently able to account for everything explained by the mainstream narrative, plus all the anomalies and errant data which appear to go against the mainstream account. Yet this superior explanatory strength is an illusion. Under scrutiny, the leap from anomalies and errant data to a coherent alternative conspiratorial narrative is unjustified – the anomalies so crucial to conspiracy theories are not satisfactory evidence.

This feature of conspiracy theories is, in part, a product of the confirmation bias. The idea that a conspiracy took place is the starting point; any evidence that can be shoe-horned to fit with that theory is incorporated and any evidence that doesn’t fit is dismissed, distorted or ignored. In the case of A&E9/11Truth, anything that appears to support the hypothesis that the collapses were a result of controlled demolition is accepted uncritically, and everything else is disregarded. But conspiracy theorists aren’t the only ones who are susceptible to the confirmation bias; we all are. John McHoskey looked at biased evaluation of evidence in relation to J.F.K. conspiracy theories (full-text paywalled). Both believers and skeptics selectively accepted or dismissed pieces of evidence to fit with their preexisting point of view. Confirmation bias is unavoidable; it’s part of being human. To overcome it requires a conscious effort to let our beliefs be shaped by the evidence, rather than shaping the evidence to fit with our beliefs.

Posted in 9/11, Biases & heuristics, Confirmation bias | Tagged , , , , , , | 12 Comments

What ConspiracyPsych is all about

Now that the first-post-general-introduction is out of the way, here’s a little more about who’s behind the blog, and what to expect from it. ConspiracyPsych is written by myself (Rob), Dan, Mike, and Christopher. We’re all at various stages of earning PhDs in psychology, specifically researching the psychology of conspiracy theories. As far as psychology goes this has been a fairly neglected field of study, producing only a handful of peer-reviewed research papers over the last 4 decades or so. Plenty of work has looked at belief formation more generally but only recently are more psychologists beginning to pay attention to conspiracy theories.

The four of us are taking quite different approaches in our research, and will each be blogging about our own ideas and findings. We’ll also be talking about research that has been done in the past, about new research as it’s published, and about ideas for the future. And we’ll be posting more general conspiracy theory-related things of interest as we come across them. Over the next few days and weeks I’ll be putting together a page full of links to other sites, articles, essays, videos, and podcasts from around the internet which discuss the psychology of conspiracy theories, so keep checking back. If there’s anything you think we should discuss or anything you think we should know about leave a comment or drop us an email – you can find contact details, as well as more info about each of us, on the About page.

Christopher, Mike, and Dan will introduce themselves in due course. In the meantime here’s a little about my background in conspiracy theories. I’ve been researching belief in conspiracy theories since my undergrad psychology degree at University of Kent. There my final year project looked at conspiracy theories from a social psychological perspective. I moved on to Goldsmiths, University of London, where my MSc project looked at the role of the conjunction fallacy, a common mental shortcut, in making people more susceptible to conspiracy theories. Finally I was awarded funding to study a PhD at Goldsmiths supervised by Professor Chris French. The PhD has encompassed thinking about what the term conspiracy theory means, why conspiracy theories matter, how best to measure belief in conspiracy theories, and the potential role of more mental shortcuts and biases. I’ll be blogging about each of these topics, and more besides, over the coming weeks and months. Hope you enjoy it!

Posted in Meta | Tagged | 1 Comment

Welcome to ConspiracyPsych.com

Conspiracy theories are everywhere. Books, TV shows, films, documentaries, magazines, newspapers – you almost can’t help being exposed to some form of conspiracy theory on a daily basis. And then there’s the internet. Millions of websites and vast, thriving communities exist purely to discuss conspiracy theories. Some of these theories are believed only by a handful of people, some are embarrassing even to other conspiracy theorists. But some have broken through into the mainstream to such an extent that they are an established part of the cultural landscape; according to some polls more people believe the J.F.K. assassination conspiracy theory than than ‘lone gunman’ theory. Can all those people be wrong?

I can’t know for sure what did or didn’t happen on the grassy knoll, but psychological research reveals that people – normal, intelligent, rational people – can come to fervently believe things which aren’t well supported, and may even be contradicted by the evidence. There are a whole host of social factors, personality traits, cognitive biases and heuristics, which can all contribute to belief formation. Sometimes the beliefs they lead to are reasonable, or at least useful. Sometimes not. This blog is going to be about what psychology can tell us about how people come to believe conspiracy theories.

If you arrived here looking for evidence for or against particular conspiracy theories you might be disappointed. The goal here isn’t to definitively prove or disprove the reality of any particular conspiracy; rather, it’s about the sometimes surprising psychology of belief. My hope is that regardless of your existing beliefs about conspiracy theories, you will read the information here with an open mind and think about how you arrived at your own set of beliefs about the world.

Posted in Meta | Tagged | 2 Comments