Scientific Studies That DeBunk The Brainwashing Myth

moonie brainwashing
A Moonie Mass Wedding – Moonie Brainwashing? Or just a Minority Religious & Cultural Pursuit Outside of the Mainstream?

The following is a compilation of scientific studies which have thoroughly debunked the idea of ‘brainwashing’.

This settled science has been completely ignored by members of the anti-cult movement who, for their own various self-interests, keep the superstition of “brainwashing” alive – no matter how many ex-members of minority religions it has harmed.

The below is taken from a 2006 paper called “The Market for Martyrs” by Laurence R. Iannaccone of George Mason University. The purpose of this paper was to look at the dynamics of “cult involvement” where it counts – terrorist suicide bombers.

What better example of ‘cult brainwashing’ could there be than convincing someone to kill themselves for the group? Surely, if cult brainwashing exists, it would exist here, right?

I think this is good data for an Ex-member to think with. It’s certainly better than the dominant ideology offered to day on the Internet. I hope you can use this to clarify some issues facing Exes of all minority religions who’ve come into contact with the ideology of the anti-cult movement, and who thought they had no alternatives in sorting out what happened to them after their own loss of faith.

*************** _____________________________________ ******************

The Cult Brainwashing Myth

It took a mountain of empirical evidence to establish that cult conversion and retention were largely matters of choice, and rational choice at that. Scholars initially viewed rational choice as the least likely explanation for anything as bizarre and costly as cult membership. If converts lacked histories of ignorance, deprivation, grievance, alienation, or mental abnormality, then they must be victims of extensive indoctrination, extreme social pressure, and systematic psychological persuasion that overwhelmed their capacity for rational choice. The most popular variant of this view came to be known as the “brainwashing” or “coercive persuasion” theory of conversion.

The term “brainwashing” was introduced in the 1950s to describe the indoctrination methods that Chinese and Korean communists used to elicit false confessions and political repudiations from prisoners of war. These victims were indeed coerced – held in confinement, deprived of food, water, and sleep, often tortured, threatened with death, and thereby forced to act, speak, and perhaps even think in ways that bore little relationship to their original beliefs and commitments. In the 1970s, however, Margaret Singer (1979), Richard Ofshe (1992), and others re-introduced “brainwashing” to describe the recruitment practices of the Moonies and other so-called cults. A spate of lurid books and news reports painted the Moonies as masters of mind control who duped and kidnapped unsuspecting youth, and forced them to attend indoctrination camps. Captive, sleep-deprived, and buzzed on sugar, the recruits were subjected to mind-numbing lectures, repetitive chanting, “love bombing,” and other insidious practices that overwhelmed their judgment, individuality, and personal will.

The truth, however, bore no relation to the sensational stories. NRM’s did indeed devote tremendous energy to outreach and persuasion, but they employed conventional methods and enjoyed very limited success. By the mid-1980s, researchers had so thoroughly discredited “brainwashing” theories that both the Society for the Scientific Study of Religion and the American Sociological Association agreed to add their names to an amicus brief denouncing the theory in court (Richardson 1991). The brainwashing myth collapsed under the weight of numerous case studies.

One of the most comprehensive and influential studies was The Making of a Moonie: Choice or Brainwashing? by Eileen Barker (1984). Barker could find no evidence that Moonie recruits were ever kidnapped, confined, or coerced. Participants at Moonie retreats were not deprived of sleep; the lectures were not “trance-inducing”; and there was not much chanting, no drugs or alcohol, and little that could be termed “frenzy” or “ecstatic” experience. People were free to leave, and leave they did.

Barker’s extensive enumerations showed that among the recruits who went so far as to attend two-day retreats (claimed to be Moonie’s most effective means of “brainwashing”), fewer than 25% joined the group for more than a week and only 5% remained full-time members one year later. And, of course, most contacts dropped out before attending a retreat. Of all those who visited a Moonie centre at least once, not one in two-hundred remained in the movement two years later. With failure rates exceeding 99.5%, it comes as no surprise that full-time Moonie membership in the U.S. never exceeded a few thousand. And this was one of the most successful New Religious Movements of the era! When researchers began checking (as opposed to merely repeating) the numbers claimed by leaders, defectors, and journalists, they found similarly low retention rates in nearly all “cults.”10

what post-ex is not
.

Social Networks of Faith:

This is not to say that outreach always failed. Some conversions did occur, and they followed consistent patterns. In place of the sensational stories and traditional theories, the case studies identified social networks and social capital as key to effective recruitment and retention. Later I shall argue that the same social processes operate in militant religious groups, including those that employ suicide-attacks.

The seminal work on cults, conversion, and social networks came from yet another study of the Moonies. By sheer luck, John Lofland and Rodney Stark (1965) chose to study the group back in the mid-1960s, when it was still microscopically small – a dozen young adults who had just moved to San Francisco from Eugene, Oregon. The group was led at the time by Young Oon Kim, a former professor of religion in Korea who had come to Oregon in 1959 to launch the Unification Church’s first American mission.

Lofland and Stark discovered that all the current members were united by close ties of friendship predating their contact with Miss Kim. The first three converts had been young housewives and next door neighbors who befriended Miss Kim after she rented a room from one of them. Subsequently, several of the husbands joined, followed by several of their friends from work. When Lofland and Stark began their study, the group had yet to convert a single stranger.

This recruitment pattern was not what the Miss Kim had sought or expected. During her first year in America she had tried to win converts through lectures and press releases. Later, in San Francisco the group also tried radio spots and public meetings in rented halls. But these methods yielded nothing. All the new recruits during Lofland and Stark’s period of observation were old friends or relatives of prior converts, or people who formed close friendships with one or more group member.

Proselytizing bore fruit only when it followed or coincided with the formation of strong social attachments, typically family ties or close personal friendships. Successful conversion was not so much about selling beliefs as it was about building ties, thereby lowering the social costs and raising the social benefits associated with changing one’s religious orientation. The converse was also true. Recruitment failure was all but assured if a person maintained strong attachments to a network of non-members. Many people spent time with the Moonies and expressed considerable interest in their doctrines but never joined. In nearly every case, these people had strong ongoing attachments to non-members who disapproved of the group. By contrast, those who joined were often newcomers to San Francisco and thus separated from their family and friends.

“In short, social attachments lie at the heart of conversion, and conversion tends to proceed along social networks. This discovery has been replicated in scores of subsequent studies all over the world, with groups as varied as the Hare Krishna, Divine Light Mission, Nichiren Shosha Buddhism, a UFO cult, fundamentalist Christian communes, Mormons, Catholic Charismatics, Christian Scientists, and the Bahai (Robbins 1988: 80).

Stated somewhat more abstractly, the fundamental sociological “law” of conversion asserts that conversion to religious groups almost never occurs unless the recruit develops stronger attachments to members of the group than to non-members. Among other things, the law explains why the establishment of a new religion, cult, or sect almost always begins with the conversion of the founder’s own family members and close friends.11

The law likewise predicts that as long as people remain deeply attached to the social networks of one faith, they rarely ever switch to another faith. Thus, the Mormon missionaries who called upon the Moonies were immune to the appeals of Miss Kim and her followers, despite forming warm relationships with several members.

The typical convert was religiously unattached, and most were not actively searching for answers to religious questions. The Moonies quickly learned that they were wasting their time at church socials or denominational student centers. They did far better in places where they came in contact with the unattached and uncommitted. This finding too has been replicated in subsequent research (see Stark and Bainbridge 1985; Stark and Finke 2000). Hence, new religious movements draw most of their converts from among those who are religiously inactive or only loosely attached to their current religion.

Absent direct observation, all these points are easy to miss, because people’s retrospective descriptions of their conversion experiences tend to stress theology. As long as the group views belief as central to its mission, converts will face strong pressure to make doctrine the center of their subsequent testimonies. As Robbins (1988: 81) observes, citing studies by Greil and Rudy (1984), Heirich (1977), and others,

“Ideological pressure often leads converts to construct testimonials of the ‘I once was lost but now am found’ variety.” These retrospective accounts are best seen as products of the converts’ new identities rather than descriptions of their antecedents.13 Social attachments are the horse that pulls the cart of ideological change.

Because conversion is a social process, it rarely is sudden. Instead, people who have encountered a new religion through their friends or family go through a gradual process of learning and listening and questioning before finally embracing the new faith. Typically, they take a quite active role in this process.14 Conversion involves introspection as well as interaction. People question, weigh, and evaluate their situations and options. Nor does the introspective process end with early professions of faith. Members of religious groups continue to assess their commitment, and many recant.
manson cult brainwashing

Demolishing the Myths of ‘Cult Conversion’ by Brainwashing

Having reviewed several of the case studies that demolished major myths about cult conversion, let us consider the applicability of these findings to suicide bombing and radical Islam. In light of what we know about cults, and what we are learning about suicide bombing, I conjecture that nearly all of the following behavioral regularities carry over from deviant cults to the militant religious groups that perpetrate acts of terror.

The typical convert is normal in nearly all respects – economically, socially, psychologically:

  • Typical converts are not plagued by neurotic fears, repressed anger, high anxiety, religious obsession, personality disorders, deviant needs, and other mental pathologies.
  • Typical converts are not alienated, frustrated in their relationships, or lacking in social skills.
  • Typical converts are young, healthy, intelligent, with better than average backgrounds and prospects.

Conversion to radical groups rarely occurs unless the recruit develops stronger attachments to members of the group than to non-members:

  • People with relatively few or relatively weak social ties are more likely to join.15
  • People with strong ties are very unlikely to convert – included those who are married with children, home-owners, people well-established in their jobs, occupations, and neighborhood.
  • Groups tend to grow through pre-existing social networks.
  • Social barriers (whether economic, regional, ethnic, language, or religious) tend to block paths of recruitment.
  • New religious movements draw most of their converts from among those with low levels of religious activity and commitment.

Conversion is a process involving repeated social interactions, and recruits participate extensively and intentionally in their own conversions:

  • Conversions are almost always incremental, although behavioral changes may be fairly sudden.
  • The form and timing of institutionalized rites of passage (such as baptisms or public testimonies) rarely corresponds to the actual form or timing of conversion.
  • The conversion process often involves reinterpreting one’s own life story so as to emphasize past levels of discontentment, sinfulness, or spiritual longing.
  • Analogous reconstructions often follow defection from movements
  • Belief typically follows involvement. Strong attachments draw people into religious groups, but strong beliefs develop more slowly or never develop at all.
  • High rates of involvement and sacrifice can coexist with doubt, uncertainty, and high probability of defection.
  • Intensity of commitment is not synonymous with certainty of belief or stability of attachment.
  • Those who leave radical groups after weeks, months, or even years of membership have little difficulty returning to normal activities, beliefs, and relationships.

Recent studies on religiously-oriented terrorism confirm many of these conjectures, and the mass of relevant evidence continues to grow. The most striking results concern the personal characteristics of suicide bombers, the role of groups, and the importance of social networks.

The substantial body of empirical results reviewed or derived by Krueger and Maleckova (2003: 141) thus finds “little direct connection between poverty or [poor] education and participation in terrorism.” Moreover, Berrebi (2003) finds that Palestinian suicide bombers have substantially more schooling and better economic backgrounds than the average Palestinian. Berrebi’s statistical portrait reaffirms the portrait that emerges from Nassra Hassan’s (2001) interviews with potential Palestinian suicide-bombers, which in turn sounds exactly like a quote from the literature on cult converts: “None of [the bombers] were uneducated, desperately poor, simple minded or depressed. Many were middle class and, unless they were fugitives, held paying jobs. … Two were the sons of
millionaires.”

Studies have likewise established the critical role of intense groups in recruiting, training, and directing suicide bombers. David Brooks (2002: 18-19) aptly describes Palestinian suicide bombing of the past several years as “a highly communitarian enterprise … initiated by tightly run organizations that recruit, indoctrinate, train, and reward the bombers.” Although the organizations seek to motivate potential bombers in many ways, the “crucial factor” is “loyalty to the group,” promoted by “small cells” and “countless hours of intense and intimate spiritual training.” As Kramer (1991) has emphasized, the “social dimension” was no less crucial in the Lebanese suicide attacks of the mid-1980s. Although these “‘self-martyrs’ sacrificed themselves, they were also sacrificed by others … [who] selected, prepared, and guided” them. For more on the activities of the bomber’s “sponsors,” see also Hoffman (2003: 43).

The role of social networks is most thoroughly documented in Understanding Terror Networks by Marc Sageman’s (2004), a forensic psychiatrist, PhD. political sociologist, and former Foreign Service officer who worked closely with Afghanistan’s mujahedin. Sageman documented what he calls the “global Salafi jihad” based on numerous sources, including biographical information that he collected on 172 radical Islamic terrorists. His results confirm that, as with cult converts of the 1970s, nearly all these people were recruited through existing social networks.17 Moreover, the vast majority were “cut off from their cultural and social origins, far from their families and friends” when they joined (p. 92). Preexisting ties also determined the small cells that perpetrated subsequent acts of terror, including the Hamburg cell responsible for the September 11 attacks. Indeed, Sageman has mapped the entire jihad as a globe-spanning network of cells linked through four major hubs.

These are by no means the only examples of recent findings recapitulating those of literature on cults. Many others can be gleaned from recent work by Cronin (2003), Reuter (2004), Gambetta (2005), Pape (2005), and Bloom (2005). I expect the parallels to continue piling up as we acquire more data on suicide bombers. But the point, of course, is not to simply wait for the parallels to pile up. Scholars should actively mine the established literature on cults and sects for additional insights, predictions, and theories. ”

*************** _____________________________________ ******************
Click here to see all posts on AlanzosBlog on the topic of social science on “cults”.

9 thoughts on “Scientific Studies That DeBunk The Brainwashing Myth”

  1. In my opinion, much of the arguments used in this post to debunk the existence of brainwashing have been successfully refuted by sociologist Benjamin Zablocki. Readers who wish to do so can read his works on this subject in the following links:

    https://www.benzablocki.net/exit-cost-analysis
    https://www.benzablocki.net/toward-a-demystified-and-disinterested-scientific-theory-of-brainwashing/

    During my last conversation with Alanzo he made an attempt to argue against Zablocki’s position. The full exchange we had can be read here:

    https://twitter.com/ExposedPerfidy/status/1073527322559561728

    I’m not fully content with all of what I wrote; in retrospect I see that I did not raise some arguments I should have had. That being said, I also believe that Alanzo’s attempt to refute Zablocki’s position was unsucsseful. A signifcant portion of Alanzo’s arguments consisted of irrelevant conjectures accusing his opponents of being motivated by bigotry. When Alanzo did try argue against the logic used by Zablocki his attempts were feeble. At one part he even seems to tacticly concede the validity of one of Zablocki’s points.* The reason why I’m revisiting this debate is partially in order to inform this blog’s readers about the full range of opinions that exist amongst social scientists regarding the brainwashing debate, something that Alanzo has not done. I also hope that maybe this time Alanzo will be able to bring up a some good arguments against Zablocki’s position, but I doubt it will happen.

    *This how I interpert the part of my conversation with Alanzo that dealt with the issue of cult recruitment, cult retention rates and their relevance or lack thereof to proving whether or not brainwashing exists. As you can see, the last comment in the conversation about this suject was made by me; after it was made Alanzo did not try to demonstrate that my thinking on this subject had any flaws, but rather proceeded to discuss other issues he saw as relevant to the brainwashing debate.

    Reply
    • Benjamin Zablocki, the single saviour of the belief in brainwashing in social science.

      Science is about a preponderance of scientific results, not about the thoughts of a single outlier who props up your belief.

      His position IS arguable.

      But science is not on his side.

      Reply
  2. Before we look at a “preponderance of scientific results” and use them to support a certain position, we need to ask oursevles whether or not said results are even relevant to the question they are supposed answer. For example, suppose that certain researcher wants to see what kind of an effect (if any) does the consumption of apples has on a person’s eyesight. To do so, he recruits a group of test subjects and makes sure that for the following month they would at least one episode of the TV show Seinfeld each day. At the end of the experiment it’s shown that the eyesight of the test subjects did not undergo a deterioration nor an improvement, so the researcher declares that apple consumption has no effect on eyesight one way or the other. One does not have to be versed in scientific research methods to see the flaws in reasoning that are being made. If a thousand more studies using the same methodology come up and reach the same results and conclusions then we would still not have a preponderance of scientific results demonstrating that apple consumption does not have no effect on eyesight.

    Benjamin Zablocki has argued that much of the data used by you and others to settle the question of whether or not brainwashing exists is not even pertient to issue at hand. Said data may be useful for answering other important questions about cults/NRMs, but it has no bearing on the issue of brainwashing. I don’t cling to his views as an article of faith but I do find merit in them. For example, after having read Eileen Barker’s The Making of a Moonie I’m fairly convinced that her study is a case of “getting people to watch Seinfeld so as to examine the effect of apple consumption on eyesight”. I may elaborate my thoughts in a future comment; not going to do so right now as I’m bit tired from work. Readers who wish to read Zablocki’s arguments can do so here:

    https://www.benzablocki.net/toward-a-demystified-and-disinterested-scientific-theory-of-brainwashing/

    A condensation of Zablocki’s arguments by me can be read in the twitter conversation I linked to in my previous comment.

    Before I sign of for the moment, one more thing I’d like to comment on:

    “Benjamin Zablocki, the single saviour of the belief in brainwashing in social science.”

    Actually, there’s also Janja Lalich, Stephen Kent, Marybeth Ayella, Alexandra Stein and probably even more whose names I’m not aware of or escape me at the moment. Furthermore, I’d say that even some of the NRM scholars who are typically decried as “cult apologists” (such as David G. Bromley and Lorne L. Dawson) hold views that are not that diametrically opposed to those of Zablocki and other scholars who see the concept of brainwashing as valid. Comments made by both Bromley and Dawson suggest to me they’re not that opposed to the idea that the resocialization processes undergone by some members of some religious movements share similiarities with Chinese Communist thought reform, both in terms of the techniques being used and in terms of the psychological effects such techniques have on those subjected to them. I may elaborate this thought in a future comment but right now I’m tired.

    Reply
  3. “To do so, he recruits a group of test subjects and makes sure that for the following month they would at least one episode of the TV show Seinfeld each day.”

    I accidentally droped the word “watch” from this sentence, it should have appeared before the words “at least one episode”. Please be a dear and edit my previous comment so as to correct my mistake, no need to post this second coment (the one in which i ask you to correct my mistake)

    Reply
    • “Comments made by both Bromley and Dawson suggest to me they’re not that opposed to the idea that the resocialization processes undergone by some members of some religious movements share similarities with Chinese Communist thought reform, both in terms of the techniques being used and in terms of the psychological effects such techniques have on those subjected to them. I may elaborate this thought in a future comment but right now I’m tired.

      You don’t say!

      Now that you’ve had time to rest – please provide these quotes.

      I do agree with you, none of this is black and white.

      But religious persecution of minority religions through the fatuous claims of cult brainwashing must be challenged.

      So I’m yer Huckleberry.

      Let’s see it.

      Reply
  4. Have you read the chapter by Dick Anthony that comes right after Zablocki’s article in his book “Misunderstanding Cults”? It’s very admirable of Zablocki to include Anthony’s article in his book because it completely blows Zablocki’s belief system in cult brainwashing away.

    For some reason, the God and Jesus of the AntiCult Movement is Lifton and Schein. Everything Anti-Cultists write about, and testify in court over, is supported by Lifton and Schein – even if it isn’t.

    In Dick Anthony’s 100 page chapter in Zablocki’s book, called “TACICAL AMBIGUITY AND BRAINWASHING FORMULATIONS: SCIENCE OR PSEUDO-SCIENCE?” Anthony shows how Zablocki, Ofshe, Margaret Singer and others who refer to Lifton & Schein for the proof of their belief system – actually get Lifton & Schein wrong!

    Here’s 8 clear examples Anthony gave – in Zablocki’s own book!

    “As we have shown, the CIA brainwashing model which had been disconfirmed by the CIA research program, as well as by the research of Lifton, Schein, and others, provides the actual theoretical foundation for all statements of brainwashing theory including cultic brainwashing formulations such as Zablocki’s.’

    “Consequently, his cultic brainwashing theory, like the earlier statements of this theory, such as those of Singer and Ofshe, is contradicted by its own claimed theoretical foundation, that is the research of Schein and Lifton. My 1990 article demonstrated that eight variables differentiate Singer’s and Ofshe’s brainwashing theory from Schein’s and Lifton’s research. ”

    “The present chapter has demonstrated the same set of conflicts between Zablocki’s approach and generally accepted research on Communist thought reform as characteristic of the Ofshe-Singer formulation.”

    “As I have shown above, the research of Schein and Lifton on Westerners in thought reform prisons, upon which Zablocki claims to base his brainwashing formulation, confirmed and extended Hinkle’s and Wolff’s earlier findings. As I argued in my 1990 article, their research on Communist forceful indoctrination practices disconfirmed the CIA model with respect to eight variables:’

    1 Conversion. None of Schein’s and Lifton’s subjects became committed to Communist worldviews as a result of the thought reform program. Only two of Lifton’s forty subjects and only one or two of Schein’s fifteen subjects emerged from the thought reform process expressing some sympathy for Communism, with neither of them actually becoming Communists. In the remaining subjects, Communist coercive persuasion produced behavioural compliance but not increased belief in Communist ideology (Lifton 1961:117,248-9; Schein 1958: 332,1961:157-66,1973: 295).

    2 Predisposing motives. Those subjects who were at all influenced by Communist indoctrination practices were predisposed to be so before they were subjected to them (Lifton 1961:130; Schein 1961: 104-10,140-56 1973: 295).

    3 Physical coercion. Communist indoctrination practices produced involuntary influence only in that subjects were forced to participate in them through extreme physical coercion (Lifton 1961:13,1976: 327-8; Schein 1959: 437,33 1961:125-7).

    4 Continuity with normal social influence. The non-physical techniques of influence utilized in Communist thought reform are common in normal social influence situations and are not distinctively coercive. (Lifton 1961: 438-61; Schein 1961: 269-82,1962: 90-7,1964: 331-51).

    5 Conditioning. No distinctive conditioning procedures were utilized in Communist coercive persuasion (Schein 1959: 437-8,1973: 284-5; Biderman 1962: 550).

    6 Psychophysiological stress/debilitation. The extreme physically-based stress and debilitation to which imprisoned thought reform victims were subjected did not cause involuntary commitment to Communist worldviews (Hinkle and Wolff 1956; Lifton: 117, 248-9; Schein 1958: 332,1961:157-66,1973: 295). Moreover, no comparable practices are present in new religious movements (Anthony 1990: 309-11).

    7 Deception/defective thought. Victims of Communist thought reform did not become committed to Communism as a result of deception or defective thought (Schein 1961: 202-3,238-9).

    8 Dissociation/hypnosis/suggestibility Those subjected to thought reform did not become hyper-suggestible as a result of altered states of consciousness; for example, hypnosis, dissociation, disorientation, and so on (Schein 1959: 457; Biderman 1962: 550)-

    This is just one devastating take down of the anti-cult belief system of cult brainwashing in Dick Anthony’s 100 page chapter in that book.

    I am coming to find that Dick Anthony is my Faa-tha.

    But, DTG, he’s yo’ daddy!

    :>

    Alanzo

    Reply
  5. Everything ascribed to brainwashing could be easily explained by conformity, emotional vulnerability, and ordinary human influence and manipulation. There is no mind control, just weak minds.

    Reply

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.