Systematic effort to destroy an individual's former loyalties and beliefs and to substitute loyalty to a new ideology or power. It has been used by religious cults as well as by radical political groups. The techniques of brainwashing usually involve isolation from former associates and sources of information; an exacting regimen calling for absolute obedience and humility; strong social pressures and rewards for cooperation; physical and psychological punishments for noncooperation, including social ostracism and criticism, deprivation of food, sleep, and social contacts, bondage, and torture; and constant reinforcement. Its effects are sometimes reversed through deprogramming, which combines confrontation and intensive psychotherapy.
Learn more about brainwashing with a free trial on Britannica.com.
Brainwashing (also known as thought reform or as re-education) consists of any effort aimed at instilling certain attitudes and beliefs in a person — beliefs sometimes unwelcome or in conflict with the person's prior beliefs and knowledge, in order to affect that individual's value system and subsequent thought-patterns and behaviors.
In 1987 the American Psychological Association (APA) Board of Social and Ethical Responsibility for Psychology (BSERP) provisionally declined to endorse one particular approach to brainwashing as "lack[ing] the scientific rigor and evenhanded critical approach necessary for APA imprimatur". The debate amongst APA members on this subject continues.
The English words "re-educate" and "re-education", which the Oxford English Dictionary attests in general senses from 1808, began in the 1940s to express specifically political connotations. George Orwell mentioned in Animal Farm (1945) "the Wild Comrades' Re-education Committee (the object of this was to tame the rats and rabbits)"; and Arthur Koestler in The Age of Longing (1951) wrote of "revolutionary vigilance,.. and discipline, and re-education camps".
The term "brainwashing" first came into use in the English language in the 1950s. Author John Marks writes that a journalist later revealed to have worked undercover for the Central Intelligence Agency (CIA) first coined the term in 1950. The OED records its earliest known English-language usage of "brain-washing" by E. Hunter in New Leader on 7 October 1950.
Earlier forms of coercive persuasion occurred for example during the Inquisition and in the course of show trials against "enemies of the state" in the Soviet Union; but no specific term emerged until the methodologies of these earlier movements became systematized during the early decades of the People's Republic of China for use in struggles against internal class enemies and foreign invaders. Until that time, presentations of the phenomenon described only concrete specific techniques.
The term xǐ năo (洗腦, the Chinese term literally translated as "to wash the brain") originally referred to methodologies of coercive persuasion used in the "reconstruction" (改造 gǎi zào) of the so-called feudal (封建 fēng jiàn) thought-patterns of Chinese citizens raised under pre-revolutionary régimes; the term punned on the Taoist custom of "cleansing/washing the heart" (洗心 xǐ xīn) prior to conducting certain ceremonies or entering certain holy places, and in Chinese, the word "心" xīn also refers to the soul or the mind, contrasting with the brain. The term first came into use in the United States in the 1950s during the Korean War (1950–1953) to describe those same methods as applied by the Chinese communists to attempt deep and permanent behavioral changes in foreign prisoners, and especially during the Korean War to disrupt the ability of captured United Nations troops to effectively organize and resist their imprisonment.
The word brainwashing consequently came into use in the United States of America to explain why, unlike in earlier wars, a relatively high percentage of American GIs defected to the enemy side after becoming prisoners of war in Korea. Later analysis determined that some of the primary methodologies employed on them during their imprisonment included sleep-deprivation and other intense psychological manipulations designed to break down the autonomy of individuals. American alarm at the new phenomenon of substantial numbers of U.S. troops switching their allegiance to support foreign Communists lessened after the repatriation of prisoners, when it emerged that few of them retained allegiance to the Marxist and "anti-American" doctrines inculcated during their incarcerations. When rigid control of information ceased and the former prisoners' "natural" cultural methods of reality-testing could resume functioning, the superimposed values and judgments rapidly decreased.
Although the use of brainwashing on United Nations prisoners during the Korean War produced some propaganda-benefits to the forces opposing the United Nations, its main utility to the Chinese lay in the fact that it significantly increased the maximum number of prisoners that one guard could control, thus freeing other Chinese soldiers to go to the battlefield.
After the Korean War the term "brainwashing" came to apply to other methods of coercive persuasion and even to the effective use of ordinary propaganda and indoctrination. Formal discourses of the Chinese Communist Party came to prefer the more clinical-sounding term sī xǐang gǎi zào 思想改造 ("thought reform"). Metaphorical uses of "brainwashing" extended as far as the memes of fashion-following.
The Communist Party of China used the phrase "xǐ nǎo" ("wash brain", 洗脑) to describe its methods of persuading into orthodoxy those members who did not conform to the Party message. The phrase played on xǐ xīn (洗心"wash heart"), an admonition — found in many Daoist temples — which exhorted the faithful to cleanse their hearts of impure desires before entering.
In September 1950, the Miami Daily News published an article by Edward Hunter titled "'Brain-Washing' Tactics Force Chinese into Ranks of Communist Party". It contained the first printed use of the English-language term "brainwashing", which quickly became a stock phrase in Cold War headlines. Hunter, a CIA propaganda-operator who worked under-cover as a journalist, turned out a steady stream of books and articles on the theme. An additional article by Hunter on the same subject appeared in New Leader magazine in 1951. In 1953 Allen Welsh Dulles, the CIA director at that time, explained that "the brain under [Communist influence] becomes a phonograph playing a disc put on its spindle by an outside genius over which it has no control."
In his 1956 book Brain-Washing: The Story of Men Who Defied It (Pyramid Books), Hunter described "a system of befogging the brain so a person can be seduced into acceptance of what otherwise would be abhorrent to him". According to Hunter, the process became so destructive of physical and mental health that many of his interviewees had not fully recovered after several years of freedom from Chinese captivity.
Later, two studies of the Korean War defections by Robert Lifton and Edgar Schein concluded that brainwashing had a transient effect when used on prisoners-of-war. Lifton and Schein found that the Chinese did not engage in any systematic re-education of prisoners, but generally used their techniques of coercive persuasion to disrupt the ability of the prisoners to organize to maintain their morale and to try to escape. The Chinese did, however, succeed in getting some of the prisoners to make anti-American statements by placing the prisoners under harsh conditions of physical and social deprivation and disruption, and then by offering them more comfortable situations such as better sleeping quarters, quality food, warmer clothes or blankets. Nevertheless, the psychiatrists noted that even these measures of coercion proved quite ineffective at changing basic attitudes for most people. In essence, the prisoners did not actually adopt Communist beliefs. Rather, many of them behaved as though they did in order to avoid the plausible threat of extreme physical abuse. Moreover, the few prisoners influenced by Communist indoctrination apparently succumbed as a result of the confluence of the coercive persuasion, and of the motives and personality characteristics of the prisoners that already existed before imprisonment. In particular, individuals with very rigid systems of belief tended to snap and realign, whereas individuals with more flexible systems of belief tended to bend under pressure and then restore themselves after the removal of external pressures.
Working individually, Lifton and Schein discussed coercive persuasion in their published analyses of the treatment of Korean War POWs. They defined coercive persuasion as a mixture of social, psychological and physical pressures applied to produce changes in an individual's beliefs, attitudes, and behaviors. Lifton and Schein both concluded that such coercive persuasion can succeed in the presence of a physical element of confinement, "forcing the individual into a situation in which he must, in order to survive physically and psychologically, expose himself to persuasive attempts". They also concluded that such coercive persuasion succeeded only on a minority of POWs, and that the end-result of such coercion remained very unstable, as most of the individuals reverted to their previous condition soon after they left the coercive environment.
Following the armistice that interrupted hostilities in the Korean War, a large group of intelligence-officers, psychiatrists, and psychologists received assignments to debrief United Nations soldiers in the process of repatriation. The government of the United States wanted to understand the unprecedented level of collaboration, the breakdown of trust among prisoners, and other such indications that the Chinese were doing something new and effective in their handling of prisoners of war. Formal studies in academic journals began to appear in the mid-1950s, as well as some first-person reports from former prisoners. In 1961, two specialists in the field published books which synthesized these studies for the non-specialists concerned with issues of national security and social policy. Edgar H. Schein wrote on Coercive Persuasion, and Robert J. Lifton wrote on Thought Control and the Psychology of Totalism. Both books focussed primarily on the techniques called "xǐ nǎo" or, more formally "sī xiǎng gǎi zào" (reconstructing or remodeling thought). The following discussion largely builds on their studies.
Although the attention of Americans came to bear on thought reconstruction or brainwashing as one result of the Korean War (1950–1953), the techniques had operated on ordinary Chinese citizens after the establishment of the People's Republic of China (PRC) in October 1949. The PRC had refined and extended techniques earlier used in the Soviet Union to prepare prisoners for show-trials, and they in turn had learned much from the Inquisition. In the Chinese context, these techniques had multiple goals that went far beyond the simple control of subjects in the prison camps of North Korea. They aimed to produce confessions, to convince the accused that they had indeed perpetrated anti-social acts, to make them feel guilty of these crimes against the state, to make them desirous of a fundamental change in outlook toward the institutions of the new communist society, and, finally, to actually accomplish these desired changes in the recipients of the brainwashing/thought-reform. To that end, brainwashers desired techniques that would break down the psychic integrity of the individual with regard to information processing, with regard to information retained in the mind, and with regard to values. Chosen techniques included: dehumanizing of individuals by keeping them in filth, sleep deprivation, partial sensory deprivation, psychological harassment, inculcation of guilt, group social pressure, etc. The ultimate goal that drove these extreme efforts consisted of the transformation of an individual with a "feudal" or capitalist mindset into a "right-thinking" member of the new social system, or, in other words, to transform what the state regarded as a criminal mind into what the state could regard as a non-criminal mind.
The methods of thought-control proved extremely useful when deployed for gaining the compliance of prisoners-of-war. Key elements in their success included tight control of the information available to the individual and tight control over the behavior of the individual. When, after repatriation, close control of information ceased and reality-testing could resume, former prisoners fairly quickly regained a close approximation of their original picture of the world and of the societies from which they had come. Furthermore, prisoners subject to thought-control often had simply behaved in ways that pleased their captors, without changing their fundamental beliefs. So the fear of brainwashed sleeper agents, such as that dramatized in the novel and the films The Manchurian Candidate, never materialized.
Terrible though the process frequently seemed to individuals imprisoned by the Chinese Communist Party, these attempts at extreme coercive persuasion ended with a reassuring result: they showed that the human mind has enormous ability to adapt to stress (not a recognized term in common use with reference to psychology in the early 1950s) and also a powerful homeostatic capacity. John Clifford, S.J. gives an account of one man's adamant resistance to brainwashing in In the Presence of My Enemies that substantiates the picture drawn from studies of large groups reported by Lifton and Schein. Allyn and Adele Rickett wrote a more penitent account of their imprisonment (Allyn Rickett had by his own admission broken PRC laws against espionage) in "Prisoners of the Liberation", but it too details techniques such as the “struggle groups” described in other accounts. Between these opposite reactions to attempts by the state to reform them, experience showed that most people would change under pressure and would change back following the removal of that pressure. Interestingly, some individuals derived benefit from these coercive procedures due to the fact that the interactions, perhaps as an unintended side effect, actually promoted insight into dysfunctional behaviors that the subjects then abandoned.
In Tibet in the 1950s the invading Chinese army arrested Robert W. Ford, a British radio-operator working there. Ford spent nearly 5 years in jail, in constant fear of execution, and experienced interrogation and thought-reform. He published a book, Captured in Tibet, about his experience in Tibet, describing and analyzing thought-reform in practice.
According to forensic psychologist Dick Anthony, the CIA invented the concept of "brainwashing" as a propaganda strategy to undercut communist claims that American POWs in Korean communist camps had voluntarily expressed sympathy for communism. Anthony stated that definitive research demonstrated that fear and duress, not brainwashing, caused western POWs to collaborate. He argued that the books of Edward Hunter (a secret CIA "psychological warfare specialist" passing as a journalist) pushed the CIA brainwashing-theory onto the general public. He further asserted that for twenty years, starting in the early 1950s, the CIA and the Defense Department conducted secret research (notably including Project MKULTRA) in an attempt to develop practical brainwashing techniques (possibly to counteract the brainwashing efforts of the Chinese), and that their attempt failed.
Frequent disputes regarding brainwashing take place in discussion of cults and of new religious movements (NRMs). The controversy about the existence of cultic brainwashing has become one of the most polarizing issues among cult-followers, academic researchers of cults, and cult-critics. Parties disagree about the existence of a social process attempting coercive influence, and also disagree about the existence of the social outcome — that people become influenced against their will.
The issue gets even more complicated due to the existence of several definitions of the term "brainwashing" (some of them almost strawman-caricature metaphors of the original Korean War era concept ) and through the introduction of the similarly controversial concept of "mind control" in the 1990s. (In some usages "mind control" and "brainwashing" serve as exact synonyms; other usages differentiate the two terms.) Additionally, some authors refer to brainwashing as a recruitment method (Barker) while others refer to brainwashing as a method of retaining existing members (Kent 1997; Zablocki 2001).
Theories on brainwashing have also become the subject of discussions in legal courts, where experts have had to pronounce their views before juries in simpler terms than those used in academic publications and where the issue became presented in rather black-and-white terms in order to make a point in a case. The media have taken up some such cases — including their black and white colorings.
In 1984, the British sociologist Eileen Barker wrote in her book The Making of a Moonie: Choice or Brainwashing? (based on her first-hand studies of British members of the Unification Church) that she had found no extraordinary persuasion techniques used to recruit or retain members.
Charlotte Allen reported that "[i]n his article in Nova Religio, Zablocki was worried less about those academics who may stretch the brainwashing concept than about those, like Bromley, who reject it altogether. And in advancing his case, he took a hard look at such scholars’ intentions and tactics. (His title is deliberately provocative: 'The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion.') In his book Combatting Cult Mind Control Steven Hassan describes the extraordinary persuasion technique that (in his opinion) members of the Unification Church used to accomplish his own recruitment and retention.
Philip Zimbardo writes that "[m]ind control is the process by which individual or collective freedom of choice and action is compromised by agents or agencies that modify or distort perception, motivation, affect, cognition and/or behavioral outcomes. It is neither magical nor mystical, but a process that involves a set of basic social psychological principles."(Zimbardo, 2002)
Some people have come to use the terms "brainwashing" or "mind control" to explain the otherwise intuitively puzzling success of some fast-acting episodes of religious conversion or of recruitment of inductees into groups known variously as new religious movements or as cults.
One of the first published uses of the term thought reform occurred in the title of the book by Robert Jay Lifton: Thought Reform and the Psychology of Totalism: A Study of 'Brainwashing' in China (1961). (Lifton also testified on behavioral-change methodologies at the 1976 trial of Patty Hearst.) In his book Lifton used the term "thought reform" as a synonym for "brainwashing", though he preferred the first term. The elements of thought reform as published in that book sometimes serve as a basis for cult checklists, and read as follows:
Benjamin Zablocki sees brainwashing as a "term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocializations" and states this same concept historically also bore the names "thought reform" and "coercive persuasion".
Before the popularization of the name and concept of "brainwashing" in the 1950s, popular lore often associated the enthusiasm and commitment of recruits joining cults to witchcraft or to mesmerism/hypnotism.
In the early 1980s some U.S. mental-health professionals became prominent figures due to their involvement as expert witnesses in court-cases involving new religious movements. In their testimony they presented certain theories involving brainwashing, mind control, or coercive persuasion as concepts generally accepted within the scientific community. The American Psychological Association (APA) in 1983 asked Margaret Singer, one of the leading proponents of coercive persuasion theories, to chair a taskforce called the APA Task Force on Deceptive and Indirect Techniques of Persuasion and Control (DIMPAC) to investigate whether brainwashing or "coercive persuasion" did indeed play a role in recruitment by such movements. Before the taskforce had submitted its final report, the APA submitted on February 10, 1987 an amicus curiæ brief in an ongoing case. The brief stated that:
[t]he methodology of Drs. Singer and Benson has been repudiated by the scientific community, that the hypotheses advanced by Singer were little more than uninformed speculation, based on skewed data and that "[t]he coercive persuasion theory ... is not a meaningful scientific concept. [...] The theories of Drs. Singer and Benson are not new to the scientific community. After searching scrutiny, the scientific community has repudiated the assumptions, methodologies, and conclusions of Drs. Singer and Benson. The validity of the claim that, absent physical force or threats, "systematic manipulation of the social influences" can coercively deprive individuals of free will lacks any empirical foundation and has never been confirmed by other research. The specific methods by which Drs. Singer and Benson have arrived at their conclusions have also been rejected by all serious scholars in the field.
The brief characterized the theory of brainwashing as not scientifically proven and suggested the hypothesis that cult recruitment techniques might prove coercive for certain sub-groups, while not affecting others coercively. On March 24, 1987, the APA filed a motion to withdraw its signature from this brief, as it considered the conclusion premature, in view of the ongoing work of the DIMPAC taskforce. The amicus as such remained, as only the APA withdraw the signature, but not the co-signed scholars (including Jeffrey Hadden, Eileen Barker, David Bromley and J. Gordon Melton). On May 11, 1987, the APA Board of Social and Ethical Responsibility for Psychology (BSERP) rejected the DIMPAC report because the brainwashing theory espoused "lacks the scientific rigor and evenhanded critical approach necessary for APA imprimatur", and concluded "Finally, after much consideration, BSERP does not believe that we have sufficient information available to guide us in taking a position on this issue."
With the rejection-memo came two letters from external advisers to the APA who reviewed the report. One of the letters, from Professor Benjamin Beit-Hallahmi of the University of Haifa, stated amongst other comments that "lacking psychological theory, the report resorts to sensationalism in the style of certain tabloids" and that "the term 'brainwashing' is not a recognized theoretical concept, and is just a sensationalist 'explanation' more suitable to 'cultists' and revival preachers. It should not be used by psychologists, since it does not explain anything". Professor Beit-Hallahmi asked that the report not be made public. The second letter, from Professor of Psychology Jeffrey D. Fisher, Ph.D., said that the report "[...] seems to be unscientific in tone, and biased in nature. It draws conclusions, which in many cases do not mesh well with the evidence presented. At times, the reasoning seems flawed to the point of being almost ridiculous. In fact, the report sometimes seems to be characterized by the use of deceptive, indirect techniques of persuasion and control — the very thing it is investigating".
When the APA's BSERP rejected her findings, Singer sued the APA in 1992 for "defamation, frauds, aiding and abetting and conspiracy"; and lost in 1994.
Zablocki (1997) and Amitrani (2001) cite APA boards and scholars on the subject and conclude that the APA has made no unanimous decision regarding this issue. They also write that Margaret Singer, despite the rejection of the DIMPAC report, continued her work and retained respect in the psychological community, which they corroborate by mentioning that in the 1987 edition of the peer-reviewed Merck's Manual, Margaret Singer wrote the article "Group Psychodynamics and Cults" (Singer, 1987).
Benjamin Zablocki, professor of sociology and one of the reviewers of the rejected DIMPAC report, wrote in 1997:
APA Division 36 (then "Psychologists Interested in Religious Issues", today "Psychology of Religion") in its 1990 annual convention approved the following resolution:
In 2002, APA's then president, Philip Zimbardo wrote in Psychology Monitor:
Two months after her kidnapping by the Symbionese Liberation Army in 1974, Patty Hearst, an American newspaper-heiress, participated in a bank-robbery with her kidnappers. In her trial, the defense postulated a concerted brainwashing-program as central. Despite this claim, the court convicted her of bank-robbery.
In the 1990 U.S. v. Fishman Case, Steven Fishman offered a "brainwashing" defense to charges of embezzlement. Margaret Singer and Richard Ofshe would have appeared as expert witnesses for him. The court disallowed the introduction of Singer and Ofshe's testimony:
Social scientists who study new religious movements, such as Jeffrey K. Hadden (see References), understand the general proposition that religious groups can have considerable influence over their members, and that that influence may have come about through deception and indoctrination. Indeed, many sociologists observe that "influence" occurs ubiquitously in human cultures, and some argue that the influence exerted in "cults" or new religious movements does not differ greatly from the influence present in practically every domain of human action and of human endeavor.
The Association of World Academics for Religious Education states that "... without the legitimating umbrella of brainwashing ideology, deprogramming — the practice of kidnapping members of NRMs and destroying their religious faith — cannot be justified, either legally or morally."
F.A.C.T.net states that "Forced deprogramming was sometimes successful and sometimes unsuccessful, but is not considered an acceptable, legal, or ethical method of rescuing a person from a cult.
The American Civil Liberties Union (ACLU) published a statement in 1977 related to brainwashing and mind control. In this statement the ACLU opposed certain methods "depriving people of the free exercise of religion". The ACLU also rejected (under certain conditions) the idea that claims of the use of "brainwashing" or of "mind control" should overcome the free exercise of religion. (See quote)
In the 1960s, after coming into contact with new religious movements (NRMs, a subset of which have gained the popular designation of "cults"), some young people suddenly adopted faiths, beliefs, and behavior that differed markedly from their previous lifestyles and seemed at variance with their upbringings. In some cases, these people neglected or even broke contact with their families. Such changes appeared strange and upsetting to their families. To explain these phenomena, some postulated brainwashing on the part of new religious movements. Observers quoted practices such as isolating recruits from their family and friends (inviting them to an end-of-term camp after university for example), arranging a sleep-deprivation program (3 a.m. prayer meetings) and exposing them to loud and repetitive chanting. Another alleged technique of religious brainwashing involved love bombing rather than torture.
James T. (Jim) Richardson, a Professor of Sociology and Judicial Studies at the University of Nevada, states that if the NRMs had access to powerful brainwashing techniques, one would expect that NRMs would have high growth-rates, while in fact most have not had notable success in recruitment, most adherents participate for only a short time, and such groups have limited success in retaining members. Langone has rejected this claim, comparing the figures of various movements, some of which do (by common consent) not use brainwashing and others of which some authors report as using brainwashing. (Langone, 1993)
In their Handbook of Cults and Sects in America, Bromley and Hadden present one possible ideological foundation of brainwashing theories that they state demonstrates the lack of scientific support: they argue that a simplistic perspective (one they see as inherent in the brainwashing metaphor) appeals to those attempting to locate an effective social weapon to use against disfavored groups, and that any relative success of such efforts at social control should not detract from any lack of scientific basis for such opinions.
Philip Zimbardo, Professor Emeritus of Psychology at Stanford University, writes: "Whatever any member of a cult has done, you and I could be recruited or seduced into doing — under the right or wrong conditions. The majority of 'normal, average, intelligent' individuals can be led to engage in immoral, illegal, irrational, aggressive and self destructive actions that are contrary to their values or personality — when manipulated situational conditions exert their power over individual dispositions."(Zimbardo, 1997)
Some religious groups, especially those of Hindu and Buddhist origin, openly state that they seek to improve what they call the "natural" human mind by spiritual exercises. Intense spiritual exercises have an effect on the mind, for example by leading to an altered state of consciousness. These groups also state that they do not [condone the] use [of] coercive techniques to acquire or to retain converts.
On the other hand, several scholars in sociology and psychology have in recent years stated that many scholars of NRMs express a bias to deny any possibility of brainwashing and to disregard actual evidence. (Zablocki 1997, Amitrani 1998, Kent 1998, Beit-Hallahmi 2001)
Steven Hassan, author of the book Combatting Cult Mind Control, has suggested that the influence of sincere but misled people can provide a significant factor in the process of thought-reform. (Many scholars in the field of new religious movements do not accept Hassan's BITE model for understanding cults.)