The emotional tipping point between religion and cult

Pierre Le Coz “…The aim of a religion is not to change its member’s ordinary life. The realistic moderation of its ambitions explains why religion does not need to use artifices of manipulation. The admission in a cult is much more costly because it requires a conversion, a rupture with others and oneself. The leader asks for an allegiance without limits and a total commitment. Cult is “a radical commitment to serve a radical cause” (9). Manipulative strategies are essential for the guru to incite the individual to leave his current life behind and to disown both his historical and family attachments…”

Pierre Le CosPierre Le Coz
Director Department Humanities at the Faculty of Medicine (Aix-Marseille University)

Introduction

Manipulation is a major criterion of distinction between cults and religions. Except for the particular case of it becoming sectarian, a religion does not need to use manipulative strategies; it inducts its members mainly through family traditions. Contrarily, a cult forces the new member to destroy their relation with their tradition and their family. This break with their past, their relatives and friends, is a costly process for the new recruit. That is why a cult leader will have “to force his hand”, using the whole range of manipulative techniques (emotional control, baiting, “foot in the door”, emphasis…)
We all have an intuitive idea of what manipulation is, but few have conceptual and objective knowledge of it. We can sometimes have the impression that we have been manipulated, but it takes some time before we understand how the trap closed around us. To identify and to categorise undue influence and persuasion strategies takes a reflexive effort, a critical distance to uncover all the components of the manipulative mechanism.
This difficulty to determine the cogs of undue influence is part of the reason of the spinner of yarns’ success. In order to smooth out these difficulties and to better understand the way in which the power of cult domination is wielded, philosophers and thinkers, past and present, can provide a framework of analysis and conceptual tools.

1. Plato: the rationalist critic of the sophist’s manipulation

In philosophical texts throughout history, even in the oldest, we find testimonies of concerns due to mass manipulation phenomena. From the 5th century B.C., Socrates wonders about the way orators manage to overcome the people’s vigilance, leading them into counterproductive wars (1). A smooth talker like Alcibiades convinced Athens to fight a battle that, in fine, turned out to be a disaster against its Spartan enemies during the Peloponnesian war. The curse attached to the demagogic use of speech can explain Plato’s repeated attacks against sophists who were the worst usurpers of his time. Plato distinguishes the philosopher, who seeks knowledge (philosophia), from the sophists who claim to possess knowledge. Sophists have one thing in common inasmuch as they have an answer to everything. Plato describes them as “experts in the tempers and desires of a mighty strong beast”, that is to say the people, when their opinion is manipulated by opinion leaders. The sophist masters the demagogic art of cosying up to the public: “what irritates it”, “what softens it”, and, “how to approach and handle it”(2).
The philosophical dialogue is about questioning, along with others, in order to walk together towards the truth, while in the sophistic dialogue the aim is to have an effect on others to seduce them and convince them to concur with our own ideas. The sophist asks himself: “what does the other want to hear?”, “what can I say to please him?”
The first feature of manipulation is, then, to be tell a tale unrelated to the truth, even though it claims to be the truth. The priorities of this tale are to gain the support, to lure mass approval and capture targeted individuals. To persuade is not only to convince of an idea, on the ground of rationality. To persuade is to defeat psychological resistance, to flatter self-esteem, to feed passions and to charm with promises of better days (3).
For the manipulator, what is essential is to start conversations, to find a good teaser, to have a launching pad. A few words stuttered by his prey will be enough to begin with. The main thing is to obtain a simple “yes” or, even better, several consecutive “yeses” (“yes, indeed, it’s a beautiful day”, “yes, you are right, there is a lot of traffic”, etc. The skill of the manipulator consists in provoking in his interlocutor a state of mind favourable to acceptance. The manipulator does not appeal to reason or general ideas, but reaches for the imagination. It’s about leading someone to adhere, not to think. In a manipulative process adherence is not rational, it is emotional. Perverted rhetoric becomes the mechanism to “seize emotionally” (4).
While the philosopher appeals to the mind, the sophist hounds; he plays on the level of emotions that he manipulates with the artifices of eloquence. He knows how to overcome his audience’s vigilance with strong expressions, full of imagery. The sophist will be able to gain support where an honest orator would fail. This is how a man as wise as Socrates could defend his “straight” ideas, objectively valid, and yet had not been heard or taken seriously by the masses. He did not know how to convey emotions through his speeches (5).

2. Troeltsch: the difference between adhering to a church and adhering to a cult

In the modern period, the first sociologists of religion such as Max Weber (6) or Ernst Troeltsch (7) elaborated distinctions between cults and religions that we can still use to distinguish adherence modalities to spiritual groups. Religions recruit members through a system of affiliation. In broad outlines, cults offer rupture where religions encourage continuity. Thus, the “Church” type corresponds to an institution for salvation that ensures, for everyone, the transmission of a founding story, such as the crucifixion and the resurrection of Christ. It is not looking to innovation or exoticism. On the contrary, a religion remains loyal to a legacy, to the speech of a prophet or a Messiah (8). The believer fits in a group through immersion into a tradition that existed before him. The Present is placed under the authority of the past. New members are incorporated from their birth. With baptism, the individual is at the centre of a ritual of incorporation into the community. He is assimilated with the religion de facto, with the support of his family. The new member has never been personally recruited. Psychological manipulation has no need to be.
To reinforce its specificity regarding cults, Troeltsch attributes to religion an extensive mode of action. The fact that “catholic” means “universal” is significant. The number of members is more important than the way they live their faith. On the contrary, the cult is in a logic of intensity. It requires a conscious and personal commitment; the quality of the experience is prioritised over the quantity of members. The admission into the group is made, in theory at least, by an individual choice, if need be, in rupture with the original religion.
Because religion places itself under the sign of extension, because it aims to embrace all societies and cultures (even if it means compromising with local beliefs and local particularities), it asks of its members only minimum ethical expectations. It is clear that if religion was demanding, it could not spread beyond a small circle of initiated. Either one asks a lot, and one will have little, or one asks little, and one can hope to get a lot. The aim of a religion is not to change its member’s ordinary life. The realistic moderation of its ambitions explains why religion does not need to use artifices of manipulation. The admission in a cult is much more costly because it requires a conversion, a rupture with others and oneself. The leader asks for an allegiance without limits and a total commitment. Cult is “a radical commitment to serve a radical cause” (9). Manipulative strategies are essential for the guru to incite the individual to leave his current life behind and to disown both his historical and family attachments.

3. Jouve and Beauvois: the misleading feeling of freedom

We know that “free will” is often used by spiritual leaders and upholders of liberalism to counter the idea of “undue influence”. It would not be legitimate to talk about “enrolling” minds in cults because members are supposedly free to commit themselves. We can ask ourselves if the inner feeling of freedom is not, precisely, the crucial cog of manipulation. Social psychologists defined manipulation as a “voluntary submission” (10). To be manipulated, is to “freely” do what the other expects us to do. Expressions such “you are free to accept or to refuse to follow me”, “I would perfectly understand that you refuse, you are free to do what you want” are used by any guru or spiritual adviser to create a bond of trust. The member is reassured by this apparent flexibility of choice that is left to him (“I can trust him because he is offering me the choice to come or not to come to meditation sessions of his community”).
Another common belief is that some people being vulnerable are more easily influenced and that this can explain manipulation. Here again, social psychology warns us to be cautious with preconceptions. It teaches us that it is not the character of people that can explain their submissive behaviour but the actions and decisions they previously made (11). It shows that once one has been committed to a cause or a group, there is a risk of being trapped in one’s own initiative. Everyone has a tendency to agree with his or her own decision, following a kind of self-manipulation. We would rather talk about “adherence” (10) to insist on the fact that it is not a conscious and rational agreement. Anyone who has taken a decision would tend to stick to this decision and not let it go. This natural downward spiral can lead to what we call an “escalation of commitment” (11) based on this tendency we have to persevere in a process, even if it becomes overly costly to us. In the common language we designate this phenomenon with expressions full of imagery such as “slippery slope”, “tripping over the carpet”. During wars, belligerents seem to find in each defeat a reason to keep on fighting. We are obscurely looking for a confirmation of the righteousness of a decision when facts show, obviously, that it is time to stop and get out of this trap. We continue to act, against all odds, because we have spent time and energy. We are loath to undo what we have fought for, which is one of the main reasons of the “akrasia” phenomenon (4). To act against our better judgment. What we refuse to accept is the “waste”. We want to rescue the meaning of what we did (“I did not do all this for nothing!”), also perhaps for our self-esteem. We can imagine the effect of “the escalation of commitment” on a person attracted to a cult and who, facing mockery and irony from relatives, will persevere on her path to prove she was right to frequent this group.
Psychologists Beauvais and Joule observe, among other things, that the force of our commitment, supposedly “free”, can vary according to some attributes of this decision. Thus, we feel more obliged when we made a decision in public. I committed myself before other people. My freedom is also reduced when I committed myself explicitly. For instance, I have been asked if I would come to the community’s meeting or to take a subscription and I said “yes”. My answer was straight-forward, it was not uncertain or indecisive; it could leave no doubt. My possibility for manoeuvre becomes even smaller when what I said is irrevocable. I feel even more bound by my first decision because I have the impression that I cannot undo it. I promised that I would come to the session or the seminar “tomorrow”, or “on the weekend”. Now, it seems hard to rethink my decision. If I had said something vague (“in the next two months”, “one of these days”, etc.), I would be freer of my acts and deeds. I would more easily be able to change my mind, pretending a change of circumstances.
Beauvois and Joule also point out the fact that we are even more committed to an action when done repeatedly. I opened my door to Jehovah’s Witnesses once or twice in the recent past. It will be more difficult not to let them in next time (even if it is only a probability and not absolute determinism).
Let’s notice once again that I would struggle to unbind from my commitment when beliefs to which the guru will ask me to concur with are compatible with my own. Social psychology talks about a “non-problematic character” (10) of belief: I submit more easily to beliefs that are dear to me. For instance, if I believe that modern civilisation is on a decadent path, that men should turn themselves to God, that we need a spiritual regeneration, I would more easily agree with the speech of a spiritual leader that supports this kind of affirmation.
Finally, we will note the fact that we feel more committed to a decision when consequences matter to us. For instance, if I gave several hundreds of euros to a cult leader (when I am already going through financial difficulties), I would feel more committed than if I had only given a little of my time. I want my decision to be relevant because it cost me, in an economic sense.
We can see, then, that behind the apparent freedom to commit to a cult, we can show the existence of six factors of commitment which create a favourable ground for manipulation:
– The visibility of one’s decision in front of someone else
– Its explicit character
– Its irrevocability
– The repetition of the act
– The non-problematic character of belief
– The importance of consequences

4. The shrinking range of emotions

Social psychology is looks at behaviour and considers it in an objective way, from the outside, by assessing probability. To complete this external perspective, we still have to characterise the manipulation phenomenon from the inside. To do so, in my book Le Gouvernement des émotions et l’art de déjouer les manipulations (The Government of emotions and the art to foil manipulations), I proposed the concept of shrinking range of emotions (4). What is it?
We usually feel a wide range of emotions, with varying degrees intensity. Descartes even summarised 34 of them, some of them natural, some of them cultural, some of them simple, some of them complicated (13). Emotional shrinking refers to a decrease in the number of feelings the member experiences. It consists in a tendency to always feel the same emotions, and in a more acute way. That is where the emotional tipping point lies from religion to cult. When I live under the sway of a guru, my emotional life is virtually reduced to four feelings: admiration, fear, guilt and gratification:

– I will admire a guru’s charismatic aura, and correlatively, I will experience less admiration for movie stars or athletes.

– The brief thought of leaving the cult will come with fear of punishment by forces from the beyond, of losing the esteem and recognition earned within the group.

– I will feel guilt because I cannot fulfil the cult’s expectations.

– I will also be grateful to have become someone important, the narcissistic satisfaction to have a mission on earth, to be chosen among the damned, lucid among the blind. Gerald Bronner highlights the fact that cult related movements “offer individuals who concur to them micro-societies where cards have been re-dealt, where it is possible again to hope to reach a status meeting expectations” (9).

The intensity and recurrence of these four feelings correlates with the loss of other emotions from ordinary life. The focus of emotional energies on the group and its leader, causes a decrease in the usual range of emotions, which often gives the relatives an impression of “a heart’s anaesthesia”. What moves us, leaves the member indifferent. He seems to be “a stranger to the world”, which is the very definition of alienation (alienus, stranger). In psychoanalytic jargon, we would say that his libidinal resources are focused on the guru through a transfer mechanism. The member’s affectivity has not disappeared but has been channelled in only one direction and has been siphoned off by the guru for himself.
Consequently, helping a disciple to leave a cult cannot be “bringing him back to reason” with philosophical or scientific arguments. A frontal opposition could even be counterproductive, strengthening the member in his beliefs, irritated to see his “knowledge” questioned. It is rather with caring benevolence that relatives can hope to reset the cult victim’s emotional dynamic, by arousing other emotions than the one his guru cultivates in order to manipulate him.

Conclusion: a new path to explore to help victims

The analysis of the affective tipping point between religion and cult allows us to understand why educated and sensible persons can also be trapped by manipulative strategies. It is in the affectivity field that manipulation can be found. Thus, besides rational ways (those of education and culture) we need to focus our attention on other possible ways of preventing cults’ undue influence. In this perspective, I propose to widen the range of expression of feelings in order to re-establish the natural diversity of feelings itself. The idea is to reactivate the natural relation of balance between emotions. By increasing their number, emotions can, among other things, decrease one another’s intensity. Only emotions can balance emotions and advance the liberating doubt in the member’s mind.
Associations that struggle against cults could, in the future, look at this emotional shrinking phenomenon, in order to clear paths to reset the dynamic process of emotional balance in cult members’ minds. In the extent that emotions are often set off by sensitive perceptions from the outside world, this is most likely by renewing the member’s sensitive perception field that he will be able to retrieve his emotional flow and see the world through an emotional kaleidoscope again. This approach could also help victims that have left a cult to mentally free themselves more efficiently.

Bibliography:
(1) Plato, The sophist, 231a, French translation by E. Chambry, Garnier-Flammarion, 1969
(2) Plato, The Republic, Book VIII,493c, French translation by R. Baccou, Flammarion, GF Paris, 1966
(3) Mucchielli A., L’art d’influencer : analyse des techniques de manipulation, Armand Colin, 2005
(4) Le Coz P. Le gouvernement des émotions. Et l’art de déjouer les manipulations. Albin Michel. 2014.
(5) Plato, Apology, 34 c, French translation by E. Chambry, GF, Paris, 1965
(6) Weber M., Sociology of religion (texts chosen and translated by Jean-Pierre Grossein), Gallimard, Paris, 1996.
(7) Troeltsch E., 1991 (republication), Protestantism and Progress, Paris, Gallimard.
(8) Hervieu-Léger D., La religion pour mémoire, Éditions du Cerf, Paris, 1993
(9) Bronner G., « Approche sociologique : le terreau favorable à l’emprise mentale » in l’emprise mentale au coeur de la dérive sectaire: une menace pour la démocratie? , Actes coll. 2013, pp. 14-43
(10) Beauvois J.-L and Joule R.-V., La soumission librement consentie, Presses Universitaires de France, PUF, 1998
(11) Beauvois J.-L and Joule R.-V., Petit traité de manipulation à l’usage des honnêtes gens, les psychologues sociaux français, Éditions Presses Universitaires de Grenoble, PUG, 2002.
(13) La Rochefoucauld, Maximes, coll. « Grands Ecrivains », Paris, 1987.

Link to original