Anthropology, Western polite society, and the conspiracy-theory taboo
Taboos reveal fundamental organizing values in a society.
Taboos can force people to bear enormous costs.
Among the university-educated in the West, conspiracy theory is taboo.
This taboo organizes an interesting social grammar, here described.
In modern democratic societies the term ‘conspiracy theory’ generally refers to Machiavellian hypotheses: claims about powerful actors who—either in secret or in a narratively camouflaged manner—work to wreck democratic institutions and citizen rights.
Among ‘Politers’—as I call Western natives of ‘polite society,’ or the university-educated world—a near-absolute moral prohibition against doing conspiracy theory has dominated for about a generation and a half. It’s a taboo. The experts on taboos, sociocultural anthropologists, should be taking a keen interest in this.
We anthropologists by tradition study (perhaps above all else) how particular values, beliefs, and norms change and become stable in different ways in different societies. We find taboos quite interesting and useful, analytically, because whatever a taboo protects is—by definition—sacred: an interpretive key to certain fundamental and organizing values.
If you wish to grok something deep, then, about university-trained Westerners—arguably the most powerful culture that ever existed—then consider the conspiracy-theory taboo. We’ll do that. But first, a brief preliminary to savor the basic concept.
Thinking through taboo
The word migrated into English after Captain James Cook, disembarking on the Polynesian Tonga archipelago in the 1770s, was startled by the social prohibitions he witnessed, which the locals named taboo.
In the presence of the king, he saw, subjects felt compelled to perform a ritual:
“they come and squat down before him, bow the head to the sole of his foot, …[and] after raising the head they tap or touch the sole of the foot with the under and upper side of the finger of both hands then rise up and retire.”1
An obvious display of submission, and so interpreted by Cook. Yet the king, he also noticed, was bound to this ritual no less than his subjects.
“It appeared that the King could not refuse anyone who chose to pay him this compliment, for the common people would frequently take it into their heads to do it when he was walking and he was always obliged to stop and hold up one of his feet behind him till they had done; this, to a heavy, unwieldy man, like him, must be attended with some trouble and pain …”
The king was so annoyed by this, in fact, that Cook more than once witnessed the hapless monarch trying to run away or hide lest he be forced to accept the submission rituals of his eager subjects.
But on those occasions when the sovereign could not escape his kingly ritual fate, he passively operated a kind of magic on the hands of whoever had paid him respect.
“The hands [of the subject] after performing this ceremony are, in some cases, rendered useless for a time, for until they are washed they must not handle any kind of victuals (…) When the hands are in this state they call it Taboo rema. Taboo in general signifies forbidden and rema is hand.”
A person could remain in taboo rema or ‘forbidden hands’ for a while, and would then have to be fed by others. The way of removing this condition, to recover the hands, was cumbersome.
“[W]e have frequently met with women that have been Taboo rema and have been fed by others, but never with a man. When the time is expired, she goes and washes in one of their baths which are dirty holes for the most part of brackish water, she then comes to the King, makes her obeisance in the usual way, then takes his foot and applies it to her breast, shoulders and most other parts of her body, he then embraces her first on the one shoulder and then on the other, after which she retires purified from her uncleanness.”
Taboos often compel, as here, behaviors that are costly to perform. Yet, for all those costs, taboos cannot be trifled with. And why not? Because a taboo violation brings mental and social agony.
This is tremendous.
Yet, tremendous though it is, gazing at this analytically in one’s own society is difficult, for it is the foreigner’s behavior that cries out for explanation, not one’s own. And so, though stupefying social prohibitions—which coerced people into costly ritualized behaviors—existed aplenty in Captain Cook’s own 18th-c. society, it was Tonga rather than England that astonished the good captain.
The term taboo now stands in Western languages for any ‘don’t,’ in any society, the explicit policing of which is fairly ritualized, and whose violation produces severe mental suffering and even social upheaval (think, for example, of the mental and social consequences of violating incest taboos).
In my view, the ban on conspiracy theories, diffusely and semi-formally enforced throughout Western polite society, and with special ferocity inside the academic world, is—in the full, technical, anthropological sense—a taboo. This taboo reveals much about the most powerful culture in the world: university-trained Westerners.
The conspiracy-theory taboo as deployed in grammatical action
Anthropologists are keenly aware that all manner of human interactions beyond language itself are governed by grammars.2 So, just as an English article forces one to follow with an English noun, in other domains of action one must also always carefully precede and follow certain complex behaviors with certain others according to the relevant grammatical rules.
Philosopher Matthew Dentith, interested himself in how Politers—for whom he writes—process conspiracy theories, has observed one important aspect of this grammar: if you dare share with others something that might sound like a conspiracy theory, then you must—by grammar—precede it with a specific apology.
“…we … start out by saying ‘I’m not a conspiracy theorist, but…’ an expression that typically means we want to put forward some conspiracy theory without having to suffer the ignominy of being called a ‘conspiracy theorist.’ ”3
Preparing others for something risky with ‘I am not a conspiracy theorist, but…’ is an expected, ritualized, grammatical behavior, no less than when a Tongan subject bows the head to the sole of the sovereign’s foot. And it likewise expresses submission—in this case, to the social rule that Machiavellian hypotheses are unacceptable in polite society.
But what if, in brutal disregard for this etiquette, a Machiavellian hypothesis should be presented unapologetically? There you are, at a politer social gathering, chatting with a man who out loud (gasp) considers the merits of a Machiavellian hypothesis, or even (horrors!) defends it. Now what happens?
He becomes social kryptonite. An almost overpowering urge wells up in you to avoid and/or deny and/or denounce and/or laugh at him. For this guy is not merely weird—he is socially dangerous; association with him will be interpreted by others (whose eyes you are already nervously casting about for) as either tolerance for or… or… (gasp!) agreement with unspeakable heresy. Shame!
Your next move will be important. It is grammatical to dismiss the Machiavellian hypothesis with a side flourish about craziness. And grammatical, too, to then change the subject. But first, lest a powerful collective shame utterly shatter the social peace, you may have to punish the conspiracy theorist—the apostate—with public ridicule. This shuts the briefly opened Overton window and reestablishes the taboo.
A preferred punishment strategy is to ask the deviant—loudly, including all within-earshot—whether he also believes X, where X (inter-dimensional shape-shifting space lizards in government, say) is something all at hand can agree is nuts, thus killing the proposal—which was not about that—by gratuitous association. The intellectual work of refutation is thus conveniently sidestepped and the apostate, not without sadistic glee, socially quashed.
Unless pious repentance is soon forthcoming from the deviant theorist, the long-term consequence is social disarticulation, for “the fate of the apostate,” wrote Tom Wolfe, keen observer of polite society, “is that curse known as anathema.”4
The danger of that curse, of being cast out, is so keenly felt that an apprehension grows—almost a panic—that Machiavellian hypotheses, notwithstanding their alleged automatic silliness, might possess an addictive, quasi-erotic attraction, leading us into temptation and then getting us ejected from polite society. And so, like any addict powerless before the lure, we beseech a Higher Power—polite society—to help us stay sober. In many contexts, then, ‘I am not a conspiracy theory, but…’ is like a confession to one’s fellow addicts: ‘Just look what my mind wants to do! Help guide me back—please!—to a properly dogmatic (non-conspiratorial) interpretation.’
Anyone tempted to scorn this should be reminded that a lot is at stake. Politers who carefully signal to others ‘I am not a conspiracy theorist’ are protecting their identities to stay safe.
Identity and taboo
A human life may be conceptualized as navigating an endless series of social ‘games,’ each ruled by a particular grammar. I have argued that, for analytical purposes, a social grammar should be understood as a functional, teleological consequence of the meaning goal that organizes the cooperative performances of the ‘players.’ Ideally, one reduces that meaning goal to one pithy sentence.
In any identity-signaling game, the meaning goal is, generally: ‘We are all legitimate members of this society, and we accept and respect the social consequences of belonging.’ Play the game improperly and you will lose personhood. This is no trifle.
As Grace Harris long ago explained in an influential essay published in American Anthropologist, “not all individuals acquire the standing of full persons as agents-in-society.” People everywhere “look to an ideal [a local ideal] of ‘normal’ human characteristics … [which] are seen as making possible”—or at any rate socially acceptable—“the performance of meaning-laden conduct.”5
Among Politers, the identity-signaling game has the following specific meaning goal: ‘we Politers don’t do conspiracy theory because that is for rednecks without a university education.’ Playing the game properly, in many contexts, requires that participants say, explicitly and in so many literal words, ‘I am (of course!) not a conspiracy theorist!’, or ‘I don’t believe in conspiracy theories.’ (Or some such.)
Such confessions are a bit like secret handshakes by which full (or ‘true’) Politers implicitly recognize each other. At stake is not only legitimate membership and normality, but also social status and self-esteem. For Politers are the intelligentsia, the university-educated professionals, cultured and educated, standing, as they see it, above the uncouth, ignorant, and cognitively undisciplined ‘rednecks.’
One who plays this game improperly—and especially within the halls of academia—may be compared to one who, in various societies of West Africa, refuses to get their scars. The scars have signaled group identity—yet not everybody wanted them, as anthropologists have documented. And one can sympathize. Because the scarification patterns are huge, often covering the entire body, including the face, and for each little element of the pattern you must cut the skin with a knife. The cost in pain and risk of infection is high. But for those who refuse the scars there is a different cost: loss of personhood.
Following Harris, The History of Scarification in Africa page explains that people without the requisite scars
“were generally not included in the group’s activities . . . [for they] are not considered as acquiring the full standings as agents in their society, [and] they would also lack the capacity for meaningful behavior, such as greeting, commanding, and stating. Therefore, scarification can transform partial tribe members into normal [persons] entirely accepted by the group. Scarification … gives the ability to communicate fully … [it’s] a key element for being considered as a normal member of the group.”
In social groups or categories, where scarification or other outward markers are lacking, you must often display a scar of the mind.
In the identity-signaling game of political groups, especially, one must express a key belief or conviction in order to be recognized as a full member equipped with social agency. Among environmentalists, for example, belief in anthropogenic global warming plays this role. To doubt the anthropogenic hypothesis is to be revealed as a heretic, a traitor, or a spy for Big Oil, or some such, and one is instantly mobbed.
The Management of Reality is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
In Western polite society the mental scar to be displayed is one’s commitment to the conspiracy-theory taboo. A public and unapologetic defense of a Machiavellian hypothesis therefore implicitly labels a Politer as a kind of stealth redneck: someone who wasted a university education. Thus become abnormal, these fringe Politers lose their scars and hence “the full standings as agents in their society,” forsaking their “ability to communicate fully” and wrecking their “capacity for meaningful behavior.” They can no longer speak and be heard. In the extreme, they are pathologized as half-persons fit for psychiatric attention to cure their ‘paranoia.’
The stakes in this game are high for all of us, not just Politers. For there is little question that Politers are in charge and their influence is vast. Politers author what modern society calls knowledge, as they populate the technocratic managerial positions in business, educational, and government bureaucracies. The taboo on conspiracy theory, therefore, since it is required for full membership in polite society, is also a de facto litmus test for social leadership.
The consequences of that for social science and democratic politics will be examined in a future essay.
But I must notice here, in closing, a curious formal similarity between our time and the social realities of the 1950s in the United States. Back then, during the infamous McCarthyite persecutions, US citizens presumed to be communists—and sometimes even clandestine Soviet agents—were dragged from every walk of life to appear before all manner of tribunals until ‘redeemed’ by public recantation: ‘I am not a communist (anymore).’
Many who recanted were not even communists (communists were quite scarce). Why did they recant, then? Because denouncing the unconstitutional process itself—which recquired refusing to recant on principle in order to dispute the validity of persecuting someone for belief—called for great courage, as it meant confronting society and government. And refusal to recant—as in the heretic and witch trials of the Inquisition—would be interpreted, anyway, as evidence of communist guilt! People had families, jobs, and lives; many preferred to recant and get on with it.
Similarly today, university-educated Westerners apparently all feel the burden of a silent accusation, charged implicitly as potentially ridiculous until cleared by pious, public genuflection: ‘I am not a conspiracy theorist.’ Whatever their true beliefs, they choose to recant.
(And did you notice the dramatic historical flip? Back when anyone in the United States, including people high up in government, could be suspected of working for the Soviets, conspiracy theory was the norm. But now it is taboo. Interesting. Might reality be managed?)
Beaglehole, J.C. (2015). The Journals of Captain James Cook on his Voyages of Discovery: Volume III, Part 2: The Voyage of the Resolution and Discovery 1776-1780 (1st ed.). Routledge. (pp.175-176)
(I have updated Cook’s spelling when necessary and improved his punctuation.)
Gil-White, Francisco, Towards an integrated theory of cooperative grammatical performance: Saussure, Malinowski, Ardener, Geertz, and gene-culture coevolution (April 20, 2020). Available at SSRN: https://ssrn.com/abstract=3581263
Dentith, Matthew R. X.. The Philosophy of Conspiracy Theories (p. 2). Palgrave Macmillan. Kindle Edition.
Wolfe, Tom. (2009) From Bauhaus to Our House. Farrar, Straus and Giroux. Kindle Edition. (p. 68)
Harris, G. G. (1989). Concepts of individual, self, and person in description and analysis. American anthropologist, 91(3), 599-612.