Ship of Fools Page 13
They also found that:
Some elderly men rarely ate human flesh, and small children residing with their mothers ate what their mothers gave them. Youths, who were initiated around the age of ten, moved to the men’s house, where they began to observe the cultural practices and dietary taboos that defined masculinity. Consuming the dead was considered appropriate for adult women but not men, who feared the pollution and physical depletion associated with eating a corpse. The epidemiological information provided by Gajdusek and Zigas in 1957—that kuru occurred among women, children of both sexes, and a few elderly men—seemed to match perfectly the Fore rules for human consumption. (Lindenbaum 2015: 104-5)
Which it did. To cut a long story short, Gajdusek was joined in his research by Stanley Prusiner, a biologist who, like Gajdusek, received the Nobel Prize. The genetic basis of kuru had been rejected, and Gajdusek had shown that the disease could be transmitted to chimpanzees exposed to infected material, which suggested to him that the disease was carried by a slow virus. Prusiner, however, showed that kuru was actually caused by prions, defective protein molecules which contain no genetic material, and was a spongiform encephalitis in the same family as Creutzfeldt-Jakob disease. The point about prions was that, whereas a slow virus would allow kuru to be spread simply by contact, prions required the actual consumption of brain matter, and the obvious occasions for this were the Fore mortuary ceremonies in which the women ate the brains of the deceased. With the demise of cannibalism the incidence of kuru fell steadily over the years, and by 1982 there were very few deaths, and the sex ratios were now equal (Lipersky 2013: Fig. 4, 476). The disease is now considered extinct.
While Arens admitted that “it is impossible to prove that cannibalism is not a factor in the kuru syndrome”, he nevertheless was not convinced: the evidence was circumstantial, there were contradictions in the ethnography, and the same material lent itself to alternate explanations (Arens 1979: 112). He points out that Fore cannibalism had never been observed by an outsider, and that the anthropologists were uncertain when it had been abolished. “As a result, Glasse and Lindenbaum relied upon Berndt’s idiosyncratic discussion of the material, the fact that the Fore had a reputation among surrounding groups for eating their dead, the odd report that someone had eaten someone else and the belief among the males that ‘the great majority of women’ were cannibals” (109). Of this belief about the Fore women he says: “Rather than uncritically accepting the native view that only women and children are cannibals, it would seem reasonable to question whether or not this might be a symbolic statement about females, in a culture area renowned for sexual antagonism and opposition.” (110). He goes on, “Another reasonable suspicion of the cannibalism hypothesis is aroused by the fact that among the Fore each death is followed by a mortuary feast involving the slaughter of pigs and distribution of the meat and vegetables…. This period of an abundance of animal protein would seem to be the least likely time to resort to cannibalism” (111).
With regard to the transmission of the disease, which by 1979 had been accepted as related to the Creutzfeldt-Jakob family, he remarks that no one has suggested that such diseases “are transmitted in the western world by cannibalism. However, such a hypothesis presents no problem when the affected population is the inhabitants of the New Guinea highlands. This is consistent with the general theoretical tone of much of the anthropological literature on this area, which effectively diminishes the cultural achievements of the inhabitants” (112). With regard to the initial appearance of the disease he says, “Surprisingly enough, no one has seriously considered the idea that the presence of Europeans in the area was responsible for the outbreak of the epidemic at the turn of the century. The arrival of the first two Europeans in 1932 does not deny the possible entry of the disease years before through indirect means and intermediaries” (113). He also points out the important social changes that have occurred since European contact, such as the disuse of the men’s house and men moving into live with their wives and children: “In the light of the obvious cultural rearrangements and new experiences, it is odd that scientific researchers have seized on a correlation between something which was never seen and another phenomenon studied and measured so meticulously” (113).
Arens’s hilarity at the racist idea of Creutzfeld-Jacob disease being transmitted by cannibalism turned out to be misplaced, however, since it was cattle cannibalism in the form of brain and spinal cord matter from diseased animals being included in cattle feed that led, a few years later, to the spread of BSE in Britain. Bovine Spongiform Encephalopathy, or Mad Cow Disease, was a prion disease that also infected a number of humans in the form of vCJD, variant Creutzfeldt-Jakob Disease, and led to a ban on the export of British beef in 1996.
In 1997, in “Man is off the menu”, he added a further refinement to his “refutation” of Fore cannibalism, which is worth quoting as an example of his methods of scholarly disputation:
There was a particularly notable agreement [among anthropologists] that cannibals did exist, however, until practically yesterday, in the highlands of New Guinea, the “final frontier” of western cultural contact. In this instance many smugly noted that the evidence for cannibalism emerged from medical research rather than from the usual less reliable forms of documentation. In the light of the exalted position of science, how could any rational person doubt this research? I discovered, with perhaps even more smugness, that one could. The story began in 1957, with the arrival in New Guinea of D. Carleton Gajdusek, an American research paediatrician on his way home from a fellowship year in Australia. Why he opted to visit this part of the world did not become clear until recently. However, the eventual results of the sojourn proved important for both medical science and for Dr Gajdusek. Eventually, he would receive the Nobel prize for medicine, and then, later, be arrested and plead guilty to the sexual abuse of minors in the US. He adopted a number of boys from part of New Guinea well known for institutionalised male homosexuality between youngsters and adults. Laudatory reports of Gajdusek’s charity, including references to his bringing a number of the lads to the Nobel ceremonies, were recounted in the media. (Arens 1997: 16)
Gajdusek’s subsequent criminal conviction related to boys of a different people from the Fore and had nothing whatever to do with his kuru research, and therefore provides Arens with no grounds for doubting it, smugly or otherwise. Arens, of course, as we might expect, makes no reference in his article to Prusiner’s work and the crucial association of brain-matter with prions which was conclusive support for the cannibalistic thesis, and by 1997 had been well-established.
I leave it to my readers to decide if they find these various arguments of Arens even a remotely adequate response to the facts presented on Fore cannibalism. Shirley Lindenbaum comments that “Although discredited today, the denial of cannibalism was kept alive during the 1980s and 1990s by a generational shift in the human sciences, glossed as postmodernism, which studied metaphor and representation, providing new life for the idea that cannibalism was nothing more than a colonizing trope and stratagem, a calumny used by colonizers to justify their predatory behavior” (Lindenbaum 2015: 108).
To sum up, then, Arens’s charge that anthropologists engage in “manipulating the data to generate a foregone conclusion” where “academic standards seem to function as an almost forgotten ideal”, actually turns out to be a very accurate description of his own book, and Marshall Sahlins, who has done more than most to refute it, may be allowed the last word:
It all follows a familiar American pattern of enterprising social science journalism: Professor X puts out some outrageous theory, such as the Nazis really didn’t kill the Jews, human civilization comes from another planet, or there is no such thing as cannibalism. Since facts are plainly against him, X’s main argument consists of the expression, in the highest moral tones, of his own disregard for all available evidence to the contrary. He rises instead to the more elevated analytical plane of ad hominem attack on the authors of
the primary sources and those credulous enough to believe them. All this provokes Y and Z to issue a rejoinder, such as this one. X now becomes ‘the controversial Professor X’ and his book is respectfully reviewed by non-professionals in Time, Newsweek , and The New Yorker . There follow appearances on radio, TV, and in the columns of the daily newspapers. (Sahlins 1979)
Notes
1 . The class of stupid, ignorant people.
2 . For Cook’s actual Journal entry see J.C. Beaglehole, ed., 1969. The Voyage of the Resolution and Adventure 1772-1775 (Cambridge: The University Press for the Hakluyt Society), pp. 292-293.
3 . But Sahlins also explains that the authorship of this account might have been mistakenly attributed to Endicott:
It could be that Endicott indeed did not see the event, insofar as he may well not be the author of the contested text. The original of that text, reprinted and signed by Endicott as an appendix to his book, is an article that appeared in The Danvers Courier newspaper on 16 August 1845, under the byline “By an Eye Witness”. The Peabody Museum, where the article is archived, apparently attributes it to a different member of the Glide ’s crew, Henry Fowler (of Danvers) with whose papers it is included (Fowler, PMB 225). Indeed, a simple “F” is inscribed at the bottom of the original newspaper article (Sahlins 2003: 3, n.3) But whether Endicott or Fowler provided the actual account, it is confirmed by numerous other contemporary records.
Chapter VI: So all languages aren’t equally complex after all
1. All languages are born equal
People outside the specialised sphere of linguistics have generally taken it for granted that, just as there are simple and complex cultures there would correspondingly be simple and complex languages. But for most of the last hundred years linguists have claimed that even if some cultures are simpler than others, All Languages are Equally Complex: ALEC, or uniformitarianism. “There are Stone Age societies, but there is no such thing as a Stone Age Language. Earlier in this [20th] century the anthropological linguist Edward Sapir wrote, ‘When it comes to linguistic form, Plato walks with the Macedonian swineherd, Confucius with the head-hunting savage of Assam’” (Pinker 2015: 25). Or again, “[N]o sign of evolution from a simpler to a more complex state of development can be found in any of the thousands of languages known to exist or to have existed in the past” (Lyons 1977 (I): 85, and see Lyons 1970: 21-22). Or, as a fairly recent linguistics textbook has said, “All languages are equally complex and equally capable of expressing any idea” (Fromkin et al. 2010: 34) ¹ .
Indeed, many would also dispute that there are “StoneAge” societies, and argue that non-industrial peoples had systems of language, knowledge, and culture as complex and valid in their world view as our own. As one anthropologist has said, “All people are essentially equal in their ability to become cultured, and all people encounter approximately the same amount of information in the process of enculturation. Thus it is untenable to maintain that one culture is ‘higher’ or more complex than another. In reality, there are no simple or primitive cultures: all cultures are equally complex and equally modern” (Hamill 1990: 106). Or again, “[People] think the same thoughts, no matter what kind of grammatical system they use; and they express the same kinds of thoughts, regardless of the grammatical tools they have: past, present and future events, cause and effect relationships, social relationships, hypothetical questions, and so forth” (Jackendoff and Wittenberg 2014: 66).
There is no doubt that egalitarian ideology has been a very powerful motivation for this belief. “The reason why linguistics was worth studying, for many descriptivists [such as Sapir and Boas], was that it helped to demonstrate that ‘all men are brothers’—Mankind is not divided into a group of civilized nations or races who think in subtle and complex ways, and another group of primitive societies with crudely simple languages” (Sampson 2009a: 4). But while linguists could justifiably point out that some languages spoken by tribal peoples could be grammatically and phonologically more complex than some European languages, there was no systematic attempt to find evidence for the general theory of uniformitarianism. Hockett, for example, simply maintained that the total grammatical complexity of any language was more or less bound to be the same as any other’s, “since all languages have about equally complex jobs to do” (1958: 180), a very strange assumption indeed, as we shall see.
When the traditional “descriptive” linguistics of Sapir and others ² was replaced by the generative linguistics (which became Universal Grammar) of Chomsky and his school, the dogma of equal complexity remained the same:
If we come forward to the generative linguistics of the last forty-odd years, we find that linguists are no longer interested in discussing whether language structure reflects a society’s cultural level, because generative linguists do not see language structure as an aspect of human culture. Except for minor details, they believe it is determined by human biology, and in consequence there is no question of some languages being structurally more complex than other languages—in essence they are all structurally identical to one another. Of course there are some parameters which are set differently in different languages: adjectives precede nouns in English but follow nouns in French. But choice of parameter settings is a matter of detail that specifies how an individual language expresses a universal range of logical distinctions—it does not allow languages to differ from one another with respect to the overall complexity of the set of distinctions expressed, and it does not admit any historical evolution with respect to that complexity.… The innate cognitive machinery which is central to the generative concept of language competence is taken to be too comprehensive to leave room for significant differences with respect to complexity. (Sampson 2009a: 6-7)
2. The aims of this paper
In the general context of what we know about biological, social and cultural development the claim that all languages are equally complex is extremely odd. Biological organisms have obviously evolved increasingly complex forms, in the sense of having an increasing number of component parts, specialisation of function, and hierarchical structures, and the same process can be observed in social organization, culture, and technology. Why, then, should language be any different? As a social anthropologist who conducted several years” fieldwork among the Konso of Ethiopia (Hallpike 2008), and the Tauade of Papua New Guinea (Hallpike 1977), it has always been obvious to me from personal experience that claims that “all cultures are equally complex” are simply untrue, and my belief is supported by a vast ethnographic literature (see Hallpike 1992 for a summary). I also applied Piagetian developmental psychology to the data of small-scale, non-literate societies with subsistence economies (“primitive societies”) in The Foundations of Primitive Thought (1979) and Ethical Thought in Increasingly Complex Societies (2016). These assembled a wealth of evidence to show that modes of thought about the natural world, causality, classification, notions of the self, society, and ethics do indeed follow a developmental pathway, and that the thought worlds of modern literate urban societies are very different from those found in primitive societies. This work also refuted the standard anthropological dogma that individual psychology cannot be used to explain collective representations, and showed that since culture can only be transmitted through individuals, their psychology has to be an integral part in the formation of these collective representations.
Language is perhaps the pre-eminent example of a collective representation, although not being a linguist I did not feel professionally competent to challenge the doctrine of ALEC. But I have recently been encouraged ³ to find that it, together with Chomsky’s Universal Grammar, are increasingly being rejected by linguists, and I have tried here to summarise their main conclusions for the benefit of anthropologists. The main theme of this paper is therefore a critique of the theory that language can be a genetically based “organ”, “instinct”, or “module”, and aims to show that, while clearly the language capacity depends on some unique and evolved qualities of the human br
ain, the characteristics of natural languages cannot be understood unless they are also placed in the context of social relations and the ways in which these have developed in the course of history.
3. Chomsky and Universal Grammar
Chomsky began developing his theory of Universal Grammar or UG in the 1950s to demonstrate that language, or more specifically grammar (syntax + morphology), is a distinct cognitive function that is innate and genetically specified, a mental “organ” with very detailed characteristics like the heart or the eye. In adopting this approach Chomsky was in perfectly orthodox scientific company, since the prevailing view of the brain was that known as “localizationalism”: “the idea that the brain is like a complex machine, made up of parts, each of which performs a specific mental function and exists in a genetically predetermined or hardwired location —hence the name” (Doidge 2007: 12). This view of language was highly compatible with subsequent developments in computer science by which it could be represented as a specific computational programme, and later also formed close links with “evolutionary psychology”, that grew up with socio-biology in the 1980s. This claimed that every mental function was a “module”, an encapsulated computational device evolved to solve all the various problems that our ancestors had encountered during the Pleistocene, whether it be detecting cheaters, child-care, mathematics, tool-use or of course language.
Chomsky used the theory of Universal Grammar very effectively to refute Skinner’s Behaviourist claim (Chomsky 1959) that speech could be explained without any reference to a supposed “mind”, but purely as the product of operant conditioning in which items of “verbal behaviour” were emitted in response to particular stimuli, and then subject to reinforcement. Chomsky pointed out however that children were able to utter grammatically well-formed statements that they had never heard before, and could attain correct grammar without being constantly corrected, or even corrected at all. Behaviourist theory was quite incapable of answering these objections which were decisive. For Chomsky, then, the basic justifications for saying that the capacity for language must be an innate module or organ, a computational mechanism, was the argument from the poverty of the input together with lack of correction, and ease of acquisition in childhood (Pinker 2015: 40).