Why Don’t We Eat (More) Brains?
From Cannibals to Mario Batali, A Brief History of Eating Gray Matter
When I first touched a brain, it was braised and enveloped in a blanket of beaten eggs. That brain had started its life in the head of a calf, but ended in my mouth, accompanied by some potatoes and a beverage at an economical eatery in Seville. Seville is a Spanish city famous for its tapas, and tortilla de sesos, as well as other brain preparations, are occasional offerings. On my brain-eating trip to Seville, I was too poor to afford sophisticated gastronomic experiences. Indeed, some of my most vivid recollections of the trip included scrounging around supermarkets for rather less satisfying food, while the delectable tapas remained out of reach, only for the ogling. The brain omelet was certainly one of the better meals I had.
My next encounter with sesos came many years later in a laboratory at MIT, in a crash course on neuroanatomy whose highlight was certainly the handling and dissection of a real sheep’s brain. At that time, I was drawn to the class and to the sheep’s brain by a diffuse set of concerns that motivate many of my fellow humans to follow and even embed themselves in neuroscience. The brain is the seat of the soul, the mechanism of the mind, I thought; by studying it, we can learn the secrets of cognition, perception, and motivation. Above all, we can gain an understanding of ourselves.
The experience of handling a brain can be awesome, in the classical sense of the word. Is this lump of putty really the control center of a highly developed organism? Is this where the magic happens? Animals have had brains or brain-like structures for nearly 500 million years; over 80 percent of that time, the ancestors of sheep were also our ancestors, and their brains were one and the same. Reflecting that extensive shared heritage, the shape, color, and texture of the sheep’s brain are quite like our own, and it is not hard to imagine that the sheep’s brain is endowed with transcendent capabilities analogous to ours. The internal complexity of the sheep’s organ is indeed almost as astounding as that of the human brain, with its billions of cells, trillions of connections between cells, and ability to learn and coordinate flexible behaviors that carry us across lifespans more convoluted than the cerebral cortex. The sheep’s brain bears witness to years of ovine toil, longing, passion, and caprice that are easily anthropomorphized. And that brain, removed from the rest of its body and everything the ex-sheep once felt or knew, is as powerful a memento mori as one can find.
But the sheep’s brain, like ours, is also a material highly similar to other biological tissues and organs. Live brains have a jellylike consistency that can be characterized by a quantity called an elastic modulus, a measure of its capacity to jiggle without losing its form. The human brain has an elastic modulus of about 0.5–1.0 kilopascal (kPa), similar to that of Jell-O (1 kPa), but much lower than biological substances such as muscle or bone. Brains can also be characterized by their density. Like many other biological materials, the density of brains is close to water; given its size, an adult human brain therefore weighs about as much as a large eggplant. A typical brain is roughly 80 percent water, 10 percent fat, and 10 percent protein by weight, leaner than many meats. A quarter pound of beef brain contains 180 percent of the US recommended daily value of vitamin B12, 20 percent of the niacin and vitamin C, 16 percent of the iron and copper, 41 percent of the phosphorus, and over 1,000 percent of the cholesterol—a profile somewhat resembling an egg yolk. Risk of clogged arteries aside, why not eat the brain rather than study it?
*
About two million years ago, near what is now the southeastern shore of Lake Victoria in Kenya, ancient hominins were doing just that. Lake Victoria itself, the largest in Africa and source of the White Nile, is less than half a million years old and was then not even a glimmer in the eye of Mother Nature. Instead, the area was an expansive prairie, roamed by our foraging forebears, who subsisted on grassland plants and the flesh of prehistoric grazing mammals that shared the terrain. Archeological findings at this site, known as Kanjera South, document the accumulation of small and midsize animal skulls at specific locations over several thousand years. The number of skulls recovered, particularly from larger animals, substantially exceeds the corresponding numbers of other bones. This indicates that animal heads were separated from the rest of their carcasses and preferentially gathered at each site. Some skulls bear the marks of human tool use, thought to reflect efforts to break open the cranial cavities and consume their contents. Brains were apparently an important part of the diet of these early people.
“Although other carnivores competed vigorously with humans for most cuts of meat, brains may have been uniquely humankind’s for the taking.”
Why brains? In evolutionary terms, the Kanjera humans were relatively new to meat eating; carnivory in Homo is documented as beginning only at about 2.5 million years ago (Mya), though it is believed to have been a major factor in our subsequent development as a species. Nonhuman carnivorous families on the scene at 2 Mya had been established meat eaters for many millions of years already. The biting jaws and catching claws of the great Pleistocene cats, the giant hyenas, and the ancestral wild dogs were better adapted to slaying, flaying, and devouring their prey than anything in the contemporary hominin body plan. But early humans had advantages of their own: already the bipedal stance, the storied opposable thumb, and a nascent ability to form and apply artificial implements all conferred special benefits. If a primordial person stumbled across the carcass of a slain deer, pungent and already picked to the bone by tigers, she could raise a stone, bring it crashing down on the cranium, and break into a reservoir of unmolested edible matter. Or if she brought down an animal herself, she could pry off the head and carry it back for sharing with her clan, even if the rest of the animal was too heavy to drag. In such fashion, the hominins demonstrated their ability to carve out an ecological niche inaccessible to quadrupedal hunters. Although other carnivores competed vigorously with humans for most cuts of meat, brains may have been uniquely humankind’s for the taking.
Synchronicity on a geologic time scale may explain the coincidence of early hominin brain eating and the emergence of massive, powerful brains in our genus, but the two phenomena are connected in other ways as well. Highly evolved human civilizations and their corresponding cuisines across the world have produced edible brain preparations that range from simple, everyday dishes to splendid delicacies. Celebrity chef Mario Batali brings us calf brain ravioli straight from his grandmother, needing about one hour of preparation and cooking time. Traditional forms of the hearty Mexican hominy stew called posole are somewhat more involved: an entire pig’s head is boiled for about six hours until the meat falls off the bone. Unkosher, but perhaps appetizing all the same! Truly festive brain dishes are prepared across much of the Muslim world on the feast of sacrifice, Eid al-Adha, which celebrates Abraham’s offering of his son Ishmael to God. These recipes—brain masala, brains in preserved lemon sauce, steamed lamb’s head, and others—leverage the glut of ritually slaughtered animals generated on the holiday, as well as a cultural reluctance to let good food go to waste. And who could forget the highlight of Indiana Jones’s Himalayan banquet on the threshold of the Temple of Doom—a dessert of chilled brains cheerfully scooped out of grimacing monkey heads? Although it is a myth that monkey brains are eaten on the Indian subcontinent, they are a bona fide, if rare, component of the proverbially catholic Chinese cuisine to the east.
Even to the hardened cultural relativist, there is something slightly savage about the idea of consuming brains as food. “It’s like eating your mind!” my little girl said to me at the dinner table, a scowl on her face. Eating monkey brains seems most definitively savage because of the resemblance of monkeys to ourselves, and eating human brains is so far beyond the pale that on at least one occasion it has invited the wrath of God himself. The unhappy victims of that almighty vengeance were the Fore people of New Guinea, discovered by colonists only in the 1930s and decimated by an epidemic of kuru, sometimes called “laughing sickness.” Kuru is a disease we now believe to be transmitted by direct contact with the brains of deceased kuru sufferers; it is closely related to mad cow disease. The Fore were susceptible to kuru because of their practice of endocannibalism—the eating of their own kind—as Carleton Gajdusek discovered in epidemiological studies that later won him a Nobel Prize. “To see whole groups of well nourished healthy young adults dancing about, with athetoid tremors which look far more hysterical than organic, is a real sight,” Gajdusek wrote. “And to see them, however, regularly progress to neurological degeneration. . . and to death is another matter and cannot be shrugged off.”
“Eating someone else’s brain, even an animal’s, is too much like eating our own brain, and eating our own brain—as my daughter asserted—is like eating our mind, and perhaps our very soul.”
Fore people were surprisingly nonchalant about their cannibalism. The bodies of naturally deceased relatives were dismembered outside in the garden, and all parts were taken except the gallbladder, which was considered too bitter. The anthropologist Shirley Lindenbaum writes that brains were extracted from cleaved heads and then “squeezed into a pulp and steamed in bamboo cylinders” before eating. Fore cannibalism was not a ritual; it was a meal. The body was viewed as a source of protein and an alternative to pork in a society for which meat was scarce. The pleasure of eating dead people (as well as frogs and insects) generally went to women and children, because the more prestigious pig products were preferentially awarded to the adult males. The brain of a dead man was eaten by his sister, daughter-in-law, or maternal aunts and uncles, while the brain of a dead woman was eaten by her sister-in-law or daughter-in-law. There was no spiritual significance to this pattern, but it did closely parallel the spread of kuru along gender and kinship lines until Fore cannibalism was eliminated in the 1970s.
There are many reasons not to eat brains, from ethical objections to eating meat in general, to the sheer difficulty of the butchery, to the danger of disease; but all activities come with some difficulties and dangers. One can’t help thinking that the real reason our culture doesn’t eat brains is more closely related to the awesomeness of holding a sheep’s brain in one’s hand: brains are sacred to us, and it takes an exercise of willpower to think of them as just meat. Eating someone else’s brain, even an animal’s, is too much like eating our own brain, and eating our own brain—as my daughter asserted—is like eating our mind, and perhaps our very soul.
Some of us arrive at this conclusion through introspection. Even in the sixth century BCE, the Pythagoreans apparently avoided eating brains and hearts because of their belief that these organs were associated with the soul and its transmigration. But can we find objective data to demonstrate a modern disinclination to eat brains? Consumption of offal of all sorts, at least in Europe and the United States, has dropped precipitously since the beginning of the 20th century, but it seems that brains in fact are particularly out of favor. A recent search of a popular online recipe database uncovered 73 liver recipes, 28 stomach recipes, nine tongue recipes, four kidney recipes (not including beans), and two brain recipes. If we suppose somewhat crudely that the number of recipes reflects the prevalence of these ingredients in actual cooking, there appears to be a distinct bias against brains. Some of the bias may be related to “bioavailability”—a cow’s brain weighs roughly a pound, compared with two to three pounds for a tongue or ten pounds for a liver—but a difference in popularity plausibly explains much of the trend. A 1990 study of food preferences surveyed from a sample set of English consumers also supports this point. The results showed that dislike for various forms of offal was ranked in ascending order from heart, kidney, tripe, tongue, and pancreas to brain. This study is notable partly because it was performed before the mad cow outbreak of the mid-1990s, so the surveyed preferences are not easily explained by health concerns related to brain eating. The participants’ tendency to “identify with” brains might best explain revulsion at eating them, inferred sociologist Stephen Mennell in an interpretation of the results.
__________________________________
From The Biological Mind. Used with permission of Basic Books. Copyright © 2018 by Alan Jasanoff.