As far back as written stories go, we see both awe and fear, delight and disgust in the wild world. We often see the urge to conquer and dominate, but we also see, like a transcendent counterpoint, an urge to be swallowed up by the wild, to merge with it.
The Epic of Gilgamesh, one of our oldest written narratives, chronicles the great city builder Gilgamesh’s friendship with a “wild man,” Enkidu, who, when we meet him, is living contentedly without human contact, “coated in hair like the god of the animals,” grazing at the water hole with a herd of gazelles, “his heart delighting with the beasts in the water.” Gilgamesh forcibly humanizes Enkidu by sending Shamhat, a priestess, to seduce him. After many days of lovemaking, Enkidu is spurned by the wild animals. He gets a haircut, eats bread and ale, and returns with Shamhat to the city.
After an epic street fight, he becomes Gilgamesh’s best friend and together they hit the road, slay the forest god Humbaba, and take his tusks as a prize. Afterward, they fell the biggest and best tree to take back to Uruk to make a fancy door. It is hard not to read this deforestation parable as a metaphor for the rise of humans as an ecological force—the triumphant expansion of urban and agricultural lands into formerly wild places. But even as Gilgamesh and Enkidu enter the forest, kitted out with axes and daggers, they pause for a moment to admire it, “gazing at the lofty cedars” and noting that “its shade was sweet and full of delight.”
Something about this moment reminds me of historical photos of 19th-century lumberjacks standing next to enormous trees, many halfway sawn-through. These images, which I saw often growing up in the Pacific Northwest, speak to the loggers’ pride in being able to achieve such huge tasks as well as a real awe at the sheer scale of these trees. Loggers tend to have a real love for trees in the way that many hunters truly respect their quarry.
Appreciation and awe are woven into humans’ most ancient feelings about wilderness. In the Bible, wilderness is a barren, life-threatening place, but it is also a place of revelation. In Genesis, Eden is a garden, but one that needs no tending, a kind of primeval wilderness with the sting of death removed. And in Christian accounts, the expulsion from Eden and the adoption of an agricultural way of life was a tragedy.
Consistent throughout is the same duality, the belief that humans—and the agricultural landscapes and cities they create—exist in a separate category from all other species and the landscapes they create. In contemporary Western thought, this dualism often takes on a charged moral color. More than just different from the wild, humans are seen as capable of destroying the wild with their touch. In this framing, any human influence on nature is bad. All changes that flow from human action are degradation no matter what they are.
This generalization is understandable. Humans have made so many changes that we now regret. We look at our oceans swirling with plastic, our shrinking old forests, the dying eyes of one of the last ten vaquita porpoises caught in a net, and we understandably feel strongly that we ruin everything. This view was memorably captured in March 2020 by a viral tweet that quickly became meme shorthand for taking a grim, misanthropic pleasure in the Covid-19 pandemic.
“Wow… Earth is recovering
—Air pollution is slowing down
—Water pollution is clearing up
—Natural wildlife returning home
Coronavirus is Earth’s vaccine
We’re the virus.”
This idea that humans are like a disease is not new. I remember attending a scholarly talk at a 2009 meeting of the Society for Conservation Biology in Flagstaff, Arizona, in which a professor compared cities to cancers, in both their physical growth patterns and in their moral value. The flip side of the idea that humans are bad by definition is that nature is good by definition. Thus we see the word “natural” splashed across our breakfast cereal, our shampoo, our dish soap. Expectant mothers opt for “natural” births, eschewing hospitals and anesthetics; shoppers flock to “natural” food stores, as if mainstream groceries sell toy broccoli and plastic steaks.
“Wild” and “natural,” “wilderness” and “nature”—these terms are often used interchangeably when talking about things, animals, and places. There have been whole books written about the meaning of these words. I wrote one of them myself.
In United States law, federally designated wilderness is famously defined as “an area where the earth and its community of life are untrammeled by man, where man himself is a visitor who does not remain.” One environmental ethics text defines natural like this: “Something is natural to the extent that it is independent of human design, control, and impacts.” Definitions like this start with a basic assumption that human beings are not part of nature. They assume, in fact, that humans are the opposite of nature, that our influence makes a thing less wild or natural. And I simply reject this premise.
Definitions like this start with a basic assumption that human beings are not part of nature.
After many years, I have come to see the concepts of wilderness and nature as not just unscientific but damaging. Firstly, all organisms alive today are influenced by humans. Secondly, we humans are deeply influenced by the plants and animals we evolved with; we are part of “nature,” too. Thirdly, “wilderness” rhetoric has long been used to justify denying land rights to Indigenous people and to erase their long histories. And finally, thinking of nature and humans as incompatible makes it impossible to discover or invent ways of working with and within nature for the common good. I still have hope for the word “wild,” but as a term for autonomous, not un-human.
Let’s start with that first point. All Earth’s species have been influenced by humanity’s changes to the environment. Even animals that have never seen a human are changed by our species, genome deep. In a sense we are only beginning to understand, “wild” animals today, and the ecosystems they live in, are all party human creations.
All species that regularly interact act as selection pressures for one another, shaping each other’s evolution. Natural selection favors organisms that thrive in their environment, and the environment is as much the living species in a place as it is nonliving factors like climate. So like all animals on Earth, our species has been affecting other species for its entire run. For much of our history, our line hasn’t had a particularly outsized influence. We may have been a minor factor in the evolution of some predators by being tricky prey and we certainly molded the evolution of our direct parasites, like lice. Pediculus humanus can thrive on no other animal. But in general, we were just one animal among many, all creating evolutionary pressures for each other, in a complex web of pushes and pulls.
Our career as “super influencers” began in earnest relatively recently, evolutionarily speaking. There are hints that our ancestors in the Pliocene were so good at stealing food from large carnivores that we helped drive these great beasts extinct. One study says that the timing of extinctions of “species of bears, saber‐toothed cats, and giant species of martens, otters and civets” in East Africa correlates better with increases in our ancestors’ brainpower than with climate changes. (Though their model also suggested that a decline in forest cover could also be an important causal factor.) In a time when competition for prey was fierce, being driven away from a kill by stick-wielding hominins might have been the difference between life and death.
By the late Pleistocene, though, Homo sapiens had arrived, and we weren’t just stealing food from better hunters; we had become the best hunters on Earth. A wave of extinctions of animals over about 100 pounds (known to researchers as “megafauna”) followed humans as they migrated around the globe. There are a huge number of these extinctions, and ultimately, each one has a slightly different causal story. Climate changes are likely to have contributed to many of them. But there are few scientists now who argue that human hunting—both of the extinct animals themselves and the prey they depended on—wasn’t a factor in at least some of these extinctions.
North America lost more than 70 percent of its megafauna in the Pleistocene, including the massive dire wolf, the American lion, giant ground sloths, a species of mountain goat, saber-toothed cats, the giant short-faced bear, mammoths, and mastodons. It is impossible to fully imagine the continent as it was when inhabited by these giants. Over my fireplace, I have a painting by my brother Alex of the skeleton of a short-faced bear, with the skeleton of a human being beside it, for scale. When standing on its hind legs, this bear was 12 feet tall. It could run up to 40 miles per hour. It ate meat. Probably, it ate us. When I gaze up at it, I try to picture what kind of culture, what kind of psychology, what kind of person could cope with living in a world where such raw fierceness was a real existential threat. I simply cannot.
All species that regularly interact act as selection pressures for one another, shaping each other’s evolution.
Between 15,000 and 10,000 years ago, all these animals disappeared. Since they, like all creatures, influenced the species in their ecosystems, their landscapes transformed in their absence. Paleoecologists can look back in time by extracting long muddy cores from lake beds. The cores are like thousands of layered time capsules, each capturing one period of time through the sediment that gradually settled on the bottom of the lakes.
Scientists look in each layer for charcoal from wildfires, pollen grains, and spores of the fungus Sporormiella, which live in herbivore dung. These studies have shown, for example, that when mastodons, mammoths, giant beavers, giant sloths, and a giant moose relative (Cervalces) declined in numbers across what is now the area from Indiana to New York—as evidenced by a crash in Sporormiella spores—broadleaf trees exploded in numbers, overtaking spruce as the most frequent source of pollen. As anyone who has tried to garden in a place with deer knows, herbivory can be a powerful force, keeping palatable species down while spiky or toxic plants flourish. The decline of the megafauna had an effect not unlike erecting a deer fence around the entire continent. Less grazing meant more plants, which also meant more fuel for fire, and researchers saw more charcoal in the sediment layers laid down after the extinctions as well.
In Australia, humans arrived much earlier, some 65,000 years ago. There was a pulse of extinction there following the arrival of humans, and, as in North America, a transformation of many landscapes. Areas of the Outback that are now nearly treeless were once “a mosaic of woodland, shrubland, and grassland, with a high proportion of plants with palatable leaves and fleshy fruits,” according to one analysis. As giant relatives of kangaroos and wombats and huge flightless birds disappeared, plant matter built up and then burned, starting a cycle of wildfires in some places that favored the tough, fire-adapted species now common across the arid parts of Australia.
The first people in New Zealand arrived less than 1,000 years ago—so recently that their lost megafauna is just barely gone. At Yale’s Peabody Museum of Natural History, I saw a full skeleton of a Moa—a family of gargantuan birds up to ten feet tall that, like so many others, was likely ushered into extinction by human hunting. I looked down at the card and the hairs on the back of my neck stood up as I read the word “subfossil.” I realized I wasn’t looking at hard minerals that had taken the place of bones, as in your typical dinosaur fossil. I was looking at this bird’s actual bones—as greasy and real as the bones left over after turkey dinner. That’s how recently we lost the nine species of Moa.
But even though the bodies are barely cold, the land has already responded to the change. Without the big herbivorous Moa keeping forest open enough for sunlight to reach the ground, at least in some places, the forest closed in and shade-adapted species took over.
All across the world, human hunting likely contributed to megafaunal extinctions, which in turn led to significant changes in ecosystems. These changes presented new challenges and opportunities for the surviving animals, influencing their behavior and, over the generations, shaping their evolution. There were likely some secondary extinctions—certainly of parasite species, perhaps also species of dung beetle that relied on the not insignificant dung heaps left by some of these massive creatures. Similarly, scavengers were in trouble, with fewer giant carcasses to feast on. In North America, we lost seven entire genera of vultures, and the California condor only survived by taking advantage of beached whales and other marine mammals.
Other species had to change their behavior to survive. The common vampire bat was likely hit hard by megafauna extinctions in its home range in the Americas. This bat lands on sleeping mammals, neatly slices open their skin, and laps up their blood. With fewer large animals to sip from, life was tough, but the bat adapted by switching over to humans and their livestock. Today, pigs, horses, and cattle are their preferred prey.
Many large predators, such as dire wolves and saber-toothed cats, went extinct during this time, but others evolved to thrive in the new normal. In South America, jaguars shrank to scale with the average size of their new prey. Instead of gorging on wild horses and camels and ground sloths, they learned to make do with capybaras and giant anteaters. Coyotes shrank too, and used their legendary cleverness to survive in the new, smaller world they found themselves living in.
There’s not a lot of research on specific animal adaptations to the extinction wave. As one scientific paper put it, “To date, the influence of the terminal Pleistocene extinction on the surviving small and medium‐sized mammals has been largely ignored.” That paper looked at a serious grave site: Hall’s Cave in the Texas Hill Country, where bones have been building up for 22,000 years. Their analysis found that the extinction pulse knocked out 80 percent of the large‐bodied herbivores and 20 percent of the apex predators in the area. Afterward, animals were, on average, smaller and there were fewer grazers.
Human hunting likely contributed to megafaunal extinctions, which in turn led to significant changes in ecosystems.
The paper’s authors noted another interesting finding: Today’s apex predators—jaguars, mountain lions, wolves, grizzly bears—used to be the little guys making a living in the shadows of even more imposing specimens: the saber‐tooth and scimitar‐toothed cats, dire wolves, and the short‐faced bear. These extinct “hyper carnivores” usually specialized in just one or a few prey species. In their wake, we are left with “apex” carnivores that grew up being scrappy opportunists. They can scavenge, eat plants, switch prey, and make other behavioral choices. That flexibility is no doubt serving them well now as they struggle to hang on in the 21st century.
In Africa and Eurasia, the effects of human hunting are more contentious, and some megafauna likely had time to evolve responses to the increasingly clever apes to avoid extinction—but that doesn’t mean our influence was not as profound. Indeed, if such animals as African elephants, rhinos, hippos, bison, elk, and water buffalo survived our rise as efficient hunters by evolving new defenses—such as better hearing, sharper horns, or defensive behaviors—that fact only supports the argument that human influence is already woven into the genomes (and lives) of these species.
Humans didn’t only shape other species by contributing to extinctions. In other cases, we evolved mutually beneficial relationships with nonhumans that changed both parties. Robin Wall Kimmerer is an ecologist and enrolled member of the Citizen Potawatomi Nation. She’s also the director of the Center for Native Peoples and the Environment at the State University of New York College of Environmental Science and Forestry in Syracuse.
In her book Braiding Sweetgrass, she writes about how harvesting fragrant sweetgrass to make baskets encourages its growth—and how unharvested patches of the plant wither away and die. “With a long, long history of cultural use, sweetgrass has apparently become dependent on humans to create the ‘disturbance’ that stimulates its compensatory growth,” she writes. M. Kat Anderson, an ethnobotanist at the University of California, Davis, writes that “It is highly likely that over centuries or perhaps millennia of indigenous management, certain plant communities came to require human tending and use for their continued fertility and renewal.” In other cases, humans directed the evolution of food species by selective harvesting and by replanting seeds of individuals with desired characteristics.
Indigenous North Americans moved plants around intentionally, bringing fan palms to the Sonoran Desert for shade and fruit, for example. The Kumeyaay, who live in and around San Diego, cultivated manzanita, ceanothus, and wild roses and actively extended the range of plums, agave, yucca, sage, and mesquite. They brought prickly pear cactus from the desert to the seashore and planted orchards of oaks and fields of grain.
“Careful study of Spanish accounts indicates that little or no natural landscape existed,” anthropologist Florence Shipek writes. “Instead they describe grass, oak-park grasslands, limited chaparral, and areas with plants so even and regular they looked planted, all the product of human management to provide food.” The Kumeyaay’s neighbors and trading partners, the Chumash, may have brought gray foxes to the Channel Islands off the coast of California—which then evolved into an entirely new species, the miniature island fox.
Many Indigenous peoples use fire to manage both the land and its animals. Fire can keep down dry fuels, reducing the chance of a destructive, out-of-control fire later in the season. Fires also stimulate new plant growth, which is the most nourishing to many herbivores. So the practice both feeds wild animals and attracts them to be hunted. Many American prairies, meadows, and grasslands that colonizers assumed were “natural” were in fact intentionally maintained with fire.
Humans altered the evolutionary trajectory of some wild animals so thoroughly that they ceased to be wild.
Indigenous land management would have influenced the animals that lived in and around these ecosystems. More directly, humans altered the evolutionary trajectory of some wild animals so thoroughly that they ceased to be wild. One theory about how dogs came to be domesticated focuses on wolves and humans as ancient hunting partners. Indigenous peoples from North America, Siberia, and Eastern Asia all tell stories of cooperative, friendly relations with wolves.
Indeed, it is possible that the reason there aren’t more examples of mutualisms between humans and wild animals is that, in such cases, we have become so thoroughly intertwined that we call our “partners” something else: domesticates. Dogs are, in this obverse framing, just the subset of wolves who have mutualistic relations with humans.
Today, even the wildest of wild animals are not only influenced by all those millennia of human-caused changes, they are continuing to adapt to our ever-changing ways. Wild animals make their own choices about what to do every day—in that sense they are free. But their bodies and minds evolved in a world profoundly influenced by humans, and all too often, their daily choices involve navigating a world that has been rearranged for human needs and desires.
Excerpted from Wild Souls: Freedom and Flourishing in the Non-Human World. Used with the permission of the publisher, Bloomsbury. Copyright © 2021 by Emma Marris.