You’re an Animal
The fatal human impact of evolutionary mismatch.
Sea turtles can have a hard time in the modern world. When hatchlings are born on the beach, they instinctively head toward the brightest part of the horizon. They developed this instinct over millennia because the brightest part of the horizon had always been the ocean, where light reflected off the water. These days, this instinct often leads them astray. In developed areas, they head toward brightly lit beach houses or coastal roads instead.
This is a case of so-called evolutionary mismatch, when traits that evolved in one environment have negative consequences in another. The turtles’ instinct works well in a natural environment, without artificial light. In a man-made environment, it is often fatal.
“Things like that are just so striking—they cause the organism to do suicidal things—that they’re just begging for our attention,” says David Sloan Wilson, an evolutionary biologist at Binghamton University. What concerns Wilson and a growing number of other evolutionary biologists, neuroscientists, and psychologists, is how evolutionary mismatch might be affecting humans.
Until about 12,000 years ago, our ancestors lived in small, pre-agricultural communities, where they hunted, fished, and foraged for food. Then, in the blink of an eye—in evolutionary terms—we went from hunting and gathering to agriculture, industry, and the information age. Today, for the first time in history, most of us live in urban centers, surrounded by man-made structures, completely immersed in the creations of our own culture. Our relationships are increasingly mediated by technology. Our work is often done seated, at a computer, using only our brains and our fingertips. Our diets are of our own design, free of natural constraints. It could hardly be a more different environment from the one our ancestors adapted to over the last 1.8 million years. Today’s world is rich in certain comforts—who doesn’t appreciate air conditioning or email—but it may fail to meet some of our more animal needs that have been evolving for millennia.
Some of the ways humans are ill-adapted to modern life are obvious. For example, we evolved in an environment where sugar, an important source of energy, was scarce. Because it was good for our ancestors to get sugar when they could, they developed a taste for it. But now sugar is everywhere, and we have a hard time turning down that cheap, enormous soft drink, even though we know it’s not good for us. Sometimes, however, the mismatch between the environment we evolved in and the one we live in plays out in less clear-cut ways.
The causes of attention-deficit hyperactivity disorder and its cousin, attention-deficit disorder, are poorly understood. Most researchers believe there are both genetic and environmental factors. And diagnosing ADHD is more art than science. Like many mental conditions, it expresses itself as an amorphous collection of symptoms and behaviors rather than, for example, the results of a simple score on a blood test. Treatment is difficult because medications don’t always work, and when they do they sometimes come with side effects or need to be taken for years.
In the summer of 2008, researchers from the University of Illinois, Urbana-Champaign, gathered a group of especially animated children—they all had been officially diagnosed with ADHD—and administered a simple treatment: a 20-minute walk in a park.
The Urbana-Champaign team, led by Frances Kuo, a psychologist who studies how trees, green space, and other natural features affect human health, had read reports from parents who said their children’s ADHD symptoms seemed to improve when they participated in after-school programs that took place outdoors. Kuo and her team wanted to find out if these reported improvements were consistent. They started by giving each child a standard concentration test that involved listening to a sequence of numbers and then repeating it back in reverse order. Then they took them for walks. Some children were taken to a downtown area, others to a residential neighborhood, and a third group to a park. Then they were given the concentration test again.
The results were remarkable. The children who had taken walks in the downtown or residential areas showed no improvement in concentration. But those children who had taken a walk in the park showed improvements that effectively canceled out their ADHD. A 20-minute walk in a natural environment was about as effective as common ADHD medications, the team concluded.
Kuo and her team think that the kind of attention we need to sustain a close focus on a single activity—homework, for example—can be restored by spending time in an environment, like a park, where we pay attention in a more diffuse, passive way. Balancing those two forms of attention, they posit, is critical to keeping both mental states vital. As they put it in their report, “Environments that are gently absorbing and thus draw predominantly on involuntary attention can be restorative; exposure to such environments can allow the mechanism underlying directed attention to rest and rejuvenate.”
Our ancestors, of course, spent lots of time in “gently absorbing” environments. They lived primarily outdoors, with lots of downtime. If our brains evolved to function well in that environment, then they may, in this way at least, be poorly adapted to modern life, with its constant demands on our attention and limited opportunities to fully relax. The ADHD study suggests that reconnecting with that ancestral life is more important for our mental health in the modern world than we think.
Indeed, our lack of contact with the natural world may be one of the most widespread forms of mismatch, with effects that go beyond ADHD. Evidence has been accumulating for years. In a 1984 study in Pennsylvania, gall bladder surgery patients were shown to recover more quickly when they could recuperate with a view of the outdoors. In a 1999 study, students who received lessons in rooms with more natural light scored up to 25 percent higher on standardized tests. (Ironically, for many years it was thought that limiting windows in classrooms would help prevent students from getting distracted.) A University of Kansas study from 2012 found that a four-day backpacking trip boosted scores on a commonly used creativity test by 50 percent.
In his book Last Child in the Woods, journalist and author Richard Louv coined the phrase “nature deficit disorder” to describe the problem. Louv has amassed mountains of research from all over the world documenting our increasing alienation from nature. In Norway, children spend less time in self-initiated play outdoors than they did in the previous years. In Australia, only 13 percent of children report playing outdoors more than indoors, compared with 73 percent of the prior generation.
The effects these massive changes are having on us are only beginning to be studied. Louv believes we don’t fully recognize the implications of our alienation from the natural world. “Nor,” he says, “do we fully understand the benefits and the costs of being immersed in technology all day. More research is coming in on that, too.”
Some of that research is coming from Peter Whybrow, a neuroscientist at UCLA. Whybrow argues that mechanisms in our brains, originally designed to deal with scarcity, can actually work against us in an environment abundant with food, information, and opportunity. “We have long been creating trouble for ourselves,” he says, “by creating mismatch between the culture which has been established since the industrial revolution and the evolution of our biology.”
Regions of the frontal cortex—the part of the human brain that keeps your impulses in check—help you resist the short-term urge to act rashly. Humans have a uniquely powerful ability to weigh long-term considerations when deliberating on a decision. But when the consequences to our urges seem low and the reward seems high, we tend to give in to our impulses. Whybrow contends that this is one of the effects of our dependence upon new technologies like social media. “Information is affluent at the moment because, increasingly, we can have these little gadgets which create an opportunity for us to do interesting things, and be amused by the novelty of people sending us messages and apps and so on.” The result, he says, is that we actually get worse at regulating our behavior for our own long-term benefit. “We’ve built an environment that ties into this short-term primitive thinking, and it tends to exclude the rational, long-term vision of the human capacity to plan, which is, of course, our great attribute.”
The way technology exploits our preference for short-term payoff doesn’t merely increase our propensity to scroll through pictures of our exes on Facebook or check our retweets every five minutes. For Whybrow, it’s at work in large-scale crises as well. “We’ve created all sorts of opportunities for people to benefit in the short term and to forget the long term,” he says. “That’s what happened in the 2008 financial crisis. That’s also what happens when you get sent a credit card, which you don’t have to pay off for six months.”
We already know, of course, that being in nature can be restorative and that humans are sometimes shortsighted. These are hardly radical thoughts in and of themselves. But viewing these issues through the lens of our evolution gives us a whole new way of thinking about our needs. It causes us to think about how other aspects of life in the urban, networked, information-affluent environment are holding us back.
Applying evolutionary theory to human culture and be-havior was off limits until the late 20th century. That was for good reason because, as evolutionary biologist David Sloan Wilson explains, “when it was on limits it led to things like social Darwinism and eugenics.” We have a track record of abusing ideas like this.
There are also those who are wary of mismatch, like Marlene Zuk, an evolutionary biologist at the University of Minnesota, who worry that the theory idealizes an earlier way of life. “It obviously makes sense that we didn’t evolve to sit on the couch and live on Diet Coke and Cheetos,” she says, “but it doesn’t necessarily follow from that [theory] that what we should be doing is living exactly as our ancestors lived.” The barefoot running craze is one example. Sure, our hunter-gatherer ancestors may have ran without shoes, but that doesn’t mean that’s the best way for us to run, skeptics like Zuk are apt to point out. The kind of running that is healthy for a person depends not only on their genes, but also on their development. If you grew up wearing shoes, then your foot and your gait are accustomed to them. You can’t make an easy switch to running barefoot and expect to thrive.
Connecting a health problem to mismatch is also made more complicated by the fact that we often don’t fully understand the underlying mechanisms at work. “You have to be cautious with this,” says Louv, “because a lot of this evidence that has emerged quite recently is correlative rather than causative evidence.” In the case of ADHD, it is hypothesized that involuntary attention restores directed attention, which explains the cognitive benefits of being outdoors. It’s plausible, but it’s not proven.
But incorporating an evolutionary perspective—remembering that we are, in fact, animals that adapted to live in a different environment—can be the key to answering important questions. “If you don’t have the idea of mismatch in mind,” says Wilson, “then you’re going to muddle all sorts of things.” Atherosclerosis, the leading cause of cardiovascular disease in modern humans, is one example. In trying to identify what causes the fatty buildup characteristic of the condition, researchers are awash in data. But it turns out atherosclerosis is all but nonexistent in people living on a paleo diet. The evolutionary perspective—understanding how our current diet differs from what we adapted to as hunter-gatherers—gives medical researchers a framework for understanding which foods contribute to the disease and which do not. It would scarcely occur to you to even make the distinction between an agricultural diet and a pre-agricultural diet if you weren’t thinking about what our ancestors evolved to eat.
Another example of how mismatch theory can shed light on the current plight of human beings is morning sickness. When we imagine ourselves to be separate from our animal ancestors and their evolved needs, we think of the nausea of morning sickness as a problem to be solved. And so we develop drugs to alleviate its symptoms so that expecting mothers can eat what they want. But from an evolutionary perspective, morning sickness is an adaptive response that causes the mother to eat on behalf of her fetus rather than on behalf of herself. It helps the mother filter out things that are toxic to a fetus during the first trimester when it is at its most vulnerable.
As the 21st century progresses, and urbanization continues, and ever more immersive and distracting technologies gain control over our lives, it will become more difficult to ignore the incredible explanatory power of mismatch theory.
Whybrow imagines a cultural shift in which we learn to reconnect with our more animal roots and, to borrow Henry David Thoreau’s phrase from Walden, “to live deliberately.”
It may be a while before the idea of evolutionary mismatch is part of the public conversation, and even longer before it’s used to guide public health. The science is still young; our understanding of our own health—both physical and psychological—is still incomplete. And, even as we gain more understanding, our environment will continue to shift. As it does, we may want to make a point of regularly asking ourselves, in what ways are we like the sea turtles heading toward the glow of the coastal road, and how can we point ourselves back to the brightest part of the ocean’s horizon?
Photos by Brian Paumier
16 Images That Perfectly Capture How Completely Nuts Modern Life Has Become John Holcroft's fascinating take on the media, obesity, and Facebook.
People are dying over this kid's emotional reaction to learning his sister is his half-sister Pam's brother doesn’t quite grasp the concept of half-siblings.
Aretha Franklin's Shade Towards Taylor Swift Belongs In The Rock And Roll Hall Of Fame The world not only lost the Queen of Soul, but a shade queen as well.
The Best And Worst States To Have A Baby In 2018 Where do you think your state ranks?
No One Noticed What This Woman Was Staring At When They Chose Her For Their Label This pasta jar is really not what it seems.