Are children today really suffering nature deficit disorder (TM)?

Children working in a London hosiery mill
around the turn of the century. Did they have
“Nature-Deficit Disorder (TM)”? Source.

Maybe you’ve heard of the scourge plaguing modern-day children, the one known as Nature Deficit Disorder (TM). You won’t find it in any of the standard diagnostic manuals used to identify true disorders, but the “disorder” arises, so the story goes, as a result of keeping children inside for fear of their safety and “stranger danger,” loss of natural surroundings in cities and neighborhoods, and increased attractions indoors that prevent spending time outdoors. 

This “disorder” is supposed to be an effect of modern times, the combined effects of controlling and fearful parents along with the irresistible screen-based attractions indoors. As a result of this “disorder,” children can allegedly be susceptible to any number of ills, including less respect for and understanding of nature, depression, shorter life spans, and obesity.

Concerns like these, it seems, have arisen with the advent of each new technological advance. One wonders if the invention of the wheel raised alarms that children might move through their natural surroundings too quickly to take them in. At any rate, while the person who invented this disorder, Richard Louv, has actually trademarked the term, it doesn’t seem to have made a big splash in the scientific literature. Given that studies are lacking–i.e., completely absent–about “nature deficit disorder,” one thing we can do is take a look back at how children lived before the technological age to see if their indoor-outdoor lives and exposure to the natural world were substantially different.

Go far enough back in human history, and of course, we all spent a lot of time outside. But how did we spend our time with the rise of civilization? Children in agrarian societies then and now worked from dawn to dusk as part of the family to put food on the table. In such a position, they certainly had no lack of exposure to nature, although how much they appreciated that endless grind could be in question. That is, of course, if they didn’t die in infancy or early childhood, as a large percentage of them did in spite of all that fresh air and time outside.

But what happened with children and how they spent their time with the rise of towns and cities? In early times, many of those cities were walled compounds, not necessarily hives of scum and villainy, but generally stacks upon stacks of living quarters existing solely for functionality. Nature? Outside the walls, where danger–including the most extreme kind of “stranger danger”–lurked. Cities that lacked walls, as ancient Rome did for a long period, still were more focused on efficient crowding and function far more than on nature, with only the wealthy having gardens, the modern equivalent of today’s back yards. In general, there were people, there were buildings, and there were more people. Not wildly different from, say, Manhattan today–except for that whole natural jewel known as Central Park.

This piling on of people, brick, mortar, more people, and wood continued for children who didn’t live in agrarian societies. With the Industrial Revolution, what may have really been a nature deficit disorder for a child living, in, say, London, became a genuine threat to health. While they certainly didn’t have television to keep them indoors, they also didn’t have child labor laws. The result was that children who once might have been at work at age 4 in a field were now at work at age 3 or 4 in a factory, putting in 12 or so hours a day before stepping out into the coal-smoked, animal-dung-scented air of the city. 

Child labor wasn’t something confined to Industrial Revolution Britain, and it continues today, both for agriculture and industry. I do wonder if the children harvesting oranges in Brazil feel any closer to nature than the children weaving carpets in Egypt. Likely, there are deficits more profound for them to worry about.

The trigger for this overview of whether or not things have really changed over recorded history in terms of children’s exposure to the natural world is this series of articles in the New York Times (NYT). In case you hit the paywall, it is the NYT’s “Room for Debate” series and includes four articles addressing whether or not nature shows and films connect people to the natural world or “contribute to ‘nature deficit disorder’” by keeping people glued to screens instead of being outside.

Louv, the coiner of “Nature deficit disorder TM”, is one of the four contributors to the debate. He argues that viewing nature documentaries can inspire us to go outside. He also thinks many of us grew up watching “Lassie” instead of the “Gilligan’s Island” my generation watched, but perhaps there’s not a huge difference between Timmy in the well and Gilligan in the lagoon and consequent outdoor inspiration. At any rate, Louv does argue in favor of viewing nature shows, although from a very first-world perspective (like the Romans and gardens, we don’t all have back yards, for example). 

Perhaps the least-defensible perspective is the argument that Ming (Frances) Kuo, an associate professor of natural resources and environmental sciences, has to offer. She compares nature documentaries to “junk food” and offers the obvious: They’re no comparison for the real world. For some reason, she implies that someone has argued that when you have access to TV, you don’t need access to nature, saying, “Scientists have been discovering that even in societies where just about everyone has access to a TV, Internet, or both, having access to nature matters.” I honestly don’t think anyone’s ever argued against that.

Does “nature deficit disorder” exist and is indoor screen time with nature documentaries to blame? In addition to the historical observations I’ve made above suggesting that children from previous eras haven’t necessarily been wandering the glades and meadows like wayward pixies, all I have to offer is a bit of anecdata, and I’m curious about the experiences of others. Historical comparisons suggest that city-dwelling children are no more deficient nature-wise today than city-dwelling children of yesteryear. But do nature documentaries help… or hinder?

When I was young and watching too much “Sesame Street,” “Gilligan’s Island,” and “Star Trek,” the only nature show available to me was “Wild Kingdom” (Mutual of Omaha’s, natch). Other than that, we had nothing unless a periodic NOVA episode came on public television. 

I was interested in science and nature, but acquiring knowledge outside of what I read in a book was difficult. As a resident of the great metropolis of Waco, Tex., yes, I had a natural world to explore, but let’s face it: The primates there weren’t that interesting, and bluebonnets get you only so far. I had no access to real-life live-motion visuals, auditory inputs, or information delivered in any form except what I could read in a book. Talk about sensory limitations.

These days, my children have a nature documentary library that extends to dozens and dozens of choices. And they have watched every single one, some of them repeatedly. That’s not to say that they don’t also have dozens of well-thumbed field guides and encyclopedias covering fossils, dinosaurs, plants, bugs, sharks, rocks–the usual obsessions of the young who are interested in nature. Our “movie nights” often kick off with a nature documentary, and our pick of choice will frequently be one involving narration from David Attenborough. My children want to be David Attenborough–so do I, for that matter–and I can’t recall ever really having that feeling about Marlin Perkins or Jim Fowler

And the upshot of that access to an expanse of nature documentaries I never had is that their knowledge of nature is practically encyclopedic. I’m the biologist in the family–or at least the one who has the biology degree–but my children often know more than I do about a specific plant or animal or ecosystem or area of the world, all thanks to these documentaries they watch. And when we’re outside, they extrapolate what they’ve learned, generalizing it to all kinds of local natural situations.

Do children today just need to be moving around more, somewhere, somehow? Oh, yes. But watching nature shows hasn’t exacerbated some kind of “nature deficit” my children might have, Minecraft obsessed as they are. And these documentaries haven’t replaced “real” nature with televised nature. Instead, the shows have expanded on and given context to the nature my children encounter, wherever that is–city, country, farm, sky, ocean, parking lot, grocery store, or even inside their own home, which is currently the scene of a sci-fi-like moth infestation that has triggered much excitement. I’d hazard that far from causing a deficit, nature shows have given my children a nature literacy that was unknown in previous generations. 



What is your take on nature deficits and nature documentaries?


By Emily Willingham, DXS managing editor 

UneXXpected Science: Does ADHD have benefits in certain environments?

Mmmm. Novelty seeking. (Source.)

Anyone who has ADHD—attention-deficit/hyperactivity disorder—can tell you the stories. Stories of getting into constant trouble, hearing “sit down, sit still, be quiet” repeatedly, endlessly, feeling the urge to move, touch, jump, talk, roll, do anything but sit quietly at a desk, working on math. And, as anyone who has ADHD can also tell you, these traits often don’t exactly help you get ahead in modern society. School requires stillness, attention, focus on pencil and paper work. Most jobs require focus, attention, sometimes an ability to tolerate the sheer boredom of four walls of a grey cubicle for eight hours each day. Most people would struggle with that, but with attention deficit, it’s more than a struggle.

A cubicle environment obviously is probably not the best place for someone with ADHD, although it may be beneficial in less boxy workplaces. And school can be a long, troubling, negative process, as well. People used to blame the parents of children with this disorder, laying the cause of ADHD at the feet of poor parenting–and some are still trying to lay that blame. Science has something else to say about it, having demonstrated that genes are actually the primary contributors to ADHD, specifically genes that encode proteins whose job is to “receive” messages from a brain chemical called dopamine.


Dopamine signaling underlies all kinds of behaviors, but primarily it is known for its involvement in reward pathways, novelty seeking, and addiction. Specific forms, or alleles, of dopamine receptor genes have been strongly associated with ADHD, and this disorder can be viewed in many cases as a constant search for reward and novelty, a search that can translate as inattentiveness or hyperactivity.


Given that this dopamine-based manifestation is rooted in genes, the question arises of why it has persisted in humans throughout our evolution. If we look around at modern society, it’s easy to see that ADHD behaviors generally are not conducive to being one of the “fittest” in many situations that take up most of our time. Yet, there has been enough associated advantage for these gene forms to persist and allow their carriers to reproduce and pass them along to offspring.


And that’s where we need to think in nature’s evolutionary terms. Modern society is just that—modern. This way of life has only been around for, at most, a few thousand years, which can be a blink of an eye for processes of natural selection. Dial back time about 10,000 years or 20,000 years, and you’ll be hard pressed to find any humans living in an environment anything remotely like a cubicle.


Natural selection results from the interaction of genes and environment, and the “selection” Nature’s making is for an individual’s genetic makeup to have some representation in future generations. To look at this process through Nature’s lens, take the gene forms associated with ADHD and place them in a different environment and ask the question: Do they help or hurt or make no difference at all?  

This question is exactly what researchers addressed when they looked at the effects of an ADHD-related gene form on a group of nomadic people, the Ariaal, in Kenya. Some members of this population had, in only the last few decades, made a transition to a sedentary, city-type lifestyle. Others continued to live the fast-moving, nomadic existence of their herding ancestors.


Researchers looked at a version of a dopamine receptor called DRD4-7R, which also has been implicated in autism symptoms in people with ADHD. They found that city dwellers with this form of the gene didn’t fare as well in health as their sedentary cousins without it. But the Ariaal who continued their nomadic existence and carried the 7R form of the gene fared better than those nomadic tribesmen without it. To assess health, the researchers looked at body mass index and other factors. 

The results suggest that there might be some benefit to ADHD in the backdrop of a nomadic culture. although a more recent analysis of several studies together suggests a different form of this receptor may have an ADHD association (this kind of study, called a meta-analysis, doesn’t provide new data but synthesizes existing data). 

Regardless of which gene forms are involved, you can imagine that in a nomadic culture, it might be useful to be always looking around, seeking novelty, thriving on the rewards of changing behaviors, defending food, and being always on the move. Someone with ADHD likely would be far better fit for this kind of lifestyle than would the best desk jockey in the world. This interesting study demonstrates that when it comes to some of the neurological developmental manifestations we call “disorders,” how negative or positive they are may be a matter of environment.


By Emily Willingham, DXS managing editor 

I Am Mental Illness: Anorexia–Biting Back

Battling the uninformed, insurance companies, and your own compulsions

[Ed. note: This post is the first in our series, “I Am Mental Illness,” bringing you personal experiences living with a mental illness. It’s likely that no single one of us lives a life untouched by mental illness, our own or that of someone we know. Yet in spite of their high prevalence, these disorders remain stigmatized and undersupported. To learn more about mental illness, you can start with the National Alliance on Mental Illness website. To learn more about anorexia and other eating disorders, you can start with this guidebook from the National Institute of Mental Health. Double X Science has previously featured a post by Harriet Brown describing the effects of family-based treatment for anorexia. Continue reading

flu pic resized

25 myths about the flu vaccine debunked

Setting the record straight on the flu vaccine

by Tara Haelle
Continue reading

mouselarge

Parenting paranoia comes in different forms

It could be Andrew Wakefield or a brain-hijacking microbe.

by Meredith Swett Walker

I’m a scientist, but I’ve learned that when we become parents, paranoia can trump the powers of rational analysis I’ve so carefully nurtured and developed. For some parents, media-whipped fears about vaccines take front and center in the anxiety lineup. For me, a brain-infecting microbe that makes mice hang around cats is at the top of my parenting paranoia list.

Parenting requires making many, many choices. Some seem inconsequential, like whether your child will wear overalls or sweatpants, pigtails or a pixie cut. But other choices have to do with health issues such as circumcision, immunization, and breast milk vs. formula – just a few in an endless list. For geeks like me, the first impulse is to research each issue, make a choice, and prepare an argument for anyone who questions the decision (and believe me, someone will.) My response usually goes something like this: “Well, recent studies have shown that yada yada yada…” Then I pat myself on the back for being so informed and making such a well-reasoned decision.

My process ran into trouble, though, when my relationship with a university and its online library access ended. What happens when you can’t get your hands on peer-reviewed scientific journal articles? One consolation should be that we live in the “Information Age.” Surely, Google, a fast internet connection, and an overwhelming flood of information should lead to what we need to make well-reasoned, science-based parenting choices. Surely.

Maybe not. A friend recently shared with me an article from the open-access (i.e., free) online journal PLoS: “Why Most Biomedical Findings Echoed by Newspapers Turn Out to be False: The Case of Attention Deficit Hyperactivity Disorder.”  The gist is that the news media preferentially cover initial findings described in the most prominent scientific journals. The key word there is initial. No initial result is going to be the final word in science, and all results require confirmation from other researchers repeating or extending the experiments. Sadly, in practice, many of the follow-up studies don’t get published in the most prominent journals because they are not “a big scoop.” Yet they often show that the initial, Big Headline Finding was overblown or even incorrect.

That brings me to an example that really pushes my buttons — childhood immunizations. In 1998, Andrew Wakefield and colleagues published a study in the prominent British medical journal the Lancet. The paper examined a hypothesized association between the MMR (measles, mumps, rubella) vaccine and autism, but the authors used fairly moderate language in their conclusions. But then, Wakefield participated in a press conference about the paper and asserted in much stronger language that the MMR combined vaccine and autism were linked and that parents should turn to single shots for measles, mumps, and rubella. The news media ate it up.

The scientific community immediately pointed out a number of glaring flaws in the study, and subsequent investigations over the next decade failed to reproduce or confirm the results. But it was too late. The popular media and celebrities like Jenny McCarthy had already done the damage. Parents were terrified, vaccination rates dropped, and deadly measles and whooping cough outbreaks starting cropping up.

Yes, the news media covered subsequent studies reporting no link between vaccines and autism, but let’s face it: Science is slow, and news is fast. In the interval, scary information takes root. The Lancet retracted the article 12 years after its publication, and in 2011, British investigative journalist Brian Deer demonstrated that Wakefield actively falsified data. Still, to this day, vaccination rates have not fully recovered, and many parents remain misinformed and concerned about vaccinating their children. Indeed, the Wakefield debacle has been directly blamed for a huge and ongoing measles outbreak in Wales.

I could haz Toxoplasmodium in my poop, so be careful.

I could haz Toxoplasmodium in my poop, so be careful.

Admittedly, the MMR case is an extreme example but also a good one of how a single initial study and the ensuing media hysteria can have a huge effect on parents — and on children’s health.

And we all have our trigger points for fear. One (of the many) things in our family tree is schizophrenia. A member of our extended family developed schizophrenia as an adolescent and has never recovered. Schizophrenia can run in families, so my two children have up to a 4% chance of developing this disorder compared to the 1.1% chance of someone without close relatives who have it.

So along comes my March 2012 issue of The Atlantic featuring “How Your Cat Is Making You Crazy” by Kathleen MacAuliffe. I would have found this article fascinating even if schizophrenia weren’t a concern. Its subject is a parasite called Toxoplasmosis gondii, which usually cycles through two hosts: cats and rodents. Toxo, as I’ll call this beast, starts life as an egg in a cat, is pooped out, and then gets picked up by a new cat. How does it get into a new cat? Cats, unlike dogs, are pretty fastidious and don’t tend to eat or otherwise mess around with cat poop. So Toxo gets itself into a less fastidious but tasty morsel like a mouse, instead, making its way into the cat when the mouse becomes dinner.

That seems simple enough, but there’s more. Toxo infection ups the odds of a mouse–cat encounter by hijacking the mouse’s brain and changing its behavior. The mouse’s activity level increases (cats love to chase fast-moving objects), and the rodent might become less wary in exposed areas and even attracted to the smell of cats. Watch these videos, and you’ll see how the infected mice move faster and wander into unknown spaces, seemingly without fear, as you can see in this video and this one.

The trouble for humans is that we also can pick up Toxo through contact with cat poop or eating undercooked meat or unwashed veggies from a garden where cats poop. Becoming infected with Toxo during pregnancy can be very harmful to a fetus, so pregnant women have long been warned off cleaning kitty litter boxes. But healthy, non-pregnant adults infected with Toxo weren’t thought to experience any detrimental effects — until recently. According to MacAuliffe’s article, which focuses on the work of Czech biologist Jaroslav Flegr, Toxo might alter human behavior, too, in mouse-like ways, such as reducing fearfulness. In most people, these purported behavioral shifts are probably very subtle and unremarkable. But Flegr suggests that in some people, Toxo infection serves as the trigger for mental illness, including schizophrenia.

Schizophrenia likely develops because of interactions between genes and the environment. Having risk gene variants isn’t a guarantee a person will develop schizophrenia, and it can arise in people without those risk variants. The list of potential environmental triggers is long and includes childhood stress, prenatal undernutrition, drug abuse, and …  infections with microbes like Toxo.

Reading this article set me off on a tear of worrying. We have a cat, but I wasn’t worried about her. She is an indoor cat (we love birds), and there is a very low incidence of Toxo infections in indoor cats. But we have outdoor cats and feral cats in our neighborhood. They sometimes hang out in our yard, where my kids like to play in the dirt and eat things out of the garden, including the dirt itself. Oh, poop.

I took to Google and researched cat traps and repellents and how to get kids to wash their hands. I laid awake at night for hours strategizing about how to keep my home and yard Toxo free. And then I realized, even if I managed to exclude all cats from my yard and the totally impossible feat of getting my children (ages 1 and 2) to wash their hands before they touched their faces or food every time, I was still doomed to failure. My kids would go to friend’s houses and play in their Toxo-infested yards.  Or they might already have encountered Toxo anyway.

Toxo was something I couldn’t control, and I needed to let it go. At our next check-up, I talked to our pediatrician about it, who had never heard about the potential Toxo–schizophrenia link. She graciously concealed her “Oh, Lord, another parent with a loony theory” reaction and calmed me down. As she put it, my only real option to prevent Toxo infection was to never allow my children to play outdoors or in the dirt, and the detrimental effects of that were likely far greater than the risk of schizophrenia, Toxo or no Toxo.

And she also reminded me of what I already knew and should have remembered: These findings about Toxo are initial findings.

As a scientist, I know that the schizophrenia–Toxo link needs more study. A lot more study. As a parent, well … yeah. I still worry, and no lack of replication or confirmation is likely to stop me.

[Image credits: cat on this page by Sasan Geranmehr under Creative Commons 3.o license; mouse under same license.] Continue reading