Recently, my two-year-old nephew Benjamin came across a copy of Vanity Fair abandoned on the floor. His eyes scanned the glossy cover, which shone less fiercely than the iPad he is used to but had a faint luster of its own. I watched his pudgy thumb and index finger pinch together and spread apart on Bradley Cooper's smiling mug. At last, Benjamin looked over at me, flummoxed and frustrated, as though to say, "This thing's broken."

Search YouTube for "baby" and "iPad" and you'll find clips featuring one-year-olds attempting to manipulate magazine pages and television screens as though they were touch-sensitive displays. These children are one step away from assuming that such technology is a natural, spontaneous part of the material world. They'll grow up thinking about the internet with the same nonchalance that I hold toward my toaster and teakettle. I can resist all I like, but for Benjamin's generation resistance is moot. The revolution is already complete.

Technology Is Evolving Just Like Our DNA Does

With its theory of evolution, Charles Darwin's The Origin of Species may have outlined, back in 1859, an idea that explains our children's relationship with iPhones and Facebook. We are now witness to a new kind of evolution, one played out by our technologies.

The "meme," a term coined by evolutionary biologist Richard Dawkins in 1976, is an extension of Darwin's Big Idea past the boundaries of genetics. A meme, put simply, is a cultural product that is copied. We humans are enamored of imitation and so become the ultimate "meme machines." Memes—pieces of culture—copy themselves through history and enjoy a kind of evolution of their own, and they do so riding on the backs of successful genes: ours.

According to the memeticist Susan Blackmore, just as Darwinism submits that genes good at replicating will naturally become the most prevalent, technologies with a knack for replication rise to dominance. These "temes," as she's called these new replicators, could be copied, varied, and selected as digital information—thus establishing a new evolutionary process (and one far speedier than our genetic model). Blackmore's work offers a fascinating explanation for why each generation seems less capable of managing solitude, and less likely to opt for technological disengagement.

Young people now count on the internet as 'their external brain' and have become skillful decision makers—even while they also 'thirst for instant gratification and often make quick, shallow choices.'

She suggests that temes are a different kind of replicator from the basic memes of everyday material culture. "Most memes . . . we forget how often we get them wrong," Blackmore says. (Oral traditions of storytelling, for example, were characterized by constant twists in the tale.) "But with digital machines the fidelity is almost 100 percent. As it is, indeed, with our genes." This is a startling thought: By delivering to the world technologies capable of replicating information with the same accuracy as DNA, we are playing a grand game indeed.

Old Ways of Thinking Are on the Verge of Extinction

The brains our children are born with are not substantively different from the brains our ancestors had 40,000 years ago. For all the wild variety of our cultures, personalities, and thought patterns, we're all still operating with roughly the same three-pound lump of gray matter. But almost from day one, the allotment of neurons in those brains (and therefore the way they function) is different today from the way it was even one generation ago. Every second of your lived experience represents new connections among the roughly 86 billion neurons packed inside your brain. Children, then, can become literally incapable of thinking and feeling the way their grandparents did. A slower, less harried way of thinking may be on the verge of extinction.

Michael Harris

Michael Harris is a contributing editor at Western Living and Vancouver magazine. His award-winning writing appears regularly in publications such as The Huffington Post and The Walrus. He is the author of The End of Absence and lives in Toronto, Canada.

In your brain, your billions of neurons are tied to each other by trillions of synapses, a portion of which are firing right now, forging (by still mysterious means) your memory of this sentence, your critique of this very notion, and your emotions as you reflect on this information. Our brains are so plastic that they will reengineer themselves to function optimally in whatever environment we give them. Repetition of stimuli produces a strengthening of responding neural circuits. Neglect of other stimuli will cause corresponding neural circuits to weaken. (Grannies who maintain their crossword puzzle regime knew that already.)

UCLA's Gary Small is a pioneer of neuroplasticity research, and in 2008 he produced the first solid evidence showing that our brains are reorganized by our use of the internet. He placed a set of "internet naïve" people in MRI machines and made recordings of their brain activity while they took a stab at going online. Small then had each of them practice browsing the internet for an hour a day for a week. On returning to the MRI machine, those subjects now toted brains that lit up significantly in the frontal lobe, where there had been minimal neural activity beforehand. Neural pathways quickly develop when we give our brains new tasks, and Small had shown that this held true—over the course of just a few hours, in fact— following internet use.

We can tell that something has changed in our minds, but we still feel helpless against it, and we even feel addicted to the technologies that are that change's agents.

"We know that technology is changing our lives. It's also changing our brains," he announced. On the one hand, neuroplasticity gives him great hope for the elderly. "It's not just some linear trajectory with older brains getting weaker," he told me. The flip side of all this, though, is that young brains may be more equipped to deal with digital reality than with the decidedly less flashy reality that makes up our dirty, sometimes boring, material world.

In The Shallows, Nicholas Carr describes how the internet fundamentally works on our plastic minds to make them more capable of shallow thinking and less capable of deep thinking. After enough time in front of our screens, we learn to absorb more information less effectively, skip the bottom half of paragraphs, shift focus constantly; "the brighter the software, the dimmer the user," he suggests at one point.

Kids These Days Can Think Quickly—But Not Deeply

The most startling example of our brain's malleability, though, comes from new research by neural engineers at Boston University who now suggest that our children will be able to "incept" a person "to acquire new learning, skills, or memory, or possibly restore skills or knowledge that has been damaged through accident, disease, or aging, without a person's awareness of what is learned or memorized." The team was able to use decoded functional magnetic resonance imaging (fMRI) to modify in highly specific ways the brain activity in the visual cortex of their human subjects.

The possibilities of such injections of "unearned" learning are as marvelous as they are quagmires for bioethical debate. Your grandchild's brain could be trained in a certain direction while watching ads through digital contact lenses without his or her awareness (or, for that matter, acquiescence). For now, it's easier to tell that something has changed in our minds, but we still feel helpless against it, and we even feel addicted to the technologies that are that change's agents. But will our children feel the static?

Some argue that the young are developing new skills better suited to their own reality than to an outmoded past.

In 2012, Elon University worked with the Pew Internet and American Life Project to release a report that compiled the opinions of 1,021 critics, experts, and stakeholders, asking for their thoughts on digital natives. Their boiled-down message was that young people now count on the internet as "their external brain" and have become skillful decision makers—even while they also "thirst for instant gratification and often make quick, shallow choices."

Some of those experts were optimistic about the future brains of the young. Susan Price, CEO and chief Web strategist at San Antonio's Firecat Studio, suggested that "those who bemoan the perceived decline in deep thinking . . . fail to appreciate the need to evolve our processes and behaviors to suit the new realities and opportunities." Price promises that the young are developing new skills and standards better suited to their own reality than to the outmoded reality of, say, 1992. Meanwhile, the report's coauthor, Janna Anderson, noted that while many respondents were enthusiastic about the future of such minds, there was a clear dissenting voice: "Some said they are already witnessing deficiencies in young people's abilities to focus their attention, be patient and think deeply. Some experts expressed concerns that trends are leading to a future in which most people become shallow consumers of information, endangering society."

We may be on our way to becoming servants to the evolution of our own technologies. The power shifts very quickly from the spark of human intention to the absorption of human will by a technology that seems to have intentions of its own.

But we'll likely find there was no robotic villain behind the curtain. Our own capitalist drive pushes these technologies to evolve. We push the technology down an evolutionary path that results in the most addictive possible outcome. Yet even as we do this, it doesn't feel as though we have any control. It feels, instead, like a destined outcome—a fate.

Excerpted from The End of Absence: Reclaiming What We've Lost in a World of Constant Connection by Michael Harris, in agreement with Current, an imprint of Penguin Random House. Copyright (c) Michael Harris, 2014.

Editor: Samantha Oltman (@samoltman)