My latest review in the San Francisco Chronicle
‘iBrain’ examines digital era’s mind games
Sunday, October 12, 2008
By Gary Small and Gigi Vorgan
Collins Living; 240 pages; $24.95
Technology has changed the world. But has it changed our brains?
Doom and gloom warnings have long been a standard critique of the technological age. To be sure, we pay the price for our attachments to computers, cell phones and Internet connections. The effects are physical, social, emotional and especially – as Gary Small and Gigi Vorgan argue – neurological. “iBrain: Surviving the Technological Alteration of the Modern Mind” explores what growing up digital means not just for an entire generation of young users but also for those who share their world.
Writing with his wife, Vorgan, Small is director of the Memory & Aging Center at UCLA’s Institute for Human Neuroscience and Human Behavior. Together they track the ways in which malleable young minds respond to a constant flow of stimuli. They explore how technology influences language acquisition, intelligence, empathy, emotional well-being and social interaction.
Vorgan and Small worry that the digital age may disrupt in subtle – and not-so-subtle – ways normal neurological development from infancy to adulthood. They offer, for example, a discussion of the midlife brain. Between ages 35 and 55, the mind is remarkably elastic and at its prime. The hemispheres of the brain begin to work in concert, thinking becomes more nimble, and social skills are at their peak.
The authors argue that, in these early generations of techno-brains, we cannot know what the long-term implications of neurological changes will be. The “future brain is yet to emerge” – and what will be gained or lost as young brains evolve more rapidly than in any other time in history is anyone’s guess. Case in point, it’s not clear how technology use is linked to a rise in autism, attention deficit hyperactivity disorder and other syndromes.
In the meantime, some good news: Your brain on Google has promise. Vorgan and Small reassure us that Internet use activates the part of the brain involved in decision-making and the assimilation of complex information. Digital technology has also been associated with a marked and consistent rise in IQ scores.
But as always, too much of a good thing is too much of a good thing. Surgeons who play video games may make fewer errors, but Vorgan and Small stress the fine line between using technology to challenge your mind and being in a spaced-out trance as you play during a weeklong binge.
“iBrain” is at once provocative and pedestrian. Perhaps better monikers could have been found to describe the gaps that exist between the “digital natives” (those who grew up surrounded by computers) and the “digital immigrants” (the rest of us who arrived late to the party).
The book dips, at times, into stereotypical descriptions of generational clashes. Anecdotes describe the challenges that employers will face as they hire computer-savvy youngsters who are plugged in 24/7. Are there still tech-phobic fuddy-duddies who grouse that no one takes dictation anymore? Just in case, the book offers up a chapter on how to send e-mail and instant messages, a glossary of “high-tech” words and a primer on text messaging and emoticons: remedial language lessons for the “immigrants.”
But is the generation gap as wide as this? Small and Vorgan acknowledge that, once thrown into the digital world, we all risk being sucked in. And in this new pounding, digital world we inhabit, it is crucial that we find ways to maintain the human connections – and by extension, neural connections – that help distinguish us from the machines to which we’re attached. Perhaps we’re all natives after all. {sbox}
Holly Tucker is associate professor of medicine, health and society and associate professor of French and Italian at Vanderbilt University. E-mail her at books@sfchronicle.com.