Our cell phones and computers had us convinced we could do five things at once. But neuroscience, says novelist Walter Kirn, is now finding that the mental gymnastics required actually dumbs us down.
In the Midwestern town where I grew up (a town so small that the phone line on our block was a “party line” well into the 1960s), there were two skinny brothers in their 30s who built a car that could drive into the river and become a fishing boat.
My pals and I thought the car-boat was a wonder. A thing that did one thing but also did another thing—especially the opposite thing, but at least an unrelated thing—was our idea of a great invention and a bold stride toward the future. Where we got this idea, I’ll never know, but it caused us to envision a world-tocome teeming with crossbred, hyphenated machines. Refrigerator–TV sets. Dishwasher– air conditioners. Table saw– popcorn poppers. Camera-radios.
With that last dumb idea, we were getting close to something, as I’ve noted every time I’ve dropped or fumbled my cell phone and snapped a picture of a wall or the middle button of my shirt. Impressive. Ingenious. Yet juvenile. Arbitrary. And why a substandard camera, anyway? Why not an excellent electric razor?
Because (I told myself at the cell phone store in the winter of 2003, as I handled a feature-laden upgrade that my new contract entitled me to purchase at a deep discount that also included a rebate) there may come a moment on a plane or in a subway station or at a mall when I and the other able-bodied males will be forced to subdue a terrorist, and my color snapshot of his trussed-up body will make the front page of USA Today and appear at the left shoulder of all the superstars of cable news.
While I waited for my date with citizenjournalist destiny, I took a lot of self-portraits in my Toyota and forwarded them to a girlfriend in Colorado, who reciprocated from her Jeep. Neither one of us almost died. For months. But then, one night on a snowy two-lane highway, while I was crossing Wyoming to see my girl’s real face, my phone made its chirpy you-have-a-picture noise, and I glanced down in its direction while also, apparently, swerving off the pavement and sailing over a steep embankment toward a barbed-wire fence.
It was interesting to me—in retrospect, after having done some reading about the frenzied activity of the multitasking brain—how late in the process my prefrontal cortex, where our cognitive switchboards hide, changed its focus from the silly phone (Where did it go? Did it slip between the seats? I wonder if this new photo is a nude shot or if it’s another one from the topless series that seemed like such a breakthrough a month ago but now I’m getting sick of) to the important matter of a steel fence post sliding spear-like across my hood.
The laminated windshield glass must have been high quality; the point of the post bounced off it, leaving only a starshaped surface crack. But I was still barreling toward sagebrush, and who knew what rocks and boulders lay in wait.
Five minutes later, I’d driven out of the field and gunned it back up the embankment onto the highway and was proceeding south, heart slowing some, satellite radio tuned to a soft-rock channel called the Heart, which was playing lots of soothing Céline Dion.
“I just had an accident trying to see your picture.”
“Will you get here in time to take me out to dinner?”
“I almost died.”
“Well, you sound fine.”
“Fine’s not a sound.”
I never forgave her for that detachment. I never forgave myself for buying a camera phone.
We all remember the promises. The slogans. They were all about freedom, liberation. Supposedly we were in handcuffs and wanted out of them. The key that dangled in front of us was a microchip.
“Where do you want to go today?” asked Microsoft in a mid-1990s ad campaign. The suggestion was that there were endless destinations—some geographic, some social, some intellectual—that you could reach in milliseconds by loading the right devices with the right software. It was further insinuated that where you went was purely up to you, not your spouse, your boss, your kids, or your government. Autonomy through automation.
This was the embryonic fallacy that grew up into the monster of multitasking.
Human freedom, as classically defined (to think and act and choose with minimal interference by outside powers), was not a product that firms like Microsoft could offer, but they recast it as something they could provide. A product for which they could raise the demand by refining its features, upping its speed, restyling its appearance, and linking it up with all the other products that promised freedom, too, but had replaced it with three inferior substitutes that they could market in its name:
Efficiency, convenience, and mobility.
For proof that these bundled minor virtues don’t amount to freedom but are, instead, a formula for a period of mounting frenzy climaxing with a lapse into fatigue, consider that “Where do you want to go today?” was really manipulative advice, not an open question. “Go somewhere now,” it strongly recommended, then go somewhere else tomorrow, but always go, go, go—and with our help. But did any rebel reply, “Nowhere. I like it fine right here”? Did anyone boldly ask, “What business is it of yours?” Was anyone brave enough to say, “Frankly, I want to go back to bed”?
Maybe a few of us. Not enough of us. Everyone else was going places, it seemed, and either we started going places, too— especially to those places that weren’t places (another word they’d redefined) but were just pictures or documents or videos or boxes on screens where strangers conversed by typing—or else we’d be nowhere (a location once known as “here”) doing nothing (an activity formerly labeled “living”). What a waste this would be. What a waste of our new freedom.
Our freedom to stay busy at all hours, at the task—and then the many tasks, and ultimately the multitask—of trying to be free. It isn’t working, it never has worked.
The scientists know this too, and they think they know why. Through a variety of experiments, many using functional magnetic resonance imaging to measure brain activity, they’ve torn the mask off multitasking and revealed its true face, which is blank and pale and drawn.
Multitasking messes with the brain in several ways. At the most basic level, the mental balancing acts that it requires—the constant switching and pivoting—energize regions of the brain that specialize in visual processing and physical coordination and simultaneously appear to shortchange some of the higher areas related to memory and learning. We concentrate on the act of concentration at the expense of whatever it is that we’re supposed to be concentrating on.
What does this mean in practice? Consider a recent experiment at UCLA, where researchers asked a group of 20- somethings to sort index cards in two trials, once in silence and once while simultaneously listening for specific tones in a series of randomly presented sounds. The subjects’ brains coped with the additional task by shifting responsibility from the hippocampus—which stores and recalls information—to the striatum, which takes care of rote, repetitive activities. Thanks to this switch, the subjects managed to sort the cards just as well with the musical distraction— but they had a much harder time remembering what, exactly, they’d been sorting once the experiment was over.
Even worse, certain studies find that multitasking boosts the level of stressrelated hormones such as cortisol and adrenaline and wears down our systems through biochemical friction, prematurely aging us. In the short term, the confusion, fatigue, and chaos merely hamper our ability to focus and analyze, but in the long term, they may cause it to atrophy.
The next generation, presumably, is the hardest-hit. They’re the ones way out there on the cutting edge of the multitasking revolution, texting and instant messaging each other while they download music to their iPod and update their Facebook page and complete a homework assignment and keep an eye on the episode of The Hills flickering on a nearby television. (A recent study from the Kaiser Family Foundation found that 53 percent of students in grades seven through 12 report consuming some other form of media while watching television; 58 percent multitask while reading; 62 percent while using the computer; and 63 percent while listening to music. “I get bored if it’s not all going at once,” said a 17-year-old quoted in the study.) They’re the ones whose still-maturing brains are being shaped to process information rather than understand or even remember it.
This is the great irony of multitasking— that its overall goal, getting more done in less time, turns out to be chimerical. In reality, multitasking slows our thinking. It forces us to chop competing tasks into pieces, set them in different piles, then hunt for the pile we’re interested in, pick up its pieces, review the rules for putting the pieces back together, and then attempt to do so, often quite awkwardly. (Fact: A brain attempting to perform two tasks simultaneously will, because of all the back-and-forth stress, exhibit a substantial lag in information processing.)
Productive? Efficient? More like running up and down a beach repairing a row of sand castles as the tide comes rolling in and the rain comes pouring down. Multitasking, a definition: “The attempt by human beings to operate like computers, often done with the assistance of computers.” It begins by giving us more tasks to do, making each task harder to do, and dimming the mental powers required to do them. It finishes by making us forget exactly how on earth we did them (assuming we didn’t give up, or “multiquit”), which makes them harder to do again.
After the near-fatal consequences of my 2003 decision to buy a phone with a feature I didn’t need, life went on—and rather rapidly, since multitasking eats up time in the name of saving time, rushing you through your two-year contract cycle and returning you to the company store with a suspicion that you didn’t accomplish all you hoped to after your last optimistic, euphoric visit.
“Which of the ones that offer rebates don’t have cameras in them?”
“The decent models all do. The best ones now have video capabilities. You can shoot little movies.”
I wanted to ask, Of what? Oncoming barbed wire?
I shook my head. I was turning down whiz-bang features for the first time.
“I’ll take the fat little free one,” I told the salesman.
“The thing’s inert. It does nothing. It’s a pet rock.”
I informed him that I was old enough to have actually owned a pet rock once and that I missed it.
From a longer essay that appears in November’s The Atlantic Monthly. © 2007 by The Atlantic Monthly Group. Distributed by Tribune Media Services.
- Rick Santorum wins the prize for the worst Nelson Mandela tribute
- This is how much extra it costs to eat healthy every day
- 6 grammar points to watch out for in Christmas songs
- How the strange case of Obama's Uncle Omar complicates immigration reform
- There is a better alternative to raising the minimum wage
- What every TV show can learn from Sleepy Hollow
- Watch The Daily Show use Pope Francis to hammer Fox Business pundits
- Is Biden helping or hurting U.S. interests in Asia?
- Girls on Film: Stephen Frears' pioneering platform for older actresses
- How Vanessa became Jacob
Subscribe to the Week