Home » Connectivity » The Mind — it is a-changin’

The Mind — it is a-changin’

June 24, 2010

supplu-chain

In three earlier posts [When does Connectivity Narrow Thinking?, Books Forever, and Is Google Rewiring our Brains?], I discussed how the availability of the Internet appears to have affected the way we think. The latter post focused on an article by author Nicholas Carr, who believes that search engines, like Google, are doing more than making information available on the World Wide Web. He believes that they are rewiring our brains and he’s not sure he likes the result. Carr has now published a book entitled The Shallows: What the Internet is Doing to Our Brains. John Horgan, a science journalist and director of the Center for Science Writings at the Stevens Institute of Technology in Hoboken, NJ, reviewed the book for the Wall Street Journal [“So Many Links, So Little Time,” 4 June 2010]. He writes:

“While toiling over what you are now reading, I scanned my three email accounts dozens of times and wrote a handful of emails; I responded on my cellphone to a score of text messages from my girlfriend and kids; I checked the balance of my bank account to see if a promised payment had arrived … and so on. Yet I’m relatively unwired. I don’t do Twitter, Facebook or Skype. And I did all this digital darting hither and thither even though I found the subject I was supposed to be writing about—Nicholas Carr’s ‘The Shallows’—quite absorbing. And disturbing. We all joke about how the Internet is turning us, and especially our kids, into fast-twitch airheads incapable of profound cogitation. It’s no joke, Mr. Carr insists, and he has me persuaded. The Internet has transformed my professional and personal lives in many positive ways. Writing about, say, the biology of aggression, I can find more high-quality information in minutes than I could have dug up in weeks when I was beginning my science-writing career in the early 1980s. I can post material online and start receiving feedback—not all of it inane—within minutes, all the while conversing with colleagues, friends and family members by email. Who would regret these advances? But Mr. Carr shows that we’re paying a price for plugging in. Many studies ‘point to the same conclusion,’ he writes. ‘When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning.’ Mr. Carr calls the Web ‘a technology of forgetfulness.’ The average Web page entices us with an array of embedded links to other pages, which countless users pursue even while under constant bombardment from email, RSS, Twitter and Facebook accounts. As a result, we skim Web pages and skip quickly from one to another. We read in what is called an ‘F’ pattern: After taking in the first two lines of a text, we zip straight down the rest of the page. We lose the ability to transfer knowledge from short-term ‘working’ memory to long-term memory, where it can shape our worldviews in enduring ways.”

If you’ve ever been around someone with a learning disability that makes it difficult for them to transfer knowledge from short-term working memory to long-term memory, then you know that Carr’s concern is a real one. The inability to transfer knowledge from short-term to long-term memory makes life much more difficult. People who have that disability find taking tests tortuous. They know they aren’t stupid but test results often undermine their self-esteem. Learning lessons from past mistakes becomes more difficult if you fail to move such lessons from short-term to long-term memory. But memory transfer problems aren’t the only challenges facing information age humans. Horgan moves on to discuss another interesting topic discussed by Carr: multitasking. He writes:

“The multitasking that is enabled, and encouraged, by our laptops and hand-held devices is supposed to boost our productivity but often diminishes it, Mr. Carr says. Students who Net-surf during class, even if their searches are related to the professor’s lecture, remember less than unconnected students. (That settles it: I will never again let my students have open laptops in class.) Verbal SAT scores—which measure reading and writing aptitude—have dropped over the past decade as Internet usage has skyrocketed. What we gain from the Internet in breadth of knowledge—or rather, access to knowledge—we lose in depth. Mr. Carr quotes the playwright Richard Foreman’s lament that we are becoming ‘pancake people—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.’ We are losing our capacity for the kind of sustained, deep contemplation and reflection required to read—let alone write—serious works of fiction and nonfiction. I sense these changes in myself, and I suspect that a lot of other people do, too.”

It used to be that a man (or woman) who had broad intellectual interests and was accomplished in both the arts and the sciences was labeled a renaissance man not a pancake person. The term renaissance man reflects the fact that during the renaissance, when knowledge was limited, a scholar could know almost everything about many topics. With the explosion of knowledge that has occurred over the past couple of hundred years, knowing “almost everything” about anything is nearly impossible. Even so, I believe there is a place in the world for a person with broad interests and general knowledge on a lot of subjects. Carr’s argument appears to be that in a world that demands specialization, we need very few generalists but are creating a world full of them. Horgan continues:

“For many, the pros of connectedness vastly outweigh the cons. My 86-year-old father, who bought an iPhone this year, loves it. No matter where he is, he exults, he can check sports scores and stock prices, read his favorite pundits online, and stay in touch with his kids and grandkids. When I told him that I was reviewing a book about how the Internet is making us dumber, he said: ‘It makes me feel smarter!’ I expected a similar reaction from my teenage son and daughter. Like most American kids, they commune with friends via text messages and Facebook updates (email is so passé), and they spend endless hours trolling the Web for odd videos and cool music. But rather than dismissing Mr. Carr’s thesis as old-fogeyish, as I expected, they confessed that their dependence on the Internet sometimes worries them. My son would like to cut back on his online time, but he fears isolation from his friends.”

I believe there is a big difference between “connectedness” and “social connectivity.” Being connected is what has permitted billions of people to raise the quality of their lives and begin the slow climb out of poverty. They didn’t use their connectivity tweeting about their bathroom habits. They used their connectivity to improve their education, foster business connections, and join the global economy. Horgan concludes:

“My own Internet usage feels compulsive, addictive. Which raises another matter posed by Mr. Carr: Are we really choosing these information technologies of our own free will, because they improve our lives? Our BlackBerrys and Droids offer us infinite options, but such virtual freedom masks a deeper loss of control. Mr. Carr quotes Ralph Waldo Emerson’s aphorism: ‘Things are in the saddle / and ride mankind.’ What Emerson said about railroads and steam engines is even truer of today’s information technologies. In a poignant ‘digression’ toward the end of ‘The Shallows,’ Mr. Carr addresses an obvious question: If the Internet is so distracting, how did a blogger, Facebooker and Tweeter like him manage to write a 276-page book? Answer: He and his wife moved to a mountain town in Colorado that lacked cellphone and broadband Internet service. He stopped blogging and cut back on instant messaging, Skyping, emailing. He gradually started to feel ‘less like a lab rat and more like, well, a human being. My brain could breathe again.’ As he finished the book, Mr. Carr plugged right back in. And upgraded: He bought a Wi-Fi gadget that lets him stream music, movies and videos from the Internet to his stereo and television. ‘I have to confess: it’s cool,’ he writes. ‘I’m not sure I could live without it.'”

I know I certainly wouldn’t like to return to a less disconnected time. That doesn’t mean there aren’t times when I wish I wasn’t connected. Everyone needs some “down” time. I certainly believe the pros of connectivity far outweigh the cons. But as Carr suspects, we are paying a mental price for being too connected and trying to multitask too often. He is supported by researchers who agree that our minds are being re-wired [“Hooked on Gadgets, and Paying a Mental Price,” by Matt Richtel, New York Times, 6 June 2010]. Richtel reports:

“Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information. These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored. The resulting distractions can have deadly consequences, as when cellphone-wielding drivers and train engineers cause wrecks. And for millions of people …, these urges can inflict nicks and cuts on creativity and deep thought, interrupting work and family life. While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress. And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.”

Like Carr, Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists, believes that “technology is rewiring our brains.” Richtel reports that Volkow “and other researchers compare the lure of digital stimulation less to that of drugs and alcohol than to food and sex, which are essential but counterproductive in excess.” However, excessive Internet usage does have some positive effects. Richtel explains:

“Technology use can benefit the brain in some ways, researchers say. Imaging studies show the brains of Internet users become more efficient at finding information. And players of some video games develop better visual acuity. More broadly, cellphones and computers have transformed life. They let people escape their cubicles and work anywhere. They shrink distances and handle countless mundane tasks, freeing up time for more exciting pursuits. For better or worse, the consumption of media, as varied as e-mail and TV, has exploded. In 2008, people consumed three times as much information each day as they did in 1960. And they are constantly shifting their attention. Computer users at work change windows or check e-mail or other programs nearly 37 times an hour, new research shows. The nonstop interactivity is one of the most significant shifts ever in the human environment, said Adam Gazzaley, a neuroscientist at the University of California, San Francisco. ‘We are exposing our brains to an environment and asking them to do things we weren’t necessarily evolved to do,’ he said. ‘We know already there are consequences.'” … At home, people consume 12 hours of media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts as two hours. That compares with five hours in 1960, say researchers at the University of California, San Diego. Computer users visit an average of 40 Web sites a day, according to research by RescueTime, which offers time-management tools. As computers have changed, so has the understanding of the human brain. Until 15 years ago, scientists thought the brain stopped developing after childhood. Now they understand that its neural networks continue to develop, influenced by things like learning skills.”

A good portion of Richtel’s article deals with the effects of multitasking. He continues:

“Not long after Eyal Ophir arrived at Stanford in 2004, he wondered whether heavy multitasking might be leading to changes in a characteristic of the brain long thought immutable: that humans can process only a single stream of information at a time. Going back a half-century, tests had shown that the brain could barely process two streams, and could not simultaneously make decisions about them. But Mr. Ophir, a student-turned-researcher, thought multitaskers might be rewiring themselves to handle the load. His passion was personal. He had spent seven years in Israeli intelligence after being weeded out of the air force — partly, he felt, because he was not a good multitasker. Could his brain be retrained? Mr. Ophir, like others around the country studying how technology bent the brain, was startled by what he discovered.”

Richel goes on to explain how Ophir set up his multitasking experiments:

“Test subjects were divided into two groups: those classified as heavy multitaskers based on their answers to questions about how they used technology, and those who were not. In a test created by Mr. Ophir and his colleagues, subjects at a computer were briefly shown an image of red rectangles. Then they saw a similar image and were asked whether any of the rectangles had moved. It was a simple task until the addition of a twist: blue rectangles were added, and the subjects were told to ignore them. (Play a game testing how well you filter out distractions.) The multitaskers then did a significantly worse job than the non-multitaskers at recognizing whether red rectangles had changed position. In other words, they had trouble filtering out the blue ones — the irrelevant information. So, too, the multitaskers took longer than non-multitaskers to switch among tasks, like differentiating vowels from consonants and then odd from even numbers. The multitaskers were shown to be less efficient at juggling problems. (Play a game testing how well you switch between tasks.)

Getting to know yourself is important. Knowing one’s vulnerabilities and tendencies can help one develop strategies to compensate for apparent shortcomings. If our brains really are being re-wired for multitasking, then we need to know how this might affect people in the workplace. Richtel continues:

“Other tests at Stanford, an important center for research in this fast-growing field, showed multitaskers tended to search for new information rather than accept a reward for putting older, more valuable information to work. Researchers say these findings point to an interesting dynamic: multitaskers seem more sensitive than non-multitaskers to incoming information. The results also illustrate an age-old conflict in the brain, one that technology may be intensifying. A portion of the brain acts as a control tower, helping a person focus and set priorities. More primitive parts of the brain, like those that process sight and sound, demand that it pay attention to new information, bombarding the control tower when they are stimulated. Researchers say there is an evolutionary rationale for the pressure this barrage puts on the brain. The lower-brain functions alert humans to danger, like a nearby lion, overriding goals like building a hut. In the modern world, the chime of incoming e-mail can override the goal of writing a business plan or playing catch with the children. ‘Throughout evolutionary history, a big surprise would get everyone’s brain thinking,’ said Clifford Nass, a communications professor at Stanford. ‘But we’ve got a large and growing group of people who think the slightest hint that something interesting might be going on is like catnip. They can’t ignore it.’ Mr. Nass says the Stanford studies are important because they show multitasking’s lingering effects: ‘The scary part for [multitaskers] is, they can’t shut off their multitasking tendencies when they’re not multitasking.’ Melina Uncapher, a neurobiologist on the Stanford team, said she and other researchers were unsure whether the muddied multitaskers were simply prone to distraction and would have had trouble focusing in any era. But she added that the idea that information overload causes distraction was supported by more and more research.”

Richtel reports that other studies indicate that not just people’s minds are affected by multitasking.

“A study at the University of California, Irvine, found that people interrupted by e-mail reported significantly increased stress compared with those left to focus. Stress hormones have been shown to reduce short-term memory, said Gary Small, a psychiatrist at the University of California, Los Angeles.”

As noted above, not all physical and mental transformations taking place are negative.

“Preliminary research shows some people can more easily juggle multiple information streams. These ‘supertaskers’ represent less than 3 percent of the population, according to scientists at the University of Utah. Other research shows computer use has neurological advantages. In imaging studies, Dr. Small observed that Internet users showed greater brain activity than nonusers, suggesting they were growing their neural circuitry. At the University of Rochester, researchers found that players of some fast-paced video games can track the movement of a third more objects on a screen than nonplayers. They say the games can improve reaction and the ability to pick out details amid clutter. ‘In a sense, those games have a very strong both rehabilitative and educational power,’ said the lead researcher, Daphne Bavelier, who is working with others in the field to channel these changes into real-world benefits like safer driving.”

I suspect the day will come when employees will be tested for multitasking, acuity, and other emerging skills just like workers in the past have been tested for things like typing and writing skills. In the meantime, Richtel reports that a debate rages over whether the changes being studied are good or bad.

“There is a vibrant debate among scientists over whether technology’s influence on behavior and the brain is good or bad, and how significant it is. ‘The bottom line is, the brain is wired to adapt,’ said Steven Yantis, a professor of brain sciences at Johns Hopkins University. ‘There’s no question that rewiring goes on all the time,’ he added. But he said it was too early to say whether the changes caused by technology were materially different from others in the past. Mr. Ophir is loath to call the cognitive changes bad or good, though the impact on analysis and creativity worries him. He is not just worried about other people. Shortly after he came to Stanford, a professor thanked him for being the one student in class paying full attention and not using a computer or phone. But he recently began using an iPhone and noticed a change; he felt its pull, even when playing with his daughter.”

The debate is important because multitasking technologies are starting to creep into schools. For several years Duke University has issued iPods to freshmen. A school in Massachusetts recently announced that it was going to allow students to use smartphones in class. In Korea and the United Kingdom, they are debating whether smartphones should have built-in features that assist in educational settings. Such moves are not without critics. “Researchers worry that constant digital stimulation … creates attention problems for children with brains that are still developing, who already struggle to set priorities and resist impulses.” Richtel concludes:

“Mr. Nass at Stanford thinks the ultimate risk of heavy technology use is that it diminishes empathy by limiting how much people engage with one another, even in the same room. ‘The way we become more human is by paying attention to each other,’ he said. ‘It shows how much you care.’ That empathy, Mr. Nass said, is essential to the human condition. ‘We are at an inflection point,’ he said. ‘A significant fraction of people’s experiences are now fragmented.'”

Someone once said that multitasking is a way to screw up everything simultaneously. It seems to me that there is a fine line between multitasking and the ability to move quickly back and forth between tasks. Multitasking appears to contribute to distraction and confusion while the ability to move quickly and successfully between tasks requires focus and clarity. If our brains are being rewired for multitasking, Carr’s and Volkow’s concerns should be taken seriously. On the other hand, if we can find a way to rewire our brains to enhance focus and clarity, I suspect that would be a good thing. Only time will tell.

Related Posts:

Happy Constitution Day

Happy Constitution Day! I thought it appropriate to post a link to Heather Cox Richardson’s (@HC_Richardson) blog discussing the Articles of Confederation and the Constitution.

Read More »