Oregon State Bar Bulletin AUGUST/SEPTEMBER 2015

“Court Martial” aired on Feb. 2, 1967, during the first season of the television series, “Star Trek.” Attorney Samuel T. Cogley boards the starship Enterprise to represent Captain Kirk, who stands accused of negligence in the death of a crewman. On the Enterprise, circa 2167, all data, systems and information are available from talking computer terminals scattered densely throughout the ship. But Lawyer Cogley has little use for the sophisticated technology. Instead, he crams Captain Kirk’s quarters with piles of thick leather-bound books. “This is where the law is,” he tells Kirk, “not in that homogenized, pasteurized, synthesized—” Cogley motions disdainfully at the computer.
Minus the intergalactic space travel, Gene Roddenberry’s vision of a computer-controlled future has become a reality. Author Nicholas Carr believes that in our own Digital Age, we are changing our brains as a species. (The Shallows: What the Internet Is Doing to Our Brains(2010).) The intellectual functions we are sacrificing — sustained, linear thought; the ability to follow an argument — are the ones that typify the traditional practice of law.
Caveman Lawyer
From the year 2167, we’ll look back to pre-history. Say you’re a cave dweller. You’re seated on a comfortable boulder on the front porch of your cave, using your stone knife to sharpen a spear braced between your knees. With all the powers of your primitive humanoid brain, you concentrate on the task before you, using what psychologists call “top-down attentional control.”
Suddenly, you hear a sound. Could be a predator: Danger. Could be prey: An opportunity to put that spear to good use. “Bottom-up attentional control” takes over, an involuntary and instinctual shift in focus. Your task forgotten, you leap to your feet, poised for action.
“Fast-paced, reflexive shifts in focus were once crucial to human survival,” writes Lauren Newell, in “Redefining Attention (and Revamping the Legal Profession?) for the Digital Generation.” Bottom-up processing has always been and remains dominant over top-down attentional control. “Modern brains,” says Newell, “react to novel stimuli just as the brains of Cro-Magnon man did 40,000 years ago.”
Fast forward to 2015. You’re sitting at your desk, frowning over a lengthy brief, digesting your opponent’s convoluted arguments. Making full use of your brain’s executive functions, your top-down processing abilities, you bring all your powers of concentration to bear. Your surroundings fade as you read, your pencil scratching marginalia, your mind composing fragments of refutation.
Suddenly, you hear a sound. Could be danger. Could be opportunity. Because your brain has not changed that much in 40,000 years, your bottom-up processing kicks in and the brief is forgotten. Your iPhone has pinged. You have a new text message.
Survival of the Busiest
In The Shallows, Nicholas Carr reminds us that the human brain is plastic: “Evolution has given us a brain that can literally change its mind — over and over again.” Our brains are plastic, though, not necessarily elastic. We can form bad neurological habits as well as good ones. We can strengthen our mental capabilities, but we are equally subject to “intellectual decay.” The Shallows 31, 34. A great deal of research shows that our exposure to information and communication technologies, or “ICTs,” is changing us neurologically, probably not for the better. As a species, our overall facility for deep concentration is deteriorating. Indeed, we are redefining what it means to “pay attention” (Newell, “Redefining Attention”).
We are busy. We are multitaskers. Now that owning a mobile telephone is as common as having indoor plumbing, we talk on the phone while we drive, walk, order coffee, and use the indoor plumbing. We check email, send text messages, listen to podcasts, update our social network profiles. Our smartphones ring when we receive a call, just like our dumb phones used to. They also alert us — on average about 12 times every hour — to missed calls, text messages, email, software updates, feeds and tweets. We are always on alert, always prepared for interruption. Our clients seem to expect instantaneous access and response.
If we pause to think about it (But wait! A new email!), the apparent goal of all this superficial activity is time optimization. Yet plenty of research demonstrates that multitasking is far from efficient; in fact, it is impossible to fully attend to two tasks at once. As humans, we are capable of doing one thing at a time, competently. Or we can switch rapidly back and forth between two tasks — which is what we call “multitasking” — and perform both tasks badly.
Interestingly, the term “multitasking,” and the concept itself, is borrowed from computer engineering (see sidebar for more computer etymology). Computers with two or more independent processing units, or “cores,” are capable of true multitasking. Humans are not, for the simple reason that we have only one brain.
David Meyer, director of the University of Michigan’s Brain, Cognition, and Action Laboratory, says multitasking is counterproductive, stressful and ultimately futile (Woolston, “Multitasking and Stress”). “Anytime you’re trying to multitask,” says Meyer, “you have less attention available to store memories.” Thus, a lawyer who attempts to carry on a phone conversation while responding to an email is unlikely to recall the substance of either. The “overloaded brain” of the multitasker “shifts its processing from the hippocampus (responsible for memory) to the striatum (responsible for rote tasks), making it hard to learn a task or even recall what you’ve been doing once you’re done” (Anderson, “In Defense of Distraction”).
Multitasking produces not only inefficiency, but anxiety. Dr. Jon Kabat-Zinn, founding executive director of a stress-reduction program at the University of Massachusetts Medical Center, writes that our obsession with saving time, our practice of continually dividing our attention, creates a sense of “free-floating urgency” in nearly every aspect of our lives. He characterizes our digital society as an “ADD Nation,” suffering from a collective attention deficit disorder (Coming to Our Senses).
Nicholas Carr joins the attack on multitasking, describing the “impairment” caused by frequent interruptions such as email alerts: They “scatter our thoughts, weaken our memory, and make us tense and anxious,” he says (132). He also points to the “cognitive overload” created by even the most casual Internet use — a result of the Internet’s two most central demands on its users: extraneous problem solving (such as the decision about whether to click on a link) and divided attention.
With our smartphones, laptops, tablets and “phablets,” we can connect to the Internet from practically anywhere. And the Internet typifies, creates and reflects our busy-ness, our “state of continuous partial attention” (Newell, “Redefining Attention”). Skillfully and by design, the Net “seizes our attention only to scatter it,” writes Carr. “It returns us to our native state of bottom-up distractedness” (The Shallows 118).
Internet consumers, indoctrinated into the cult and culture of multitasking, seem to prefer speed over depth, skimming over reading. But even if we didn’t, distraction is profitable. Nicholas Carr (whose book grew from a 2008 Atlantic article entitled “Is Google Making Us Stupid?”) assures us that Google’s profitability depends on the superficiality of our Internet experience: “Google’s profits are tied directly to the velocity of people’s information intake. The faster we surf across the surface of the Web — the more links we click and pages we view — the more opportunities Google gains to collect information about us and to feed us advertisements” (The Shallows 156-57).
Google is not the only company cashing in on our collective attention deficit disorder. Apple sold 74.5 million iPhones in the fourth quarter of 2014, following the release of the new iPhone 6. Since June 2007, when the “First Generation” iPhone was born, Apple has released a new version of its signature mobile device every year. In 2009 (release of the iPhone 3GS), a study at Ball State University revealed that most Americans spend more than eight hours each day looking at a screen. In 2013 (iPhone 5C and 5S), more than 4 hours of that screen time was evenly divided between mobile devices and desktop or laptop computers, but in 2014 (iPhone 6), mobile devices took a strong lead, with an average of nearly 3 hours of use per day, excluding voice calls.
In contrast, according to 2008 (iPhone 3G) figures from the U.S. Bureau of Labor Statistics, Americans between the ages of 25 and 34 were spending less than one hour per week reading printed documents.
Resistance Is Futile
Even in our century, Samuel T. Cogley stands a little ways apart with his stacks of books. “For some people,” writes Nicholas Carr, “the very idea of reading a book has come to seem old-fashioned, maybe even a little silly — like sewing your own shirts or butchering your own meat” (The Shallows 8). But for 500 years, ever since Johannes Gutenberg created his moveable type printing press, books held their cherished place at the center of intellectual life in the West. When, in 1501, Italian printer Aldus Manutius produced the “octavo” format, the truly portable-sized book became personal, and reading wove itself indelibly into “the fabric of everyday life” (The Shallows 70).
Not everyone thought books were wonderful. Sixteenth-century scholars like Swiss scientist Conrad Gessner raised alarms about the confusing overabundance of information created by the printing industry. In the 18th century, with the growth of periodicals publishing, critics argued that newspapers created social isolation. In fact, the protest against print publications was substantially identical to current concerns about the Internet and ICTs. As neuropsychologist Vaughan Bell writes, “Worries about information overload are as old as information itself, with each generation reimagining the dangerous impacts of technology on mind and brain” (“Don’t Touch That Dial!”).
The curmudgeons who predicted that books would cause the downfall of Western civilization could at least dispense their prophecies in print. When the Greek alphabet was invented around 750 B.C., the opinions of naysayers (who surely existed) mostly faded into oblivion. Without a contemporaneous record, however, we know that written language transformed human communication and expression. It may have alienated humans from their environment. Quoting from Marshal McLuhan’s Understanding Media: The Extensions of Man (1964), Nicholas Carr speculates that the “oral world of our distant ancestors may well have had emotional and intuitive depths that we can no longer appreciate,” that preliterate humans “must have enjoyed a particularly intense ‘sensuous involvement’ with the world” (The Shallows 56).
In Rise of the Network Society, Manuel Castells posits that the “new alphabetic order … separated written communication from the audiovisual system of symbols and perceptions, so critical for the fully fledged expression of the human mind.” He believes the Digital Age represents a “transformation of similar historic dimensions,” with the potential to reintegrate “text, images and sounds” and fundamentally alter the nature of communication (356). And although Nicholas Carr mourns the departure of “the intellectual tradition of solitary, single-minded concentration,” he admits that we may “come to see digitization as a liberating act, a way of freeing text from the page” (The Shallows 108, 114).
Michael Chorost, author of World Wide Mind, goes further. He envisions an integration of humans and technology that would “interlink” our minds as seamlessly as the right and left hemispheres of the human brain, creating an interconnected hive of humanity and redefining our understanding of the individual and the community (14-16).
Warp Nine!
Our love affair with the Internet and ICTs is passionate, rushed and — like many great romances — largely unexamined. The “warp speed” at which we invite mobile devices into our lives has “far out-paced the available research on the subject,” warns Kendra Cherry in “How Do Smartphones Affect the Brain?” In a market economy, where consumerism and consumption reign supreme, the rate of acceleration appears inevitable, and our thirst for nifty new devices seems unquenchable. According to Manuel Castells, the market “gives rise to an acceleration of technological innovation and a faster diffusion of such innovation, as ingenious minds, driven by passion and greed, constantly scan the industry for market niches in products and processes” (Rise of the Network Society 69).
It seems the faster we go, the more we crave speed. Beset by alerts and alarms, we remain in a primitive state of distraction. We skim through massive amounts of information, more and more quickly but often without deep attention. Time seems compressed. Everything feels urgent. Inevitably, it affects the way we practice law.
Some of us are tempted to cast our lot with Samuel T. Cogley, stubbornly siding with the curmudgeons and Luddites in the face of inevitable change. But, as the enemy Borg of “Star Trek: The Next Generation” assure us, resistance is (probably) futile. Before we assimilate, though — before we embrace the very traits of our technology and abandon deep thinking — let us at least make a thorough and thoughtful examination of what we are doing.
Sources:
Sam Anderson, “In Defense of Distraction,” New York Magazine (May 17, 2009).
Vaughan Bell, “Don’t Touch That Dial! A History of Media Technology Scares, from the Printing Press to Facebook,” Slate (Feb. 15, 2010); www.slate.com/articles/health_and_science/science/2010/02/dont_touch_that_dial.single.html.
Nicholas Carr,The Shallows: What the Internet Is Doing to Our Brains(2010).
Manuel Castells, The Rise of the Network Society (2nd ed. 2000).
Tom Chatfield, “The 10 Best Words the Internet Has Given English” (April 17, 2013);www.theguardian.com/books/2013/apr/17/tom-chatfield-top-10-internet-neologisms.
Kendra Cherry, “How Do Smartphones Affect the Brain?”, http://psychology.about.com/od/biopsychology/fl/How-Do-Smartphones-Affect-the-Brain.htm.
Michael Chorost, World Wide Mind: The Coming Integration of Humanity, Machines, and the Internet (2011).
Jon Kabat-Zinn, Coming to Our Senses: Healing Ourselves and the World Through Mindfulness (2005).
Lauren Newell, “Redefining Attention (and Revamping the Legal Profession?) for the Digital Generation,” 15 Nev. L. J. (forthcoming 2015).
Chris Woolston, “Multitasking and Stress,” Health Day (Mar. 11, 2015); http://consumer.healthday.com/encyclopedia/emotional-health-17/emotional-disorder-news-228/multitasking-and-stress-646052.html.
ABOUT THE AUTHOR
Jennie Bricker is a Portland-area attorney in private practice (Jennie Bricker Land & Water Law, jbrickerlaw.com) and also a freelance writer doing business as Brick Work Writing & Editing LLC. She can be reached at (503) 928-0976 or brickworkwriting@gmail.com.
© 2015 Jennie Bricker