from TheAtlantic Website
So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick’s 2001 - A Space Odyssey.
Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial “brain."
I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going - so far as I can tell - but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy.
My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text.
The deep reading that used to come
naturally has become a struggle.
For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes.
A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link.
(Unlike footnotes, to which they’re
sometimes likened, hyperlinks don’t merely point to related works;
they propel you toward them.)
But that boon comes at a price.
As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information.
They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles.
Once I was a scuba diver in the sea of
words. Now I zip along the surface like a guy on a Jet Ski.
Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether.
He speculates on the answer:
Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits.
A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me.
His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online.
Anecdotes alone don’t prove much.
And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published study of online research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think.
As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “a form of skimming activity,” hopping from one source to another and rarely returning to any source they’d already visited.
They typically read no more than one or two pages of an article or book before they would “bounce” out to another site. Sometimes they’d save a long article, but there’s no evidence that they ever went back and actually read it.
The authors of the study report:
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice.
But it’s a different kind of reading, and behind it lies a different kind of thinking - perhaps even a new sense of the self.
Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.”
Our ability to interpret text, to make
the rich mental connections that form when we read deeply and
without distraction, remains largely disengaged.
Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli.
We can expect as well that the circuits
woven by our use of the Net will be different from those woven by
our reading of books and other printed works.
His already terse prose had become even tighter, more telegraphic.
The human brain is almost infinitely malleable.
People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind “is very plastic.”
Nerve cells routinely break old connections and form new ones.
As we use what the sociologist Daniel Bell has called our “intellectual technologies” - the tools that extend our mental rather than our physical capacities - we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example.
In Technics and Civilization, the historian and cultural critic Lewis Mumford described how the clock,
The “abstract framework of divided time”
became “the point of reference for both action and thought.”
In deciding when to eat, to work, to
sleep, to rise, we stopped listening to our senses and started
obeying the clock.
But the changes, neuroscience tells us,
go much deeper than metaphor. Thanks to our brain’s plasticity, the
adaptation occurs also at a biological level.
The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies.
It’s becoming our map and our clock, our
printing press and our typewriter, our calculator and our telephone,
and our radio and TV.
The result is to scatter our attention
and diffuse our concentration.
When, in March of this year, The New
York Times decided to devote the second and third pages of every
edition to article abstracts, its design director, Tom Bodkin,
explained that the “shortcuts” would give harried readers a quick
“taste” of the day’s news, sparing them the “less efficient” method
of actually turning the pages and reading the articles. Old media
have little choice but to play by the new-media rules.
By breaking down every job into a
sequence of small, discrete steps and then testing different ways of
performing each one, Taylor created a set of precise instructions -
an “algorithm,” we might say today - for how each worker should
work. Midvale’s employees grumbled about the strict new regime,
claiming that it turned them into little more than automatons, but
the factory’s productivity soared.
Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers.
The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the,
Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency.
Taylor’s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor’s ethic is beginning to govern the realm of the mind as well.
The Internet is a machine designed for
the efficient and automated collection, transmission, and
manipulation of information, and its legions of programmers are
intent on finding the “one best method” - the perfect algorithm
- to carry out every mental movement of what we’ve come to describe
as “knowledge work.”
Drawing on the terabytes of
behavioral data it collects through
its search engine and other sites, it carries out thousands of
experiments a day, according to the Harvard Business Review,
and it uses the results to refine the algorithms that
increasingly control how people find information and extract meaning
from it. What Taylor did for the work of the hand, Google is doing
for the work of the mind.
It seeks to develop,
In Google’s view, information is a kind
of commodity, a utilitarian resource that can be mined and processed
with industrial efficiency. The more pieces of information we can
“access” and the faster we can extract their gist, the more
productive we become as thinkers.
Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains.
In a 2004 interview with Newsweek, Brin said,
Last year, Page told a convention of
scientists that Google is “really trying to build artificial
intelligence and to do it on a large scale.”
A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt’s words,
Why wouldn’t Brin and Page want to be
the ones to crack it?
Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link - the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought.
It’s in their economic interest to
drive us to distraction.
He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters,
And because they would be able to,
Socrates wasn’t wrong - the new
technology did often have the effects he feared - but he was
shortsighted. He couldn’t foresee the many ways that writing and
reading would serve to spread information, spur fresh ideas, and
expand human knowledge (if not wisdom).
As New York University professor Clay Shirky notes,
But, again, the doomsayers were unable
to imagine the myriad blessings that the printed word would deliver.
The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas.
Deep reading, as Maryanne Wolf
argues, is indistinguishable from deep thinking.
In a recent essay, the playwright Richard Foreman eloquently described what’s at stake:
As we are drained of our,
I’m haunted by that scene in 2001.
What makes it so poignant, and so weird, is the computer’s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut - “I can feel it. I can feel it. I’m afraid” - and its final reversion to what can only be called a state of innocence.
HAL’s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm.
In the world of 2001, people have become so machinelike that the most human character turns out to be a machine.
That’s the essence of Kubrick’s dark
prophecy: as we come to rely on computers to mediate our
understanding of the world, it is our own intelligence that flattens
into artificial intelligence.