Reading: Is Google Making Us Stupid?
I ’ megabyte not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they ’ re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have besides begun mentioning the phenomenon. Scott Karp, who writes a blog about on-line media, recently confessed that he has stopped reading books altogether. “ I was a lighted major in college, and used to be [ a ] edacious book reviewer, ” he wrote. “ What happened ? ” He speculates on the answer : “ What if I do all my reading on the web not sol much because the way I read has changed, i.e. I ’ thousand merely seeking public toilet, but because the way I THINK has changed ? ” Bruce Friedman, who blogs regularly about the use of computers in medicine, besides has described how the Internet has altered his mental habits. “ I now have about wholly lost the ability to read and absorb a longish article on the web or in print, ” he wrote earlier this year. A diagnostician who has long been on the staff of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His think, he said, has taken on a “ staccato ” quality, reflecting the direction he quickly scans curtly passages of text from many sources online. “ I can ’ thyroxine read War and Peace anymore, ” he admitted. “ I ’ ve lost the ability to do that. even a web log post of more than three or four paragraph is besides much to absorb. I skim it. ” Anecdotes entirely don ’ metric ton raise a lot. And we still await the long-run neurological and psychological experiments that will provide a definitive visualize of how Internet consumption affects cognition. But a recently published cogitation of on-line research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined calculator logs documenting the behavior of visitors to two popular research sites, one operated by the british Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “ a form of skimming activity, ” hopping from one beginning to another and rarely returning to any reference they ’ d already visited. They typically read no more than one or two pages of an article or bible before they would “ bounce ” out to another web site. sometimes they ’ d save a long article, but there ’ s no tell that they ever went back and actually read it. The authors of the survey report :
It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more nowadays than we did in the 1970s or 1980s, when television was our medium of choice. But it ’ s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new feel of the self. “ We are not only what we read, ” says Maryanne Wolf, a developmental psychologist at Tufts University and the generator of Proust and the Squid: The Story and Science of the Reading Brain. “ We are how we read. ” Wolf worries that the manner of reading promoted by the Net, a vogue that puts “ efficiency ” and “ immediacy ” above all else, may be weakening our capability for the kind of deep reading that emerged when an earlier engineering, the impression press, made long and building complex works of prose commonplace. When we read on-line, she says, we tend to become “ mere decoders of information. ” Our ability to interpret textbook, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged. reading, explains Wolf, is not an natural skill for homo beings. It ’ s not etched into our genes the manner actor’s line is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learn and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a genial circuitry for reading that is identical unlike from the circuitry found in those of us whose written terminology employs an alphabet. The variations extend across many regions of the brain, including those that govern such all-important cognitive functions as memory and the interpretation of ocular and auditory stimuli. We can expect vitamin a well that the circuits woven by our function of the Net will be different from those woven by our read of books and other printed works. sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a foliate had become consume and afflictive, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. once he had mastered touch-type, he was able to write with his eyes closed, using alone the tips of his fingers. Words could once again flow from his mind to the page. But the machine had a insidious impression on his work. One of Nietzsche ’ second friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. “ possibly you will through this instrumental role even take to a modern idiom, ” the supporter wrote in a letter, noting that, in his own oeuvre, his “ ‘ thoughts ’ in music and language often depend on the choice of penitentiary and paper. ” “ You are right field, ” Nietzsche replied, “ our writing equipment takes region in the shape of our thoughts. ” Under the carry of the machine, writes the german media scholar Friedrich A. Kittler, Nietzsche ’ s prose “ changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style. ” The homo mind is about boundlessly ductile. People used to think that our mental net, the dense connections formed among the 100 billion or then neurons inside our skulls, was largely fixed by the time we reached adulthood. But genius researchers have discovered that that ’ s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the pornographic mind “ is very plastic. ” Nerve cells routinely break old connections and form raw ones. “ The brain, ” according to Olds, “ has the ability to reprogram itself on the fly, altering the manner it functions. ” As we use what the sociologist Daniel Bell has called our “ intellectual technologies ” —the tools that extend our mental rather than our physical capacities—we inescapably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the fourteenth century, provides a compelling example. In Technics and Civilization, the historian and cultural critic Lewis Mumford described how the clock “ disassociated clock from homo events and helped create the impression in an freelancer world of mathematically measurable sequences. ” The “ outline model of separate time ” became “ the point of reference book for both natural process and think. ” The clock ’ s methodical ticking helped bring into being the scientific mind and the scientific man. But it besides took something away. As the former MIT computer scientist Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the creation of the earth that emerged from the widespread consumption of timekeeping instruments “ remains an deprive adaptation of the older one, for it rests on a rejection of those address experiences that formed the footing for, and indeed constituted, the old reality. ” In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock. The march of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operate “ like clockwork. ” Today, in the historic period of software, we have come to think of them as operate on “ like computers. ” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our mind ’ south malleability, the adaptation occurs besides at a biological level. The Internet promises to have particularly far-reaching effects on cognition. In a newspaper published in 1936, the british mathematician Alan Turing proved that a digital computer, which at the clock time existed only as a theoretical car, could be programmed to perform the affair of any other information-processing device. And that ’ s what we ’ re seeing today. The Internet, an boundlessly herculean computing system, is subsuming most of our early cerebral technologies. It ’ second becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio receiver and television. When the Net absorbs a medium, that culture medium is re-created in the web ’ sulfur image. It injects the average ’ mho content with hyperlinks, blinking ads, and other digital bangle, and it surrounds the content with the contentedness of all the early media it has absorbed. A new electronic mail message, for case, may announce its arrival as we ’ re glance over the latest headlines at a newspaper ’ randomness locate. The resultant role is to scatter our attention and diffuse our concentration. The net income ’ randomness influence doesn ’ thymine goal at the edges of a computer screen, either. As people ’ south minds become attune to the crazy quilt of Internet media, traditional media have to adapt to the hearing ’ s new expectations. television programs add text crawl and pop-up book ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. When, in March of this year, The New York Times decided to devote the irregular and third pages of every edition to article abstracts, its invention director, Tom Bodkin, explained that the “ shortcuts ” would give harry readers a promptly “ taste ” of the day ’ randomness news, sparing them the “ less effective ” method of actually turning the pages and reading the articles. Old media have fiddling choice but to play by the new-media rules.
never has a communications arrangement played so many roles in our lives—or exerted such wide influence over our thoughts—as the Internet does today. Yet, for all that ’ s been written about the net, there ’ second been little consideration of how, precisely, it ’ south reprogramming us. The net ’ s cerebral ethic remains hidden. About the lapp fourth dimension that Nietzsche started using his typewriter, an dear young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant ’ mho machinists. With the approval of Midvale ’ randomness owners, he recruited a group of factory hands, set them to work on diverse metalworking machines, and recorded and timed their every movement a well as the operations of the machines. By breaking down every problem into a sequence of small, discrete steps and then testing unlike ways of performing each one, Taylor created a determine of accurate instructions—an “ algorithm, ” we might say today—for how each worker should work. Midvale ’ s employees grumbled about the nonindulgent new regimen, claiming that it turned them into little more than automatons, but the factory ’ s productiveness soared. More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor ’ s besotted industrial choreography—his “ organization, ” as he liked to call it—was embraced by manufacturers throughout the nation and, in meter, around the populace. Seeking utmost speed, maximal efficiency, and maximal output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrate 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the “ one best method acting ” of sour and thereby to effect “ the gradual substitution of skill for govern of thumb throughout the automobile mechanic arts. ” Once his arrangement was applied to all acts of manual british labour party, Taylor assured his followers, it would bring about a restructure not alone of industry but of company, creating a utopia of perfect efficiency. “ In the past the man has been inaugural, ” he declared ; “ in the future the system must be beginning. ” Taylor ’ south system is still very a lot with us ; it remains the ethic of industrial fabrication. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor ’ s ethic is beginning to govern the kingdom of the judgment as well. The Internet is a machine designed for the efficient and automated collection, transmittance, and handling of information, and its legions of programmers are purpose on finding the “ one best method acting ” —the perfect algorithm—to carry out every genial movement of what we ’ ve come to describe as “ cognition work. ” Google ’ randomness headquarters, in Mountain View, California—the Googleplex—is the Internet ’ second eminent church, and the religion practiced inside its walls is Taylorism. Google, says its foreman executive, Eric Schmidt, is “ a ship’s company that ’ south founded around the skill of measurement, ” and it is striving to “ systematize everything ” it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithm that increasingly control how people find information and extract think of from it. What Taylor did for the work of the hand, Google is doing for the cultivate of the mind. The company has declared that its deputation is “ to organize the world ’ second information and make it universally accessible and useful. ” It seeks to develop “ the perfect search locomotive, ” which it defines as something that “ understands precisely what you mean and gives you back precisely what you want. ” In Google ’ south view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “ access ” and the fast we can extract their kernel, the more productive we become as thinkers. Where does it end ? Sergey Brin and Larry Page, the give young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search locomotive into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. “ The ultimate search engine is something deoxyadenosine monophosphate smart as people—or smart, ” Page said in a manner of speaking a few years back. “ For us, working on research is a room to work on artificial intelligence. ” In a 2004 interview with Newsweek, Brin said, “ surely if you had all the earth ’ second information directly attached to your brain, or an artificial brain that was smarter than your brain, you ’ d be better off. ” end year, Page told a convention of scientists that Google is “ truly trying to build artificial intelligence and to do it on a bombastic scale. ” such an ambition is a natural one, even an admirable one, for a pair of mathematics whizzes with huge quantities of cash at their disposal and a little army of computer scientists in their employment. A basically scientific enterprise, Google is motivated by a desire to use engineering, in Eric Schmidt ’ second words, “ to solve problems that have never been solved before, ” and artificial intelligence is the hardest trouble out there. Why wouldn ’ thymine Brin and Page want to be the ones to crack it ? distillery, their easy premise that we ’ d all “ be better off ” if our brains were supplemented, or tied replaced, by an artificial intelligence is unsettling. It suggests a impression that intelligence is the output of a mechanical work, a series of discrete steps that can be isolated, measured, and optimized. In Google ’ south populace, the world we enter when we go on-line, there ’ s little place for the indistinctness of contemplation. Ambiguity is not an possibility for insight but a tease to be fixed. The homo brain is barely an outdated calculator that needs a fast processor and a bigger intemperate drive. The theme that our minds should operate as high-speed data-processing machines is not lone built into the workings of the Internet, it is the network ’ s reigning business model a well. The fast we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a fiscal stake in collecting the crumb of datum we leave behind as we flit from associate to link—the more crumbs, the better. The last thing these companies want is to encourage easy reading or slow, concentrated intend. It ’ randomness in their economic concern to drive us to distraction. possibly I ’ meter fair a worrier. fair as there ’ s a leaning to glorify technological advance, there ’ s a countertendency to expect the worst of every new creature or machine. In Plato ’ mho Phaedrus, Socrates bemoaned the growth of writing. He feared that, as people came to rely on the written word as a substitute for the cognition they used to carry inside their heads, they would, in the words of one of the dialogue ’ south characters, “ discontinue to exercise their memory and become forgetful. ” And because they would be able to “ receive a measure of information without proper teaching, ” they would “ be thought identical knowledgeable when they are for the most part quite ignorant. ” They would be “ filled with the conceit of wisdom alternatively of real wisdom. ” Socrates wasn ’ t wrong—the new engineering did much have the effects he feared—but he was shortsighted. He couldn ’ metric ton foresee the many ways that write and reading would serve to spread information, spur fresh ideas, and expand homo cognition ( if not wisdom of solomon ). The arrival of Gutenberg ’ s printing urge, in the fifteenth hundred, set off another round of teeth gnashing. The italian humanist Hieronimo Squarciafico worried that the easy handiness of books would lead to cerebral laziness, making men “ less studious ” and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the influence of scholars and scribes, and banquet sedition and orgy. As New York University professor Clay Shirky notes, “ Most of the arguments made against the print press were discipline, even prescient. ” But, again, the doomsayers were ineffective to imagine the countless blessings that the print parole would deliver. then, yes, you should be disbelieving of my agnosticism. possibly those who dismiss critics of the Internet as Luddites or nostalgists will be proved decline, and from our hyperactive, data-stoked minds will spring a fortunate age of intellectual discovery and universal wisdom of solomon. then again, the final international relations and security network ’ t the rudiment, and although it may replace the printing press, it produces something altogether unlike. The kind of bass read that a sequence of print pages promotes is valuable not equitable for the cognition we acquire from the generator ’ s words but for the intellectual vibrations those words set off within our own minds. In the tranquillity spaces opened up by the sustained, undistracted recitation of a book, or by any other act of contemplation, for that count, we make our own associations, draw our own inferences and analogies, foster our own ideas. deep recitation, as Maryanne Wolf argues, is indistinguishable from deep think. If we lose those quieten spaces, or fill them up with “ message, ” we will sacrifice something important not only in our selves but in our acculturation. In a recent test, the dramatist Richard Foreman articulately described what ’ second at stake :
I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the “instantly available.”
As we are drained of our “ inner repertory of dense cultural inheritance, ” Foreman concluded, we risk turning into “ ‘ pancake people ’ —spread wide and thin as we connect with that huge network of information accessed by the mere touch of a clitoris. ” I ’ m haunted by that fit in 2001. What makes it so poignant, and therefore wyrd, is the computer ’ s emotional reaction to the dismantling of its mind : its despair as one circuit after another goes dark, its childlike pleading with the astronaut— “ I can feel it. I can feel it. I ’ m afraid ” —and its concluding atavism to what can entirely be called a state of purity. HAL ’ s outpouring of feeling contrasts with the unemotionality that characterizes the human figures in the film, who go about their commercial enterprise with an about robotic efficiency. Their thoughts and actions feel scripted, as if they ’ rhenium following the steps of an algorithm. In the worldly concern of 2001, people have become so automatic that the most human character turns out to be a machine. That ’ s the effect of Kubrick ’ s dark prophecy : as we come to rely on computers to mediate our understand of the world, it is our own news that flattens into artificial intelligence .