2013-10-10

Nicholas Car. The Shallows: What the Internet is Doing to our Brains.  W.W. Norton & Company, Inc., 2010. (276 pages)



Prologue: The Watchdog and the Thief

What both enthusiast and skeptic miss is what McLuhan saw: that in the long run a medium’s content matters less than the medium itself in influencing how we think and act. (3)

The computer screen bulldozes our doubts with its bounties and conveniences. It is so much our servant that it would seem churlish to notice that it is also our master. (4)

One: Hal and Me

Over the last few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. (5)

The boons are real. But they come at a price. As McLuhan suggested, media aren’t just channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I’m online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. (7)

What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed? – Scott Karp

…for society as a whole the Net has become, in just the twenty years since the software programmer Tim Berners-Lee wrote the code for the World Wide Web, the communication and information medium of choice. (9)

Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts — the faster, the better. (10)

For the last five centuries, ever since Gutenberg’s printing press made book reading a popular pursuit, the linear, literary mind has been at the center of art, science, and society. As supple as it is subtle, it’s been the imaginative mind of the Renaissance, the rational mind of the Enlightenment, the inventive mind of the Industrial Revolution, even the subversive mind of Modernism. It may soon be yesterday’s mind. (10)

The computer, I began to sense, was more than just a simple tool that did what you told it to do. It was a machine that, in subtle but unmistakable ways, exerted an influence over you. (13)

Sometime in 2007, a serpent of doubt slithered into my info-paradise. I began to notice that the Net was exerting a much stronger and broader influence over me than my old stand-alone PC ever had. It wasn’t just that I was spending so much time staring into a computer screen. It wasn’t just that so many of my habits and routines were changing as I became more accustomed to and dependent on the sites and services of the Net. The very way my brain worked seemed to be changing. It was then that I began worrying about my inability to pay attention to one thing for more than a couple of minutes. At first I’d figured that the problem was a symptom of middle-age mind rot. But my brain, I realized, wasn’t just drifting. It was hungry. It was demanding to be fed the way the Net fed it — and the more it was fed, the hungrier it became. Even when I was away from my computer, I yearned to check e-mail, click links, do some Googling. I wanted to be connected. Just as Microsoft Word had turned me into a flesh-and-blood word processor, the Internet, I sensed, was turning me into something like a high-speed data-processing machine, a human HAL. | I missed my old brain. (16)

Two: The Vital Paths

Our writing equipment takes part in the forming of our thoughts. – Friedrich Nietzsche

Neurons have central cores, or somas, which carry out the functions common to all cells, but they also have two kinds of tentacle-like appendages — axons and dendrites — that transmit and receive electric pulses. (19)

As the idea of the unchangeable adult brain hardened into dogma, it turned into a kind of “neurological nihilism,” according to the research psychiatrist Norman Doidge. Because it created “a sense that treatment for many brain problems was ineffective or unwarrnated,” Doidge explains, it left those with mental illnesses or brain injuries little hope of treatment, much less cure. And as the idea “spread through our culture,” it ended up “stunting our overall view of human nature. Since the brain could not change, human nature, which emerges from it, seemed necessarily fixed and unalterable as well.” There was no regeneration; there was only decay. We, too, were stuck in the frozen concrete of our brain cells — or at least in the frozen concrete of received wisdom. (23-24)

As brain science continues to advance, the evidence for [neuro]plasticity strengthens. Using sensitive new brain-scanning equipment, as well as microelectrodes and other probes, neuroscientists conduct more experiments, not only on lab animals but on people. All of them confirm [Michael] Merzenich’s discovery. They also reveal something more: The brain’s plasticity is not limited to the somatosensory cortex, the area that governs our sense of touch. It’s universal. Virtually all of our neural circuits — whether they’re involved in feeling, seeing, hearing, moving, thinking, learning, perceiving, or remembering — are subject to change. The received wisdom is cast aside. (26)

THE ADULT BRAIN, it turns out, is not just plastic but, as James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, puts it, “very plastic.” …The plasticity diminishes as we get older — brains do get stuck in their ways — but it never goes away. (26)

Cells that fire together wire together – Hebb’s rule

The plasticity of our synapses brings into harmony two philosophies of the mind that have for centuries stood in conflict: empiricism and rationalism. (28)

…nature and nurture “actually speak the same language. They both ultimately achieve their mental and behavioral effects by shaping the synaptic organization of the brain.” – Joseph Le Doux, Synaptic Self. (28-29)

The brain is not the machine we once thought it to be. Though different regions are associated with different mental functions, the cellular components do not form permanent structures or play rigid roles. They’re flexible. They change with experience, circumstance, and need. (29)

We have learned that neuroplasticity is not only possible but that it is constantly in action. … That is the way we adapt to changing conditions, the way we learn new facts, and the way we develop new skills. – Mark Hallett

Neuroplasticity, argues Pascual-Leone, is one of the most important products of evolution, a trait that enables the nervous system “to escape the restrictions of its own genome and thus adapt to environmental pressures, physiologic changes, and experiences.” The genius of our brain’s construction is not that it contains a lot of hardwiring but that it doesn’t. (31)

Our ways of thinking, perceiving, and acting, we now know, are not entirely determined by our genes. Nor are they entirely determined by our childhood experiences. We change them through the way we live — and, as Nietzsche sensed, through the tools we use. (31)

Descartes may have been wrong about dualism, but he appears to have been correct in believing that our thoughts can exert a physical influence on, or at least cause a physical reaction in, our brains. We become, neurologically, what we think. (33) [VIA: cf. Philippians 4:8]

The brain’s adaptability hasn’t just led to new treatments, and new hope, for those suffering from brain injury or illness. It provides all of us with a mental flexibility, an intellectual litheness, that allows us to adapt to new situations, learn new skills, and in general expand our horizons. | But the news is not all good. Although neuroplasticity provides an escape from genetic determinism, a loophole for free thought and free will, it also imposes its own form of determinism on our behavior. As particular circuits in our brain strengthen through the repetition of a physical or mental activity, they begin to transform that activity into a habit. The paradox of neuroplasticity, observes Doidge, is that, for all the mental flexibility it grants us, it can end up locking us into “rigid behaviors.” The chemically triggered synapses that link our neurons program us, in effect, to want to keep exercising the circuits they’ve formed. Once we’ve wired new circuitry in our brain, Doidge writes, “we long to keep it activated.” That’s the way the brain fine-tunes its operations. Routine activities are carried out ever more quickly and efficiently, while unused circuits are pruned away. (34)

Plastic does not mean elastic, in other words. (34)

If we stop exercising our mental skills, we do not just forget them: the brain map space for those skills is turned over to the skills we practice instead. – Norman Doidge

Jeffrey Schwartz, a professor of psychiatry at UCLA’s medical school, terms this process “survival of the busiest.” (35)

a digression: on what the brain thinks about when it thinks about itself

The brain, packed neatly into the bone-crate of the skull, gives us no sensory signal of its existence. … The source of consciousness lies beyond the grasp of consciousness. (37)

…the brain — and the mind to which it gives rise — is forever a work in progress. That’s true not just for each of us as individuals. It’s true for all of us as a species. (38)

Three: Tools of the Mind

Our intellectual maturation as individuals can be traced through the way we draw pictures, or maps, of our surroundings. We begin with primitive, literal renderings of the features of the land we see around us, and we advance to ever more accurate, and more abstract, representations of geographic and topographic space. We progress, in other words, from drawing what we see to drawing what we know. (40)

The intellectual process of transforming experience in space to abstraction of space is a revolution in modes of thinking. – Vincent Virga

[cf. "The Flynn Effect"]

The map is a medium that not only stores and transmits information but also embodies a particular mode of seeing and thinking. (41)

The use of a reduced, substitute space for that of reality is an impressive act in itself. – Arthur Robinson

But what’s even more impressive is how the map “advanced the evolution of abstract thinking” throughout society. “The combination of the reduction of reality and the construct of an analogical space is an attainment in abstract thinking of a very high order indeed, for it enables one to discover structures that would remain unknown if not mapped.” (41)

What the map did for space — translate a natural phenomenon into an artificial and intellectual conception of that phenomenon — another technology, the mechanical clock, did for time. (41)

[Life was...] dominated by agrarian rhythms, free of haste, careless of exactitude, unconcerned by productivity. – Jacques Le Goff

The mechanical clock changed the way we saw ourselves. … Once the clock had redefined time as a series of units of equal duration, our minds began to stress the methodical mental work of division and measurement. … The clock played a crucial role in propelling us out of the Middle Ages and into the Renaissance and then the Enlightenment. (43)

[the clock] helped create the belief in an independent world of mathematically measurable sequences. [The] abstract framework of divided time [became] the point of reference for both action and thought. – Lewis Mumford

Independent of the practical concerns that inspired the timekeeping machine’s creation and governed its day-to-day use, the clock’s methodical ticking helped bring into being the scientific mind and the scientific man. (44)

Every technology is an expression of human will. (44)

physical strength, dexterity, or resilience.

the range or sensitivity of our senses.

enables us to reshape nature to better serve our needs or desires

“intellectual technologies.”

…it is our intellectual technologies that have the greatest and most lasting power over what and how we think. They are our most intimate tools, the ones we use for self-expression, for shaping personal and public identity, and for cultivating relations with others. (45)

…intellectual technologies, when they come into popular use, often promote new ways of thinking that had been limited to a small, elite group. Every intellectual technology, to put it another way embodies an intellectual ethic, a set of assumptions about how the human mind works or should work. The map and the clock shared a similar ethic. Both placed a new stress on measurement and abstraction, on perceiving and defining forms and processes beyond those apparent to the senses. (45)

The intellectual ethic of a technology is rarely recognized by its inventors. (45)

The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users. (46)

Some have made the case for what the sociologist Thorstein Veblen dubbed “technological determinism”; they’ve argued that technological progress, which they see as an autonomous force outside man’s control has been the primary factor influencing the course of human history. Karl Marx gave voice to this view when he wrote, “The windmill gives you society with the industrial capitalist.” Ralph Waldo Emerson put it more crisply: “Things are in the saddle / And ride mankind.” In the most extreme expression of the determinist view, human beings become little more than “the sex organs of the machine world,” as McLuhan memorably wrote in the “Gadget Lover” chapter of Understanding Media. (46)

At the other end of the spectrum are the instrumentalists. … The idea that we’re somehow controlled by our tools is anathema to most people. (46)

Technology is technology. It is a means for communication and transportation over space, and nothing more. – James Carey

But if you take a broader historical or social view, the claims of the determinists gain credibility. (47)

If the experience of modern society shows us anything, it is that technologies are not merely aids to human activity, but also powerful forces acting to reshape that activity and its meaning – Langdon Winner

The conflict between the determinists and the instrumentalists will never be resolved. It involves, after all, two radically different views of the nature and destiny of humankind. The debate is as much about faith as it is about reason. But there is one thing that determinists and instrumentalists can agree on: technological advances often mark turning points in history. (47-48)

There are plenty of fossilized bodies, but there are no fossilized minds. (48)

…the ways human beings think and act have changed almost beyond recognition through those millennia. … Through what we do and how we do it — moment by moment, day by day, consciously or unconsciously — we alter the chemical flows in our synapses and change our brains. And when we hand down our habits of thought to our children, through the examples we set, the schooling we provide, and the media we use, we hand down as well the modifications int he structure of our brains. (49)

Once maps had become common, people began to picture all sorts of natural and social relationships as cartographic, as as set of fixed bounded arrangements in real or figurative space. We began to “map” our lives, our social spheres, even our ideas. Under the sway of the mechanical clock, people bean thinking of their brains and their bodies — of the entire universe, in fact — as operating “like clockwork.” (50)

Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word. Walter J. Ong

…an inventor is not the most reliable judge of the value of his invention

O man full of arts, to one is it given to create the things of art, and to another to judge what measure of harm and of profit they have for those that shall employ them. And so it is that you, by reason of the tender regard for the writing that is your offspring, have declared the very opposite of its true effect. – Thamus

By substituting outer symbols for inner memories, writing threatens to make us shallower thinkers, he says, preventing us from achieving the intellectual depth that leads to wisdom and true happiness. (55)

Poetry and literature represented opposing ideals of the intellectual life. (55)

In a purely oral culture, thinking is governed by the capacity of human memory. (56)

In Plato’s time, and for centuries afterward, that heightened consciousness was reserved for an elite. Before the cognitive benefits of the alphabet could spread to the masses, another set of intellectual technologies — those involved int he transcription, production, and distribution of written works — would have to be invented. (57)

Four: The Deepening Page

Silent reading was largely unknown in the ancient world. (60)

When he read, his eyes scanned the page and his heart explored the meaning, but his voice was silent and his tongue was still – Augustine writing of Ambrose, the bishop of Milan

scriptura continua … reflected language’s origins in speech. (61)

The lack of word separation, combined with the absence of word order conventions, placed an “extra cognitive burden” on ancient readers… (61)

Not until well after the collapse of the Roman Empire did the form of written language finally break from the oral tradition and begin to accommodate the unique needs of readers. … By the start of the second millennium writers had begun to impose rules of word order on their work, fitting words into a predictable, standardized system of syntax. (62)

The visual cortex, for example, develops “a veritable collage” of neuron assemblies dedicated to recognizing, in a matter of milliseconds, “visual images of letters, letter patterns, and words.” As the brain becomes more adept at decoding text, turning what had been a demanding problem-solving exercise into a process that is essentially automatic, it can dedicate more resources to the interpretation of meaning. What we today call “deep reading” becomes possible. By “altering the neurophysiological process of reading,” word separation “freed the intellectual faculties of the reader,” Saenger writes;” even readers of modest intellectual capacity could read more swiftly, and they could understand an increasing number of inherently more difficult texts.” (63)

Readers didn’t just become more efficient. They also became more attentive. (63)

Our fast-aced, reflexive shifts in focus were once crucial to our survival. They reduced the odds that a predator would take us by surprise or that we’d overlook a nearby source of food. For most of history, the normal path of human thought was anything but linear. (64)

To read a book was to practice an unnatural process of thought, one that demanded sustained, unbroken attention to a single, static object. It required readers to place themselves at what T.S. Eliot, in Four Quartets, would call “the still point of the turning world.” (64)

In the quiet spaces opened up by the prolonged, undistracted reading of a book, people made their own associations, drew their own inferences and analogies, fostered their own ideas. They thought deeply as they read deeply. (65)

The medieval bishop Isaac of Syria described how, whenever he read to himself,

as in a dream, I enter a state when my sense and thoughts are concentrated. then, when with prolonging of this silence the turmoil of memories is stilled in my heart, ceaseless waves of joy are sent me by inner thoughts, beyond expectation suddenly arising to delight my heart.

Readers disengaged their attention from the outward flow of passing stimuli in order to engage it more deeply with an inward flow of words, ideas, and emotions. That was — and is — the essence of the unique mental process of deep reading. It was the technology of the book that made this “strange anomaly” in our psychological history possible. The brain of the book reader was more than a literate brain. it was a literary brain. (65)

Now, writing began to take on, and to disseminate, the new intellectual ethic: the ethic of the book. The development of knowledge became an increasingly private act, with each reader creating, in his own mind, a personal synthesis of the ideas and information passed down through the writings of other thinkers. The sense of individualism strengthened. (67)

According to one estimate, the number of books produced in the fifty years following Gutenberg’s invention equaled the number produced by European scribes during the preceding thousand years. (69)

By the end of the fifteenth century, nearly 250 towns in Europe had print shops, and some 12 million volumes had already come off their presses. (70)

…the arrival of movable-type printing was a central event in the history of Western culture and the development of the Western mind. (72)

The brain regions that are activated often “mirror those involved when people perform, imagine, or observe similar real world activities.” Deep reading, says the study’s lead researcher, Nicole Speer, “is by no means a passive exercise.” The reader becomes the book. (74)

After Gutenberg’s invention, the bounds of language expanded rapidly as writers, competing for the eyes of ever more sophisticated and demanding readers, strived to express ideas and emotions with superior clarity, elegance, and originality. The vocabulary of the English language, once limited to just a few thousand words, expanded to upwards of a million words as books proliferated. Many of the new words encapsulated abstract concepts that simply hadn’t existed before. Writers experimented with syntax and diction, opening new pathways of thought and imagination. Readers eagerly traveled down those pathways, becoming adept at following fluid, elaborate, and idiosyncratic prose and verse. The ideas that writers could express and readers could interpret became more complex and subtle, as arguments would their way linearly across many pages of text. As language expanded, consciousness deepened. (75)

The words in books didn’t just strengthen people’s ability to think abstractly; they enriched people’s experience of the physical world, the world outside the book. | One of the most important lessons we’ve learned form the study of neuroplasticity is that the mental capacities, the very neural circuits, we develop for one purpose can be put to other uses as well. As our ancestors imbued their minds with the discipline to follow a line of argument or narrative through a succession of printed pages, they became more contemplative, reflective, and imaginative. (75)

The literary ethic was not only expressed in what we normally think of as literature. It became the ethic of the historian, illuminating works like Gibbon’s Decline and Fall of the Roman Empire. It became the ethic of the philosopher, informing the ideas of Descrates, Locke, Kant, and Nietzsche. And, crucially, it became the ethic of the scientist. One could argue that the single most influential literary work of the nineteenth century was Darwin’s On the Origin of Species. In the twentieth century, the literary ethic ran through such diverse books as Einstein’s Relativity, Keynes’s General Theory of Employment, Interest and Money, Thomas Kuhn’s Structure of Scientific Revolutions, and Rachel Carson’s Silent Spring. none of these momentous intellectual achievements would have been possible without the changes in reading and writing — and in perceiving and thinking — spurred by the efficient reproduction of long forms of writing on printed pages. (76)

Writing and print and the computer are all ways of technologizing the word – Walter Ong

…and once technologized, the word cannot be de-technologized. But the world of the screen, as we’re already coming to understand, is a very different place from the world of the page. A new intellectual ethic is taking hold. The pathways in our brains are once again being rerouted. (77)

a digression: on lee de forest and his amazing audion

…the Audion…was the first electronic audio amplifier, and the man who created it was Lee de Forest. (78)

Electric currents are, simply put, streams of electrons, and the Audion was the first device that allowed the intensity of those streams to be controlled with precision. (79)

Five: A Medium of the Most General Nature

Constructed of millions of interconnected computers and data banks, the Net is a [Alan] Turing [Wikipedia] machine of immeasurable power, and it is, true to form, subsuming most of our other intellectual technologies. It’s becoming our typewriter and our printing press, our map and our clock, our calculator and our telephone, our post office and our library, our radio and our TV. (83)

…the limiting factor of his universal machine was speed. (83)

Over the past three decades, the number of instructions a computer chip can process every second has doubled about every three years, while the cost of processing those instructions has fallen by almost half every year. (83)

The Net differs from most of the mass media it replaces in an obvious and very important way: it’s bidirectional. (85)

The growth in our online time has, in other words, expanded the total amount of time we spend in front of screens. (87)

What does seem to be decreasing as Net use grows is the time we spend reading print publications — particularly newspapers and magazines, but also books. (87)

Once information is digitized, the boundaries between media dissolve. We replace our special-purpose tools with an all-purpose tool. And because the economics of digital production and distribution are almost always superior to what came before — the cost of creating electronic products and transmitting them through the Net is a small fraction of the cost of manufacturing physical goods and shipping them through warehouses and into stores — the shift happens very quickly, following capitalism’s inexorable logic. (89)

But the old technologies lose their economic and cultural force. They become progress’s dead ends. It’s the new technologies that govern production and consumption, that guide people’s behavior and shape their perceptions. That’s why the future of knowledge and culture no longer lies in books or newspapers or TV shows or radio programs or records or CDs. It lies in digital files shot through our universal medium at the speed of light. (89)

A new medium is never an addition to an old one, nor does it leave the old one in peace. It never ceases to oppress the older media until it finds new shapes and positions for them. – Marshall McLuhan in Understanding Media

When the Net absorbs a medium, it re-creates that medium in its own image. It not only dissolves the medium’s physical form; it injects the medium’s content with hyperlinks, breaks up the content into searchable chunks, and surrounds the content with the content of all the other media it has absorbed. All these changes in the form of the content also change the way we use, experience, and even understand that content. (90)

The shift from paper to screen doesn’t just change the way we navigate a piece of writing. It also influences the degree of attention we devote to it and the depth of our immersion in it. (90)

Hyperlinks are designed to grab our attention. (90)

Our attachment to any one text becomes more tenuous, more provisional. Searches also lead to the fragmentation of online works. (91)

Whenever we turn on our computer, we are plunged into an “ecosystem of interruption technologies.” [Cory Doctorow]

When access [to information] is easy, we tend to favor the short, the sweet, and the bitty. – Tyler Cowen

The predominant sound in the modern library is the tapping of keys, not the turning of pages. (97)

Six: The Very Image of a Book

And what of the book itself? Of all popular media, it’s probably the one that has been most resistant to the Net’s influence. (99)

As soon as you inject a book with links and connect it to the Web — as soon as you “extend” and “enhance” it and make it “dynamic” — you change what it is and you change, as well the experience of reading it. An e-book is no more a book than an online newspaper is a newspaper. (103)

I fear that one of the great joys of book reading — the total immersion in another world, or int he world of the author’s ideas — will be compromised. We all may read books the way we increasingly read magazines and newspapers: a little bit here, a little bit there. – Steven Johnson

When a printed book — whether a recently published scholarly history or a two-hundred-year-old Victorian novel — is transferred to an electronic device connected to the Internet, it turns into something very like a Web site. Its words becomes wrapped in all the distractions of the networked computer. … The linearity of the printed book is shattered, along with the calm attentiveness it encourages in the reader. The high-tech features of devices like the Kindle and the Apple’s new iPad may make it more likely that we’ll read e-books, but the way we read them will be very different from the way we read printed editions. (104)

Changes in reading style will also bring changes in writing style, as authors and their publishers adapt to readers’ new habits and expectations. (104)

…it does seem inevitable that the Web’s tendency to turn all media into social media will have a far-reaching effect on styles of reading and writing and hence on language itself. (06-107)

A printed book is a finished object. … The finality of the act of publishing has long instilled in the best and most conscientious writers and editors a desire, even an anxiety, to perfect the works they produce — to write with an eye and an ear toward eternity. Electronic text is impermanent. (107)

It seems likely that removing the sense of closure from book writing will, in time, alter writers’ attitudes toward their work. The pressure to achieve perfection will diminish, along with the artistic rigor that the pressure imposed. To see how small changes in writers’ assumptions and attitudes can eventually have large effects on what they write, one need only glance at the history of correspondence. A personal letter written in, say, the nineteenth century bears little resemblance to a personal e-mail or text message written today. Our indulgence in the pleasures of informality and immediacy has led to a narrowing of expressiveness and a loss of eloquence. [§]

When Amazon’s chief executive, Jeff Bezos, introduced the Kindle, he sounded a self-congratulatory note: “It’s so ambitious to take something as highly evolved as a book and improve on it. And maybe even change the way people read.” There’s no “maybe” about it. The way people read — and write — has already been changed by the Net, and the changes will continue as, slowly but surely, the words of books are extracted from the printed page and embedded int he computer’s “ecology of interruption technologies.” (108)

Thought will spread across the world with the rapidity of light, instantly conceived, instantly written, instantly understood. It will blanket the earth from one pole to the other — sudden, instantaneous, burning with the fervor of the soul from which it burst forth. This will be the reign of the human word in all its plenitude. Thought will not have the time to ripen, to accumulate into the form of a book — the book will arrive too late. The only book possible from today is a newspaper. – Alphonse de Lamartine, 1831

While physical books may be on the road to obsolescence, the road will almost certainly be a long and winding one. yet the continued existence of the codex, though it may provide some cheer to bibliophiles, doesn’t change the fact that books and book reading, at least as we’ve defined those things in the past, are in their cultural twilight. (110)

Our old literary habits “were just a side-effect of living in an environment of impoverished access.” (111)

Although it may be tempting to ignore those who suggest the value of the literary mind has always been exaggerated, that would be a mistake. Their arguments are another important sign of the fundamental shift taking place in society’s attitude toward intellectual achievement. (112)

Our desire for fast-moving, kaleidoscopic diversions didn’t originate with the invention of the World Wide Web. It has been present and growing for many decades, as the pace of our work and home lives has quickened and as broadcast media like radio and television have presented us with a welter of programs, messages, and advertisements. The Internet, though it marks a radical departure from traditional media in many ways, also represents a continuation of the intellectual and social trends that emerged from people’s embrace of the electric media of the twentieth century and that have been shaping our lives and thoughts ever since. (112)

In the choices we have made, consciously or not, about how we use our computers, we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestowed on us. We have cast our lot with the juggler. (114)

Seven: The Juggler’s Brain

…the Internet’s import and influence can be judged only when viewed in the fuller context of intellectual history. As revolutionary as it may be, the Net is best understood as the latest in a long series of tools that have helped mold the human mind. (115)

What can science tell us about the actual effects that Internet use is having on the way our minds work? … The news is even more disturbing than I had suspected. Dozens of studies by psychologists, neurobiologists, educators, and Web designers point to the same conclusion: when we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. It’s possible to think deeply while surfing the Net, just as it’s possible to thinks shallowly while reading a book, but that’s not the type of thinking the technology encourages and rewards. | One thing is very clear: if, knowing what we know today about the brain’s plasticity, you were to set out to invent a medium that would rewire our mental circuits as quickly and thoroughly as possible, you would probably end up designing something that looks and works a lot like the Internet. It’s not just that we tend to use the Net regularly, even obsessively. It’s that the Net delivers precisely the kind of sensory and cognitive stimuli — repetitive, intensive, interactive, addictive — that have been shown to result in strong and rapid alterations in brain circuits and functions. With the exception of alphabets and number systems, the Net may well be the single most powerful mind-altering technology that has ever come into general use. At the very least, it’s the most powerful that has come along since the book. (116)

As the psychotherapist Michael Hausauer notes, teens and other young adults have a “terrific interest in knowing what’s going on in the lives of their peers, coupled with a terrific anxiety about being out of the loop.” If they stop sending messages, they risk becoming invisible. (118)

Our use of the Internet involves many paradoxes, but the one that promises to have the greatest long-term influence over how we think is this one: the Net seizes our attention only to scatter it. (118)

The Net’s cacophony of stimuli short-circuits both conscious and unconscious thought, preventing our minds from thinking either deeply or creatively. Our brains turn into simple single-processing units, quickly shepherding information into consciousness and then back out again. (119)

When culture drives changes in the ways that we engage our brains, it creates DIFFERENT brains. – Michael Merzenich

What we’re not doing when we’re online also has neurological consequences. Just as neurons that fire together wire together, neurons that don’t fire together don’t wire together. As the time we spend scanning Web pages crowds out the time we spend reading books, as the time we spend exchanging bite-sized text messages crowds out the time we spend composing sentences and paragraphs, as the time we spend hopping across links crowds out the time we devote to quiet reflection and contemplation, the circuits that support those old intellectual functions and pursuits weaken and begin to break apart. The brain recycles the disused neurons and synapses for other, more pressing work. We gain new skills and perspectives but lose old ones. (120)

The current explosion of digital technology not only is changing the way we live and communicate but is rapidly and profoundly altering our brains. – Gary Small

The need to evaluate links and make related navigational choices, while also processing a multiplicity of fleeting sensory stimuli, requires constant mental coordination and decision making, distracting the brain from the work of interpreting text or other information. (122)

The mind of the experienced book reader is a calm mind, not a buzzing one. When it comes to the firing of our neurons, it’s a mistake to assume that more is better. (123)

The depth of our intelligence hinges on our ability to transfer information from working memory to long-term memory and weave it into conceptual schemas. (124)

When our brain is over taxed, we find “distractions more distracting.” (125)

There are many possible sources of cognitive overload, but two of the most important…are “extraneous problem-solving” and “divided attention.” (125)

Even thought the World Wide Web has made hypertext commonplace, indeed ubiquitous, research continues to show that people who read linear text comprehend more, remember more, and learn more than those who read text peppered with links. (127)

Psychological research long ago proved what most of us know from experience: frequent interruptions scatter our thoughts, weaken our memory, and make us tense and anxious. The more complex the train of thought we’re involved in, the greater the impairment the distractions cause. (132)

Every time we shift our attention, our brain has to reorient itself, further taxing our mental resources. (133)

Our results suggest that learning facts and concepts will be worse if you learn them while you’re distracted. – Russell Poldrack

On the Net, where we routinely juggle not just two but several mental tasks, the switching costs are all the higher. (133)

We crave the new even when we know that “the new is more often trivial than essential.” (134)

What we’re experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization: we are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest. (138)

The Net grants us instant access to a library of information unprecedented in its size and scope, and it makes it easy for us to sort through that library — to find, if not exactly what we were looking for, at least something sufficient for our immediate purposes. What the Net diminishes is [Samuel] Johnson’s primary kind of knowledge: the ability to know, in depth, a subject for ourselves, to construct within our own minds the rich and idiosyncratic set of connections that give rise to a singular intelligence. (143)

a digression: on the buoyancy of IQ scores

Controversial when originally reported, the Flynn effect, as the phenomenon came to be called, has been confirmed by many subsequent studies. It’s real. (144)

After mulling over the paradoxes for many years, Flynn came to the conclusion that the gains in IQ scores have less to do with an increase in general intelligence than with a transformation in the way people think about intelligence. (147)

Living in a world of substance rather than symbol, they had little cause or opportunity to think about abstract shape and theoretical classification schemes. | But, Flynn realized, that all changed over the course of the last century when, for economic, technological, and educational reasons, abstract reasoning moved into the mainstream. (147)

We’re not smarter than our parents or our parents’ parents. We’re just smart in different ways. And that influences not only how we see the world but also how we raise and educate our children. This social revolution in how we think about thinking explains why we’ve become ever more adept at working out the problems in the more abstract and visual sections of IQ tests while making little or no progress in expanding our personal knowledge, bolstering our basic academic skills, or improving our ability to communicate complicated ideas clearly. We’re trained, from infancy, to put things into categories, to solve puzzles, to think in terms of symbols in space. Our use of personal computers and the Internet may well be reinforcing some of those mental skills and the corresponding neural circuits by strengthening our visual acuity, particularly our ability to speedily evaluate objects and other stimuli as they appear in the abstract realm of a computer screen. But, as Flynn stresses, that doesn’t mean we have “better brains.” It just means we have different brains. (148)

Eight: The Church of Google

In Google’s view, information is a kind of commodity, a utilitarian resource that can, and should, be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can distill their gist, the more productive we become as thinkers. Anything that stands in the way of the speedy collection, dissection, and transmission of data is a threat not only to Google’s business but to the new utopia of cognitive efficiency it aims to construct on the Internet. (152)

Without its search engine, and the other engines that have been built on its model, the Internet would have long ago become a Tower of Digital Babel. (156)

Google is, quite literally, in the business of distraction. (157)

But the inevitability of turning the pages of books into online images should not prevent us from considering the side effects. To make a book discoverable and searchable online is also to dismember it. The cohesion of its text, the linearity of its argument or narrative as it flows through scores of pages, is sacrificed. What that ancient Roman craftsman wove together when he created the first codex is unstitched. The quiet that was “part of the meaning” of the codex is sacrificed as well. Surrounding every page or snippet of text on Google book Search is a welter of links, tools, tabs, and ads, each eagerly angling for a share of the reader’s fragmented attention. (165)

…for Google, the real value of a book is not as a self-contained literary work but as another pile of data to be mined. The great library that Google is rushing to create shouldn’t be confused with the libraries we’ve known up until now. It’s not a library of books. it’s a library of snippets. (165-166)

The irony in Google’s effort to bring greater efficiency to reading is that it undermines the very different kind of efficiency that the technology of the book brought to reading — and to our minds — in the first place. By freeing us from the struggle of decoding text, the form that writing came to take on a page of parchment or paper enabled us to become deep readers, to turn our attention, and our brain power, to the interpretation of meaning. With writing on the screen, we’re still able to decode text quickly — we read, if anything, faster than ever — but we’re no longer guided toward a deep, personally constructed understanding of the text’s connotations. Instead, we’re hurried off toward another bit of related information, and then another, and another. The strip-mining of “relevant content” replaces the slow excavating of meaning. (166)

The stress that Google and other Internet companies place on the efficiency of information exchange as the key to intellectual progress is nothing new. It’s been, at least since the start of the Industrial Revolution, a common theme in the history of the mind. It provides a strong and continuing counterpoint to the very different view, promulgated by the American Transcendentalists as well as the early English Romantics, that true enlightenment comes only through contemplation and introspection. The tension between the two perspectives is one manifestation of the broader conflict between, in Marx’s terms, “the machine” and “the garden” — the industrial ideal and the pastoral ideal — that has played such an important role in shaping modern society. (167)

The development of a well-rounded mind requires both an ability to find and quickly parse a wide range of information and a capacity for open-ended reflection. There needs to be time for efficient data collection and time for inefficient contemplation, time to operate the machine and time to sit idly in the garden. We need to work in Google’s “world of numbers,” but we also need to be able to retreat to Sleepy Hollow. The problem today is that we’re losing our ability to strike a balance between those two very different states of mind. Mentally, we’re in perpetual locomotion. (168)

Everything that human beings are doing to make it easier to operate computer networks is at the same time, but for different reasons, making it easier for computer networks to operate human beings. – George Dyson

It’s also a fallacy to think that the physical brain and the thinking mind exist as separate layers in a precisely engineered “architecture.” The brain and the mind, the neuroplasticity pioneers have shown, are exquisitely intertwined, each shaping the other. (176)

Google is neither God nor Satan, and if there are shadows in the Googleplex they’re no more than the delusions of grandeur. What’s disturbing about the company’s founders is not their boyish desire to create an amazingly cool machine that will be able to outthink its creators, but the pinched conception of the human mind that gives rise to such a desire. (176)

Nine: Search, Memory

Socrates was right. As people grew accustomed to writing down their thoughts and reading the thoughts others had written down, they became less dependent on the contents of their own memory. (177)

the proliferation of printed pages had another effect, which Socrates didn’t foresee but may well have welcomed. Books provided people with a far greater and more diverse supply of facts, opinions, ideas, and stories than had been available before, and both the method and the culture of deep reading encouraged the commitment of printed information to memory. (177)

I had thought that the magic of the information age was that it allowed us to know more, but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants — silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves. – David Brooks

Not only has memory lost its divinity; it’s well on its way to losing its humanness. (181-182)

Neurologists and psychologists had known since the end of the nineteenth century that our brains hold more than one kind of memory. … “primary memories” and “secondary memories.” (182-183)

…the memory in our heads is the product of an extraordinarily complex natural process that is, at every instant, exquisitely turned to the unique environment in which each of us lives and the unique pattern of experiences that each of us goes through. (190)

Governed by highly variable biological signals, chemical, electrical, and genetic, every aspect of human memory — the way it’s formed, maintained, connected, recalled — has almost infinite gradations. (190)

Those who celebrate the “outsourcing” of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory. What gives real memory its richness and its character, not to mention its mystery and fragility, is its contingency. It exists in time, changing as the body changes. Indeed, the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals. (191)

Biological memory is in a perpetual state of renewal. The memory stored in a computer, by contrast, takes the form of distinct and static bits; you can move the bits from one storage drive to another as many times as you like, and they will always remain precisely as they were. (191)

The proponents of the outsourcing idea also confuse working memory with long-term memory. when a person fails to consolidate a fact, an idea, or an experience in long-term memory, he’s not “freeing up” space in his brain for other functions. In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections. (192)

Unlike a computer, the normal human brain never reaches a point at which experiences can no longer be committed to memory; the brain cannot be full. – Nelson Cowan

Evidence suggests, moreover, that as we build up our personal store of memories, our minds become sharper. The very act of remembering…appears to modify the brain in a way that can make it easier to learn ideas and skills in the future. (192)

We don’t constrain our mental powers when we store new long-term memories. We strengthen them. With each expansion of our memory comes an enlargement of our intelligence. The Web provides a convenient and compelling supplement to personal memory, but when we start by using the Web as a substitute for personal memory, bypassing the inner processes of consolidation, we risk emptying our minds of their riches. (192)

What determines what we remember and what we forget? The key to memory consolidation is attentiveness. (193)

The influx of competing messages that we receive whenever we go online not only overloads our working memory; it makes it much harder for our frontal lobes to concentrate our attention on any one thing. …the more we use the Web, the more we train our brain to be distracted — to process information very quickly and very efficiently but without sustained attention. That helps explain why many of us find it hard to concentrate even when we’re away from our computers. Our brains become adept at forgetting, inept at remembering. Our growing dependence on the Web’s information stores may in fact bet eh product of as self-perpetuating, self-amplifying loop. A sour use of the Web makes it harder for us to lock information into our biological memory, we’re forced to rely more and more on the Net’s capacious and easily searchable artificial memory, even if it makes us shallower thinkers. (194)

When we outsource our memory to a machine, we also outsource a very important part of our intellect and even our identity. William James, in concluding his 1892 lecture on memory, said, “The connecting is the thinking.” To which could be added, “The connecting is the self.” (195)

Personal memory shapes and sustains the “collective memory” that underpins culture. What’s stored int he individual mind — events, facts, concepts, skills — is more than the “representation of distinctive personhood” that constitutes the self, writes the anthropologist Pascal Boyer. It’s also “the crux of cultural transmission.” Each of us carries and projects the history of the future. Culture is sustained in our synapses.

| The offloading of memory to external data banks doesn’t just threaten the depth and distinctiveness of the self. It threatens the depth and distinctiveness of the culture we all share. … we risk turning into “pancake people — spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.” (196)

Culture is more than the aggregate of what Google descries as “the world’s information.” It’s more than what can be reduced to binary code and uploaded onto the Net. To remain vital, culture must be renewed in the minds of the members of every generation. Outsource memory, and the culture withers. (197)

a digression: on the writing of this book

I know what you’re thinking. The very existence of this book would seem to contradict its thesis. (198)

Ten: A thing Like Me

To understand the effects of a computer, [Joseph Weizenbaum] argued, you had to see the machine in the context of mankind’s past intellectual technologies, the long succession of tools that, like the map and the clock, transformed nature and altered “man’s perception of reality.” Such technologies become part of “the very stuff out of which man builds his world.” Once adopted, they can never be abandoned, at least not without plunging society into “great confusion and possibly utter chaos.” An intellectual technology, he wrote, “becomes an indispensable component of any structure once it is so thoroughly integrated with the structure, so enmeshed in various vital substructures, that it can no longer be factored out without fatally impairing the whole structure.” (206-207)

Our ability to meld with all manner of tools is one of the qualities that most distinguishes us as a species. In combination with our superior cognitive skills, it’s what makes us so good at using new technologies. It’s also what makes us so good at inventing them. Our brains can imagine the mechanics and the benefits of using a new device before that device even exists. (208)

Even as our technologies become extensions of ourselves, we become extensions of our technologies. (209)

Every tool imposes limitations even as it opens possibilities. The more we use it, the more we mold ourselves to its form and function. (209)

McLuhan wrote that our tools end up “numbing” whatever part of our body they “amplify.” When we extend some part of ourselves artificially, we also distance ourselves from the amplified part and its natural functions. (210)

The price we pay to assume technology’s power is alienation. The toll can be particularly high with our intellectual technologies. The tools of the mind amplify and in turn numb the most intimate, the most human, of our natural capacities — those for reason, perception, memory, emotion. (211)

…an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained. We shouldn’t allow the glories of technology to blind our inner watchdog to the possibility that we’ve numbed an essential part of our self. (212)

As a universal medium, a supremely versatile extension of our senses, our cognition, and our memory, the networked computer serves as a particularly powerful neural amplifier. (213)

As we’ve entered the computer age, however, our talent for connecting with other minds has had an unintended consequence. The “chronic overactivity of those brain regions implicated in social thought” can, writes [Jason] Mitchell, leads us to perceive minds where no minds exist, even in “inanimate objects.” There’s growing evidence, moreover, that our brains naturally mimic the states of the other minds we interact with, whether those minds are real or imagined. Such neural “mirroring” helps explain why we’re so quick to attribute human characteristics to our computers and computer characteristics to ourselves — why we hear a human voice when ELIZA speaks. (213)

The brighter the software, the dimmer the user. (216)

What matters in the end is not our becoming but what we become. (222)

Epilogue: Human Elements

Computers…follow rules; they don’t make judgments. (223)

In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence. (224)

——–

[§] Up to now, concerns about the influence of digital media on language have centered on the abbreviations and emoticons that kids use in instant messaging and texting. But such affectations will probably prove benign, just the latest twist int he long history of slang. Adults would be wiser to pay attention to how their own facility with writing is changing. Is their vocabulary shrinking or becoming more hackneyed? Is their syntax becoming less flexible and more formulaic? Those are the types of questions that matter in judging the Net’s long-run effects on the range and expressiveness of language.

— VIA —

Brilliant. An extremely accessible primer on the philosophy and psychology of technology. In a captivating journalistic manner, it awakens the reader to insights on the effects that are extremely important to anyone studying anything pertaining to human behavior.

I have one question and one contention regarding the discussion on “language.”

On page 53, Carr suggests that the simpler form of language happened around 750 BC. I would want to inquire about his thoughts on Semitic languages and their development of which we have archaeological evidence dating to around 1000 BC?

He also suggests that language is “native to our species” (page 51). I would want a stronger evaluation of the nature of language and the discussion of “techniness” in relation to language.

Regardless, it is my hope and prayer that books like these help to awaken our senses to the radical changes that are taking place as a result of you reading this blog and this review!

Show more