2013-12-10

Twitter It!

The wonders of the gut, why our brains are wired to be social, what poetry and math have in common, swarm intelligence vs. “God,” and more.

On the heels of the year’s best reads in psychology and philosophy, art and design, history and biography, and children’s books, the season’s subjective selection of best-of reading lists continues with the finest science and technology books of 2013. (For more timeless stimulation, revisit the selections for 2012 and 2011.)

1. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.



Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

2. YOU ARE STARDUST

“Everyone you know, everyone you ever heard of, every human being who ever was … lived there — on a mote of dust suspended in a sunbeam,” Carl Sagan famously marveled in his poetic Pale Blue Dot monologue, titled after the iconic 1990 photograph of Earth. The stardust metaphor for our interconnection with the cosmos soon permeated popular culture and became a vehicle for the allure of space exploration. There’s something at once incredibly empowering and incredibly humbling in knowing that the flame in your fireplace came from the sun.

That’s precisely the kind of cosmic awe environmental writer Elin Kelsey and Toronto-based Korean artist Soyeon Kim seek to inspire in kids in You Are Stardust (public library) — an exquisite picture-book that instills that profound sense of connection with the natural world, and also among the best children’s books of the year. Underpinning the narrative is a bold sense of optimism — a refreshing antidote to the fear-appeal strategy plaguing most environmental messages today.

Kim’s breathtaking dioramas, to which this screen does absolutely no justice, mix tactile physical materials with fine drawing techniques and digital compositing to illuminate the relentlessly wondrous realities of our intertwined existence: The water in your sink once quenched the thirst of dinosaurs; with every sneeze, wind blasts out of your nose faster than a cheetah’s sprint; the electricity that powers every thought in your brain is stronger than lightning.

But rather than dry science trivia, the message is carried on the wings of poetic admiration for these intricate relationships:

Be still. Listen.

Like you, the Earth breathes.

Your breath is alive with the promise of flowers.

Each time you blow a kiss to the world, you spread pollen that might grow to be a new plant.

The book is nonetheless grounded in real science. Kelsey notes:

I wrote this book as a celebration — one to honor the extraordinary ways in which all of us simply are nature. Every example in this book is backed by current science. Every day, for instance, you breathe in more than a million pollen grains.

But what makes the project particularly exciting is that, in the face of the devastating gender gap in science education, here is a thoughtful, beautiful piece of early science education presented by two women, the most heartening such example since Lauren Redniss’s Radioactive.

A companion iPad app features sound effects, animation, an original score by Paul Aucoin, behind-the-scenes glimpses of Kim’s process in creating her stunning 3D dioramas, and even build-your-own-diorama adventures.

Originally featured in March — see more here.

3. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the best psychology and philosophy books of the year — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

4. WILD ONES

Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at People Looking at Animals in America (public library) by journalist Jon Mooallem isn’t the typical story designed to make us better by making us feel bad, to scare us into behaving, into environmental empathy; Mooallem’s is not the self-righteous tone of capital-K knowing typical of many environmental activists but the scientist’s disposition of not-knowing, the poet’s penchant for “negative capability.” Rather than ready-bake answers, he offers instead directions of thought and signposts for curiosity and, in the process, somehow gently moves us a little bit closer to our better selves, to a deep sense of, as poet Diane Ackerman beautifully put it in 1974, “the plain everythingness of everything, in cahoots with the everythingness of everything else.”

In the introduction, Mooallem recalls looking at his four-year-old daughter Isla’s menagerie of stuffed animals and the odd cultural disconnect they mime:

[T]hey were foraging on the pages of every bedtime story, and my daughter was sleeping in polar bear pajamas under a butterfly mobile with a downy snow owl clutched to her chin. Her comb handle was a fish. Her toothbrush handle was a whale. She cut her first tooth on a rubber giraffe.

Our world is different, zoologically speaking — less straightforward and more grisly. We are living in the eye of a great storm of extinction, on a planet hemorrhaging living things so fast that half of its nine million species could be gone by the end of the century. At my place, the teddy bears and giggling penguins kept coming. But I didn’t realize the lengths to which humankind now has to go to keep some semblance of actual wildlife in the world. As our own species has taken over, we’ve tried to retain space for at least some of the others being pushed aside, shoring up their chances of survival. But the threats against them keep multiplying and escalating. Gradually, America’s management of its wild animals has evolved, or maybe devolved, into a surreal kind of performance art.

Yet even conservationists’ small successes — crocodile species bouncing back from the brink of extinction, peregrine falcons filling the skies once again — even these pride points demonstrate the degree to which we’ve assumed — usurped, even — a puppeteer role in the theater of organic life. Citing a scientist who lamented that “right now, nature is unable to stand on its own,” Mooallem writes:

We’ve entered what some scientists are calling the Anthropocene — a new geologic epoch in which human activity, more than any other force, steers change on the planet. Just as we’re now causing the vast majority of extinctions, the vast majority of endangered species will only survive if we keep actively rigging the world around them in their favor. … We are gardening the wilderness. The line between conservation and domestication has blurred.

He finds himself uncomfortably straddling these two animal worlds — the idyllic little-kid’s dreamland and the messy, fragile ecosystem of the real world:

Once I started looking around, I noticed the same kind of secondhand fauna that surrounds my daughter embellishing the grown-up world, too — not just the conspicuous bald eagle on flagpoles and currency, or the big-cat and raptor names we give sports teams and computer operating systems, but the whale inexplicably breaching in the life-insurance commercial, the glass dolphin dangling from a rearview mirror, the owl sitting on the rump of a wild boar silk-screened on a hipster’s tote bag. I spotted wolf after wolf airbrushed on the sides of old vans, and another wolf, painted against a full moon on purple velvet, greeting me over the toilet in a Mexican restaurant bathroom. … [But] maybe we never outgrow the imaginary animal kingdom of childhood. Maybe it’s the one we are trying to save.

[…]

From the very beginning, America’s wild animals have inhabited the terrain of our imagination just as much as they‘ve inhabited the actual land. They are free-roaming Rorschachs, and we are free to spin whatever stories we want about them. The wild animals always have no comment.

So he sets out to better understand the dynamics of the cultural forces that pull these worlds together with shared abstractions and rip them apart with the brutal realities of environmental collapse. His quest, in which little Isla is a frequent companion, sends him on the trails of three endangered species — a bear, a butterfly, and a bird — which fall on three different points on the spectrum of conservation reliance, relying to various degrees on the mercy of the very humans who first disrupted “the machinery of their wildness.” On the way, he encounters a remarkably vibrant cast of characters — countless passionate citizen scientists, a professional theater actor who, after an HIV diagnosis, became a professional butterfly enthusiast, and even Martha Stewart — and finds in their relationship with the environment “the same creeping disquiet about the future” that Mooallem himself came to know when he became a father. In fact, the entire project was inextricably linked to his sense of fatherly responsibility:

I’m part of a generation that seems especially resigned to watching things we encountered in childhood disappear: landline telephones, newspapers, fossil fuels. But leaving your kids a world without wild animals feels like a special tragedy, even if it’s hard to rationalize why it should.

The truth is that most of us will never experience the Earth’s endangered animals as anything more than beautiful ideas. They are figments of our shared imagination, recognizable from TV, but stalking places — places out there — to which we have no intention of going. I wondered how that imaginative connection to wildlife might fray or recalibrate as we’re forced to take more responsibility for its wildness.

It also occurred to me early on that all three endangered species I was getting to know could be gone by the time Isla is my age. It’s possible that, thirty years from now, they’ll have receded into the realm of dinosaurs, or the realm of Pokémon, for that matter — fantastical creatures whose names and diets little kids memorize from books. And it’s possible, too, I realized, that it might not even make a difference, that there would still be polar bears on footsy pajamas and sea turtle-shaped gummy vitamins — that there could be so much actual destruction without ever meaningfully upsetting the ecosystems in our minds.

Originally featured in May — read more here.

5. THINKING IN NUMBERS

Daniel Tammet was born with an unusual mind — he was diagnosed with high-functioning autistic savant syndrome, which meant his brain’s uniquely wired circuits made possible such extraordinary feats of computation and memory as learning Icelandic in a single week and reciting the number pi up to the 22,514th digit. He is also among the tiny fraction of people diagnosed with synesthesia — that curious crossing of the senses that causes one to “hear” colors, “smell” sounds, or perceive words and numbers in different hues, shapes, and textures. Synesthesia is incredibly rare — Vladimir Nabokov was among its few famous sufferers — which makes it overwhelmingly hard for the majority of us to imagine precisely what it’s like to experience the world through this sensory lens. Luckily, Tammet offers a fascinating first-hand account in Thinking In Numbers: On Life, Love, Meaning, and Math (public library) — a magnificent collection of 25 essays on “the math of life,” celebrating the magic of possibility in all its dimensions. In the process, he also invites us to appreciate the poetics of numbers, particularly of ordered sets — in other words, the very lists that dominate everything from our productivity tools to our creative inventories to the cheapened headlines flooding the internet.

Reflecting on his second book, Embracing the Wide Sky: A Tour Across the Horizons of the Mind, and the overwhelming response from fascinated readers seeking to know what it’s really like to experience words and numbers as colors and textures — to experience the beauty that a poem and a prime number exert on a synesthete in equal measure — Tammet offers an absorbing simulation of the synesthetic mind:

Imagine.

Close your eyes and imagine a space without limits, or the infinitesimal events that can stir up a country’s revolution. Imagine how the perfect game of chess might start and end: a win for white, or black, or a draw? Imagine numbers so vast that they exceed every atom in the universe, counting with eleven or twelve fingers instead of ten, reading a single book in an infinite number of ways.

Such imagination belongs to everyone. It even possesses its own science: mathematics. Ricardo Nemirovsky and Francesca Ferrara, who specialize in the study of mathematical cognition, write that “like literary fiction, mathematical imagination entertains pure possibilities.” This is the distillation of what I take to be interesting and important about the way in which mathematics informs our imaginative life. Often we are barely aware of it, but the play between numerical concepts saturates the way we experience the world.

Sketches from synesthetic artist and musician Michal Levy’s animated visualization of John Coltrane’s ‘Giant Steps.’ Click image for details.

Tammet, above all, is enchanted by the mesmerism of the unknown, which lies at the heart of science and the heart of poetry:

The fact that we have never read an endless book, or counted to infinity (and beyond!) or made contact with an extraterrestrial civilization (all subjects of essays in the book) should not prevent us from wondering: what if? … Literature adds a further dimension to the exploration of those pure possibilities. As Nemirovsky and Ferrara suggest, there are numerous similarities in the patterns of thinking and creating shared by writers and mathematicians (two vocations often considered incomparable.)

In fact, this very link between mathematics and fiction, between numbers and storytelling, underpins much of Tammet’s exploration. Growing up as one of nine siblings, he recounts how the oppressive nature of existing as a small number in a large set spurred a profound appreciation of numbers as sensemaking mechanisms for life:

Effaced as individuals, my brothers, sisters, and I existed only in number. The quality of our quantity became something we could not escape. It preceded us everywhere: even in French, whose adjectives almost always follow the noun (but not when it comes to une grande famille). … From my family I learned that numbers belong to life. The majority of my math acumen came not from books but from regular observations and day-to-day interactions. Numerical patterns, I realized, were the matter of our world.

This awareness was the beginning of Tammet’s synesthetic sensibility:

Like colors, the commonest numbers give character, form, and dimension to our world. Of the most frequent — zero and one — we might say that they are like black and white, with the other primary colors — red, blue, and yellow — akin to two, three, and four. Nine, then, might be a sort of cobalt or indigo: in a painting it would contribute shading, rather than shape. We expect to come across samples of nine as we might samples of a color like indigo—only occasionally, and in small and subtle ways. Thus a family of nine children surprises as much as a man or woman with cobalt-colored hair.

Daniel Tammet. Portrait by Jerome Tabet.

Sampling from Jorge Luis Borges’s humorous fictional taxonomy of animals, inspired by the work of nineteenth-century German mathematician Georg Cantor, Tammet points to the deeper insight beneath our efforts to itemize and organize the universe — something Umberto Eco knew when he proclaimed that “the list is the origin of culture” and Susan Sontag intuited when she reflected on why lists appeal to us. Tammet writes:

Borges here also makes several thought-provoking points. First, though a set as familiar to our understanding as that of “animals” implies containment and comprehension, the sheer number of its possible subsets actually swells toward infinity. With their handful of generic labels (“mammal,” “reptile,” “amphibious,” etc.), standard taxonomies conceal this fact. To say, for example, that a flea is tiny, parasitic, and a champion jumper is only to begin to scratch the surface of all its various aspects.

Second, defining a set owes more to art than it does to science. Faced with the problem of a near endless number of potential categories, we are inclined to choose from a few — those most tried and tested within our particular culture. Western descriptions of the set of all elephants privilege subsets like “those that are very large,” and “those possessing tusks,” and even “those possessing an excellent memory,” while excluding other equally legitimate possibilities such as Borges’s “those that at a distance resemble flies,” or the Hindu “those that are considered lucky.”

[…]

Reading Borges invites me to consider the wealth of possible subsets into which my family “set” could be classified, far beyond those that simply point to multiplicity.

Tammet circles back to the shared gifts of literature and mathematics, which both help cultivate our capacity for compassion:

Like works of literature, mathematical ideas help expand our circle of empathy, liberating us from the tyranny of a single, parochial point of view. Numbers, properly considered, make us better people.

Originally featured in August — read more here.

6. SMARTER THAN YOU THINK

“The dangerous time when mechanical voices, radios, telephones, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision,” Anaïs Nin wrote in her diary in 1946, decades before the internet as we know it even existed. Her fear has since been echoed again and again with every incremental advance in technology, often with simplistic arguments about the attrition of attention in the age of digital distraction. But in Smarter Than You Think: How Technology is Changing Our Minds for the Better (public library), Clive Thompson — one of the finest technology writers I know, with regular bylines for Wired and The New York Times — makes a powerful and rigorously thought out counterpoint. He argues that our technological tools — from search engines to status updates to sophisticated artificial intelligence that defeats the world’s best chess players — are now inextricably linked to our minds, working in tandem with them and profoundly changing the way we remember, learn, and “act upon that knowledge emotionally, intellectually, and politically,” and this is a promising rather than perilous thing.

He writes in the introduction:

These tools can make even the amateurs among us radically smarter than we’d be on our own, assuming (and this is a big assumption) we understand how they work. At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.

Page from ‘Charley Harper: An Illustrated Life.’ Click image for details.

But Thompson is nothing if not a dimensional thinker with extraordinary sensitivity to the complexities of cultural phenomena. Rather than revisiting painfully familiar and trite-by-overuse notions like distraction and information overload, he examines the deeper dynamics of how these new tools are affecting the way we make sense of the world and of ourselves. Several decades after Vannevar Bush’s now-legendary meditation on how technology will impact our thinking, Thompson reaches even further into the fringes of our cultural sensibility — past the cheap techno-dystopia, past the pollyannaish techno-utopia, and into that intricate and ever-evolving intersection of technology and psychology.

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it. Thompson contextualizes the phenomenon, which isn’t new, then asks the obvious, important question about our culturally unprecedented solutions to it:

Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.

[…]

What’s the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in so instantly? Or dumber with every search?

Vannevar Bush’s ‘memex’ — short for ‘memory index’ — a primitive vision for a personal hard drive for information storage and management. Click image for the full story.

That concern, of course, is far from unique to our age — from the invention of writing to Alvin Toffler’s Future Shock, new technology has always been a source of paralyzing resistance and apprehension:

Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt, Thamus, who met the present with defiant indignation:

This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.

But Thompson argues that despite history’s predictable patterns of resistance followed by adoption and adaptation, there’s something immutably different about our own era:

The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.

This ability to “google” one another’s memory stores, Thompson argues, is the defining feature of our evolving relationship with information — and it’s profoundly shaping our experience of knowledge:

Transactive memory helps explain how we’re evolving in a world of on-tap information.

He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegner’s, who conducted a series of experiments demonstrating that when we know a digital tool will store information for us, we’re far less likely to commit it to memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But there’s a subtler yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasn’t until recently that computer memory became fast enough to be consulted on the fly, but once it did — with search engines boasting that they return results in tenths of a second — our transactive habits adapted.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of reso

Show more