2013-12-02

Twitter It!

How to think like Sherlock Holmes, make better mistakes, master the pace of productivity, find fulfilling work, stay sane, and more.

After the best biographies, memoirs, and history books of 2013, the season’s subjective selection of best-of reading lists continue with the most stimulating psychology and philosophy books published this year. (Catch up on the 2012 roundup here and 2011′s here.)

1. ON LOOKING: ELEVEN WALKS WITH EXPERT EYES

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library) — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

2. TIME WARPED

Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.



Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.

[…]

The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.

3. HOW TO FIND FULFILLING WORK

“If one wanted to crush and destroy a man entirely, to mete out to him the most terrible punishment,” wrote Dostoevsky, “all one would have to do would be to make him do work that was completely and utterly devoid of usefulness and meaning.” Indeed, the quest to avoid work and make a living of doing what you love is a constant conundrum of modern life. In How to Find Fulfilling Work (public library) — the latest installment in The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living, which previously gave us Philippa Perry’s How to Stay Sane and Alain de Botton’s How to Think More About Sex — philosopher Roman Krznaric (remember him?) explores the roots of this contemporary quandary and guides us to its fruitful resolution:

The desire for fulfilling work — a job that provides a deep sense of purpose, and reflects our values, passions and personality — is a modern invention. … For centuries, most inhabitants of the Western world were too busy struggling to meet their subsistence needs to worry about whether they had an exciting career that used their talents and nurtured their wellbeing. But today, the spread of material prosperity has freed our minds to expect much more from the adventure of life.

We have entered a new age of fulfillment, in which the great dream is to trade up from money to meaning.

Krznaric goes on to outline two key afflictions of the modern workplace — “a plague of job dissatisfaction” and “uncertainty about how to choose the right career” — and frames the problem:

Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it. Most surveys in the West reveal that at least half the workforce are unhappy in their jobs. One cross-European study showed that 60 per cent of workers would choose a different career if they could start again. In the United States, job satisfaction is at its lowest level — 45 per cent — since record-keeping began over two decades ago.

Of course, Krznaric points out, there’s plenty of cynicism and skepticism to go around, with people questioning whether it’s even possible to find a job in which we thrive and feel complete. He offers an antidote to the default thinking:

There are two broad ways of thinking about these questions. The first is the ‘grin and bear it’ approach. This is the view that we should get our expectations under control and recognize that work, for the vast majority of humanity — including ourselves — is mostly drudgery and always will be. Forget the heady dream of fulfillment and remember Mark Twain’s maxim. “Work is a necessary evil to be avoided.” … The history is captured in the word itself. The Latin labor means drudgery or toil, while the French travail derives from the tripalium, an ancient Roman instrument of torture made of three sticks. … The message of the ‘grin and bear it’ school of thought is that we need to accept the inevitable and put up with whatever job we can get, as long as it meets our financial needs and leaves us enough time to pursue our ‘real life’ outside office hours. The best way to protect ourselves from all the optimistic pundits pedaling fulfillment is to develop a hardy philosophy of acceptance, even resignation, and not set our hearts on finding a meaningful career.

I am more hopeful than this, and subscribe to a different approach, which is that it is possible to find work that is life-enhancing, that broadens our horizons and makes us feel more human.

[…]

This is a book for those who are looking for a job that is big enough for their spirit, something more than a ‘day job’ whose main function is to pay the bills.

‘Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it.’

Krznaric considers the five keys to making a career meaningful — earning money, achieving status, making a difference, following our passions, and using our talents — but goes on to demonstrate that they aren’t all created equal. In particular, he echoes 1970s Zen pioneer Alan Watts and modern science in arguing that money alone is a poor motivator:

Schopenhauer may have been right that the desire for money is widespread, but he was wrong on the issue of equating money with happiness. Overwhelming evidence has emerged in the last two decades that the pursuit of wealth is an unlikely path to achieving personal wellbeing — the ancient Greek ideal of eudaimonia or ‘the good life.’ The lack of any clear positive relationship between rising income and rising happiness has become one of the most powerful findings in the modern social sciences. Once our income reaches an amount that covers our basic needs, further increases add little, if anything, to our levels of life satisfaction.

The second false prophet of fulfillment, as Y-Combinator Paul Graham has poignantly cautioned and Debbie Millman has poetically articulated, is prestige. Krznaric admonishes:

We can easily find ourselves pursuing a career that society considers prestigious, but which we are not intrinsically devoted to ourselves — one that does not fulfill us on a day-to-day basis.

Krznaric pits respect, which he defines as “being appreciated for what we personally bring to a job, and being valued for our individual contribution,” as the positive counterpart to prestige and status, arguing that “in our quest for fulfilling work, we should seek a job that offers not just good status prospects, but good respect prospects.”

Rather than hoping to create a harmonious union between the pursuit of money and values, we might have better luck trying to combine values with talents. This idea comes courtesy of Aristotle, who is attributed with saying, ‘Where the needs of the world and your talents cross, there lies your vocation.’

Originally featured in April — read the full article here.

4. INTUITION PUMPS

“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.

[…]

Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.

5. MASTERMIND: HOW TO THINK LIKE SHERLOCK HOLMES

“The habit of mind which leads to a search for relationships between facts,” wrote James Webb Young in his famous 1939 5-step technique for creative problem-solving, “becomes of the highest importance in the production of ideas.” But just how does one acquire those vital cognitive customs? That’s precisely what science writer Maria Konnikova explores in Mastermind: How to Think Like Sherlock Holmes (UK; public library) — an effort to reverse-engineer Holmes’s methodology into actionable insights that help develop “habits of thought that will allow you to engage mindfully with yourself and your world as a matter of course.”

Bridging ample anecdotes from the adventures of Conan Doyle’s beloved detective with psychology studies both classic and cutting-edge, Konnikova builds a compelling case at the intersection of science and secular spiritualism, stressing the power of rigorous observation alongside a Buddhist-like, Cageian emphasis on mindfulness. She writes:

The idea of mindfulness itself is by no means a new one. As early as the end of the nineteenth century, William James, the father of modern psychology, wrote that, ‘The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will. … An education which should improve this faculty would be the education par excellence.’ That faculty, at its core, is the very essence of mindfulness. And the education that James proposes, an education in a mindful approach to life and to thought.

[…]

In recent years, studies have shown that meditation-like thought (an exercise in the very attentional control that forms the center of mindfulness), for as little as fifteen minutes a day, can shift frontal brain activity toward a pattern that has been associated with more positive and more approach-oriented emotional states, and that looking at scenes of nature, for even a short while, can help us become more insightful, more creative, and more productive. We also know, more definitively than we ever have, that our brains are not built for multitasking — something that precludes mindfulness altogether. When we are forced to do multiple things at once, not only do we perform worse on all of them but our memory decreases and our general wellbeing suffers a palpable hit.

But for Sherlock Holmes, mindful presence is just a first step. It’s a means to a far larger, far more practical and practically gratifying goal. Holmes provides precisely what William James had prescribed: an education in improving our faculty of mindful thought and in using it in order to accomplish more, think better, and decide more optimally. In its broadest application, it is a means for improving overall decision making and judgment ability, starting from the most basic building block of your own mind.

But mindfulness, and the related mental powers it bestows upon its master, is a skill acquired with grit and practice, rather than an in-born talent or an easy feat attained with a few half-hearted tries:

It is most difficult to apply Holmes’s logic in those moments that matter the most. And so, all we can do is practice, until our habits are such that even the most severe stressors will bring out the very thought patterns that we’ve worked so hard to master.

Echoing Carl Sagan, Konnikova examines the role of intuition — a grab-bag concept embraced by some of history’s greatest scientific minds, cultural icons, and philosophers — as both a helpful directional signpost of intellectual inquiry and a dangerous blind spot:

Our intuition is shaped by context, and that context is deeply informed by the world we live in. It can thus serve as a blinder — or blind spot — of sorts. … With mindfulness, however, we can strive to find a balance between fact-checking our intuitions and remaining open-minded. We can then make our best judgments, with the information we have and no more, but with, as well, the understanding that time may change the shape and color of that information.

“I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose,” Holmes famously remarked. Indeed, much like the inventor’s mind, the problem-solver’s mind is the product of that very choice: The details and observations we select to include in our “brain attic” shape and filter our perception of reality. Konnikova writes:

Observation with a capital O — the way Holmes uses the word when he gives his new companion a brief history of his life with a single glance — does entail more than, well, observation (the lowercase kind). It’s not just about the passive process of letting objects enter into your visual field. It is about knowing what and how to observe and directing your attention accordingly: what details do you focus on? What details do you omit? And how do you take in and capture those details that you do choose to zoom in on? In other words, how do you maximize your brain attic’s potential? You don’t just throw any old detail up there, if you remember Holmes’s early admonitions; you want to keep it as clean as possible. Everything we choose to notice has the potential to become a future furnishing of our attics — and what’s more, its addition will mean a change in the attic’s landscape that will affect, in turn, each future addition. So we have to choose wisely.

Choosing wisely means being selective. It means not only looking but looking properly, looking with real thought. It means looking with the full knowledge that what you note — and how you note it — will form the basis of any future deductions you might make. It’s about seeing the full picture, noting the details that matter, and understanding how to contextualize those details within a broader framework of thought.

Originally featured in January — read the full article for more, including Konnikova’s four rules for Sherlockian thinking.

6. MAKE GOOD ART

Commencement season is upon us and, after Greil Marcus’s soul-stirring speech on the essence of art at the 2013 School of Visual Arts graduation ceremony, here comes an exceptional adaptation of one of the best commencement addresses ever delivered: In May of 2012, beloved author Neil Gaiman stood up in front of the graduating class at Philadelphia’s University of the Arts and dispensed some timeless advice on the creative life; now, his talk comes to life as a slim but potent book titled Make Good Art (public library).

Best of all, it’s designed by none other than the inimitable Chip Kidd, who has spent the past fifteen years shaping the voice of contemporary cover design with his prolific and consistently stellar output, ranging from bestsellers like cartoonist Chris Ware’s sublime Building Stories and neurologist Oliver Sacks’s The Mind’s Eye to lesser-known gems like The Paris Review‘s Women Writers at Work and The Letter Q, that wonderful anthology of queer writers’ letters to their younger selves. (Fittingly, Kidd also designed the book adaptation of Ann Patchett’s 2006 commencement address.)

When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.

A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:

I hope that in this year to come, you make mistakes.

Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.

So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.

Whatever it is you’re scared of doing, Do it.

Make your mistakes, next year and forever.

Originally featured in May — read the full article here, along with a video of Gaiman’s original commencement address.

7. HOW CHILDREN SUCCEED

In How Children Succeed: Grit, Curiosity, and the Hidden Power of Character (public library) — a necessary addition to these fantastic reads on education — Paul Tough, whose writing has appeared in The New Yorker, Slate, Esquire, The New York Times, sets out to investigate the essential building blocks of character through the findings and practical insight of exceptional educators and bleeding-edge researchers. One of his core arguments is based on the work of pioneering psychologist and 2013 MacArthur “genius” grantee Angela Duckworth, who studied under positive psychology godfather Martin Seligman at my alma mater, the University of Pennsylvania, and has done more than anyone for advancing our understanding of how self-control and grit — the relentless work ethic of sustaining your commitments toward a long-term goal — impact success.

Duckworth had come to Penn in 2002, at the age of thirty-two, later in life than a typical graduate student. The daughter of Chinese immigrants, she had been a classic multitasking overachiever in her teens and twenties. After completing her undergraduate degree at Harvard (and starting a summer school for low-income kids in Cambridge in her spare time), she had bounced from one station of the mid-nineties meritocracy to the next: intern in the White House speechwriting office, Marshall scholar at Oxford (where she studied neuroscience), management consultant for McKinsey and Company, charter-school adviser.

Duckworth spent a number of years toying with the idea of starting her own charter school, but eventually concluded that the model didn’t hold much promise for changing the circumstances of children from disadvantaged backgrounds, those whom the education system was failing most tragically. Instead, she decided to pursue a PhD program at Penn. In her application essay, she shared how profoundly the experience of working in schools had changed her view of school reform and wrote:

The problem, I think, is not only the schools but also the students themselves. Here’s why: learning is hard. True, learning is fun, exhilarating and gratifying — but it is also often daunting, exhausting and sometimes discouraging. . . . To help chronically low-performing but intelligent students, educators and parents must first recognize that character is at least as important as intellect.

Duckworth began her graduate work by studying self-discipline. But when she completed her first-year thesis, based on a group of 164 eighth-graders from a Philadelphia middle school, she arrived at a startling discovery that would shape the course of her career: She found that the students’ self-discipline scores were far better predictors of their academic performance than their IQ scores. So she became intensely interested in what strategies and tricks we might develop to maximize our self-control, and whether those strategies can be taught. But self-control, it turned out, was only a good predictor when it came to immediate, concrete goals — like, say, resisting a cookie. Tough writes:

Duckworth finds it useful to divide the mechanics of achievement into two separate dimensions: motivation and volition. Each one, she says, is necessary to achieve long-term goals, but neither is sufficient alone. Most of us are familiar with the experience of possessing motivation but lacking volition: You can be extremely motivated to lose weight, for example, but unless you have the volition — the willpower, the self-control — to put down the cherry Danish and pick up the free weights, you’re not going to succeed. If a child is highly motivated, the self-control techniques and exercises Duckworth tried to teach [the students in her study] might be very helpful. But what if students just aren’t motivated to achieve the goals their teachers or parents want them to achieve? Then, Duckworth acknowledges, all the self-control tricks in the world aren’t going to help.

This is where grit comes in — the X-factor that helps us attain more long-term, abstract goals. To address this, Duckworth and her colleague Chris Peterson developed the Grit Scale — a deceptively simple test, on which you evaluate how much twelve statements apply to you, from “I am a hard worker” to “New ideas and projects sometimes distract me from previous ones.” The results are profoundly predictive of success at such wide-ranging domains of achievement as the National Spelling Bee and the West Point military academy. Tough describes the surprising power of this seemingly mundane questionnaire:

For each statement, respondents score themselves on a five-point scale, ranging from 5, “very much like me,” to 1, “not like me at all.” The test takes about three minutes to complete, and it relies entirely on self-report — and yet when Duckworth and Peterson took it out into the field, they found it was remarkably predictive of success. Grit, Duckworth discovered, is only faintly related to IQ — there are smart gritty people and dumb gritty people — but at Penn, high grit scores allowed students who had entered college with relatively low college-board scores to nonetheless achieve high GPAs. At the National Spelling Bee, Duckworth found that children with high grit scores were more likely to survive to the later rounds. Most remarkable, Duckworth and Peterson gave their grit test to more than twelve hundred freshman cadets as they entered the military academy at West Point and embarked on the grueling summer training course known as Beast Barracks. The military has developed its own complex evaluation, called the whole candidate score, to judge incoming cadets and predict which of them will survive the demands of West Point; it includes academic grades, a gauge of physical fitness, and a leadership potential score. But the more accurate predictor of which cadets persisted in Beast Barracks and which ones dropped out turned out to be Duckworth’s simple little twelve-item grit questionnaire.

You can take the Grit Scale here (registration is free).

8. THINKING: THE NEW SCIENCE OF DECISION-MAKING, PROBLEM-SOLVING AND PREDICTION

Every year, intellectual impresario and Edge editor John Brockman summons some of our era’s greatest thinkers and unleashes them on one provocative question, whether it’s the single most elegant theory of how the world works or the best way to enhance our cognitive toolkit. This year, he sets out on the most ambitious quest yet, a meta-exploration of thought itself: Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction (public library) collects short essays and lecture adaptations from such celebrated and wide-ranging (though not in gender) minds as Daniel Dennett, Jonathan Haidt, Dan Gilbert, and Timothy Wilson, covering subjects as diverse as morality, essentialism, and the adolescent brain.

One of the most provocative contributions comes from Nobel-winning psychologist Daniel Kahneman — author of the indispensable Thinking, Fast and Slow, one of the best psychology books of 2012 — who examines “the marvels and the flaws of intuitive thinking.”

In the 1970s, Kahneman and his colleague Amos Tversky, self-crowned “prophets of irrationality,” began studying what they called “heuristics and biases” — mental shortcuts we take, which frequently result in cognitive errors. Those errors, however, reveal a great deal about how our minds work:

If you want to characterize how something is done, then one of the most powerful ways of characterizing how the mind does anything is by looking at the errors that the mind produces while it’s doing it because the errors tell you what it is doing. Correct performance tells you much less about the procedure than the errors do.

One of the most fascinating examples of heuristics and biases is what we call intuition —

Show more