2015-08-11



INTRODUCTION

On the weekend of July 30th, Edge convened one of its "Master Classes." In the past, these classes have featured short courses taught by people such as psychologist and Nobel Laureate Daniel Kahneman ("A Short Course in Thinking About Thinking"); behavioral economists Richard Thaler and Sendhil Mullainathan, again with Kahneman ("A Short Course in Behavioral Economics"); and genomic researchers George Church and J. Craig Venter ("A Short Course on Synthetic Genomics").

This year, the psychologist and social scientist Philip E. Tetlock presented the findings based on his work on forecasting as part of the Good Judgment Project. In 1984, Tetlock began holding "forecasting tournaments" in which selected candidates were asked questions about the course of events: In the wake of a natural disaster, what policies will be changed in the United States? When will North Korea test nuclear weapons? Candidates examine the questions in teams. They are not necessarily experts, but attentive, shrewd citizens.

Steven Pinker, who has written about Tetlock's work on Superforecasting, noted that "Tetlock is one of the very, very best minds in the social sciences today. He has come up with one brilliant idea after another, and superforecasting is no exception. Everyone agrees that the way to know if an idea is right  is to see whether it accurately predicts the future. But which ideas, which methods, which people have an actual, provable track record of non-obvious predictions vindicated by the course of events? The answers will surprise you, and have radical implications for politics, policy, journalism, education, and even epistemology—how we can best gain knowledge about the world we live in."

Among Tetlock's "students" at the Edge weekend were many intellectual heavyweights including political scientist and National Medal of Science winner Robert Axelrod; psychologist, Nobel Laureate, and recipient of the 2013 Presidential Medal of Freedom Daniel Kahneman; the political scientist and Director of Stanford’s CASBS Margaret Levi; Google Senior Vice President Salar Kamangar; psychologist and National Medal of Science winner Anne Treisman; Roboticist Rodney Brooks, former head of MIT's Computer Science Lab; W. Daniel Hillis, pioneer in massively parallel computation; medical inventor Dean Kamen; and Peter Lee, Corporate Vice President, Microsoft Research, overseeing MSR NExT.

Over the weekend in Napa, Tetlock held five classes, which are being presented by Edge in their entirety (8.5 hours of video and audio) along with accompanying transcripts (61,000 words). Commenting on the event, one of the participants wrote:

"The interesting thing is that this is not about a latest trend that might scale in one or two years, but about real change that might take a decade or two. Also, these masterclasses are not only much more profound than any of the conferences popularizing contemporary intellectualism. The possibility to spend that much time with the clairvoyants in a setting like this also gives you a sense of community so much greater than any of the advertised."

Enjoy!

Best,

John Brockman

Editor, Edge

PHILIP E. TETLOCK, Political and Social Scientist, is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction  (forthcoming, US, Crown, September 29th; UK, Random House, September 24th). Further reading on Edge: "How To Win At Forecasting: A Conversation with Philip Tetlock" (December 6, 2012). Philip Tetlock's Edge Bio Page.

US


UK


Publication September 29th,

Available for pre-ordering

Publication September 24th,

Available for pre-ordering

ATTENDEES:
Robert Axelrod, Political Scientist; Walgreen Professor for the Study of Human Understanding, U. Michigan; Author, The Evolution of Cooperation; Member, National Academy of Sciences; Recipient, the National Medal of Science; Stewart Brand, Founder, The Whole Earth Catalog; Co-Founder, The Well; Co-Founder, The Long Now Foundation; Author, Whole Earth Discipline; John Brockman, Editor, Edge; Author, The Third Culture; Rodney Brooks, Panasonic Professor of Robotics (emeritus), MIT; Founder, Chmn/CTO, Rethink Robotics; Author, Flesh and Machines; Brian Christian, Philosopher, Computer Scientist, Poet; Author, The Most Human Human; Wael Ghonim, Pro-democracy leader of the Tarir Square demonstrations in Egypt; Anonymous administrator of the Facebook page, "We are all Khaled Saeed"; W. Daniel Hillis, Physicist; Computer Scientist; Chairman, Applied Minds; Author, The Pattern on the Stone; Jennifer Jacquet, Assistant Professor of Environmental Studies, NYU; Author, Is Shame Necessary?; Daniel Kahneman, Professor Emeritus of Psychology, Princeton; Author, Thinking, Fast and Slow; Winner of the 2013 Presidential Medal of Freedom; Recipient of the 2002 Nobel Prize in Economic Sciences; Salar Kamangar, Senior Vice President, Google; Fmr head of YouTube; Dean Kamen, Inventor and Entrepreneur, DEKA Research;  Andrian Kreye, Feuilleton Editor, Sueddeutsche Zeitung, Munich; Peter Lee, Corp. VP, Microsoft Research; Former Founder / Director, DARPA's technology office; Former Head, Carnegie Mellon Computer Science Department & CMU's Vice Provost for Research; Margaret Levi, Political Scientist, Director, Center For Advanced Study in Behavioral Sciences (CASBS), Stanford University; Barbara Mellers, Psychologist; George Heyman University Professor at UPennsylvania; Past President, Society of Judgment and Decision Making; Ludwig Siegele, Technology Editor, The Economist; Rory Sutherland, Executive Creative Director and Vice-Chairman, OgilvyOne London; Vice-Chairman, Ogilvy & Mather UK; Columnist, The Spectator; Philip Tetlock, Political and Social Scientist; Annenberg University Professor at UPenn; Author, Expert Political Judgment; and (with Dan Gardner) Superforecasting (forthcoming); Anne Treisman, James S. McDonnell Distinguished University Professor Emeritus of Psychology at Princeton; Recipient, National Medal of Science; D.A.Wallach, Recording Artist; Songwriter; Artist in Residence, Spotify; Hi-Tech Investor

"La Miravalle" at Spring Mountain Vineyard

The Office for Anticipating Surprise
by Andrian Kreye, Feuilleton Editor
English translation by Arya Kamangar

[Click on image for English-language translation.]

In the circle of clairvoyants: At a vineyard north of San Francisco, Philip Tetlock of the University of Pennsylvania (left) presented his findings. Initially skeptical was Nobel Laureate Kahneman (third from left). Photo: John Brockman / edge.org

CLASS I — Forecasting Tournaments: What We Discover When We Start Scoring Accuracy

It is as though high status pundits have learned a valuable survival skill, and that survival skill is they've mastered the art of appearing to go out on a limb without actually going out on a limb. They say dramatic things but there are vague verbiage quantifiers connected to the dramatic things. It sounds as though they're saying something very compelling and riveting. There's a scenario that's been conjured up in your mind of something either very good or very bad. It's vivid, easily imaginable.

It turns out, on close inspection they're not really saying that's going to happen. They're not specifying the conditions, or a time frame, or likelihood, so there's no way of assessing accuracy. You could say these pundits are just doing what a rational pundit would do because they know that they live in a somewhat stochastic world. They know that it's a world that frequently is going to throw off surprises at them, so to maintain their credibility with their community of co-believers they need to be vague. It's an essential survival skill. There is some considerable truth to that, and forecasting tournaments are a very different way of proceeding. Forecasting tournaments require people to attach explicit probabilities to well-defined outcomes in well-defined time frames so you can keep score.

CLASS II — Tournaments: Prying Open Closed Minds in Unnecessarily Polarized Debates

Tournaments have a scientific value. They help us test a lot of psychological hypotheses about the drivers of accuracy, they help us test statistical ideas; there are a lot of ideas we can test in tournaments. Tournaments have a value inside organizations and businesses. A more accurate probability helps to price options better on Wall Street, so they have value.

I wanted to focus more on what I see as the wider societal value of tournaments and the potential value of tournaments in depolarizing unnecessarily polarizing policy debates. In short, making us more civilized. ...

There is well-developed research literature on how to measure accuracy. There is not such well-developed research literature on how to measure the quality of questions. The quality of questions is going to be absolutely crucial if we want tournaments to be able to play a role in tipping the scales of plausibility in important debates, and if you want tournaments to play a role in incentivizing people to behave more reasonably in debates.

CLASS III — Counterfactual History: The Elusive Control Groups in Policy Debates

There's a picture of two people on slide seventy-two, one of whom is one of the most famous historians in the 20th century, E.H. Carr, and the other of whom is a famous economic historian at the University of Chicago, Robert Fogel. They could not have more different attitudes toward the importance of counterfactuals in history. For E.H. Carr, counterfactuals were a pestilence, they were a frivolous parlor game, a methodological rattle, a sore loser's history. It was a waste of cognitive effort to think about counterfactuals. You should think about history the way it did unfold and figure out why it had to unfold the way it did—almost a prescription for hindsight bias.

Robert Fogel, on the other hand, approached it more like a scientist. He quite correctly recognized that if you want to draw causal inferences from any historical sequence, you have to make assumptions about what would have happened if the hypothesized cause had taken on a different value. That's a counterfactual. You had this interesting tension. Many historians do still agree, in some form, with E.H. Carr. Virtually all economic historians would agree with Robert Fogel, who's one of the pivital people in economic history; he won a Nobel Prize. But there's this very interesting tension between people who are more open or less open to thinking about counterfactuals. Why that is, is something that is worth exploring.

CLASS IV — Counterfactuals and the Making of (Better) Superforecasters
(Publication date: September 15)

CLASS V — Condensing it All Into Four Big Problems and a Killer App Solution
(Publication date: September 22)

Show more