
Superforecasting: The Art and Science of Prediction
by Philip E. Tetlock
30 popular highlights from this book
Key Insights & Memorable Quotes
Below are the most popular and impactful highlights and quotes from Superforecasting: The Art and Science of Prediction:
âIf you donât get this elementary, but mildly unnatural, mathematics of elementary probability into your repertoire, then you go through a long life like a one-legged man in an ass-kicking contest.â
âFor superforecasters, beliefs are hypotheses to be tested, not treasures to be guarded.â
âFor scientists, not knowing is exciting. Itâs an opportunity to discover; the more that is unknown, the greater the opportunity.â
âConsensus is not always good; disagreement not always bad. If you do happen to agree, donât take that agreementâin itselfâas proof that you are right. Never stop doubting.â
âIt is wise to take admissions of uncertainty seriously,â Daniel Kahneman noted, âbut declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.â
âChurchill sent Keynes a cable reading, âAm coming around to your point of view.â His Lordship replied, âSorry to hear it. Have started to change my mind.âââ7â
âIf you have to plan for a future beyond the forecasting horizon, plan for surprise. That means, as Danzig advises, planning for adaptability and resilience.â
âThe test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function,â F. Scott Fitzgeraldâ
âItâs very hard to master and if youâre not learning all the time, you will fail. That being said, humility in the face of the game is extremely different than humility in the face of your opponents.â
âIt was the absence of doubtâand scientific rigorâthat made medicine unscientific and caused it to stagnate for so long.â
âIt follows that the goal of forecasting is not to see whatâs coming. It is to advance the interests of the forecaster and the forecasterâs tribe.â
âFuzzy thinking can never be proven wrong. And only when we are proven wrong so clearly that we can no longer deny it to ourselves will we adjust our mental models of the worldâproducing a clearer picture of reality. Forecast, measure, revise: it is the surest path to seeing better.â
âThere is no divinely mandated link between morality and competence.â
âKnowing what we donât know is better than thinking we know what we donât.â
âSuppose someone says, âUnfortunately, the popularity of soccer, the worldâs favorite pastime, is starting to decline.â You suspect he is wrong. How do you question the claim? Donât even think of taking a personal shot like âYouâre silly.â That only adds heat, not light. âI donât think soâ only expresses disagreement without delving into why you disagree. âWhat do you mean?â lowers the emotional temperature with a question but itâs much too vague. Zero in. You might say, âWhat do you mean by âpastimeâ?â or âWhat evidence is there that soccerâs popularity is declining? Over what time frame?â The answers to these precise questions wonât settle the matter, but they will reveal the thinking behind the conclusion so it can be probed and tested. Since Socrates, good teachers have practiced precision questioning, but still itâs often not used when itâs needed most. Imagine how events might have gone if the Kennedy team had engaged in precision questioning when planning the Bay of Pigs invasion: âSo what happens if theyâre attacked and the plan falls apart?â âThey retreat into the Escambray Mountains, where they can meet up with other anti-Castro forces and plan guerrilla operations.â âHow far is it from the proposed landing site in the Bay of Pigs to the Escambray Mountains?â âEighty miles.â âAnd whatâs the terrain?â âMostly swamp and jungle.â âSo the guerrillas have been attacked. The plan has fallen apart. They donât have helicopters or tanks. But they have to cross eighty miles of swamp and jungle before they can begin to look for shelter in the mountains? Is that correct?â I suspect that this conversation would not have concluded âsounds good!â Questioning like that didnât happen, so Kennedyâs first major decision as president was a fiasco. The lesson was learned, resulting in the robust but respectful debates of the Cuban missile crisisâwhich exemplified the spirit we encouraged among our forecasters.â
âForesight isnât a mysterious gift bestowed at birth. It is the product of particular ways of thinking, of gathering information, of updating beliefs. These habits of thought can be learned and cultivated by any intelligent, thoughtful, determined person.â
âBe careful about making assumptions of expertise, ask experts if you can find them, reexamine your assumptions from time to time.â
âAll models are wrong,â the statistician George Box observed, âbut some are useful.â
âthe facts change, I change my mind.â
âIn the EPJ results, there were two statistically distinguishable groups of experts. The first failed to do better than random guessing, and in their longer-range forecasts even managed to lose to the chimp. The second group beat the chimp, though not by a wide margin, and they still had plenty of reason to be humble. Indeed, they only barely beat simple algorithms like âalways predict no changeâ or âpredict the recent rate of change.â Still, however modest their foresight was, they had some. So why did one group do better than the other? It wasnât whether they had PhDs or access to classified information. Nor was it what they thoughtâwhether they were liberals or conservatives, optimists or pessimists. The critical factor was how they thought. One group tended to organize their thinking around Big Ideas, although they didnât agree on which Big Ideas were true or false. Some were environmental doomsters (âWeâre running out of everythingâ); others were cornucopian boomsters (âWe can find cost-effective substitutes for everythingâ). Some were socialists (who favored state control of the commanding heights of the economy); others were free-market fundamentalists (who wanted to minimize regulation). As ideologically diverse as they were, they were united by the fact that their thinking was so ideological. They sought to squeeze complex problems into the preferred cause-effect templates and treated what did not fit as irrelevant distractions. Allergic to wishy-washy answers, they kept pushing their analyses to the limit (and then some), using terms like âfurthermoreâ and âmoreoverâ while piling up reasons why they were right and others wrong. As a result, they were unusually confident and likelier to declare things âimpossibleâ or âcertain.â Committed to their conclusions, they were reluctant to change their minds even when their predictions clearly failed. They would tell us, âJust wait.â The other group consisted of more pragmatic experts who drew on many analytical tools, with the choice of tool hinging on the particular problem they faced. These experts gathered as much information from as many sources as they could. When thinking, they often shifted mental gears, sprinkling their speech with transition markers such as âhowever,â âbut,â âalthough,â and âon the other hand.â They talked about possibilities and probabilities, not certainties. And while no one likes to say âI was wrong,â these experts more readily admitted it and changed their minds. Decades ago, the philosopher Isaiah Berlin wrote a much-acclaimed but rarely read essay that compared the styles of thinking of great authors through the ages. To organize his observations, he drew on a scrap of 2,500-year-old Greek poetry attributed to the warrior-poet Archilochus: âThe fox knows many things but the hedgehog knows one big thing.â No one will ever know whether Archilochus was on the side of the fox or the hedgehog but Berlin favored foxes. I felt no need to take sides. I just liked the metaphor because it captured something deep in my data. I dubbed the Big Idea experts âhedgehogsâ and the more eclectic experts âfoxes.â Foxes beat hedgehogs. And the foxes didnât just win by acting like chickens, playing it safe with 60% and 70% forecasts where hedgehogs boldly went with 90% and 100%. Foxes beat hedgehogs on both calibration and resolution. Foxes had real foresight. Hedgehogs didnât.â
âI have been struck by how important measurement is to improving the human condition,â Bill Gates wrote. âYou can achieve incredible progress if you set a clear goal and find a measure that will drive progress toward that goalâŠ.This may seem basic, but it is amazing how often it is not done and how hard it is to get right.â
âHereâs a very simple example,â says Annie Duke, an elite professional poker player, winner of the World Series of Poker, and a former PhD-level student of psychology. âEveryone who plays poker knows you can either fold, call, or raise [a bet]. So what will happen is that when a player who isnât an expert sees another player raise, they automatically assume that that player is strong, as if the size of the bet is somehow correlated at one with the strength of the other personâs hand.â This is a mistake.â
âThey arenât gurus or oracles with the power to peer decades into the future, but they do have a real, measurable skill at judging how high-stakes events are likely to unfold three months, six months, a year, or a year and a half in advance. The other conclusion is what makes these superforecasters so good. Itâs not really who they are. It is what they do. Foresight isnât a mysterious gift bestowed at birth. It is the product of particular ways of thinking, of gathering information, of updating beliefs.â
âAnd yet this stagnation is a big reason why I am an optimistic skeptic. We know that in so much of what people want to predictâpolitics, economics, finance, business, technology, daily lifeâpredictability exists, to some degree, in some circumstances. But there is so much else we do not know.â
âUnpack the question into components. Distinguish as sharply as you can between the known and unknown and leave no assumptions unscrutinized. Adopt the outside view and put the problem into a comparative perspective that downplays its uniqueness and treats it as a special case of a wider class of phenomena. Then adopt the inside view that plays up the uniqueness of the problem. Also explore the similarities and differences between your views and those of othersâand pay special attention to prediction markets and other methods of extracting wisdom from crowds. Synthesize all these different views into a single vision as acute as that of a dragonfly. Finally, express your judgment as precisely as you can, using a finely grained scale of probability.â
âsuccess can lead to acclaim that can undermine the habits of mind that produced the success.â
âNot knowing is exciting. It's an opportunity to discover.â
âThe ultimate goal of science is uncertaintyâs total eradication.â
âIn describing how we think and decide, modern psychologists often deploy a dual-system model that partitions our mental universe into two domains. System 2 is the familiar realm of conscious thought. It consists of everything we choose to focus on. By contrast, System 1 is largely a stranger to us. It is the realm of automatic perceptual and cognitive operationsâlike those you are running right now to transform the print on this page into a meaningful sentence or to hold the book while reaching for a glass and taking a sip. We have no awareness of these rapid-fire processes but we could not function without them. We would shut down.â
âForget the old advice to think twice. Superforecasters often think thriceâand sometimes they are just warming up to do a deeper-dive analysis.â


