Silver, Nate (2012). The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t. New York: The Penguin Press. 2012. ISBN 9781101595954. Pagine 545. 15,67 €
BARNES & NOBLE
Un libro molto bello, anche se tutt’altro che perfetto. Un libro pieno di eccessi, più che di difetti: Nate Silver sente il bisogno di scrivere di tutto quello che, nel tempo, lo ha interessato e appassionato – il baseball, il poker, la politica – e di tutti i campi in cui, a suo giudizio, la scienza delle previsioni ha accumulato errori e può fare progressi. Tutto questo fa di The Signal and the Noise un libro monstre, ma al tempo stesso uno dei libri più stimolanti dell’anno.
Nate Silver è salito alla ribalta della cronaca in queste ultime settimane, prima per le polemiche sulle sue previsioni sull’esito delle elezioni americane, e poi per il suo trionfo. Ne ho scritto su questo blog in più occasioni, sia a proposito proprio delle elezioni americane (Nate Silver, il vincitore morale delle elezioni americane), sia discutendo dell’affidabilità delle previsioni meteorologiche private (Le previsioni dei meteorologi privati sono distorte?).
Poiché leggendo il libro mi sono annotato una cinquantina di citazioni, mi asterrei dal fare una vera recensione – i temi del libro e le tesi di Silver emergono con sufficiente chiarezza dalle citazioni stesse. Ma vi regalo un paio di videoclip, dato che Silver è, secondo me, un tipo molto interessante, con una vaga rassomiglianza con Clark Kent, un prototipo del geek e del gay, il che ne ha naturalmente fatto il bersaglio dei sostenitori di Mitt Romney. [In un’intervista a The Observer (Nate Silver: it’s the numbers, stupid), a Carole Cadwalladr che gli chiede «What made you more of a misfit, […] being gay or a geek?», risponde: «Probably the numbers stuff since I had that from when I was six.»]
Il primo video è la presentazione del libro:
Il secondo una lunga conversazione (circa un’ora) tenuta a Google pochi giorni fa:
* * *
Prima una curiosità: ho imparato leggendo questo libro (posizione Kindle 261) che la locuzione “information overload” è stata coniata da Alvin Toffler in Future Shock nel 1970.
* * *
Ecco le minacciate citazioni, che però vi suggerisco di leggere, o almeno di scorrere (riferimento come sempre alle posizioni sul Kindle).
The instinctual shortcut that we take when we have “too much information” is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest. [104]
A prediction was what the soothsayer told you; a forecast was something more like Cassius’s idea.
The term forecast came from English’s Germanic roots, unlike predict, which is from Latin. Forecasting reflected the new Protestant worldliness rather than the otherworldliness of the Holy Roman Empire. [134]The human brain is quite remarkable; it can store perhaps three terabytes of information. And yet that is only about one one-millionth of the information that IBM says is now produced in the world each day. So we have to be terribly selective about the information we choose to remember. [257]
The printing press changed the way in which we made mistakes. Routine errors of transcription became less common. But when there was a mistake, it would be reproduced many times over, as in the case of the Wicked Bible. [275]
When you can’t state your innocence, proclaim your ignorance. [399]
“The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair,” wrote Douglas Adams in The Hitchhiker’s Guide to the Galaxy series. [478]
[…] even if the amount of knowledge in the world is increasing, the gap between what we know and what we think we know may be widening. [828]
“When the facts change, I change my mind,” the economist John Maynard Keynes famously said. “What do you do, sir?” [1169]
Olympic gymnasts peak in their teens; poets in their twenties; chess players in their thirties; applied economists in their forties, and the average age of a Fortune 500 CEO is 55. [1437]
[…] statheads can have their biases too. One of the most pernicious ones is to assume that if something cannot easily be quantified, it does not matter. [1626]
When we can’t fit a square peg into a round hole, we’ll usually blame the peg — when sometimes it’s the rigidity of our thinking that accounts for our failure to accommodate it. Our first instinct is to place information into categories — usually a relatively small number of categories since they’ll be easier to keep track of. (Think of how the Census Bureau classifies people from hundreds of ethnic groups into just six racial categories or how thousands of artists are placed into a taxonomy of a few musical genres.) [1808]
It’s not merely that there is no longer a signal amid the noise, but that the noise is being amplified. [2322]
The statistical reality of accuracy isn’t necessarily the governing paradigm when it comes to commercial weather forecasting. It’s more the perception of accuracy that adds value in the eyes of the consumer. [2326]
Forecasts “add value” by subtracting accuracy. [2335]
With four parameters I can fit an elephant,” the mathematician John von Neumann once said of this problem. “And with five I can make him wiggle his trunk.” [2865]
The government produces data on literally 45,000 economic indicators each year. Private data providers track as many as four million statistics. The temptation that some economists succumb to is to put all this data into a blender and claim that the resulting gruel is haute cuisine. There have been only eleven recessions since the end of World War II. If you have a statistical model that seeks to explain eleven outputs but has to choose from among four million inputs to do so, many of the relationships it identifies are going to be spurious. [3127]
But in fact real management is mostly about managing coalitions, maintaining support for a project so it doesn’t evaporate. [3421]
[…] sophisticatedly simple. [3836]
This is why our predictions may be more prone to failure in the era of Big Data. […] Most of the data is just noise, as most of the universe is filled with empty space. [4258-4266]
We can think of these simplifications as “models,” but heuristics is the preferred term in the study of computer programming and human decision making. It comes from the same Greek root word from which we derive eureka. A heuristic approach to problem solving consists of employing rules of thumb when a deterministic solution to a problem is beyond our practical capacities. [4542]
In many ways, we are our greatest technological constraint. The slow and steady march of human evolution has fallen out of step with technological progress: evolution occurs on millennial time scales, whereas processing power doubles roughly every other year. [4947]
[…] it is not really “artificial” intelligence if a human designed the artifice. [4972. In tema di fibre, si usa distinguere tra fibre artificiali – quelle ottenute modificando fibre naturali, come nel caso della viscosa – e fibre sintetiche – quelle ottenute per sintesi a partire dagli idrocarburi]
As Arthur Conan Doyle once said, “Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth.” This is sound logic, but we have a lot of trouble distinguishing the impossible from the highly improbable and sometimes get in trouble when we try to make too fine a distinction. [5196: un punto di vista nuovo e stimolante su una citazione che è da anni un mio cavallo di battaglia. La frase è pronunciata da Sherlock Holmes]
In the United States, we live in a very results-oriented society. If someone is rich or famous or beautiful, we tend to think they deserve to be those things. Often, in fact, these factors are self-reinforcing: making money begets more opportunities to make money; being famous provides someone with more ways to leverage their celebrity; standards of beauty may change with the look of a Hollywood starlet. [5519]
Smith’s “invisible hand” might be thought of as a Bayesian process, in which prices are gradually updated in response to changes in supply and demand, eventually reaching some equilibrium. Or, Bayesian reasoning might be thought of as an “invisible hand” wherein we gradually update and improve our beliefs as we debate our ideas, sometimes placing bets on them when we can’t agree. Both are consensus-seeking processes that take advantage of the wisdom of crowds. [5609]
“The market can stay irrational longer than you can stay solvent.” [6066: ancora Keynes]
[…] “the fight between order and disorder,” [6202: è di Didier Sornett]
We could try to legislate our way out of the problem, but that can get tricky. If greater regulation might be called for in some cases, constraints on short-selling — which make it harder to pop bubbles — are almost certainly counter-productive. [6218]
CO2 quickly circulates around the planet: emissions from a diesel truck in Qingdao will eventually affect the climate in Quito. [6285]
Climate refers to the long-term equilibriums that the planet achieves; weather describes short-term deviations from it. [6501]
Uncertainty in forecasts is not necessarily a reason not to act — the Yale economist William Nordhaus has argued instead that it is precisely the uncertainty in climate forecasts that compels action […] [6716]
When we advance more confident claims and they fail to come to fruition, this constitutes much more powerful evidence against our hypothesis. We can’t really blame anyone for losing faith in our forecasts when this occurs; they are making the correct inference under Bayesian logic. [6855]
The fundamental dilemma faced by climatologists is that global warming is a long-term problem that might require a near-term solution. [6864]
In science, progress is possible. In fact, if one believes in Bayes’s theorem, scientific progress is inevitable as predictions are made and as beliefs are tested and refined. […]
In politics, by contrast, we seem to be growing ever further away from consensus. The amount of polarization between the two parties in the United States House, which had narrowed from the New Deal through the 1970s, had grown by 2011 to be the worst that it had been in at least a century. […]
The dysfunctional state of the American political system is the best reason to be pessimistic about our country’s future. Our scientific and technological prowess is the best reason to be optimistic. [6913-6930]To Wohlstetter, a signal is a piece of evidence that tells us something useful about our enemy’s intentions; this book thinks of a signal as an indication of the underlying truth behind a statistical or predictive problem. Wohlstetter’s definition of noise is subtly different too. Whereas I tend to use noise to mean random patterns that might easily be mistaken for signals, Wohlstetter uses it to mean the sound produced by competing signals. [6999]
[…] signal detection vs. signal analysis [7023]
There is a tendency in our planning to confuse the unfamiliar with the improbable. The contingency we have not considered seriously looks strange; what looks strange is thought improbable; what is improbable need not be considered seriously. [7035: è una citazione di Thomas Schelling]
[T]here are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—there are things we do not know we don’t know.—Donald Rumsfeld [7060]
[…] detecting a terror plot is much more difficult than finding a needle in a haystack, he said, and more analogous to finding one particular needle in a pile full of needle parts. [7177]
The more often you are willing to test your ideas, the sooner you can begin to avoid these problems and learn from your mistakes. Staring at the ocean and waiting for a flash of insight is how ideas are generated in the movies. In the real world, they rarely come when you are standing in place. Nor do the “big” ideas necessarily start out that way. It’s more often with small, incremental, and sometimes even accidental steps that we make progress. [7593]
Sanford J. Grossman and Joseph E. Stiglitz, “On the Impossibility of Informationally Efficient Markets,” American Economic Review, 70, 3 (June 1980), pp. 393–408. http://www.math.ku.dk/kurser/2003-1/invfin/GrossmanStiglitz.pdf. [9526]
A conspiracy theory might be thought of as the laziest form of signal analysis. As the Harvard professor H. L. “Skip” Gates says, “Conspiracy theories are an irresistible labor-saving device in the face of complexity.” [11798]
martedì, 4 dicembre 2012 alle 14:02
[…] Nate Silver – The Signal and the Noise […]
sabato, 5 gennaio 2013 alle 20:22
[…] non sono completamente d'accordo, può essere utile a complemento della mia recensione, che trovate qui. Pubblicato in Recensioni. Lascia un commento […]
sabato, 23 febbraio 2013 alle 23:37
[…] e per una certa tendenza a vedere dappertutto complotti (Nate Silver afferma nel suo The Signal and the Noise che «[a] conspiracy theory might be thought of as the laziest form of signal analysis» e cita il […]