A little more signal, a lot more noise

I don’t fully understand this excerpt from Nassim Nicholas Taleb’s forthcoming book Antifragile, but I found this bit to be intriguing:

The more frequently you look at data, the more noise you are disproportionally likely to get (rather than the valuable part called the signal); hence the higher the noise to signal ratio. And there is a confusion, that is not psychological at all, but inherent in the data itself. Say you look at information on a yearly basis, for stock prices or the fertilizer sales of your father-in-law’s factory, or inflation numbers in Vladivostock. Assume further that for what you are observing, at the yearly frequency the ratio of signal to noise is about one to one (say half noise, half signal) —it means that about half of changes are real improvements or degradations, the other half comes from randomness. This ratio is what you get from yearly observations. But if you look at the very same data on a daily basis, the composition would change to 95% noise, 5% signal. And if you observe data on an hourly basis, as people immersed in the news and markets price variations do, the split becomes 99.5% noise to .5% signal. That is two hundred times more noise than signal — which is why anyone who listens to news (except when very, very significant events take place) is one step below sucker. … Now let’s add the psychological to this: we are not made to understand the point, so we overreact emotionally to noise. The best solution is to only look at very large changes in data or conditions, never small ones.

I’ve long suspected, based on observations of myself as well as observations of society, that, beyond the psychological and cognitive strains produced by what we call information overload, there is a point in intellectual inquiry when adding more information decreases understanding rather than increasing it. Taleb’s observation that as the frequency of information sampling increases, the amount of noise we take in expands more quickly than the amount of signal might help to explain the phenomenon, particularly if human understanding hinges as much or more on the noise-to-signal ratio of the information we take in as on the absolute amount of signal we’re exposed to. Because we humans seem to be natural-born signal hunters, we’re terrible at regulating our intake of information. We’ll consume a ton of noise if we sense we may discover an added ounce of signal. So our instinct is at war with our capacity for making sense.

If this is indeed a problem, it’s not an isolated one. We have a general tendency to believe that if x amount of something is good, then 2x must be better. This leads, for instance, to a steady increase in the average portion size of a soft drink – until the negative effects on health become so apparent that they’re impossible to ignore. Even then, though, it remains difficult to moderate our personal behavior. When given the choice, we continue to order the Big Gulps.

7 thoughts on “A little more signal, a lot more noise

  1. JohnDCook

    This is related to what statisticians call multiple testing: the more questions you ask, the more likely you are to see a false positive. When you look at it from one angle, this makes perfect sense. But from another angle, it’s mind boggling.

    When someone tells you it’s likely you’re fooling yourself because you’ve done multiple tests (without correcting for this statistically), you want to say “But I saw what I saw! How can you look at this data and tell me I’m not right?!”

  2. Max Gano

    Excellent bit to highlight. An accomplished historian once explained to me that he seldom read the current news. Biased by fear, angst and myopia of the immediate perspective, it provides little reliable information. And that was twenty years ago.

  3. Drcoddwasright.blogspot.com

    The counter examples: yearly (or quarterly) data is merely averaging of daily, and high frequency trading (love it or hate it) works only because of data “overload”. In fact, as sample size increases, calculated variance decreases. Variance is where the noise lives.

    His assertion that more frequent data is inherently more noisy is laughed at by math stats. And they know what’s going on.

  4. Nick Carr

    yearly (or quarterly) data is merely averaging of daily

    I see your point, and I recognize it’s an important one, but I don’t see that it necessarily contradicts Taleb’s point. There’s data-assembly (averaging, say) and there’s information-retrieval, and they’re not necessarily the same thing.

  5. Gsatell

    There’s an interesting story behind this. The concept comes from Benoit Mandelbrot who first started working on noise in communications systems at IBM labs and noticed the pattern (which he called “Noah effects” and “Joseph effects”).

    Later he noticed that his charts looked exactly like cotton market data and he started studying markets. He took a lot of flack for arguing about the risks of financial engineering and Taleb became a big fan.

    Most of his ideas are based on Mandelbrot, who died in 2010, soon after the financial collapse that he predicted in the 1960’s actually occurred.

    – Greg

  6. Tom Lord

    Mandelbrot? Huh. I took the quote to be a kind of trite analogy to the sampling theorem (Shannon) and extensions of it to channels with noise.

    The problem with the application of the analogy to real world phenomena is that many real world phenomena aren’t usefully described by continuous functions of finite spectra. *That’s* where Mandelbrot comes in.

    Psychologically, I believe that when humans are confronted with a phenomenon that really is subject to the sampling theorem — we’re pretty much biologically programmed to get the sampling rate right. That’s part of the function of boredom. Even the most obsessive day traders are unlikely to keep double-checking the closing DOW 100 times a night after hours, for example.

    It’s historically true that sometimes people seem to go nuts staring at noise but it seems to be an exception to the rule in deep ways.

    There is an edge, though, where human cognition reliably fails for many: gambling against the stacked house.

    A stacked house (e.g., Vegas) stimulates with signals (pay out events against chances taken) that are subject to the sampling theorem. And you know analytically its a suckers bet. But emotionally it’s so close to not being a suckers bet as to be just a little too compelling sometimes.

    I’m suspicious of this claim:

    Because we humans seem to be natural-born signal hunters, we’re terrible at regulating our intake of information. We’ll consume a ton of noise if we sense we may discover an added ounce of signal.

    I think our bug is more like that we easily fall into a trap of consuming 1.01 ounces of noise for 1.00 ounces of signal, so to speak.

    Modern technology may indeed amplify our opportunities to fall into that trap.

  7. A J Marr

    Taleb’s position is well taken, but nonetheless commits a category error by implicitly segregating rational from emotional faculties, when they are in fact continuously and completely integrated in all decision making.

    To illustrate, although the diminishing marginal utility of consuming information (checking every ten minutes the stock market ticker, social media account, or email) soon replaces salient information with mere ‘noise’, the perceived randomness of information (or the novel ways information is related) in itself has biological value. This is represented by the activity of mid-brain dopamine systems that add momentary or ‘decision’ utility to behavior by causing positive affect but do not predict the long-term utility of behavior. That is, the ‘gut feelings’ of positive affect predict nothing at all.

    This is best illustrated in the core metaphors we use to describe the unintended consequences of the information revolution. If as we commonly believe we are predominantly rational animals, then the negative or toxic results of this are best represented by metaphors such as information overload, wherein our logical or reasoning circuitry is impaired by information of uniform and every growing utility or ‘goodness’. However, if we recognize that our emotions cause us to vastly overvalue the importance of information, then information delusion is the appropriate metaphor. That is, we are not overloaded by information. Rather, we are deluded by the integrated outcome of poor reasoning habits and affect into thinking that most information is useful, when it is overwhelmingly not. This of course changes the whole dynamic of the ‘information overload’ debate, since better information filters or information ‘diets’ are doomed to fail we cannot recognize the fact that almost all the information we receive is just junk. In this regard, Taleb is spot on, as we are indeed deluded creatures, not ‘overloaded’ ones!

Comments are closed.