Charlie bit my cognitive surplus

“You can say this for the technological revolution; it’s cut way down on television.” So writes Rebecca Christian in a column for the Telegraph Herald in Dubuque. She’s not alone in assuming that the increasing amount of time we devote to the web is reducing the time we spend watching TV. It’s a common assumption. And, like many common assumptions, it’s wrong. Despite the rise of digital media – or perhaps because of it – Americans are watching more TV than ever.

The Nielsen Company has been tracking media use for decades, and it reported last year that in the first quarter of 2009, the amount of time Americans spend watching TV hit its highest level ever – the average American was watching 156 hours and 24 minutes of TV a month. Now, Nielsen has come out with an update for the first quarter of 2010. Once again, TV viewing has hit a new record, with the average American now watching 158 hours and 25 minutes of TV a month, a gain of 2 hours in just the past twelve months. Although two-thirds of Americans now have broadband Internet access at home, TV viewing continues its seemingly inexorable rise.

And the Nielsen TV numbers actually understate our consumption of video programming, because the time we spend viewing video on our computers and cell phones is also going up. The average American with Internet access is now watching 3 hours and 10 minutes of video on Net-connected computers every month, Nielsen reports, and the average American with a video-capable cell phone is watching on additional 3 hours and 37 minutes of video on his or her phone every month. Not surprisingly, expanding people’s access to video programming increases their consumption of that programming. The spread of high-definition digital TVs and broadcasts appears to be another factor propelling TV viewing upward, says Nielsen.

What about the young? Surely, so-called “digital natives” are watching less TV, right? Nope. The young, too, continue to ratchet up their TV viewing. A recent study of media habits by Deloitte showed, in fact, that over the past year people in the 14-to-26 age bracket increased their TV watching by a greater percentage than any other age group. An extensive Kaiser Family Foundation study released earlier this year found that while young people appear to be spending a little less time in front of TV sets today than they did five years ago, that decline is offset by increased viewing of television programming on computers, cell phones, and iPods. Overall, “the proliferation of new ways to consume TV content has actually led to an increase of 38 minutes of daily TV consumption” by the young, reports Kaiser. Nielsen, too, finds that TV viewing continues to rise among children, teens, and young adults.

What about the rise of amateur media production, abetted by sites like YouTube? That trend, at least, must be shifting us away from media consumption. Wrong again. As Bradley Bloch explained in a recent Huffington Post article, the ease with which amateur media productions can be distributed online actually has the paradoxical effect of increasing people’s media consumption even more than it increases their media production. “Even if we count posting a LOLcat as a creative act,” observes Bloch, “there are many more people looking at LOLcats than there are creating them.” Bloch runs the numbers on one oft-viewed YouTube entertainment: “One of the most popular videos on YouTube, ‘Charlie bit my finger – again!’ depicting a boy sticking his fingers in his little brother’s mouth, has been viewed 211 million times. Something that took 56 seconds to create – and which was only intended to be seen by the boys’ godfather – has sucked up the equivalent of 1600 people working 40 hours a week for a year. Now that’s leverage.” By giving us easy and free access to millions of short-form video programs, the web allows us to cram ever more video-viewing into the nooks and crannies of our daily lives.

To give an honest accounting of the effects of the Net on media consumption, you need to add the amount of time that people spend consuming web media to the amount of time they already spend consuming TV and other traditional media. Once you do that, it becomes clear that the arrival of the web has not reduced the time people spend consuming media but increased it substantially. As consumption-oriented Internet devices, like the iPad, grow more popular, we will likely see an even greater growth in media consumption. The web, in other words, marks a continuation of a long-term cultural trend, not a reversal of it.

Take it away, Charlie:

11 thoughts on “Charlie bit my cognitive surplus

  1. tomslee

    Spot on, and good references.

    I particularly like the Bradley Bloch point: there is an accounting identity (or conservation law?) that production * audience = consumption, and it follows that if we had more people producing more stuff and reaching bigger audiences than ever before, we would necessarily have a much larger amount of consumption.

  2. William

    The rise in hours people are watching TV coincides with a large downturn in the economy. It’s possible that folks can’t afford to go out as much so they take the cheaper option and watch TV and surf the net more. So the numbers for watching TV would increase along with the web surfing.

  3. John Schoettler


    There appears to be a long and constant historical pattern showing an ever increasing amount of time individuals spend consuming TV/video regardless of mild deviations related to economic booms and busts. Video is just a form of media that easily consumed by the masses regardless of differences of education and social status.

    Rome had it’s bread and circuses while we have fast food joints on every corner and unlimited video on demand whenever and wherever we want. I don’t think this is any government conspiracy, only that the natural direction of things is the one of least resistance and that easy path is usually ‘down’ if left unchallenged.

  4. Hendrik Dacquin

    While I agree with the general idea of the post I find the comparaison of the time wasted on a silly video with the time spent by 1600 people working 40 hours a week for a year tendentious.

    Like the recent study that claimed that Google’s pac-man game ate five million hours of work time it is comparing two different, incompatible time perspectives.

    Since when are hours of work an objectif measurement of time? Since Frederick Winslow Taylor? Isn’t it this mindset that ratifies the idea that skimming the web and absorbing information overload in actually an improvement over the slow, contemplative taks of deep reading and thinking.

    It’s not the hours of work that should be measured, it’s the balance between two time perspectives. It’s the delta between the future oriented book reading and the present hedonistic fun of watching a clip of finger-eating babies. If they are balanced, I don’t see much harm.

  5. Brian Quass

    I think it’s interesting to notice how much of the TV that’s increasingly watched these days is itself of the “Charlie Bit My Finger” variety, which is to say “reality TV.” There’s a topic for another book, perhaps, Nicholas: the philosophical and cultural significance of the TV viewer’s increasing preference for voyeuristic programming over old-school, script-based programming, and how that relates (as cause, effect, etc.) to the impatient Netizen zeitgeist that you so excellently described in “The Shallows.”

  6. Kroberts39

    Best blog post title ever. This idea of “cognitive surplus” is something Harold Hill might have bottled and sold to the trusting townsfolk. “Step right up, folks! You too can be a producer of content on the internets! No more passive consuming of sitcoms and novels and history books and mathematics for us–no sir! With just one of these here bottles, and for the one time low, low price of…”

  7. Gavin Heaton

    All good points – but for the vast majority of us, “watching TV” equates to “watching the device called TV”. What we are seeing is a shift in the consumptive behaviour of individuals – moving from one device which is broadcast oriented to another which can be/is multicast in nature. So yes, I see your points, but I think the transformation under way runs deeper than the statistics suggest.

  8. Seth Finkelstein

    Excellent return to your pre-“Stupid/Shallows” material. It’s good to see this stuff getting some attention, amidst the huckstering Kelly Roberts details above.

  9. Jeanne

    Yes, time spent viewing TV on any platform far exceeds using computers for other purposes.

    Having just finished reading The Shallows, I’m wondering whether neuroscientists have explored whether television viewing changes our brains. Similar arguments made about the Net were also made about television back in the 60s before brain imaging technology (e.g., Sesame Street’s chunked episodes and “rapid” pacing).

    (I only know of one brain study of children’s brains while watching TV violence.)

  10. John Schoettler

    It’s rather ironic that I’m suggesting a link to a video, but has everyone here already seen the new 2011 Jeep commercial with the very ‘Marshall Mcluhan’-like statement at the end of the commercial saying “The Things We Make, Make Us”?

    If not you can see it @

Comments are closed.