Cognitive surplus watch

Thanks to the Internet, Americans are devoting less of their free time to watching television and more to creating socially useful stuff.

As if.

A year ago, the Nielsen Company reported that Americans’ TV viewing hit an all-time record high in the first quarter of 2010, with the average person spending 158 hours and 25 minutes a month in front of the idiot box.* That record didn’t last long. Nielsen has released a new media-usage report, and it shows that in the first quarter of 2011, the average American watched TV for 158 hours and 47 minutes a month, up another 0.2 percent and, once again, a new all-time high.* Twenty years into the Web revolution, and we’re boob-tubier than ever.

But even that understates our video consumption. One of the Net’s big effects has been to free TV programming from the living room and the bedroom. We can now watch the tube through our laptops and smartphones 24/7 – at work, in restaurants, and while strolling down the street. And that’s just what we’re doing. In the first quarter of 2011, the average American watched 4 hours and 33 minutes of streaming video a month on a computer, up a whopping 34.5 percent from year-earlier levels. That same average American watched an additional 4 hours and 20 minutes of video on a mobile phone, up 20 percent from Q1 2010. You no longer need a couch to be a couch potato.

Of course, we’re doing more on the Net than just watching video. We’re also playing Angry Birds. As of the start of this year, human beings were devoting 200 million minutes a day to playing the addictive computer game. That works out to 1.2 billion hours of our collective annual cognitive surplus.*

Bottom line: the more time we spend in front of media devices, the more time we fritter away. Shocked? Me neither.

Previous posts on this subject:

Gilligan’s Web

Charlie Bit My Cognitive Surplus

32 thoughts on “Cognitive surplus watch

  1. Kelly Roberts

    Heretic! I spent 4 hours yesterday updating the Wikipedia entry on Star Wars Action Figures. You don’t think that’s useful?

    It’s called collaboration, smart guy. We don’t need your “expertise” anymore. Just because the hive mind is mostly analyzing episodes of The Family Guy and posting recipes for pot brownies doesn’t mean Twitter didn’t save Egypt.

    Get on board the peace (via networking technology) train!

  2. tomslee

    One of the Net’s big effects has been to free TV programming from the living room and the bedroom.

    And conversely, one of mobile computing’s big effects (smartphones, fondleslabs) has been to bring social media into the living room and to gather round the TV. Self-promotional link.

  3. Nick Carr

    Tom,

    Can you stick a webcam on top of your TV? I would enjoy watching you watching So You Think You Can Dance. It would also serve to transform you from a consumer to a prosumer. A win-win.

    Nick

  4. Curiouslyp

    Sure TV/video consumption is increasing, but plenty of other statistics show that we’re spending more time multi-tasking. You may contend that people can’t effectively work/edit Wikipedia/create something while the TV is playing in the background, but this is no more distracting than, say, email.

    In short, the conclusion you draw from the data is an interesting hypothesis, but it is not conclusive.

    Simon

  5. Matgb

    Basically, what Tom said. I’m sat on my sofa watching my fiance and kid play Worms, while reading around the internet and discussing economics theory with people.

    I find these days I can’t *just* sit and watch TV, it bores me, so while I undoubtedly have the TV on a lot more these days, half the time it’s background to what I’m actually doing, which is trawling around the internet teaching myself things.

  6. Nick Carr

    Doing other stuff while watching TV is nothing new. Long before the Net came along, people would knit, crotchet, chat with friends or family members, do small household repairs, do crossword puzzles, read newspapers and magazines, talk on the phone, fiddle with various hobbies and crafts etc. while the TV played. Tweeting while watching So You Think You Can Dance is absolutely fine by me, but it doesn’t strike me as being instrinsically more valuable than any other type of diversion. What the statistics show is that, contrary to common assumptions and expectations, we watch more TV than ever before and, thanks to being connected to media networks during pretty much all our waking hours, we consume far more media entertainment than ever before. I think Tom Slee’s point that the net is turning out to be more a complement to mass media than a substitute for it is on target.

  7. Matt Seigal

    As a Brit, I’m fascinated how cognitive surplus fits in with the abundance of free time that teenagers have during the summer holidays to destroy and loot in English cities. Presumably, social media, BBM and television inspired copy cat vandals to be actively destructive.

  8. tomslee

    I would enjoy watching you watching So You Think You Can Dance.

    Believe me, you wouldn’t.

    Matt Seigal – All that Angry Birds playing is just a gateway drug for throwing bricks through shop windows.

  9. Seth Finkelstein

    You’ve got to read that “cognitive surplus” phrase with the right – well, I’m not exactly sure what the correct word is – something like tone or spin or accent. Imagine a venture capitalist huckster giving a presentation along the lines of “China’s an enormous country, if we could just get 1% of them to buy our product, just one percent of the China market, it’d be HUGE, we’d be rich, and with the Internet we have the COGNITIVE SURPLUS …”

    (i.e. this “1%” argument is a well-known business plan fallacy)

    The hype-mongers aren’t saying “People won’t waste time”. They’re saying “There’s huge amount of time up for grabs, and if we could grab some of it, we’d make out like bandits”. Of course, it’s much more difficult than they present. But their own business is selling the concept and collecting that way.

  10. Designcomment.blogspot.com

    Nick,

    I haven’t gotten around to reading Clay Shirky’s Cognitive Surplus yet. I still have great works like Don Tapscott’s Wikinomics sitting un-read in boxes today. Nor have I read your blog during the period in which Shirky’s latest book was being launched on the market. So I would have to back track through your site, to fix in on any Shirky centric posting of yours.

    However, straight off the bat, I can see you are making the same mental knee jerk reaction to Shirky’s work, as you made when you speak about artists and creativity subjects. The thing to bear in mind, is to quit thinking in terms of Alvin Toffler, or Nicholas Negroponte type, 1980’s prosumer individuals. It should move on from that original definition, to incorporate some basic understanding of groups, teams, crowds, mobs etc, etc.

    Fred Brooks, author of the Mythical Man Month commented several years ago, in response to Eric S. Raymond of the Open Source community on the subject of their production model, and problem-solving approach. You know the one, given enough eyes, all bugs are shallow. Brooks posed the question, would the open source approach, be useful for a design problem, where the participants didn’t understand the nature of the problem, as a dispersed group of software professors, solving a software problem together?

    The other thing I would have to add to that, is the reason why the Open Source community functions as well as it does, to solve difficult problems efficiently, is that from undergraduate level upwards, software engineers are working on collaborative friendly text-based programming tools. I held several discussions on this topic lately with people from the software engineering career segment, and they all told me the same thing. That software engineers collaborate so wilfully, because their role models in the industry work on projects in collaborative, largely distributed fashion.

    They also mentioned the fact, that existing collaborative assignment completion tools available for several years now, to the young software engineer undergraduate, cannot cope with (wait for it), sophisticated file types such as word documents. That’s meant to be a geek joke. It means that for the basic, basic file format on *individual* personal computer platforms today, the collaborative type tools simply aren’t out there in 2011. At this point, I would prefer to have someone such as Kevin Kelly enter the fray, and tell us all again, that personal computers were pretty dumb – until, they started tying up together, via communications networks.

    Well guess what, from the point of view of the prosumer at the most basic, basic level in 2011, personal computer platforms (despite having tablets, iphones, netbooks, readers, blah, blah), are still as dumb as they ever were, when Kevin Kelly started up Wired magazine.

  11. Designcomment.blogspot.com

    Nick,

    No television in home whatsoever. I used to watch a lot of TV on computer a year ago, and stopped because I got back to a dissertation etc. But still, I think you’ve sort of caught me rotten.

    Notebook computer used as Dvd player. In the past year, I’ve gotten into a DVD box set habit. It’s low energy media consumption, and fits in with the academic study. I knew folk who were box set fanatics (in the Stephen Berlin Johnson, things that make us dumb, make us smarter sense), several years ago. I never thought I would fall for it myself.

    But a couple of TV series got me interested. I knew about Johnson’s theory, on the plot lines involved in TV series today – as opposed to the isolated, un-related episodes of olden days. I’ve found there is something to it. Watched West Wing seven episodes about three times in full, in past year. One forward, then backways, then forwards.

    You get a bit of re-use out of these box sets, if you put in the effort to follow the higher level plot lines. I probably would find the same with games. But I hadn’t the patience to play them.

    Btw, you’re blog site is generally entertaining and well written. And provides a nice bit of brain stimulus too. Not too much, but certainly some. Keep up the work.

  12. Seth Finkelstein

    Designcomment, I lost you somewhere. All of project management is about how to “work on projects in collaborative, largely distributed fashion”. It’s not like this was suddenly invented when a bunch of professional conference-clubbers decided they were going to sell the hype to their corporate clients. And Microsoft Word documents are a massive pain. It’s a complicated format that gets in the way of the simple text of programming.

  13. Designcomment.blogspot.com

    Seth,

    I don’t mean word application for programming. I meant word application for doing reports, between multiple project participants. Nobody is doing that. Despite Word working on all platforms, big and small now. A lot of people owning several devices, and lots of people working on reports.

    It’s a big discussion. I don’t want to crowd Nick’s comment thread out with an excess of paragraphs on this. In fairness to Nick, his original blog entry reads fine to mean, and I agree with an awful lot that he said. But I didn’t want to leave out the other half of the picture either.

    I’ve detected a pattern in his treatment of many topics now, to do with creation and production – where he struggles to separate himself away from the centuries old notion of the romantic individual author. Its not a technological brick wall. It’s a problem to do with our need to create superstars. Most industries, including the art industry, is about star creation. It’s been rampant in architecture for years now. Star-itects, they call them.

    What people don’t realise, is that there is a vast collaborative effort happening behind the curtain, in order to hoist the Star-itect up there in front of the audience. All the Richard M. Stallman, the open source movement, the wikinomics people are trying to do, is move what already exists in the background into the foreground. But it will not alter what already fundamentally happens. Only present it differently, minus the ‘star’ individual.

    Reference, iSchool at Berkeley. Takes ‘back room’ server administrators and teaches them how to become foreground figures, with a basic knowledge of humanities. Or visa versa, humanities graduates learn about the engine room of it all. Apologises Nick for length. As I said, not the place to open up this tin of worms.

  14. Nick Carr

    Despite the readily available statistics, the myth that internet use displaces TV viewing continues to spread. I just read this in a Mother Jones piece by Kevin Drum: “I always thought the anti-TV crowd at least had a point: television really did crowd out things like books and magazines, which were better suited to big ideas and complex arguments than the tube. But social networking? As near as I can tell, it’s mostly crowding out in-person gossip and….television. That seems like a much more benign trade.” As near as you can tell isn’t near enough.

  15. Jlmannisto

    I find these days I can’t *just* sit and watch TV, it bores me, so while I undoubtedly have the TV on a lot more these days, half the time it’s background to what I’m actually doing, which is trawling around the internet teaching myself things.

    I actually never used to be much of a TV watcher. Especially in recent years, my brain has been too jumpy to enjoy TV and movies that go on without me when my mind wanders. I like books partly because they don’t do that.

    But this summer (due to the Internet configuration — wired only — in the apartment where I’m living temporarily) I have ended up watching an hour of TV every other night or so, with no Internet anywhere in sight, and I’m starting to think that maybe it might be a good exercise in focus, to counter the way the Internet works with my brain.

    Weird…I never thought I’d be making an argument that TV is good for my brain, but (though I have no solid data on this) it seems like in this case, it might have been! I also might, maybe, be getting a little better at focusing on the books I read.

    ~Jessie Mannisto

  16. Kelly Roberts

    Jessie has a point. If *just* watching TV bores you, then why watch TV at all? It reminds me of that Seinfeld episode where George tries to combine sex, eating and TV watching (to the supreme disappointment of his ladyfriend).

    Internet use tends to displace not TV watching but attentive TV watching.

  17. Designcomment.blogspot.com

    Nick,

    We went over this issue not so long ago, as I recall. I remember writing something here, about the architect who banned members of his staff, who received a phone call from a building site operative opening up a digital drawing on the screen, as they were talking about a design matter. I made the point that in the digital era, large format paper output devices, will fold all of the drawings for you. All that people do, is have an assistant put the paper drawings in an envelope and write an address on it. Few people in a drawing office any more look at a paper drawing. Yet, the fellow who receives the envelope looks at the paper drawing, and is very aware of the paper output.

    What we have in the modern world, is certain parts of the process speed-ed up, and others aren’t at all. The older architect said to me, we moved to digital and we lost more than we gained. We gained a faster means to output information. But what we lost was a common medium through which everyone could agree and communicate.

    When we talk about the ‘benign trade’ or otherwise, we should think about these issues also. It’s like what you said about with ‘search’, and people not using parts of their brains any longer. What happens when architects no longer look at paper drawings? But still expect others else to interpret what they produce. Or further still, the (digital) information production is outsourced to another continent, and the guy looking at the paper item, has to phone that other continent? That’s happening already. How much is being lost in that equation I wonder?

    Hal Varian of Google spoke about this at the iSchool in October 2009, as if it was a wonderful new development in collaborative working. He’s a smart fellow Hal, and makes some great points in the podcast. I’d love to challenge him on a few also though.

    http://www.ischool.berkeley.edu/newsandevents/audiovideo?page=3

  18. Brutus.wordpress.com

    Nick’s post is all about quantity, but everyone seems to really want to talk about quality. The story told by the numbers is not the same as that told by the anecdotes. In our consumer society, we’re probably driven more by quantification (even if it feels like quality), especially the numbers representing hours spent allowing advertisers to infiltrate our minds and then the buying habits that flow from that exposure. And an inchoate fear of silence, inactivity, and boredom keeps the cycle churning. We gotta consume something, right?

    I’m seriously not buying the argument that electronic and social networking is freedom or a vehicle to achieving it. Nor am I convinced by the hipster admonition to get with it and embrace the crowd (group, team, mob, etc.). That’s just demographics run amok, another quantification. There probably is something to the notion that individual authorship is eroding in favor of collaboration, but that might be something to regard warily rather than effusively.

  19. Designcomment.blogspot.com

    Brutus,

    Very well considered post. Thanks.

    Having used a combination of Amazon to do shopping lately, and using Chrome web browser from Google Inc, I am shocking at how much targeted advertisement banners I see nowadays. I purchased a DVI to VGA adapter for €6.00 recently, and everywhere I go now on the web, I seem to see those things in banner ads! Honestly, I only needed the one. I was so innocent, that at first I thought it was only a coincidence, until about the 100th time. Nick covers this very well in certain chapter in his book, The Big Switch.

    If you really want to discover the amount of advertisement you consume while watching the average movie on television, I recommend getting a digital video recorder device some time, and going back to delete the advertisements in what you record. Some stations aren’t too bad, but I was amazed to find myself sometimes deleting over five minutes worth of advertisements at a time – lots of those – in one 90min feature length movie. Basically, the advertisements add almost a half an hour onto the viewing time. I know, because I was trying to fit the recording onto a DVD disc to watch later on, on the laptop. At least the digital video recorder though, doesn’t follow you around with €6.00 DVI to VGA adapter special offers. But that is only a matter of time also.

  20. Stewart Dinnage

    What evidence is there that video is a poor medium in comparison to others?

    Comment is always free and in Nicks case welcomed, clearly a great writer. That not withstanding, the comments still seem highly opinionated and based on seriously blunt stats.

    On another note:

    http://www.guardian.co.uk/science/the-lay-scientist/2011/aug/08/1

    Susan Greenfield actually quoted Nicks book as first in her list of “evidence” of the affect of technology on the brain (ironic as Nick seems to quote her extensively, a circle of opinion?).

    I find this post by Dorothy Bishop interesting on the matter of autism and technology, brought up by greenfield.

    http://deevybee.blogspot.com/2011/08/open-letter-to-baroness-susan.html

    My thoughts, if you bill yourself as a scientist, try doing some science. If you are writer of popular non fiction and you suggest your work has strong scientific foundation, work hard on researching that, maybe even define a hypothesis…

    Otherwise how are these comments/books/products etc any different to the hyper-linked ramblings of the masses being poked fun at.

    I have no insight as to the effect of technological distraction on the brains of humans. That said, I’m pretty sure I’d have found it difficult to call out or discuss anything with my favourite authors/directors and scientists of the past, in the way I’m happy to here!

    I’m not saying the questions posed by Susan and Nick are wrong, even if they seem a little unintelligible, I’m just a little tired of the implied message, i.e. that the words shared are more than opinion!

  21. Nick Carr

    Susan Greenfield actually quoted Nicks book as first in her list of “evidence” of the affect of technology on the brain (ironic as Nick seems to quote her extensively, a circle of opinion?).

    Stewart: Are you making that up, or do you have some “evidence”? I don’t believe I quote Baroness Greenfield at all in my book. But if I did, how exactly would that be “ironic”?

  22. Stewart Dinnage

    Firstly a massive apology, Susan (Baroness) Greenfield is not to be confused with Patricia Greenfield. As I understand it Patricia’s meta study/review 2009 is a major component of the Science considered in your book.

    Absolute apology again for confusing these two Greenfields.

    You can find Susan Greenfield discussing “Mind Change” as she calls it here: http://www.guardian.co.uk/commentisfree/video/2011/aug/15/susan-greenfield-video?INTCMP=SRCH

    Your book is first on the list of evidence, 3:19

    There is probably still reason to find this ironic. A Neuroscientist is challenging the criticism that her opinions stated have no basis in evidence. Rather than point to directly related peer reviewed/respected studies that back up her statements (whatever “Mind Change” actually is stating, which seems unclear) she mentions the Shallows as first in the list. To my knowledge you haven’t claimed to be the originator of scientific evidence in this field.

    Two things that stand out to me, studies of the effects of Games (which seems to mean computer/screen games) have in Susan Greenfield’s (not sure about Patricia’s)case, become something you can test the effects of across the board. Are all games alike? does killing zombies = crayon physics (humble indie bundle)? Is a virtual game of Monopoly dangerously inconsequential, whilst on a board it is a valuable learning experience? It seems to me that the level of the debate here and elsewhere is currently rather black or white. Secondly, Susan Greenfield’s statements on autism and screen technology (which go beyond what was said in the video) seem to be particularly unlikely and have been very poorly received by scientists who actually work in that field.

    The increase in ADHD mentioned is also interesting, Tech News outlets reported this: http://www.sciencedaily.com/releases/2010/08/100817103342.htm

    Which suggests a massive misdiagnosis issue based on older and younger pupils in the school year. Sir Ken Robinson/RSA has a little segment on ADHD here also.

    http://www.youtube.com/watch?v=zDZFcDGpL4U

    At about 3:40 (although the whole VIDEO is quite interesting).

    As I’ve said I don’t have the background or skills here and I agree that asking the questions is not inherently a bad idea. I will say if you want to suggest “Science” underpins what you are saying then you need to say something scientific. A hypothesis, some test for evidence (either way), repeatability, be open even. The things good scientists actually do! Otherwise isn’t it just opinion?

Comments are closed.