Cognitive surplus watch

Thanks to the Internet, Americans are devoting less of their free time to watching television and more to creating socially useful stuff.

As if.

A year ago, the Nielsen Company reported that Americans’ TV viewing hit an all-time record high in the first quarter of 2010, with the average person spending 158 hours and 25 minutes a month in front of the idiot box.* That record didn’t last long. Nielsen has released a new media-usage report, and it shows that in the first quarter of 2011, the average American watched TV for 158 hours and 47 minutes a month, up another 0.2 percent and, once again, a new all-time high.* Twenty years into the Web revolution, and we’re boob-tubier than ever.

But even that understates our video consumption. One of the Net’s big effects has been to free TV programming from the living room and the bedroom. We can now watch the tube through our laptops and smartphones 24/7 – at work, in restaurants, and while strolling down the street. And that’s just what we’re doing. In the first quarter of 2011, the average American watched 4 hours and 33 minutes of streaming video a month on a computer, up a whopping 34.5 percent from year-earlier levels. That same average American watched an additional 4 hours and 20 minutes of video on a mobile phone, up 20 percent from Q1 2010. You no longer need a couch to be a couch potato.

Of course, we’re doing more on the Net than just watching video. We’re also playing Angry Birds. As of the start of this year, human beings were devoting 200 million minutes a day to playing the addictive computer game. That works out to 1.2 billion hours of our collective annual cognitive surplus.*

Bottom line: the more time we spend in front of media devices, the more time we fritter away. Shocked? Me neither.

Previous posts on this subject:

Gilligan’s Web

Charlie Bit My Cognitive Surplus

32 Comments

Filed under Uncategorized

32 Responses to Cognitive surplus watch

  1. Heretic! I spent 4 hours yesterday updating the Wikipedia entry on Star Wars Action Figures. You don’t think that’s useful?

    It’s called collaboration, smart guy. We don’t need your “expertise” anymore. Just because the hive mind is mostly analyzing episodes of The Family Guy and posting recipes for pot brownies doesn’t mean Twitter didn’t save Egypt.

    Get on board the peace (via networking technology) train!

  2. One of the Net’s big effects has been to free TV programming from the living room and the bedroom.

    And conversely, one of mobile computing’s big effects (smartphones, fondleslabs) has been to bring social media into the living room and to gather round the TV. Self-promotional link.

  3. Tom,

    Can you stick a webcam on top of your TV? I would enjoy watching you watching So You Think You Can Dance. It would also serve to transform you from a consumer to a prosumer. A win-win.

    Nick

  4. Sure TV/video consumption is increasing, but plenty of other statistics show that we’re spending more time multi-tasking. You may contend that people can’t effectively work/edit Wikipedia/create something while the TV is playing in the background, but this is no more distracting than, say, email.

    In short, the conclusion you draw from the data is an interesting hypothesis, but it is not conclusive.

    Simon

  5. Basically, what Tom said. I’m sat on my sofa watching my fiance and kid play Worms, while reading around the internet and discussing economics theory with people.

    I find these days I can’t *just* sit and watch TV, it bores me, so while I undoubtedly have the TV on a lot more these days, half the time it’s background to what I’m actually doing, which is trawling around the internet teaching myself things.

  6. Doing other stuff while watching TV is nothing new. Long before the Net came along, people would knit, crotchet, chat with friends or family members, do small household repairs, do crossword puzzles, read newspapers and magazines, talk on the phone, fiddle with various hobbies and crafts etc. while the TV played. Tweeting while watching So You Think You Can Dance is absolutely fine by me, but it doesn’t strike me as being instrinsically more valuable than any other type of diversion. What the statistics show is that, contrary to common assumptions and expectations, we watch more TV than ever before and, thanks to being connected to media networks during pretty much all our waking hours, we consume far more media entertainment than ever before. I think Tom Slee’s point that the net is turning out to be more a complement to mass media than a substitute for it is on target.

  7. Matt Seigal

    As a Brit, I’m fascinated how cognitive surplus fits in with the abundance of free time that teenagers have during the summer holidays to destroy and loot in English cities. Presumably, social media, BBM and television inspired copy cat vandals to be actively destructive.

  8. I would enjoy watching you watching So You Think You Can Dance.

    Believe me, you wouldn’t.

    Matt Seigal – All that Angry Birds playing is just a gateway drug for throwing bricks through shop windows.

  9. You’ve got to read that “cognitive surplus” phrase with the right – well, I’m not exactly sure what the correct word is – something like tone or spin or accent. Imagine a venture capitalist huckster giving a presentation along the lines of “China’s an enormous country, if we could just get 1% of them to buy our product, just one percent of the China market, it’d be HUGE, we’d be rich, and with the Internet we have the COGNITIVE SURPLUS …”

    (i.e. this “1%” argument is a well-known business plan fallacy)

    The hype-mongers aren’t saying “People won’t waste time”. They’re saying “There’s huge amount of time up for grabs, and if we could grab some of it, we’d make out like bandits”. Of course, it’s much more difficult than they present. But their own business is selling the concept and collecting that way.

  10. Nick,

    I haven’t gotten around to reading Clay Shirky’s Cognitive Surplus yet. I still have great works like Don Tapscott’s Wikinomics sitting un-read in boxes today. Nor have I read your blog during the period in which Shirky’s latest book was being launched on the market. So I would have to back track through your site, to fix in on any Shirky centric posting of yours.

    However, straight off the bat, I can see you are making the same mental knee jerk reaction to Shirky’s work, as you made when you speak about artists and creativity subjects. The thing to bear in mind, is to quit thinking in terms of Alvin Toffler, or Nicholas Negroponte type, 1980’s prosumer individuals. It should move on from that original definition, to incorporate some basic understanding of groups, teams, crowds, mobs etc, etc.

    Fred Brooks, author of the Mythical Man Month commented several years ago, in response to Eric S. Raymond of the Open Source community on the subject of their production model, and problem-solving approach. You know the one, given enough eyes, all bugs are shallow. Brooks posed the question, would the open source approach, be useful for a design problem, where the participants didn’t understand the nature of the problem, as a dispersed group of software professors, solving a software problem together?

    The other thing I would have to add to that, is the reason why the Open Source community functions as well as it does, to solve difficult problems efficiently, is that from undergraduate level upwards, software engineers are working on collaborative friendly text-based programming tools. I held several discussions on this topic lately with people from the software engineering career segment, and they all told me the same thing. That software engineers collaborate so wilfully, because their role models in the industry work on projects in collaborative, largely distributed fashion.

    They also mentioned the fact, that existing collaborative assignment completion tools available for several years now, to the young software engineer undergraduate, cannot cope with (wait for it), sophisticated file types such as word documents. That’s meant to be a geek joke. It means that for the basic, basic file format on *individual* personal computer platforms today, the collaborative type tools simply aren’t out there in 2011. At this point, I would prefer to have someone such as Kevin Kelly enter the fray, and tell us all again, that personal computers were pretty dumb – until, they started tying up together, via communications networks.

    Well guess what, from the point of view of the prosumer at the most basic, basic level in 2011, personal computer platforms (despite having tablets, iphones, netbooks, readers, blah, blah), are still as dumb as they ever were, when Kevin Kelly started up Wired magazine.

  11. Nick Carr

    Designcomment: Were you watching TV when you wrote that?

  12. Nick,

    No television in home whatsoever. I used to watch a lot of TV on computer a year ago, and stopped because I got back to a dissertation etc. But still, I think you’ve sort of caught me rotten.

    Notebook computer used as Dvd player. In the past year, I’ve gotten into a DVD box set habit. It’s low energy media consumption, and fits in with the academic study. I knew folk who were box set fanatics (in the Stephen Berlin Johnson, things that make us dumb, make us smarter sense), several years ago. I never thought I would fall for it myself.

    But a couple of TV series got me interested. I knew about Johnson’s theory, on the plot lines involved in TV series today – as opposed to the isolated, un-related episodes of olden days. I’ve found there is something to it. Watched West Wing seven episodes about three times in full, in past year. One forward, then backways, then forwards.

    You get a bit of re-use out of these box sets, if you put in the effort to follow the higher level plot lines. I probably would find the same with games. But I hadn’t the patience to play them.

    Btw, you’re blog site is generally entertaining and well written. And provides a nice bit of brain stimulus too. Not too much, but certainly some. Keep up the work.

  13. Designcomment, I lost you somewhere. All of project management is about how to “work on projects in collaborative, largely distributed fashion”. It’s not like this was suddenly invented when a bunch of professional conference-clubbers decided they were going to sell the hype to their corporate clients. And Microsoft Word documents are a massive pain. It’s a complicated format that gets in the way of the simple text of programming.

  14. Seth,

    I don’t mean word application for programming. I meant word application for doing reports, between multiple project participants. Nobody is doing that. Despite Word working on all platforms, big and small now. A lot of people owning several devices, and lots of people working on reports.

    It’s a big discussion. I don’t want to crowd Nick’s comment thread out with an excess of paragraphs on this. In fairness to Nick, his original blog entry reads fine to mean, and I agree with an awful lot that he said. But I didn’t want to leave out the other half of the picture either.

    I’ve detected a pattern in his treatment of many topics now, to do with creation and production – where he struggles to separate himself away from the centuries old notion of the romantic individual author. Its not a technological brick wall. It’s a problem to do with our need to create superstars. Most industries, including the art industry, is about star creation. It’s been rampant in architecture for years now. Star-itects, they call them.

    What people don’t realise, is that there is a vast collaborative effort happening behind the curtain, in order to hoist the Star-itect up there in front of the audience. All the Richard M. Stallman, the open source movement, the wikinomics people are trying to do, is move what already exists in the background into the foreground. But it will not alter what already fundamentally happens. Only present it differently, minus the ‘star’ individual.

    Reference, iSchool at Berkeley. Takes ‘back room’ server administrators and teaches them how to become foreground figures, with a basic knowledge of humanities. Or visa versa, humanities graduates learn about the engine room of it all. Apologises Nick for length. As I said, not the place to open up this tin of worms.

  15. Jim Takchess

    http://fold.it/portal/

    I thought this was an interesting way to grab CS from games.

    http://realityisbroken.org/

    speaks to the CS topic at length.

  16. Despite the readily available statistics, the myth that internet use displaces TV viewing continues to spread. I just read this in a Mother Jones piece by Kevin Drum: “I always thought the anti-TV crowd at least had a point: television really did crowd out things like books and magazines, which were better suited to big ideas and complex arguments than the tube. But social networking? As near as I can tell, it’s mostly crowding out in-person gossip and….television. That seems like a much more benign trade.” As near as you can tell isn’t near enough.

  17. I find these days I can’t *just* sit and watch TV, it bores me, so while I undoubtedly have the TV on a lot more these days, half the time it’s background to what I’m actually doing, which is trawling around the internet teaching myself things.

    I actually never used to be much of a TV watcher. Especially in recent years, my brain has been too jumpy to enjoy TV and movies that go on without me when my mind wanders. I like books partly because they don’t do that.

    But this summer (due to the Internet configuration — wired only — in the apartment where I’m living temporarily) I have ended up watching an hour of TV every other night or so, with no Internet anywhere in sight, and I’m starting to think that maybe it might be a good exercise in focus, to counter the way the Internet works with my brain.

    Weird…I never thought I’d be making an argument that TV is good for my brain, but (though I have no solid data on this) it seems like in this case, it might have been! I also might, maybe, be getting a little better at focusing on the books I read.

    ~Jessie Mannisto

  18. Jessie has a point. If *just* watching TV bores you, then why watch TV at all? It reminds me of that Seinfeld episode where George tries to combine sex, eating and TV watching (to the supreme disappointment of his ladyfriend).

    Internet use tends to displace not TV watching but attentive TV watching.

  19. Nick,

    We went over this issue not so long ago, as I recall. I remember writing something here, about the architect who banned members of his staff, who received a phone call from a building site operative opening up a digital drawing on the screen, as they were talking about a design matter. I made the point that in the digital era, large format paper output devices, will fold all of the drawings for you. All that people do, is have an assistant put the paper drawings in an envelope and write an address on it. Few people in a drawing office any more look at a paper drawing. Yet, the fellow who receives the envelope looks at the paper drawing, and is very aware of the paper output.

    What we have in the modern world, is certain parts of the process speed-ed up, and others aren’t at all. The older architect said to me, we moved to digital and we lost more than we gained. We gained a faster means to output information. But what we lost was a common medium through which everyone could agree and communicate.

    When we talk about the ‘benign trade’ or otherwise, we should think about these issues also. It’s like what you said about with ‘search’, and people not using parts of their brains any longer. What happens when architects no longer look at paper drawings? But still expect others else to interpret what they produce. Or further still, the (digital) information production is outsourced to another continent, and the guy looking at the paper item, has to phone that other continent? That’s happening already. How much is being lost in that equation I wonder?

    Hal Varian of Google spoke about this at the iSchool in October 2009, as if it was a wonderful new development in collaborative working. He’s a smart fellow Hal, and makes some great points in the podcast. I’d love to challenge him on a few also though.

    http://www.ischool.berkeley.edu/newsandevents/audiovideo?page=3

  20. Nick’s post is all about quantity, but everyone seems to really want to talk about quality. The story told by the numbers is not the same as that told by the anecdotes. In our consumer society, we’re probably driven more by quantification (even if it feels like quality), especially the numbers representing hours spent allowing advertisers to infiltrate our minds and then the buying habits that flow from that exposure. And an inchoate fear of silence, inactivity, and boredom keeps the cycle churning. We gotta consume something, right?

    I’m seriously not buying the argument that electronic and social networking is freedom or a vehicle to achieving it. Nor am I convinced by the hipster admonition to get with it and embrace the crowd (group, team, mob, etc.). That’s just demographics run amok, another quantification. There probably is something to the notion that individual authorship is eroding in favor of collaboration, but that might be something to regard warily rather than effusively.

  21. Brutus,

    Very well considered post. Thanks.

    Having used a combination of Amazon to do shopping lately, and using Chrome web browser from Google Inc, I am shocking at how much targeted advertisement banners I see nowadays. I purchased a DVI to VGA adapter for €6.00 recently, and everywhere I go now on the web, I seem to see those things in banner ads! Honestly, I only needed the one. I was so innocent, that at first I thought it was only a coincidence, until about the 100th time. Nick covers this very well in certain chapter in his book, The Big Switch.

    If you really want to discover the amount of advertisement you consume while watching the average movie on television, I recommend getting a digital video recorder device some time, and going back to delete the advertisements in what you record. Some stations aren’t too bad, but I was amazed to find myself sometimes deleting over five minutes worth of advertisements at a time – lots of those – in one 90min feature length movie. Basically, the advertisements add almost a half an hour onto the viewing time. I know, because I was trying to fit the recording onto a DVD disc to watch later on, on the laptop. At least the digital video recorder though, doesn’t follow you around with €6.00 DVI to VGA adapter special offers. But that is only a matter of time also.

  22. Stewart Dinnage

    What evidence is there that video is a poor medium in comparison to others?

    Comment is always free and in Nicks case welcomed, clearly a great writer. That not withstanding, the comments still seem highly opinionated and based on seriously blunt stats.

    On another note:

    http://www.guardian.co.uk/science/the-lay-scientist/2011/aug/08/1

    Susan Greenfield actually quoted Nicks book as first in her list of “evidence” of the affect of technology on the brain (ironic as Nick seems to quote her extensively, a circle of opinion?).

    I find this post by Dorothy Bishop interesting on the matter of autism and technology, brought up by greenfield.

    http://deevybee.blogspot.com/2011/08/open-letter-to-baroness-susan.html

    My thoughts, if you bill yourself as a scientist, try doing some science. If you are writer of popular non fiction and you suggest your work has strong scientific foundation, work hard on researching that, maybe even define a hypothesis…

    Otherwise how are these comments/books/products etc any different to the hyper-linked ramblings of the masses being poked fun at.

    I have no insight as to the effect of technological distraction on the brains of humans. That said, I’m pretty sure I’d have found it difficult to call out or discuss anything with my favourite authors/directors and scientists of the past, in the way I’m happy to here!

    I’m not saying the questions posed by Susan and Nick are wrong, even if they seem a little unintelligible, I’m just a little tired of the implied message, i.e. that the words shared are more than opinion!

  23. Susan Greenfield actually quoted Nicks book as first in her list of “evidence” of the affect of technology on the brain (ironic as Nick seems to quote her extensively, a circle of opinion?).

    Stewart: Are you making that up, or do you have some “evidence”? I don’t believe I quote Baroness Greenfield at all in my book. But if I did, how exactly would that be “ironic”?

  24. Stewart Dinnage

    Firstly a massive apology, Susan (Baroness) Greenfield is not to be confused with Patricia Greenfield. As I understand it Patricia’s meta study/review 2009 is a major component of the Science considered in your book.

    Absolute apology again for confusing these two Greenfields.

    You can find Susan Greenfield discussing “Mind Change” as she calls it here: http://www.guardian.co.uk/commentisfree/video/2011/aug/15/susan-greenfield-video?INTCMP=SRCH

    Your book is first on the list of evidence, 3:19

    There is probably still reason to find this ironic. A Neuroscientist is challenging the criticism that her opinions stated have no basis in evidence. Rather than point to directly related peer reviewed/respected studies that back up her statements (whatever “Mind Change” actually is stating, which seems unclear) she mentions the Shallows as first in the list. To my knowledge you haven’t claimed to be the originator of scientific evidence in this field.

    Two things that stand out to me, studies of the effects of Games (which seems to mean computer/screen games) have in Susan Greenfield’s (not sure about Patricia’s)case, become something you can test the effects of across the board. Are all games alike? does killing zombies = crayon physics (humble indie bundle)? Is a virtual game of Monopoly dangerously inconsequential, whilst on a board it is a valuable learning experience? It seems to me that the level of the debate here and elsewhere is currently rather black or white. Secondly, Susan Greenfield’s statements on autism and screen technology (which go beyond what was said in the video) seem to be particularly unlikely and have been very poorly received by scientists who actually work in that field.

    The increase in ADHD mentioned is also interesting, Tech News outlets reported this: http://www.sciencedaily.com/releases/2010/08/100817103342.htm

    Which suggests a massive misdiagnosis issue based on older and younger pupils in the school year. Sir Ken Robinson/RSA has a little segment on ADHD here also.

    http://www.youtube.com/watch?v=zDZFcDGpL4U

    At about 3:40 (although the whole VIDEO is quite interesting).

    As I’ve said I don’t have the background or skills here and I agree that asking the questions is not inherently a bad idea. I will say if you want to suggest “Science” underpins what you are saying then you need to say something scientific. A hypothesis, some test for evidence (either way), repeatability, be open even. The things good scientists actually do! Otherwise isn’t it just opinion?

  25. Nick,

    I know this is a different issue to the one of web usage and how it is pushing out, or not, television media. But just for the sake of drawing your focus to issues in other areas – on the thing about digital drawings versus paper. Sydney Pollack made a documentary film with architect Frank Gehry a few years back. You will see in that documentary, a very ambitious member of Gehry’s staff, describe how much the world currently operates around paper. The guys on Gehry’s staff, were of the opinion, that paper had to be taken out of circulation altogether. It is not just the paper less office. It is all of the other public systems at the moment, connected to a new building project, that require paper in some shape or form, in order to go through channels.

    It is the idea of one media trying to push out another, and the other, pushing back against the change.

    I was probably on the side of Gehry’s staff, until I considered the views of my old colleague. What happens when architects no longer ‘look’ at their own drawings at all. It is like a musician, not bother to develop technical skills of using an instrument.

    Jeff Hawkins talked once about his strategy with the Palm Pilot. He noted how other companies, had aimed with their handheld products to try and compete with other computer platforms, or other handheld devices. Hawkins was quite straight about what he was trying to do. He wasn’t competing with other technology products at all. He was competing with paper. It was that simple.

    I have to say, when I used a Palm Pilot once for three years running, at the end of that period, I realized how USEFUL paper is. Paper is really an in-genius thing. It’s light weight, doesn’t require charge up, you can read it in bright sunlight, you don’t have to boot it up to read it. There are a lot of advantages with paper, that digital devices will take a long time to displace fully. I think that digital devices only fill in little gaps, where paper can’t manage. For instance, you can carry around several books at a time with digital eReaders. That is precisely where you see eReaders gaining ground, where school kids have heavy bags to carry. It out-weighs the disadvantage of needing charge points and so forth.

    Also, Nick Gershenfeld from MIT, wrote in his books I think, about highly classified documents such as aircraft maintenance manuals, which are a real security risk. Rather than allow those very heavy manuals lying around the place, having to shred out of date versions, insert new pages etc, it made sense to allow aircraft engineers to access the same through secure eReader devices.

    I’m all for replacing paper in situations where it makes absolute sense. But the jury is still out, in reference to the Gehry architect notion – that we need to eliminate paper throughout the system. It reminds me of John Thackara’s criticism, of the fad, where we remove all human touch points in the design of services. Thackara’s point was simple. We need to leave touch points at various places, but get most strategic benefit from them. I would replace paper in a ‘system’ where it offers maximum strategic benefit. But if we find ourselves in situations, where architects no longer view their own drawings, then we have over shot that optimal balance, by a big margin.

  26. Stewart, I can’t speak for Susan Greenfield. As to your sense of what may be in my book, the best way to discover what’s in a book, I’ve found, is to read it. Nick

  27. Nick wrote: “As to your sense of what may be in my book, the best way to discover what’s in a book, I’ve found, is to read it.”

    It’s like what happens with architects not viewing their own drawings any longer. They would assume that stuff was represented on the paper, because it was on the screen, and so forth. I sometimes assume I’ve read a book, because I’ve visited the authors blog etc. I think, it creeps up on us all, in the digital era for some reason.

  28. Methinks you might be protesting too much here. It’s not the device that matters. It’s what we do on the device and what value we derive from it.

    My 4 hours a month of streaming video most recently has been watching documentaries for free from Amazon.com because that is a great benefit extended to Prime members. I’ve learned tons and actually bought a couple of books to learn more about the topics covered in the movies.

    And more and more conferences are streaming video like TED, PopTech, etc. so a lot of what may be going on might very well be good for our brains.

  29. Stewart Dinnage

    “Read the book” seems fair comment Nick. I’ve been trying to get my local library to sort it out for over 6 months.

    I know my comments may read like I’m simply against your position, which isn’t the case. Your article on wired shows some of the depth of consideration and research that has lead to your thoughts, which are clearly valuable and a joy to read. The parallels you draw to previous historical technological developments are thought provoking and insightful. Besides Charles Handy, I cannot think of another writer whose words on technology, progress or business feel as well crafted and enjoyable to digest.

    My only complaint regarding the framing of scientific enquiry, applies far less to someone whose position and history does not imply personal insight through primary research. I’m just concerned when people invoke the name of science without apparently following the typical scientific method to assess what has been suggested.

  30. Nick,

    I thought I might mention this. It somewhat relates to the issue of the Cognitive Surplus, that Shirky talks about. This is part of an abstract from a talk delivered by Victoria Stodden at a conference in Toronto a couple of years back, which was summarised by the Canadian software blogger at the address linked below. Stodden’s blog is also good, and her talks, slides etc are available at several places in one searches.

    As computation becomes more pervasive in scientific research, it seems to have become a mode of discovery in itself, a “third branch” of the scientific method. Greater computation also facilitates transparency in research through the unprecedented ease of communication of the associated code and data, but typically code and data are not made available and we are missing a crucial opportunity to control for error, the central motivation of the scientific method, through reproducibility.

    http://www.globalnerdy.com/2009/08/01/science-2-0-how-computational-science-is-changing-the-scientific-method/