« "Is Google Making Us Stupid?": sources and notes | Main | Machine head »

Easy does it

August 12, 2008

A recent edition of Science featured a worrying paper by University of Chicago sociologist James A. Evans titled Electronic Publication and the Narrowing of Science and Scholarship. Seeking to learn more about how research is conducted online, Evans scoured a database of 34 million articles from science journals. He discovered a paradox: as journals begin publishing online, making it easier for researchers to find and search their contents, research tends to become more superficial.

Evans summarizes his findings in a new post on the Britannica Blog:

[My study] showed that as more journals and articles came online, the actual number of them cited in research decreased, and those that were cited tended to be of more recent vintage. This proved true for virtually all fields of science ... Moreover, the easy online availability of sources has channeled researcher attention from the periphery to the core—to the most high-status journals. In short, searching online is more efficient, and hyperlinks quickly put researchers in touch with prevailing opinion, but they may also accelerate consensus and narrow the range of findings and ideas grappled with by scholars.

If part of the Carr thesis [in "Is Google Making Us Stupid?"] is that we are lazier online, and if efficiency is laziness (more results for less energy expended), then in professional science and scholarship, researchers yearn to be lazy…they want to produce more for less.

Ironically, my research suggests that one of the chief values of print library research is its poor indexing. Poor indexing—indexing by titles and authors, primarily within journals—likely had the unintended consequence of actually helping the integration of science and scholarship. By drawing researchers into a wider array of articles, print browsing and perusal may have facilitated broader comparisons and scholarship.

Evans's study is consistent with the study of researcher behavior conducted by University College London that I cited in my article, which found that online researchers tend to go for "quick wins" through rapid "power browsing."

When the efficiency ethic moves from the realm of goods production to the realm of intellectual exploration, as it is doing with the Net, we shouldn't be surprised to find a narrowing rather than a broadening of the field of study. Search engines, after all, are popularity engines that concentrate attention rather than expanding it, and, as Evans notes, efficiency amplifies our native laziness.

Comments


Hi Nick

Your article about Google made me think about commodity markets behave - click onto my blog entry at http://www.icis.com/blogs/asian-chemical-connections/

Over the last ten years I've noticed an increase in frantic behaviour and outright mental exhaustion (the two are probably connected), partly from the effort of keeping up with rapidly changing information.

I think this has made producers, buyers and traders dive in and react sometimes before they think, adding to volatility

Posted by: Organising Chaos at August 12, 2008 08:46 AM

One of my favorite books (http://www.amazon.com/Library-Research-Models-Classification-Cataloging/dp/019509395X/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1218545046&sr=8-1) made this point... in 1994. Though he was talking about databases like Lexis/Nexis rather than the internet.

Posted by: ojbyrne at August 12, 2008 08:47 AM

The concept of "power browsing" is in much the same vein as "quality time" which is only an excuse for behavior you know is bad.

Posted by: Richard at August 12, 2008 10:12 AM

I would be more concerned about this if people knew how to search. I suggest that this problem can be cured/alleviated by teaching people to search effectively and reasonable efficiently.

There is plenty of date showing how most people search with one or two keywords, and so limit their exposure to easily available results (the first half of the first page), which is likely to contribute to Evan's results

Posted by: Sabbasolo at August 12, 2008 10:14 AM

I'm not surprised, people are just moderating their efforts based on the medium. It's easy to publish online, and easy to fix it, so why not just rush through the work? You can always fix it later.

And of course, since its become so much easier to produce (writing, photography, video, etc), more and more amateurs are getting into the game. The glut of content changes its value. If there are only a few journals, that is way more serious than millions of blogs. The increasing volume of work causes diminishing gains for each piece.

It seems like a bit of a downward spiral where the easier it gets, the more people contribute, making it even easier. Where she stops, nobody knows?

Posted by: Paul W. Homer at August 12, 2008 04:16 PM

Not sure I agree with the argument that more amateurs = bad content. Only time will tell if there is any value in the content that is being produced.

I think that there is a degree of intellectual snobbery going on here.

I simply can't see the argument that making it easier to publish and research is a bad thing. Rather I think it is a miracle that the the likes of Isaac Newton and Charles Darwin happened to have access to research and publishing media. Imagine all those poor people and women who's contribution to human knowledge will never be known.

Posted by: swainjo at August 12, 2008 04:42 PM

Why is this worrying? I think it worries you Nick, only because it is new.

I can demonstrate this by inverting the time series.

Let's say that 20 years ago, all published papers in the world were online. And that they gave the pattern Evans reports.

But then suddenly 20 years later some world-wide mysterious disruption makes search less efficient, removes most of the papers offline. Now suddenly, the best papers can't be found by researchers so they are forced to cite marginal papers. This is herald by some folks as "increased diversity," when it really is increased inefficiency.

THAT would be something to worry about.

Posted by: Kevin Kelly at August 12, 2008 05:18 PM

Hello Nick

I don't agree with any of this. Firstly, there's no correlation between the quality of a piece of research and the number of other papers it cites. If anything there could be an inverse correlation. Secondly, in the sciences at least, the days of journal browsing and serendipity are long gone if they ever existed.

Anecdotally, the most important scientific paper of the 20th century was a letter to Nature with a puny 6 references (http://www.nature.com/nature/dna50/watsoncrick.pdf)

Posted by: bentoth at August 13, 2008 03:39 AM

Nick, I agree with some of the comments here that there's inadequate proof; Evan's thesis is an example of hasty generalization. Evans didn't actually read any of the 34 million articles; if there's a paradox anywhere it's that he's guilty of the same shallow research you complain about in your original claim. But Google's goal is not to make us smart, but rich, a goal it has surpassed. Evans though is posting for Britannica, a Google competitor that charges fees. Is that what this is really all about? I agree with bentoth, above, and another example is Garrett Lisi's recent publication of his Theory of Everything, now even on YouTube, with close to 200,000 views - but can you find it on Britannica? Not sure; don't have a subscription.

Posted by: Joe Linker at August 13, 2008 11:01 AM

Nick

an interesting premise, as usual... But are these researchers being 'lazy', or are they drowning in information because tools are failing to keep effective pace with the growth in content?

I've started to explore this a little at http://blogs.talis.com/xiphos/2008/08/15/nicholas-carr-comments-on-laziness-in-scholarly-publishing/ (more to follow, post-holiday...) and would welcome your thoughts...

Posted by: Paul Miller at August 15, 2008 10:42 AM

Civilization has never been hidered by over information, but lack of it.

Remember, Kepler had to steal Tycho Brahe's data to prove his theory of ellipical planetory orbits.

Personally I am not even bothered about the redundant volume of paragraphs produced in the name of scholarly research.

An original one pager will show itself in all its glory. New durgs will manifest themselves by the benefit they create to mankind. Libraries might complain about lack of storage selves or lack of hard disk space.

The day when mankind stops weighing intellect by the number or words written this problem will be solved. And it is going to come soon.

The discussed problem arises out of social and other pressures faced by "scholars" to prove that they are "scholars" indeed. And churning out volumes is currently recognized as necessary condition.

I see a day when scholarly papers will be like Arnie's move dialogues. Few and powerful.

Posted by: Shouvik at August 15, 2008 03:03 PM

In the old days, you had to get out of your desk to do research. Perhaps it’s the fact that you had to move through time and space to interact with other people that lead to more ideas. Years ago I read an article by Kurt Vonnegut in "In These Times" about why he didn't use email.( I tried to Google it and used ITTs search engine but could not find it! LOL). He tells at length about the steps he took to get his manuscript to his publisher including all of the people he met during the process. Interesting perspective; too bad it’s not listed in Google’s index!!

Posted by: Linuxguru1968 at August 16, 2008 05:19 PM

Nick,

I agree with the comment from Paul Miller above that "tools are failing to keep effective pace with the growth in content."

My bet is that over time new tools will emerge in the world of online scientific publishing that specifically counteract this tendency of online research to "narrow the range of findings and ideas grappled with by scholars."

We are smart enough to figure out the ways that Google might "make us stupid" and develop new tools to counteract those tendencies.

The advantages of having scientific and scholarly research freely available online for linking and annotation far outweighs the disadvantages even now, and in the structured research environments of tomorrow there will be no question.

It's August 2008 - did you collect from Benkler on your bet?

I have weighed in my thoughts on the issue at my new blog: http://wearetheweb.wordpress.com.

-P

Posted by: publy at August 16, 2008 08:24 PM

There is a feeling in the scientific community that less "original" research is being performed and more work is simply a rehash of previous data. First of all, it is difficult to show if work is "original" just by looking at the citations. But let's suppose the premise is true and the impressions of many scientists is correct. If the author had included scientific literature written to the year 2007 he could have argued that the decrease in funding by the NSF and NIH over the past several years has caused this.



http://www.nsf.gov/statistics/infbrief/nsf08303/

The irony that this work simply examines scientific literature has not been lost. Anyway this guy is a sociologist - shouldn't he be setting up an experiment where scientists using the internet are compared to ones using a "sham-internet" where all useful content has been eliminated? I am certain Google could create this and provide sponsorship

Posted by: grizzly marmot at August 17, 2008 11:13 AM

grizzly:
>> There is a feeling .... that less "original"
>> research .... more work is simply a rehash of
>> previous data.
There has also been a "drift" away from what we call real science. Science involves the study of natural phenomenon that can be tested in the lab or observed indirectly like astronomy. For example, in physics we have String Theory, a complete head game which has no experimental basis and make only one prediction which is wrong. Yet physicists make careers out of that non-sense since they have to eat too. Could this be because today the economy is dominated by companies that produce no tangible technological artifacts? How about companies that claim to be in the computer science business but never p0ublicly disclose the source to their products which is needed for independent verification at the heart of true science?

Posted by: Linuxguru1968 at August 19, 2008 05:12 PM

From my perspective, less citation is often more relevant ones — and the core of a literature is better for a reason; among many, this proves that people are not trying to pat in the back of their local invisible college, but be part of the over-all discipline. What surprises me is to see that the tendency to quote from drafts appears so recent: many elder scholar criticize it as being un-reviewed territory, I'd disagree — but my main point is that we are only seeing the beginning of the transformations that Google Scholar, Semantic clusters and other technology will bring to science. Wait until my generation, raised with search-engine, comes to monitor research. . .

Posted by: Bertil at August 23, 2008 09:58 AM

I think this critique applies to where the Web is today, not to the fundamental nature of the Web.

It may be true that the algorithms used to recover relevant content too heavily emphasize pointing people to "high-status" content. I've been disappointed by the available web tools for finding content on the periphery. You can use the Web to find obscure content, but it requires effort (that many choose not to give).

It may be true that the "poor indexing" of the print library force people to explore materials beyond what they can find easily on the Web, and this can be a positive thing.

However, the Web will ultimately provide a better solution than the print library when tools are developed that make exploring the periphery more efficient. While it might be occasionally fun spending hours in a library hoping to randomly stumble upon something, usually, I'd rather jump straight to the obscure content that interests me.

In general, it's not surprising that the Web is favoring certain behaviors that correspond with the problems that web community has focused on solving so far. The fact that the web neglects other human goals just points to exciting development that is on the horizon.

Posted by: Dan Schmidt at August 26, 2008 04:18 PM

Post a comment




Remember Me?

(you may use HTML tags for style)

carrshot5.jpg Subscribe to Rough Type

Now in paperback:
shallowspbk2.jpg Pulitzer Prize Finalist

"Riveting" -San Francisco Chronicle

"Rewarding" -Financial Times

"Revelatory" -Booklist

Order from Amazon

Visit The Shallows site

The Cloud, demystified: bigswitchcover2thumb.jpg "Future Shock for the web-apps era" -Fast Company

"Ominously prescient" -Kirkus Reviews

"Riveting stuff" -New York Post

Order from Amazon

Visit Big Switch site

Greatest hits

The amorality of Web 2.0

Twitter dot dash

The engine of serendipity

The editor and the crowd

Avatars consume as much electricity as Brazilians

The great unread

The love song of J. Alfred Prufrock's avatar

Flight of the wingless coffin fly

Sharecropping the long tail

The social graft

Steve's devices

MySpace's vacancy

The dingo stole my avatar

Excuse me while I blog

Other writing

Is Google Making Us Stupid?

The ignorance of crowds

The recorded life

The end of corporate computing

IT doesn't matter

The parasitic blogger

The sixth force

Hypermediation

More

The limits of computers: Order from Amazon

Visit book site

Rough Type is:

Written and published by
Nicholas Carr

Designed by

JavaScript must be enabled to display this email address.

What?