« I'm with Google | Main | Let Wikipedia be Wikipedia »

Is the internet too dumb to survive?

December 21, 2005

Maybe Bob Metcalfe was just ten years too early. If you remember, Metcalfe in 1995 wrote a column predicting that the net would "go spectacularly supernova and in 1996 catastrophically collapse." He later ate his words - literally.

The current issue of MIT's Technology Review, which, coincidentally, has Metcalfe on its board, features dark new predictions about the net. In a cover story called The Internet Is Broken, David Talbot suggests that the internet, designed for fairly simple communications between fairly small groups, may finally be cracking under the weight of its ever growing complexity. He writes that "for the average user, the Internet these days all too often resembles New York's Times Square in the 1980s. It was exciting and vibrant, but you made sure to keep your head down, lest you be offered drugs, robbed, or harangued by the insane. Times Square has been cleaned up, but the Internet keeps getting worse, both at the user's level, and ... deep within its architecture."

Talbot quotes some heavy hitters. Here's MIT's David Clark: "We are at an inflection point, a revolution point ... We might just be at the point where the utility of the Internet stalls - and perhaps turns downward." Here's Jonathan Zittrain, formerly of Harvard, now at Oxford: "Take any of the top ten viruses and add a bit of poison to them, and most of the world wakes up on a Tuesday morning unable to surf the Net - or finding much less there if it can." Princeton's Larry Peterson explains how the myriad patchwork fixes to the net's security problems ultimately magnify the system's fragility: "We see vulnerability, we try to patch it. That approach is one that has worked for 30 years. But there is reason to be concerned. Without a long-term plan, if you are just patching the next problem you see, you end up with an increasingly complex and brittle system."

Talbot reports on some ongoing efforts, notably a program at the National Science Foundation, to reengineer the net - to "develop clean-slate architectures that provide security, accommodate new technologies, and are easier to manage." They aim, for instance, to build into the net "the ability to authenticate whom you are communicating with and prevent things like spam and viruses from ever reaching your PC." They also include the design of "protocols that allow Internet service providers to better route traffic and collaborate to offer advanced services without compromising their businesses" as well as mechanisms to "allow all pieces of the network to detect and report emerging problems - whether technical breakdowns, traffic jams, or replicating worms - to network administrators." In short, these initiatives seek to make the internet smarter.

Which, of course, means they run counter to the internet's native grain. From the start, many have argued that the internet's greatest strength is its essential stupidity - the inability of its pipes to distinguish between the different bits of data running through them. That dumbness is one of the main reasons the net's so flexible and fast, and it also lies at the heart of what Talbot calls "the libertarian culture of the Internet." Internet pioneer Vint Cerf, now with Google, says, "It's really hard to have a network-level thing do this stuff, which means you have to assemble the packets into something bigger and thus violate all the protocols."

The tension between security and openness, manageability and freedom, has always been around, of course. But, as Talbot's article makes clear, the day when that tension is resolved may be approaching. Princeton's Peterson, for instance, reports that "there is a wider recognition in the highest level of the government [of the seriousness of the problems]. We are getting to the point where we are briefing people in the president's Office of Science and Technology Policy. I specifically did, and other people are doing that as well. As far as I know, that's pretty new."

The internet has become an essential infrastructure - not just for communication but for commerce. When the workings of national economies hinge on an infrastructure's reliability and security, then you can be sure that reliability and security will in the end come to trump all other concerns, even openness.


Aha! the KISS design principle or Keep It Simple & Stupid.

Thanks a million for showing me that The Internet is Broken. It is indeed.

Let me change it to Stupendous and it will be trully Free for the Whole Wide Word.

Posted by: 666 at December 21, 2005 02:47 PM

I would think a simpler - a.k.a. "dumber" - network architecture would be more likely to provide security and reliability. More "intelligence" == more code == more bugs.

With all due respect to the folks quoted in the article, I think technical arguments against end-to-end "dumb network" design are likely to be adopted and amplified by those whose real problem is with the openness and libertarian culture of the Net itself.

Posted by: Doug Lay at December 21, 2005 06:43 PM

This confirms what I had been saying (and writing) for a while: one can't be exposed to potentially millions of (virtual) people and billions of (right or wrong) information items, it is simply unmanageable; even if you filter out spam and noise and just keep what is relevant, there is no way one can handle such a one-dimensional, paradoxical world where everything and everyone is at the same distance, one click away. So this is not just a technical issue, but a systems one, as if it were: such a system can't survive without becoming totally anarchical. (This is also the case, incidentally, with cell phones : when one is reachable directly anywhere at any time, this is bound to become a problem.) The end of the utopian, free for all, times of this media are a-coming, as they did with previous media, and the counter blow towards "order" may be difficult to stomach.

Posted by: Miklos at December 21, 2005 06:51 PM

I shutter at the prospect of building too much intelligence into the infrastructure of the internet. Having been involved in building a few of the systems the internet relies upon I can say that each time we attempted to postulate the ways in which a new technology could be used, we were patently wrong.

Once you accept that you can't predict the ultimate usage of the system, it follows that all you accomplish by making a smarter system is to create the need for ever more complex hacks.

Posted by: Bert Armijo at December 21, 2005 07:50 PM

Or maybe the Terminator story dating was just ten years too early, by setting Judgement Day on August 27, 1997. Skynet triggered a nuclear war that day, in an effort to protect the network against... humans. Skynet was a fictional intelligent distributed computer network designed by the military to (amongst others) protect their computer systems from virus attacks.

Happy holidays!

Posted by: Filip Verhaeghe at December 23, 2005 10:26 AM

Interesting post! Perhaps the web may fit this description but not the net (if we can consider the web to be the part that contains most of the content and just a subset of the net, which is really a medium rather than content).

Posted by: lisa at December 28, 2005 11:39 PM

In reply to 666, read Bosworths thoughts about the virtues of kiss.


Posted by: sy at December 31, 2005 07:53 AM

The number of Internet users is estimated around 1 billion. This very size is at the same time a strength (it has become so much an infrastructure that if it's broken, we'll fix it) and a weakness (it's hard to move such a number of independant decision makers).

As a trained civil engineer, I would like to remind again an analogy between the phone/internet network and railways/roads. For years, in Europe, many doomsayers explained that the road network is 'broken' and pointed out the fact that trains & airplanes are more safe and reliable. Did it really change the course of things?

Posted by: Jean-Philippe Papillon at January 2, 2006 06:40 AM

Post a comment

Thanks for signing in, . Now you can comment. (sign out)

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Remember me?

carrshot5.jpg Subscribe to Rough Type

Now in paperback:
shallowspbk2.jpg Pulitzer Prize Finalist

"Riveting" -San Francisco Chronicle

"Rewarding" -Financial Times

"Revelatory" -Booklist

Order from Amazon

Visit The Shallows site

The Cloud, demystified: bigswitchcover2thumb.jpg "Future Shock for the web-apps era" -Fast Company

"Ominously prescient" -Kirkus Reviews

"Riveting stuff" -New York Post

Order from Amazon

Visit Big Switch site

Greatest hits

The amorality of Web 2.0

Twitter dot dash

The engine of serendipity

The editor and the crowd

Avatars consume as much electricity as Brazilians

The great unread

The love song of J. Alfred Prufrock's avatar

Flight of the wingless coffin fly

Sharecropping the long tail

The social graft

Steve's devices

MySpace's vacancy

The dingo stole my avatar

Excuse me while I blog

Other writing

Is Google Making Us Stupid?

The ignorance of crowds

The recorded life

The end of corporate computing

IT doesn't matter

The parasitic blogger

The sixth force



The limits of computers: Order from Amazon

Visit book site

Rough Type is:

Written and published by
Nicholas Carr

Designed by

JavaScript must be enabled to display this email address.