Monthly Archives: August 2007

Virtualization gets personal

The fundamental promise of virtualization is that it breaks the lock between hardware and software, between the physical side of computing and the logical side. That lock, which most of us don’t even notice because we’re so accusomed to it, makes computing much more complicated, expensive, and inflexible than it needs to be.

Many big companies are today using virtualization to consolidate and automate their computer systems. Rather than being forced to dedicate stacks of hardware to particular applications, they can create, in essence, a multipurpose hardware infrastructure that can run all their applications, shifting resources among them as necessary. (That’s an overstatement, but it’s where virtualization is leading us.) That increases the capacity utilization of hardware enormously, reducing the amount of machinery you have to buy, and by greatly simplifying the deployment of the hardware, it also reduces the number of people you have to employ to run your systems.

But virtualization can bring benefits to individuals as well. It promises, in fact, to make “personal computing” considerably more personal by separating what you do on your PC from the PC itself. For a preview of what that means, consider the personal virtualization software MojoPac, which the Financial Times tech columnist, Paul Taylor, reviews today. MojoPac, as Taylor explains, “lets you safely and securely separate software applications, files and settings from a PC and put them on to any USB device, such as an iPod, mini portable drive or flash memory stick. Instead of lugging a computer around, you load your desktop on to a pocket device. When you reach your destination, you plug it into [a PC] and your familiar home set-up is instantly in front of you.”

MojoPac, like other similar products, has its limitations, but in Taylor’s tests it worked admirably well. “I am impressed,” he writes.

In MojoPac, we begin to see what happens to personal computing when it becomes separated from the personal computer. As storage devices get smaller, more capacious, and cheaper and as virtualization technologies advance, we’ll be able to carry our entire computing setup around on our keychain – including all our applications, data, and settings. Whenever we need to use a computing device, whether a PC, a mobile phone, a music player, a TV, or some nifty future gadget, we’ll plug in a little drive and – voila! – our personal setup will appear instantly. At the level of software, which is the level that matters for us, we’ll have just one device, even though it will take many physical forms. This won’t kill the PC, but it will make it far less important, far less central, than it has been.

But that’s only the start. Ultimately, we probably won’t even need a little storage drive to carry our setup around with us. All the required information will sit in a utility data center out on the Internet somewhere, in the “cloud,” and it will be served up automatically to whatever machine we happen to be using at any given moment. We’ll access it with a smartcard, or a password, or maybe a thumb print or a voice command. Our “PC” will float around with us, weightlessly, forever at our beck and call.

Rise of the wikicrats

It’s over. The Deletionists won.

“It’s like I’m in some netherworld from the movie Brazil, being asked for my Form 27B(stroke)6,” writes the media scholar and long-time Wikipedian Andrew Lih. He’s describing what it’s like these days to contribute to Wikipedia, the “encyclopedia that anyone can edit.” Lih recently noticed that Wikipedia lacked an article on Michael Getler, a reporter who now serves as ombudsman for the Public Broadcasting System. Lih added a brief entry – a “stub,” in Wikipedia parlance – assuming that other contributors would flesh it out in due course. Within minutes, though, one of the site’s myriad wikicops had swooped in and marked Lih’s entry as a candidate for “speedy deletion,” citing the site’s increasingly arcane legal code:

It is a very short article providing little or no context (CSD A1), contains no content whatsoever (CSD A3), consists only of links elsewhere (CSD A3) or a rephrasing of the title (CSD A3).

Lih’s reaction: “What the… what manner of… who the… how could any self-respecting Wikipedian imagine this could be deleted? I’ve been an editor since 2003, an admin with over 10,000 edits and I had never been this puzzled by a fellow Wikipedian.” After some more digging, he discovered that the rapid deletion of new articles has become rampant on the site. Deletionism has become Wikipedia’s reigning ethic. Writes Lih:

It’s incredible to me that the community in Wikipedia has come to this, that articles so obviously “keep” just a year ago, are being challenged and locked out. When I was active back on the mailing lists in 2004, I was a well known deletionist. “Wiki isn’t paper, but it isn’t an attic,” I would say. Selectivity matters for a quality encyclopedia.

But it’s a whole different mood in 2007. Today, I’d be labeled a wild eyed inclusionist. I suspect most veteran Wikipedians would be labeled a bleeding heart inclusionist too. How did we raise a new generation of folks who want to wipe out so much, who would shoot first, and not ask questions whatsoever? It’s as if there is a Soup Nazi culture now in Wikipedia. There are throngs of deletion happy users, like grumpy old gatekeepers, tossing out customers and articles if they don’t comply to some new prickly hard-nosed standard.

But, given human nature, is it really so “incredible” that Wikipedia has evolved as it has? Although writers like Yochai Benkler have presented Wikipedia as an example of how widescale, volunteer-based “social production” on the Internet can exist outside hierarchical management structures, the reality is very different. As Wikipedia has grown, it has developed a bureaucracy that is remarkable not only for the intricacies of its hierarchy but for the breadth and complexity of its rules. The reason Deletionism has triumphed so decisively over Inclusionism is pretty simple: It’s because Deletionism provides a path toward ever more elaborate schemes of rule-making – with no end – and that’s the path that people prefer, at least when they become members of a large group. The development of Wikipedia’s organization provides a benign case study in the political malignancy of crowds.

“Gone are the days of grassroots informality,” writes a saddened Lih in another post. “Has the golden age of Wikipedia passed?”

Maybe the time has come for Wikipedia to amend its famous slogan. Maybe it should call itself “the encyclopedia that anyone can edit on the condition that said person meets the requirements laid out in Wikipedia Code 234.56, subsections A34-A58, A65, B7 (codicil 5674), and follows the procedures specified in Wikipedia Statutes 31 – 1007 as well as Secret Wikipedia Scroll SC72 (Wikipedia Decoder Ring required).”

My, what a friendly ad

Is Larry Page now writing headlines for CNET?

Google’s YouTube, copying the “ticker ad” concept that VideoEgg introduced nearly a year ago, yesterday announced that it is slapping advertisements across the bottom of some of its videos. In a blog post titled “You Drive the YouTube Experience” (yeah, I’ve been dying to have ads injected into the videos I watch), the company says that the ads are “animated overlays that appear on the bottom 20 percent of a video. If you’re interested by what you see there, clicking on the overlay launches a deeper interactive video ad that we think is relevant and entertaining.” Sweet!

Now, obviously, it’s always been inevitable that YouTube would incorporate advertising into the videos it plays – whether or not Google acquired it. YouTube is not a public service; it’s a business. What gets me, though, is not just the patronizing spin that Google is putting on the news – “as always,” its announcement concludes, “we’re looking to improve the experience with you in mind” – but the way some respectable news organizations are echoing the company’s nonsense. Here, for instance, is the headline CNET is running on its story:

YouTube tests viewer-friendly ad format

What? Viewer-friendly? Is it viewer-friendly because it’s arguably less annoying than having an ad run in advance of a video? That’s like saying that being hit on the head once with a hammer is a pleasant experience because it’s not as bad as being hit on the head twice with a hammer.

I liked the reaction of the first viewer to leave a comment on the YouTube blog: “yuck.” If you’re going to stick ads on the videos, go ahead and stick ads on the videos. But, please, don’t tell us you’re doing it on our behalf. We’re not idiots.

UPDATE: CNET has changed the headline on its story to:

YouTube tests 10-second ad format

Invite bonanza: Pownce, Freebase, iMedix

UPDATE: Pownce invites are gone.

Somehow or other, I have managed to assemble a small pile of invitations for joining the communities of Pownce, Freebase, and iMedix. Because I consider everyone who visits Rough Type to be my friend, in the Web 2.0 sense of that term, I’m going to give them away, first come, first served. To request one, send an email to:

iloveroughtype @ mac . com

and put the name of the desired site in the subject field (one site per entrant). If you win, you’ll receive your invitation directly from the site. When the invites have all been given away, the above email address will be terminated and I will put a notice on this page.

Good luck, my friends.

Skype and the hedge fund problem

Hedge funds work on a simple wisdom-of-the-crowd principle: Because they involve an extremely large number of transactions, which smooths out the vagaries of individual transactions, the movements of financial markets follow predictable patterns, which can be discerned from a study of their past behavior. Deviations from the patterns tend to be shortlived, and by making huge bets that the deviations will quickly return to the norm, you can make a whole lot of money. As we’ve seen recently, though, things aren’t quite as simple as the hedge fund operators assume. Sometimes, very weird things happen and the deviations become either larger or longer-lived than expected, at which point the big bets can unravel in very unpleasant ways.

Peer-to-peer networks, which also involve lots and lots of different actors doing lots and lots of different things for lots and lots of different reasons, work in a similar way, and a company like Skype, whose telephone network is designed to run on many thousands of computers spread across a big P2P network, has built its system on the assumption that usage patterns are predictable – even if the actions of any individual user are not. Last week, Skype ran head-on into the hedge fund problem. The network’s behavior deviated from the norm in a way that was greater than the Skype engineers had planned for, and the system crashed. The catalyst was the distribution of a routine patch from Microsoft, which led to a cascade of unanticipated effects, as Skype’s Villu Arak explains:

The Microsoft Update patches were merely a catalyst — a trigger — for a series of events that led to the disruption of Skype, not the root cause of it … The high number of post-update reboots affected Skype’s network resources. This caused a flood of log-in requests, which, combined with the lack of peer-to-peer network resources at the time, prompted a chain reaction that had a critical impact. The self-healing mechanisms of the P2P network upon which Skype’s software runs have worked well in the past … Unfortunately, this time, for the first time, Skype was unable to rise to the challenge and the reasons for this were exceptional. In this instance, the day’s Skype traffic patterns, combined with the large number of reboots, revealed a previously unseen fault in the P2P network resource allocation algorithm Skype used. Consequently, the P2P network’s self-healing function didn’t work quickly enough. Skype’s peer-to-peer core was not properly tuned to cope with the load and core size changes that occurred on August 16.

As our economy becomes ever more tightly and intricately networked, its continued operation will hinge on the assumptions that mathematicians and software engineers embed in the code that underpins it. Usually, the assumptions will hold. But usually isn’t always. Weird things happen, even in the largest of crowds.

Long player: bonus track

A while back, in the post Long player, I disputed David Weinberger’s contention, in his book Everything Is Miscellaneous, that the vinyl record album was a purely economic contrivance and that we purchased and listened to albums not “for artistic reasons,” as we had assumed, but only “because the economics of the physical world required it: Bundling songs into long-playing albums lowered the production, marketing, and distribution costs because there were fewer records to make, ship, shelve, categorize, alphabetize, and inventory.” The form of the album was actually created, I argued, to expand both the artistic canvas and the supply of recorded music, and, indeed, its arrival unleashed a remarkable flood of creativity in popular music while also vastly expanding the supply of recordings, to everyone’s benefit.

In recently rereading Marshall McLuhan’s classic Understanding Media – insanely brilliant, with an equal emphasis on both words – I came across a brief passage in which McLuhan describes how the LP album spurred a burst of creativity in jazz as well as pop:

… the l.p. record suddenly made the phonograph a means of access to all the music and speech of the world … With regard to jazz, l.p. brought many changes, such as the cult of “real cool drool,” because the greatly increased length of a single side of a disk meant that the jazz band could really have a long and casual chat among its instruments. The repertory of the 1920s was revived and given new depth and complexity by this new means.

McLuhan’s book was published in 1964, a couple of years before rock musicians would realize that the LP form allowed them a way to extend their creativity beyond the individual track. Well before what we now recognize as the golden age of the album, the LP was viewed as a liberating technology, for musician and listener alike, not as a means of constraining choice and oppressing music fans.

The end of ERP?

As the founder and leader of PeopleSoft, Dave Duffield played a seminal role in establishing enterprise resource planning, or ERP, systems as the IT engines of big business. But then, in a hostile takeover, the enterprise software giant Oracle yanked PeopleSoft out of Duffield’s hands. Now, Duffield’s back in town, and he’s gunning for ERP.

It’s the Shootout at Enterprise Gulch.

Today, Duffield’s new company, Workday, is announcing an expansion of its suite of software-as-a-service business applications to include not only human resource management – its original offering – but also a set of financial management services, including accounts payable and receivable, general ledger, and reporting and analysis. The integrated suite, which is being offered in beta form and will be further fleshed out in coming months, provides, Duffield’s deputy Mark Nittler told me, “the first alternative to ERP.”

It’s an alternative to ERP, rather than a Web-delivered version of ERP, argues Nittler, because the system’s software guts are entirely different. Rather than being tightly tied to a complex relational database, with thousands of different data tables, running on a separate disk, the Workday system uses a much simpler in-memory database, running in RAM, and relies on metadata, or tags, to organize and integrate the data. Having an in-memory database means that the system can run much faster (crucial for Web-delivered software), and using metadata rather than static tables, says Nittler, gives users greater flexibility in tailoring the system to their particular needs. It solves ERP’s complexity problem – or at least it promises to. (For more on the nuts and bolts, see David Dobrin’s whitepaper and Dan Farber’s writeup.)

So what are the odds that Duffield’s Workday will come out on top once the dust has settled in Enterprise Gulch? The odds are long. But Workday has three things going for it. First, it has the widely admired Duffield, who gives the upstart immediate credibility with customers, investors, and programmers. Second, it has a technological head start. There are reasons to believe that the secret new system, codenamed A1S, being developed by SAP, the biggest ERP provider, will resemble what Workday is doing, with an in-memory database and much metadata, but SAP is moving slowly, weighed down with the baggage of the past. Third, Workday is adopting a strategy of patience and steady gains. It’s targeting mid-sized companies that have not yet implemented full ERP systems – a rich market that’s also being targeted by SAP, Oracle, and Microsoft, among other mainstream software houses. The ERP virgins, who well know the costs, complexities, and risks of installing an ERP system on their own hardware, have good reason to give careful consideration to a software-as-a-service offering like Workday’s, which runs in a browser and requires little in the way of upfront capital investments. The middle market offers Workday a means of establishing a toehold before it moves upward to the big-company market, where it will actually have to displace installed systems – a tall order, indeed.

Salesforce.com’s marketing slogan has long been “The End of Software.” Workday’s pitch sounds like “The End of ERP.” Whether or not Workday itself succeeds in its battle against the behemoths, we already see in its innovative system the outlines of the post-ERP era of enterprise computing.