Monthly Archives: March 2007

Oracle sues SAP for “theft”

Wow. The Oracle-SAP battle just turned into a full-scale war. Reuters reports that Oracle today filed a suit against SAP accusing it

of gaining repeated and unauthorized access to [Oracle’s] password-protected customer support Web site. This allowed SAP to copy thousands of Oracle software products and other confidential materials onto its own servers to compile an illegal library of copyrighted software code, the lawsuit charged. “This case is about corporate theft on a grand scale …,” said the lawsuit, filed in United States District Court in San Francisco.

Here’s the full text of the 43-page lawsuit, which in addition to laying out the allegations in detail – Oracle claims SAP used the allegedly stolen code and documents to enable its TomorrowNow subsidiary to offer “cut rate support services” to users of Oracle applications as a means of getting them to shift over to SAP applications – provides an extensive discussion of the fierce competition between Oracle and SAP (from Oracle’s point of view, of course). Oracle is asking for a jury trial.

It will, to say the least, be very interesting to see SAP’s response.

Larry Dignan has more.

Is a “neutral” net anticompetitive?

The “net neutrality” debate is a complicated one (witness Google’s recent twists and turns). Take the very important issue of competition. On the surface, it would seem that those in favor of making net neutrality the law of the land are fighting the good pro-competition fight. By preventing telcos, cable operators, and other pipe owners from giving favorable treatment to certain forms of data – allowing, say, video from TV studios to flow faster than video from amateurs – a net-neutrality law would keep the playing field level for the little guys.

In theory, that’s true. In reality, it’s a little more complicated.

Net neutrality exists in the abstract, in the realm of protocol. Because the content of any packet of data is invisible to the pipe carrying it, by protocological fiat, every packet is treated the same. If that was all there was to it – if theory and reality were one – then pro-neutrality would mean pro-competition. But it’s not all there is to it. In addition to the abstract realm of protocol, there’s the very real – very physical – realm of infrastructure. Regardless of protocol, superior infrastructure provides superior quality of service – ie, faster, more reliable transmission of data. To put it a different way, a company can buy a competitive advantage by buying (or renting) better infrastructure. So, for instance, if I have the money to contract with a caching company like Akamai to speed the delivery of my content, then I have an advantage over the saps who can’t afford such services.

As Akamai itself puts it: “Akamai’s technology … has transformed the chaos of the Internet into a predictable, scalable, and secure platform for business and entertainment. The Akamai EdgePlatform comprises 20,000 servers deployed in 71 countries that continually monitor the Internet – traffic, trouble spots and overall conditions. We use that information to intelligently optimize routes and replicate content for faster, more reliable delivery.” No wonder so many companies use services like Akamai’s – who wants to be stuck with all the little guys in the “chaos”?

Protocol is neutral. Infrastructure isn’t.

If net neutrality becomes law, it would prevent big companies from locking in an advantage at the protocological level – giving certain types of data privileged status – but it would allow big companies to lock in an advantage at the infrastructural level. And who has the best infrastructure? Well, Google, of course. Through billions of dollars in capital investments, it has created a kind of shadow internet for the express purpose of providing its content and services with an advantage in transmission speed and reliability. That’s great, because it means we all get our search results that much quicker. If the net were truly neutral, truly agnostic about what it carried, we’d spend a lot more time twiddling our thumbs.

But here’s the downside. As Google shifts into content and services businesses, that very expensive and very sophisticated infrastructure turns into a big entry barrier for would-be competitors. Think of the software-as-a-service (SaaS) market, for instance. For SaaS providers looking to serve businesses, the speed and reliability with which their applications run through the browser window are absolutely crucial to success. If I’m a small startup looking to compete against a Google (or a Microsoft or any other large company able to invest many hundreds of millions of dollars into its network), I start out at a big disadvantage at the infrastructural level. There’s no way in hell I can afford to build the kind of infrastructure that the big guys have. But perhaps I could, at a far lower cost, contract with the pipe owners to give my service privileged status. In this scenario, dismantling “net neutrality” (as commonly defined) could actually be pro-competitive by helping to counter the infrastructural advantages held by large companies. Embedding “net neutrality” into law would, by contrast, strengthen infrastructural advantages, creating ever larger barriers to entry over the long run.

I’m not trying to argue that protocological neutrality is a bad thing, and I’m certainly not suggesting that pipe owners should be trusted to promote competition. I’m just pointing out that it’s a dicey issue. Over the long haul, which would turn out to be more anti-competitive: a Net rendered non-neutral by protocol, or a Net rendered non-neutral by infrastructure? I don’t know. It’s a very good question to debate. But let the debate begin with an honest admission: The Internet is not neutral and never will be.

As for those who would look to politicians and lobbysists to maintain the net in its putatively Edenic state: Be careful what you wish for.

Two views of Web 2.0 in business

Some hard data is coming out this week on the adoption of Web 2.0 tools by companies. Yesterday, Forrester released some results from a December 2006 survey of 119 CIOs at mid-size and larger companies. It indicated that Web 2.0 is being broadly and rapidly brought into enterprises. Fully 89% of the CIOs said they had adopted at least one of six prominent Web 2.0 tools – blogs, wikis, podcasts, RSS, social networking, and content tagging – and a remarkable 35% said they were already using all six of the tools. Although Forrester didn’t break out adoption rates by tool, it did say that CIOs saw relatively high business value in RSS, wikis, and tagging and relatively low value in social networking and blogging.

Tomorrow, McKinsey will release the results of a broader survey of Web 2.0 adoption, and the results are quite different. In January 2007, McKinsey surveyed some 2,800 executives – not just CIOs – from around the world. It found strong interest in many Web 2.0 technologies but much less widespread adoption. McKinsey also looked at six tools. While it didn’t include tagging, it did include mashups; the other five were the same. It found that social networking was actually the most popular tool, with 19% of companies having invested in it, followed by podcasts (17%), blogs (16%), RSS (14%), wikis (13%), and mashups (4%). When you add in companies planning to invest in the tools, the percentages are as follows: social networking (37%), RSS (35%), podcasts (35%), wikis (33%), blogs (32%), and mashups (21%).

North American companies haven’t embraced Web 2.0 appreciably faster than companies in other countries, according to McKinsey. Although North American firms have been slightly more likely to invest in blogs and RSS, for instance, they’ve been slightly less likely to invest in social networks and wikis than their counterparts in some other regions. Perhaps the most surprising finding coming out of the McKinsey survey was that American companies are not poised to be the leaders in embracing Web 2.0 in coming years. If anything, they’re looking like laggards. Leading the way are Indian firms, 80% of which plan to increase their investments in Web 2.0 over the next three years, compared with 69% of Asia-Pacific firms, 65% of European firms, 64% of Chinese firms, 64% of North American firms, and 62% of Latin American firms.

In another sign of what the future holds for Web 2.0 in business, the Forrester survey found a clear preference among CIOs for buying a full suite of Web 2.0 tools from a large, established vendor. 74% of CIOs said they’d be more interested in investing in Web 2.0 if all the tools were offered as a suite, and 71% said they’d prefer the tools to be “offered by a major incumbent vendor like Microsoft or IBM [rather than] smaller specialist firms like Socialtext, NewsGator, MindTouch, and others.” Web 2.0 startups hoping to make inroads in the enterprise market, even among mid-sized firms, will continue to face big challenges, particularly as the larger vendors release their own suites of tools or incorporate them into existing products. You can bypass the CIO on a small scale, but it’s difficult to bypass the CIO when it comes time for a company to standardize on a particular product and vendor.

UPDATE: The McKinsey study is now available online.

Deneutralizing the net

Technology Review, which jumps on the Web 3.0 bandwagon in its current issue, reports that Stanford’s Clean Slate Design for the Internet program will be holding a coming out party this Wednesday. The interdisciplinary program seems to take the end of “net neutrality” as a given. Its thrust, in fact, is to make the Internet less Internety (at least as we’ve come to define the term) by redesigning it to be “inherently secure,” by making it possible to “determine the value of a packet … to better allocate the resources of the network, providing high-value traffic with higher bandwidth, more reliability, or lower latency paths,” and by “support[ing] anonymity where prudent, and accountability where necessary.”

Reports Technology Review:

The Internet may have revolutionized society, but [Stanford professor Nick]McKeown points out that there are still some basic things it doesn’t do well. There’s no reliable way of knowing whom data comes from, for example, because the Internet was designed in a way that makes it “ridiculously easy” to fake any information’s origin, McKeown says. It would be much easier to eliminate unsolicited e-mail messages if the sender could be verified because spammers could be quickly identified and prosecuted.

The intent of data can also be masked. Data packets that might look as though they were sent for a legitimate purpose could actually be intended to damage the network by spreading viruses or searching for secret information. When the Internet was first designed, “it was assumed that everyone would be well behaved, but we’re obviously in an era now where we can’t make that assumption,” McKeown says.

Commenting on the initiative, networking pioneer Bob Metcalfe goes even further, arguing that

there needs to be a way to ensure dedicated bandwidth. “The Internet was designed to get teletype characters echoed across the U.S. in under a half second,” Metcalfe wrote in an e-mail interview. “Soon we’ll have to handle [high-definition] video conversations around the world. The Internet must now allow bandwidth reservation, not just priority, to carry realtime, high-bandwidth communication – video in its many forms including video telephone.”

Maybe it will be the geeks rather than the suits who end up killing net neutrality.

Twitter dot dash

And so at last, after passing through Email and Instant Messaging and Texting, we arrive in the land of Twitter. The birds are singing in the trees – they look like that robin at the end of Blue Velvet – and the air itself is so clean you can see yourself in it.

Twitter is the telegraph system of Web 2.0. Like Morse’s machine, it limits messages to very brief strings of text. But whereas the telegraph imposed its limit through the market’s will – priced by the word, telegraph messages were too expensive to waste – Twitter imposes its limit through the iron law of code. Each message may include no more than 140 characters. As you type your message – your “tweet,” in Twitterese – in the Twitter messaging box, a counter lets you know how many characters you have left. (That last sentence wouldn’t quite have made the cut. It has 146 characters. Faulkner would have been a disaster as a Twitterer.)

Only on the length of each message is a limit imposed. Because there’s no charge to send a message and no protocol governing the frequency of posting, you can send as many tweets as you want. The telegraph required you to stop and ask yourself: Is this worth it? Twitter says: Everything’s worth it! (If you’re sending or receiving tweets on your cell phone, though, you best have an all-you-can eat messaging plan; Twitter is, among other things, a killer app for the wireless oligopoly.) You can also send each tweet to as large an audience as you want, and the recipients are free to read it via mobile phone, instant messaging, RSS, or web site. Twitter unbundles the blog, fragments the fragment. It broadcasts the text message, turns SMS into a mass medium.

And what exactly are we broadcasting? The minutiae of our lives. The moment-by-moment answer to what is, in Twitterland, the most important question in the world: What are you doing? Or, to save four characters: What you doing? Twitter is the telegraph of Narcissus. Not only are you the star of the show, but everything that happens to you, no matter how trifling, is a headline, a media event, a stop-the-presses bulletin. Quicksilver turns to amber.

Are you exhausted yet?

Dave Winer has succeeded in creating a New York Times feed through the Twitter service, as if to prove that everything is equal in its 140-character triviality. “All the news that’s fit to twit,” twitters Dave. The world is flat, and so is information.

my dog just piddled on the rug! :-) [less than 10 seconds ago]

Seventeen killed in Baghdad suicide bombing [2 minutes ago]

Oh my god I cant believe it I just ate 14 double stuff Oreos [3 minutes ago]

A conflicted Kathy Sierra explains why Twitter is so addictive. Boiled down to a couple of tweets, it goes like this: using Twitter presents us with the possibility of a social reward, while not using it presents us with the possibility of a social penalty – and the possibility of a reward or penalty is a far more compelling motivator than the reality of a reward or penalty. Look at me! Look at me! Are you looking?

Tara Hunt says, “Twitter is a representation of my stream of consciousness.” What used to happen in the privacy of the mind is now tossed into the public’s bowl like so many Fritos. The broadcasting of the spectacle of the self has become a full-time job. Au revoir, Jean Baudrillard, your work here is done.

Like so many other Web 2.0 services, Twitter wraps itself and its users in an infantile language. We’re not adults having conversations, or even people sending messages. We’re tweeters twittering tweets. We’re twitters tweetering twits. We’re twits tweeting twitters. We’re Tweety Birds.

I did! I did taw a puddy tat! [half a minute ago]

I tawt I taw a puddy tat! [1 minute ago]

Narcissism is just the user interface for nihilism, of course, and with artfully kitschy services like Twitter we’re allowed to both indulge our self-absorption and distance ourselves from it by acknowledging, with a coy digital wink, its essential emptiness. I love me! Just kidding!

The great paradox of “social networking” is that it uses narcissism as the glue for “community.” Being online means being alone, and being in an online community means being alone together. The community is purely symbolic, a pixellated simulation conjured up by software to feed the modern self’s bottomless hunger. Hunger for what? For verification of its existence? No, not even that. For verification that it has a role to play. As I walk down the street with thin white cords hanging from my ears, as I look at the display of khakis in the window of the Gap, as I sit in a Starbucks sipping a chai served up by a barista, I can’t quite bring myself to believe that I’m real. But if I send out to a theoretical audience of my peers 140 characters of text saying that I’m walking down the street, looking in a shop window, drinking tea, suddenly I become real. I have a voice. I exist, if only as a symbol speaking of symbols to other symbols.

It’s not, as Scott Karp suggests, “I Twitter, therefore I am.” It’s “I Twitter because I’m afraid I ain’t.”

As the physical world takes on more of the characteristics of a simulation, we seek reality in the simulated world. At least there we can be confident that the simulation is real. At least there we can be freed from the anxiety of not knowing where the edge between real and unreal lies. At least there we find something to hold onto, even if it’s nothing.

I did! I did taw a puddy tat!

SAP CEO calls SaaS “the better model”

SAP has changed its tune. The German software house, the leading maker of enterprise applications, has long looked down its nose at the idea of supplying software as a service over the Internet, arguing that companies will continue to buy complex programs and install them on their own machines in their own data centers. In February 2006, when SAP introduced a decidedly underwhelming on-demand version of its CRM application, it seemed less an endorsement of the SaaS concept than a dismissal of it. Commenting on the SaaS model, one SAP executive told Business Week, “Most customers are hitting a wall in terms of flexibility and the ability to integrate with other programs.” Last April, in a CNET interview in which he was asked about the challenge posed to SAP by SaaS suppliers like Salesforce.com, SAP CEO Henning Kagermann stressed his view that SaaS had limited applicability and value:

We have not changed our strategy … You can do this on-demand for certain areas and certain functions, but not for everything. Everybody starts with salesforce automation because it makes sense since it’s not very structured. It’s simple and more office-like. But the more you come from this type [of system] to the core of CRM [customer relationship management], the more difficult it will become to do it on-demand. People don’t want to share the data with others … I have spoken to many clients and they want to own [the software]. They are happy with this model.

What a difference a year makes. Last week, at the big Cebit trade show in Hanover, Kagermann heaped praise on the software-as-a-service model and introduced the company’s forthcoming suite of SaaS apps – codenamed A1S – as SAP’s future engine of growth. Kagermann calls the subscription-based suite, which spans not only CRM but also supply chain management (SCM) and SAP’s bread-and-butter enterprise resource planning (ERP), “game-changing.” It represents “a completely new model for us,” he said at Cebit. According to an InfoWorld report, he praised the SaaS suite as a much simpler and more business-friendly alternative to traditional installed systems:

“Once businesses have used the application, they’ll get the hang of it pretty quickly.” Kagermann expects even some senior managers of companies, who typically don’t spend much time with software issues, to show an interest and master many functions on their own. Traditionally, the software industry has developed applications and businesses have had to adapt their processes, according to Kagermann. “Now with A1S, users define their software requirements,” he said. [In addition to mid-sized and smaller companies,] Kagermann expects subsidiaries of large enterprises to be interested in the product.

Most remarkable were the comments Kagermann made about the SaaS suite in a Financial Times interview published on Friday:

Mr. Kagermann said investors were nervous as the new product was coupled with a new business model. While big groups buy SAP’s software for their offices, small companies will rent A1S and use it online. Installing databases meant the subscription model had big start-up costs. “People know this is the better model. But the upfront cost means few dare to introduce it,” he said. “You only start printing money later.”

Although Kagermann appears to have been referring to mid-sized and smaller companies, the fact that he would call SaaS “the better model” – both for customers and for his own company’s future profitability – represents a striking change of heart and of strategy. As Salesforce has demonstrated, the SaaS model, after proving its value in the mid-market, naturally moves up-market to ever larger companies – in classic disruptive-technology fashion. If SaaS is “the better model” for mid-sized companies today, how long will it take before it becomes the better model for big companies as well? Kagermann’s conversion, spurred no doubt by his firm’s recent earnings shortfalls, seems like a milestone on the road to the transformation of business software. The game, indeed, has changed.

UPDATE: Vinnie Mirchandani smells a rat.

Google should buy Intuit

Google has begun to nibble at the business market, introducing its $50-per-employee-per-month package of personal productivity applications, Google Apps, and buying up some little companies like JotSpot, a purveyor of corporate wikis. Now the time has come for the fast-growing company to take a bigger bite out of the enterprise pie. And the best way to do that would be to buy Intuit. I would argue, in fact, that there’s no company that provides a better immediate fit with Google than does the maker of QuickBooks, Quicken, and TurboTax.

Google’s strategy in the enterprise market is the same strategy pursued by most software-as-a-service companies: start by serving small and medium-sized companies, then work your way up into larger corporations. The core software program used by smaller companies is the bookkeeping and accounting application. Intuit’s QuickBooks holds the dominant position in this market and, as Larry Dignan notes today, Intuit has quietly been moving up into the middle market with its more powerful version of QuickBooks, QuickBooks Enterprise, which provides basic enterprise-resource-planning (ERP) functionality. As Dignan writes:

[QuickBooks executive Gary] Wiessinger says Intuit’s plan is to stay focused on mid-market companies looking for simplicity. The division started in 2001 following numerous customer surveys. What Intuit discovered was that more mid-market companies were maxing out QuickBooks as their firms grew. “From those surveys, we realized we had to put a focus on (QuickBooks Enterprise) so we launched enterprise solutions,” says Wiessinger. “Our original goal was to keep them in QuickBooks family. Now we’re focused on meeting needs of more complex companies by offering greater scale and power.”

What Dignan doesn’t mention is that Intuit also happens to be a major, if largely unacknowledged, player in the software-as-a-service business. It offers a SaaS version of QuickBooks called QuickBooks Online Edition which has more than 85,000 subscribers (growing at a 50% annual clip), making it one of the most popular Web-based business apps. QuickBooks Online is a fairly rudimentary accounting program, though it has become steadily more sophisticated over the years with the addition of various features such as payroll management. Right now, it would be an ideal complement to Google Apps. Roll an accounting/payroll service into Google Apps, and you have a suite that literally covers all the software required by a whole lot of small businesses. And you have a strong base for moving up into the mid-market.

Intuit also, of course, has Quicken, the leading personal finance program, which is another natural for moving onto the Web. And it has the leading tax preparation program, TurboTax, which has already moved to the software-as-a-service model with the popular TurboTax Online. Weave this stuff together with Google Finance, run it on Google’s infrastructure, and you’ve got the makings of the dominant personal finance service.

If Google were to buy Intuit, it would also fulfill another of its core goals: annoying the hell out of Microsoft. You’ll remember that, back in 1995, Microsoft tried to acquire Intuit, only to be thwarted by the Justice Department’s antitrust regulators. (That marked the real beginning of Microsoft’s legal woes.) Even though, in the long run, Google could end up an even bigger monopoly than Microsoft, I don’t think it would run into antitrust problems if it tried to buy Intuit today. (If it waits, though, all bets are off.)

Bottom line: By acquiring Intuit, Google leaps to the forefront of the small-business software market and establishes a foundation for moving up-market with its software-as-a-service business suite, while at the same time gaining a big share of the personal finance sector just as it’s beginning a shift to the SaaS model. The acquisition is certainly do-able. Google’s current market cap is about $137 billion. Intuit, a nicely profitable company, has a market cap of just under $10 billion, with a little over $1 billion of cash on hand. And just a few months ago, Google and Intuit announced a major strategic alliance, so the lines of communication are certainly open. What’s not to like?

UPDATE: In the comments, Isaac Garcia notes that Intuit also has QuickBase, another popular software-as-a-service application for small businesses and corporate teams (which even includes a CRM module). Does anyone know how many companies currently use QuickBase?