Everybody’s appy nowadays

The soon-to-be-disappeared Sun Microsystems had a knack for prescient slogans. “The network is the computer” has come true. And then there was “write once, run anywhere,” which heralded the age of universal software applications. Rather than tailoring their programs to run on a particular type of computer – an IBM mainframe, say, or a Windows PC – programmers would use a language like Sun’s Java that was adaptable to any computer. It was a liberating idea: Software developers and users would no longer be locked into one operating system, and beholden to the owner of that system.

And it came to pass. The Web, a universal medium built on device-agnostic standards, sped the embrace of the “write once, run anywhere” ethic. The idea of tethering an app to an OS came to seem kind of absurd. All was good in the land of software.

And then Apple opened its iPhone app store, and in a Cupertino minute everything changed. Suddenly, the idea of tethered software seemed normal again. (Ironically, when Apple was struggling to survive in the 90s, the Web’s run-anywhere ethic had served as an important lifeline for the company, reducing the importance of Microsoft’s control of the PC software market.) Proprietary app stores are popping up everywhere, as device makers and social network operators seek to extend the usefulness of their gadgets and sites and at the same time strengthen their ability to lock in customers. Just last week, Amazon announced it would be opening an app store for its Kindle e-reader.

The rise of the app store comes as the nature of personal computing applications is changing. Some of the apps sold for the iPhone and other devices are old-fashioned, self-contained programs, drawing on data stored in the device itself. But most of them are what might be called “cloud translators.” They serve as software gateways between the Internet and the device. They tap into stores of data that exist out in the Net’s cloud, from maps to message streams, and they tailor that data for some practical use geared to the device’s form and interface.

So is Sun’s “write once, read anywhere” ethic as doomed as Sun itself? Is the proprietary app store model the new normal? Farhad Manjoo, writing in Fast Company, doesn’t believe so. Although “companies increasingly see it as the future model for all software distribution,” he argues, “the app bandwagon” may soon hit “a dead end.” The universality of the Web, he says, will once again win the day: “The App Store’s true rival isn’t a competing app marketplace. Rather, it’s the open, developer-friendly Web. When Apple rejected Google Latitude, the search company’s nearby-friend-mapping program, developers created a nearly identical version that works perfectly on the iPhone’s Web browser.”

“You’d be a fool,” concludes Manjoo, “to ignore the long-term trend in software – away from incompatible platforms and restrictive programming regimes, and toward write-once, run-anywhere code that works on a variety of devices, without interference from middlemen.”

He may well be right. The advantages of run-anywhere software, and of the browser as a universal platform, remain strong. And yet you’d also be a fool to ignore a different trend: people’s retreat from the open Web and their embrace of private networks, whether run by device makers like Apple or site operators like Facebook. The battle between universal software and proprietary apps is also, in other words, a battle between two models for the future of personal computing. While both models will almost certainly survive, only one will be the dominant model. The question that will be answered in the years ahead is this: Is write-once-run-anywhere the destiny of software, or was it an anomaly?

9 thoughts on “Everybody’s appy nowadays

  1. Niraj

    The was has moved from Personal Computing to Cloud Computing. The Cloud is going to be the ultimate lock-in. just look at AppEngine-google , AppFabric-MSFT , Force.com-SFDC or Amazon-SDB,SQS etc.

    Seems like Manjoo is in the situation MSFT was with the browser war. They won the browser battle , but the war moved elsewhere on the internet-(i.e Search)

  2. KiltBear

    Well, if the network can be the computer, then the data can be the application.

    There are many many Twitter and Facebook apps, but still only one Twitter or Facebook data repository.

    It’s that data repository that MAKES the application.

    It is the app that makes the experience.

    We constantly move back and forth between local computer power and network/server compute power based on the ratio of cost to compute cycles. Right now, and probably for a long time going, local computer cycles gives an edge and better user experience than many web based interactions. That will change back and forth depending on technologies and the capabilities of protocols.

    It is the local GPS position being available to Latitude running in MobileSafari that makes it work.

  3. dmarti

    It looks like “write once, run anywhere” is happening but on the development end, not the deployment end. We aren’t getting applications that will install on anything, but we do have the option of choosing development tools that will let us build iPhone applications without writing iPhone-exclusive code.

    Develop iPhone apps in Flash

    Develop cross-platform iPhone/Android apps

    Cross-platform might not work for high-performance games, but it’s probably good enough for casual gaming or business apps. So, developers aren’t as locked in as they were in the PC software market.

  4. Barry Kelly

    This dichotomy is false; it is temporal, rather than fundamental. Software is written targeting a particular abstraction. When hardware is slow, the abstraction can’t be thick; so the software targets the hardware, more or less directly.

    When hardware gets faster, cheaper and lower power, as it always does over time, the abstraction level can rise up. Rich web applications are only performant because the hardware is fast enough to run masses of code translating between the high-level application description and the processing of I/O at the hardware level. In effect, the software targets a virtual, or software, machine.

    Sun’s Java, with its JVM (Java Virtual Machine), was a very old idea coming in at the right time: a little too early for the desktop, but just about right for the server side.

    With mobile, the little gadgets people keep in their pockets are now powerful enough to run decent custom software, with network connectivity at reasonably fast speeds to make them compelling. But the hardware isn’t fast enough to have a really excellent software-level abstraction (and for games, it often never really is). So, the focus on these devices is more towards native applications and necessarily non-run-anywhere.

  5. Laurent

    Definitely agree with Barry: the phenomenon is temporal. I would also add that consumers will push for a reduction in the number of standards.

    If you recall the early days of home computing (before the PC), there were gazillions of software formats. Two computer models from the same vendor might require software incompatible with each other (e.g. Commodore 64 and Commodore Vic-20)

    But consumers grew tired of buying a computer and waiting for the software they wanted to be available for it. They started looking for a system which would have *all* the applications they wanted. IBM (unintentionally) helped with its PC. In the end, there are only two desktop systems left: Windows and MacOS X.

    That’s what Google is trying to achieve with Android. Time will tell if Android prevails, if the iPhone sweeps the market like Windows swept the desktop or if another solution wins (like Web apps becoming powerful enough to replace a lot of apps). But eventually there is no room for all the systems out there.

  6. Nick Carr

    Barry,

    Thanks for the perceptive comment. I think the tendency you point out is a very strong one, and may indeed rule the day, but I’m not convinced it renders the dichotomy “false.” There are other forces at work, including changes in consumers’ perception of personal computing and the Internet. In the battle between (open) Google and (closed) Apple to shape the future of personal computing, for instance, I would not rush to bet against Apple.

    Nick

  7. Cloudy

    Consider the (abridged) history of java on the desktop: right out the gate java let you do write-once, run-everywhere gui application development (at a time when consumer-targeted gui applications were still hot hot hot). The appeal was pretty thin in practice, however, as java started off both too slow and too ugly to compete with actual native applications; it was only the widespread adoption of serverside java that kept the platform alive.

    As of the past few years Java on the desktop has closed the gap: cpus are worlds faster, java itself is faster, the ugliness problem is solved, etc.; you can now do high-quality write-once, run-everywhere desktop gui application development in Java…

    …but Java got there “too late”, in a sense: by the time it caught up on the desktop the real action had moved to web apps, and so the “open desktop” technology never really caught on.

    I suspect the current crop of open web technologies (html5, etc.) are going to repeat that history, or at least rhyme with it.

    I see it playing out as HTML5 (et al) successfully closing the gap between what web apps can do and what desktop applications can do (with the usual caveats about CAD, numerics, etc. applying). This will happen over the next 5 years or so.

    However, by that point the major gap is going to be between the user experience offered by desktop/web apps and the user experience offered by, eg, the iPhone, iSlate, and other multitouch UI interfaces.

    I don’t see HTML5 as proposed being adequate to implement the kinds of UI people will be expecting from quality products in the near future. I also don’t think html is the right standard to build on: it’s a text layout language with some hooks for user interaction and some hooks for communicating with actual programs; where we’re going text documents aren’t going to be the right metaphor.

    So the open world needs to either settle on a single standard api (unlikely) or develop some open standard for specifying interactive front-ends that can adequately describe (and implement!) a slick interactive multitouch application…or perhaps the open world lucks out and people wind up not wanting the slick multitouch ui most of the time.

  8. Neil Taggart

    The dichotomy is true, but it’s just part of the ebb and flow of IT standards and capabilities.

    The reason Apple lost out first time round was because they tried to be too proprietary in their hardware, and were swarmed by the less-integrated-but-more-open PC clones.

    This time round mobile devices suit an app-centric view, and Apple were the first to exploit it and fence it. But the trouble with fences is that it’s hard to expand them. My guess is that Apple will not be able to keep up with the variety of hardware device needs in a massively expanding market, and the ‘run anywhere’ equilibrium will return.

    Social media is a different phenomenon because the loyalty is group-based: I won’t leave unless the bulk of my friends do. But every party has to end…!

Comments are closed.