If you’re not familiar with the concept, a WebOS is essentially a computer in a Web browser; a complete operating system virtualized inside your Web browser, allowing you to access specific cloud-based applications.
Don’t get me wrong – running a computer inside a Web browser inside a computer is an amazing technical feat, and one which is pretty cool. But that’s all that it is. The problem is applications.
That is, in order to access the applications – word processing, spreadsheets, photo cataloguing, etc. – you have to A) Use your computer’s OS to B) use your Web browser to C) Use the Web OS to D) use the application you really wanted in the first place. Compared to most cloud applications which take three steps, (Computer OS to Browser to Web app), or desktop applications which take two, this seems to be a woefully inefficient way of doing things.
The advantage of a unified operating system on the Web, of course, is interoperability – being able to copy spreadsheet data and put it in a word processing document, etc. However, Google seems to be able to do this without the Web OS intermediary. Gmail integrates with Google Docs, which integrates with Google Spreadsheets, etc.
So right now, any so called “Web OS” is less useful than no Web OS at all. It’s just another thing getting in the way to the application you really want.
There is still potential in this model, however. The problem is that these Web OSes are very limited in what types of applications will run on them. The main selling point of OSes on the desktop is that, say, Windows will let you run any Windows software that exists. Mac OSX will allow to run any OSX app. Linux will let you run any Unix app.
Why would you use a service that allows you to run, maybe, 15 applications, when you can use a service on your desktop that allows you to use millions?
Cloud computing has already accomplished an amazing feat. About 80% of our computing can be done on the Cloud – and about 80% of computer users can use cloud apps exclusively day-to-day. The 80/20 rule, if you will, strikes again. The trick for cloud computing apps now is reaching that “long tail” on the 20% – creating the app for the minority of users, rather than the majority.
I can see Web OSes solving this problem one of two ways. The hard way would be for all these little startups to standardize on a single platform for app development, and create millions of Web apps, to rival commercial desktop apps. At the same time, they will be competing with Google and Microsoft, who plan on developing Web apps – or have already developed Web apps – without those standards. Again – this is the hard way.
The easy way would be for a Web OS as “middleware” to make already developed desktop applications work on the cloud. A Web OS that came with some basic features, but which allowed me to run, say, anything that is featured in Debian’s APT packaging system, by recompiling it for the cloud, and running as a cloud app. Sure, some apps would be too “chatty” to be worthwhile on an Internet connection, but those apps can be recompiled in later versions to be less chatty and more responsive over longer-latency links. (Until then, monitoring how many round trips each app takes is a good policy.)
There are advantages to such a setup – running Web apps is less bandwidth intensive than more traditional remote desktop virtualization (where you run the entire output of the screen and everything down the network tubes) and there’s certainly an advantage in providing app support only through a Web interface, and take no liability on the client’s end for the desktop OS.