Google announced the Google Chrome “Operating System” a few days ago, and the world I used to live in is abuzz with people talking about the earthquake this represents for the computing industry. TechCrunch says, “Google Drops a Nuclear Bomb on Microsoft.” Leo Babauta of Zen Habits has a more thoughtful response, but also subscribes to the “future of computing” theme: “Google is moving everything online, and I really believe this is the future of computing. The desktop model of computing — the Microsoft era — is coming to an end. It’ll take a few years, but it will happen.”
The basic idea is that the future is all about connectivity. All your data and apps will be in the sky (on servers that you can access from anywhere). And, according to Babauta, the way we use computers is also changing:
While the business world has long used Microsoft Word to create rich documents full of formatting and charts, the increasingly mobile world doesn’t care about any of that. We send emails and text messages and tweets and messages on Facebook and forums and other social media — with no formatting at all. We do blog posts that have bold and italics and links and photos and videos and not much more in terms of formatting text.
We don’t need feature-bloated Microsoft Word anymore. Nor Excel, with its 2 million features, nor PowerPoint (who likes to watch slides?). Sure, there are still some great desktop apps that people use, for photo and video editing and much more … but the majority of us don’t need those. We need to communicate simply and quickly, without hassle.
I’m sympathetic to this picture, in part because I work on several different computers at different times. I already store a lot of my information on the Internet (including all my law school notes, which are accessible to all my classmates); I use web apps where feasible, including Gmail, Google Calendar, Google Docs (for lightweight work), Google Reader, Google Sites, Flickr, this blog, and so on. But I think there are a couple of holes in this story.
The small hole is that Google Chrome OS just doesn’t add up for me (even though Google Chrome is my favorite browser). In the old parlance, it isn’t even an operating system; it’s what used to be called an “operating environment.” The OS is Linux; Google Chrome OS is just the layer on top that you can see and touch.
More importantly, Google Chrome OS looks to me like a crippled version of Ubuntu (a popular desktop flavor of Linux), which is also free and open source. I recently installed Ubuntu on an old laptop, and I mainly just use the browser (Firefox). But if I need or want to use other applications on the laptop itself, I can; for example, if I’m doing some blogging and I want to download some data into a spreadsheet, I can. (Don’t try telling me that Google Docs has a spreadsheet program that’s usable for anything other than adding and subtracting, at least not yet.) Or I can play MP3s via the MP3 player, instead of having to stream them from some web site. With Google Chrome OS, all I’ve got is the browser. The only compensating advantage I can think of is that it should start up faster than Ubuntu, but since Ubuntu rarely needs to be rebooted, that doesn’t matter much.
The big and more interesting hole is that all this “future of computing” talk should come with an asterisk, with a note saying: “Applies only to personal computing by consumers with limited needs.” The technology media tend to think that the state of computing is reflected in the tools that they use: netbooks, built-in 3G wireless, Gmail, Facebook, Twitter, Amazon, eBay, etc. But I would submit that the computing that really matters is the kind that goes on inside companies.
Brad Delong recently cited Robert Allen on the central role of “productivity-raising machinery” in the transformation of the first industrial revolution into the second industrial revolution and a virtuous cycle of continuous improvement. Recently, people have talked about computer technology playing a similar role in boosting productivity across the economy, although on a smaller scale. But if computers are going to increase productivity and thereby our standards of living, it is not (or not primarily) because they make it easy to create and view pictures of cats with misspelled captions, entertaining though that is.
And there, where it really matters, at least from a macro-economic perspective, the future of computing is a long, long way off. If you hypothesize a uniform unit of “work” done by computers used in businesses – and I mean useful work accomplished, not calculations performed – all of my own observations indicate that the vast majority of this work is still being done by old-fashioned mainframe computers, and most of the rest is being done by those much-hated “client/server” systems.
I’m willing to allow the possibility that in some areas like manufacturing it’s possible that computers have made possible huge amounts of productivity-increasing automation. But in the mass services industries that I’m more familiar, like insurance, the efficiency gains provided by computers have been limited. Think about every time you have tried to do something simple at an airport – like change your frequent flyer number – and watched as the agent typed code after code after code. Or all the times your customer service representative couldn’t answer the simplest question on the phone. My company, which does something so boring no technology writer would ever dream of writing about it, has been pretty successful because we picked an industry that was vastly underutilizing or misusing technology, and we built systems that, while far better than anything our customers had before, are not at the bleeding edge of software.
The big problem in the way most large companies use computer technology is not the software per se; it’s the complexity of conceptualizing, designing, building, and testing huge applications involving millions of lines of code to manage business processes in new and better ways, when those new and better ways can only be dimly glimpsed from the perspective of the current situation. Planning and running these projects is something that most companies are admittedly bad at. The example closest to the usual themes of this blog would be the failure of the mortgage servicing companies to get their modification programs off the ground, in part because they can’t get the systems and processes in place quickly enough. But this happens over and over again in virtually all large companies.
And for me, this is actually a reason to be hopeful. The potential for information technology to improve productivity is still enormous, and we don’t need netbooks or cloud computing or new operating systems or quantum computers to get there. The hardware and software tools of today – or of five years ago – would work just fine, if companies could figure out how to apply and implement them successfully and repeatably. From the point of view of the economy, whether college students are using Windows, OS X, Linux, Android, or Google Chrome OS in five years really doesn’t matter.
I don’t think the people talking about Google Chrome OS would actually disagree with this; they’re just more interested in knowing what tools they will be using in the future. But given that our economy has to figure out some way to increase productivity growth for the long term, the important question is whether companies will figure out how to use existing technology more effectively. The information technology industry is very, very immature. It can improve a lot.
Thank you, thank you, thank you!! It’s about time somebody stood up and said the Internet wasn’t the center of the universe. Web-based apps might aid in collaborative efforts, and it makes life easier as far as sharing pictures and class notes, but to say that “web-based is the future of computing” is beyond foolish. As you point out, “The technology media tend to think that the state of computing is reflected in the tools that they use…” It’s human nature to ascribe one’s own thoughts, desires, and tendencies onto others, and more often than not, that thinking is inaccurate. Shades of the “But everybody’s doing it!” argument from childhood.
For myself, I see far too many bugs in the system for it to be considered “THE Future”. The biggest is security – do people “really” think that some bored kid, spiteful ex-, or downright criminal couldn’t figure out a way of getting into your data? That happens often enough, even with that data safely tucked away on people’s hard-drives. On-line back-ups? Given the cost of storage now you could buy yourself external, terrabyte-sized drives and not worry about getting to them. If your computer AND external drives are destroyed, I’d figure you had bigger problems than restoring. “House fire” comes to mind, as does “flood” and “mud-slide”. As for claiming it’s “The Future of Computing”, from your description it sounds more like a window manager than an OS.
Let’s learn how to make efficient use of what we’ve got before we go running after the “Next Big Thing”, please. Or are bells and whistles more important than the train we’re riding on?
The Google Chrome OS is taking a chapter right out of FaceBook’s success story. This social interactive computing is the way of the future without the personal investment in productivity programs. You get freedom from viruses and the cost of updating those programs. But best of all is the promise instantly shared collaborated results using only one source of program on-line. Bottom line is the amount of money and productivity time you save. Smart move, Google.