One the recurring themes I keep seeing is that the adoption of cloud computing is being driven by our current sucky economic circumstances. I don’t disagree that the macro-economy is a *contributing* factor in cloud adoption, but I really don’t think it’s the primary driving factor. Rather, I think cloud adoption is part of a much larger, much longer IT cycle.
Anyone remember Application Service Providers from the late 90’s? ASPs were a big deal. They were gonna change the way business did business. And then the dotcom bubble burst. And ASPs quickly hit the dustbin of history. Well, kind of. Salesforce.com emerged from those ashes, and somewhere between 2002 and 2004, we saw the first real ASP success story happen.
Could it be argued that this was all dependent on the macro-economy? Yea, I suppose so. I mean – dotcom bust, recession, emerge from recession in the 99 to 2002 timeframe is the exact timeframe for the founding and early growth of Salesforce.com. But it seems to me that there were bigger forces at work (no pun intended).
People, as in non-IT folks, began realizing that they could just purchase “seats” on Salesforce.com. Gartner has called it the “consumerization” of IT – a huge sweeping change, wherein mainstream technology adoption had “normal people” (non-IT pros) thinking they could just “download” something (or browse to something) to accomplish what they wanted. By 2005, IT departments everywhere had started to wake up to the beginning of this nightmare, where software as a service was being purchased “haphazardly” by lines of business level personnel. In other words, the purchasing and maintenance of enterprise software had been decentralized.
Of course, that wave of decentralization is still sweeping across IT. But it was in that context that “cloud computing” rose to prominence within IT departments. Which is to say that it wasn’t simply a “we can save money” or “screw CAPEX” decision. It was a reflection of a decentralization of the IT department that has been in the offing for over 10 years (in force).
The “cloud” is the natural evolution of the decentralization of the complex system of enterprise IT. That evolution drives a commoditization of every point on the “value chain” — i.e., it drives value further and further up the chain. The value in IT used to be in the hardware, the mainframe. Then it was the OS. Then the platform and database. Commoditization — which is really just the cost of computing cycles dropping over time (while increasing in efficiency), Moore’s law — drives value further and further up the stack. As that drives us toward the application level, moving IT departments into “the cloud” was a completely rational occurrence.
How far will it go? Farther than any of us think it will, I’m sure. I remember a hallway conversation I had with Jamie Lewis (then CEO of Burton Group; now Gartner) back in 2004. I raised the idea of “identity as a service.” He said he didn’t think enterprises would ever put their identity data on someone else’s server; too risky. Of course, I countered with Salesforce.com getting people to put *customer* data on someone else’s server. Now, I don’t know that we’re fully to “identity as a service” yet; but the movement seems inevitable at this point (and it’s not that Jamie was “wrong,” so much as that neither of us could imagine the cloud world that was coming).
Cloud Computing is not simply being driven by economic factors. Cloud Computing’s rise is part of a much larger cycle — one that’s been happening for decades. In that context, it’s probably instructive that we keep our eye focused on where the value will land — not in servers, or storage, or processing cycles, or even in applications themselves. Just as with email, the internet, and cell phones — the VALUE is in the connection. The value lives in the space between the servers, storage, apps, people, data and networks.