Now that the cloud is becoming a standard feature in the enterprise, a little truism has emerged: Resources are infinitely scalable, but so are the costs.
Theoretically, at least, increased cloud consumption should only happen in the presence of increased business activity, and therefore increased revenue. So the cost/benefit ratio should always favor the enterprise, at least if you’re smart about it. In practice, though, it doesn’t always work that way. But even if it did, the real question is not at what point does a gargantuan cloud presence become a money loser, but when does it end up costing more than building and operating your own data center?
This conflict is particularly acute in rapidly growing enterprises. Companies that go from little-known start-up to must-have business solution provider overnight can suddenly find themselves on the hook for millions per year. Wired.com, for example, tells the tale of MemSQL, a West Coast database services company that originally provisioned its entire test and development infrastructure on Amazon only to dump it one day in favor of in-house, bare metal infrastructure. A simple cost comparison was the key driver: For about $120,000 amortized over three years, the company was able to shed more than $300,000 in cloud costs per year – a reduction of more than 80 percent.