Fall World 2017

Conference & Exhibit

Attend The #1 BC/DR Event!

Spring Journal

Volume 30, Issue 1

Full Contents Now Available!

Thursday, 10 October 2013 17:39

Big Data, the Cloud and the Exascale Dilemma

For enterprises looking to build their own private clouds, the rule of thumb is quickly becoming: Go big or go home.

It’s been clear from the beginning that one of the chief advantages the cloud brings to enterprise data environments is scale. But even the most advanced cloud architecture in the world can only expand so far before it hits the limits of physical infrastructure. That’s why enterprises that have the resources, like General Motors, are doing their best to match the hyperscale infrastructure of Amazon, Google and other top providers. With a big enough physical footprint, they will be able to broaden scalability and flexibility without having to entrust critical data and applications to public resources.

Naturally, building data infrastructure on a grand scale is not without its challenges. One of the chief obstacles is delivering adequate power to hyperscale, or even exascale, architectures – something the NSA has apparently discovered at its new Bluffdale, Utah, facility. To the joy of civil libertarians everywhere, the plant has been experiencing unexplained electrical surges that have fried components and caused mini explosions. The situation is so bad that insiders are reporting that the center is largely unusable, and even leading experts in the facilities and data fields are at a loss to explain what is going on.