CIO — The demands of big data applications can put a lot of strain on a data center. Traditional IT seeks to operate in a steady state, with maximum uptime and continuous equilibrium. After all, most applications tend to have a fairly light compute load—they operate inside a virtual machine and use just some of its resource.
Big data applications, on the other hand, tend to suck up massive amounts of compute load. They also tend to feature spikes of activity—they start and end at a particular point in time.
"Big data is really changing the way data centers are operating and some of the needs they have," says Rob Clyde, CEO of Adaptive Computing, a specialist in private/hybrid cloud and technical computing environments. "The traditional data center is very much about achieving equilibrium and uptime."