Fall World 2013

Conference & Exhibit

Attend The #1 BC/DR Event!

Spring Journal

Volume 26, Issue 2

Full Contents Now Available!

The Need To Be Green: Data Center Strategies Becoming More Eco-Friendly

Written by  SAI GUNDAVELLI Monday, 23 June 2008 23:34
As Nobel Peace Prize winner Al Gore stated recently, “It is a mistake to think of the climate crisis as one in a list of issues, it is the issue.”

Climate change, carbon credits or the need to be green is coming to the data center. For IT managers charged with implementing data center strategies that are more eco-friendly, the mandate is clear – turn the data center from inefficient to environmentally friendly. To do this requires a fundamental shift in the way corporate IT personnel think about the total impact of business applications, the underlying data and the processes required to maintain that data.

To transform a data center into one that is environmentally friendly will likely require new capital investments along with a review and thorough understanding of the total cost of data stored by the company. By understanding data; how and why it is used; and determining when it can be moved to lower cost data storage options IT managers will begin to understand the overall the impact on corporate costs.

How Much Power is Consumed

The energy used by companies to power data centers is significant. In 2006 the amount of energy consumed by U.S data centers was estimated at 61 billion kilowatt-hours (kWh) (about 1.5 percent of total U.S. electricity consumption) for a total electricity cost of about $4.5 billion. This level of electricity consumption exceeds the amount of electricity consumed by all of the televisions in the U.S. and is equal to the amount of electricity consumed by approximately 5.8 million U.S. households.

The power and cooling infrastructure alone, which is needed to keep IT equipment in data centers at moderate temperatures, accounts for 50 percent of the total electricity consumption within data centers. Among the different sizes of data centers, more than one-third (38 percent) of electricity use is attributable to the nation’s largest and most rapidly growing data centers.

Using current efficiency trends, the energy consumed by servers and data centers will likely double by 2011 to more than 100 billion kWh, which represents $7.4 billion in annual electricity costs. The peak load on the power grid from these servers and data centers is currently estimated to be approximately 7 gigawatts (GW), the equivalent to the output of about 15 base load power plants. If current trends continue, this demand would rise to 12 GW by 2011, which would require an additional 10 power plants. In 2007, carbon dioxide emissions for all US data center will exceed 44.4 million metric tons.

The estimated annual costs of operating a data center employing 75 workers and occupying 125,000 sq. ft. of new space ranges can also vary depending on geography. In Sioux Falls, SD operational costs can be kept to under $10 million to more than $14 million in New York City.

According to a 2005 survey of AFCOM (an IT trade association dedicated to providing education and resources for data center managers) members, data center power requirements are increasing an average of 8 percent per year. Power requirements of the top 10 percent of data centers are growing at more than 20 percent annually. At those rates, companies are more likely to be motivated to change their data centers than in cheap power markets.

Going green can have a number of meanings, but the most common, is having the ability to recognize the environmental impact that you and your organization have upon the world. U.S. data centers data center release more than 44 million metric tons of carbon dioxide into the air annually and consume 100-200 times more energy that the average household per square foot. Becoming green in the data center is about reducing power, reducing carbon and driving profit to the bottom line.

In order for a data center to be considered “green” many technology strategies need to work together. Server consolidation through virtualization and replacing older hardware with more efficient platforms is an easy start to a green data center. However, low energy hardware does not address a fundamental component of the data center – large volumes of data are being retained in production systems beyond their usable life, creating the need for high availability servers and storage with even larger server and storage requirements for test and development. Together, high volume production and the required test and development processes, creates the need for extra power and cooling in the data center.

To achieve a cost-effective, power efficient green data center, a data management framework needs to be employed along with virtualization and energy saving hardware. Employing all three strategies will result in significant direct cost savings to companies and reduce the carbon load to the environment.

Large, multinational companies are beginning to make environmental needs a top priority. Rupert Murdoch announced earlier this year that News Corporation will be carbon neutral by 2010. During an interview, Murdoch said, “When all of News Corp. becomes carbon neutral, it will have the same impact as turning off the electricity in the city of London for five full days.”

IT data centers are also facing issues such as power scarcity, which results in skyrocketing cooling and electricity costs. This problem is compounded by disproportionate use of energy on inefficient servers and storage infrastructures with waste estiGartner reports that by 2008, 50 percent of IT managers will not have enough power to run their data centers. Governments around the globe are implementing legislations to enforce energy performance standards and regulate energy consumption. These include the Japan Energy Law, the SPEC Power, the ECMA Energy Efficiency (ECMA TC-38 TG2), Energy Star, the European Union Directive for Energy Using Products (EuP), and EPEAT and consumer pressure to be a responsible environmental partner.

IT Problems with Data Management
Data center systems are straining under the large amount of data volume that it must manage. In the past few years, data center managers have experienced a dramatic increase in the amount of data collected. Online transaction systems have made it faster and easier to collect volumes of information about products, customers and suppliers and in recent years companies have been required legally to save more and more data.

The large data footprint in production systems has led to a much larger downstream effect as companies struggle to manage not only the core transaction system but the proliferation of copies that are being made to support the production system. On average, for every production application, IT makes 8 copies for production support. As the production system grows, so do all the copies, consuming large quantities of storage and power. When an application or database needs to be upgraded, additional copies are required to reduce risk associated with the upgrade process.

 ****************************************************

IT data centers typically manage at least 6 mission critical applications. Multiply the number of applications by the number of copies (6 apps x 8 copies = 42 total copies) needed to meet the storage requirement, plus the servers required to support each copy, and the power to support the entire infrastructure, it is no surprise that more than 70 percent of IT budgets are allocated to the database applications even though only 20 percent of the production data is database data.

Effective data management needs to incorporate the needs of not only the production environment but also of test and development. Reducing the need for full size development copies can significantly reduce the storage footprint required, saving storage, lowering power requirements and saving money.

What Can Your Company Do?
Strategies to manage the proliferation of data and systems include virtualization of servers, server consolidation and powering disks on an as-needed basis (focused on efficient power supplies through tighter integration between system workloads and storage drives). These strategies are all valid but the fact remains that 80 percent of data being retained in production systems is under utilized by the organization. Better data-management techniques such as data compression, data de-duplication, and tiered storage can create significant energy savings.

The massive storage devices in data centers, the physical space and power supply needed support them are all focused on retaining data in the most costly, highest availability medium. Exponential data growth ultimately translates into the need for more space, more storage servers, more applications to streamline the data, added networking, more complex data center design, larger facilities, more air conditioning and power supply units and in the end, a big fat power bill.

The data layer is where the real problems lie. Storage is consuming power, and the same principles that govern storage space saving should be applied to power management. Users need to identify what information is being stored, quantify where and for how long it is being kept, and detect redundancies and opportunities for information consolidation. Once this is known, IT managers can move forward with incremental backups, snapshots, and other advanced storage technologies to sever the inactive data and reduce the overall volume of information.

Data archiving and migration are potent strategies that can directly help optimize storage capacity needs. These strategies optimize storage capacity utilization and reduce the associated power demands. Policy-based archiving and purging eliminates redundant copies of stored data, enabling significant data reduction for backup and storage applications.

Reducing the size of production systems can significantly lower the overall storage footprint in the data center by dramatically lowering the storage requirement for test and development processes. Application developers should define the testing requirements and build data sets that meet the specific need of the test, rather than reproducing the entire database. By clearly defining the data needed for the test and creating a clone with only the required data, DBA’s can significantly reduce the storage requirement needed for application testing and maintenance functions.

For example, take a 1 Terabyte production database with seven copies, equals 8TB of storage capacity for a single applications data footprint. In classic test and development environments, this would be standard operating procedure. Using enterprise data management techniques, specific requirements of a testing instance can be defined and only the data needed for the test are created for the testing copy. By defining the requirements of the test, specific data sets can be identified as needed for the test, reducing the data volumes on these copies significantly, saving storage, server cycles and power in the data center.

Data is expensive, in the direct costs from maintenance, storage, servers and power to run and cool the equipment. Effective data management can help organizations to manage data growth, decrease costs in the data center and lower the company’s impact on the environment.

Sai Gundavelli is chief executive officer of Solix Technologies of Sunnyvale, California. He can be reached at sai.gundavelli@solix.com.



"Appeared in DRJ's Summer 2008 Issue"
Login to post comments