When one backup window closes, will another open for you to complete the process?
IT managers face the backup window dilemma all too often as data volumes continue to grow by as much as 50 percent annually and business requirements increasingly dictate an around-the-clock operation. More business-critical data to back up with less time available. As a result, backup operations are straining organizations’ data protection objectives and requiring the development of a more innovative solution.
With more than half of businesses running over their allotted backup window, according to a recent IDG study, what processes can be adopted to ensure the data protection strategies deployed at your organization remain viable?
Of the available data optimization technologies, data deduplication is the one with the greatest potential to deliver substantial and recurring impact on the cost and manageability of data growth. Backup solutions have employed data deduplication engines for many years to save storage space in the backup data store. Although the cost of storage continues to decrease, data growth has expanded faster and as a result more and more storage is being consumed. Additionally, the backup windows mentioned above are more and more critical and any performance hit that deduplication may impose further impacts the ability of the backup solution to complete within the allocated window.
Some backup deployments use data optimization in a post-process mode to help shrink the amount of time required for backup in order to meet their backup window objectives. To do this, they must add data cache storage requirements prior to data optimization, which increases the cost and complexity parts of the IT equation. The only thing getting smaller than today’s backup windows are ever-shrinking IT budgets. Rampant data growth affects budgets, operating costs, floor space and, of course, capital expenditure through the amount of data created and its associated cost. The efforts to shorten backup times may be further limited by the need for cost containment.
To address these issues and to gain competitive market share and revenues, OEMs need to implement data optimization technologies that satisfy the following key requirements:
- Performance — Data optimization must be extremely efficient and maintain a level of performance that does not impede overall storage/backup performance. If the data optimization engine runs at very high performance levels, the engine can run inline. This mitigates the need for a post-process optimization run and its data caching, while also eliminating the need and cost of temporary storage for this cache. Storage vendors have made billion dollar R&D investments to optimize their storage performance as a means of differentiating their offerings. Any optimization engine today should never impact the speed of the overall backup process.
- Scalability – Less than 10 years ago, only a handful of IT organizations had a petabyte of data. Today, thousands of large organizations have requirements for more than that amount. Data optimization solutions must be able to scale to multiple petabytes to address the needs of these customers at petabyte levels and, in the future, exabyte capacities
- Resource efficiency – Whether backup is done in a standalone appliance or with server-based software, RAM and CPU resources are consumed by the dedupe/data optimization process. By employing highly efficient resource utilization techniques, more scalability is enabled and the ability of the backup process to run as quickly as possible is enabled.
- Data Integrity — Data optimization technology must not interfere with the storage application software in a way that increases data risk. The OEM’s storage software must maintain control over writing the data to disk and the data optimization software cannot modify the data format in any way. This has the benefit of eliminating the need for complex data reassembly processes (commonly called “rehydration”), and protects data against possible corruption.
OEMs need to evolve a backup solution that can run without latency or performance impact (which may further limit the ability to complete backup within the allocated window) while eliminating the need for the post-process data cache and its cost. And it must work seamlessly across primary, archive and backup storage by broadly addressing the needs of each of these storage tiers.
Backup windows are getting smaller and smaller yet more and more critical. When properly deployed, data optimization technologies help IT complete backup operations within the allotted time and at an improved TCO over alternative techniques.
Wayne Salpietro is director marketing at Permabit Technology Corporation in Cambridge, Mass., a leading provider of storage efficiency software.