DRJ's Spring 2019

Conference & Exhibit

Attend The #1 BC/DR Event!

Winter Journal

Volume 31, Issue 4

Full Contents Now Available!

Difficult economic conditions lead to fiscal belt tightening. The ever-increasing demand for data accelerates the growth of storage and causes these costs to appear like ripe, low hanging fruit to many cost-cutters. Buying low priced, “good enough,” or mediocre storage appears to be an opportunity to reduce a large and growing budgetary item. This, however, is only one part of the cost equation.

Low-cost gear costs less not only because of limited functionality but also because a number of engineering shortcuts were taken during manufacturing. For example, using lower-tolerance components that have higher failure rates or removing redundant components are common ways to reduce production costs. These shortcuts, however, negatively impact overall reliability.

Lower reliability means a greater number of outages that require restores, rebuilds, restarts, and reboots. The extra expense of these recovery actions as well as the lost productivity of diverting attention from more important productive activities can quickly exceed the one-time savings gained from buying cheap storage.

Mediocre storage can cause a much greater danger, however, than just increasing operating expense. Mediocre storage devices have a greater vulnerability to reliability problems and, therefore, they expose the organization to a higher level of data integrity risk and, more seriously, expose the organization to the risk of data loss.

Data is not an off-the-shelf commodity; you can’t buy replacement data if it is lost. Without a duplicate copy of critical data, the loss is irreversible and permanent. In addition, transactional data has not only increased in both value and volume but the reconstruction of transactional data is much more difficult if not impossible without a duplicate copy because the paper source has been eliminated.

Data is one of two irreplaceable corporate assets second only to the loss of life. ‘Oh, come on, really; compare loss of data to loss of life?’ Research has shown that 50 percent of companies that lose critical business systems for more than 10 days never recover, 43 percent of companies experiencing a disaster never reopen, and 29 percent of the remaining close within two years. That’s death of a corporation.

Technology is tightly woven into the operating fabric of today’s organizations and in many ways technology has become the business. Using mediocre, “good enough” storage creates an untrustworthy business environment for critical corporate information by placing vital data at risk. Good enough storage not only increases operational risks but it creates a material internal control weakness by contributing to data integrity problems and increasing the risk of data loss. The risk of data loss compromises compliance with a growing number of governmental regulations.

The heart of this growing government regulation of business is internal controls and operational risk. Not since the Nixon-era’s Foreign Corrupt Practices Act (FCPA) has so much attention been given to corporate governance. These new regulations have a big bite and very sharp teeth. The Sarbanes-Oxley Act holds senior executives personally liable and can result in penalties of up to $5 million in fines, up to 20 years in prison, or both. To say the least, this has gripped the attention of all corporate senior officers.
Sarbanes-Oxley Section 302 addresses material weakness in internal controls. A material weakness is a condition in which there is a high probability that material financial errors, irregularities, or risk events could occur and not be detected by employees or existing control processes. Implementing acceptable internal controls is the key to satisfying the requirements of Sarbanes-Oxley.

Although most IT organizations set policies and practices to limit vulnerabilities and reduce security incidents, this best-effort scenario is no longer enough for the federal government. An untrustworthy operation leads to serious noncompliance implications in today’s corporate governance environment. Is a nominal, one-time saving from purchasing mediocre storage worth the risk of prison?

Sarbanes-Oxley controls are not unlike those found in the Gramm-Leach-Bliley Act (GLBA) of 1999 and the Health Insurance Portability and Accountability Act (HIPAA) of 1996 that were enacted to safeguard data against unauthorized and improper use.
However, in this case the SEC is squarely focused on corporate accountability. Negligence, ignorance, or a “good enough” effort is no longer acceptable under this new law. Blind trust in an IT system will not be an acceptable defense. The law formally establishes corporate responsibility to create and maintain controls to identify and manage risks that result in inaccurate data.

Technology is tightly woven into the operating fabric of today’s organizations and in many ways technology has become the business. Internal controls are largely in the realm of IT and compliance is not an option. Required internal controls include policies and procedures to maintain accurate records, properly record and report transactions; and safeguard assets against unauthorized or improper use. Since mediocre storage puts data in jeopardy, mediocre storage is not really good enough for compliance.
If mediocre storage creates internal control weaknesses then the reverse is also true. Quality has value with respect to compliance. Quality storage solutions will improve operational effectiveness by reducing operational risk and strengthening internal controls. Storage solutions with superior high quality design standards including redundancy of critical components will increase the protection of data assets by reducing sharply the likelihood of data loss. Quality storage is technology’s “Keep Out of Jail” card.

Dennis Wenk, CISA, CDP, CSP, is the senior global architect of business continuity and resiliency for Hitachi Data Systems. He is responsible for creating high availability solutions for HDS customers. Wenk holds a Masters of Business Administration in accounting and finance (1985), a Bachelor of Science in computer science (1977) from Northern Illinois University, and an Associate of Science in data processing from Moraine Valley Community College (1974). He is also a Certified Information Systems Auditor (CISA), Certified Data Processor (CDP) and Certified Systems Professional (CSP).