Fall World 2017

Conference & Exhibit

Attend The #1 BC/DR Event!

Summer Journal

Volume 30, Issue 2

Full Contents Now Available!

Jon Seals

Jon Seals

The cliché of “change is the only constant” is true for most enterprises. Customers, business analysts, and employees all expect some sort of evolution, even if it is with varying degrees of enthusiasm.

Even the minority whose positioning is deliberately one of no change (providers of traditional goods and services) are affected by changes in the way governments tax and regulate them, or how suppliers supply to them.

When it comes to business continuity, plans and management must keep pace with business changes too. But is an annual, a quarterly, or even a monthly BCP review the right way to stay synchronized?

There is a dilemma with business continuity planning and reviews. Making them too infrequent means greater risk of being out of alignment and less able to react effectively to threats of business interruption.



The data center is on a clear trajectory toward greater abstraction, greater resource distribution, and greater diversity in both the workloads it supports and the technologies it brings to bear.

All of this leads to an increasingly complex management challenge that pits the need for greater autonomy among users and applications against the needs of the enterprise to maintain data availability and security while keeping budgets under control.

According to Shay Demmons, executive VP of BaseLayer’s RunSmart software division, this challenge is compounded by the fact that most organizations are branching into new IoT and service-level data architectures that must reach back to legacy infrastructure for crucial data support. This calls for a “looking forward, looking backward” management approach that, in fact, utilizes many of the same technologies that are driving the transition to digital services – things like sensor-driven data systems, advanced visibility and intelligent automation that propel workflow management and resource allocation to the speed of modern business.



Tuesday, 06 June 2017 15:14

8 Vital Data Protection Tips

With so many files in existence and so many more being created every moment, it’s no wonder so many breaches and data loss incidents occur. We asked the experts for some of the top tips on keeping storage data protected.

1. Limit and Monitor Access

Many of the big data breaches we read about in the news trace their origins back to one of these two issues, and most likely both: too much access and little or no monitoring of that access. These are some of the biggest problems in data security, according to Rob Sobers, Director at Varonis.

The 2017 Varonis Data Risk Report, found that 20 percent of folders are open to every employee. Forty-seven percent of organizations in the report had at least 1,000 or more sensitive files containing personal data, health records, financial information or intellectual property open to every single user. Not only are sensitive files open to more people than necessary, but access abuse is not monitored and flagged. This is why 63 percent of data breaches take months or years to detect, according to the report.



(TNS) - With an above average hurricane season predicted, the lack of leadership at two agencies responsible for protecting the United States' coast lines should be a sobering thought, said a widely admired general who led the military’s response to Hurricane Katrina.

The National Oceanic and Atmospheric Administration, which runs the National Hurricane Center, and the Federal Emergency Management Agency are both without leaders. Those positions must be appointed by President Donald Trump and confirmed by the U.S. Senate, CNN reported.

“That should scare the hell out of everybody,” retired Lt. Gen. Russel Honoré told CNN. “These positions help save lives.”

Honoré, who served as commander of Joint Task Force Katrina and coordinated military relief efforts, told CNN that the disaster proved “how important leadership was.”



Many critical industries such as nuclear energy, commercial and military airlines—even drivers’ education—invest significant time and resources to developing processes. The data center industry … not so much.

That can be problematic, considering that two-thirds of data center outages are related to processes, not infrastructure systems, says David Boston, director of facility operations solutions for TiePoint-bkm Engineering.

“Most are quite aware that processes cause most of the downtime, but few have taken the initiative to comprehensively address them. This is somewhat unique to our industry.”