Fall World 2016

Conference & Exhibit

Attend The #1 BC/DR Event!

Summer Journal

Volume 29, Issue 3

Full Contents Now Available!

Jon Seals

Thursday, 17 March 2016 00:00

Agility Trumps Cost in the Cloud

Most cloud experts will tell you that the real advantage of shedding static, legacy infrastructure is not the cost savings, but the enhanced agility. By quickly and easily developing new applications and pushing them out, organizations can craft a more responsive and compelling experience to customers, which should translate into higher sales.

But even after the cloud environment has been deployed, this doesn’t happen by itself. The enterprise needs to make sure that cloud functionality exists across the data environment and that business managers know how to leverage the flexibility and agility that the new service-based infrastructure offers.

One of the ways to do this, of course, is rapid deployment and configuration of resources. But as Google and others are quick to point out, the goal is not simply to deploy a new environment and let it run but to constantly configure and reconfigure resources to produce optimal results with the lowest consumption. Google’s new Custom Machine Types supports this level of functionality by offering sub-minute configuration changes, which provide the twin benefits of highly accurate load balancing and the ability to quickly change underlying resources like compute and memory to meet shifting data requirements. Essentially, it gives the enterprise what it wants when it wants it, with only a fraction of the complexity that usually accompanies infrastructure change management.



In recent years, more and more cybersecurity incidents have taken place as a result of insecure third-party vendors, business associates and contractors. For example, the repercussions of the notorious Target breach from a vulnerable HVAC vendor continue to plague the company today. With sensitive data, trade secrets and intellectual property at risk, hackers can easily leverage a third party’s direct access into a company’s network to break in.

While such incidents may cause significant financial and reputational harm to the first-party business, there is hope. Regulators are instating a growing number of legal requirements that an organization must meet with respect to third-party vendor risk management. As liability and regulations take shape, it is important to assess whether your company currently employs a vendor risk management policy, and, if not, understand how a lack of due diligence poses significant risk on your organization’s overall cybersecurity preparedness.

A vendor management policy is put in place so an organization can tier its vendors based on risk. A policy like this identifies which vendors put the organization most at risk and then expresses which controls the company will implement to lessen this risk. These controls might include rewriting all contracts to ensure vendors meet a certain level of security or implementing an annual inspection.



One of the more frustrating aspects of analytics is the amount of time it takes to put data in a format that makes it useful. By some estimates, manually making data accessible to an analytics application can consume as much as 80 percent of an analyst’s time. Given the salary analysts command, the cost of prepping data can be considerable.

IBM today announced a partnership with Datawatch under which it will resell Datawatch Monarch, a self-service tool that enables end users to automate much of the data preparation work associated with running an analytics application. In this instance, IBM intends to provide access to Datawatch Monarch to end users making use of the IBM Cognos and IBM Watson Analytics services delivered via the cloud.

Datawatch Monarch makes it possible for an end user to automatically have all the data in a file turned into rows and columns that can be easily consumed by an analytics application. It also makes it possible to join dissimilar data, all of which can be reused across the organization.



A few short decades ago, safety planning was not considered a priority for the vast majority of corporations. Instead, most incidents and emergencies were handled as they occurred, as effectively as possible given the limited technology resources available at the time.

Today, workplace health & safety departments have evolved into something else entirely. Now it is a must-have element of any corporation in order to maximize occupational health and safety.

To fully understand the importance of corporate safety planning—and to glimpse how much it has changed our modern work environment—you only need to take a quick look at how far it’s come. Let’s take a look at how workplace safety programs have evolved, as well as what worked—and what didn’t:



DUPONT, Wash. – Washington suffered its worst wildfire season in state history in 2015. Raging fires burned more than one million acres of public and private lands. After two straight years of record-breaking wildfires, vast areas of the state face a much greater risk of flash flooding, debris flow and mudslides. But a team effort by all levels of government aims to reduce those threats to public safety.

The team—called the Erosion Threat Assessment/Reduction Team (ETART)—was formed by the Washington Military Department’s Emergency Management Division (EMD) and the Federal Emergency Management Agency (FEMA) after the Carlton Complex Fire of 2014. A new ETART was formed in October 2015 following the federal disaster declaration for the 2015 wildfires.

ETART participants include EMD, FEMA, the U.S. Army Corps of Engineers, the National Weather Service, the Confederated Tribes of the Colville Reservation, the Washington State Conservation Commission, the Washington State Department of Natural Resources, the Spokane, Okanagan and Whatcom conservation districts, and many others.

Led by the Okanogan Conservation District, ETART members measured soil quality, assessed watershed changes, identified downstream risks and developed recommendations to treat burned state, tribal and private lands.

“Without vegetation to soak up rainwater on charred mountainsides, flash floods and debris flows may occur after a drizzle or a downpour,” said Anna Daggett, FEMA’s ETART coordinator. “ETART brings together partners to collaborate on ways to reduce the vulnerability of those downstream homes, businesses and communities.”

Besides seeding, erosion control measures may include debris racks, temporary berms, low-water crossings and sediment retention basins. Other suggestions may include bigger culverts, more rain gauges and warning signs, and improved road drainage systems.

While public health and safety remains the top priority, other values at risk include property, natural resources, fish and wildlife habitats, as well as cultural and heritage sites.

“ETART addresses post-fire dangers and promotes collective action,” said Gary Urbas, EMD’s ETART coordinator. “With experienced partners at the table, we can assess and prioritize projects, then identify potential funding streams to fit each project based on scale, location and other criteria, which may lead to a faster and more cost-effective solution.”

Since the major disaster declaration resulting from wildfire and mudslide damages that occurred Aug. 9 to Sept. 10, 2015, FEMA has obligated more than $2.9 million in Public Assistance grants to

Washington. Those funds reimburse eligible applicants in Chelan, Ferry, Lincoln, Okanogan, Pend Oreille, Stevens, Whatcom and Yakima counties, as well as the Confederated Tribes of the Colville Reservation, for at least 75 percent of the costs for debris removal, emergency protective measures, and the repair or restoration of disaster-damaged infrastructure.

After the 2014 Carlton Complex Fire, FEMA provided $2.4 million in Public Assistance grants specifically for ETART-identified projects. Those grants funded erosion control measures that reduced the effects of the 2015 wildfires—such as installing straw wattles, clearing culverts and ditches of debris, shoring up breached pond dams, and seeding and mulching burned lands.

FEMA also offers fire suppression grants, firefighter assistance grants, Hazard Mitigation Grants and National Fire Academy Educational Programs.

Affected jurisdictions, landowners and business owners continue to submit requests for grants, disaster loans, goods, services and technical assistance from local, state and federal sources to recover from the wildfires, protect the watersheds or reduce the risks associated with flooding and other natural hazards.

ETART recently issued its final report, which details its methodology, assessments, debris-flow model maps, activities and recommendations. Completed activities include:

  • Compiled and shared multi-agency risk assessments across jurisdictions through a public file-sharing site.

  • Developed and disseminated an interagency program guide to assist jurisdictions seeking assistance.

  • Transitioned ETART to a long-term standing committee to address threats, improve planning, and resolve policy and coordination issues that may thwart successful response and recovery efforts related to past fires and potential future events.

The “2015 Washington Wildfires Erosion Threat Assessment/Reduction Team Final Report” is available at https://data.femadata.com/Region10/Disasters/DR4243/ETART/Reports/. Visitors to this site may also access “Before, During and After a Wildfire Coordination Guide” developed by ETART.

More information about the PA program is available at www.fema.gov/public-assistance-local-state-tribal-and-non-profit and on the Washington EMD website at http://mil.wa.gov/emergency-management-division/disaster-assistance/public-assistance.

Additional information regarding the federal response to the 2015 wildfire disaster, including funds obligated, is available at www.fema.gov/disaster/4243.

A breach a day is the new norm. In the past 12 months there have been a number of high profile breaches.  Take Sony for example, they lost control of their entire network.  The hackers were releasing feature length movies onto torrent sites for people to freely download. This was very high profile at the time and it was incredibly damaging.  TalkTalk, had all of their customer information dumped onto the internet for everybody to use.  XBOX Game Network was hacked over the Christmas period.  They had a Distributed Denial of Service – the hackers just wanted to do it for the fun of it!  Famous political figures have also had their public profiles very notably defamed.

These hacks happen everyday. A breach a day is the new norm.



It’s a common phrase you have probably heard throughout your career: A crisis management plan is a living document. It’s a reminder that any crisis plan should be updated continually to reflect a business, its employees and the threats that might impact normal operations.

However, in practice, ensuring your plan is current, and your team is up-to-date, requires a significant investment in time and patience, and can be downright challenging. But if your company makes crisis management a priority, it is possible.

Here are three key ways to ensure your plan and team are always up to date:




Active shooter incidents have become an increasingly significant threat in healthcare and hospital environments.  According to a study conducted by the FBI titled Workplace Violence: Issues in Response, healthcare employees experience the largest number of Type 2 active shooter assaults (assaults on an employee by a customer, patient, or someone else receiving a service). [1] Also, in a 12 year study conducted by Johns Hopkins, hospital-based active shooter incidents in the United States increased from 9 per year in the first half of the study to 16.7 per year in the second half. [2]

Because of the increased active shooter risk that healthcare and hospital facilities face, it is crucial for decision-makers to integrate active shooter preparedness into their workplace violence prevention policy and to provide reality-based training and resources for their staff. Of equal importance is an emergency response procedure and communication strategy. Shooting incidents are unique in hospitals and healthcare settings and they require a clear, concise communication action plan.



Thursday, 17 March 2016 00:00

Building Resilience, City by City

With escalating risks and uncertainty around the globe, cities are challenged with understanding and circumventing those risks to stay vital. Much as in the business world, municipalities are moving towards resilience—the capability to survive, adapt and grow no matter what types of stresses are experienced.

Recognizing that they have much to offer each other, communities and businesses are often working together to pool their experience and knowledge. Helping to foster this is a project called the 100 Resilient Cities Challenge, funded by the Rockefeller Foundation. The project has selected 100 cities around the world and provided funding for them to hire a chief resilience officer.

“Resilience is a study of complex systems,” said Charles Rath, president and CEO of Resilient Solutons 21. He spoke about resilience and his experiences with the 100 Resilient Cities Challenge at the recent forum, “Pathways to Resilience,” hosted by the American Security Project and Lloyd’s in Washington, D.C. “To me, resilience is a mechanism that allows us to look at our cities, communities, governments and businesses almost as living organisms—economic systems that are connected to social systems, that are connected to environmental systems and fiscal systems. One area we need to work on is understanding those connections and how these systems work.”



Thursday, 17 March 2016 00:00

Understanding the Value of Data

The enterprise has been sitting on a goldmine of valuable information for several decades now, but only recently has it had access to the technology to pull it all together and make sense of it. This is leading to a shift in the way organizations value both data and infrastructure – data becoming increasingly important to the business model while distributed cloud architectures and commodity hardware are diminishing the significance of infrastructure.

But raw data is like unrefined ore: There is potential there, but first it must be retrieved, cleaned, refined and then delivered to those who find it most desirable. For that, you need a top-notch data management platform.

According to a recent study by Veritas, many organizations are still squandering the value of data simply by not having a full understanding of what they have and how it can be utilized. More than 40 percent of data, in fact, hasn’t been accessed in three years. In some instances, this is due to compliance and regulatory issues, but in many cases it can be traced to improper management. Once data enters the archives, it tends to be lost forever even though it may still have value to present-day processes. As well, developer files and compressed files make up about a third of all stored data, even though the projects they supported are long gone. There is also a significant amount of orphaned data, unowned and unclaimed by anyone in the organization, and this is becoming increasingly populated with rich media files like video chats and graphics-heavy presentations.