The Continuity Logic customized demo provides an opportunity for qualifying organizations to evaluate Frontline Live 5™, with their plans, desired controls, policies, and procedures. This first-of-its-kind system for both business continuity and many other areas of Governance, Operational Risk and Compliance (GRC) is powerful, but often best viewed with some of your familiar plans, data and templates.


Fall World 2015

Conference & Exhibit

Attend The #1 BC/DR Event!

Summer Journal

Volume 28, Issue 3

Full Contents Now Available!

Jon Seals

Wednesday, 17 December 2014 00:00

Using a Risk Model as a Common Language

The central purpose of a common risk language is to assist management with evaluating the completeness of its efforts to identify events and scenarios that merit consideration in a risk assessment. Either management begins a risk assessment with (a) a blank sheet of paper with all of the start-up that choice entails, or (b) a common language that enables busy people with diverse backgrounds and experience to communicate more effectively with each other and identify relevant issues more quickly.

In a Corporate Compliance Insights column earlier this year, we provided a suggested language for executive management and directors to use in the Boardroom to focus the board risk oversight process. This month, we discuss the merits of a common language for use by the entire organization.

The sources of uncertainty an enterprise must understand and manage may be external or internal. Risk is about knowledge. When management lacks knowledge, there is greater uncertainty. Thus sources of uncertainty also relate to the relevance and reliability of information about the external and internal environment. These three broad groups – environment, process and information for decision making – provide the basis for an enabling framework summarizing the sources of uncertainty in a business.



Wednesday, 17 December 2014 00:00

2015 risk predictions

What emerging risks are likely to have an impact on organizations during 2015? Experts from The Institute of Risk Management give their views.

Political instability caused by low oil prices, increased shareholder activism and the business threat posed by a potential UK exit from the EU, are among chief concerns voiced by some of the UK’s leading risk experts for 2015.

As the year comes to a close, members of the Institute of Risk Management (IRM), were asked to identify key risk areas for 2015. A broad range of oil and gas, political, healthcare, regulatory and insurance risks were highlighted as potential flashpoints.



Wednesday, 17 December 2014 00:00

2015 cyber risk and data protection predictions

Businesses in 2015 are expected to experience increasing challenges as they struggle to contend with the burgeoning threat of complex cybercrime. EY analysis has outlined some of the key areas that cyber risks threaten to impact in the coming year, including the difficulties in the insurance sector of underwriting cyber risk, the raft of regulation coming out of both the EU and the UK, the importance of integrated risk functions in firms, and the cyber risk of supply chains moving to the cloud.

Insuring against cyber risk

Cyber risk poses a serious and growing threat to businesses across the UK, and companies are increasingly looking to insurers for protection against financial losses in the face of attacks. Certain sectors already require firms to take out cyber risk under regulatory compliance. However, cybercrime is not a traditional area of risk for insurers, and the burden of underwriting the risk is proving to be very difficult.

Shaun Crawford, Global Head of Insurance at EY, comments: “Cyber risk will certainly be one of the biggest challenges to the insurance market in 2015. Cybercrime is a moving beast, making it impossible to quantify the risks neatly or to calculate them in an informed or consistent manner. With so much unknown, it’s not surprising that premiums are wildly different across the market, and without cross-market stability, the industry will most likely be operating on significant indemnity losses.



MSPs specializing in cloud-based file sharing, may not be shocked to discover that end users frequently share data via insecure means. What might come as a surprise, however, is the fact that 20 percent of those files contain data directly related to compliance (or lack thereof).   

This statistic comes from a recent study that analyzed roughly 100 million files shared through public-cloud applications. You can see all the findings in this infographic, but here are a few key takeaways, specifically for MSPs:

Non-compliance is the norm, not the exception

Based on the numbers, most businesses are struggling to stay compliant. Some have not made it a priority at all. The compliance data that was shared on public-clouds included personally identifiable information (PII), personal health information (PHI), and customer payment card information.

This fact presents opportunities for MSPs. Conveying to companies the importance of compliance and the risk of having their data vulnerable gives you the opportunity to bring them a solution. Creating a cloud file sharing system that stays compliant can bring them extreme value.



Wednesday, 17 December 2014 00:00

Lessons Learned from Data Breaches

Recent data breaches have left some large organizations reeling as they deal with the aftermath. They include the Target data breach, compromises at Home Depot, JP Morgan, USPS (which exposed employee Social Security Numbers and other data) and, most recently, Sony Pictures. The Sony hack also proved to be embarrassing to some of the company’s executives, as private email correspondences were exposed.

Collateral damage from data breach is significant: one in nine customers affected by a data breach stopped shopping at a particular retailer. According to LifeLock, a recent survey of corporate executive decision-makers found that while concern for a breach is 4 or 5 on a 5-point scale, only 10% to 20% of their total cyber security budgets go to breach remediation. Establishing an incident response plan in advance can reduce the cost per compromised record by $17.



Wednesday, 17 December 2014 00:00

Here Comes the Big Data

About two decades ago I thought I had a handle on big data. I was doing some data warehousing work with a telephone utility that had about 100 million transactions. That was a lot of data, I said to myself. Then, about 10 years ago, I was doing a review of a firm that audited financial trading on one of the major stock markets and I asked its big data guy how many transactions the company processed. His initial answer was, “On a slow day we get about 2.5 billion transactions.” “How many do you have on a busy day?” I asked with an air of shock. “4 or 5 billion,” he responded. Now that was really a lot of data.

Jump ahead a decade or so, and on 24 July 2014, Facebook announces that it is currently processing 1 trillion transactions per day. Now ”that” is really, really big data. If you are a CEO, that is just one of the reasons why you should worry about having a big data strategy. Even if your organization isn’t a telecom utility or a financial institution, the amount of data you’re going to have to process is shooting up, what with all the smart (wireless) devices your customers and employees use heavily, plus the volumes of data beginning to flood the organization from all the IoT devices/systems that increasingly control any number of real-time systems.



The company confirms with PCI-DSS RoC the highest standards of security in all DataBank data centers


DALLAS – DataBank, Ltd., a leading custom data center and colocation provider based in Dallas, announced the completion of their annual PCI-DSS RoC (Payment Card Industry - Data Security Standard - Report on Compliance), for all data center locations. DataBank engages in the rigorous annual audits to ensure that clientele with strict data compliance requirements are receiving a truly move-in ready environment for their entire critical IT infrastructure.

PCI-DSS was developed in 2004 by the founding payment brands of MasterCard, Visa, American Express, and JCB International. Because of the nature of the transactions and sensitive data that resides on servers conducting financial transactions, the PCI-DSS is one of the strictest set of audit guidelines conducted in the MTDC (Multi-Tenant Data Center) industry.

DataBank's PCI-DSS RoC means that businesses processing credit card transactions through deployed servers in any one of the company's data centers is compliant with the required security standards. The annual audit process involves an in-depth review of data center operations, logging, monitoring, intrusion detection, security procedures, and staff training. The resulting environment enables DataBank to enforce best practices and security and to augment risk management to further protect client IT assets.

"The PCI RoC framework consists of the highest standards for data-security in our industry," said Tim Moore, CEO of DataBank. "By extending this rigorous compliance standard into all of our facilities, we ensure DataBank clients are receiving the most secure data center environment available."

To learn more about DataBank's data centers, compliance standards, and the company's complete suite of service solutions, please visit the corporate website at www.databank.com.


About DataBank
DataBank is a leading provider of enterprise-class data center solutions aimed at providing customers with 100% uptime availability of data, applications and deployed infrastructure. We offer a full suite of hosting solutions including colocation, managed services and cloud solutions that are anchored in world-class secure data center facilities with best of breed infrastructure and highly robust network architecture. Our customized customer deployments are designed to effectively manage risk, improve their technology performance and allow them to focus on their core business objectives. DataBank is headquartered in the historic former Federal Reserve Bank Building, in downtown Dallas, TX and has additional data centers in Dallas, Minneapolis and Kansas City. For more information on DataBank locations and services, please visit www.databank.com or call 1(800) 840-7533.

Tuesday, 16 December 2014 00:00

Shaping mobile security

Keith Bird shows how a new approach to mobile security can help organizations achieve the right balance of protection, mobility and productivity.

Most of us are familiar with the ‘triangle’ project management model, which highlights the constraints on delivering results in projects. The three corners of the triangle are fast, good and cheap, showing that in any given project, all three attributes cannot be optimised: one will inevitably be compromised to maximise the other two. You can have a good project delivered quickly, but not cheaply, and so on.

It’s traditionally been the same in IT security, especially when it comes to mobility. In this case, the three corners of the triangle are security, mobility and productivity. Usually, organizations have taken one of two approaches: either enabled mobility to boost productivity, with security inevitably being compromised; or they’ve tried to deliver more effective security for mobile fleets, compromising productivity.

Recent research shows that a majority of organizations have used the first approach, with mobility racing ahead of security. We (Check Point) surveyed over 700 IT professionals worldwide about mobility and mobile device usage in their organizations, and 72 percent said the number of personal mobile devices connecting to their organizations' networks had more than doubled in the past two years. 82 percent expected mobile security incidents to grow over the next 12 months, with higher costs of remediation.



A Johns Hopkins University analysis has looked at how climate change will increase the risk of power outages for various major US metro areas.

Johns Hopkins engineers created a computer model to predict the increasing vulnerability of power grids in major coastal cities during hurricanes. By factoring historical hurricane information with plausible scenarios for future storm behavior, the team could pinpoint which of 27 cities, from Texas to Maine, will become more susceptible to blackouts from future hurricanes.

Topping the list of cities most likely to see big increases in their power outage risk are New York City, Philadelphia, Jacksonville, Fla.; Virginia Beach, Va.; and Hartford, Conn. Cities at the bottom of the list, whose future risk of outages is unlikely to dramatically change, include Memphis, Dallas, Pittsburgh, Atlanta and Buffalo.

Seth Guikema, an associate professor in the university’s Department of Geography and Environmental Engineering, said his team’s analysis could help metropolitan areas better plan for climate change.



Ever since the cloud burst onto the IT consciousness, the primary focus of most organizations has been to prepare for this new data paradigm. The thinking has been that the enterprise needs to be ready for the cloud or risk being left behind.

Lately, however, we’ve seen a subtle shift in attitude on the part of both the enterprise and the nascent cloud industry: It’s not the enterprise that needs to adapt to the cloud, but the cloud that needs to adapt to the enterprise. Across the board, from the large players like Amazon and Google to smaller ones like CloudSigma and DigitalOcean, the goal has shifted from providing the commodity resources that appeal to consumers to more specialized offerings that the enterprise values.

To be sure, there is no shortage of enterprise interest in the cloud already. According to IDG, nearly 70 percent of organizations today utilize cloud-based infrastructure or applications in some way, and IT spending on the cloud is currently averaging about 20 percent growth per year. The thing is, the vast majority of that activity consists of low-level workloads and bulk storage applications that generally go to the lowest bidder, which is usually one of the hyperscale players that can shave margins to the bone and still turn out a decent profit.